Page MenuHomePhabricator

Special:RecentChangesLinked returns 404
Closed, InvalidPublic

Description and both return HTTP status code 404, while it's clear that the page works. What's happening here?

Event Timeline

Reedy renamed this task from Special:RecentChangesLinked/Speciaal:RecenteWijzigingenGelinkt returns 404 to Special:RecentChangesLinked returns 404.Dec 30 2018, 4:26 PM
Reedy edited projects, added MediaWiki-Special-pages; removed MediaWiki-General.

This confused me a bit. When I go to these pages it works as expected (it shows RC).

Aklapper changed the task status from Open to Stalled.Dec 31 2018, 8:14 PM

@Smile4ever: Both pages work for me. Have you tried with another machine and with another internet provider? For which items do you get an error if you open the "network" tab in your web browser's developer tools with debug=true? See for more info.

@Aklapper I tried with Postman on the same computer, and that returned HTTP 404.

Firefox on the same computer (Manjaro Linux + Firefox Nightly 65) gives 404 when using the "Network" tab:

Request headers:

I also got it on a different computer and operating system (Windows 10 + Firefox Nightly 66a1 2018-12-26 / 2019-01-01).

In Chrome I didn't get a 404 response. Maybe it has something to do with the user agent?

Aklapper changed the task status from Stalled to Open.Jan 1 2019, 9:22 PM

Does this also happen with Firefox non-Nightly (=stable)?

I tested now at Opera 57.0.3098.106 and this happening there too.

It's definitely replicable, which is why I didn't close the bug. It's also not WMF specific as my dev wiki does the same too

$ curl -I
HTTP/2 404 
$ curl -I
HTTP/2 200 

It can also be seen in the Google Chrome console

C:\Users\zoran\Desktop\development>curl -I
HTTP/1.1 404 Not Found
Date: Tue, 01 Jan 2019 21:24:45 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Server: mw1326.eqiad.wmnet
X-Content-Type-Options: nosniff
P3P: CP="This is not a P3P policy! See for more info."
X-Powered-By: HHVM/3.18.6-dev
Content-language: en
X-Frame-Options: DENY
Backend-Timing: D=154062 t=1546377878830353
Vary: Accept-Encoding,Cookie,Authorization,X-Seven
X-Varnish: 585549347, 377890741 379722423, 1060887976
Via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
Age: 6
X-Cache: cp1083 pass, cp3033 hit/1, cp3042 miss
X-Cache-Status: hit-local
Server-Timing: cache;desc="hit-local"
Strict-Transport-Security: max-age=106384710; includeSubDomains; preload
Set-Cookie: WMF-Last-Access=01-Jan-2019;Path=/;HttpOnly;secure;Expires=Sat, 02 Feb 2019 12:00:00 GMT
Set-Cookie: WMF-Last-Access-Global=01-Jan-2019;Path=/;;HttpOnly;secure;Expires=Sat, 02 Feb 2019 12:00:00 GMT
X-Analytics: ns=-1;special=Recentchangeslinked;https=1;nocookies=1
Cache-Control: private, s-maxage=0, max-age=0, must-revalidate
Set-Cookie: GeoIP=RS:00:Belgrade:44.82:20.47:v4; Path=/; secure;


Uh, with curl -I I can also reproduce. Thanks everybody for chiming in!

MediaWiki's output of HTTP response codes is not always immediately intuitive or consistent for Special pages, but this is working as designed. The page returns a 404 because no relevant title is specified. If you specify a target, i.e., the HTTP response is a 200.

Other Special pages, for example Special:Contributions, behave in a different way: if no target is specified the page is still a 200 rather than a 404.

Yeah, it's probably this block of code:

		if ( !$this->runMainQueryHook( $tables, $select, $conds, $query_options, $join_conds,
			$opts )
		) {
			return false;

Note how the page for contains no content. If you try with a target that has content, you should get a 200.

Note how the page for contains no content. If you try with a target that has content, you should get a 200.

@kostajh 204 No Content would be a better response to return in that case, but in my particular opinion 204 No Content means you return an empty string instead of a HTML page with zero results.

I can reproduce this on my test wiki too:

@Zoranzoki21: The behavior was already confirmed in T212702#4849075; more "I can reproduce this" comments do not help anybody at this stage. :)

SBisson added a subscriber: SBisson.

I think as a bug this is invalid. It is working as designed and has been for over 4 years. See T69182

For anyone wanting to change this behaviour, I recommend going back to the original ticket and evaluating if it is still an issue and whether is can be solved differently. It should also be investigated if bots or gadgets are relying on the current behaviour and what would a transition look like for them. This can be done by re-opening, renaming, and "re-describing" the current ticket or creating a new one.