Page MenuHomePhabricator

Double redirects should work
Closed, DeclinedPublic

Description

Author: pete

Description:
At the moment double redirects "don't work" i.e. if A redirects to B and B
redirects to C and you request A you do not get page C.

However to keep Wikipedia (and other wikis) logical and well-structured, they
should. There are absolutely tons of examples where A and B are synonyms (so one
should redirect to the other) and are a subtopic of C.

Currently we have to write A redir C and B redir C. But now someone comes along
and decides to expand the subtopic and so undoes the B redirect and starts the
new article, but A still redirs to C. If we allowed double redirects then A
redirs B and B redirs would enable us to future-proof against future expansion.

This may seem a trivial application but I have personally seen dozens of
occasions where bad redirects have arisen because of this, and think it would be
really useful.

On the flip side, there is the extra processing required on-going basis (no idea
if this is "critical path" in terms of performance) and of course the
cycle-detection would have to be improved/implemented :/

thanks for listening,


Version: unspecified
Severity: enhancement

Details

Reference
bz5503

Event Timeline

bzimport raised the priority of this task from to Lowest.Nov 21 2014, 9:10 PM
bzimport set Reference to bz5503.
bzimport added a subscriber: Unknown Object (MLST).

pete wrote:

Felt I should give at least one concrete example:
In en.wikipedia both [[Rest Mass]] and [[Rest mass]] currently point to [[Mass
in special relativity]]. But if [[rest mass]] ever gets span out, we would want
[[Rest Mass]] to point to [[Rest mass]].

gangleri wrote:

This seems to be a duplicate of
Bug 3747: Avoiding double redirects

pete wrote:

No. This is not a duplicate of that request. This is a request that double
redirects should work. That is some vague request about making it impossible to
create double redirects. That solution would actually make things worse in my
view, because "abnormal" redirects would be created "behind the backs" of users.

robchur wrote:

This has been proposed before and WONTFIXed.

ayg wrote:

*** Bug 10499 has been marked as a duplicate of this bug. ***

falkeli wrote:

An other reason for double redirects to work: Page B has many redirects. B is then moved to C. As a result, it takes a while until all the redirects are fixed. With software called AWB, 10 a minute seems like a maximum. This means that if there are 30 redirects, then for 3 minutes not all redirects work properly.

pete wrote:

It is a shame we never got a reason /why/ it is closed as WONTFIX, beyond it "has been proposed before"... If there is a technical reason why it is hard to implement well then that is one thing, but if it the rationale is that the current solution of bots "fixing" double redirects is a good solution, then I wonder if the developers would reflect on my example above and realize it is not a good solution, hence the repeated requests for a better one.

ayg wrote:

WONTFIX is not used (or not supposed to be used) just because something is hard to do. In that case it's left open and unfixed. I have never seen a satisfactory explanation for why we can't allow two or three consecutive redirects either, and would also like to hear why the current "solution" (bots moving them) is better than any fix at all in the software. Brion, could you elaborate?

The longer you allow the chain to get, the harder it will be to figure out how to clean it up when you need to. Hence, the requirement for immediate cleanup.

pete wrote:

I'm sorry, I must not be expressing my point clearly enough, because it seems to have been mis-understood again. Let me try one more time:

Sometimes there is no "clean up" to do when a double redirect arises. I.e. sometimes having a double redirect is the correct, most logical layout for a particular set of overlapping topics.
E.g. if A and B are the synonymous, for example they are [[Subset Topic]] and [[Subset topic]], but are currently redirecting to a part of a more encompassing topic e.g. [[Supratopic]] then currently all the bots and clean up will force [[Subset topic]] and [[Subset Topic]] both to redirect to [[Supratopic]]. But it is much more logical for [[Subset Topic]] to redirect to [[Subset topic]], and indeed this is robust in the face of a new page on [[Subset topic]] being started.

Does that example make sense?


In any case I very much doubt that lengthy chains would arise - are there any natural examples of where a quadruple or even treble redirect makes sense? Seems rather theoretical...


My take is that this would a very minor improvement to the software, but perhaps a big deal to implement. In which case I would understand the developers saying its not worth the bother. However that is different from the developers misunderstanding the rationale..

And then you change one of them and you have inconsistency. Hence, cleanup work.

falkeli wrote:

And what about a situation where a page whichalready has many redirects needs to be moved? Allowing a second redirect level (not a third, just a second) means that the cleanup work doesn't need to be done between the page move and the first time someone wants to see the page.

robchur wrote:

Please note that bug warring (reopening/reclosing en masse) is unproductive and disruptive.

It's clear that there's strong feeling here from some parties that the issue needs further discussion, so I propose that this move to a mailing list or some other discussion forum, perhaps on MediaWiki.org.

It should be obvious to all that digging in and stubbornly messing with this bug is not the answer by any means. I will consider briefly locking the BugZilla accounts of anyone who continues with this behaviour, though I would prefer to think that project contributors were capable of discussing issues and reaching an acceptable understanding in a more mature fashion.

pete wrote:

Developers tend to be busy and thus terse...

With the redirect table, imho, we have a very simple solution satisfying both standpoints that want:
A: force to immediate "cleanup to chain length 1"
B: possibly a multi-hop logical redirect chain of at most N.

Introduce a variable:

$wgMaxRedirects = 1;

in DefaultSettings.php which is counted down to zero while redirects are honored, and proceed to immediate display when either no-a-redirect is found, or the count is exhausted.

As a side-effect $wgMaxRedirects = 0; can "disallow" redirects completely for a wiki.

Since I do not want to reopen this bug bending its purpose to something too different, I made the proposal a new one.
See bug # 11644