The new patches needing review are:
Wed, Nov 22
If you already have to know someone's email to use it
I'm a bit confused here. Is this task about adding a count for the action API, turning an existing restbase-only count into a restbase+action API count, or adding a restbase+action API count in addition to an existing restbase-only count?
Tue, Nov 21
What kind of abuse is this preventing?
Yes, splitting to the final destination would be minimal churn in git, versus splitting to one filename and then renaming to a different name shortly after.
Mon, Nov 20
One thing to try would be to restart redis and nutcracker on the relevant Beta Cluster hosts and see if the problem goes away. I don't know if there's a less manual way to do it than sshing everywhere.
Further separation of redirect tags to un-redirecting, new redirect, changed redirect target, self redirect, redirect to non-existent page, etc. (Probably an overkill)
I'll let someone else make the final decision, but I'm pretty sure the answer is going to be that your skin there needs to be fixed somehow to not have web-loaded resources in the vendor directory.
If this is done, it should be done everywhere relevant and not just in globalallusers. The only place in core that I know of is ApiQueryAllUsers.
Sun, Nov 19
Fri, Nov 17
All I'm seeing in P6346 is the "clicked login, auto logged in" step. But, in the middle of that step, it looks like you might have hit the bug again. I don't see anything wrong with the cookies your browser is sending there. Unfortunately there's not too much logging (e.g. the session channel seems to be completely missing), but what I do see is consistent with redis being screwy as shown in T173646.
If you can reproduce it, please capture the relevant HTTP requests as mentioned at https://www.mediawiki.org/wiki/Manual:How_to_debug/Login_problems.
These patches only block TemplateStyles deployment if the decision in T155813 to not block TemplateStyles deployment on deduplication is being reversed.
This has nothing to do with the css-sanitizer library, nor with TemplateStyles.
Thu, Nov 16
Wed, Nov 15
Tue, Nov 14
Thanks. That's what we're already doing, so I'm going to decline this request to change it.
Except a cookie can't actually implement a global preference on Wikimedia sites since the sites are spread across multiple second-level domains. Look how much trouble CentralAuth has to go through to make login work.
Mon, Nov 13
More specifically, 32 characters from a base-32 character set, generated by PasswordFactory::generateRandomPasswordString(), so ideally 160 bits of randomness.
You lost me in there, @jcrespo ;) I'll tell you what's being run and hopefully you can tell us what the indexes should look like.
forceHTTPS isn't stored for each subdomain separately, at least not when you're logged in with an SUL account which everyone should be by now. If we want to get rid of the cookie entirely on HTTPS-only wikis, the check for "is this an HTTPS-only wiki?" should probably be added to MediaWiki\Session\CookieSessionProvider::setForceHTTPSCookie() and CentralAuthSessionProvider::setForceHTTPSCookie(), and either just return or set $set to false. For the CentralAuth case, ideally the test would be "are all wikis in this SUL-grouping HTTPS-only?", but that might be hard to determine unless we just add a flag specifically specifying that.
You should compare the API's results to the equivalent Special:WhatLinksHere invocation that lists only links, not links plus transclusions plus redirects.
This is nothing specific to ApiSandbox, oojs-ui (or MediaWiki's integration of it) should load whatever messages it needs internally when the 'oojs-ui' module is required.
Fri, Nov 10
While MediaWiki shouldn't ever insert rows that violate the "one comment_id per revision" constraint, it's helpful to check it at the database level too.
Thu, Nov 9
You guess correctly. Technically we could probably just make those unique indexes be the primary keys, but according to T153333#3283613 the PK should cover both columns for associative tables like these (presumably to be able to "Using index").
Status "Blocked" because it's probably far too complex to try to do this on top of the complexity in https://gerrit.wikimedia.org/r/#/c/380669/ for the actor table schema migration.
It looks like T37349 was closed by the proposer after their patch ran into problems. As I mentioned, it's doable (although we might have to reduce the maximum uclimit and/or maximum number of ucusers when it's being done) for the non-prefix case, but the necessary logic to actually do it is complex. It'd probably be best to start fresh in a new task if we want to pursue it.
TL;DR: The ordering is entirely consistent, it's just not as you expected. In this case, the results are first ordered by user ID then by timestamp. Thus, I'm closing this as Invalid. See below for a different task you might file if you want to propose changing the ordering to match what you expected.
we would need to know that User A edited the page (at some point in time), even though we are only requesting revisions in 2017.
The request for getting this information is something like this:
The proximate cause here seems to be rMW1cc3a57296ff: Send a cookie with autoblocks to prevent vandalism., which added attempted checking of the blocked status into User::loadFromSession().
No, we're not going to rewrite the entire API in some other random framework just so you don't have to read documentation.
Tue, Nov 7
E006 means that they key you supplied isn't found in the database. It's possible it's the redirect as described in T74186#750517. Assuming things haven't changed since then, apparently you might need to use /wiki/Special:OAuth/authorize while using /w/index.php?title=Special:OAuth for any signed request.
What happens then if the attacker manages to create /tmp/mediawiki ahead of time?
I note that edit summaries are going to be able to be significantly longer soon, 1000 Unicode characters rather than 255 bytes. See T6714: Epic: Increasing the length of the edit summary for the overarching task and T166733: Deploy refactored comment storage to watch the progress.
Mon, Nov 6
It should be easy enough for someone to add to ApiQuerySiteinfo::appendGeneralInfo() following the example of the other stuff there.
Sun, Nov 5
Note that we specifically want certain things to remain present after logout, including the user name cookie used to prefill the field on a subsequent login, the cookie for the "cookie block" feature I've heard some talk about, and the new anonymous session cookie (if any). There's also T142542 that wants to return to setting a LoggedOut cookie.
Fri, Nov 3
This task is not the place to ask such questions. Your best bet would be to contact hoo on a wiki, on IRC, or via email. Even better would be to take advantage of a public mailing list, wiki help page, or Technical Advice IRC Meeting to ask your questions.
Use pcall() or xpcall() if you need to catch errors.
You should probably look into why you're having to check the existence of hundreds of images on a single page. That seems excessive.
Thu, Nov 2
I wonder how I missed that when searching for an existing bug.
Wed, Nov 1
Well, it's deprecated now, so let's mark this as Resolved.
Tue, Oct 31
Those look like decent explanations.
Nothing obvious, no. To be properly indexable at the database level, such a scheme would require denormalization along the lines of what's being done in the temporary table at https://gerrit.wikimedia.org/r/#/c/380669/6/maintenance/tables.sql@459. And then pretty much every consumer of the database would need to be updated to handle the possibility of having more than one row per revision, both on the back end and in UI, which would likely be a project as large as the current Multi-Content Revisions project.
Mon, Oct 30
Due to T179156, 1.31.0-wmf.5 wasn't deployed to group 2 wikis (including urwiki) as scheduled. The problem does seem to be resolved in that version, as you can see by visiting https://ur.wiktionary.org/wiki/Special:ApiFeatureUsage.
See T172165#3699606 for some relevant criticism of this decision.
Sat, Oct 28
Fri, Oct 27
Oct 25 2017
There doesn't seem to be any problem in the API here, or in category handling. You're getting no popup because the API response has only nonsense titles in namespace 0 which the widget is ignoring because it's expecting namespace 14, and those are being returned by the API because that's what the underlying search engine code is giving it.
Oct 24 2017
Oct 23 2017
Ok, I've added it to my schedule.
The fix should be deployed to WMF wikis with 1.31.0-wmf.5, see https://www.mediawiki.org/wiki/MediaWiki_1.31/Roadmap for the schedule.
Oct 22 2017
Works fine when I try it. You probably did something wrong when compiling and/or installing the luasandbox extension.
Oct 20 2017
Edge cases that come to mind:
Oct 19 2017
On import, if the user name attached to the revision exists then rev_user is set to that user's user_id, whether it's really the "same" user or not. If the user name attached to the revision doesn't exist, the revision is created with the name but with rev_user set to 0. It's probably something of a toss-up whether other things consider it the same user or not (depending on whether those things look at rev_user or rev_user_text).
Since T59346: Incorrectly attributed edits (from 2006) to me and T15798: Imported edits can be incorrectly attributed to whoever creates that account were closed as duplicates of T9240, let's follow suit here.