I have no opinion. I don't think the existing 50MB limit was chosen for any specific reason.
Can't reproduce with Firefox 66 or Chromium 74.
After digging through the stack trace, rMW855b1794b6b7: Unstub $wgLang for PageContentLanguage hook looks likely to have caused this. It's #12 in the first stack trace and #10 in the second.
Brainstorming a bit, please give feedback.
Tue, May 21
Mon, May 20
Fri, May 17
The coalesce is indeed causing it to not be able to use the index. In general, the planner can't know how the output of a function depends on the input columns in order to be able to use the input columns for indexing. For COALESCE() it probably could special-case it, but it doesn't look like it does.
How long is "a while"?
The queries quoted in this task seem to be as optimized as they can be, and the needed indexes seem to already be in place. #2 is probably the answer: fetch a batch of revisions, then in PHP collect the revision IDs from the batch and do SELECT slot_revision_id,slot_content_id,slot_origin,slot_role_id,content_size,content_sha1,content_address,content_model FROM `slots` JOIN `content` ON ((slot_content_id = content_id)) WHERE slot_revision_id IN (...) to fetch the slot data all at once and merge that in somehow. Then probably collect the content addresses and use ExternalStore::batchFetchFromURLs() to fetch the actual contents for all those slots in a batch as well.
Wed, May 15
Tue, May 14
Here's the series of events going on there:
- The page is saved containing the bad link.
- The Parser's output includes the bad link in the ParserOutput metadata.
- The job for inserting the externallinks table rows for the page runs. For our bad link it gets 0 indexes from LinkFilter::makeIndexes(), so makes no insertion for this bad link.
- The 0 index entries comes because wfParseUrl() returns false, which in turn is because PHP's built-in parse_url() returns false, which seems to be because it tries to interpret the ":0000" at the end as a port number and finds that invalid.
- The subsequent edit is made, not modifying the bad link.
- Again, the Parser's output includes the bad link in the ParserOutput metadata.
- ConfirmEdit checks the externallinks table to find the existing links in the page. to detect whether any new links have been added in the edit being made. Since there's no externallinks entry for this bad link, it thinks it's new and so triggers a captcha.
Mon, May 13
Wed, May 8
Err, yeah, I meant "action=undelete requests".
image-rendering is a CSS Image 3 property
I think it would likely be best to block this on T222409: Standardize declarative object construction in MediaWiki, at which point we'll likely find that this has already been resolved.
The long-term plan, as I understand it, is that we'll run maintenance scripts to rename pages (see gerrit:507596) and users (see rEWMA9122f6c) that are affected by the change, then reverse rOMWC713a20a0f2dd: Add Language::ucfirst overrides for php 7.2 to have HHVM use PHP 7.2's uppercasing table. Note we may have to follow the same process in the future whenever we upgrade to a new version of PHP, as it seems upstream is intending to do a better job of tracking new versions of Unicode.
Tue, May 7
Should be resolved.
- MultiWriteBagOStuff: similar, but without the read fallback
Mon, May 6
Fri, May 3
HHVM raises a warning about it, though, so we'd have to wait until after we drop HHVM support before using it.
Collecting the various different mechanisms into a more usable list:
I doubt the cases where anything else is wanting to use ObjectFactory would really be suited to just pulling "a service" out of the container, as such code could as well just get the service directly from the DI container. And I think the type of "factory_service" that would be usable by ObjectFactory uses cases is very likely to be exactly the code that's trying to use ObjectFactory in the first place.
I don't think this is likely to be an issue in the API itself, but instead in the deletion code used by the API (and the normal web deletion action as well). Although my best guess is that it's going to get into tricky details of MySQL/MariaDB gap locking, which I can't say I'm very familiar with.
Doing some archaeology,
Thu, May 2
It's caused by the fix for T221458: Special:Log on commons -- entire web request took longer than 60 seconds and timed out forcing MariaDB to use a bad query plan for the query here. Nothing to do with T222036 or T222038. I have a patch ready to upload for review as soon as we make this non-private.
The new errors in T212284#5125669 seem to be related to the database connection having been closed or dropped (e.g. it's reusing a connection that was dropped due to a timeout). Since it's using ->doQuery() to try to change the database, it doesn't have automatic reconnection logic that ->query() has.
Wed, May 1
@jcrespo and I have been discussing this same sort of idea, in a "we should do that someday once all the revision schema changing is over" sort of way. We don't seem to have ever gotten around to filing a Phab task for it though. Thanks for filing one!
probably review the phab task for dependency injection. the constructor or entry point should probably take an abstract object for Env, I think that's the point at which all our "dependencies are injected"
Tue, Apr 30
Perhaps we should take another step back and ask what use case RevisionStore being able to load from other wikis is actually supposed to support. The only places outside of tests I see ->getRevisionStore( $wiki ) being passed a value for $wiki are two calls in Wikibase (but I can't tell with the the chained DI there whether $wiki is ever actually non-local) and the implementation of the deprecated Revision::getRevisionText() (for which I see no non-test callers passing a value for $wiki).
I didn't see this post earlier, I went straight to the latest one. :(
Mon, Apr 29
I have been looking at design of the three tables querycache, querycachetwo (!), and qeurycache_info.
I don't really like the fact that we have part of the array statically defined and another part included inside the constructor, it makes it even harder for people to find where the definition of a module lives. What would you say to moving the closures to static factory methods of their classes, so the "wiring" for these modules could still be in the self::$Modules initializer?
Fri, Apr 26
JSON formatting does mean that the output isn't intended to be conceptually sorted, although you have a point that with formatversion=2 we output an array instead.
I'm skeptical that access to the history of pages is really something we want to provide for use during the parsing of the page. I'm not going to decline this just yet in case you want to make a counterargument, but in my opinion we should probably decline this request.
As far as MediaWiki-API is concerned, this is blocked on someone moving this "global ID" concept into core and probably having wfWikiID() itself return that ID. If that were done, the API would follow naturally.
From the MediaWiki perspective, this is a duplicate of T220999: Slow query "ApiQueryLogEvents::execute" after actor rollout, which should roll out with the train next week unless someone backports it on Monday.
Thu, Apr 25
The trick would be in making sure the resulting SQL queries still can use indexes efficiently.
I don't see anything in the API that seems like it would result in deadlocks here, so I'm going to move it to "non-core-API stuff" for now add MediaWiki-Special-pages to investigate it from that angle. The API module is pretty simple and just calls MovePage to do most of the work.