Page MenuHomePhabricator

Fatal DBUnexpectedError: "Database servers in extension1 are overloaded"
Closed, ResolvedPublicPRODUCTION ERROR

Assigned To
Authored By
LD
Mar 30 2025, 10:40 PM
Referenced Files
F59667794: image.png
May 4 2025, 2:15 PM
F59667473: image.png
May 4 2025, 1:50 PM
F34563878: dddddd.PNG
Apr 30 2025, 4:01 PM
File Not Attached
F59516766: grafik.png
Apr 28 2025, 6:25 PM
F59515304: grafik.png
Apr 28 2025, 5:01 PM
F59510345: grafik.png
Apr 28 2025, 12:53 PM
F59510336: grafik.png
Apr 28 2025, 12:53 PM
F59289300: grafik.png
Apr 18 2025, 12:32 PM

Description

on French Wikipedia:

MediaWiki internal error. Original exception: [bffe80d8-4f4e-48cd-bfba-a8038b22162a] 2025-03-30 22:34:58: Fatal exception of type "Wikimedia\Rdbms\DBUnexpectedError"
Logstash record
[bffe80d8-4f4e-48cd-bfba-a8038b22162a]

Wikimedia\Rdbms\DBUnexpectedError: Database servers in extension1 are overloaded. In order to protect application servers, the circuit breaking to databases of this section have been activated. Please try again a few seconds.

Related Objects

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

First, I repeat myself again: The errors are a symptom and not the cause. The DBs are getting overloaded so MediaWiki actively starts fataling a certain portion of requests (in these cases around ~10%) to avoid the rest of infra from going down. It is actually working as intended.

So what's causing the overload? I put a bit of debug I've done. From what I'm seeing. the reads on s3 during the spike gets a 15 times jump:

grafik.png (826×1 px, 104 KB)

Writes are also spiky:

grafik.png (826×1 px, 159 KB)

but the writes might be a red herring, I went though binlogs.

We clearly can't take the 15 times the average load. I haven't figured out what's causing this spike since it's intermittent and once it happens, everything becomes slow so everything starts logging and I get drowned in noise. I will debug a bit more to see where this is coming from.

I have done some improvements (T391153) that might have helped with this but can't say for sure. Let me wait and see if it gets triggered again.

So the change we deployed wasn't the reason. I've looked at today's spike which is the usual case we are seeing. Connections to databases suddenly start to pile up everywhere. There is no corresponding traffic pattern change, nothing in slow logs, nothing in traces, etc.

In this specific case, while connections in almost all sections go up, the rows read goes down which is natural. Except on s6 that rows read go up a lot. And among all replicas, there is only one replica that has a massive rows read spike:

grafik.png (422×1 px, 80 KB)

But I'm not seeing any slow query being logged for db1168 during that time (and much before and after)

Change #1135454 had a related patch set uploaded (by Ladsgroup; author: Amir Sarabadani):

[operations/mediawiki-config@master] Increase max db connection count before circuit breaking

https://gerrit.wikimedia.org/r/1135454

Change #1135454 merged by jenkins-bot:

[operations/mediawiki-config@master] Increase max db connection count before circuit breaking

https://gerrit.wikimedia.org/r/1135454

Mentioned in SAL (#wikimedia-operations) [2025-04-09T17:04:36Z] <ladsgroup@deploy1003> Started scap sync-world: Backport for [[gerrit:1135454|Increase max db connection count before circuit breaking (T390510)]]

Mentioned in SAL (#wikimedia-operations) [2025-04-09T17:11:26Z] <ladsgroup@deploy1003> ladsgroup: Backport for [[gerrit:1135454|Increase max db connection count before circuit breaking (T390510)]] synced to the testservers (https://wikitech.wikimedia.org/wiki/Mwdebug)

Mentioned in SAL (#wikimedia-operations) [2025-04-09T17:21:24Z] <ladsgroup@deploy1003> Finished scap sync-world: Backport for [[gerrit:1135454|Increase max db connection count before circuit breaking (T390510)]] (duration: 16m 47s)

We made another change: Set vslow group on s6 which was missing. That could have explained the spikes on normal replicas. I continue monitoring to see if that fully resolve the issue or not.

This still seems unresolved, unless it's related to something else.

Logstash record
[97675ccf-d5d6-495e-92e7-858d0f311311]

Thanks. Investigating this one:

  • The ratio of user-facing errors have been dropped from 12% to 4%. That's a good progress.
  • It's again db connection pile up.
  • It's again one s6 replica getting a lot of read_next:

grafik.png (559×497 px, 45 KB)

The good news is that, it's exactly the vslow replica we set. So we have a much higher chance of isolating the problem which is very very likely an overly aggersive maintenance script. @Marostegui is it okay if we enable slow log for db1180 and/or reduce its general weight (from 50 to something lower?)

I can enable it and reduce it to maybe 1?

Mentioned in SAL (#wikimedia-operations) [2025-04-11T13:25:18Z] <marostegui@cumin1002> dbctl commit (dc=all): 'Change weight for db1180 T390510', diff saved to https://phabricator.wikimedia.org/P74901 and previous config saved to /var/cache/conftool/dbconfig/20250411-132518-marostegui.json

Change weight to 1 and enabled slow query log.

FWIW. This is the only maint script that matches time-wise and is running every three hours:

Fri 2025-04-11 15:15:00 UTC  1h 49min left         Fri 2025-04-11 12:15:00 UTC  1h 10min ago       mediawiki_job_growthexperiments-updateMenteeData-s6.timer                              mediawiki_job_growthexper

https://grafana.wikimedia.org/d/000000273/mysql?orgId=1&var-job=All&var-server=db1180&var-port=9104&viewPanel=3&from=1744294159813&to=1744378323933

grafik.png (826×1 px, 104 KB)

Change #1135983 had a related patch set uploaded (by Ladsgroup; author: Amir Sarabadani):

[operations/puppet@production] mediawiki: Absent updatementeedata jobs

https://gerrit.wikimedia.org/r/1135983

Change #1135983 merged by Ladsgroup:

[operations/puppet@production] mediawiki: Absent updatementeedata jobs

https://gerrit.wikimedia.org/r/1135983

The slow query log shows that the majority of them come from maintenance scripts (wikiadmin user) (the wikiuser ones get killed after 60 seconds by the query killer - but wikiadmin ones do not).

The slow query log shows that the majority of them come from maintenance scripts (wikiadmin user) (the wikiuser ones get killed after 60 seconds by the query killer - but wikiadmin ones do not).

That's somewhat expected since this is a vslow replica but as long as it's not scanning 700 million rows, it should be fine.

Change #1136970 had a related patch set uploaded (by Michael Große; author: Michael Große):

[operations/puppet@production] Revert "mediawiki: Absent updatementeedata jobs"

https://gerrit.wikimedia.org/r/1136970

Change #1136970 merged by Ladsgroup:

[operations/puppet@production] Revert "mediawiki: Absent updatementeedata jobs"

https://gerrit.wikimedia.org/r/1136970

Ladsgroup claimed this task.

I'm closing this since it has not happened since we disabled that script.

@Ladsgroup: if you could examine theses one, just in case it has to be unclosed.

Logstash record
[08508217-6076-446a-b8c8-38a92dbafb44]
Logstash record
[1ec19a9c-6f4e-459d-8eaa-35e7d5d6ae30]
Logstash record
[ade4b03f-48b7-4ccd-934b-36920477efaf]

Those took place around 2025-04-18 10:14:00.

It's similar but different, the databases are getting overloaded. Twice so far but it seems to be from a different origin, it's s1 and you can see the bumps in for example db1218:
https://grafana.wikimedia.org/d/000000273/mysql?orgId=1&from=1744959329293&to=1744979431607&var-job=All&var-server=db1218&var-port=9104&viewPanel=3

grafik.png (831×1 px, 140 KB)

Someone needs to investigate what's causing this one. It doesn't look like the Growth Experiments jumps as they were at 3h intervals.

Mentioned in SAL (#wikimedia-operations) [2025-04-18T14:40:14Z] <_joe_> enabled slow query log on db1218, investigating T390510

After last night. We found another issue: T392784: CampaignEvents makes an uncached x1 DB query on pageviews I also think there is another issue with x1, the read_rnd_next in x1 replicas is unusually high:
https://grafana.wikimedia.org/d/000000273/mysql?orgId=1&var-job=All&var-server=db1179&var-port=9104&viewPanel=3&from=1745834330067&to=1745844675963

grafik.png (941×1 px, 239 KB)

In comparison a normal healthy replica shouldn't have that many read_rnd_next:

grafik.png (941×1 px, 342 KB)

Looking at performance schema. A lot of these show up:

DIGEST_TEXT: DELETE FROM `cx_suggestions` WHERE `cxs_title` IN (...) AND `cxs_source_language` = ?
SUM_TIMER_WAIT: 2389741235102000
MIN_TIMER_WAIT: 201921194000
AVG_TIMER_WAIT: 241388003000
MAX_TIMER_WAIT: 433279866000
 SUM_LOCK_TIME: 906605000000
DIGEST_TEXT: DELETE FROM `growthexperiments_mentee_data` WHERE `mentee_id` IN (...) 
COUNT_STAR: 39056
SUM_TIMER_WAIT: 3357347453632000
MIN_TIMER_WAIT: 60639000
AVG_TIMER_WAIT: 85962398000
MAX_TIMER_WAIT: 328616744000

Created this index live on wikishared:

CREATE INDEX cxs_source_language_title ON cx_suggestions (cxs_source_language, cxs_title);

Mentioned in SAL (#wikimedia-operations) [2025-04-28T14:57:31Z] <Amir1> CREATE INDEX cxs_source_language_title ON cx_suggestions (cxs_source_language, cxs_title); on wikishared (T390510)

And another missing index:

cumin2024@db1179.eqiad.wmnet[wikishared]> explain SELECT /* ContentTranslation\\Store\\TranslationStore::getAllTranslationsByUserId  */  *  FROM `cx_translations`    WHERE translation_started_by = 75158556  ORDER BY translation_last_updated_timestamp DESC LIMIT 1;
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
| id   | select_type | table           | type | possible_keys | key  | key_len | ref  | rows    | Extra                       |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
|    1 | SIMPLE      | cx_translations | ALL  | NULL          | NULL | NULL    | NULL | 3141607 | Using where; Using filesort |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
1 row in set (0.001 sec)

Mentioned in SAL (#wikimedia-operations) [2025-04-28T15:11:33Z] <Amir1> CREATE INDEX translation_started_by_last_updated_timestamp ON cx_translations (translation_started_by, translation_last_updated_timestamp); (T390510)

Much better:

cumin2024@db1237.eqiad.wmnet[wikishared]> explain SELECT /* ContentTranslation\\Store\\TranslationStore::getAllTranslationsByUserId  */  *  FROM `cx_translations`    WHERE translation_started_by = 75158556  ORDER BY translation_last_updated_timestamp DESC LIMIT 1;
+------+-------------+-----------------+------+-----------------------------------------------+-----------------------------------------------+---------+-------+------+-------------+
| id   | select_type | table           | type | possible_keys                                 | key                                           | key_len | ref   | rows | Extra       |
+------+-------------+-----------------+------+-----------------------------------------------+-----------------------------------------------+---------+-------+------+-------------+
|    1 | SIMPLE      | cx_translations | ref  | translation_started_by_last_updated_timestamp | translation_started_by_last_updated_timestamp | 5       | const | 7    | Using where |
+------+-------------+-----------------+------+-----------------------------------------------+-----------------------------------------------+---------+-------+------+-------------+
1 row in set (0.001 sec)

x1 replicas are much healthier after these:

grafik.png (915×1 px, 207 KB)

But it looks like there is still one or two more missing indexes.

Mentioned in SAL (#wikimedia-operations) [2025-04-28T18:08:06Z] <Amir1> CREATE INDEX translation_last_update_by_last_updated_timestamp ON cx_translations (translation_last_update_by, translation_last_updated_timestamp); (T392839 and T390510)

Mentioned in SAL (#wikimedia-operations) [2025-04-28T18:11:56Z] <Amir1> CREATE INDEX cxl_owner ON cx_lists (cxl_owner); (T392839 and T390510)

Finally it has gone to zero-ish:

grafik.png (915×1 px, 234 KB)

@Ladsgroup: Thanks for keeping an eye on it. It might be helpful for communities to receive some feedback on what’s happening. Should I create a subtask?

From what I understand, some spikes were linked to misconfigured tools, while others could be related to AI scraping, even if that might only be an indirect effect — similar to the butterfly effect.

Sure. Which venue do you prefer to communicate it? I think I should write a blog post. Let me explain why: Database overload has always been one of the biggest sources of our outages (if not the biggest). I remember seeing db overload issues at least for a decade now. What used to happen was that the database overload caused the appservers to overload (all waiting for the overloaded database to respond). It used to cause general slow down and outages such as T287983: Raw "upstream connect error or disconnect/reset before headers. reset reason: overflow" error message shown to users during outage. This famous message you might have seen before:

dddddd.PNG (102×844 px, 3 KB)

The difference now is that we have implemented database circuit breaking T360930: Section-wide circuit breaking so before appservers get overloaded and envoy's circuit breaker kicking in (causing the old error page), the database actively kills a subset of requests to protect the rest. This also makes the outages quite shorter. If you remember previous ones, they tend to take longer to recover. If you want to know more about circuit breaking this is decent: https://learn.microsoft.com/en-us/azure/architecture/patterns/circuit-breaker

I admit the error page is not very user-friendly and that's on me but 1- It's not super straight forward to fix it since you don't have access to the database to parse messages and make it pretty and mw has already flushed 200 header so envoy's error page is not being shown (I need to double check but lats time I tried it, that was the reason) 2- It was low priority given that circuit breaking wasn't happening (we didn't have db-related outages) but now that we do, I will do my best to work on it given time constraints.

Then we have the discussion of what caused those outages. I so far, identified multiple root causes for different overloads (documented them in details as comments and subtickets here). They fed each other and made things worse but it used to happen quite more often until T391695: UncachedMenteeOverviewDataProvider query is extremely aggressive causing partial outages got resolved and then I think we finally fixed last of them. To my knowledge in these specific cases, we didn't have any outages caused by AI scrapers (but we did have other outages caused by scrapers in this time period, explicitly outages on images).

My point here is this: We will see more error pages like this from time to time. This is how db overload issues manifest now (while they used to manifest in much larger scale outages). Each case require its own ticket and its own investigation.

This seems to be happening again:
While browsing https://commons.wikimedia.org/wiki/Category:Deletion_requests_-_No_timestamp_given
MediaWiki internal error.

Original exception: [83831733-e17e-4d4c-ad50-c8ec6633ac1d] 2025-05-04 09:52:43: Fatal exception of type "Wikimedia\Rdbms\DBUnexpectedError"

Exception caught inside exception handler.

Set $wgShowExceptionDetails = true; at the bottom of LocalSettings.php to show detailed debugging information.

This seems to be happening again:

I can see spikes for x1 replicas (db1179, db1220, db1224), for example:

image.png (1×3 px, 374 KB)

Checking logstash, there aren't many MW logs for DB events, but I found the following problematic queries

wikiadmin2023@10.64.16.208(wikishared)> explain SELECT  translation_target_language AS `language`,COUNT(DISTINCT translation_started_by) AS `translators`  FROM `cx_translations`    WHERE ((translation_status = 'published' OR translation_target_url IS NOT NULL))  GROUP BY translation_target_language ;
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
| id   | select_type | table           | type | possible_keys | key  | key_len | ref  | rows    | Extra                       |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
|    1 | SIMPLE      | cx_translations | ALL  | NULL          | NULL | NULL    | NULL | 3148342 | Using where; Using filesort |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
1 row in set (0.001 sec)
wikiadmin2023@10.64.16.208(wikishared)> explain SELECT  translation_source_language AS `language`,COUNT(DISTINCT translation_started_by) AS `translators`  FROM `cx_translations`    WHERE ((translation_status = 'published' OR translation_target_url IS NOT NULL))  GROUP BY translation_source_language ;
+------+-------------+-----------------+-------+---------------+--------------------------+---------+------+---------+-------------+
| id   | select_type | table           | type  | possible_keys | key                      | key_len | ref  | rows    | Extra       |
+------+-------------+-----------------+-------+---------------+--------------------------+---------+------+---------+-------------+
|    1 | SIMPLE      | cx_translations | index | NULL          | cx_translation_languages | 76      | NULL | 3148342 | Using where |
+------+-------------+-----------------+-------+---------------+--------------------------+---------+------+---------+-------------+
1 row in set (0.001 sec)
wikiadmin2023@10.64.16.208(wikishared)> explain SELECT  translation_source_language AS `sourceLanguage`,translation_target_language AS `targetLanguage`,(CASE WHEN (translation_status = 'published' OR translation_target_url IS NOT NULL) THEN 'published' ELSE 'draft' END) AS `status`,COUNT(*) AS `count`,COUNT(DISTINCT translation_started_by) AS `translators`  FROM `cx_translations`    WHERE translation_status IN ('draft','published')   GROUP BY translation_source_language,translation_target_language,status ;
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
| id   | select_type | table           | type | possible_keys | key  | key_len | ref  | rows    | Extra                       |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
|    1 | SIMPLE      | cx_translations | ALL  | NULL          | NULL | NULL    | NULL | 3148342 | Using where; Using filesort |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+-----------------------------+
1 row in set (0.001 sec)
wikiadmin2023@10.64.16.208(wikishared)> explain SELECT  MAX(translation_last_updated_timestamp) AS `date`,COUNT(translation_id) AS `count`  FROM `cx_translations`    WHERE ((translation_status = 'published' OR translation_target_url IS NOT NULL))  GROUP BY YEARWEEK(translation_last_updated_timestamp, 3) ;
+------+-------------+-----------------+------+---------------+------+---------+------+---------+----------------------------------------------+
| id   | select_type | table           | type | possible_keys | key  | key_len | ref  | rows    | Extra                                        |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+----------------------------------------------+
|    1 | SIMPLE      | cx_translations | ALL  | NULL          | NULL | NULL    | NULL | 3148342 | Using where; Using temporary; Using filesort |
+------+-------------+-----------------+------+---------------+------+---------+------+---------+----------------------------------------------+
1 row in set (0.001 sec)

These queries (at least 3 of them) are issued WITHIN THE SAME REQUEST, example.

The queries above are problematic but might not be the cause of the observed unavailability. I was once again tricked by the timezone in grafana, duh. The request reported above happened at 9:52 UTC. Around that time we have a spike in open connections:

image.png (1×3 px, 389 KB)

Adapting the query in the task description of T392784, it seems that we had around ~30k DB errors, 5.5k of which are from CampaignEvents. 7.8k are from Cognate, e.g.

from /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadmonitor/LoadMonitor.php(125)
#0 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadbalancer/LoadBalancer.php(453): Wikimedia\Rdbms\LoadMonitor->scaleLoads(array)
#1 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadbalancer/LoadBalancer.php(779): Wikimedia\Rdbms\LoadBalancer->getReaderIndex(string)
#2 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(111): Wikimedia\Rdbms\LoadBalancer->getConnectionInternal(int, array, string, int)
#3 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(125): Wikimedia\Rdbms\DBConnRef->ensureConnection()
#4 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(351): Wikimedia\Rdbms\DBConnRef->__call(string, array)
#5 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(762): Wikimedia\Rdbms\DBConnRef->select(array, array, array, string, array, array)
#6 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/CognateStore.php(175): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet()
#7 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/CognateRepo.php(134): Cognate\CognateStore->selectLinkDetailsForPage(string, MediaWiki\Title\Title)
#8 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/HookHandler/CognateParseHookHandler.php(79): Cognate\CognateRepo->getLinksForPage(string, MediaWiki\Title\Title)
#9 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/CognateHooks.php(125): Cognate\HookHandler\CognateParseHookHandler->doContentAlterParserOutput(MediaWiki\Title\Title, MediaWiki\Parser\ParserOutput)
#10 /srv/mediawiki/php-1.44.0-wmf.27/includes/HookContainer/HookContainer.php(155): Cognate\CognateHooks->onContentAlterParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Title\Title, MediaWiki\Parser\ParserOutput)
#11 /srv/mediawiki/php-1.44.0-wmf.27/includes/HookContainer/HookRunner.php(1241): MediaWiki\HookContainer\HookContainer->run(string, array)
#12 /srv/mediawiki/php-1.44.0-wmf.27/includes/content/ContentHandler.php(1709): MediaWiki\HookContainer\HookRunner->onContentAlterParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Title\Title, MediaWiki\Parser\ParserOutput)
#13 /srv/mediawiki/php-1.44.0-wmf.27/includes/content/Renderer/ContentRenderer.php(75): MediaWiki\Content\ContentHandler->getParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Content\Renderer\ContentParseParams)
#14 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RenderedRevision.php(261): MediaWiki\Content\Renderer\ContentRenderer->getParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Page\PageIdentityValue, MediaWiki\Revision\RevisionStoreRecord, MediaWiki\Parser\ParserOptions, array)
#15 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RenderedRevision.php(233): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(MediaWiki\Content\WikitextContent, array)
#16 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RevisionRenderer.php(236): MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string, array)
#17 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RevisionRenderer.php(169): MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, MediaWiki\Parser\ParserOptions, array)
#18 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RenderedRevision.php(196): MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array)
#19 /srv/mediawiki/php-1.44.0-wmf.27/includes/page/ParserOutputAccess.php(458): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput()
#20 /srv/mediawiki/php-1.44.0-wmf.27/includes/page/ParserOutputAccess.php(368): MediaWiki\Page\ParserOutputAccess->renderRevision(MediaWiki\Page\WikiPage, MediaWiki\Parser\ParserOptions, MediaWiki\Revision\RevisionStoreRecord, int, null)
#21 /srv/mediawiki/php-1.44.0-wmf.27/includes/content/ContentHandler.php(1465): MediaWiki\Page\ParserOutputAccess->getParserOutput(MediaWiki\Page\WikiPage, MediaWiki\Parser\ParserOptions, MediaWiki\Revision\RevisionStoreRecord, int)
#22 /srv/mediawiki/php-1.44.0-wmf.27/extensions/CirrusSearch/includes/BuildDocument/ParserOutputPageProperties.php(102): MediaWiki\Content\ContentHandler->getParserOutputForIndexing(MediaWiki\Page\WikiPage, null, MediaWiki\Revision\RevisionStoreRecord)
#23 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/objectcache/WANObjectCache.php(1846): CirrusSearch\BuildDocument\ParserOutputPageProperties->CirrusSearch\BuildDocument\{closure}(bool, int, array, null, array)
#24 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/objectcache/WANObjectCache.php(1650): Wikimedia\ObjectCache\WANObjectCache->fetchOrRegenerate(string, int, Closure, array, array)
#25 /srv/mediawiki/php-1.44.0-wmf.27/extensions/CirrusSearch/includes/BuildDocument/ParserOutputPageProperties.php(116): Wikimedia\ObjectCache\WANObjectCache->getWithSetCallback(string, int, Closure)
#26 /srv/mediawiki/php-1.44.0-wmf.27/extensions/CirrusSearch/includes/BuildDocument/ParserOutputPageProperties.php(52): CirrusSearch\BuildDocument\ParserOutputPageProperties->finalizeReal(Elastica\Document, MediaWiki\Page\WikiPage, CirrusSearch\CirrusSearch, MediaWiki\Revision\RevisionStoreRecord)
#27 /srv/mediawiki/php-1.44.0-wmf.27/extensions/CirrusSearch/includes/BuildDocument/BuildDocument.php(215): CirrusSearch\BuildDocument\ParserOutputPageProperties->finalize(Elastica\Document, MediaWiki\Title\Title, MediaWiki\Revision\RevisionStoreRecord)
#28 /srv/mediawiki/php-1.44.0-wmf.27/extensions/CirrusSearch/includes/Api/QueryBuildDocument.php(128): CirrusSearch\BuildDocument\BuildDocument->finalize(Elastica\Document, bool, MediaWiki\Revision\RevisionStoreRecord)
#29 /srv/mediawiki/php-1.44.0-wmf.27/includes/api/ApiQuery.php(739): CirrusSearch\Api\QueryBuildDocument->execute()
#30 /srv/mediawiki/php-1.44.0-wmf.27/includes/api/ApiMain.php(2010): MediaWiki\Api\ApiQuery->execute()
#31 /srv/mediawiki/php-1.44.0-wmf.27/includes/api/ApiMain.php(948): MediaWiki\Api\ApiMain->executeAction()
#32 /srv/mediawiki/php-1.44.0-wmf.27/includes/api/ApiMain.php(919): MediaWiki\Api\ApiMain->executeActionWithErrorHandling()
#33 /srv/mediawiki/php-1.44.0-wmf.27/includes/api/ApiEntryPoint.php(152): MediaWiki\Api\ApiMain->execute()
#34 /srv/mediawiki/php-1.44.0-wmf.27/includes/MediaWikiEntryPoint.php(202): MediaWiki\Api\ApiEntryPoint->execute()
#35 /srv/mediawiki/php-1.44.0-wmf.27/api.php(44): MediaWiki\MediaWikiEntryPoint->run()
#36 /srv/mediawiki/w/api.php(3): require(string)
#37 {main}
	from /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadmonitor/LoadMonitor.php(125)
#0 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadbalancer/LoadBalancer.php(453): Wikimedia\Rdbms\LoadMonitor->scaleLoads(array)
#1 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadbalancer/LoadBalancer.php(779): Wikimedia\Rdbms\LoadBalancer->getReaderIndex(string)
#2 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(111): Wikimedia\Rdbms\LoadBalancer->getConnectionInternal(int, array, string, int)
#3 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(125): Wikimedia\Rdbms\DBConnRef->ensureConnection()
#4 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(351): Wikimedia\Rdbms\DBConnRef->__call(string, array)
#5 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(762): Wikimedia\Rdbms\DBConnRef->select(array, array, array, string, array, array)
#6 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/CognateStore.php(175): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet()
#7 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/CognateRepo.php(134): Cognate\CognateStore->selectLinkDetailsForPage(string, MediaWiki\Title\Title)
#8 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/HookHandler/CognateParseHookHandler.php(79): Cognate\CognateRepo->getLinksForPage(string, MediaWiki\Title\Title)
#9 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Cognate/src/CognateHooks.php(125): Cognate\HookHandler\CognateParseHookHandler->doContentAlterParserOutput(MediaWiki\Title\Title, MediaWiki\Parser\ParserOutput)
#10 /srv/mediawiki/php-1.44.0-wmf.27/includes/HookContainer/HookContainer.php(155): Cognate\CognateHooks->onContentAlterParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Title\Title, MediaWiki\Parser\ParserOutput)
#11 /srv/mediawiki/php-1.44.0-wmf.27/includes/HookContainer/HookRunner.php(1241): MediaWiki\HookContainer\HookContainer->run(string, array)
#12 /srv/mediawiki/php-1.44.0-wmf.27/includes/content/ContentHandler.php(1709): MediaWiki\HookContainer\HookRunner->onContentAlterParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Title\Title, MediaWiki\Parser\ParserOutput)
#13 /srv/mediawiki/php-1.44.0-wmf.27/includes/content/Renderer/ContentRenderer.php(75): MediaWiki\Content\ContentHandler->getParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Content\Renderer\ContentParseParams)
#14 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RenderedRevision.php(261): MediaWiki\Content\Renderer\ContentRenderer->getParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Page\PageIdentityValue, MediaWiki\Revision\RevisionStoreRecord, MediaWiki\Parser\ParserOptions, array)
#15 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RenderedRevision.php(233): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(MediaWiki\Content\WikitextContent, array)
#16 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RevisionRenderer.php(236): MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string, array)
#17 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RevisionRenderer.php(169): MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, MediaWiki\Parser\ParserOptions, array)
#18 /srv/mediawiki/php-1.44.0-wmf.27/includes/Revision/RenderedRevision.php(196): MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array)
#19 /srv/mediawiki/php-1.44.0-wmf.27/includes/page/ParserOutputAccess.php(458): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput()
#20 /srv/mediawiki/php-1.44.0-wmf.27/includes/page/ParserOutputAccess.php(368): MediaWiki\Page\ParserOutputAccess->renderRevision(MediaWiki\Page\PageStoreRecord, MediaWiki\Parser\ParserOptions, MediaWiki\Revision\RevisionStoreRecord, int, MediaWiki\Parser\ParserOutput)
#21 /srv/mediawiki/php-1.44.0-wmf.27/includes/jobqueue/jobs/ParsoidCachePrewarmJob.php(150): MediaWiki\Page\ParserOutputAccess->getParserOutput(MediaWiki\Page\PageStoreRecord, MediaWiki\Parser\ParserOptions, MediaWiki\Revision\RevisionStoreRecord, int)
#22 /srv/mediawiki/php-1.44.0-wmf.27/includes/jobqueue/jobs/ParsoidCachePrewarmJob.php(166): MediaWiki\JobQueue\Jobs\ParsoidCachePrewarmJob->doParsoidCacheUpdate()
#23 /srv/mediawiki/php-1.44.0-wmf.27/extensions/EventBus/includes/JobExecutor.php(88): MediaWiki\JobQueue\Jobs\ParsoidCachePrewarmJob->run()
#24 /srv/mediawiki/rpc/RunSingleJob.php(60): MediaWiki\Extension\EventBus\JobExecutor->execute(array)
#25 {main}

Then 11.5k are from Flow:

	from /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadmonitor/LoadMonitor.php(125)
#0 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadbalancer/LoadBalancer.php(453): Wikimedia\Rdbms\LoadMonitor->scaleLoads(array)
#1 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/loadbalancer/LoadBalancer.php(779): Wikimedia\Rdbms\LoadBalancer->getReaderIndex(string)
#2 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(111): Wikimedia\Rdbms\LoadBalancer->getConnectionInternal(int, array, string, int)
#3 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(125): Wikimedia\Rdbms\DBConnRef->ensureConnection()
#4 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/DBConnRef.php(703): Wikimedia\Rdbms\DBConnRef->__call(string, array)
#5 /srv/mediawiki/php-1.44.0-wmf.27/includes/libs/rdbms/database/Database.php(2695): Wikimedia\Rdbms\DBConnRef->getSessionLagStatus()
#6 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/Data/FlowObjectCache.php(33): Wikimedia\Rdbms\Database::getCacheSetOptions(Wikimedia\Rdbms\DBConnRef)
#7 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/ServiceWiring.php(82): Flow\Data\FlowObjectCache->__construct(Wikimedia\ObjectCache\WANObjectCache, Flow\DbFactory, int)
#8 /srv/mediawiki/php-1.44.0-wmf.27/vendor/wikimedia/services/src/ServiceContainer.php(442): Wikimedia\Services\ServiceContainer::{closure}(MediaWiki\MediaWikiServices)
#9 /srv/mediawiki/php-1.44.0-wmf.27/vendor/wikimedia/services/src/ServiceContainer.php(406): Wikimedia\Services\ServiceContainer->createService(string)
#10 /srv/mediawiki/php-1.44.0-wmf.27/includes/MediaWikiServices.php(370): Wikimedia\Services\ServiceContainer->getService(string)
#11 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/ServiceWiring.php(395): MediaWiki\MediaWikiServices->getService(string)
#12 /srv/mediawiki/php-1.44.0-wmf.27/vendor/wikimedia/services/src/ServiceContainer.php(442): Wikimedia\Services\ServiceContainer::{closure}(MediaWiki\MediaWikiServices)
#13 /srv/mediawiki/php-1.44.0-wmf.27/vendor/wikimedia/services/src/ServiceContainer.php(406): Wikimedia\Services\ServiceContainer->createService(string)
#14 /srv/mediawiki/php-1.44.0-wmf.27/includes/MediaWikiServices.php(370): Wikimedia\Services\ServiceContainer->getService(string)
#15 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/container.php(739): MediaWiki\MediaWikiServices->getService(string)
#16 /srv/mediawiki/php-1.44.0-wmf.27/vendor/pimple/pimple/src/Pimple/Container.php(122): Flow\Container::{closure}(Flow\Container)
#17 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/Data/ManagerGroup.php(72): Pimple\Container->offsetGet(string)
#18 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/Data/ManagerGroup.php(127): Flow\Data\ManagerGroup->getStorage(string)
#19 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/Data/ManagerGroup.php(139): Flow\Data\ManagerGroup->call(string, array)
#20 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/ReferenceClarifier.php(115): Flow\Data\ManagerGroup->find(string, array)
#21 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/ReferenceClarifier.php(76): Flow\ReferenceClarifier->loadReferencesForPage(MediaWiki\Title\Title)
#22 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/ReferenceClarifier.php(30): Flow\ReferenceClarifier->getWikiReferences(MediaWiki\Title\Title, MediaWiki\Title\Title)
#23 /srv/mediawiki/php-1.44.0-wmf.27/extensions/Flow/includes/Hooks.php(1164): Flow\ReferenceClarifier->getWhatLinksHereProps(stdClass, MediaWiki\Title\Title, MediaWiki\Title\Title)
#24 /srv/mediawiki/php-1.44.0-wmf.27/includes/HookContainer/HookContainer.php(155): Flow\Hooks->onWhatLinksHereProps(stdClass, MediaWiki\Title\Title, MediaWiki\Title\Title, array)
#25 /srv/mediawiki/php-1.44.0-wmf.27/includes/HookContainer/HookRunner.php(4647): MediaWiki\HookContainer\HookContainer->run(string, array)
#26 /srv/mediawiki/php-1.44.0-wmf.27/includes/specials/SpecialWhatLinksHere.php(532): MediaWiki\HookContainer\HookRunner->onWhatLinksHereProps(stdClass, MediaWiki\Title\Title, MediaWiki\Title\Title, array)
#27 /srv/mediawiki/php-1.44.0-wmf.27/includes/specials/SpecialWhatLinksHere.php(478): MediaWiki\Specials\SpecialWhatLinksHere->listItem(stdClass, MediaWiki\Title\Title, MediaWiki\Title\Title)
#28 /srv/mediawiki/php-1.44.0-wmf.27/includes/specials/SpecialWhatLinksHere.php(117): MediaWiki\Specials\SpecialWhatLinksHere->showIndirectLinks(int, MediaWiki\Title\Title, int, int, int, string)
#29 /srv/mediawiki/php-1.44.0-wmf.27/includes/specialpage/FormSpecialPage.php(216): MediaWiki\Specials\SpecialWhatLinksHere->onSuccess()
#30 /srv/mediawiki/php-1.44.0-wmf.27/includes/specialpage/SpecialPage.php(734): MediaWiki\SpecialPage\FormSpecialPage->execute(null)
#31 /srv/mediawiki/php-1.44.0-wmf.27/includes/specialpage/SpecialPageFactory.php(1738): MediaWiki\SpecialPage\SpecialPage->run(null)
#32 /srv/mediawiki/php-1.44.0-wmf.27/includes/actions/ActionEntryPoint.php(499): MediaWiki\SpecialPage\SpecialPageFactory->executePath(string, MediaWiki\Context\RequestContext)
#33 /srv/mediawiki/php-1.44.0-wmf.27/includes/actions/ActionEntryPoint.php(143): MediaWiki\Actions\ActionEntryPoint->performRequest()
#34 /srv/mediawiki/php-1.44.0-wmf.27/includes/MediaWikiEntryPoint.php(202): MediaWiki\Actions\ActionEntryPoint->execute()
#35 /srv/mediawiki/php-1.44.0-wmf.27/index.php(58): MediaWiki\MediaWikiEntryPoint->run()
#36 /srv/mediawiki/w/index.php(3): require(string)
#37 {main}

Then a bunch of other stuff.