- Blog: https://timotijhof.net
- Mastodon: @krinkle
(Photo by Niek Hidding.)
(Photo by Niek Hidding.)
Reported upstream at https://github.com/webmandesign/modern/issues/2.
Boldly resolving as any such caches will have long expired by now, and presumably improved discovery of this task for its documented workaround, by being an open task, is no longer as important now that those caches have expired.
I see your point. The debug parameter is primarily a boolean param. People should only need to know its name and expect it to work like any other boolean param in MediaWiki. That is why debug=true has always been the canonical way we describe it in docs, and the version numbers merely a transition mechanism for those in the know wanting to opt-in or out.
In T405005#11437105, @gerritbot wrote:Change #1215666 had a related patch set uploaded (by Ladsgroup; author: Amir Sarabadani):
[mediawiki/core@master] SpecialLinkSearch: Add a message when domains are being ignored
Proposed changes for the checklist so far:
In T385310#11417701, @Tgr wrote:In T385310#11413400, @Krinkle wrote:The error is on this request to wikimania.wikimedia.org, and that request is indeed auto-creating the local account on wikimaniawiki.
That would be a major regression, we are not supposed to auto-create during edge login.
And in general we don't seem to - compare the number of account creations on enwiki vs wikimaniawiki. Although that's still way too much autocreation for wikimaniawiki. […]
For future reference, this wasn't caught in CI or Beta because most wikis don't have language variants or have a number that you can divide 10000 by without a remainder (e.g. 1, 2, 4, 5, etc). This includes the piglatin test variant in CI, which adds to English for 2 variants total.
Thank you.
Client preferenes are implemented in ResourceLoader/ClientHtml.php, resources/src/startup/clientprefs.js, and mw.user.clientPrefs (part of mediawiki.user), which are are maintained as part of ResourceLoader.
In T382904#10721521, @ihurbain wrote:Removing Content-Transform-Team-WIP to avoid other folks looking into this during EssentialWeek; to be discussed in tech forum for exact triage.
@hashar Aye, that would produce a more understable CI failure, but a CI failure nonetheless.
Is it actually required to have the information available at this point? From my previous analysis it seems the caller is indifferent about which wikis are included in the array. The requests could have completed in either order so the concurrent callers don't know about those wikis, and a good number of them won't be included either way if both the CA+local writes haven't taken place yet. It's only because one them happens to have finished already that we even try in the first place, but the caller didn't specifically expect or ask for that.
Thanks.
In T400023#11414663, @tstarling wrote:The sitemap has only been submitted direct to Google, it's not in robots.txt and it hasn't been submitted to Bing. SRE are afraid that adding it to robots.txt will cause too much crawler traffic.
Progress report for 27 Oct - 24 Nov (four weeks) on WE6.4.8 Support PHP 8.3 upgrade (m:FY2025-2026#Q2).
On ParserCache hosts we instantiate the objectcache template 256 times as pcXXX, like pc128 (wmf-config).
This is still an issue it seems. LinkCache::addGoodLinkObjFromRow is called with primary/unsafe data in these places (instead of safe replica data):
In T385310#11413400, @Krinkle wrote:It is perfectly natural for the wikimania request to start a database connection to centralauth and to mediawiki.org, and use a replica that is behind by a fraction of a second.