Change #1214204 had a related patch set uploaded (by Eileen; author: Eileen):
[wikimedia/fundraising/crm@master] Fix merge failing on db error
Change #1214204 had a related patch set uploaded (by Eileen; author: Eileen):
[wikimedia/fundraising/crm@master] Fix merge failing on db error
Change #1208481 merged by jenkins-bot:
[mediawiki/extensions/DonationInterface@master] Reuse the popup modal for wikiminute
@MDemosWMF Yes, we'll aim for early 2026.
In T355763#9484918, @wmr wrote:I can't read PHP. ChatGPT says: In this part of the code, if $isEncoding is true and the key $k is a string containing only digits (checked using ctype_digit), it converts the string to an integer using (int)$k. This is done to handle cases where JSON decoding may result in numeric keys represented as strings. The function then checks if the converted integer key forms a sequential sequence.
Is this intended? Is this a bug? If this is intended, can you add an option to allow no conversion like mw.loadJsonData("file.json", digit_conversion=False).
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:26:20Z] <krinkle@deploy2002> Finished scap sync-world: Backport for [[gerrit:1214201|robots.php: Avoid "404 Not Found" for Sitemap rule (T400023)]] (duration: 11m 08s)
Change #1214203 had a related patch set uploaded (by Eileen; author: Eileen):
[wikimedia/fundraising/crm@master] Handle conflict on snooze_date
Change #1214202 merged by jenkins-bot:
[integration/config@master] zuul: Define jobs for mediawiki/extensions/WikimediaCustomizations repo
Change #1214202 had a related patch set uploaded (by Krinkle; author: Krinkle):
[integration/config@master] zuul: Define jobs for mediawiki/extensions/WikimediaCustomizations repo
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:17:53Z] <krinkle@deploy2002> krinkle: Backport for [[gerrit:1214201|robots.php: Avoid "404 Not Found" for Sitemap rule (T400023)]] synced to the testservers (see https://wikitech.wikimedia.org/wiki/Mwdebug). Changes can now be verified there.
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:15:12Z] <krinkle@deploy2002> Started scap sync-world: Backport for [[gerrit:1214201|robots.php: Avoid "404 Not Found" for Sitemap rule (T400023)]]
Change #1214201 merged by jenkins-bot:
[operations/mediawiki-config@master] robots.php: Avoid "404 Not Found" for Sitemap rule
Change #1214201 had a related patch set uploaded (by Krinkle; author: Krinkle):
[operations/mediawiki-config@master] robots.php: Avoid "404 Not Found" for Sitemap rule
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:08:02Z] <krinkle@deploy2002> Finished scap sync-world: Backport for [[gerrit:1201740|robots.php: Clean up unused site, lang, and x-subdomain (T407122)]], [[gerrit:1214148|Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google (T400023)]], [[gerrit:1214149|robots.txt: Clean up inline comments]], [[gerrit:1214150|robots.txt: Remove redundant "/wiki/Fundraising_2007/comments" disallow]] (duration: 08m 26s)
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:08:02Z] <krinkle@deploy2002> Finished scap sync-world: Backport for [[gerrit:1201740|robots.php: Clean up unused site, lang, and x-subdomain (T407122)]], [[gerrit:1214148|Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google (T400023)]], [[gerrit:1214149|robots.txt: Clean up inline comments]], [[gerrit:1214150|robots.txt: Remove redundant "/wiki/Fundraising_2007/comments" disallow]] (duration: 08m 26s)
@SHust @AMJohnson @LWadleigh @MBeat33 The team has identified a lack of caching in CiviCRM was causing a large number of unnecessary queries when editing contact details (and quite likely in other areas as well). We've implemented a fix, so please let us know if the general slowness seems improved on your end. We hope this improves slow editing and loading, but I don't think this has anything to do with being unable to load pages at all.
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:02:25Z] <krinkle@deploy2002> krinkle: Backport for [[gerrit:1201740|robots.php: Clean up unused site, lang, and x-subdomain (T407122)]], [[gerrit:1214148|Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google (T400023)]], [[gerrit:1214149|robots.txt: Clean up inline comments]], [[gerrit:1214150|robots.txt: Remove redundant "/wiki/Fundraising_2007/comments" disallow]] synced to the testservers (see https://wiki
Mentioned in SAL (#wikimedia-operations) [2025-12-03T03:02:25Z] <krinkle@deploy2002> krinkle: Backport for [[gerrit:1201740|robots.php: Clean up unused site, lang, and x-subdomain (T407122)]], [[gerrit:1214148|Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google (T400023)]], [[gerrit:1214149|robots.txt: Clean up inline comments]], [[gerrit:1214150|robots.txt: Remove redundant "/wiki/Fundraising_2007/comments" disallow]] synced to the testservers (see https://wiki
Mentioned in SAL (#wikimedia-operations) [2025-12-03T02:59:36Z] <krinkle@deploy2002> Started scap sync-world: Backport for [[gerrit:1201740|robots.php: Clean up unused site, lang, and x-subdomain (T407122)]], [[gerrit:1214148|Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google (T400023)]], [[gerrit:1214149|robots.txt: Clean up inline comments]], [[gerrit:1214150|robots.txt: Remove redundant "/wiki/Fundraising_2007/comments" disallow]]
Mentioned in SAL (#wikimedia-operations) [2025-12-03T02:59:36Z] <krinkle@deploy2002> Started scap sync-world: Backport for [[gerrit:1201740|robots.php: Clean up unused site, lang, and x-subdomain (T407122)]], [[gerrit:1214148|Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google (T400023)]], [[gerrit:1214149|robots.txt: Clean up inline comments]], [[gerrit:1214150|robots.txt: Remove redundant "/wiki/Fundraising_2007/comments" disallow]]
Change #1214148 merged by jenkins-bot:
[operations/mediawiki-config@master] Submit Commons sitemap to Bing/DuckDuckGo and remaining wikis to Google
Change #1201740 merged by jenkins-bot:
[operations/mediawiki-config@master] robots.php: Clean up unused site, lang, and x-subdomain
Change #1214156 had a related patch set uploaded (by Krinkle; author: Krinkle):
[operations/puppet@production] varnish: Move error message from footer to body for HTTP 4xx responses
Change #1214199 abandoned by Zabe:
[mediawiki/core@master] refreshImageMetadata: Add support for file read new
Reason:
actually FileSelectQueryBuilder already takes care of this
Change #1214199 had a related patch set uploaded (by Zabe; author: Zabe):
[mediawiki/core@master] refreshImageMetadata: Add support for file read new
Change #1214198 had a related patch set uploaded (by Zabe; author: Zabe):
[mediawiki/core@master] rebuildImages: Add support for file read new
Change #1214189 merged by jenkins-bot:
[mediawiki/core@REL1_43] htmlform: Load ooui before infusing field cloner buttons
Change #1214188 merged by jenkins-bot:
[mediawiki/core@REL1_43] HTMLFormFieldCloner: Fix multiple bugs related to conditional states
It's interesting that the deletion in the task description is still not processed. I spot-checked every wdqs host in eqiad, and they all agree that the item still exists, so there's definitely an issue here and it's not merely confined to a few hosts.
In T411503#11425771, @daniel wrote:In T411503#11424246, @taavi wrote:I don't think we currently have any places outside of https://wikitech.wikimedia.org/wiki/Help:Cloud_VPS_IP_space that publish our IP space. Would it be helpful if we published the same information in some machine-readable format?
Probbly... maybe this could use the same mechanism we use for "know clients" like googlebot? Curious what @CDanis thinks.
I believe relying on this is still functionally what @ori was against when he said "The task of making $wgRedactedFunctionArguments comprehensive is hopelessly gargantuan. It would require something like a full trace of the flow of data throughout all of MediaWiki and its extensions." in the liked email announcement. MediaWiki is too big to reasonably expect that we can track every variable that may contain "sensitive" data. I don't object to introducing the attribute, but I don't think we can reasonably expect it to become a comprehensive solution for web facing stack traces.
Change #1214187 merged by jenkins-bot:
[mediawiki/core@REL1_43] htmlform: Fix rendering contents for cloner fields
Is there another reason that we have the local lore that any extension or skin must be in all deployed branches before being added to wmf-config/extension-list?
Change #1214186 merged by jenkins-bot:
[mediawiki/core@REL1_43] Replace uses of Xml::fieldset(), deprecated since 1.42
an-worker* reboots ongoing now
I turns out that the current mergeMessageFileList.php implementation does not treat missing extensions in the list-file input as a fatal error. It does just what I proposed: logs the missing extension/skin and moves on. There was a hard failure when rMW8da7f5eeff33: Added $wgExtensionEntryPointListFiles for use with mergeMessageFileList.php was first implemented, but that was removed by rMW76680021f80e: Don't hard fail when we couldn't find an entry point for an extension way back in 2016.
Change #1214194 had a related patch set uploaded (by Zabe; author: Zabe):
[mediawiki/core@master] RestrictionStore: Improve documentation of virtual domain check
Hrm. Well, SpiderPig does have a web daemon. That daemon doesn't persist any of the metric timings we used to track (as far as I'm aware). @dancy might be able to say something smarter than that about it :)
Change #1214182 merged by jenkins-bot:
[mediawiki/core@REL1_45] htmlform: Load ooui before infusing field cloner buttons
Change #1214185 merged by jenkins-bot:
[mediawiki/core@REL1_44] htmlform: Load ooui before infusing field cloner buttons
Change #1214184 merged by jenkins-bot:
[mediawiki/core@REL1_44] HTMLFormFieldCloner: Fix multiple bugs related to conditional states
Change #1214181 merged by jenkins-bot:
[mediawiki/core@REL1_45] HTMLFormFieldCloner: Fix multiple bugs related to conditional states
Forward the information here
Update function-schemata sub-module to HEAD (b22fbe6)
Change #1199275 abandoned by Reedy:
[mediawiki/extensions/OATHAuth@master] Create test for T408297
For batch-job oriented applications, we run a Prometheus PushGateway instance which may be an option for this. A Prometheus metrics endpoint in a stateful service is the ideal, though. Is there an opportunity here with spiderpig being a daemon service and could serve as the source of this data? (Guessing myself: probably not, but it can't hurt to ask the experts. 🙂)
Change #1214189 had a related patch set uploaded (by Reedy; author: Func):
[mediawiki/core@REL1_43] htmlform: Load ooui before infusing field cloner buttons
Change #1214188 had a related patch set uploaded (by Reedy; author: Func):
[mediawiki/core@REL1_43] HTMLFormFieldCloner: Fix multiple bugs related to conditional states
Change #1214187 had a related patch set uploaded (by Reedy; author: Máté Szabó):
[mediawiki/core@REL1_43] htmlform: Fix rendering contents for cloner fields
Change #1214186 had a related patch set uploaded (by Reedy; author: Jforrester):
[mediawiki/core@REL1_43] Replace uses of Xml::fieldset(), deprecated since 1.42
^ During the closure (T411501), I reassigned crwiki and klwiki from group2 to group0 and moved them to wmf.5 to match the train’s current stage.
Mentioned in SAL (#wikimedia-operations) [2025-12-03T00:33:31Z] <zabe@deploy2002> rebuilt and synchronized wikiversions files: group0 to 1.46.0-wmf.5 refs T408275
Change #1214180 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Cache pseudoconstantOptions
Please put on hold for now, it's possible the OKR 2.1.5 and tag name will be changed, in the meanwhile please advise if more require action is necessary to allow enable the tag.
Change the story points as more details needed than the expectation, some still need clarification, it will launch in the coming January, we are here to prepare in advanced.