Page MenuHomePhabricator
Search Global Search
Use the application-specific Advanced Search for better results and additional search criteria: Tasks, Commits. (More information)
    • Task
    EventBus tests fail when EventStreamConfig is not loaded: ``` lines=5 There were 4 errors: 1) EventBusFactoryTest::testGetEventServiceName with data set "disabled service" ('disabled_stream', '_disabled_eventbus_') Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 2) EventBusFactoryTest::testGetEventServiceName with data set "stream without destination event service" ('stream_without_destination_ev...ervice', 'intake-main') Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 3) EventBusFactoryTest::testGetEventServiceName with data set "stream with explicit event service" ('other_stream', 'intake-other') Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 4) EventBusFactoryTest::testGetEventServiceName with data set "stream with T321557 BC setting" ('stream_with_destination_event...etting', 'intake-main') Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 ``` This will also cause extension that solely depends on EventBus to fail, for example WikimediaEvents. In production we always load `EventBus`, but `EventStreamConfig` is only loaded when `wmgUseEventLogging` is set. Thus I don't think it should be marked a hard requirement (added to requires in extension.json). Some EventBus test is marked to be skipped when EventStreamConfig is not loaded. The above tests should be skipped as well (and that will be enforced by CI via T407797).
    • Task
    1. Rollbacker (Тез кайтаруучу) Requested rights: * rollback Purpose: To allow trusted users to quickly revert vandalism and inappropriate edits. Local community discussion and consensus: https://w.wiki/EG$W#Чечим_кабыл_алуучу_статусун_киргизүү 2. Eliminator (Чечим кабыл алуучу) This user group is intended for users who summarize and close local discussions, such as deletion requests. Local community discussion and consensus: https://w.wiki/EG$W#Чечим_кабыл_алуучу_статусун_киргизүү Requested rights: * delete Additional notes: Both user groups were approved by the kywiki community through local discussion. Please update the configuration in operations/mediawiki-config accordingly. Thank you!
    • Task
    **Test Scope** Usually, changing the colour schema of the banner helps to counteract the downward trend in donation income. **Acceptance Criteria** **Both banners** - The banners are based on the [variant banner of desktop-de-10](https://de.wikipedia.org/?banner=WMDE_FR_2025_Desktop_DE_10_var&useskin=vector-2024&vectornightmode=0&devMode). **Variant banner** - The variant banner uses the new design as defined in the design file (see [non-slider](https://www.figma.com/design/fFQ2XINsjVHgwMaTjL6hIG/Fundraising-2025?node-id=2723-664&t=Ne8KImsfLpaXUmCl-4) and [slider](https://www.figma.com/design/fFQ2XINsjVHgwMaTjL6hIG/Fundraising-2025?node-id=2764-1749&t=Ne8KImsfLpaXUmCl-4) for light mode and [non-slider](https://www.figma.com/design/fFQ2XINsjVHgwMaTjL6hIG/Fundraising-2025?node-id=2873-2070&t=8Adf9MugiemUEb81-4) for dark mode). - The dark slider version uses white a white active slide indicator and chevron icons, and black inactive slide indicators. - The copy layout remains as it is in the base banner (headline, subheadline and paragraph). - The colour schema is applied. - The use of funds link is moved to the top of the donation form. - The form submit button size is increased. - The form has a rounded corner border. - The copy has a different background color and no border. | {F70211838} | {F70211854} | {F70215394} | | | light non-slider | light slider | dark non-slider | dark slider | **Banner Preview** [light control banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Desktop_DE_11_ctrl&useskin=vector-2024&vectornightmode=0&devMode) [light variant banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Desktop_DE_11_var&useskin=vector-2024&vectornightmode=0&devMode) [dark control banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Desktop_DE_11_ctrl&useskin=vector-2024&vectornightmode=1&devMode) [dark variant banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Desktop_DE_11_var&useskin=vector-2024&vectornightmode=1&devMode)
    • Task
    The WikimediaBadges extension does not require WikibaseClient but tests are failing when it is not loaded: ``` 17:01:01 1) Error 17:01:01 The data provider specified for WikimediaBadges\Tests\WikibaseClientSiteLinksForItemHookHandlerTest::testDoAddToSidebar is invalid. 17:01:01 Error: Class "Wikibase\DataModel\SiteLink" not found 17:01:01 /workspace/src/extensions/WikimediaBadges/tests/phpunit/includes/WikibaseClientSiteLinksForItemHookHandlerTest.php:72 17:01:01 /workspace/src/tests/phpunit/suites/ExtensionsTestSuite.php:37 17:01:01 /workspace/src/tests/phpunit/suites/ExtensionsTestSuite.php:46 17:01:01 17:01:01 2) WikimediaBadges\Tests\WikibaseClientSiteLinksForItemHookHandlerTest::testDoAddToSidebar_disabled 17:01:01 Error: Interface "Wikibase\Client\Hooks\WikibaseClientSiteLinksForItemHook" not found 17:01:01 17:01:01 /workspace/src/extensions/WikimediaBadges/includes/WikibaseClientSiteLinksForItemHookHandler.php:31 17:01:01 /workspace/src/includes/AutoLoader.php:183 17:01:01 /workspace/src/extensions/WikimediaBadges/tests/phpunit/includes/WikibaseClientSiteLinksForItemHookHandlerTest.php:224 Logs generated by test 17:01:01 17:01:01 3) WikimediaBadges\Tests\WikibaseClientSiteLinksForItemHookHandlerTest::testAddToSidebar 17:01:01 Error: Class "WikimediaBadges\WikibaseClientSiteLinksForItemHookHandler" not found 17:01:01 17:01:01 /workspace/src/extensions/WikimediaBadges/tests/phpunit/includes/WikibaseClientSiteLinksForItemHookHandlerTest.php:241 ``` This test a hook provided by WikibaseClient and the test should be skipped when the extension is not loaded.
    • Task
    TheWikipediaLibrary extension does require the Echo extension but tests are failing when Echo is not loaded: ``` 1) TheWikipediaLibraryEchoTest::testNoDupes Error: Class "MediaWiki\Extension\Notifications\DbFactory" not found /workspace/src/extensions/TheWikipediaLibrary/tests/phpunit/TheWikipediaLibraryEchoTest.php:38/workspace/src/extensions/TheWikipediaLibrary/tests/phpunit/TheWikipediaLibraryEchoTest.php:20 ``` The test should use `$this->markTestSkippedIfExtensionNotLoaded( 'Echo' )` or alternatively TheWikipediaLibrary can be marked to require Echo in `extension.json`.
    • Task
    ReportIncident extension does not require CommunityConfiguration in `extension.json`. Running the PHPUnit tests fail with: ``` ```counterexample PHP Fatal error: Uncaught Error: Class "MediaWiki\Extension\CommunityConfiguration\Tests\SchemaProviderTestCase" not found in extensions/ReportIncident/tests/phpunit/integration/Config/ReportIncidentSchemaProviderTest.php:12 ``` ``` #0 vendor/phpunit/phpunit/src/Util/FileLoader.php(66): include_once() #1 vendor/phpunit/phpunit/src/Util/FileLoader.php(49): PHPUnit\Util\FileLoader::load() #2 vendor/phpunit/phpunit/src/Framework/TestSuite.php(398): PHPUnit\Util\FileLoader::checkAndLoad() #3 vendor/phpunit/phpunit/src/Framework/TestSuite.php(537): PHPUnit\Framework\TestSuite->addTestFile() #4 tests/phpunit/suites/ExtensionsTestSuite.php(37): PHPUnit\Framework\TestSuite->addTestFiles() #5 tests/phpunit/suites/ExtensionsTestSuite.php(46): ExtensionsTestSuite->__construct() #6 [internal function]: ExtensionsTestSuite::suite() #7 vendor/phpunit/phpunit/src/Framework/TestSuite.php(486): ReflectionMethod->invoke() #8 vendor/phpunit/phpunit/src/TextUI/TestSuiteMapper.php(84): PHPUnit\Framework\TestSuite->addTestFile() #9 vendor/phpunit/phpunit/src/TextUI/Command.php(393): PHPUnit\TextUI\TestSuiteMapper->map() #10 vendor/phpunit/phpunit/src/TextUI/Command.php(114): PHPUnit\TextUI\Command->handleArguments() #11 vendor/phpunit/phpunit/src/TextUI/Command.php(99): PHPUnit\TextUI\Command->run() #12 vendor/phpunit/phpunit/phpunit(107): PHPUnit\TextUI\Command::main() #13 vendor/bin/phpunit(122): include('...') #14 {main} ``` See also {T410051}
    • Task
    It seems that the Page Options menu is being improperly placed, resulting in the menu options being cut off on any device width smaller than ~450px. I've observed this on a totally fresh installation of Mediawiki, using multiple skins, with nothing installed save for the most recent 1.44 release of the VisualEditor extension, across both Firefox and Chrome. This is also happening on multiple in production wikis and skins, on multiple mobile and non-mobile devices. Firefox: {F70210283} and Chrome: {F70210294} **Steps to replicate the issue** (include links if applicable): * Download and install Mediawiki, using the current stable version of 1.44.2 from here https://www.mediawiki.org/wiki/Download , and the local development install instructions here https://www.mediawiki.org/wiki/Local_development_quickstart . * Download and install VisualEditor from https://www.mediawiki.org/wiki/Special:ExtensionDistributor/VisualEditor , selecting 1.44. * Update Mediawiki installation as per VisualEditor installation instructions, and enable the extension. * Apply any skin with a mobile responsive layout; in the above case, Timeless. * Attempt to edit any page from a device with a display width of less than ~450, and click the Page Options menu. **What happens?**: The menu is cut off by the edge of the display, making the options unreadable. **What should have happened instead?**: I assume the menu should be displayed full-width across the center. If you disable the menu element's 'right' and 'margin-left' properties, it appears to do so, as shown here: {F70210331} These properties are being applied directly to the element in question, not via CSS selectors, and the 'right' property is re-applied if the page is scrolled. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): Mediawiki: 1.44.2 VisualEditor: – (56c6279) 07:43, 12 November 2025 (used for above testing) – (ae3bd62) 06:56, 24 September 2025 **Other information** (browser name/version, screenshots, etc.):
    • Task
    Username: (The username of your existing LDAP account on https://idm.wikimedia.org: OKryva-WMF Shell access: Yes/No (Whether you currently have shell access): yes Purpose: (Specify which service you need to get access to, e.g. Icinga, Grafana, Superset etc): logstash Approver: @IBerker-WMF Contract end date: March 8th As Engineering Manager in Product Safety and Integrity team, I would like to request access to Logstash to enhance my ability to track and analyze production errors more effectively within our environment.
    • Task
    The instructions to launch the unit tests as described in the airflow-dags [README](https://gitlab.wikimedia.org/repos/data-engineering/airflow-dags/#tests) are: ```console docker run \ --platform linux/x86_64 \ --memory="8g" \ --cpus="4.0" \ --rm -it \ --volume .:/opt/airflow \ --env REBUILD_FIXTURES=yes --env PYTHONPATH=".:wmf_airflow_common/plugins" \ airflow_env_test \ bash -c "pytest --capture=no -o cache_dir=/tmp/pytest_cache_dir" ``` While this is working, we should probably ease out the execution all of these common operations (running linters, tests, rebuilding fixtures, etc).
    • Task
    **Steps to replicate the issue** (include links if applicable): * Press "Get started" on the campaign: https://isa.toolforge.org/campaigns/421 * Press "next" to have new images * **What happens?**: you will find repeated images, all from Algeria (especially the ones that participants already edited). New images may appear after many times of repeated ones which takes time and effort and make participants get bored. **What should have happened instead?**: showing new image each time you press "next", especially that the campaign has 31203 images (so huge number for variety) **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    TASK AUTO-GENERATED by Nagios/Icinga RAID event handler A degraded RAID (broadcom) [[ https://icinga.wikimedia.org/cgi-bin/icinga/extinfo.cgi?type=2&host=an-worker1208&service=Dell PowerEdge or Supermicro Broadcom RAID Controller | was detected ]] on host `an-worker1208`. An automatic snapshot of the current RAID status is attached below. Please **sync with the service owner** to find the appropriate time window before actually replacing any failed hardware. ``` communication: 0 OK : controller: 1 Needs Attention : physical_disk: 2 Failed : virtual_disk: 2 OfLn : bbu: 0 OK : enclosure: 0 OK : CLI Version = 007.1910.0000.0000 Oct 08, 2021 $ sudo /usr/local/lib/nagios/plugins/get-raid-status-broadcom Failed to execute '['/usr/lib/nagios/plugins/check_nrpe', '-4', '-H', 'an-worker1208', '-c', 'get_raid_status_broadcom']': RETCODE: 2 STDOUT: communication: 0 OK ; controller: 1 Needs Attention ; physical_disk: 2 Failed ; virtual_disk: 2 OfLn ; bbu: 0 OK ; enclosure: 0 OK ; CLI Version = 007.1910.0000.0000 Oct 08, 2021 Operating system = Linux 5.10.0-34-amd64 Controller = 0 Status = Success Description = Show Drive Group Succeeded TOPOLOGY : ======== ----------------------------------------------------------------------------- DG Arr Row EID:Slot DID Type State BT Size PDC PI SED DS3 FSpace TR ----------------------------------------------------------------------------- 0 - - - - RAID1 Optl N 446.625 GB enbl N N dflt N N 0 0 - - - RAID1 Optl N 446.625 GB enbl N N dflt N N 0 0 0 251:0 5 DRIVE Onln N 446.625 GB enbl N N dflt - N 0 0 1 251:1 7 DRIVE Onln N 446.625 GB enbl N N dflt - N 1 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 1 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 1 0 0 252:0 0 DRIVE Onln N 7.276 TB enbl N N dflt - N 2 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 2 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 2 0 0 252:1 6 DRIVE Onln N 7.276 TB enbl N N dflt - N 3 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 3 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 3 0 0 252:2 8 DRIVE Onln N 7.276 TB enbl N N dflt - N 4 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 4 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 4 0 0 252:3 9 DRIVE Onln N 7.276 TB enbl N N dflt - N 5 - - - - RAID0 OfLn N 7.276 TB enbl N N dflt N N 5 0 - - - RAID0 Dgrd N 7.276 TB enbl N N dflt N N 6 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 6 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 6 0 0 252:5 11 DRIVE Onln N 7.276 TB enbl N N dflt - N 7 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 7 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 7 0 0 252:6 12 DRIVE Onln N 7.276 TB enbl N N dflt - N 8 - - - - RAID0 OfLn N 7.276 TB enbl N N dflt N N 8 0 - - - RAID0 Dgrd N 7.276 TB enbl N N dflt N N 9 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 9 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 9 0 0 252:9 2 DRIVE Onln N 7.276 TB enbl N N dflt - N 10 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 10 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 10 0 0 252:10 3 DRIVE Onln N 7.276 TB enbl N N dflt - N 11 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 11 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 11 0 0 252:11 4 DRIVE Onln N 7.276 TB enbl N N dflt - N 12 - - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 12 0 - - - RAID0 Optl N 7.276 TB enbl N N dflt N N 12 0 0 252:7 13 DRIVE Onln N 7.276 TB enbl N N dflt - N ----------------------------------------------------------------------------- DG=Disk Group Index|Arr=Array Index|Row=Row Index|EID=Enclosure Device ID DID=Device ID|Type=Drive Type|Onln=Online|Rbld=Rebuild|Optl=Optimal|Dgrd=Degraded Pdgd=Partially degraded|Offln=Offline|BT=Background Task Active PDC=PD Cache|PI=Protection Info|SED=Self Encrypting Drive|Frgn=Foreign DS3=Dimmer Switch 3|dflt=Default|Msng=Missing|FSpace=Free Space Present TR=Transport Ready STDERR: None ```
    • Task
    **Steps to replicate the issue** (include links if applicable): * Go to https://zh.wikipedia.beta.wmcloud.org/wiki/Special:Import * Select "wikipedia" as import target **What happens?**: {F70206107} Many languages ​​are not included in the importable list, such as `en` and `zh`. **What should have happened instead?**: It should include all languages ​​available on the production site.
    • Task
    === Background The GrowthBook integration will enable experiment owners to run their experiments in phases. Until then, we can provide a way in the Test Kitchen UI to clone an existing experiment. === Description Add an option to the action menu in the Catalog/List view to clone an experiment and auto-appending a version number to the machine-readable name. === Acceptance Criteria [] Action menu for experiments includes a `Clone experiment` option [] Cloning automatically navigates to a create experiment form populated by the cloned experiment's configuration [] The duplicated experiment appends a `-{$n+1}` to the name and slug (machine-readable name) [] Documentation is updated
    • Task
    **Steps to replicate the issue** (include links if applicable): * Generate 10 block logs on a user. * Go to Special:Block and check the counter on the row "Block log" **What happens?**: It reports "10+" blocks. **What should have happened instead?**: It should say "10". Only when it exceeds 10 (i.e., 11 and onwards) can we say "10+". **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): zhwiki **Other information** (browser name/version, screenshots, etc.): All these screenshots are taken after doing [[ https://zh.wikipedia.org/w/index.php?title=Special:BlockList&wpTarget=%23641777 | block #641777 ]] on zhwiki. After this block, the user HMOXDSS1 has exactly 10 block logs. Special:Block page: [[ https://zh.wikipedia.org/wiki/Special:Block/HMOXDSS1 ]] (zhwiki sysops only). (1) Special:Block in English; (2) Special:Block in qqx. {F70204429} {F70204434}
    • Task
    **Steps to replicate the issue** (include links if applicable): * Attempt to edit a protected or deleted page, such as [[https://en.wikipedia.org/wiki/World%20of%20Warcraft?action=edit&useskin=minerva|World of Warcraft]] **What happens?**: {F70203364} **What should have happened instead?**: That thank link should not be that large and buttony. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): This probably doesn't show up in useformat=mobile given protection notices aren't displayed like this in that context?
    • Task
    **Steps to replicate the issue** (include links if applicable): * Attempt to edit a page on desktop width Minerva e.g. [[https://en.wikipedia.org/wiki/World%20of%20Warcraft?action=edit&useskin=minerva | World of Warcraft]] **What happens?**: Edit box stops expanding. {F70203283} **What should have happened instead?**: Edit box should always be the full width of its container. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): Due to ```lang=css @media screen and (min-width: 993.3px) { .banner-container, ..., #mw-content-text > form { ... width: 90%; max-width: 993.3px; } }``` at [[https://gerrit.wikimedia.org/g/mediawiki/skins/MinervaNeue/+/041492a9cc0c88b1f565a448d32d701d3ecb2859/resources/skins.minerva.styles/content/tablet/common.less#108|MinervaNeue/resources/skins.minerva.styles/content/tablet/common.less]]. Stated rationale seems to be for some special pages, so perhaps a selector for `.ns--1` applied here would be reasonable.
    • Task
    Command `py pwb.py interwiki -start -confirm -neverlink:de` attempt delete all [[de: links I expect that the -neverlink key will instruct the bot not to check the relevance of the link to the specified language pages, but to leave it as is, assuming that it is correct, and sort it in a single list if necessary. By sort it in a single list, I mean that the bot will not separate it into a separate list of "outdated links."
    • Task
    In team or team bots channels like `#wikimedia-fundraising`, `#mediawiki-core-bots` or `#wikimedia-editing` it is customary to help the team gain awareness of each other's work in Phab and Gerrit by subscribing to activity in components they maintain/own. This works for standalone Git repos for libraries and extensions, but not for MediaWiki core components. This means: 1. teams that own many core components (like MediaWiki Engineering) lack visibility into each others work unless they use other avenues (email watching in phab or manually polling workboards daily) or monitor much nosier channels (like `#mediawiki-feed`). 2. teams that own only one or two components in MediaWiki core tend to miss change requests unless there is out-of-bound begging to individuals for code review. This creates a slight incentive against ownership, and adds friction in otherwise potentially succesful team interactions and trust/confidence building. I suggest adding the ability to configure a channel not just by listing repos, but to also allow dictionary keys for repos that contain a list of file paths. For example: ```lang=diff,name=gerrit-channels.yaml "#mediawiki-core-bots": + "mediawiki/core": + - includes/ResourceLoader "RelPath": "WrappedString": "mediawiki/libs/Minify": "mediawiki/libs/less.php": "mediawiki/libs/node-cssjanus": "mediawiki/libs/php-cssjanus": "mediawiki/tools/grunt-cssjanus": ``` See also: * <https://www.mediawiki.org/wiki/Developers/Maintainers> which lists file paths for each component. * {T364652}
    • Task
    >>! From <https://wikitech.wikimedia.org/wiki/Help:Toolforge/Redis> > […] using redis-cli, which is installed on the bastions: `redis-cli -h redis.svc.tools.eqiad1.wikimedia.cloud`. ```counterexample [01:05 UTC] krinkle at tools-bastion-15.tools.eqiad1.wikimedia.cloud in ~ $ redis-cli -h redis.svc.tools.eqiad1.wikimedia.cloud -bash: redis-cli: command not found ``` Context: I was inspecting tool to debug it. I switched to using netcat instead, which works fine enough. It lacks autocomplete and makes discovery a bit harder if you're not vary familiar with the Redis line protocol (vs e.g. the PHP/Python API, which don't require prior knowledge of exact position and formatting of each arg).
    • Task
    - this might involve changing the name when generated or when uploaded per https://docs.google.com/document/d/1Yv-bw5PhqwGQgd2jDAoX-6n76pPf0LRMMzaU0fMjJzs/edit?tab=t.0#bookmark=id.u0epvotgo2wa
    • Task
    **Steps to replicate the issue** (include links if applicable): * Navigate to YIR * Navigate to any slide * Swipe left or right on the far left or right sides of the screen **What happens?**: The YIR slides disappear. **What should have happened instead?**: The previous or next slide appears **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): Tested on 2.7.50556-r-2025-11-13, on OnePlus 8 on Android 13, and Pixel 6 on Android 16. **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Steps to replicate the issue** (include links if applicable): * Register for at least two events with collaborative contributions enabled * Make an edit on any page * When the dialog shows up, DO NOT select any event, and just submit **What happens?**: The dialog closes. No indication is given of either success or failure. In the console, we can see that the request wasn't successful, e.g. ``` PUT http://localhost:8080/w/rest.php/campaignevents/v0/event_registration/null/edits/my_wiki/2190 400 (Bad Request) ``` **What should have happened instead?**: I think it should not let me submit, and the dialog should remain open. I don't know whether there is a predefined error/required state for the selector that we can use here, or if we need a custom error message.
    • Task
    The copy that accompanies the "Tone Check" Edit Check and the "Revise Tone" suggested edit differ in intentional ways. This task involves the work of ensuring the "Revise Tone" suggested edit copy that Growth surfaces within the newcomer homepage is aligned with the copy of the "Revise Tone" edit suggestion the Editing Team will be surfacing in suggestion mode. //More context in [Slack](https://wikimedia.slack.com/archives/C010UHLBLBX/p1762341136873339).//
    • Task
    In T410069, [we discovered](https://phabricator.wikimedia.org/T410069#11372205) that we could //potentially// offer people tapping any {nav Edit} button on mobile the ability to edit the entirety of the article's content while also increasing loading speed by decoupling the rendering of the section related to the {nav Edit} button someone tapped from the rendering of the rest of the article's content. In this ticket, we'd like to build a proof of concept that would enable us to try out this "lazy load" experience for the purpose of comparing it to the user experience we are prototyping in T409990. WARNING: before prioritizing work on this task, Editing Engineering will need to decide whether the approach David Chan discovered in T410069 is reliable/scalable enough to depend on.
    • Task
    | CID | transaction | date | note | [[ https://civicrm.wikimedia.org/civicrm/contact/view?reset=1&cid=69005565 | 69005565 ]] | [[ https://wikimedia.gr4vy.app/merchants/default/transactions/4400b395-e9ad-497c-afc0-455b11a5bd35/overview | 4400b395-e9ad-497c-afc0-455b11a5bd35 ]] | Nov 11 | PII is visible at Gravy | [[ https://civicrm.wikimedia.org/civicrm/contact/view?reset=1&cid=69004555 | 69004555 ]] | [[ https://wikimedia.gr4vy.app/merchants/default/transactions/60eba7bf-0312-44a2-83fb-ff8657608993/overview | 60eba7bf-0312-44a2-83fb-ff8657608993 ]] | Nov 11 | PII is visible at Gravy Unclear if related to prior tasks like {T395375}
    • Task
    ## Background The **Vector 2022** skin provides flexibility for customizing user interface elements such as menus and layout widths. Recent design discussions, specifically arising from T407516 and T409334, have given fuel to evaluating how *pinnable menus* are behaving with complex layouts like Grid under various customization options and how widths adapt between **wide** and **standard** viewport modes. Maintaining these customizations while ensuring consistent user satisfaction across different configurations comes at a cost. This task aims to evaluate the current customization capabilities, identify limitations, and gather insights to inform potential improvements or configuration defaults. --- ## Goal - Assess how Vector’s pinnable menus are increasing maintenance under complex layouts. - Outline how available content widths adjust or constrain under both wide and standard viewport modes. - Gather instrumentation data and review for use here - Determine whether customization settings lead to consistent and predictable layouts. - Provide recommendations or follow-up tasks if changes are needed. --- ## Open questions - To be decided if any recommendation should apply solely to ReadingLists or beyond ## Acceptance Criteria for Done [ ] A documented outline of pinnable menu and appearance menu behavior across customization states. [ ] An evaluation report describing how menu widths differ between wide and standard viewports. [ ] Screenshots or screencasts demonstrating observed behavior for both configurations. [ ] Identified list of any issues, layout breakages, or inconsistent user experiences. [ ] Clear recommendations or proposed next steps (e.g. enhancement tasks). [ ] All findings posted or linked in the task description or associated Phabricator subtask.
    • Task
    New Android beta is out, so it's time to run another regression test! Production APK: https://releases.wikimedia.org/mobile/android/wikipedia/stable/wikipedia-2.7.50556-r-2025-11-13.apk Beta APK: https://releases.wikimedia.org/mobile/android/wikipedia/betas/wikipedia-2.7.50556-beta-2025-11-13.apk Thank you!
    • Task
    NOTE: Due to the time-sensitive nature of the Wikipedia 25 birthday celebrations, it is unreasonable to expect a full security review for this extension before a production deployment in January/February 2026. The developer @ATitkov will provide a review to the best of their abilities and @Jdrewniak will assume the security risk for deploying this to production. This task will encompass the initial security review done by the extension developer, as well as an eventual review by the security team. **Project Information ** * Name of tool/project: Extension:WP25EasterEggs * Project home page: TODO https://www.mediawiki.org/wiki/Extension:WP25EasterEggs * Name of team requesting review: Reader Experience Team * Primary contact: @ATitkov (developer) @cmadeo (project lead) * Target date for deployment: January 2026 * Link to code repository: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WP25EasterEggs/+/refs/heads/master * Link to scc output for general sizing of codebases (https://github.com/boyter/scc): TODO **Description of the tool/project: ** The Wikipedia 25 Easter eggs extension will provide readers of Wikipedia with a celebratory mascot who will accompany them on their Wikipedia journey. **Description of how the tool will be used at WMF:** Lighthearted interventions to celebrate 25 years of Wikipedia. **Dependencies** TODO **Has this project been reviewed before?** No **Working test environment** >Please link or describe setup process for setting up a test environment. TODO Link to documented installation instructions **Post-deployment** > Name of team responsible for tool/project after deployment and primary contact. Reader Experience Team == Acceptance criteria== - use query param `uselang=x-xss` - confirm no alert messages appear
    • Task
    Upgrading from mediawiki 1.39.X to 1.43.X we noticed that some tabs (which are conditional tabs) were missing from the tabs, and rather shown as headers inline. The version of HeaderTabs was going from 2.2.2 to 2.4. We originally use it within pageforms and with variables plugin, but a more simple reproduction just using parserfunctions and a single page, given the following markup reproduces it: ``` = tab 1 = = tab 2 = {{#if: test string | = tab 3 = | value if test string is empty (or only white space)}} = tab 4 = <headertabs /> ``` See the attached image where tab 3 is a header instead of adding a third tab. {F70189296}
    • Task
    **Feature summary** (what you would like to be able to do and where): There are two main ways to edit mediawiki pages: the wikitext editor and the visual editor. There are also two ways to view changes (between versions, or befor publishing an edit): a wikitexf-based diff, and a "visual diff", showing the differences on something that looks like the rendered page. When comparing versions (e.g. from the version history page), or when reviewing changes before publishing **from the visual editor**, it is possible to switch between the two "diff viewers". However, when reviewing changes before publishing **from the wikitext editor**, the diff is shown in the wikitext diff viewer, and there is no way to switch to the visual diff viewer. **My feature request**: allow switching to "visual" "show changes" from the wikitext editor. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): To reproduce: * Open the version history of any Wikipedia page with at least two versions. * Click the "prev" link in the beginning of any of the version entries in the list, to open the version comparison page. * Note that there are buttons to switch between "visual" and "wiktext" diff views. * When editing a page using the visual editor and clicking "show changes", these two buttons appear as well, letting the editor view the changes both ways. * But when editing a page using the wikitext editor and clicking "show changes", the diff is displayed in wikitext mode, and there are no buttons to switch to visual. (By the way, using the wikitext editor, the "show changes" button exists only in desktop view, but it's not even there in mobile view ‐ but that should probably wait for another feature request, if no such request exists yet) **Benefits** (why should this be implemented?): Different users feel more convenient using different editors; and, different changes, or different aspects in an edit, are easier to see using different viewers. It can be beneficial to have both options for viewing the changes accessible from both editors. Thanks!
    • Task
    ### Description As we expand our OpenAPI style guide and expand linting rules, it is helpful to have a spec that we can easily use for testing rule compliance and socializing best practices. This spec is intended to have examples that cover every linting rule, and can be used as a reference spec. We may introduce automated testing to verify linting behavior, as the spec should always pass all rules. Additionally, the simplified spec will create an easier route to test or debug new rules than live specs. This spec will also be useful to demonstrate simple examples of expected patterns, which will help guide teams who wish to follow the standards, potentially including in error message references for "good" examples. ### Conditions of acceptance [] Create an sample spec that can be used for testing and demonstrating desired patterns across teams. [] Clearly indicate that it is a test spec. The content should not be something that could be mistaken for an actual Wikimedia service. There are a few suggestions to accomplish this, but open to other ideas: [] Follow the "pet shop" example that is frequently used to demontrate OpenAPI tools. [] Name the objects after the rules we are defining for clear example mapping (this is half baked and might get tricky, but could be useful) [] Pick any other "fun" example that can demonstrate functionality without risking Wikimedia domain conflicts or confusion. [] Determine where we should house the spec, so that it is discoverable and we can ensure that it remains up to date as new rules are introduced. [] Work with Moriel to determine if/how we might use it for automated linter testing. NOTE: Although the tech writer team is creating the first instance of this spec, we will have joint ownership with MWI for iterative changes as new rules are created.
    • Task
    * Hook handlers of `TitleIsAlwaysKnown` can be more expensive than checking `Title::exists()` (e.g. GlobalUserPage) ** Introduce a new hook run after `Title::exists()` in `Title::isKnown()` * 1 extra database hit per contributor linked, even on parser cache hit. ** Cache links to the user namespace in LinkCache when $wgMaxCredits is set? Example: https://performance.wikimedia.beta.wmcloud.org/excimer/profile/fb82ebe99935364d
    • Task
    I'm unable to keep operating my bot in translatewiki.net since aprox. November, 5 due to repeat "Nonce already used" errors. I'm using Pywikibot stable ([[ https://phabricator.wikimedia.org/rPWBC3a5006da82c4e7a2613e9f3ac44d2cefae70bd7c | 3a5006da82c4 ]]) on Toolforge. Example error while using `pwb login -verbose -debug` (although I get the same error with `pwb version`) ``` Python 3.13.5 (main, Jun 25 2025, 18:55:22) [GCC 14.2.0] Found 3 i18n:i18n processes running, including this one. WARNING: API error mwoauth-invalid-authorization: The authorization headers in your request are not valid: Nonce already used: 34771845544456520691763062872 headers= {'Date': 'Thu, 13 Nov 2025 19:41:13 GMT', 'Content-Type': 'application/json; charset=utf-8', 'Transfer-Encoding': 'chunked', 'Connection': 'keep-alive', 'Server': 'cloudflare', 'x-content-type-options': 'nosniff', 'mediawiki-api-error': 'mwoauth-invalid-authorization', 'content-security-policy': "default-src 'self'; script-src 'none'; object-src 'none'", 'x-frame-options': 'DENY', 'content-disposition': 'inline; filename=api-result.json', 'Cache-Control': 'private, must-revalidate, max-age=0', 'x-request-id': '0337e3d5b445ed2bcb13da7b', 'strict-transport-security': 'max-age=31536000; includeSubDomains', 'Content-Encoding': 'gzip', 'cf-cache-status': 'DYNAMIC', 'vary': 'accept-encoding', 'CF-RAY': '99e0beccbfc58c41-EWR'} ERROR: Retrying failed OAuth authentication for i18n:i18n: The authorization headers in your request are not valid: Nonce already used: 34771845544456520691763062872 ``` I've created a new OAuth consumer to get new credentials, but it keeps erroring. Not sure if this is a translatewiki.net issue only, or if this is caused by the OAuth extension or a bug in the Pywikibot code. Thanks in advance for your assistance.
    • Task
    I've cloned a host with `--ignore-existing` but looks like it still tried to add it to Zarcillo and I got this error: ``` [07:42:00] marostegui@cumin1003:~$ sudo cookbook sre.mysql.clone -t T409374 --source db1241.eqiad.wmnet --target db1262.eqiad.wmnet --nopool --ignore-existing <snipped> [cookbooks.sre.mysql.clone.zarc] Adding db1262.eqiad.wmnet to Zarcillo Transaction rolled back due to: (1062, "Duplicate entry 'db1262' for key 'PRIMARY'") [Error] 1062: Duplicate entry 'db1262' for key 'PRIMARY' ==> The above warnings were raised during the last query, do you want to proceed anyway? Type "go" to proceed or "abort" to interrupt the execution > go User input is: "go" [cookbooks.sre.mysql.clone.catchup_repl_s] Catching up replication lag on db1241.eqiad.wmnet before removing icinga downtime ``` The cookbook worked fine after skipping the above issue.
    • Task
    Share code with update page where possible, but this page will only for reducing amount. Per slack conversation with Kevin and Ramon: https://wikimedia.enterprise.slack.com/archives/C09RXAEFP33 {F70186523}{F70186518} and if the gift is at the minimum, have an error message that indicates the minimum gift level. (Something like: The minimum gift amount for monthly recurring donations is $X. If you wish to give less that this, please choose an annual recurring gift instead.)
    • Task
    Prompted by an an offline discussion with @Sucheta-Salgaonkar-WMF and @nayoub and an underlying need to make the content Wikipedia offers, "...more readable and accessible, and thus easier to discover and learn from..." ([source](https://www.mediawiki.org/wiki/Reading/Web/Content_Discovery_Experiments/Simple_Article_Summaries)), this task involves the work of introducing a signal/suggestion that would make people editing in VE (and maybe one day, reading (T409589)) aware of the reading length and complexity of a given portion of text and invite them to consider making changes that would reduce both. //Note: were we to implement this as a suggestion, I could imagine a similar kind of {nav Recheck} interaction that we implemented with Tone Check that could enable someone to see, in real-time, the extent to which the changes they've made are effective at increasing the readability of the content they were editing.//
    • Task
    ====Background We need to notify users when easter eggs / birthday mode becomes available on Minerva. We also know from Usability testing that readers have no problem finding the appearance settings in Minerva once they know they exist. ====User story As a reader on Minerva, I want to know that easter eggs are available and how to turn them on. ====Requirements - Promotional notification shown as soon as possible to logged and logged out readers on Minerva where the configuration is enabled - Notification only appears once, and is not shown again after it has been dismissed, regardless of the logged in or out status of the reader. - Notification uses Codex Dialog component and follows its mobile behaviour. ====Design Figma link: https://www.figma.com/design/wr6JE07aP6jaXNHWW04qzg/Easter-eggs?node-id=5-110&t=BUaUXs3wCpTQf4IE-1 {F70185863 width=1000}
    • Task
    @Lars I've been creating additional batch data entry templates for our team: {F70185625} I would like to add a shortcut to access these in the top toolbar like our current template is: {F70185645} Could we add these to that shortcut? I couldn't remember how to do this. Also it would be great to have the batch entry names reflected in each so it's clear what each template type is. Thanks!
    • Task
    We have a warning in App Store Connect. We have to submit new age rating responses by January 31, 2026 in order to continue app updates. {F70185473}
    • Task
    ==== Background Since {T409714}, it is possible to configure a group such that its members must meet a set of conditions, and users who are adding members must meet certain conditions. This currently works only for local groups. We need to make this work for global groups too. This should be easier thanks to work already done to share logic between the special pages (T406003) and services (T405575). **How it works for local groups** * `SpecialUserRights` asks `UserGroupAssignmentService` for changeable groups * `UserGroupAssignmentService` asks `UserGroupManager` for addable/removeable groups * `UserGroupAssignmentService` asks `RestrictedUserGroupChecker` for restricted groups * `RestrictedUserGroupChecker` calculates this, using `UserRequirementsConditionChecker` **How this could work for global groups** * `SpecialGlobalGroupMembership` asks `GlobalGroupAssignmentService` for changeable groups * `GlobalGroupAssignmentService` calculates addable/removable/automatic groups * **New functionality:** `GlobalGroupAssignmentService` could ask `RestrictedUserGroupChecker` for restricted groups * If using the same config, we would need to disallow global and local groups having the same name * Or we could use a different config, `$wgGlobalRestrictedGroups`, and have a local and global version of the `RestrictedUserGroupChecker` * `RestrictedUserGroupChecker` would calculate this, using `UserRequirementsConditionChecker` This means the user would need to be a local user rather than a `CentralAuthUser`, since `UserRequirementsConditionChecker` works with local users. This seems OK though, since (1) the use-cases we currently know about involve checking 2FA on the local wiki, and (2) where global conditions need to be met (e.g. global editcount), CentralAuth would add these via a hook and check them against the central user. ==== Acceptance criteria It is possible to configure a restricted global group, similarly to how a restricted local group would be configured.
    • Task
    Cassandra clients are configured with at least one contact point —which can be any node in the cluster. From a single contact, clients are able to discover the complete topology. We typically supply //every// node as a contact point, to hedge against cluster changes that go undocumented. For example: If you added just one, and you missed updating that before replacing the host, service restarts would fail unexpectedly. We maintain these contact lists in many places, across many disparate systems, so we do frequently miss updates —make the threat real. We also require up to date lists of Cassandra cluster nodes to maintain access lists (we have different manually updated lists in helm too, for example). And some form of service discovery might also allow us to create a custom Casandra `SeedProvider`, and clean up some [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/puppet/+/884227d573cb3fe64ad6b4616887a07de8b4247a/modules/cassandra/templates/cassandra.yaml-4.x.erb#590 | historically brittle Puppet code ]].
    • Task
    ## Summary In the CheckUser repo, separate out the translations related to temp accounts, UIC, SI, and potentially other user interfaces. Goal: make it easier for translators to focus on a specific group of the most needed translations and get it to 100%. ##Background - Currently, with about 400 messages in the repo, messages for different UIs are mixed. It's challenging for translators to make sure that 100% of the most relevant and needed ones are done. For example, after the temp accounts deployment on eswiki, the onboarding dialog wasn't translated, and community members needed to check each message in the large repo to see which ones should be prioritized. - As a side-effect, this will also provide the translations with a few separate rewards for reaching each milestone: 100% translations for UIC, 100% translations for SI, etc. ##User story > Do: > - Explain who the user is, what they want to achieve and why (e.g. As a Steward, I want to automatically see all temp account IPs, in order to effectively mitigate on-wiki abuse) As a translator, I want to be able to focus on translating the UI for Special:Investigate, SI, UIC, temp accounts etc. ## Technical notes > Do: > - Provide detailed technical notes > - Delete this section if it is not relevant to the task ## Acceptance criteria > Identify how we will know the task is done. Make these checkboxes specific, so that QA can meaningfully review completion. - [ ] Component code updated - [ ] Relevant component code is tested - [ ] Documentation updated
    • Task
    https://netbox.wikimedia.org/extras/scripts/results/268387/ test_power_port_termination_names incorrectly named power port cable termination: PS1 incorrectly named power port cable termination: PS2 incorrectly named console port cable termination: Console We are using this standard on all the new Nokia switches? If so, report should update, or should these be named differently?
    • Task
    After exporting files to acoustic, write number of rows in each to prometheus (unless that can just be handled by a different log scraper) We want to notice any large increases in unsubscribe list size like the one that may have caused {T409958}
    • Task
    This could potentially avoid a long delay when switching from section editing to full-page editing
    • Task
    documentation lives here: https://wikitech.wikimedia.org/wiki/Catalyst/Scaling
    • Task
    The first is a PayPal donation from October 18th that was manually captured on November 6th. Manually settling PayPal's at Gravy is a new workflow. 9629b124-2829-4139-8842-21a7ff1946eb cid=24645612 ZD #176575 The second is a new recurring card donation from yesterday, November 12th. @KHancock99 noted that happened this around the same time as a slowdown mentioned by Dallas [[ https://wikimedia.slack.com/archives/CNT0JM91U/p1762992310595919?thread_ts=1762979113.868509&cid=CNT0JM91U | in Slack ]]. 80892609-9caf-4a3c-8d5b-a2f1c62a8ba4 ZD #1773036 multiple email addresses I've added these to the Transactions Log in the Gravy tracker, and can manually import to Civi if needed.
    • Task
    On a new install of Lingo in MW 1.43 with SMW 5.1 and SemanticGlossary 5 I see the following warning ``` smw/schema:Group:Schema properties ...PHP Deprecated: Use of MediaWiki\Parser\Parser::getOutput before initialization was deprecated in MediaWiki 1.42. [Called from Lingo\LingoParser::shouldParse in extensions/Lingo/src/LingoParser.php at line 391] in includes/debug/MWDebug.php on line 385 ```
    • Task
    ## Description Often, it's possible to have information about an object stored in memcached. Some of this object is stored in `ZWrapper`s but is lost when we create a `ZWrapper` from things retrieved from memcached. This information could be used to rehydrate `ZWrapper`s with more information. **Desired behavior/Acceptance criteria (returned value, expected error, performance expectations, etc.)** We could store any/all of the following in memcached: [ ] object is already validated; [ ] object is already fully resolved; [ ] (for Wikidata) certain statements are not yet populated **Remove all the non-applicable tags from the "Tags" field, leave only the tags of the projects/repositories related to this task** --- ## Completion checklist * [ ] Before closing this task, review one by one the checklist available here: https://www.mediawiki.org/wiki/Abstract_Wikipedia_team/Definition_of_Done#Back-end_Task/Bug_completion_checklist
    • Task
    ( ! ) Warning: Undefined array key "id" in /var/www/html/extensions/DonationInterface/special/DonorPortal.php on line 89
    • Task
    ####Acceptance Criteria: * Move Edit Count to be between Articles edited and Bytes changed
    • Task
    As a user, I want to **edit** time using time datatypes to Wikidata, and see them displayed on the main page.
    • Task
    ==== Error ==== * service.version: 1.46.0-wmf.2 * timestamp: 2025-11-13T15:46:33.715Z * labels.phpversion: `8.3.26` * trace.id: `6eafc815-5bc2-4b1b-bacc-822351a5f43a` * [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-11-12T15:46:33.715Z',to:'2025-11-13T16:17:28.510Z'))&_a=(query:(query_string:(query:'reqId:%226eafc815-5bc2-4b1b-bacc-822351a5f43a%22'))) | Find trace.id in Logstash ]] ```name=labels.normalized_message,lines=10 [{reqId}] {exception_url} Wikimedia\Assert\InvariantException: Invariant failed: Parameter 1 should be positional! ``` | Frame | Location | Call | -- | -- | -- | from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/assert/src/Assert.php#231 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/assert/src/Assert.php(231) ]] | | #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/NodeData/TemplateInfo.php#127 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/NodeData/TemplateInfo.php(127) ]] | Wikimedia\Assert\Assert::invariant(bool, string) | #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/NodeData/TemplateInfo.php#177 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/NodeData/TemplateInfo.php(177) ]] | Wikimedia\Parsoid\NodeData\TemplateInfo::renumberParamInfos(array) | #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/json-codec/src/JsonStaticClassCodec.php#46 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/json-codec/src/JsonStaticClassCodec.php(46) ]] | Wikimedia\Parsoid\NodeData\TemplateInfo->toJsonArray() | #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/json-codec/src/JsonCodec.php#244 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/json-codec/src/JsonCodec.php(244) ]] | Wikimedia\JsonCodec\JsonStaticClassCodec->toJsonArray(Wikimedia\Parsoid\NodeData\TemplateInfo) | #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/json-codec/src/JsonCodec.php#274 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/json-codec/src/JsonCodec.php(274) ]] | Wikimedia\JsonCodec\JsonCodec->toJsonArray(Wikimedia\Parsoid\NodeData\TemplateInfo, string) | #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/json-codec/src/JsonCodec.php#274 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/json-codec/src/JsonCodec.php(274) ]] | Wikimedia\JsonCodec\JsonCodec->toJsonArray(array, string) | #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/json-codec/src/JsonCodec.php#274 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/json-codec/src/JsonCodec.php(274) ]] | Wikimedia\JsonCodec\JsonCodec->toJsonArray(array, string) | #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Utils/DOMDataUtils.php#889 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Utils/DOMDataUtils.php(889) ]] | Wikimedia\JsonCodec\JsonCodec->toJsonArray(array, string) | #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Utils/DOMUtils.php#73 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Utils/DOMUtils.php(73) ]] | Wikimedia\Parsoid\Utils\DOMDataUtils::storeDataAttribs(Wikimedia\Parsoid\DOM\Element, array) | #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Utils/DOMUtils.php#77 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Utils/DOMUtils.php(77) ]] | Wikimedia\Parsoid\Utils\DOMUtils::visitDOM(Wikimedia\Parsoid\DOM\Element, array, array) | #10 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Utils/DOMUtils.php#77 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Utils/DOMUtils.php(77) ]] | Wikimedia\Parsoid\Utils\DOMUtils::visitDOM(Wikimedia\Parsoid\DOM\Element, array, array) | #11 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Utils/DOMUtils.php#77 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Utils/DOMUtils.php(77) ]] | Wikimedia\Parsoid\Utils\DOMUtils::visitDOM(Wikimedia\Parsoid\DOM\Element, array, array) | #12 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Utils/DOMDataUtils.php#799 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Utils/DOMDataUtils.php(799) ]] | Wikimedia\Parsoid\Utils\DOMUtils::visitDOM(Wikimedia\Parsoid\DOM\Element, array, array) | #13 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/vendor/+blame/refs/heads/wmf/1.46.0-wmf.2/wikimedia/parsoid/src/Parsoid.php#270 | /srv/mediawiki/php-1.46.0-wmf.2/vendor/wikimedia/parsoid/src/Parsoid.php(270) ]] | Wikimedia\Parsoid\Utils\DOMDataUtils::visitAndStoreDataAttribs(Wikimedia\Parsoid\DOM\Element, array) | #14 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Parser/Parsoid/ParsoidParser.php#153 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Parser/Parsoid/ParsoidParser.php(153) ]] | Wikimedia\Parsoid\Parsoid->wikitext2html(MediaWiki\Parser\Parsoid\Config\PageConfig, array, null, MediaWiki\Parser\ParserOutput) | #15 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Parser/Parsoid/ParsoidParser.php#286 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Parser/Parsoid/ParsoidParser.php(286) ]] | MediaWiki\Parser\Parsoid\ParsoidParser->genParserOutput(MediaWiki\Parser\Parsoid\Config\PageConfig, MediaWiki\Parser\ParserOptions, null) | #16 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Content/WikitextContentHandler.php#375 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Content/WikitextContentHandler.php(375) ]] | MediaWiki\Parser\Parsoid\ParsoidParser->parse(string, MediaWiki\Title\Title, MediaWiki\Parser\ParserOptions, bool, bool, int, null) | #17 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Content/ContentHandler.php#1574 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Content/ContentHandler.php(1574) ]] | MediaWiki\Content\WikitextContentHandler->fillParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Content\Renderer\ContentParseParams, MediaWiki\Parser\ParserOutput) | #18 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Content/Renderer/ContentRenderer.php#67 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Content/Renderer/ContentRenderer.php(67) ]] | MediaWiki\Content\ContentHandler->getParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Content\Renderer\ContentParseParams) | #19 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Revision/RenderedRevision.php#246 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Revision/RenderedRevision.php(246) ]] | MediaWiki\Content\Renderer\ContentRenderer->getParserOutput(MediaWiki\Content\WikitextContent, MediaWiki\Page\PageIdentityValue, MediaWiki\Revision\RevisionStoreCacheRecord, MediaWiki\Parser\ParserOptions, array) | #20 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Revision/RenderedRevision.php#219 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Revision/RenderedRevision.php(219) ]] | MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(MediaWiki\Content\WikitextContent, array) | #21 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Revision/RevisionRenderer.php#225 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Revision/RevisionRenderer.php(225) ]] | MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string, array) | #22 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Revision/RevisionRenderer.php#158 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Revision/RevisionRenderer.php(158) ]] | MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, MediaWiki\Parser\ParserOptions, array) | #23 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Revision/RenderedRevision.php#182 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Revision/RenderedRevision.php(182) ]] | MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array) | #24 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Page/ParserOutputAccess.php#586 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Page/ParserOutputAccess.php(586) ]] | MediaWiki\Revision\RenderedRevision->getRevisionParserOutput() | #25 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Page/ParserOutputAccess.php#672 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Page/ParserOutputAccess.php(672) ]] | MediaWiki\Page\ParserOutputAccess->renderRevision(MediaWiki\Page\WikiPage, MediaWiki\Parser\ParserOptions, MediaWiki\Revision\RevisionStoreCacheRecord, array) | #26 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/PoolCounter/PoolCounterWorkViaCallback.php#68 | /srv/mediawiki/php-1.46.0-wmf.2/includes/PoolCounter/PoolCounterWorkViaCallback.php(68) ]] | MediaWiki\Page\ParserOutputAccess->MediaWiki\Page\{closure}() | #27 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/PoolCounter/PoolCounterWork.php#159 | /srv/mediawiki/php-1.46.0-wmf.2/includes/PoolCounter/PoolCounterWork.php(159) ]] | MediaWiki\PoolCounter\PoolCounterWorkViaCallback->doWork() | #28 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Page/ParserOutputAccess.php#489 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Page/ParserOutputAccess.php(489) ]] | MediaWiki\PoolCounter\PoolCounterWork->execute() | #29 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Page/Article.php#829 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Page/Article.php(829) ]] | MediaWiki\Page\ParserOutputAccess->getParserOutput(MediaWiki\Page\WikiPage, MediaWiki\Parser\ParserOptions, MediaWiki\Revision\RevisionStoreCacheRecord, array) | #30 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Page/Article.php#538 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Page/Article.php(538) ]] | MediaWiki\Page\Article->generateContentOutput(MediaWiki\User\User, MediaWiki\Parser\ParserOptions, int, MediaWiki\Output\OutputPage, array) | #31 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Actions/ViewAction.php#71 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Actions/ViewAction.php(71) ]] | MediaWiki\Page\Article->view() | #32 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Actions/ActionEntryPoint.php#734 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Actions/ActionEntryPoint.php(734) ]] | MediaWiki\Actions\ViewAction->show() | #33 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Actions/ActionEntryPoint.php#505 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Actions/ActionEntryPoint.php(505) ]] | MediaWiki\Actions\ActionEntryPoint->performAction(MediaWiki\Page\Article, MediaWiki\Title\Title) | #34 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Actions/ActionEntryPoint.php#143 | /srv/mediawiki/php-1.46.0-wmf.2/includes/Actions/ActionEntryPoint.php(143) ]] | MediaWiki\Actions\ActionEntryPoint->performRequest() | #35 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/MediaWikiEntryPoint.php#184 | /srv/mediawiki/php-1.46.0-wmf.2/includes/MediaWikiEntryPoint.php(184) ]] | MediaWiki\Actions\ActionEntryPoint->execute() | #36 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/index.php#44 | /srv/mediawiki/php-1.46.0-wmf.2/index.php(44) ]] | MediaWiki\MediaWikiEntryPoint->run() | #37 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/w/index.php#3 | /srv/mediawiki/w/index.php(3) ]] | require(string) | #38 | {main} | ==== Notes ==== * Happening at a very low rate (7 in 90 days) * Seems like it's exclusively to happen on this single template * Reproducible though, something about the template makes parsoid unhappy
    • Task
    It is useful to be able to build a newer/different code release (i.e. `--ref`) for testing prior to replacing the current image. For some tools having a dedicated staging tool is a bit overkill (having to selectively duplicate secrets/manage multiple oauth credentials is overhead for maintainers). Today the only option is to build a different 'component', this is very 'toolforge' specific, the common pattern being to use tags (dev, staging, specific releases etc). builds-api constructs the target, hard coding `latest`, which should be reasonably easy to change. After the recent changes to jobs-api it likely would just work, components-api still has 1 change outstanding to remove the hard coding. Proposal to: - Add `tagName` similar to `imageName` in builds-api - Add `-t/--tag` with a default of `latest` to builds-cli - Break jobs-api again
    • Task
    #mediawiki-extensions-newsletter does not require the Flow extension (#structureddiscussions), but tests are failing when Flow is not loaded: ```counterexample There were 4 errors: 1) NewsletterAPIEditTest::testUpdateDescription Error: Class "MediaWiki\Extension\Notifications\Model\Event" not found 2) NewsletterAPIEditTest::testUpdateMainPage Error: Class "MediaWiki\Extension\Notifications\Model\Event" not found 3) NewsletterAPIEditTest::testAddPublisher Error: Class "MediaWiki\Extension\Notifications\Model\Event" not found 4) NewsletterAPIEditTest::testRemovePublisher Error: Class "MediaWiki\Extension\Notifications\Model\Event" not found ``` Those tests should use `$this->markTestSkippedIfExtensionNotLoaded( 'Echo' )` or alternatively Newsletter can be marked to require Echo in `extension.json`.
    • Task
    #EventBus does not require EventStreamConfig in `extension.json`. Running the PHPUnit tests fail though: ``` 16:50:34 There were 4 errors: 16:50:34 16:50:34 1) EventBusFactoryTest::testGetEventServiceName with data set "disabled service" ('disabled_stream', '_disabled_eventbus_') 16:50:34 Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found 16:50:34 16:50:34 /workspace/src/extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 16:50:34 16:50:34 2) EventBusFactoryTest::testGetEventServiceName with data set "stream without destination event service" ('stream_without_destination_ev...ervice', 'intake-main') 16:50:34 Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found 16:50:34 16:50:34 /workspace/src/extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 16:50:34 16:50:34 3) EventBusFactoryTest::testGetEventServiceName with data set "stream with explicit event service" ('other_stream', 'intake-other') 16:50:34 Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found 16:50:34 16:50:34 /workspace/src/extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 16:50:34 16:50:34 4) EventBusFactoryTest::testGetEventServiceName with data set "stream with T321557 BC setting" ('stream_with_destination_event...etting', 'intake-main') 16:50:34 Error: Class "MediaWiki\Extension\EventStreamConfig\StreamConfigs" not found 16:50:34 16:50:34 /workspace/src/extensions/EventBus/tests/phpunit/unit/EventBusFactoryTest.php:149 ``` Those tests should use `$this->markTestSkippedIfExtensionNotLoaded( 'EventStreamConfig' )` or alternatively EventBus can be marked to require EventStreamConfig in extension.json
    • Task
    When using `--follow` an artificial log entry is emitted every ~15 seconds ``` tools.cluebot3@tools-bastion-15:~$ date; toolforge jobs logs -f cluebot3 Thu Nov 13 15:28:44 UTC 2025 2025-11-13T15:29:00.278526Z [nopod] [nocontainer] No logs received yet for job 'cluebot3', maybe the tool is using filelog or the job name is not correct? Will continue waiting just in case 2025-11-13T15:29:15.283920Z [nopod] [nocontainer] No logs received yet for job 'cluebot3', maybe the tool is using filelog or the job name is not correct? Will continue waiting just in case 2025-11-13T15:29:30.285722Z [nopod] [nocontainer] No logs received yet for job 'cluebot3', maybe the tool is using filelog or the job name is not correct? Will continue waiting just in case ``` This makes the contents inconsistent to the 'fetch' mode ``` tools.cluebot3@tools-bastion-15:~$ toolforge jobs logs cluebot3 ERROR: Job 'cluebot3' does not have any logs available ``` I am currently using the `get_raw_lines` method to fetch all logs, adding them to a list, until a pre-defined end marker is seen (working around previous issues with logs being dropped). This requires a lot of calls to the logging endpoint and would be better served by the streaming endpoint, however the streaming endpoint pollutes the job output, which is persisted for the run (e.g. https://cluebotng-trainer.toolforge.org/Original%20Testing%20Training%20Set%20-%20Old%20Triplet/2025-08-30%2023:13:04/logs/bayes-train.log) These can be filtered out by the `pod` and `container` fields being known strings, however those are are really 'internal' identifiers. Having an additional field to identify the contents is a 'response message' (to use the same name as what is used elsewhere) rather than a log entry would likely be better.
    • Task
    Continue the work Ehi started and get it to the finish line __________________________________ [] Update values access file [] Update WAF config [] deploy to dev **Acceptance Criteria** * Endpoint available in DEV
    • Task
    #articleplaceholder does not require Wikibase in `extension.json`. Running the tests fail though when it is not loaded: ```counterexample There were 22 errors: 1) ArticlePlaceholder\Tests\SearchHookHandlerTest::testNewFromGlobalState Error: Class "Wikibase\Client\WikibaseClient" not found /workspace/src/extensions/ArticlePlaceholder/includes/SearchHookHandler.php:54 /workspace/src/vendor/wikimedia/testing-access-wrapper/src/TestingAccessWrapper.php:114 /workspace/src/extensions/ArticlePlaceholder/tests/phpunit/includes/SearchHookHandlerTest.php:134 === Logs generated by test case [objectcache] [debug] MainWANObjectCache using store {class} {"class":"Wikimedia\\ObjectCache\\HashBagOStuff"} [localisation] [debug] LocalisationCache using store LCStoreNull [] === 2) ArticlePlaceholder\Tests\SearchHookHandlerTest::testAddToSearch with data set #0 ('get term, check if entity wit...turned', 'Unicorn', '>Q7246 en label</a>: Q7246 en...</div>') Error: Class "Wikibase\Lib\Interactors\MatchingTermsLookupSearchInteractor" not found /workspace/src/extensions/ArticlePlaceholder/tests/phpunit/includes/SearchHookHandlerTest.php:90 /workspace/src/extensions/ArticlePlaceholder/tests/phpunit/includes/SearchHookHandlerTest.php:119 /workspace/src/extensions/ArticlePlaceholder/tests/phpunit/includes/SearchHookHandlerTest.php:163 === Logs generated by test case [objectcache] [debug] MainWANObjectCache using store {class} {"class":"Wikimedia\\ObjectCache\\HashBagOStuff"} [localisation] [debug] LocalisationCache using store LCStoreNull [] === ... ```
    • Task
    #mediawiki-extensions-babel does not require CommunityConfiguration in `extension.json. Running the PHPUnit tests fail though: ``` 16:16:37 1) Babel\Tests\Unit\ConfigWrapperTest::testReturnsArray 16:16:37 PHPUnit\Framework\MockObject\UnknownTypeException: Class or interface "MediaWiki\Extension\CommunityConfiguration\Access\MediaWikiConfigRouter" does not exist 16:16:37 16:16:37 /workspace/src/tests/phpunit/MediaWikiTestCaseTrait.php:63 16:16:37 /workspace/src/extensions/Babel/tests/phpunit/unit/ConfigWrapperTest.php:15 16:16:37 ``` ``` 16:16:37 2) Babel\Tests\Unit\ConfigWrapperTest::testRelaysHas 16:16:37 PHPUnit\Framework\MockObject\UnknownTypeException: Class or interface "MediaWiki\Extension\CommunityConfiguration\Access\MediaWikiConfigRouter" does not exist 16:16:37 16:16:37 /workspace/src/tests/phpunit/MediaWikiTestCaseTrait.php:63 16:16:37 /workspace/src/extensions/Babel/tests/phpunit/unit/ConfigWrapperTest.php:29 16:16:37 ``` Those tests should be skipped when CommunityConfiguration is not present with `markTestSkippedIfExtensionNotLoaded( 'CommunityConfiguration' )` Note `tests/phpunit/integration/maintenance/MigrateConfigToCommunityTest.php` extends `MediaWiki\Extension\CommunityConfiguration\Tests\SchemaProviderTestCase` so I don't think we can make it skippable. See also {T410117}
    • Task
    as a WME eng I want the Kafka health checks actually validate that the topic exists, without causing unnecessary container restarts. Health check will be used: As readiness / startup probe (to gate rollout and load-balancer registration). Not as a liveness probe (to avoid Kubernetes repeatedly killing containers when Kafka has transient/network issues). @REsquito-WMF will sync with @RThomas-WMF on the readiness-only usage for Kafka health checks.
    • Task
    Notable changes: Change how file checksum is calculated when wildcards and include/exclude patterns are involved to better align with how they are calculated in the non-wildcard path. #6238 LLB Copy operation now allows specifying required paths to be included in the copy. #6229 Fixed race condition between cache and snapshot for the Git source. #6281 Fixed race condition in HTTP cache key digest computation that could cause duplicate requests and digest mismatch errors. #6292 Runc container runtime has been updated to v1.3.3. #6331 Source metadata requests via ResolveSourceMeta, previously available for image sources, can now be performed for Git sources. This can be used to resolve Git commit and tag checksums and also to access the raw commit and tag objects for further verification. #6283 Source metadata requests via ResolveSourceMeta, previously available for image sources, can now be performed for HTTP sources. This can be used to access artifact checksums, last-modified time etc. #6285 Git sources can now perform verification of GPG or SSH signatures on commits and tags. Enable git signature checks via source policy. #6300 #6344 contentutil package now supports moving referrer objects when using CopyChain function. #6336 Fix fetch by commit for git source when tags change or branch names are updated. #6259 Fix http connection leak when resolving metadata from http source on non-2xx HTTP status codes. #6313 A new type of source policies has been added that supports making policy decisions on the client side via session tunnel. #6276 Add buildkit capability for detecting if source policy decisions can be made via session tunnel. #6345 Avoid intermediate type wrappers for custom fields in provenance. #6275 Add raw commit/tag object access when resolving git source metadata. #6298 Move image source resolver away from the ResolveImageConfig type to ResolveSourceMetadata. #6330 # probably not needed for changelog Fix inline cache used with multiple exporters. #6263 Fix handling multiple inline cache exporters configured for single build. #6272 Fix handling of annotated Git tags. The pin of the annotated tag should be the SHA of the tag and not the commit it is pointing to. #6251 Fix source policy attributes validation when multiple rules use the same identifier. #6342 Deployment: [] gitlab-cloud-runners staging [] gitlab-cloud-runners production [] WMCS and Trusted runners
    • Task
    ``` tools.cluebot3@tools-bastion-15:~$ toolforge jobs delete cluebot3 tools.cluebot3@tools-bastion-15:~$ tools.cluebot3@tools-bastion-15:~$ toolforge jobs delete cluebot3 ERROR: Job 'cluebot3' does not exist ``` The command provides not meaningful feedback to the user, though it takes a long time to return, appearing to be hung (waiting for the deployment to be deleted). The API returns no messages and the CLI only logs output at debug level. Adding feedback for the human would make the result more intuitive.
    • Task
    @ssingh found that he could not create a new VM on the public vlan in ulsfo today, as there are no free IPs on the [[ https://netbox.wikimedia.org/ipam/prefixes/13/ip-addresses/ | allocated subnet ]]. **Widen range** Luckily we had planned to increase the size of this subnet to a /27 as part of the upcoming ulsfo network refresh (see T408892#11330727). I wasn't aware of the lack of free IPs but the plan was to bring the subdivision of the public /24 there match what we have at other POPs, where we have a public /27 for each rack. So it is no problem for us to make the subnet 198.35.26.0/27. I can make this change in Netbox and also on the routers. This change will not disrupt any existing host traffic. **Subnet mask on existing hosts** The tricker problem is that when we assign a host to the IP 198.35.26.15/27, any of the existing hosts - for instance //dns4002// on 198.35.26.8/28 - will be unable to communicate with it. Once the router change is done, therefore, we need to somehow adjust the netmask on all the existing hosts on the vlan. Probably the simplest way to do this is for us to go through them one-by-one, change the netmask in ///etc/network/interfaces//, and reboot the host. **Existing hosts** Servers: ``` dns4003.wikimedia.org dns4004.wikimedia.org lvs4008.ulsfo.wmnet lvs4009.ulsfo.wmnet lvs4010.ulsfo.wmnet ``` VMs: ``` bast4005.wikimedia.org doh4001.wikimedia.org doh4002.wikimedia.org hcaptcha-proxy4001.wikimedia.org install4003.wikimedia.org ``` Once all existing hosts have had this done we can safely add new hosts to the vlan, which will start using the free IPs in the upper half of the extended range.
    • Task
    ``` tools.cluebot3@tools-bastion-15:~$ kubectl get pods NAME READY STATUS RESTARTS AGE cluebot3-766f9864d8-zqwhl 1/1 Running 0 25m tools.cluebot3@tools-bastion-15:~$ toolforge jobs restart cluebot3 tools.cluebot3@tools-bastion-15:~$ kubectl get pods NAME READY STATUS RESTARTS AGE cluebot3-766f9864d8-zqwhl 1/1 Terminating 0 25m ``` It can be inferred from the show command (status change, then 'started at' change), but it's not intuitive ``` tools.cluebot3@tools-bastion-15:~$ toolforge jobs show cluebot3 +---------------+-----------------------------------------------------------------+ | Job name: | cluebot3 | +---------------+-----------------------------------------------------------------+ | Command: | run-bot | +---------------+-----------------------------------------------------------------+ | Job type: | continuous | +---------------+-----------------------------------------------------------------+ | Image: | tool-cluebot3/cluebot3:latest (unknown) | +---------------+-----------------------------------------------------------------+ | Port: | none | +---------------+-----------------------------------------------------------------+ | File log: | no | +---------------+-----------------------------------------------------------------+ | Output log: | | +---------------+-----------------------------------------------------------------+ | Error log: | | +---------------+-----------------------------------------------------------------+ | Emails: | none | +---------------+-----------------------------------------------------------------+ | Resources: | mem: 1.0Gi, cpu: 3.0 | +---------------+-----------------------------------------------------------------+ | Replicas: | 1 | +---------------+-----------------------------------------------------------------+ | Mounts: | none | +---------------+-----------------------------------------------------------------+ | Retry: | no | +---------------+-----------------------------------------------------------------+ | Timeout: | no | +---------------+-----------------------------------------------------------------+ | Health check: | script: health-check | +---------------+-----------------------------------------------------------------+ | Status: | Running | +---------------+-----------------------------------------------------------------+ | Hints: | Last run at 2025-11-13T15:20:22Z. Pod in 'Running' phase. State | | | 'running'. Started at '2025-11-13T15:20:24Z'. | +---------------+-----------------------------------------------------------------+ tools.cluebot3@tools-bastion-15:~$ toolforge jobs restart cluebot3 tools.cluebot3@tools-bastion-15:~$ toolforge jobs show cluebot3 +---------------+-----------------------------------------------------------------+ | Job name: | cluebot3 | +---------------+-----------------------------------------------------------------+ | Command: | run-bot | +---------------+-----------------------------------------------------------------+ | Job type: | continuous | +---------------+-----------------------------------------------------------------+ | Image: | tool-cluebot3/cluebot3:latest (unknown) | +---------------+-----------------------------------------------------------------+ | Port: | none | +---------------+-----------------------------------------------------------------+ | File log: | no | +---------------+-----------------------------------------------------------------+ | Output log: | | +---------------+-----------------------------------------------------------------+ | Error log: | | +---------------+-----------------------------------------------------------------+ | Emails: | none | +---------------+-----------------------------------------------------------------+ | Resources: | mem: 1.0Gi, cpu: 3.0 | +---------------+-----------------------------------------------------------------+ | Replicas: | 1 | +---------------+-----------------------------------------------------------------+ | Mounts: | none | +---------------+-----------------------------------------------------------------+ | Retry: | no | +---------------+-----------------------------------------------------------------+ | Timeout: | no | +---------------+-----------------------------------------------------------------+ | Health check: | script: health-check | +---------------+-----------------------------------------------------------------+ | Status: | Not running | +---------------+-----------------------------------------------------------------+ | Hints: | Last run at 2025-11-13T15:20:22Z. Pod in 'Running' phase. State | | | 'running'. Started at '2025-11-13T15:20:24Z'. | +---------------+-----------------------------------------------------------------+ ``` API endpoint returns no messages, CLI has logging but only at debug level. The command should tell the user something happened to avoid confusion. Perhaps it should also wait for for the pod to start then report 'restarted' or 'failed to restart', but that would be a behaviour change from today.
    • Task
    #MinervaNeue does not require MobileFrontend in its `extension.json`. Running the QUnit tests without MobileFrontend leads to failure: ``` counterexample 16:15:21 mediawiki.base/track 16:15:21 ✔ track 16:15:21 ✔ trackSubscribe 16:15:21 ✔ trackUnsubscribe 16:15:21 ERROR: 'trackError test: unexpected non-string data', Object{exception: TypeError: mobile.getOverlayManager is not a function ... 16:15:21 , module: 'test.MinervaNeue', source: 'module-execute'} 16:15:21 ✖ trackError ``` The test is in `tests/qunit/skins.minerva.scripts/page-issues/index.test.js`: ``` lang=javascript QUnit.module( 'Minerva pageIssues', () => { const mobile = require( 'mobile.startup' ); ... const overlayManager = mobile.getOverlayManager(); ``` My guess is `require` returns nothing/a stub, in which cases the test module should be skipped?
    • Task
    While researching the parent ticket, I discovered [[ https://wikitech.wikimedia.org/wiki/Wikidata_Query_Service/Runbook#Logstash_Method | our docs linking to the Wikidata Query Service logstash dashboard ]] were pointing to a dead link. I did manage to dredge up [[ https://logstash.wikimedia.org/goto/06005ad39e06db4d82d959b1d99bebcb | the old Logstash dashboard ]], but several of the filters don't work. Creating this ticket to: [] Fix or remove the filters in error state [] Add more filters based on the available data. I'm not sure the data is available yet, but what I'd like to see is†: - A list of queries that are causing exceptions (found in `/var/log/wdqs/wdqs-blazegraph.log` on the wdqs hosts) - A list of queries > 60s (found in [[ https://github.com/blazegraph/database/wiki/Quick_Start | the Blazegraph workbench ]]) † As previously discussed in [[ https://docs.google.com/document/d/1194KzUUDLKMEigSKSs6VG26GLK0uzfDnzgqDet7U_80/edit?tab=t.0 | this Google doc ]]
    • Task
    This is an umbrella task for reviewing and identifying various alerts related to MediaWiki and other #serviceops areas within [[https://gerrit.wikimedia.org/r/q/project:operations/alerts | operations/alerts]], and determining what needs to be updated. Additionally, we can gather alert statistics from [[https://logstash.wikimedia.org/app/dashboards#/view/8b1907c0-2062-11ec-85b7-9d1831ce7631?_g=h@3f6e3a2&_a=h@7e7a0e9 | logstash alerts]], which could help inform adjustments to alerting thresholds. **Potential areas** * Grafana ** Deprecated dashboards ** "Noisy" dashboards (i.e., too many panels, little information) ** Introduction of more comprehensive dashboards for oncallers * Documentaton ** Runbooks ** Troubleshooting/cheatsheet updates * Incident Response ** Review of past incidents to assess: ** What alerts fired and what didn't ** What information is missing that would help an oncaller identify the area of the issue
    • Task
    ``` cd 'wikimedia-tools/feverfew'; git add .; if ! git diff --cached --quiet; then git commit -m 'Localisation updates from https://translatewiki.net.'; git rebase 'origin/main' && git push --force origin HEAD:'i18n'; fi [main ab15d0e] Localisation updates from https://translatewiki.net. 1 file changed, 2 insertions(+), 2 deletions(-) Current branch main is up to date. To github.com:plantaest/feverfew.git + ad4fd07...ab15d0e HEAD -> i18n (forced update) wikimedia-tools/feverfew: Pull request has been open for 2 m and 5 d. ``` * https://translatewiki.net/wiki/Translating:Feverfew * https://translatewiki.net/wiki/Project_activity_requirements
    • Task
    **User story:** As a Wikipedia editor, I want to know that a new feature, the Dashboard, is available for me to use, so that I can try it out and learn if it will be useful to me. # Designs **Blue dot notice** | Minerva | Vector | Vector 2022 | | {F70175136} | {F70175138} | {F70175137} | **After clicking** | Minerva | Vector | Vector 2022 | | {F70175152} | {F70175154} | {F70175153} # Acceptance criteria * Users meeting a given edit count requirement (100) see a pulsing blue dot notification next to the Dashboard link location. * Selecting the link the blue dot is attached to opens the popover (and on Vector 2022, opens the personal tools menu as normal). * Clicking Got It closes the popover. * Users see the blue dot until they click Got It on the popover. * After clicking Got It, users do not see the blue dot or popover again. * It is possible to deploy Extension:PersonalDashboard without automatically displaying this notice to users.
    • Task
    ``` cd 'cita'; git add .; if ! git diff --cached --quiet; then git commit -m 'Localisation updates from https://translatewiki.net.'; git rebase 'origin/master' && git push --force origin HEAD:'i18n'; fi [master f8cd43b] Localisation updates from https://translatewiki.net. 6 files changed, 342 insertions(+), 105 deletions(-) create mode 100644 static/chrome/locale/cbk-zam/wikicite.properties Current branch master is up to date. To github.com:diegodlh/zotero-cita.git + eed015d...f8cd43b HEAD -> i18n (forced update) cita: Pull request has been open for 2 m and 2 d. ``` * https://translatewiki.net/wiki/Translating:Cita * https://translatewiki.net/wiki/Project_activity_requirements
    • Task
    Example: * View https://en.wikipedia.org/wiki/List_of_chained-brand_hotels * Sort on "Rooms" * The same number now appears multiple times, which is not perfect
    • Task
    Feedback from the doc reviews done by @JWuyts-WMF and @KMontalva-WMF , as well as reviews done by Benoit from Kiwix and @awight indicates that users would prefer concrete examples in response codeboxes on Enterprise doc pages, instead of value type indicators. Providing real examples in response codeboxes seem to improve the speed of comprehension of the code response, and gives devs example values to work with. If we use the same sample **Acceptance criteria** # API Response codeboxes across enterprise.wikimedia.org/docs contain real example values # example values are the same throughout all API codeboxes # response schemas are checked for accuracy, e.g. int he [[ https://enterprise.wikimedia.com/docs/snapshot/#snapshot-chunk-info | Chunk info response ]] there should not be a `chunks` array **ToDo** - [ ] research if and how to provide real examples in openapi spec files (https://learn.openapis.org/specification/docs.html) - [ ] check validity of codeboxes, refer to eng if needed - [ ] generate example API response to base example values off of - [ ] replace API responses in openapi spec file with actual example values ===== Test Strategy ===== deploy on dev site first, then on prod
    • Task
    **Steps to replicate the issue**: * Install TimedNotify: https://www.mediawiki.org/wiki/Extension:TimedNotify * Use TimedNotify to regularly notify people of certain issues, e.g. some page needs attention. We send to multiple users per page. * Run your jobs at 120 jobs every 10 seconds. **What happens?**: The amount of `EchoNotificationDeleteJob` jobs keeps increasing with about 50 every 2 seconds. Not regularly clearing these jobs has had us have more than 300,000 jobs of this type, which is quite a lot for a wiki with 8000 content pages and 6000 users. **What should have happened instead?**: There must be a better way to do what EchoNotificationDeleteJob does, without overflowing the job queue. We have considered increasing our job throughput, but it seems to us that the task EchoNotificationDeleteJob has is not important enough to justify requiring that many resources. In particular, it should not be necessary to have multiple of these jobs for the same user (And having 300,000 jobs with only 6000 users means we have average of 500 jobs per user) **Software version**: MediaWiki 1.43.1 PHP 8.1.33 (apache2handler) ICU 72.1 MariaDB 10.11.13-MariaDB-log
    • Task
    The language-data library uses the https://github.com/mustangostang/spyc library to generate the [[ https://github.com/wikimedia/language-data/blob/master/data/language-data.json | langauge-data.json ]] file from the [[ https://github.com/wikimedia/language-data/blob/master/data/langdb.yaml | YAML file ]] using the script here: https://github.com/wikimedia/language-data/blob/master/src/util/ulsdata2json.php This script is run when languages are added and as such this can be marked as a dev dependency in `composer.json` Also update the related documentation.
    • Task
    After T376892 expansion we thought we had at least 2 years of media backup storage, but recently, in a very small amount of time -weeks- 50TB were added to our upload, leading to a non-linear growth in disk utilization: in the last year, storage has grown from 650 TB to 850 TB, getting too uncomfortably close to the maximum available space of 960TB, where now 5 out of 6 servers are at 90% disk usage. {F70174852} While we could do some magic writing to only 1 server and using spare disk resources from other hosts with mixed use, it would be nice to do any refactoring at the same time than other expansion to avoid working twice: once for moving data around, and again for a refresh. Doing large expansions rather than small ones is less work and time intensive. Having a couple of extra servers per datacenter (8 instead of 6, or 850/1280 TB) will expand the current space by 2-3 extra years with the current projected growth (but as said, it varies a lot from year to year). There was already a planned refresh for Q3, and maybe it could be combined with garage or other software migration (T410020). Alternatively, we could refresh the hosts with more density per host, as we have the same density as 6 years ago per host. Another question if if we need to change, while migrating, some of the storage formulas: no SSDs, RAID level, or replication factor + jbod.
    • Task
    **Steps to replicate the issue**: * Open Paulina on a mobile device or resize browser window to mobile width (< 992px) * Click the hamburger menu icon in the top navigation bar * Observe the menu behavior when it opens **What happens?**: 1. The hamburger menu pushes all page content down when opened, causing layout shift 2. The hamburger icon remains as three horizontal lines even when menu is open - no visual indication of open/closed state 3. No smooth animation on the hamburger icon transformation **What should have happened instead?**: 1. The hamburger menu should overlay the page content using absolute positioning instead of pushing it down 2. The hamburger icon should animate to an X icon when the menu is open, providing clear visual feedback 3. The icon transformation should have a smooth transition animation (0.3s ease)
    • Task
    ==== Technical background The new service `UserRequirementsConditionChecker` is able to distinguish between a user performing the request and not. Currently callers are supposed to pass to `recursivelyCheckCondition` whether the user is performing the request or not. However, callers do not always know this. Since `UserRequirementsConditionChecker` has access to the request context, it could determine whether or not the passed-in user is performing the request. **Bug 1** Special:UserRights displays autopromote groups for a user. Some autopromote conditions, such as `APCOND_TOR`, check the condition against the request (making the assumption that the user we are interested in is the user performing the request). This leads to incorrect reporting on Special:UserRights. This is because `UserGroupManager::getUserAutopromoteGroups` and `UserGroupManager::getUserAutopromoteOnceGroups` do not pass in whether the user is performing the request, so the user is assumed to be performing the request. This was an existing bug, but can be fixed now that we have `UserRequirementsConditionChecker`. **Bug 2** When checking restricted groups, `RestrictedUserGroupChecker::doesTargetMeetConditions` assumes that the target is not the performing user. However, they may be the performing user if the performing user is editing their own groups. This is theoretical, as there are unlikely to be restricted groups that require their members to be the performing user at the time the group is added. However, correcting the assumption that the member is not the performing user may be worth doing, to avoid confusion in the future. ==== Acceptance criteria * Special:UserRights can display membership of a group that uses conditions that depend on featuresd of the request (e.g. `APCOND_ISIP`, `APCOND_TOR`) correctly if the target is the performing user * Special:UserRights displays that a target who is not the performing user is not a member of the group that uses request-specific conditions * Special:UserRights correctly treats a group as restricted, even when the target is also the performer
    • Task
    == Summary The #confirmedit hCaptcha captcha type verifies captchas using a call to an API named `siteverify`. This code should verify that the sitekey provided to the client matches the sitekey returned by the `siteverify` API == Background * hCaptcha in #confirmedit allows varying the type of sitekey used for a given request ** This includes varying the sitekey for an #abusefilter consequence * Currently we do not enforce that the given response token sent by the client during the POST request is associated with any given sitekey ** This means that a client could modify the JavaScript config variables to use a sitekey that has potentially less restrictions and may be easier to solve * To address these issues, we should ensure that the `sitekey` API response `sitekey` property is compared against the sitekey that should have been used for the request ** If the sitekeys do not match, then we should consider that hCaptcha captcha check to have failed == Acceptance criteria * [ ] In `HCaptcha::passCaptcha`, the method should return false if the sitekey returned by the `siteverify` API does not match the the sitekey passed to the client
    • Task
    [[ https://console.cloud.google.com/logs/query;cursorTimestamp=2025-11-13T11:05:12.884783951Z;duration=PT30M;query=resource.labels.cluster_name%3D%22wbaas-2%22%0Alabels.%22k8s-pod%2Fapp_kubernetes_io%2Finstance%22%3D%22api%22%0Atimestamp%3D%222025-11-13T11:05:12.884733696Z%22%0AinsertId%3D%22pie6ft0z93jeuhf7%22?project=wikibase-cloud | Example occurrence ]] ``` ERROR 2025-11-13T11:05:12.884733696Z [resource.labels.containerName: pre-install-job] [2025-11-13 11:05:12] staging.ERROR: SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'migrations' already exists (Connection: mysql, SQL: create table `migrations` (`id` int unsigned not null auto_increment primary key, `migration` varchar(255) not null, `batch` int not null) default character set utf8mb4 collate 'utf8mb4_unicode_ci') {"exception":"[object] (Illuminate\\Database\\QueryException(code: 42S01): SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'migrations' already exists (Connection: mysql, SQL: create table `migrations` (`id` int unsigned not null auto_increment primary key, `migration` varchar(255) not null, `batch` int not null) default character set utf8mb4 collate 'utf8mb4_unicode_ci') at /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php:829) ERROR 2025-11-13T11:05:12.884783951Z [resource.labels.containerName: pre-install-job] [stacktrace] ERROR 2025-11-13T11:05:12.884787489Z [resource.labels.containerName: pre-install-job] #0 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php(783): Illuminate\\Database\\Connection->runQueryCallback('create table `m...', Array, Object(Closure)) ERROR 2025-11-13T11:05:12.884789391Z [resource.labels.containerName: pre-install-job] #1 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php(576): Illuminate\\Database\\Connection->run('create table `m...', Array, Object(Closure)) ERROR 2025-11-13T11:05:12.884791363Z [resource.labels.containerName: pre-install-job] #2 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Schema/Blueprint.php(110): Illuminate\\Database\\Connection->statement('create table `m...') ERROR 2025-11-13T11:05:12.884793604Z [resource.labels.containerName: pre-install-job] #3 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Schema/Builder.php(602): Illuminate\\Database\\Schema\\Blueprint->build(Object(Illuminate\\Database\\MySqlConnection), Object(Illuminate\\Database\\Schema\\Grammars\\MySqlGrammar)) ERROR 2025-11-13T11:05:12.884795035Z [resource.labels.containerName: pre-install-job] #4 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Schema/Builder.php(456): Illuminate\\Database\\Schema\\Builder->build(Object(Illuminate\\Database\\Schema\\Blueprint)) ERROR 2025-11-13T11:05:12.884796361Z [resource.labels.containerName: pre-install-job] #5 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Migrations/DatabaseMigrationRepository.php(165): Illuminate\\Database\\Schema\\Builder->create('migrations', Object(Closure)) ERROR 2025-11-13T11:05:12.884797603Z [resource.labels.containerName: pre-install-job] #6 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Console/Migrations/InstallCommand.php(54): Illuminate\\Database\\Migrations\\DatabaseMigrationRepository->createRepository() ERROR 2025-11-13T11:05:12.884798946Z [resource.labels.containerName: pre-install-job] #7 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(36): Illuminate\\Database\\Console\\Migrations\\InstallCommand->handle() ERROR 2025-11-13T11:05:12.884800182Z [resource.labels.containerName: pre-install-job] #8 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/Util.php(41): Illuminate\\Container\\BoundMethod::Illuminate\\Container\\{closure}() ERROR 2025-11-13T11:05:12.884801421Z [resource.labels.containerName: pre-install-job] #9 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(93): Illuminate\\Container\\Util::unwrapIfClosure(Object(Closure)) ERROR 2025-11-13T11:05:12.884803194Z [resource.labels.containerName: pre-install-job] #10 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(35): Illuminate\\Container\\BoundMethod::callBoundMethod(Object(Illuminate\\Foundation\\Application), Array, Object(Closure)) ERROR 2025-11-13T11:05:12.884805160Z [resource.labels.containerName: pre-install-job] #11 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/Container.php(662): Illuminate\\Container\\BoundMethod::call(Object(Illuminate\\Foundation\\Application), Array, Array, NULL) ERROR 2025-11-13T11:05:12.884807088Z [resource.labels.containerName: pre-install-job] #12 /var/www/html/vendor/laravel/framework/src/Illuminate/Console/Command.php(211): Illuminate\\Container\\Container->call(Array) ERROR 2025-11-13T11:05:12.884808435Z [resource.labels.containerName: pre-install-job] #13 /var/www/html/vendor/symfony/console/Command/Command.php(326): Illuminate\\Console\\Command->execute(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Illuminate\\Console\\OutputStyle)) ERROR 2025-11-13T11:05:12.884810941Z [resource.labels.containerName: pre-install-job] #14 /var/www/html/vendor/laravel/framework/src/Illuminate/Console/Command.php(180): Symfony\\Component\\Console\\Command\\Command->run(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Illuminate\\Console\\OutputStyle)) ERROR 2025-11-13T11:05:12.884816727Z [resource.labels.containerName: pre-install-job] #15 /var/www/html/vendor/symfony/console/Application.php(1096): Illuminate\\Console\\Command->run(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884826329Z [resource.labels.containerName: pre-install-job] #16 /var/www/html/vendor/symfony/console/Application.php(324): Symfony\\Component\\Console\\Application->doRunCommand(Object(Illuminate\\Database\\Console\\Migrations\\InstallCommand), Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884827970Z [resource.labels.containerName: pre-install-job] #17 /var/www/html/vendor/symfony/console/Application.php(175): Symfony\\Component\\Console\\Application->doRun(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884829683Z [resource.labels.containerName: pre-install-job] #18 /var/www/html/vendor/laravel/framework/src/Illuminate/Foundation/Console/Kernel.php(201): Symfony\\Component\\Console\\Application->run(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884831934Z [resource.labels.containerName: pre-install-job] #19 /var/www/html/artisan(35): Illuminate\\Foundation\\Console\\Kernel->handle(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884834020Z [resource.labels.containerName: pre-install-job] #20 {main} ERROR 2025-11-13T11:05:12.884835762Z [resource.labels.containerName: pre-install-job] {} ERROR 2025-11-13T11:05:12.884838128Z [resource.labels.containerName: pre-install-job] [previous exception] [object] (PDOException(code: 42S01): SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'migrations' already exists at /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php:587) ERROR 2025-11-13T11:05:12.884840118Z [resource.labels.containerName: pre-install-job] [stacktrace] ERROR 2025-11-13T11:05:12.884846789Z [resource.labels.containerName: pre-install-job] #0 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php(587): PDOStatement->execute() ERROR 2025-11-13T11:05:12.884849365Z [resource.labels.containerName: pre-install-job] #1 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php(816): Illuminate\\Database\\Connection->Illuminate\\Database\\{closure}('create table `m...', Array) ERROR 2025-11-13T11:05:12.884851673Z [resource.labels.containerName: pre-install-job] #2 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php(783): Illuminate\\Database\\Connection->runQueryCallback('create table `m...', Array, Object(Closure)) ERROR 2025-11-13T11:05:12.884853655Z [resource.labels.containerName: pre-install-job] #3 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php(576): Illuminate\\Database\\Connection->run('create table `m...', Array, Object(Closure)) ERROR 2025-11-13T11:05:12.884855516Z [resource.labels.containerName: pre-install-job] #4 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Schema/Blueprint.php(110): Illuminate\\Database\\Connection->statement('create table `m...') ERROR 2025-11-13T11:05:12.884857666Z [resource.labels.containerName: pre-install-job] #5 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Schema/Builder.php(602): Illuminate\\Database\\Schema\\Blueprint->build(Object(Illuminate\\Database\\MySqlConnection), Object(Illuminate\\Database\\Schema\\Grammars\\MySqlGrammar)) ERROR 2025-11-13T11:05:12.884860062Z [resource.labels.containerName: pre-install-job] #6 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Schema/Builder.php(456): Illuminate\\Database\\Schema\\Builder->build(Object(Illuminate\\Database\\Schema\\Blueprint)) ERROR 2025-11-13T11:05:12.884862218Z [resource.labels.containerName: pre-install-job] #7 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Migrations/DatabaseMigrationRepository.php(165): Illuminate\\Database\\Schema\\Builder->create('migrations', Object(Closure)) ERROR 2025-11-13T11:05:12.884864081Z [resource.labels.containerName: pre-install-job] #8 /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Console/Migrations/InstallCommand.php(54): Illuminate\\Database\\Migrations\\DatabaseMigrationRepository->createRepository() ERROR 2025-11-13T11:05:12.884866119Z [resource.labels.containerName: pre-install-job] #9 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(36): Illuminate\\Database\\Console\\Migrations\\InstallCommand->handle() ERROR 2025-11-13T11:05:12.884872091Z [resource.labels.containerName: pre-install-job] #10 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/Util.php(41): Illuminate\\Container\\BoundMethod::Illuminate\\Container\\{closure}() ERROR 2025-11-13T11:05:12.884874184Z [resource.labels.containerName: pre-install-job] #11 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(93): Illuminate\\Container\\Util::unwrapIfClosure(Object(Closure)) ERROR 2025-11-13T11:05:12.884876120Z [resource.labels.containerName: pre-install-job] #12 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(35): Illuminate\\Container\\BoundMethod::callBoundMethod(Object(Illuminate\\Foundation\\Application), Array, Object(Closure)) ERROR 2025-11-13T11:05:12.884878193Z [resource.labels.containerName: pre-install-job] #13 /var/www/html/vendor/laravel/framework/src/Illuminate/Container/Container.php(662): Illuminate\\Container\\BoundMethod::call(Object(Illuminate\\Foundation\\Application), Array, Array, NULL) ERROR 2025-11-13T11:05:12.884880153Z [resource.labels.containerName: pre-install-job] #14 /var/www/html/vendor/laravel/framework/src/Illuminate/Console/Command.php(211): Illuminate\\Container\\Container->call(Array) ERROR 2025-11-13T11:05:12.884882431Z [resource.labels.containerName: pre-install-job] #15 /var/www/html/vendor/symfony/console/Command/Command.php(326): Illuminate\\Console\\Command->execute(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Illuminate\\Console\\OutputStyle)) ERROR 2025-11-13T11:05:12.884896121Z [resource.labels.containerName: pre-install-job] #16 /var/www/html/vendor/laravel/framework/src/Illuminate/Console/Command.php(180): Symfony\\Component\\Console\\Command\\Command->run(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Illuminate\\Console\\OutputStyle)) ERROR 2025-11-13T11:05:12.884898470Z [resource.labels.containerName: pre-install-job] #17 /var/www/html/vendor/symfony/console/Application.php(1096): Illuminate\\Console\\Command->run(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884900586Z [resource.labels.containerName: pre-install-job] #18 /var/www/html/vendor/symfony/console/Application.php(324): Symfony\\Component\\Console\\Application->doRunCommand(Object(Illuminate\\Database\\Console\\Migrations\\InstallCommand), Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884902568Z [resource.labels.containerName: pre-install-job] #19 /var/www/html/vendor/symfony/console/Application.php(175): Symfony\\Component\\Console\\Application->doRun(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884904851Z [resource.labels.containerName: pre-install-job] #20 /var/www/html/vendor/laravel/framework/src/Illuminate/Foundation/Console/Kernel.php(201): Symfony\\Component\\Console\\Application->run(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884906806Z [resource.labels.containerName: pre-install-job] #21 /var/www/html/artisan(35): Illuminate\\Foundation\\Console\\Kernel->handle(Object(Symfony\\Component\\Console\\Input\\ArgvInput), Object(Symfony\\Component\\Console\\Output\\ConsoleOutput)) ERROR 2025-11-13T11:05:12.884908804Z [resource.labels.containerName: pre-install-job] #22 {main} ERROR 2025-11-13T11:05:12.884911013Z [resource.labels.containerName: pre-install-job] "} ```
    • Task
    [[ https://console.cloud.google.com/logs/query;cursorTimestamp=2025-11-13T11:05:30.322084072Z;duration=PT30M;query=resource.labels.cluster_name%3D%22wbaas-3%22%0Alabels.%22k8s-pod%2Fapp_kubernetes_io%2Finstance%22%3D%22api%22%0Atimestamp%3D%222025-11-13T11:05:30.322084072Z%22%0AinsertId%3D%22y9wfleachvtvjiiu%22?project=wikibase-cloud | Example of error in google logs explorer ]]. ``` [previous exception] [object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): App\\Jobs\\PollForMediaWikiJobsJob has been attempted too many times. at /var/www/html/vendor/laravel/framework/src/Illuminate/Queue/MaxAttemptsExceededException.php:24) ``` Note this doesn't appear as an error level log for some reason; just as an info.
    • Task
    ## Context We already have a 1.43 version of MediaWiki running on staging. ## Goal We can also ship this to production at some point but not upgrade any Wikis or tweak the default db version. Running these on production is a necessary step and can be done independently of other tasks. We need to follow the same pattern of updating the image if and when we update the MW image on staging. ## notes This task is as "simple" as removing [[ https://github.com/wmde/wbaas-deploy/blob/main/k8s/helmfile/helmfile.yaml#L158 | the check to not ship to production ]] and confirming the right settings being passed to the production release and watching the logs for anything unusual. It's probably just this one line change and deploying it. ## Acceptance Criteria - [ ] MW 143 is also shipped to production
    • Task
    `scap train` for group2 initially failed due to a HTTP 500 error with the Docker Registry: T408272#11369913. Trying `scap train` again, it incorrectly showed that group2 is already on wmf.2: ``` ____ |DD|_____T_ |_ |-wmf.2|< @-@-@-oo ===================================================================== START testwikis group0 group1 group2 1.46.0-wmf.2 1.46.0-wmf.2 1.46.0-wmf.2 1.46.0-wmf.2 ``` `<hashar> my guess is scap thinks that because it reads the local /srv/mediawiki-staging/wikiversions.json` Not sure that can be corrected, if not, please decline this task. https://versions.toolforge.org/ correctly showed still old `1.46.0-wmf.1` for group2 instead of `1.46.0-wmf.2`.
    • Task
    For testing dbt jobs, spark session-mode starts spark jobs from stat machines. For production jobs launched via airflow, getting the session-mode will be more complicated as it would mean running the dbt from within a skein container. One different approach would be to provide one SparkThriftSQLServer per production user, allowing Airflow instances to run dbt jobs against those Servers. For this spike, the idea would be * install one SparkThriftServer using the analytics user * in k8s if possible, on bare metal/VM otherwise, * With auto-restart in case of failure (k8s would help for this) as this would use regularly from Airflow * with a defined DNS to be accessed from Airflow * Verify that the job can execute SQL and right production data If we can have all that, the next steps will be to make the replication of such a server easy, to provide this capability for every prod-user on the Hadoop cluster.
    • Task
    ==== Error ==== * service.version: 1.46.0-wmf.1 * timestamp: 2025-11-13T08:18:58.620Z * labels.phpversion: `8.3.26` * trace.id: `30d83ff5-e179-4897-9d1b-6ae76d3db5ac` * [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-11-12T08:18:58.620Z',to:'2025-11-13T10:08:31.429Z'))&_a=(query:(query_string:(query:'reqId:%2230d83ff5-e179-4897-9d1b-6ae76d3db5ac%22'))) | Find trace.id in Logstash ]] ```name=labels.normalized_message,lines=10 [{reqId}] {exception_url} ValueError: setcookie(): "expires" option cannot have a year greater than 9999 ``` | Frame | Location | Call | -- | -- | -- | from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/CentralNotice/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/specials/SpecialHideBanners.php#85 | /srv/mediawiki/php-1.46.0-wmf.1/extensions/CentralNotice/includes/specials/SpecialHideBanners.php(85) ]] | | #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/CentralNotice/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/specials/SpecialHideBanners.php#85 | /srv/mediawiki/php-1.46.0-wmf.1/extensions/CentralNotice/includes/specials/SpecialHideBanners.php(85) ]] | setcookie(string, string, int, string, string, bool, bool) | #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/CentralNotice/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/specials/SpecialHideBanners.php#52 | /srv/mediawiki/php-1.46.0-wmf.1/extensions/CentralNotice/includes/specials/SpecialHideBanners.php(52) ]] | SpecialHideBanners->setHideCookie(string, int, string) | #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/SpecialPage/SpecialPage.php#711 | /srv/mediawiki/php-1.46.0-wmf.1/includes/SpecialPage/SpecialPage.php(711) ]] | SpecialHideBanners->execute(null) | #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/SpecialPage/SpecialPageFactory.php#1736 | /srv/mediawiki/php-1.46.0-wmf.1/includes/SpecialPage/SpecialPageFactory.php(1736) ]] | MediaWiki\SpecialPage\SpecialPage->run(null) | #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/actions/ActionEntryPoint.php#499 | /srv/mediawiki/php-1.46.0-wmf.1/includes/actions/ActionEntryPoint.php(499) ]] | MediaWiki\SpecialPage\SpecialPageFactory->executePath(string, MediaWiki\Context\RequestContext) | #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/actions/ActionEntryPoint.php#143 | /srv/mediawiki/php-1.46.0-wmf.1/includes/actions/ActionEntryPoint.php(143) ]] | MediaWiki\Actions\ActionEntryPoint->performRequest() | #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.1/includes/MediaWikiEntryPoint.php#184 | /srv/mediawiki/php-1.46.0-wmf.1/includes/MediaWikiEntryPoint.php(184) ]] | MediaWiki\Actions\ActionEntryPoint->execute() | #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.1/index.php#44 | /srv/mediawiki/php-1.46.0-wmf.1/index.php(44) ]] | MediaWiki\MediaWikiEntryPoint->run() | #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/w/index.php#3 | /srv/mediawiki/w/index.php(3) ]] | require(string) | #9 | {main} | ==== Impact ==== Low ==== Notes ==== This is on old `1.46.0-wmf.1` but as the involved code has not seen changes lately I assume that this is still a valid issue.
    • Task
    ==== Error ==== * mwversion: 1.46.0-wmf.2 * timestamp: 2025-11-13T09:48:25.567Z * phpversion: `8.3.26` * reqId: `b2b97f065baf41d8ca3d9a5b` * [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-11-12T09:48:25.567Z',to:'2025-11-13T09:58:13.427Z'))&_a=(query:(query_string:(query:'reqId:%22b2b97f065baf41d8ca3d9a5b%22'))) | Find reqId in Logstash ]] ```name=normalized_message,lines=10 [{reqId}] {exception_url} PHP Warning: Undefined array key "user" ``` | Frame | Location | Call | -- | -- | -- | from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/LiquidThreadsApi/PageRevisionedObject.php#38 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/LiquidThreadsApi/PageRevisionedObject.php(38) ]] | | #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/LiquidThreadsApi/PageRevisionedObject.php#38 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/LiquidThreadsApi/PageRevisionedObject.php(38) ]] | MediaWiki\Exception\MWExceptionHandler::handleError(int, string, string, int) | #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/LiquidThreadsApi/PageRevisionedObject.php#49 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/LiquidThreadsApi/PageRevisionedObject.php(49) ]] | Flow\Import\LiquidThreadsApi\PageRevisionedObject->getRevisionData() | #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/LiquidThreadsApi/ImportPost.php#126 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/LiquidThreadsApi/ImportPost.php(126) ]] | Flow\Import\LiquidThreadsApi\PageRevisionedObject->getRevisions() | #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/TalkpageImportOperation.php#453 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/TalkpageImportOperation.php(453) ]] | Flow\Import\LiquidThreadsApi\ImportPost->getRevisions() | #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/TalkpageImportOperation.php#389 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/TalkpageImportOperation.php(389) ]] | Flow\Import\TalkpageImportOperation->importObjectWithHistory(Flow\Import\LiquidThreadsApi\ImportPost, Closure, string, Flow\Import\PageImportState, MediaWiki\Title\Title) | #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/TalkpageImportOperation.php#217 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/TalkpageImportOperation.php(217) ]] | Flow\Import\TalkpageImportOperation->importPost(Flow\Import\TopicImportState, Flow\Import\LiquidThreadsApi\ImportPost, Flow\Model\PostRevision) | #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/TalkpageImportOperation.php#131 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/TalkpageImportOperation.php(131) ]] | Flow\Import\TalkpageImportOperation->importTopic(Flow\Import\TopicImportState, Flow\Import\LiquidThreadsApi\ImportTopic) | #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/Importer.php#114 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/Importer.php(114) ]] | Flow\Import\TalkpageImportOperation->import(Flow\Import\PageImportState) | #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/Converter.php#215 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/Converter.php(215) ]] | Flow\Import\Importer->import(Flow\Import\LiquidThreadsApi\ImportSource, MediaWiki\Title\Title, MediaWiki\User\User, Flow\Import\SourceStore\FileImportSourceStore) | #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/Converter.php#157 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/Converter.php(157) ]] | Flow\Import\Converter->doConversion(MediaWiki\Title\Title, MediaWiki\Title\Title) | #10 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/includes/Import/Converter.php#113 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/includes/Import/Converter.php(113) ]] | Flow\Import\Converter->convert(MediaWiki\Title\Title, bool, bool) | #11 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Flow/+blame/refs/heads/wmf/1.46.0-wmf.2/maintenance/convertAllLqtPages.php#112 | /srv/mediawiki/php-1.46.0-wmf.2/extensions/Flow/maintenance/convertAllLqtPages.php(112) ]] | Flow\Import\Converter->convertAll(AppendIterator, bool, bool) | #12 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/maintenance/includes/MaintenanceRunner.php#696 | /srv/mediawiki/php-1.46.0-wmf.2/maintenance/includes/MaintenanceRunner.php(696) ]] | Flow\Maintenance\ConvertAllLqtPages->execute() | #13 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.46.0-wmf.2/maintenance/run.php#53 | /srv/mediawiki/php-1.46.0-wmf.2/maintenance/run.php(53) ]] | MediaWiki\Maintenance\MaintenanceRunner->run() | #14 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/multiversion/MWScript.php#221 | /srv/mediawiki/multiversion/MWScript.php(221) ]] | require_once(string) | #15 | {main} | ==== Impact ==== ==== Notes ====
    • Task
    We need a new version of the compact form that allows the donor to input an address without asking for a donation receipt. The hope is that this will increase address data. {F70214762} **Implementation Notes** - Dynamic required messages are added on the fields once the donor starts filling out an address - A message box above the fields explaining that we require a full address if they enter data into any of the fields - The same box has a link to clear that part of the form - The message in the box is added to the aria-describedby attribute in all the fields - Programatically check things and set the appropriate donor type - Make sure the error summaries work properly **Spike Notes** - We can't prefill the country field
    • Task
    **Feature summary** Make the navigation bar in Paulina sticky on scroll, ensuring it stays fixed at the top of the page. **Use case(s)** Users may lose the main navigation when scrolling long pages, which interrupts flow and orientation. **Benefits** Keeps the navigation always accessible, enhances user flow, and improves UX consistency across pages.
    • Task
    we segment html to sentences and use <span class="cx-segment"> to wrap them. This tags we insert to HTML (DOM modifcation) has been a source of [[ https://phabricator.wikimedia.org/search/query/IwsEr3qESdjR/#R | several bugs ]] - leaking into published articles for example. We had tried to fix these bugs many times in the past, but what if we remove those spans instead? The basic idea is to use https://developer.mozilla.org/en-US/docs/Web/API/CSS_Custom_Highlight_API - This allows to highlight ranges in HTML without modifying the DOM. It is in browsers since 2022. ## Steps * Extract text out of text nodes from HTML. * Flatten the text from these nodes to get a plain text version of the content * Use plaintext sentence segmenter - sentencex. Get sentence boundaries - start,end index of boundaries. * Map the boundaries to DOM Nodes, calculate the ranges to highlight * Create Ranges in js and then use css to decorate it. ## Proof of concept Source code: https://codesandbox.io/p/sandbox/optimistic-sun-vhs4q3 Demo: https://vhs4q3.csb.app/ {F70170227 size=full} This POC uses wasm version of sentencex library. For DOM parsing tree-sitter-html is used(This C library is very fast and error tolerant). We cannot use DOMParser interface since it cannot give nodes offsets in the HTML code. It also uses latest version of [[ https://github.com/wikimedia/sentencex/ | sentencex ]] library - recently rewritten in Rust ## Advantages * No DOM Changes means elimination of so many bugs related to leakage of this markup to published article. * Reduced DOM size, reduced parallel corpus size * VE can work very similar to the article editing workflow since there is no sentence nodes ## Current uses of sentence segmentation * In the article translation interface to highlight source and target sentence pairs. * In section translation, translation editor's sentence picker translates one sentence at a time The underlying code of HTML sentence segmentation uses a lineardoc model for DOM that we inherited from VE long time back(contributed by @dchan) and quite hard to maintain due to string mangling based parsing. The same code is used to update the model for adaptation of links, references and templates. We need to see if this lineardoc model can be abandoned for a DOMParser implementation that will make the adaptation logic easier(Adaptation is basically a DOM manipulation) The section translation usecase is worth reconsideration. Instead of thinking how to use sentence segmentation to that usecase, I am more inclined to think how to do section level translation instead of sentence level translation. Sections are semantically complete units to translate. MT system can work better with sections instead of sentneces which will have only partial context. ## Pending [ ] Do an analysis of cxserver integration - what is the impact and opportunities to refresh the codebase [ ] Discuss changing the section translation model to sue paragraphs as basic translation units instead of sentences. [ ] How does VE work with these highlight ranges? Does VE still need wrapping tags? [ ] Do we need sentence visualization at all?! Currently it is broken in CX and SX. So is there a real purpose it is serving?
    • Task
    ## Summary The [[ https://grafana-rw.wikimedia.org/d/441b2def-52e9-49d6-acad-91f5bb748989/hcaptcha-reverse-proxy?orgId=1&from=now-3h&to=now&timezone=utc&var-instance=$__all&var-site=eqiad | Grafana dashboard for hCaptcha ]] needs to be updated to ensure existing product metrics panels are segmented by edit and account creation interactions. Currently the panels assume that all events are from account creation. ## Acceptance criteria - [ ] The Grafana dashboard allows us to visualize events created from editing workflows, separately from account creation. Most likely this should be done by segmenting existing panels, rather than adding new panels.
    • Task
    **Steps to replicate the issue** (include links if applicable): * `wget -O - "https://de.wikipedia.org/w/api.php?format=json&formatversion=2&action=query&list=search&srsearch=insource%3Ahttps%20insource%3A%2F%5C%5Bhttps%3A%5C%2F%5C%2F%5B%5E%20%5C%5D%5D%2A%27%27%2F&srnamespace=0&srlimit=max&srinfo=&srprop="` **What happens?**: --2025-11-13 07:39:49-- https://de.wikipedia.org/w/api.php?format=json&formatversion=2&action=query&list=search&srsearch=insource%3Ahttps%20insource%3A%2F%5C%5Bhttps%3A%5C%2F%5C%2F%5B%5E%20%5C%5D%5D%2A%27%27%2F&srnamespace=0&srlimit=max&srinfo=&srprop= Resolving de.wikipedia.org (de.wikipedia.org)... 2620:0:861:ed1a::1, 208.80.154.224 Connecting to de.wikipedia.org (de.wikipedia.org)|2620:0:861:ed1a::1|:443... connected. HTTP request sent, awaiting response... 504 Gateway Timeout Retrying. **What should have happened instead?**: Retrieve some data This happens since the 11th of November, did not see the problem before. And it happens with various different search patterns from my home machine (not logged in) and from within the toolforge cloud when logged in with my bot account. In my PHP-Code I get this answer: {"httpReason":"upstream request timeout","httpCode":504} in addition to status 504
    • Task
    **Steps to replicate the issue** (include links if applicable): Add an interwiki link to any item. In the language field tap "ro" (language code for Romanian) and press Tab **What happens?**: The language code becomes rmy (language code for Romani), as the first entry from the suggestion list **What should have happened instead?**: If the user does not touch the suggestion list by any mean (mouse or keyboard), and the text is a valid language code, the user entry should be used. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Test Scope** This is the second iteration of the changed donation form layout. The first one showed/hid postal address data fields depending on the `checked` status of the donation receipt radio button. In the second iteration we want to test if we users are willing to provide their postal address data without them needing a donation receipt. **Acceptance Criteria** **Variant banner form** - The form always shows the fields for the postal address data, not depending on the `checked` status of the donation receipt radio button. - All postal address data fields are optional unless a user enters a value into one of the fields. {F70168505, size=full} **Notes** - This ticket requires deploying the Fundraising Application. - User interaction design is not ready yet.
    • Task
    **Test Scope** This is the second iteration of the changed donation form layout. The first one showed/hid postal address data fields depending on the `checked` status of the donation receipt radio button. In the second iteration we want to test if we users are willing to provide their postal address data without them needing a donation receipt. **Acceptance Criteria** **Both banners** - The banners are based on the **control|variant banner** of **mobile-de-09**. **Control banner** - Submitting the banner form redirects users to the default donation form (`ap=0`). **Variant banner** - Submitting the banner form redirects users to the variant donation form (`ap=2`). **Variant banner form** - The form always shows the fields for the postal address data, not depending on the `checked` status of the donation receipt radio button. - All postal address data fields are optional unless a user enters a value into one of the fields. **Banner Preview** [light control banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Mobile_DE_10_ctrl&useskin=minerva&minervanightmode=0&devMode) [light variant banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Mobile_DE_10_var&useskin=minerva&minervanightmode=0&devMode) [dark control banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Mobile_DE_10_ctrl&useskin=minerva&minervanightmode=1&devMode) [dark variant banner](https://de.wikipedia.org/?banner=WMDE_FR_2025_Mobile_DE_10_var&useskin=minerva&minervanightmode=1&devMode)