== Common information
* **dashboard**: https://w.wiki/DocP
* **description**: Use `kubectl get jobs -l team=mediawiki-special-pages,cronjob=updatequerypages-deadendpages-s3 --field-selector status.successful=0` to see failures
* **runbook**: https://wikitech.wikimedia.org/wiki/Periodic_jobs#Troubleshooting
* **summary**: MediaWiki periodic job updatequerypages-deadendpages-s3 failed
* **alertname**: MediaWikiCronJobFailed
* **label_cronjob**: updatequerypages-deadendpages-s3
* **label_team**: mediawiki-special-pages
* **prometheus**: k8s
* **severity**: task
* **site**: eqiad
* **source**: prometheus
* **team**: mediawiki-special-pages
== Firing alerts
---
* **dashboard**: https://w.wiki/DocP
* **description**: Use `kubectl get jobs -l team=mediawiki-special-pages,cronjob=updatequerypages-deadendpages-s3 --field-selector status.successful=0` to see failures
* **runbook**: https://wikitech.wikimedia.org/wiki/Periodic_jobs#Troubleshooting
* **summary**: MediaWiki periodic job updatequerypages-deadendpages-s3 failed
* **alertname**: MediaWikiCronJobFailed
* **label_cronjob**: updatequerypages-deadendpages-s3
* **label_team**: mediawiki-special-pages
* **prometheus**: k8s
* **severity**: task
* **site**: eqiad
* **source**: prometheus
* **team**: mediawiki-special-pages
* [Source](https://prometheus-eqiad.wikimedia.org/k8s/graph?g0.expr=sum+by+%28label_cronjob%2C+label_team%29+%28kube_job_status_failed%7Bnamespace%3D%22mw-cron%22%7D+%2A+on+%28namespace%2C+job_name%29+group_left+%28label_cronjob%2C+label_team%29+kube_job_labels%7Bnamespace%3D%22mw-cron%22%7D%29+%3E+0&g0.tab=1)
**Steps to replicate the issue** (include links if applicable):
In Android app
* Have an article open
* Deeplink from browser into another article
**What happens?**:
* The new article displays with the previous article's image
* Reload of the page shows the previous article rather than correcting the present article's presentation.
**What should have happened instead?**:
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
**Other information** (browser name/version, screenshots, etc.):
Work with PgMs to identify current or prospective initiatives that fit into the PTAC recommendation number 2 thematic areas of: TOOLS
NPOV
CONTRIBUTORS
CONTENT RE-USE
COMMUNICATION & SUPPORT
OTHER (Cross-wiki flows, gammification and search)
As an FA team member, I want to track different analytics (such as DAU, session time, and retention) by difficulty mode, so that I can see which type of game experience appeals most to our target demographic.
**Acceptance Criteria**
1. First, implement the [[ https://create.roblox.com/docs/production/analytics/funnel-events | PlayerAdded event ]]. This will allow us to start tracking our funnel immediately on player join.
2. Then, use [[ https://create.roblox.com/docs/production/analytics/custom-fields | custom fields ]] to differentiate between Chill (zen), Classic (our initial game), and Challenge (mode with floor-falling).
3. Ensure that we can track our players' behavior by difficulty mode. Our key metrics here are:
- [[ https://create.roblox.com/docs/production/analytics/retention | Retention ]]
- [[ https://create.roblox.com/docs/production/analytics/engagement | Engagement ]] (average session time)
- Ideally, we can also measure [[ https://create.roblox.com/docs/production/analytics/acquisition | acquisition ]] by difficulty mode. This would inform us of the types of players that come in through, for example, homepage promotions, which would help us understand which promotional levers to focus on.
4. Ideally, we could also track [[ https://create.roblox.com/docs/production/analytics/funnel-events | funnel analytics ]] by difficulty mode. This would tell us the likelihood of drop-off based on different modes. Our funnel steps are:
- Entrance into lobby and tutorial modal
- Entrance into game (any mode, but ideally separated)
- Entrance into a room after the starting room. Since the player automatically lands in the starting room after entering the game, we will not count entrance into the starting room as a separate step in the funnel.
- Entrance into the target room
- Finished rebuilding of maze. Since the maze is automatically rebuilt/the player is automatically dropped back to the lobby after clicking on "Play Next," we will not count clicking on "Play Next" as a separate step in the funnel.
**Documentation**
1. PlayerAdded: https://create.roblox.com/docs/production/analytics/funnel-events
2. Custom fields: https://create.roblox.com/docs/production/analytics/custom-fields
3. Retention: https://create.roblox.com/docs/production/analytics/retention
4. Engagement: https://create.roblox.com/docs/production/analytics/engagement
5. Acquisition: https://create.roblox.com/docs/production/analytics/acquisition
6. Funnel analytics: https://create.roblox.com/docs/production/analytics/funnel-events
**Steps to replicate the issue** (include links if applicable):
* Edit https://test.wikipedia.org/wiki/User:Amire80/Z18784 in "Edit source" mode on desktop.
**What happens?**:
Template:Citation needed and other related templates and modules appear under "Templates used in this preview", but the wikifunction Z18784, which is also used there, doesn't.
**What should have happened instead?**:
There should be links from the editing mode to the function, just like there are for templates and modules. In fact, it's even more important for functions, because templates and modules can have a meaningful name, whereas function have only a number, and there's no other way to know what does the function except copying its ZID and searching for it on Wikifunctions.
There is a link to the function from visual editor, but it's necessary in wikitext editing, too.
**Steps to replicate the issue** (include links if applicable):
* Go to https://dag.wikipedia.org/wiki/Special:Version
* Examine the "Parser function hooks" section
**What happens?**:
`{{#invoke}}` and `{{#expr}}` appear in the "Parser function hooks" section, but `{{#function}}` doesn't.
**What should have happened instead?**:
If `{{#invoke}}` and `{{#expr}}` appear on Special:Version, so should `{{#function}}`.
Even if it's somehow significantly different from other Parser function hooks, it //looks// just like they do, so it should appear on Special:Version somewhere.
**Steps to replicate the issue**:
* This image: https://commons.wikimedia.org/wiki/File:Prof.Anna-Teresa_Tymieniecka.JPG
* is broken at: https://en.wikipedia.org/wiki/Anna-Teresa_Tymieniecka
* is broken at: https://de.wikipedia.org/wiki/Anna-Teresa_Tymieniecka
* is broken at most other projects that I checked from the Commons "Global usage links"
* **but works at**: https://fr.wikipedia.org/wiki/Anna-Teresa_Tymieniecka
* Possibly because it uses a different (bigger) thumbnail?
Original bug report at: https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#An_image_not_displaying_properly
As a player, I want to be able to choose between different difficulty modes, so that I can have more freedom while playing.
**Acceptance Criteria**
1. Include 3 doors in the lobby. One should lead to "Chill", one should lead to "Classic" (our original game), and one should lead to "Challenge."
2. The doors for Chill and Classic can be the same (though ideally, the door for Chill could be a light purple or orange). The door for Challenge can be either the same door in a deep red, or the original black portal.
3. When a player finishes a round in any game mode and is dropped back to the lobby, the player should be able to switch to another mode without difficulty.
4. The leaderboards can be shared between all three game modes (unless it is easier to continue to not include Chill scores in the leaderboard).
**Steps to replicate the issue** (include links if applicable):
* Visit a page with DiscussionTools that gets topics added regularly, e.g. [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto]]
* Check the timestamp in the topic containers for a topic that is 1 month old and 1 year old.
**What happens?**:
* When reading the page with the interface language as [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto?uselang=gom|gom]], the topic containers say '1 म्हयन्यां आदीं' (1 //months// ago) and '1 वर्सां आदीं' (1 //years// ago)
* A similar thing happens with other interface languages such as [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto?uselang=zh-hans|zh-hans]] and [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto?uselang=id|id]], but not with [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto?uselang=en|English]], [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto?uselang=fr|French]] or [[https://gom.wikipedia.org/wiki/Wikipedia:Tintto?uselang=es|Spanish]].
**What should have happened instead?**:
* The singular form of the time unit (month or year) should have appeared.
**Why is this happening?**
* These time units come from the CLDR extension, and all the time units are not getting transferred from the CLDR database for some languages.
** Languages like [[https://github.com/wikimedia/mediawiki-extensions-cldr/blob/master/CldrMain/CldrMainGom_deva.php|gom]], [[https://github.com/wikimedia/mediawiki-extensions-cldr/blob/master/CldrMain/CldrMainId.php|id]] and [[https://github.com/wikimedia/mediawiki-extensions-cldr/blob/master/CldrMain/CldrMainZh_hans.php|zh-hans]] are missing time units like `month-past-one` and `year-past-one`
** Compare this with langauges like [[https://github.com/wikimedia/mediawiki-extensions-cldr/blob/master/CldrMain/CldrMainEn.php|en]], [[https://github.com/wikimedia/mediawiki-extensions-cldr/blob/master/CldrMain/CldrMainEs.php|es]] and [[https://github.com/wikimedia/mediawiki-extensions-cldr/blob/master/CldrMain/CldrMainFr.php|fr]], which have units like `month-past-one` and `year-past-one`
//As a// technical contributor
//I want to// load javascript and css content hosted on gitlab.wikimedia.org in userscripts and/or gadgets on Wikimedia wikis
//So I can// implement collaborative workflows using git and CI.
GitLab serves raw content with `Content-Type: text/plain` and `X-Content-Type-Options: nosniff` headers which prevents us from trivially linking to content hosted on the service. These are reasonable protections from various forms of MIME confusion and XSS attacks. The upstream has rejected requests for integrated raw content delivery on multiple occasions.
Use of a reverse proxy separates sessions that visitors may have on the gitlab.wm.o site from the delivery of content by the Toolforge tool diffusing XSS attacks against the gitlab site. This can be seen as operating similarly to [[https://docs.gitlab.com/user/project/pages/|gitlab pages]] by delivering the content from a separate domain.
Related: {T321458}
**Steps to replicate the issue** (include links if applicable):
* Test case with const and function - run phpcs on the following code:
```lang=php
use MediaWiki\FileRepo\File\File;
class Bar {
private const FILE = 'test';
public function doWork() {
$file = file( self::FILE );
}
}
```
**What happens?**:
No error is shown
**What should have happened instead?**:
Warning `Unused use statement "File"` should be shown
`const FILE` is used in `tests/phpunit/unit/includes/filebackend/HTTPFileStreamerTest.php`, adding `use MediaWiki\FileRepo\File\File;` there, does not show as unused.
**Feature summary**: An interface to edit author and work Wikidata information directly from Paulina.
**Use case(s)**: Currently, when an author's or work's page contains incorrect or missing information, the user must click a link on the page that reads "Missing/wrong data? Edit Wikidata item," which takes them to the Wikidata item. This user interface is overly complex, as it's not clear what properties the user must edit or add to the Wikidata item to correct or add the information.
**Benefits** (why should this be implemented?): Many Paulina users are knowledgeable about heritage, but are not Wikidata experts. A user-friendly editing interface, embedded in the Paulina web application itself, would help increase contributions to Wikidata from these users.
== Common information
* **description**: Rule: PDU sensor over limit Faults:
#1: Sensor 3610
** 30 over limit
Previous Measurement: 3442
Limit: 3580
#2: Sensor 1445
** 15 over limit
Previous Measurement: 1243
Limit: 1430
#3: Sensor 1472
** 42 over limit
Previous Measurement: 1289
Limit: 1430
https://wikitech.wikimedia.org/wiki/Network_monitoring#LibreNMS_alerts
* **summary**: Alert for device ps1-b4-eqiad.mgmt.eqiad.wmnet - PDU sensor over limit
* **timestamp**: 2025-04-22 14:33:58
* **alertname**: PDU sensor over limit
* **instance**: ps1-b4-eqiad.mgmt.eqiad.wmnet
* **scope**: network
* **severity**: task
* **source**: librenms
* **team**: dcops
== Firing alerts
---
* **description**: Rule: PDU sensor over limit Faults:
#1: Sensor 3610
** 30 over limit
Previous Measurement: 3442
Limit: 3580
#2: Sensor 1445
** 15 over limit
Previous Measurement: 1243
Limit: 1430
#3: Sensor 1472
** 42 over limit
Previous Measurement: 1289
Limit: 1430
https://wikitech.wikimedia.org/wiki/Network_monitoring#LibreNMS_alerts
* **summary**: Alert for device ps1-b4-eqiad.mgmt.eqiad.wmnet - PDU sensor over limit
* **timestamp**: 2025-04-22 14:33:58
* **alertname**: PDU sensor over limit
* **instance**: ps1-b4-eqiad.mgmt.eqiad.wmnet
* **scope**: network
* **severity**: task
* **source**: librenms
* **team**: dcops
* [Source](https://librenms.wikimedia.org/device/device=48/alerts)
== Common information
* **description**: Rule: PDU sensor over limit Faults:
#1: Sensor 3522
** 12 over limit
Previous Measurement: 3524
Limit: 3510
https://wikitech.wikimedia.org/wiki/Network_monitoring#LibreNMS_alerts
* **summary**: Alert for device ps1-a4-eqiad.mgmt.eqiad.wmnet - PDU sensor over limit
* **timestamp**: 2025-04-22 14:29:00
* **alertname**: PDU sensor over limit
* **instance**: ps1-a4-eqiad.mgmt.eqiad.wmnet
* **scope**: network
* **severity**: task
* **source**: librenms
* **team**: dcops
== Firing alerts
---
* **description**: Rule: PDU sensor over limit Faults:
#1: Sensor 3522
** 12 over limit
Previous Measurement: 3524
Limit: 3510
https://wikitech.wikimedia.org/wiki/Network_monitoring#LibreNMS_alerts
* **summary**: Alert for device ps1-a4-eqiad.mgmt.eqiad.wmnet - PDU sensor over limit
* **timestamp**: 2025-04-22 14:29:00
* **alertname**: PDU sensor over limit
* **instance**: ps1-a4-eqiad.mgmt.eqiad.wmnet
* **scope**: network
* **severity**: task
* **source**: librenms
* **team**: dcops
* [Source](https://librenms.wikimedia.org/device/device=40/alerts)
When you install a new Speechoid the lexicon won't include any entries added from the wiki. There should be a maintenance script to add them.
TASK AUTO-GENERATED by Nagios/Icinga RAID event handler
A degraded RAID (md) [[ https://icinga.wikimedia.org/cgi-bin/icinga/extinfo.cgi?type=2&host=cloudcephmon1004&service=MD RAID | was detected ]] on host `cloudcephmon1004`. An automatic snapshot of the current RAID status is attached below.
Please **sync with the service owner** to find the appropriate time window before actually replacing any failed hardware.
```
CRITICAL: State: degraded, Active: 3, Working: 3, Failed: 1, Spare: 0
$ sudo /usr/local/lib/nagios/plugins/get-raid-status-md
Personalities : [raid10] [linear] [multipath] [raid0] [raid1] [raid6] [raid5] [raid4]
md0 : active raid10 sda2[0] sdd2[3] sdb2[1](F) sdc2[2]
1874534400 blocks super 1.2 512K chunks 2 near-copies [4/3] [U_UU]
bitmap: 1/14 pages [4KB], 65536KB chunk
unused devices: <none>
```
== Common information
* **dashboard**: https://grafana.wikimedia.org/d/b013af4c-d405-4d9f-85d4-985abb3dec0c/wmcs-kernel-errors?orgId=1&var-instance=cloudcephmon1004
* **description**: The server cloudcephmon1004 logged some kernel errors in the past 24h
* **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/KernelErrors
* **summary**: Server cloudcephmon1004 logged kernel errors
* **alertname**: KernelErrors
* **cluster**: wmcs
* **instance**: cloudcephmon1004:9100
* **job**: node
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: wmcs
== Firing alerts
---
* **dashboard**: https://grafana.wikimedia.org/d/b013af4c-d405-4d9f-85d4-985abb3dec0c/wmcs-kernel-errors?orgId=1&var-instance=cloudcephmon1004
* **description**: The server cloudcephmon1004 logged some kernel errors in the past 24h
* **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/KernelErrors
* **summary**: Server cloudcephmon1004 logged kernel errors
* **alertname**: KernelErrors
* **category**: priority_crit
* **cluster**: wmcs
* **instance**: cloudcephmon1004:9100
* **job**: node
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: wmcs
* [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=max_over_time%28kernel_messages%7Bcategory%3D~%22keyword_%28panic%7Cerror%7Ctaint%29%7Cpriority_%28emerg%7Calert%7Ccrit%7Cerr%29%22%7D%5B1d%5D%29+%3E+0&g0.tab=1)
---
* **dashboard**: https://grafana.wikimedia.org/d/b013af4c-d405-4d9f-85d4-985abb3dec0c/wmcs-kernel-errors?orgId=1&var-instance=cloudcephmon1004
* **description**: The server cloudcephmon1004 logged some kernel errors in the past 24h
* **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/KernelErrors
* **summary**: Server cloudcephmon1004 logged kernel errors
* **alertname**: KernelErrors
* **category**: priority_err
* **cluster**: wmcs
* **instance**: cloudcephmon1004:9100
* **job**: node
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: wmcs
* [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=max_over_time%28kernel_messages%7Bcategory%3D~%22keyword_%28panic%7Cerror%7Ctaint%29%7Cpriority_%28emerg%7Calert%7Ccrit%7Cerr%29%22%7D%5B1d%5D%29+%3E+0&g0.tab=1)
**Steps to replicate the issue**:
* go to Library Card Platform
* click "Access Collection" at [[ https://wikipedialibrary.wmflabs.org/partners/126/ | OECD Data ]], [[ https://wikipedialibrary.wmflabs.org/partners/127/ | OECD Multimedia Gallery ]] or [[ https://wikipedialibrary.wmflabs.org/partners/125/ | OECD iLibrary ]]
**What happens?**:
Volunteers have reported that sometimes access is only possible on every 5th to 10th attempt and in the other cases the error message “Website is not accessible via this address.” appears. I was able to reproduce this problem, although not with the same regularity.
My error message referred to: //Ray ID: 9344b0941f5acf03 (Performance & security by Cloudflare)// – see screenshot
{F59342016}
**What should have happened instead?**:
access to OCED collections
Steps to reproduce:
Do:
```docker compose exec mediawiki /bin/bash /docker/install.sh```
Results:
Error message you shouldn't call install.php directly.
```
*******************************************************************************
NOTE: Do not run maintenance scripts directly, use maintenance/run.php instead!
Running scripts directly has been deprecated in MediaWiki 1.40.
It may not work for some (or any) scripts in the future.
***************************************************************
```
- Implement the design shown here: https://phabricator.wikimedia.org/T391262
- Final wording:
- Header: Review needed
- Body: The query service might not be the optimal choice for getting the information you require. We recommend exploring our alternative data access methods to find the one more suited to your needs.
- CTA 1 - Run query anyway
- CTA 2 - Learn about alternatives
- Note: If the Codex components are difficult to implement in the WDQS UI, use what is available to build a component as close to these as possible
https://bidra.wikimedia.no/gigave
The link styling for the " Les betingelsene for gaven" link needs to match other links (underlining), so it's clearer that it is actually a link.
As a follow-up to T391444, there are still some hooks in `extension-repo.json` that use static methods. Migrate these to use constructible classes, implementing the associated hook interfaces if they exist.
**Acceptance Criteria**
- [] All hooks in the `Hooks` section of Wikibase's `extension-repo.json` reference HookHandlers
- [] For all hooks that have a hook interface, the hook handler classes implement the interface
As a follow-up to T391442, there are still some hooks in `extension-client.json` that use static methods. Migrate these to use constructible classes, implementing the associated hook interfaces if they exist.
**Acceptance Criteria**
- [] All hooks in the `Hooks` section of Wikibase's `extension-client.json` reference HookHandlers
- [] For all hooks that have a hook interface, the hook handler classes implement the interface
| [[ https://phabricator.wikimedia.org/T356384 | ← Past Event ]] | Current Event 2025 |
Let's improve Phorge by at least 0.01 % during this Wikimedia Hackaton {icon heart spin}
A consolidated team of people who specialize in random yelling @ Phorge will help you triaging and writing and reviewing patches in the upstream Phorge.it (ex Phabricator), to ideally see your patch online in Wikimedia Phabricator before XMAS! 🎄
== Anonymized testimonials from past events ==
* Maria: «I spent months of my life writing local local patches since there was no way to easily contribute in secure.phabricator.com ; but then I changed company and they had their own Phorge/Phabricator, indeed without my patches... noooooooo... so, now I've spent months of my life promoting local patches in Phorge.it so I can easily move to other companies and still have my patches. MUAHAHAHAH»
* ...
(↑ Feel free to add more ~~fake~~ ehm anonymized testimonials)
== Activities Corner ==
Feel free to propose a problem/feature to be triaged, or highlight a particular code review, etc. Thanks!
=== Triaging ===
- ...
- https://we.phorge.it/T15081 · Figure out if there are patches from Wikimedia's fork that are desirable to upstream in Phorge
- {T107254}
- https://we.phorge.it/tag/affects-wikimedia/
- https://we.phorge.it/maniphest/
- ...
=== Code Review ===
Are you opinionated about PHP? Share your comments on a patch!
- https://we.phorge.it/differential/
=== Testing ===
Are you opinionated about features? Explore the new ones!
https://we.phorge.it/w/changelog/next_up/
=== SetUp ===
More bonus activities involving Phorge:
- How-to setup a local Phorge
- a trip into the "[[ https://en.wikipedia.org/wiki/Five_whys | Five whys ]]" interrogation
- a Free coin for the Arcanist command line gamebox - https://commons.wikimedia.org/wiki/File:Arcanist_command_arc_anoid_-_winner.png
- ...
---
And finally:
# Post-Event Activity Report
## 🫠 Maximum Effort 🔝 leading to Minimum Result ⏬ Ranking
NOTE: To be done.
This ranking is dedicated to giving a big thank you to small things that required exaggerated human effort.
Please edit this section do add interesting things to be mentioned. No requirement. Just add in the list and talk about that.
....
## Team
Blessed committers who can approve your patches upstream:
https://we.phorge.it/project/members/3/
- https://we.phorge.it/p/valerio.bozzolan/ - Phorge hacker
- https://we.phorge.it/p/aklapper/ - epic refactor Phorge hacker
- ...
Other people that may want to join and meet together to talk/hack 5 minutes about Phorge lol:
- ...
- ...
- ...
MW version: 1.43
PHP version: 8.0.30
DB system (MySQL, Blazegraph, etc.) and version: mysql
Issue
The
Detailed description of the issue and a stack trace if applicable:
advance search extensions unable to search in title contain.
resulting no result is return. normal search ok work fine.
the intitle+ <--- not working
{F59336871}
Hello,
> tools.wmtran@tools-bastion-13:~$ cat TOOL_DISABLED
> Your tool has been disabled as part of the grid-engine
> shutdown timeline. We have tried to contact you about this timeline but
> have so far been able to reach the tool owners.
What is grid-engine? My tool is only a PHP/JS page, it does not require any changes.
>
> If you need to temporarily re-enable this tool on the grid, begin by
> locating the task associated with your tool (it will be a subtask of
> https://phabricator.wikimedia.org/project/view/6135/)
I can't do this, as wmtran isn't listed on this page.
> and comment on
> that task. Please include your plan for migration as well as a request
> to be re-enabled for a limited time.
>
> You may also get speedier support with re-enabling by using the 'help' keyword
> on the IRC wikimedia-cloud channel. Please engage on phabricator before
> pinging admins on IRC.
>
> Details about this process can be found here:
>
> https://wikitech.wikimedia.org/wiki/News/Toolforge_Grid_Engine_deprecation#Timeline
>
Your help would be appreciated.
(Please also open a task, or a request, at either Wikitech or phabricator, when anything needs to be done about my tool, and enable a way for a notification to be visible when I am logged into another WMF wiki.)
Thanks.
Background: it'll be easier to work during the hackathon if we have easy access to the existing upload flows
[Pre-hackathon] Document current flows for adding an image through
- Android Commons App
- Mobile Web Upload Wizard
##### Android Commons App
https://commons.wikimedia.org/wiki/Commons:Mobile_app
https://commons-app.github.io/
**Entry points & choosing photo**
From the App
| Click on "+" | 3 options: | 1) Take photo with camera | or 2) Select photo from camera roll | or 3) Custom selector
| {F59336450} | {F59336452} | {F59336483} | {F59336494} | {F59336497}
From Photos App > Share
{F59336562}
**Step 1: Media Details**
| Starting screen | Media Details how-to | Edit image > Allows you to rotate image | Add location | Choose language (ability to add second language for caption and description) | Caption how-to | Description how-to
- First caption is used as the file name
- This step validates that the file name is unique
- Reminds users to add location before proceeding
**Step 2: Depicts**
**Step 3: Categories**
**Step 4: Media License**
**Submitting**
This task adds "bread crumbing" instrumentation to the donate button utilizing existing infrastructure to log clicks. This task will also implement branching logic to display specific screens to a user depending on their login state.
**Expected Behavior**
Bread crumbing: When the donate button is clicked, the click event will be visible as a breadcrumb event in the streaming API.
Branching Logic: When a user is logged into the app, they will see personalized slides. However when the user is logged out, they will see “collective” slides.
#####Background
See Epic: {T384758}
#####Requirements
- @QA engineer perform the following checks on the Tabs Beta release in addition to smoke test and individual tickets
[] Test Tabs under slow wifi, and weak cellular connection and surface if there are any performance issues
[] Test having a lot of tabs (400+ different tabs) and see if the App takes up substantially more local storage (via settings) than no/few tabs
[] Test adding lots of tabs with a device that is low on storage - is there a point where you are unable to open new tabs?
[] Test opening external links with a variety of browsers
[] Test opening article links in tabs from notifications, talk pages, user page links, and diff view.
[] Check Tabs overview for article titles that have different text styles
[] Check Tabs overview for different language variants and verify the correct titles & article descriptions are being pulled in
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
[] @Engineers build feature with compatibility built in for VoiceOver and Dynamic Type
[] @Design - during design review of individual tasks or once tabs is in Beta, check over functionality for accessing articles and the tabs overview
- VoiceOver
- Dynamic Type
[] @Design notate necessary & nice-to-have fixes on this task, and send back to Engineers for fixes
[] @Engineers: Fix must-haves before release
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
- Ensure back behavior is working as expected across tabs
- ‘Back’ button takes users back to the previous item they visited.
- Example flow for switching tabs:
- User is on ‘Explore’ → Taps [Polar bear](https://en.wikipedia.org/wiki/Polar_bear) → Opens [Arctic](https://en.wikipedia.org/wiki/Arctic) in a new tab and switches to it → Opens [Polar region](https://en.wikipedia.org/wiki/Polar_regions_of_Earth) → Opens [Earth](https://en.wikipedia.org/wiki/Earth)
- Taps ‘Back’ on [Earth](https://en.wikipedia.org/wiki/Earth) → Taps ‘Back’ on [Polar region](https://en.wikipedia.org/wiki/Polar_regions_of_Earth) → Taps ‘Back’ on [Arctic](https://en.wikipedia.org/wiki/Arctic) [user is taken to the previous tab] → Tabs ‘Back’ on [Polar bear](https://en.wikipedia.org/wiki/Polar_bear) → User is back on ‘Explore’
- ‘Back’ button label: whenever possible, show the label of the previous destination next to the back button to make navigating articles easier. It has helped users immensely to navigate the prototype in usability testing ([T389390](https://phabricator.wikimedia.org/T389390)) and understand what happens
- Filter out non-article detours (talk pages, article history, files)
Nice to have - to be done on separate tasks
- Animate the tabs icon (increase then decrease its size) when users tap ‘Open in new background tab’ to emphasize that a new tab has opened. Some users didn’t realize the tab opened in the background since the interface didn’t respond ([T389390](https://phabricator.wikimedia.org/T389390)) #nicetohave
- If the label on the back button is too long, apply truncation (ellipsis with …)
#####Designs
| {F59120255} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=241-1780&t=ZAPGkfpwkUomBWcu-4) | {F59120320} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=241-1538&t=ZAPGkfpwkUomBWcu-4)| {F59122869} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=241-1416&t=ZAPGkfpwkUomBWcu-4) | {F59122964} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=261-10449&t=ZAPGkfpwkUomBWcu-4) | {F59123040} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=261-10378&t=ZAPGkfpwkUomBWcu-4) | {F59123107} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=241-1653&t=ZAPGkfpwkUomBWcu-4)
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
- Show optional survey **once** if users have used “Open in new tab” (using the plus icon in the tabs overview, via long-press on an article link or long-press on the tabs icon in the navigation bar) **AND** visited the tab overview page.
- Toast is shown after feedback is submitted
- Allow free-text entry, and allow user to type and see multiple lines of text in the free-text entry
#####Designs
| {F59267409} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=282-12770&t=ZPcdeEc5oI3PAkmd-4) | {F59129773} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=434-2364&t=8sLxXXk5UHzJKHD9-4)
- Reuse components from T370308 and YiR
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
- If the tab limit is reached (500), the article always opens in the current tab
- If the limit is reached, show a toast message **without buttons ** with the following copy any time they try to open a new tab from the contextual menu, the "new tab" option in the tabs overview, or from an external link:
- Tab limit reached (500). Please close one or more tabs.
To be done in separate tasks in V2
- Offering "close all tabs" option here
- Save all tabs to reading list flow
#####Designs
https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=261-11925&t=CLBPHIjMmq3eHD3T-4
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
- When a user taps a Tab from the overview, open that article
- Remember the user's previous scroll position and immediately show the article in that position (avoid a scrolling motion flashing to the user)
- Only the most recently visited article in the Tab needs to be saved in the tab overview
- Filter out non-article detours (talk pages, article history, files) - if someone left one of these page, show the last visited article in the Tab
- Accessing an article from Tabs overview should counts as pageview/impression in the instrumentation
- If the user has already viewed the tab once and, and is re-acccessing it, count it as a new pageview (TBD: Should we check how Web handles this in the browser, when an article is open and folks access it multiple times)
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
- Users can open articles in new tabs and go to them, or open them in the background for reading later
- Update the long-press contextual menu across the app to add two new options
- ‘Open in new tab’ (opens article link in new tab, and takes the user there)
- ‘Open in new background tab’ (opens article link in new tab, and does NOT take the user there)
- These two options always open the article in a new tab (regardless of whether an item is already open).
- Keep current behavior in when navigating links in articles with a short press
- Update the default behavior for opening articles with a short press
- Web browser:
- Article opens in new tab, user is taken there.
- If the tab limit is reached (500), the article opens in the current tab.
- Search input (anywhere in the app)
- Article opens in the current tab, user is taken there
- Explore:
- Article opens in the current tab, user is taken there.
- Places:
- Article opens in the current tab, user is taken there.
- Saved:
- Article opens in the current tab, user is taken there.
- History
- Article opens in the current tab, user is taken there.
- Anywhere else:
- Article opens in the current tab, user is taken there.
Nice-to-have
- Switch to existing tabs when an article is already open, instead of opening a new one [1]
- For web browsers: If the article is open at the top of an existing tab, switch to it
- For in-app search (anywhere in the app): If the article is open at the top of an existing tab, switch to it
[1] The history of that tab is not that important in this scenario. The main point is that tabs shouldn’t be duplicated to stay manageable. Back could simply take users to the root view from where the tabs has been opened initially (e.g. Explore, Places, Saved, History, Search).
#####Designs ([[ https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=261-10378&t=CLBPHIjMmq3eHD3T-4 | Figma ]])
| {F59335598} | {F59335600}
| Updated contextual menu | Nice to have, show open tab in search
- We aren’t complicating things for existing users who haven’t dealt with tabs before. Tabs are here when you use them, but stay out of the way if you don’t.
#####Engineering Notes
#####Background
See Epic: {T384758}
#####Requirements
- Add 2 tabs onboarding tooltips
- Show tab tooltips in the same session as the Wikipedia ‘W’ tooltip. This ensures a streamlined navigation onboarding experience for users.
- Ensure that users who have already seen the 'W' tooltip see the new tab-related tooltips too.
- It's ok to repeat showing the 'W' tooltip if needed
- Use sequential tooltips (similar to ‘Add image’ to introduce tabs)
- Users must tap 'Got it' to continue interacting with the screen.
- The current behavior is to dismiss the tooltip once users click anywhere on the screen.
- To make sure users learn that there are tabs, we should “lock” the background, similar to the onboarding experience for ‘Add an image’.
- If the above is not feasible, reshow the tooltip until users tap 'Got it'. Maximum: 2 times.
Nice to have
- ‘Open in new tab’ tooltip pointing to a blue link in the article (t’s ok if it points to the article’s main content area to reduce complexity)
#####Designs
| {F59267836} [[ https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=454-2506&t=0bPtbBgKopsyCHDF-4 | Figma ]] | {F59118685} [Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=243-8453&t=wMZj2wNuI6b68ECt-4) | {F59118772}[Figma](https://www.figma.com/design/Z7ms5T86rziBoY9XKBOy62/iOS---Tabs?node-id=244-8664&t=wMZj2wNuI6b68ECt-4)
#####Engineering Notes
When analysing deadlocks the main command is `SHOW INNODB ENGINE STATUS` - this is not something we have permission to run.
I also note this write up
https://dev.to/productive/how-we-handled-mysql-deadlocks-in-productive-part-1-15ce
talks about a `data_locks` table in the performance schema database - which might be useful? Although I don't see that table locally so I wonder if it's not native mariab
Since we have the auth issue for a while and no one noticed for about a month,
we should think about something to detect the potential issues when we have so many gateway works under gravy, which hard to notice if any one of them is having some issue for a while,
we need to be more vigilant with gravy transactions
Hello, when I tried to run a transaction for Online Banking, I get the following error:{F59334846}. There has been an outage at Adyen today impacting alternative payment type sin the EU, however, I do not see any volume for this payment type for the last 2 weeks or so while our campaign has been running. If possible, can we re enable this payment type or at a minimum not show the button on the form while it is disabled?
**Steps to replicate the issue**:
* Go to a sandbox
* Enter the Wikitext `[[Special:TalkPage/foobar1234567890]]`
* Click save or preview
**What happens?**:
# A blue link is created to [[Special:TalkPage/foobar1234567890]], even though `[[Talk:foobar1234567890]]` doesn't actually exist.
# Clicking on the link takes you to https://en.wikipedia.org/wiki/Talk:Foobar1234567890
**What should have happened instead?**:
# The link should be a red link
# The link should go to https://en.wikipedia.org/w/index.php?title=Talk:Foobar1234567890&action=edit which is normal behaviour for red links.
**Notes**:
Special:TalkPage was introduced in {T242346}
The code is to be found in `includes/specials/redirects/SpecialTalkPage.php`
On 2025-05-06, Codex v2.0.0-rc1 will begin riding the deployment train. As that occurs, we can start testing on the wikis being deployed to.
### Testing plan
We'll test the following as applicable (some special pages won't appear on all wikis):
- Test in Vector 2022, Vector legacy, MinervaNeue, and Monobook
- Ensure the Vector 22 header and sidebars do not scale with text size
- Test login and create account forms
- Test a messagebox in content (may need to add a CSS-only message to page content)
- Test Special:ContentTranslation
- Test Special:Block with `?usecodex=1` (muiltiblocks)
- Test Special:MediaSearch
- Test Special:Nearby
- Test Special:NewPagesFeed
- Test Special:Preferences (not a Codex UI; ensure nothing has changed)
#### Group 0 wikis (May 6)
- [] mediawiki.org
- [] test.wikipedia.org
- [] test.wikidata.org
- [] office.wikimedia.org
#### Group 1 wikis (May 7)
- [] wikidata.org
- [] wikifunctions.org
- [] commons.wikimedia.org
- [] meta.wikimedia.org
- [] ca.wikipedia.org
- [] he.wikipedia.org
#### Group 2 wikis (May 8-9)
- [] en.wikipedia.org
- Any other Wikipedias we think we should test
### Anticipated findings
Status quo:
- There should be no Less compilation errors; styles should load normally.
- Interfaces that do not use the Codex toolkit should be unchanged.
- Page titles and article text should be unchanged.
- In Vector 2022...
- The header and sidebars should not scale with text size. Only the content region should scale.
- The QuickSurveys UI and messages within content should not scale with text size.
Changes:
- Codex components and interfaces that use Codex design tokens may look slightly different but any changes should be either negligible or improvements. Some more noticeable visual changes include:
- Dialog titles have increased in size.
- Spacing between binary inputs within groups has reduced.
- Padding in non-inline Messages has reduced.
- In MonoBook...
- Codex interfaces (e.g. the login form, the Codex version of Special:Block) will use the small font mode and will therefore be based at a 0.875rem/14px font size. This means they will be ~10% larger than the existing size (based at 12.7px).
---
### Acceptance criteria
- [] Complete all testing on the dates specified
- [] Any findings are discussed within the team
See activities for CID 66493049 - 17.5K TY mail activities but 5 contributions.
Looking at a [[ https://github.com/toolforge/quarry/actions/runs/14513505830/job/40717349988?pr=75 | random CI job ]]:
```
Run docker run tox:01
flake8: OK ✔ in 0.22 seconds
pytest: OK ✔ in 0.09 seconds
black: OK ✔ in 0.09 seconds
flake8: OK (0.22 seconds)
pytest: OK (0.09 seconds)
black: OK (0.09 seconds)
mypy: OK (0.09 seconds)
congratulations :) (0.56 seconds)
```
This is not actually doing anything...
The magnum dashboard doesn't seem to filter cluster (templates) for the specific project, and instead shows all the clusters
When the underlying Gravy SDK's Gr4vyConfig::getTransaction returns null, we get this failmail:
Last chance exception handler fired.
TypeError@ SmashPig/PaymentProviders/Gravy/Api.php:148 (SmashPig\PaymentProviders\Gravy\Api::getTransaction(): Return value must be of type array, null returned)
As [[ https://meta.wikimedia.org/wiki/Talk:Web2Cit#c-Aaron_Liu-20250420161300-Aaron_Liu-20250220150500 | reported ]] by User:Aaron_Liu, Range transformation does not behave as expected when used with `itemwise=true`. Instead, it seems to behave just as it does with `itemwise=false".
**Steps to replicate the issue** (include links if applicable):
* Create a translation template
* Use a selection step to select some string, for example `"some string"`
* Use a range transformation step with `itemwise=true`, for example `3:4`
**What happens?**:
In the example, the Range transformation is applied to the array `[ "some string", ]`. Therefore, the output is an empty array, since there are no 3rd nor 4th items in the original array.
**What should have happened instead?**:
The Range transformation, used with `itemwise=true` should have been applied on each individual string, considering each character of the string as an item in an array, as stated in the docs [[ https://meta.wikimedia.org/wiki/Web2Cit/Docs/Templates#Range_transformation | here ]].
The output may have been `[ "me", ]` instead.
Note that the documentation does not specify whether characters should be rejoined in a string after Range transformation (with `itemwise=true`) is applied.
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
**Other information** (browser name/version, screenshots, etc.):
If the behavior described in the documentation is indeed the expected behavior, carefully consider which translation templates currently use Range transformation steps with `itemwise=true` and how outputs would change before deploying any changes.
Also note that the Join transformation has a similar "take string as a list of characters" behavior when `itemwise=true` which does work as expected.
== Common information
* **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status
* **description**: wmf_auto_restart_ssh-gitlab.service on gitlab1004:9100
* **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state
* **summary**: wmf_auto_restart_ssh-gitlab.service on gitlab1004:9100
* **alertname**: SystemdUnitFailed
* **instance**: gitlab1004:9100
* **name**: wmf_auto_restart_ssh-gitlab.service
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: collaboration-services
== Firing alerts
---
* **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status
* **description**: wmf_auto_restart_ssh-gitlab.service on gitlab1004:9100
* **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state
* **summary**: wmf_auto_restart_ssh-gitlab.service on gitlab1004:9100
* **alertname**: SystemdUnitFailed
* **instance**: gitlab1004:9100
* **name**: wmf_auto_restart_ssh-gitlab.service
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: collaboration-services
* [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%2C+name%29+group_left+%28team%29+systemd_unit_owner%29+%3D%3D+1+or+ignoring+%28team%29+%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%29+group_left+%28team%29+role_owner%7Bteam%21%3D%22wmcs%22%7D%29+%3D%3D+1&g0.tab=1)
On 2025-04-29, Codex v2.0.0-rc1 will be released and merged into MediaWiki core. Once that happens, we can start testing on beta wikis.
### Testing plan
Once the core patch is merged, we can begin testing on beta wikis:
- Beta English Wikipedia: https://en.wikipedia.beta.wmflabs.org/wiki/Main_Page
- Beta Meta-Wiki: https://meta.wikimedia.beta.wmflabs.org/wiki/Main_Page
We'll test the following:
- Test in Vector 2022, Vector legacy, MinervaNeue, and Monobook
- Ensure the Vector 22 header and sidebars do not scale with text size
- Test login and create account forms
- Test a messagebox in content (may need to add a CSS-only message to page content)
- Test Special:ContentTranslation
- Test Special:Block with `?usecodex=1` (muiltiblocks)
- Test Special:MediaSearch
- Test Special:Nearby
- Test Special:NewPagesFeed
- Test Special:Preferences (not a Codex UI; ensure nothing has changed)
### Anticipated findings
Status quo:
- There should be no Less compilation errors; styles should load normally.
- Interfaces that do not use the Codex toolkit should be unchanged.
- Page titles and article text should be unchanged.
- In Vector 2022...
- The header and sidebars should not scale with text size. Only the content region should scale.
- The QuickSurveys UI and messages within content should not scale with text size.
Changes:
- Codex components and interfaces that use Codex design tokens may look slightly different but any changes should be either negligible or improvements. Some more noticeable visual changes include:
- Dialog titles have increased in size.
- Spacing between binary inputs within groups has reduced.
- Padding in non-inline Messages has reduced.
- In MonoBook...
- Codex interfaces (e.g. the login form, the Codex version of Special:Block) will use the small font mode and will therefore be based at a 0.875rem/14px font size. This means they will be ~10% larger than the existing size (based at 12.7px).
---
### Acceptance criteria
- [] Everything in the list above has been tested on at least one of the beta wikis
- [] Any findings are discussed within the team
As a user, I would like to Wikify my content, so that I can:
- Share my Wikified content with others, and/or
- Enhance my content consuming experience with Wikipedia.
**Acceptance Criteria**
1. Sketch 5-10 initial ideas for what WikiMe could be that fit within the "sweet spot" of "fun" and "serious" identified in Slide 12 of our [[ https://docs.google.com/presentation/d/1hrapQqw7TuqVG5U94RPabVq_F5Dtvy5wR9MMfJV3pgg/edit?usp=sharing | April 2025 Steerco presentation ]]. We aim to lean towards fun. These ideas should be centered around at least one of the following concepts:
- In both cases, users can import information--such as their Goodreads or Letterboxd lists, or even just their current interests--into a location.
- In our first case, the aim is to give users an artifact that users want to share with their friends and community. An example would be a personalized image of what a user's Wikipedia page could be. This use case is relatively one-time.
- In our second case, the aim is to drive traffic to Wikipedia by providing users with additional information (or other enhancements) that "only Wikipedia could provide." An example would be an enhanced reading list that adds context and relevant knowledge for the user to explore. This use case is relatively multi-use.
2. Important questions to ask include:
- Which external platforms should we target? Goodreads, Letterboxd, and RateYourMusic are good platforms that already filter for users that like to read. Are there other platforms we can leverage?
- Where should this live? Is this on our current app, on a separate portal, or somewhere else?
- How will users discover this feature? Can we approach these platforms with the angle of providing content enhancements for their users, or might they be opposed to it due to the nature of our feature idea?
Use case: for @RMurthy to help automate some elements of forecasting
An integration that would let us automatically write from tables from MinIO into Google Sheets
This would save a lot of manual effort that Runjini is currently doing copying, pasting, calculating, and manipulating data from multiple tables in Superset in Google Sheets
See https://docs.google.com/spreadsheets/d/1yfKqO0DD2Wo2HiwDfMVOuhBtMAco2ADNVJdxVrpg6Og/edit?gid=121501171#gid=121501171 for how she uses it :D
When a donor starts a recurring charge in e.g. Brazil and then gets their country changed to e.g. US, we start seeing the recurring charges fail with "Processor failed to create new payment with response:Can't map fiscal number to Gravy Tax ID"
Note that the country change in the case of CID 21070433 is also suspect, but we should figure out where to stash that original country, probably on something related to the contribution_recur.
== Background
- Now that we are starting implementation for empty searc recoms, we need a feature flag
== User story
As a user I want a responsive search that works consistently across my devices and has the same features on both desktop and mobile
AS a developer I want to have an easy way to test and develop on the empty search recommendation feature
== Requirements
- A feature flag is created and used in minerva, a class is present or not present in the body element depending on if the flag is on or off
- query params work for the feature flag
- Default config is deployed that disables the flag
- the flag is enabled in beta cluster
- Add activity tab data controller that contains A/B/C test logic for when to show the activity tab, and which variant of the activity tab to show. (see parent for details)
- Ensure History tab experience is brought back for those not eligible.
==== Error ====
* mwversion: 1.44.0-wmf.25
* reqId: `5a043710-8d61-4765-9d0f-c629de0d02f1`
* [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-04-20T06:30:58.258Z',to:'2025-04-21T15:36:55.039Z'))&_a=(query:(query_string:(query:'reqId:%225a043710-8d61-4765-9d0f-c629de0d02f1%22'))) | Find reqId in Logstash ]]
```name=normalized_message,lines=10
[{reqId}] {exception_url} Wikibase\Lexeme\Domain\Model\Exceptions\ConflictException: At least two forms with the same ID were provided: `L855-F1`
```
| Frame | Location | Call
| -- | -- | --
| from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Domain/Model/FormSet.php#92 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikibaseLexeme/src/Domain/Model/FormSet.php(92) ]] |
| #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Domain/Model/LexemePatchAccess.php#43 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikibaseLexeme/src/Domain/Model/LexemePatchAccess.php(43) ]] | Wikibase\Lexeme\Domain\Model\FormSet->add(Wikibase\Lexeme\Domain\Model\Form)
| #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Domain/Diff/LexemePatcher.php#164 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikibaseLexeme/src/Domain/Diff/LexemePatcher.php(164) ]] | Wikibase\Lexeme\Domain\Model\LexemePatchAccess->addForm(Wikibase\Lexeme\Domain\Model\Form)
| #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Domain/Model/Lexeme.php#350 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikibaseLexeme/src/Domain/Model/Lexeme.php(350) ]] | Wikibase\Lexeme\Domain\Diff\LexemePatcher::Wikibase\Lexeme\Domain\Diff\{closure}(Wikibase\Lexeme\Domain\Model\LexemePatchAccess)
| #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Domain/Diff/LexemePatcher.php#165 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikibaseLexeme/src/Domain/Diff/LexemePatcher.php(165) ]] | Wikibase\Lexeme\Domain\Model\Lexeme->patch(Closure)
| #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Domain/Diff/LexemePatcher.php#92 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikibaseLexeme/src/Domain/Diff/LexemePatcher.php(92) ]] | Wikibase\Lexeme\Domain\Diff\LexemePatcher->patchForms(Wikibase\Lexeme\Domain\Model\Lexeme, Wikibase\Lexeme\Domain\Diff\LexemeDiff)
| #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+blame/refs/heads/wmf/1.44.0-wmf.25/lib/packages/wikibase/data-model-services/src/Diff/EntityPatcher.php#40 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/Wikibase/lib/packages/wikibase/data-model-services/src/Diff/EntityPatcher.php(40) ]] | Wikibase\Lexeme\Domain\Diff\LexemePatcher->patchEntity(Wikibase\Lexeme\Domain\Model\Lexeme, Wikibase\Lexeme\Domain\Diff\LexemeDiff)
| #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+blame/refs/heads/wmf/1.44.0-wmf.25/repo/includes/Content/EntityContent.php#406 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/Wikibase/repo/includes/Content/EntityContent.php(406) ]] | Wikibase\DataModel\Services\Diff\EntityPatcher->patchEntity(Wikibase\Lexeme\Domain\Model\Lexeme, Wikibase\Lexeme\Domain\Diff\LexemeDiff)
| #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+blame/refs/heads/wmf/1.44.0-wmf.25/repo/includes/Actions/EditEntityAction.php#295 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/Wikibase/repo/includes/Actions/EditEntityAction.php(295) ]] | Wikibase\Repo\Content\EntityContent->getPatchedCopy(Wikibase\Repo\Content\EntityContentDiff)
| #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+blame/refs/heads/wmf/1.44.0-wmf.25/repo/includes/Actions/EditEntityAction.php#234 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/Wikibase/repo/includes/Actions/EditEntityAction.php(234) ]] | Wikibase\Repo\Actions\EditEntityAction->showUndoForm()
| #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/actions/ActionEntryPoint.php#728 | /srv/mediawiki/php-1.44.0-wmf.25/includes/actions/ActionEntryPoint.php(728) ]] | Wikibase\Repo\Actions\EditEntityAction->show()
| #10 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/actions/ActionEntryPoint.php#505 | /srv/mediawiki/php-1.44.0-wmf.25/includes/actions/ActionEntryPoint.php(505) ]] | MediaWiki\Actions\ActionEntryPoint->performAction(MediaWiki\Page\Article, MediaWiki\Title\Title)
| #11 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/actions/ActionEntryPoint.php#143 | /srv/mediawiki/php-1.44.0-wmf.25/includes/actions/ActionEntryPoint.php(143) ]] | MediaWiki\Actions\ActionEntryPoint->performRequest()
| #12 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/MediaWikiEntryPoint.php#202 | /srv/mediawiki/php-1.44.0-wmf.25/includes/MediaWikiEntryPoint.php(202) ]] | MediaWiki\Actions\ActionEntryPoint->execute()
| #13 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/index.php#58 | /srv/mediawiki/php-1.44.0-wmf.25/index.php(58) ]] | MediaWiki\MediaWikiEntryPoint->run()
| #14 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/w/index.php#3 | /srv/mediawiki/w/index.php(3) ]] | require(string)
| #15 | {main} |
==== Impact ====
==== Notes ====
==== Error ====
* mwversion: 1.44.0-wmf.25
* reqId: `b5a963f1-8bee-44f6-8e2d-8f6e69ba942f`
* [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-04-20T10:40:34.111Z',to:'2025-04-21T15:30:57.354Z'))&_a=(query:(query_string:(query:'reqId:%22b5a963f1-8bee-44f6-8e2d-8f6e69ba942f%22'))) | Find reqId in Logstash ]]
```name=normalized_message,lines=10
[{reqId}] {exception_url} Wikimedia\Rdbms\DBQueryError: Error 1146: Table 'centralauth.discussiontools_item_revisions' doesn't exist
Function: MediaWiki\Extension\DiscussionTools\ThreadItemStore::insertThreadItems
Query: SELECT itr_id FROM `discussion
```
| Frame | Location | Call
| -- | -- | --
| from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#1232 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(1232) ]] |
| #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#1216 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(1216) ]] | Wikimedia\Rdbms\Database->getQueryException(string, int, string, string)
| #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#1190 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(1190) ]] | Wikimedia\Rdbms\Database->getQueryExceptionAndLog(string, int, string, string)
| #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#647 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(647) ]] | Wikimedia\Rdbms\Database->reportQueryError(string, int, string, string, bool)
| #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#1367 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(1367) ]] | Wikimedia\Rdbms\Database->query(Wikimedia\Rdbms\Query, string)
| #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#1318 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(1318) ]] | Wikimedia\Rdbms\Database->select(array, string, array, string, array, array)
| #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php#780 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(780) ]] | Wikimedia\Rdbms\Database->selectField(array, string, array, string, array, array)
| #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/DiscussionTools/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ThreadItemStore.php#739 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/DiscussionTools/includes/ThreadItemStore.php(739) ]] | Wikimedia\Rdbms\SelectQueryBuilder->fetchField()
| #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/Database.php#2293 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/Database.php(2293) ]] | MediaWiki\Extension\DiscussionTools\ThreadItemStore->MediaWiki\Extension\DiscussionTools\{closure}(Wikimedia\Rdbms\DatabaseMySQL, string)
| #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/DBConnRef.php#127 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/DBConnRef.php(127) ]] | Wikimedia\Rdbms\Database->doAtomicSection(string, Closure, string)
| #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/rdbms/database/DBConnRef.php#661 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/rdbms/database/DBConnRef.php(661) ]] | Wikimedia\Rdbms\DBConnRef->__call(string, array)
| #10 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/DiscussionTools/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ThreadItemStore.php#803 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/DiscussionTools/includes/ThreadItemStore.php(803) ]] | Wikimedia\Rdbms\DBConnRef->doAtomicSection(string, Closure, string)
| #11 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/DiscussionTools/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/Hooks/DataUpdatesHooks.php#50 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/DiscussionTools/includes/Hooks/DataUpdatesHooks.php(50) ]] | MediaWiki\Extension\DiscussionTools\ThreadItemStore->insertThreadItems(MediaWiki\Revision\RevisionStoreRecord, MediaWiki\Extension\DiscussionTools\ContentThreadItemSet)
| #12 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/deferred/MWCallableUpdate.php#52 | /srv/mediawiki/php-1.44.0-wmf.25/includes/deferred/MWCallableUpdate.php(52) ]] | MediaWiki\Extension\DiscussionTools\Hooks\DataUpdatesHooks->MediaWiki\Extension\DiscussionTools\Hooks\{closure}(string)
| #13 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/deferred/DeferredUpdates.php#459 | /srv/mediawiki/php-1.44.0-wmf.25/includes/deferred/DeferredUpdates.php(459) ]] | MediaWiki\Deferred\MWCallableUpdate->doUpdate()
| #14 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/deferred/RefreshSecondaryDataUpdate.php#112 | /srv/mediawiki/php-1.44.0-wmf.25/includes/deferred/RefreshSecondaryDataUpdate.php(112) ]] | MediaWiki\Deferred\DeferredUpdates::attemptUpdate(MediaWiki\Deferred\MWCallableUpdate)
| #15 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/deferred/DeferredUpdates.php#459 | /srv/mediawiki/php-1.44.0-wmf.25/includes/deferred/DeferredUpdates.php(459) ]] | MediaWiki\Deferred\RefreshSecondaryDataUpdate->doUpdate()
| #16 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/Storage/DerivedPageDataUpdater.php#1833 | /srv/mediawiki/php-1.44.0-wmf.25/includes/Storage/DerivedPageDataUpdater.php(1833) ]] | MediaWiki\Deferred\DeferredUpdates::attemptUpdate(MediaWiki\Deferred\RefreshSecondaryDataUpdate)
| #17 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/page/WikiPage.php#1895 | /srv/mediawiki/php-1.44.0-wmf.25/includes/page/WikiPage.php(1895) ]] | MediaWiki\Storage\DerivedPageDataUpdater->doSecondaryDataUpdates(array)
| #18 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/jobqueue/jobs/RefreshLinksJob.php#304 | /srv/mediawiki/php-1.44.0-wmf.25/includes/jobqueue/jobs/RefreshLinksJob.php(304) ]] | MediaWiki\Page\WikiPage->doSecondaryDataUpdates(array)
| #19 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/jobqueue/jobs/RefreshLinksJob.php#214 | /srv/mediawiki/php-1.44.0-wmf.25/includes/jobqueue/jobs/RefreshLinksJob.php(214) ]] | MediaWiki\JobQueue\Jobs\RefreshLinksJob->runForTitle(MediaWiki\Title\Title)
| #20 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/EventBus/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/JobExecutor.php#88 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/EventBus/includes/JobExecutor.php(88) ]] | MediaWiki\JobQueue\Jobs\RefreshLinksJob->run()
| #21 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/rpc/RunSingleJob.php#60 | /srv/mediawiki/rpc/RunSingleJob.php(60) ]] | MediaWiki\Extension\EventBus\JobExecutor->execute(array)
| #22 | {main} |
==== Impact ====
==== Notes ====
==== Error ====
* mwversion: 1.44.0-wmf.25
* reqId: `823f35ce-676e-493a-82a8-9598728e9e6f`
* [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-04-20T15:22:44.128Z',to:'2025-04-21T15:26:25.862Z'))&_a=(query:(query_string:(query:'reqId:%22823f35ce-676e-493a-82a8-9598728e9e6f%22'))) | Find reqId in Logstash ]]
```name=normalized_message,lines=10
[{reqId}] {exception_url} PHP Warning: Undefined array key 0
```
| Frame | Location | Call
| -- | -- | --
| from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjects/ZString.php#28 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjects/ZString.php(28) ]] |
| #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjects/ZString.php#28 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjects/ZString.php(28) ]] | MediaWiki\Exception\MWExceptionHandler::handleError(int, string, string, int)
| #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZErrorFactory.php#402 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZErrorFactory.php(402) ]] | MediaWiki\Extension\WikiLambda\ZObjects\ZString->__construct(array)
| #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#603 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(603) ]] | MediaWiki\Extension\WikiLambda\ZErrorFactory::createZErrorInstance(string, array)
| #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#300 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(300) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::extractObjectType(stdClass)
| #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#417 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(417) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::create(stdClass)
| #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#352 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(352) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::createKeyValues(array, string)
| #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ActionAPI/ApiPerformTest.php#127 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ActionAPI/ApiPerformTest.php(127) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::create(stdClass)
| #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ActionAPI/WikiLambdaApiBase.php#82 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ActionAPI/WikiLambdaApiBase.php(82) ]] | MediaWiki\Extension\WikiLambda\ActionAPI\ApiPerformTest->run()
| #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiMain.php#2005 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiMain.php(2005) ]] | MediaWiki\Extension\WikiLambda\ActionAPI\WikiLambdaApiBase->execute()
| #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiMain.php#947 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiMain.php(947) ]] | MediaWiki\Api\ApiMain->executeAction()
| #10 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiMain.php#918 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiMain.php(918) ]] | MediaWiki\Api\ApiMain->executeActionWithErrorHandling()
| #11 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiEntryPoint.php#152 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiEntryPoint.php(152) ]] | MediaWiki\Api\ApiMain->execute()
| #12 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/MediaWikiEntryPoint.php#202 | /srv/mediawiki/php-1.44.0-wmf.25/includes/MediaWikiEntryPoint.php(202) ]] | MediaWiki\Api\ApiEntryPoint->execute()
| #13 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/api.php#44 | /srv/mediawiki/php-1.44.0-wmf.25/api.php(44) ]] | MediaWiki\MediaWikiEntryPoint->run()
| #14 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/w/api.php#3 | /srv/mediawiki/w/api.php(3) ]] | require(string)
| #15 | {main} |
==== Impact ====
==== Notes ====
==== Error ====
* mwversion: 1.44.0-wmf.25
* reqId: `823f35ce-676e-493a-82a8-9598728e9e6f`
* [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-04-20T15:22:44.127Z',to:'2025-04-21T15:26:53.824Z'))&_a=(query:(query_string:(query:'reqId:%22823f35ce-676e-493a-82a8-9598728e9e6f%22'))) | Find reqId in Logstash ]]
```name=normalized_message,lines=10
[{reqId}] {exception_url} PHP Warning: Array to string conversion
```
| Frame | Location | Call
| -- | -- | --
| from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#598 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(598) ]] |
| #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#598 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(598) ]] | MediaWiki\Exception\MWExceptionHandler::handleError(int, string, string, int)
| #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#300 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(300) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::extractObjectType(stdClass)
| #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#417 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(417) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::create(stdClass)
| #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ZObjectFactory.php#352 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ZObjectFactory.php(352) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::createKeyValues(array, string)
| #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ActionAPI/ApiPerformTest.php#127 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ActionAPI/ApiPerformTest.php(127) ]] | MediaWiki\Extension\WikiLambda\ZObjectFactory::create(stdClass)
| #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikiLambda/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/ActionAPI/WikiLambdaApiBase.php#82 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/WikiLambda/includes/ActionAPI/WikiLambdaApiBase.php(82) ]] | MediaWiki\Extension\WikiLambda\ActionAPI\ApiPerformTest->run()
| #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiMain.php#2005 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiMain.php(2005) ]] | MediaWiki\Extension\WikiLambda\ActionAPI\WikiLambdaApiBase->execute()
| #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiMain.php#947 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiMain.php(947) ]] | MediaWiki\Api\ApiMain->executeAction()
| #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiMain.php#918 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiMain.php(918) ]] | MediaWiki\Api\ApiMain->executeActionWithErrorHandling()
| #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/api/ApiEntryPoint.php#152 | /srv/mediawiki/php-1.44.0-wmf.25/includes/api/ApiEntryPoint.php(152) ]] | MediaWiki\Api\ApiMain->execute()
| #10 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/MediaWikiEntryPoint.php#202 | /srv/mediawiki/php-1.44.0-wmf.25/includes/MediaWikiEntryPoint.php(202) ]] | MediaWiki\Api\ApiEntryPoint->execute()
| #11 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/api.php#44 | /srv/mediawiki/php-1.44.0-wmf.25/api.php(44) ]] | MediaWiki\MediaWikiEntryPoint->run()
| #12 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/w/api.php#3 | /srv/mediawiki/w/api.php(3) ]] | require(string)
| #13 | {main} |
==== Impact ====
==== Notes ====
Looks like we lost the payment method being passed through when we changed the paypal smashpig for the new info from paypal flow
==== Error ====
* mwversion: 1.44.0-wmf.25
* reqId: `fb41361f31cd8cb7e49f59aa`
* [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2025-04-20T15:14:23.804Z',to:'2025-04-21T15:16:49.797Z'))&_a=(query:(query_string:(query:'reqId:%22fb41361f31cd8cb7e49f59aa%22'))) | Find reqId in Logstash ]]
```name=normalized_message,lines=10
[{reqId}] {exception_url} ValueError: Path cannot be empty
```
| Frame | Location | Call
| -- | -- | --
| from | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/mime/MimeAnalyzer.php#545 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/mime/MimeAnalyzer.php(545) ]] |
| #0 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/mime/MimeAnalyzer.php#545 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/mime/MimeAnalyzer.php(545) ]] | fopen(string, string)
| #1 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/libs/mime/MimeAnalyzer.php#519 | /srv/mediawiki/php-1.44.0-wmf.25/includes/libs/mime/MimeAnalyzer.php(519) ]] | Wikimedia\Mime\MimeAnalyzer->doGuessMimeType(string)
| #2 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/MediaModeration/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Services/MediaModerationImageContentsLookup.php#157 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/MediaModeration/src/Services/MediaModerationImageContentsLookup.php(157) ]] | Wikimedia\Mime\MimeAnalyzer->guessMimeType(string)
| #3 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/MediaModeration/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Services/MediaModerationImageContentsLookup.php#85 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/MediaModeration/src/Services/MediaModerationImageContentsLookup.php(85) ]] | MediaWiki\Extension\MediaModeration\Services\MediaModerationImageContentsLookup->getThumbnailMimeType(ThumbnailImage)
| #4 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/MediaModeration/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Services/MediaModerationPhotoDNAServiceProvider.php#145 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/MediaModeration/src/Services/MediaModerationPhotoDNAServiceProvider.php(145) ]] | MediaWiki\Extension\MediaModeration\Services\MediaModerationImageContentsLookup->getImageContents(MediaWiki\FileRepo\File\OldLocalFile)
| #5 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/MediaModeration/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Services/MediaModerationPhotoDNAServiceProvider.php#62 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/MediaModeration/src/Services/MediaModerationPhotoDNAServiceProvider.php(62) ]] | MediaWiki\Extension\MediaModeration\Services\MediaModerationPhotoDNAServiceProvider->getRequest(MediaWiki\FileRepo\File\OldLocalFile)
| #6 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/MediaModeration/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Services/MediaModerationFileScanner.php#80 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/MediaModeration/src/Services/MediaModerationFileScanner.php(80) ]] | MediaWiki\Extension\MediaModeration\Services\MediaModerationPhotoDNAServiceProvider->check(MediaWiki\FileRepo\File\OldLocalFile)
| #7 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/MediaModeration/+blame/refs/heads/wmf/1.44.0-wmf.25/src/Job/MediaModerationScanFileJob.php#21 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/MediaModeration/src/Job/MediaModerationScanFileJob.php(21) ]] | MediaWiki\Extension\MediaModeration\Services\MediaModerationFileScanner->scanSha1(string)
| #8 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/EventBus/+blame/refs/heads/wmf/1.44.0-wmf.25/includes/JobExecutor.php#88 | /srv/mediawiki/php-1.44.0-wmf.25/extensions/EventBus/includes/JobExecutor.php(88) ]] | MediaWiki\Extension\MediaModeration\Job\MediaModerationScanFileJob->run()
| #9 | [[ https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+blame/refs/heads/master/rpc/RunSingleJob.php#60 | /srv/mediawiki/rpc/RunSingleJob.php(60) ]] | MediaWiki\Extension\EventBus\JobExecutor->execute(array)
| #10 | {main} |
==== Impact ====
==== Notes ====
Stack looks very similar to {T330049}, which was solved by changing upstream callers.
Looks like the call to `$thumbnail->getLocalCopyPath()` is returning `''`, and not `false` as may be expected in some error conditions...
During our OpenSearch migration (T388610) , `cirrussearch2078` fail its reimage and came back up as `elastic2078`. However, this host was firewalled off from the rest of the cluster and unable to join. As such, it responded to API calls on its primary and secondary ports with `503 master_not_discovered`. Despite the 503, the host was not removed from its load balancer pools. Additionally, none of the Elastic hosts appear to be logging anything to nginx locally...I had to dig through logstash to find the problem host.
Creating this ticket to:
[] Fix health checks so servers that respond with a 503 are automatically removed from the load balancer pool (or see if this is a depool ratio issue, since we're reimaging and a lot of the hosts are either down or missing)
[] Decide whether or not we want to turn on nginx logging and either: A) enable or B) document our decision not to enable it.
```
2025-04-21 12:09:24 H=maps-warper4.maps.eqiad1.wikimedia.cloud [172.16.7.145]:45522 I=[172.16.2.248]:25 Warning: Mail sent from Cloud VPS using non-supported domain warper.wmflabs.org
```
Please update your application to use a [[ https://wikitech.wikimedia.org/wiki/Help:Email_in_Cloud_VPS | supported domain ]] like `wmcloud.org`; arbitrary subdomains are not supported because we don't have SPF/DKIM set up for them and using them will likely lead to deliverability issues. {T366935} is planning to outright reject sending this type of mail.
```
2025-04-21 12:24:34 H=wikiwho01.wikiwho.eqiad1.wikimedia.cloud [172.16.6.48]:59444 I=[172.16.2.248]:25 Warning: Mail sent from Cloud VPS using non-supported domain localhost
```
as far as I can tell this mail is being dropped at the destination due to the invalid sender, but it'd be great if wikiwho wasn't trying to do that in the first place
Per 5 (five) local discussions without opposition:
1) [[ https://sr.wikisource.org/wiki/Викизворник:Гласање/Додавање_могућности_блокирања_Филтеру_против_злоупотребе | sr.Wikisource ]],
2) [[ https://sr.wikiquote.org/wiki/Викицитат:Гласање/Додавање_могућности_блокирања_Филтеру_против_злоупотребе | sr.Wikiquote ]],
3) [[ https://sr.wikinews.org/wiki/Викиновости:Гласање/Додавање_могућности_блокирања_Филтеру_против_злоупотребе | sr.Wikinews ]],
4) [[ https://sr.wikibooks.org/wiki/Викикњиге:Гласање/Додавање_могућности_блокирања_Филтеру_против_злоупотребе | sr.Wikibooks ]],
5) [[ https://sr.wiktionary.org/wiki/Викиречник:Гласање/Додавање_могућности_блокирања_Филтеру_против_злоупотребе | sr.Wiktionary ]].
This task is to convert the current logic of storing gameplay history to use a proper database table, instead of the current json structure in SharedPreferences.
(Note: the "current" day's gameState can remain in SharedPrefs; this only applies to game history, which will need to be queried for statistics and archives.)
In order to future-proof the database table somewhat, let's think deliberately about the structure that the table should have:
Table name: `DailyGameHistory`
| `gameName` | `language` | `year` | `month` | `day` | `score` | `playType` | `gameData` |
| Identifier for the game, e.g. `whichcamefirst` | Language wiki on which this game was played | Year | Month | Day | The score for the game on this day | The "type" of gameplay this row represents. | Additional data for this game, e.g. results for each question; can be a json string |
example:
| `whichcamefirst` | `en` | `2025` | `04` | `20` | `4` | `0` | `{"true","true","false","true","true"}` |
* The `score` field is technically redundant, since it can be calculated based on the `gameData` field, but it will be useful and efficient this way for statistics.
* The `playType` field is a special field that indicates "how" this day's game was played. For example, was this game played on the day that it happened? Or was it played as part of an archived game? This can be an integer enum with values like `PLAYED_ON_SAME_DAY=0` and `PLAYED_ON_ARCHIVE=1`. This will be useful for calculating proper "streaks". If we allow users to play previous days' games, those games should not retroactively increase the streak of the player.
https://github.com/wikimedia links to the @wikimediatech Twitter account. AIUI that's no longer in active use, can it be removed?
Nagf has been broken since https://wikitech.wikimedia.org/wiki/News/2023_Cloud_VPS_metrics_changes.
In SUL2, you could go to the login page while already logged in to log into a different account. There wasn't a link to it in the UI, but you could e.g. type it into the search box, or link to `[[Special:UserLogin]]` in wikitext or click login link on the `Special:SpecialPages` page. In SUL3, this doesn't work:
https://test.wikipedia.org/wiki/Special:UserLogin?usesul3=0
https://test.wikipedia.org/wiki/Special:UserLogin?usesul3=1
(note you have to be logged in to see a difference)
You can still do it directly on the central domain, but that's a lot harder to find:
https://auth.wikimedia.org/testwiki/wiki/Special:UserLogin
The specific business logic used to be that Special:UserLogin redirects you away if there's a `returnto` parameter (assuming that you got redirected to the login page in multiple tabs when your session expired, and you logged in in one of the other tabs so now you should be sent back, as if after a successful login) but shows the login page if there isn't such a parameter; the SUL3 redirect-based provider doesn't honor that.
https://incubator.wikimedia.org/wiki/MediaWiki:Wminc-infopage-title-status-tocreate says "This is an open test wiki in the Wikimedia Incubator." It is however used for test-wikis that are not actually open, but can be created.
Current example: https://incubator.wikimedia.org/wiki/Wp/na
I suggest changing the wording to the same one as in the automatically created info pages, wminc-infopage-missingwiki-text (example https://incubator.wikimedia.org/wiki/Wp/fff )
Some of the files in https://github.com/wikimedia/mediawiki-extensions-cldr/tree/master/CldrMain start with "CldrMain", others start with "CldrNames"
The `ingress-nginx` project that we heavily rely on for handling Toolforge traffic is now in maintenance mode: https://github.com/kubernetes/ingress-nginx/issues/13002
We have several options:
1. Do nothing.
2. Migrate to some other ingress provider.
3. Once it's ready, migrate to [[ https://github.com/kubernetes-sigs/ingate | InGate ]], which is a work-in-progress Ingress/[[ https://kubernetes.io/docs/concepts/services-networking/gateway/ | Gateway API ]] controller by the same people who maintain ingress-nginx
4. Migrate to some other Gateway API implementation
== Common information
* **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status
* **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state
* **alertname**: SystemdUnitFailed
* **instance**: gitlab1004:9100
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: collaboration-services
== Firing alerts
---
* **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status
* **description**: backup-restore.service on gitlab1004:9100
* **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state
* **summary**: backup-restore.service on gitlab1004:9100
* **alertname**: SystemdUnitFailed
* **instance**: gitlab1004:9100
* **name**: backup-restore.service
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: collaboration-services
* [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%2C+name%29+group_left+%28team%29+systemd_unit_owner%29+%3D%3D+1+or+ignoring+%28team%29+%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%29+group_left+%28team%29+role_owner%7Bteam%21%3D%22wmcs%22%7D%29+%3D%3D+1&g0.tab=1)
---
* **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status
* **description**: wmf_auto_restart_ssh-gitlab.service on gitlab1004:9100
* **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state
* **summary**: wmf_auto_restart_ssh-gitlab.service on gitlab1004:9100
* **alertname**: SystemdUnitFailed
* **instance**: gitlab1004:9100
* **name**: wmf_auto_restart_ssh-gitlab.service
* **prometheus**: ops
* **severity**: critical
* **site**: eqiad
* **source**: prometheus
* **team**: collaboration-services
* [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%2C+name%29+group_left+%28team%29+systemd_unit_owner%29+%3D%3D+1+or+ignoring+%28team%29+%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%29+group_left+%28team%29+role_owner%7Bteam%21%3D%22wmcs%22%7D%29+%3D%3D+1&g0.tab=1)
It would be nice to have WikimediaDebug support in Vagrant. This would be:
* A new hostname like http://performance.local.wmftest.net:8080/
* Provision XHGui/Speedscope and serve their frontends under this hostname
* Provision Tideways php extension like production and save the profiles to XHGui or Excimer UI (https://gerrit.wikimedia.org/g/operations/mediawiki-config/+/master/src/Profiler.php)
* Add support in the browser extension to add the X-Wikimedia-Debug header and detect this from Vagrant, correctly link to logstash/xhgui/excimer-ui from the browser extension
**Feature summary** (what you would like to be able to do and where):
Implement [[https://docs.marimo.io/ | Marimo]] notebooks in PAWS as an alternative to Jupyter. It can be done through this [[ https://github.com/jyio/jupyter-marimo-proxy | proxy ]], but another implementation can be used.
**Benefits** (why should this be implemented?):
- Increased Flexibility: More choices mean more possibilities for users
- Enhanced User Experience: different interface that some users may find more intuitive
- Consistent State: Marimo ensures that notebook code, outputs, and program state are always consistent, eliminating hidden state issues common in jupyter
- Interactivity : Marimo provides UI elements like sliders and interactive plots that are automatically synchronized with Python
**Steps to replicate the issue** (include links if applicable):
* Be logged into https://idm.wikimedia.org/
* Click the [sign out](https://idm.wikimedia.org/accounts/logout/?next=/) link in the menu
* Click the “Wikimedia Developer Single Sign On” button
**What happens?**:
I am immediately logged back in, with no opportunity to log in as anybody else.
**What should have happened instead?**:
I should be asked for my username and password again.
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
N/A (Bitu has tagged versions, but doesn’t appear to show the current version anywhere in the UI, so all I can say is “probably the latest version”)
**Other information** (browser name/version, screenshots, etc.):
Apparently the correct way to log out / **workaround** is to log out in [idp](https://idp.wikimedia.org/login) ([logout link](https://idp.wikimedia.org/logout))
Currently the MediaWiki release tarballs are generated by an individual person on their local workstation and then directly uploaded to releases.wikimedia.org for public usage. Having that process be bit-for-bit [[ https://en.wikipedia.org/wiki/Reproducible_builds | reproducible ]] would allow setting up automated tests to ensure that those tarballs match what we expect them to match.
I ran [[ https://diffoscope.org/ | diffoscope ]] to compare the "official" 1.43.1 tarball and one I generated with `makerelease2.py` locally. The diff is here: https://people.wikimedia.org/~taavi/misc/diffoscope-mw-1.43-1/
Seemingly the main issue is that files include local usernames and file modification timestamps. I think we could for example use the date of the Git tag as a consistent timestamp?
No thumbnail available at https://commons.wikimedia.org/wiki/File:Himalaya,_Indian_Atlas,_sheet_66_(15219000).jpg
```Request served via cp3079 cp3079, Varnish XID 691707132
Upstream caches: cp3079 int
Error: 429, Too Many Requests at Sun, 20 Apr 2025 16:16:03 GMT```
There was T307787, closed as a duplicate of T337649
**Steps to replicate the issue** (include links if applicable):
* Create a page with content `-{[[man|gggg]]}-` or other piped links surrounded by `-{ }-` (a language variant block)
Find an example page at [[ https://zh.wiktionary.org/wiki/User:Hzy980512/sandbox/1 | my sandbox ]] on Chinese Wiktionary.
**What happens?**:
REST API `api/rest_v1/page/html/` returns faulty rendering of the link:
```
<span typeof="mw:LanguageVariant" data-mw-variant='{"disabled":{"t":"gggg]]"}}' id="mwBQ"></span>
```
**What should have happened instead?**:
It should render the link correctly and put it into attribute `data-mw-variant`. For example, for wikitext `-{[[man]]}-`, it works well, where it returns
```
<span typeof="mw:LanguageVariant" data-mw-variant='{"disabled":{"t":"<a rel=\"mw:WikiLink\" href=\"./man\" title=\"man\" data-parsoid='{\"stx\":\"simple\",\"a\":{\"href\":\"./man\"},\"sa\":{\"href\":\"man\"},\"dsr\":[25,32,2,2]}'>man</a>"}}' id="mwBA"></span>
```
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
1.44.0-wmf.25 (964d7fd) on [[ https://zh.wiktionary.org/ | Chinese Wiktionary ]].
**Other information** (browser name/version, screenshots, etc.):
It seems to only affect the first link: if the variant block contains more piped links, the links other than the first one works well.
Currently MediaSearch search only images by default. Proposed:
* We introduce a new tab "All media" which will display a montage of all media files matching a topic (including images, audios, videos and documents). Videos should be displayed as a thumbnail with a play button (so to differentiates with images). Audios do not have thumbnail, so an icon may be displayed instead.
* "All media" should be the default tab.
`comms.http._decide_encoding` sometimes gives a warning like `WARNING: Unknown or invalid encoding 'ISO-8859-1,utf-8;q=0.7,*;q=0.7'` and returns None because encodings detection fails but it could detect encoding easily because 'ISO-8859-1,utf-8;q=0.7,*;q=0.7'` contains multiple encodings separated by comma and with quality factor which is 1 by default. The given string has the following encodings:
# ISO-8859-1 (quality 1.0)
# utf-8 (quality 0.7)
# * = wildcard for other encodings (quality 0.7)
**Steps to replicate the issue** (include links if applicable):
* Start editing a page or start creating a new one. For reference, let's assume we're starting with the following text: "this test is a test to test find and replace bug"
* Open find and replace (does not matter whether by UI button or the keybinding)
* Type "test" into the 'find' field, "aaa" into replace. Hit the 'replace' button once. Hit 'done' to close the panel. The text is now "this aaa is a test to test find and replace bug".
* Reopen find and replace. UI displays "test" in the 'find' field, "aaa" in the 'replace' field, same way we've left it before.
* Hit replace.
**What happens?**:
The text is now "this aaa is a to test find and replace bug". Input phrase gets replaced with an empty string.
**What should have happened instead?**:
The replacement phrase used last time, which is still shown in the UI after reopening, should still be used after reopen without the user to cause an input event in the field.
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
Reproducible on WMF wikis.
**Other information** (browser name/version, screenshots, etc.):
Reproducible in different browsers.
{F59310208}
**Steps to replicate the issue** (include links if applicable):
*Visit https://meta.wikimedia.org/wiki/Special:CentralAuth/User:GPT304
**What happens?**:
* "View account info" says "User:GPT304"
* Account info for "GPT304" is displayed
* The local wiki link for es.wikipedia.org is https://es.wikipedia.org/wiki/User:User:GPT304 (note the extra "User:")
* The edit count link is https://es.wikipedia.org/wiki/Special:Contributions/User:GPT304 (also an extra "User:")
When clicking these links they are both misleading (ie they say the user is registered/has no edits) as they are for username "User:GPT304" not "GPT304" as intended.
**What should have happened instead?**:
The links should be corrected, or Special:CentralAuth/User:GPT304 should show an error message, or redirect to Special:CentralAuth/GPT304.
**Other information** (browser name/version, screenshots, etc.):
This problem is hidden for some (English-lang?) wiki's contribs pages as they fix the incorrect url. For example at https://meta.wikimedia.org/wiki/Special:CentralAuth/User:Commander_Keane the en.wiki edit count link works, while the es.wiki edit count links does not.
**Steps to replicate the issue** (include links if applicable):
* Use any modern and still supported version of MediaWiki (preferably 1.39.12 to 1.44.0)
* Create a template that uses this string: `{{#tag:pre|test-{0}?}}`.
**What happens?**:
On any non-Mandarin (zh) wiki, in this specific case an [English Wiki on Miraheze](https://namuwitch.miraheze.org/wiki/), the entire string is escaped and returns as `{{#tag:pre|test-{0}?}}`, this is not the intended result. This ticket is related to T20958#237662, where a user reports that `<math>` operations are also being escaped when this setting is not set to false.
**What should have happened instead?**:
What should've happened is that string should've only returned `test-{0}`, however because of the way Lang Conversion works for the Mandarin language, the function didn't take into account that if the wiki was set to any other language other than Mandarin (zh), then it shouldn't apply, so wikis that aren't using the Mandarin (zh) language are having these issues.
To resolve this issue on non-Mandarin (zh) wikis, set `$wgDisableLangConversion = true;` in `LocalSettings.php` and the issue will be resolved.
Although, it shouldn't take setting this obscure variable to false for this to fix the issue, it should be accounted for in the function handling language conversion.
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
| Product | Version
| MediaWiki | 1.43.1 (cde4276) 17:18, 10 April 2025
| PHP | 8.2.28 (fpm-fcgi)
| ICU | 72.1
| MariaDB | 10.11.11-MariaDB-deb12-log
| wikidiff2 | 1.13.0
| LuaSandbox | 4.1.2
| Lua | 5.1.5
| Pygments | 2.17.2
**Other information** (browser name/version, screenshots, etc.):
This issue was found on a wiki hosted by [Miraheze](https://meta.miraheze.org/wiki/Main_Page).
**Steps to replicate the issue** (include links if applicable):
* Please see [[https://www.wikifunctions.org/view/en/Z24094|Z24094]]
* Note the failing implementation
* Click details
**What happens?**:
The Wikidata item reference appears to contain a Typed Map rather than the expected string.
**What should have happened instead?**:
The expected string should be returned and the test should pass.
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
**Other information** (browser name/version, screenshots, etc.):
This is an artificial recreation of a problem that has since been fixed (or worked around).
[[https://www.wikifunctions.org/view/en/Z23421|Implementation Z23421]] returns a (synthetic) Wikidata item reference object containing a String object. This is presumably valid and the implementation passes all the function’s test cases.
When the function was used in [[https://www.wikifunctions.org/view/en/Z24043|composition Z24043]], the composition failed with a mysterious error. {F59303087}
Alternative implementations for the [[https://www.wikifunctions.org/wiki/Z23419|inner function]] were created, allowing the composition to succeed.
Further investigation revealed that the [[https://www.wikifunctions.org/view/en/Z23050|Python implementation]] of [[https://www.wikifunctions.org/view/en/Z20041|Z20041]] would also fail if asked to return the string within the Wikidata item reference returned by Z23419 (although it, too, was passing all its test cases). This is because the object being returned was interpreted as a Typed Map rather than a String. This too was fixed (by returning the Z6K1).
Sandbox functions have been used to emulate the implementations as they were before being fixed.
* [[https://www.wikifunctions.org/view/en/Z24074|Sandbox JavaScript]] emulates Z23419, returning a synthesised Wikidata item reference object
* [[https://www.wikifunctions.org/view/en/Z24073|Sandbox composition]] emulates Z24041, with compositions calling either the emulated JavaScript function (Z24074) or the corrected function that it emulates (Z24041).
* [[https://www.wikifunctions.org/view/en/Z24093|Sandbox Python]] emulates Z20041. Called with a value returned from Sandbox JavaScript, it returns a Typed Map.
{F59303354}
It seems improper that a String object should ever be interpreted as a Typed Map, particularly in a context that should allow only objects that can resolve to strings (or references).
**Steps to replicate the issue**
* Be logged into a Wikimedia wiki
* Visit `Special:CreateAccount` on the local wiki (and be taken to a URL like `https://auth.wikimedia.org/testwiki/wiki/Special:CreateAccount?<parameters>`)
* Click on either the 'bell' icon or the 'tray' icon in the top-right of the screen (& next to your username)
**What happens?**
Even if you have any read or unread notifications, the notifications popup says that "There are no notifications.".
**What should have happened instead?**:
The page shouldn’t (incorrectly) say that the end-user has no (read or unread) notifications; and - ideally - notifications would be displayed on `auth.wikimedia.org` as they would be on the local wiki itself.
**Steps to replicate the issue** (include links if applicable):
* translate few paragraphs of any article
* go to the Content Translation main page
* See translations in progress
* Hover over the started translation progress bar
**What happens?**:
Nothing
**What should have happened instead?**:
Pop-up (tooltip) with an information on what percentage of translation has been finished. This is what happened until the recent update of UI
**Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia):
**Other information** (browser name/version, screenshots, etc.):
The semantics of `EditResult::isNullEdit()` do not match the definition of a null edit given in <https://www.mediawiki.org/wiki/Manual:Purge#Null_edits>. It will also return true for null (dummy) revisions (see <https://www.mediawiki.org/wiki/Manual:Null_revision>). Indeed covering dummy revisions appears to be the original intent.
The method is [[https://codesearch.wmcloud.org/things/?q=-%3EisNullEdit|used fairly widely in extensions and skins]], and it's not immediately clear what the expected semantics are in each use case. To avoid further confusion, EditResult should use the same terminology as `PageRevisionUpdatedEvent`:
* `changedLatestRevisionId()` returns true for null edits but not for dummy revisions.
* `isEffectiveContentChange()` returns false for both null edits and dummy revisions.
* `isNominalContentChange()` returns false for dummy revisions but true for null edits.
The `isNullEdit` method should be deprecated and replaced by the three methods above. Ideally, callers would use `PageRevisionUpdatedEvent` for this information, but that requires all handlers for the `PageSaveCompleteHook` to be converted to listeners for `PageRevisionUpdatedEvent`.
**Steps to replicate the issue**
* Navigate to <https://phabricator.wikimedia.org/maniphest/>.
* In the sidebar, under "Queries", click "Subscribed". (Alternatively, navigate to <https://phabricator.wikimedia.org/maniphest/query/subscribed/> directly.)
* Click on the "Edit Query" button near the top-right of the page, in order to view the filters that have been applied.
**What happens?**
In addition to the filter that removes tasks that you're not subscribed to, the "Subscribed" search query is also filtering on the "Open", "In Progress", & "Stalled" task statuses.
|{F59294337 width=600}|
This also affects queries made through the [[https://phabricator.wikimedia.org/conduit/method/maniphest.search/|maniphest.search Conduit API endpoint]] using the [[https://phabricator.wikimedia.org/conduit/method/maniphest.search/#queries|'subscribed' queryKey]].
**What should have happened instead?**
IMO, it would make more sense for the prebuilt "subscribed" query to not filter by task status, as it doesn't advertise that it'll be doing this. To me, especially from an API point-of-view, a filter labelled "subscribed" implies that task subscription status will be the //only// thing that the search query is filtering by. This additional filtering also doesn't make sense IMO when there's also a prebuilt "Open + Subscribed" filter immediately below it in the sidebar.
The current behaviour ended up massively confusing me for a bit while I was testing something that used the Phab Conduit API - to the point where I decided to file this task about it (lol) - as I'd just assumed the whole time that the prebuilt `subscribed` filter would be filtering by task subscription status and nothing more.
I'll leave the final call on this to the Phab maintainers though :)
**Background**:
Regex lists exist in MediaWiki installation, including:
- MediaWiki:Titleblacklist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Titleblacklist
- MediaWiki:Titlewhitelist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Titlewhitelist
- MediaWiki:filename-prefix-blacklist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Filename-prefix-blacklist
- MediaWiki:Spam-blacklist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Spam-blacklist
- MediaWiki:Captcha-addurl-whitelist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Captcha-addurl-whitelist
- MediaWiki:Spam-whitelist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Spam-whitelist
- MediaWiki:Email-blacklist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Email-blacklist
- MediaWiki:Email-whitelist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Email-whitelist
- MediaWiki:pageimages-blacklist, e.g., https://en.wikipedia.org/wiki/MediaWiki:Pageimages-blacklist
Plus
https://meta.wikimedia.org/wiki/Title_blacklist
And possibly others that I've missed.
**Observations**:
There is currently no syntax highlighting on the Regex pages on Wikipedia. Presumably, nobody has thought to do it yet.
Syntax highlighting is possible however using `<syntaxhighlight lang=python>` or `<syntaxhighlight lang=perl>`, or `<syntaxhighlight lang=bash>`, etc. Note that, unfortunately, there is no `<syntaxhighlight lang=regex>`. The colour schemes vary a little depending on the lingo, but you end up with something like:
{F59298014}
But where the code block begins...
```
# <syntaxhighlight lang=python>
```
The formatting ends up a little bit weird, because the parser initially treats it as WikiText:
{F59298046}
or
{F59298074}
It might be possible to start with `<syntaxhighlight>` tag not in a comment, but that would be treated as Regex, and might have unintended consequences, although page titles can't include `<>` so maybe it won't have any impact.
OK, so there is a quick fix, but let's dig a little deeper...
**Which content model is best?**:
Two possible content models are possible for this (both work the same, functionally):
- Plain text, which permits *no* syntax highlighting. After T202424 at least this looks like plain text.
- Wikitext, which allows syntax highlighting, but this clearly isn't really wikitext. It doesn't function as wikitext, and doing so produces some formatting issues.
And btw, it clearly isn't CSS or Javascript, but those two existing content models serve as examples of how to use content models on different types of pages, including appropriate syntax highlighting, and line numbers.
**New content model is needed**:
I think what's needed here, therefore, is an additional content model called `Regex` (or similar name). This would wrap the whole page in a `<syntaxhighlight lang=python>` (or whatever tag is preferred), and add line numbers, but otherwise function as Wikitext, and allow `[[double-square bracket]]` links and urls, e.g.
{F59298147}
Thank you for your consideration. Sorry, this one is a bit long-winded.
**Notes**:
Add further notes here.
>>! In T59138#8885443, @Winston_Sung wrote:
> To-do:
>
> * Fix conversion table:
> ** 鬱 <≠> 郁
> ** 台 <≠> 臺
> * https://wuu.wikipedia.org/wiki/MediaWiki:Conversiontable/wuu-hans
> ```
> 箇=>箇;
> 纔=>纔;
> 睏=>睏;
> ```
> * https://wuu.wikipedia.org/wiki/MediaWiki:Conversiontable/wuu-hant
> ```
> 个=>个;
> 个人=>個人;
> 箇=>箇;
> ```
Document Wikisource-Wikidata Linking on Malayalam Wikisource
**Feature summary** (what you would like to be able to do and where):
wscontest tool should show a dashboard for a contest like total user participated, total proofread pages, validated pages etc
**Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution):
currently there is no dashboard available which can show the stats for any contest.
**Benefits** (why should this be implemented?):
easy to get the statistics of a contest hosted on wscontest tool.
We recently had a spate of fail mail on `omnimail_recipient_process_unsubscribes` - it turned out to relate to an attempt to add data to a contact that had been hard deleted - but the contact_id was still wrong in the `civicrm_mailing_provider_data` table
It looks like the contact was merged on 2024-09-21 13:35:06 - I can't recall when we started updating the contact_id on that table when merges happen but I am sure it was well before that so maybe this was a merge that crashed? We can investigate by querying rows in that table with contact IDs not in the civicrm_contact table & attempting to fix them. We may be able to monitor them like we do contributions against deleted contacts going forwards
QuickSurvey request for Design Research participant database recruitment which will also serve as the recruitment strategy for the Deep Readers project.
Objective: Recruit readers into the design research participant database
Goal: 5000 readers on en.wiki sign up for the database
Audience: en.wiki readers, mix of desktop and mobile web visitors, logged in or anonymous
Type: External on Airtable
When: Over the week of April 21
Duration: Until we reach 5000 sign-ups
Coverage: 0.01% (7000 unique devices), or per your recommendation
**Tasks**
[] Pre-deploy survey to target wiki
[] Create messages on EN wiki
[] Deploy to target wiki
[] Monitor sign-ups in Airtable (Daniel and Bethany)
[] Disable survey
**Message**
question: //Want to help improve Wikipedia by participating in research studies?//
description: //We’re looking for Wikipedia readers like you to take part in research studies that will help improve user experiences. Click the link below to join our research participant list. You’ll be contacted periodically with opportunities to participate, and you can unsubscribe at any time.//
link: [[ https://airtable.com/appNd1Rm2PloP90vU/shr0x60AzeK832ext? | https://airtable.com/appNd1Rm2PloP90vU/shr0x60AzeK832ext? ]]
privacy policy: //See the privacy notice for this form [[https://foundation.wikimedia.org/wiki/Legal:Research_Participant_List_Privacy_Notice | here]].//
In T388471, we will be inviting volunteers to review the Peacock Check language model using Annotool.
This task represents the work of updating Annotool to enable the above.
=== Requirements
1. Annotool is updated so that each edit/diff includes a free-text field wherein volunteers can offer context about what language they consider to be non-neutral within a given edit/diff
2. Annotool instances are created for each of the languages listed in the `=== Language instances` section below
-- //Note: the Machine Learning Team will be updating T388471 as `Eval Data` is ready for inclusion in Annotool.//
=== Language instances
| Wiki | Language | Status | Link | Notes |
|------------|-------------------|--------|------|-------|
| ar.wiki | Arabic | | | |
| cs.wiki | Czech | | | |
| de.wiki | German | | | |
| en.wiki | English | | | |
| es.wiki | Spanish | | | |
| fa.wiki | Persian | | | |
| fr.wiki | French | | | |
| he.wiki | Hebrew | | | |
| id.wiki | Indonesian | | | |
| it.wiki | Italian | | | |
| ja.wiki | Japanese | | | |
| nl.wiki | Dutch | | | |
| no.wiki | Norwegian Bokmål | | | |
| pl.wiki | Polish | | | |
| pt.wiki | Portuguese | | | |
| ro.wiki | Romanian | | | |
| ru.wiki | Russian | | | |
| tr.wiki | Turkish | | | |
| uk.wiki | Ukrainian | | | |
| zh.wiki | Chinese | | | |