Page MenuHomePhabricator
Search Open Tasks
    • Task
    This task is the same as {T203651}, but 2024 edition. Taint-check received some new features, and the runtime has increased. It's hard to set a specific goal as definition of goal for this task. As of 2024-03-03, it takes 2m 48s to run phan on MW core on my laptop. I'm hoping to decrease that by ~30 seconds if possible.
    • Task
    **Steps to replicate the issue** (include links if applicable): * visit https://en.m.wikipedia.org/wiki/SS_Kroonland with new parser HTML **What happens?**: {F42369416} **What should have happened instead?**: Infobox should be below first paragraph {F42369417} **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Steps to replicate the issue** (include links if applicable): * visit mobile site with new parser **What happens?**: * all sections are expanded and there is no way to expand them **What should have happened instead?**: * h2s should be collapsible **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Steps to replicate the issue** (include links if applicable): * enable new parser * visit any mobile page * **What happens?**: {F42369342} **What should have happened instead?**: Centered. Text color. **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    {fd2d21a36fde102dd8841459479c5ad9d8730306} implemented a solution for {T169695} that allows `webservice TYPE shell -- something` to work by passing `something` as the entrypoint for the container with output returning to the calling shell. I expected `toolforge webservice TYPE shell -- something` to work in the same way, but it appears that this is not currently the case. Here is an example from the wikibugs tool: ```lang=shell-session tools.wikibugs@tools-sgebastion-10:~$ webservice python3.9 shell -- venv-wikibugs2-39/bin/python3 -m wikibugs2 --help Usage: python -m wikibugs2 [OPTIONS] COMMAND [ARGS]... IRC announce bot for issue tracker and forge events. Options: --version Show the version and exit. -v, --verbose Increase debug logging verbosity --logfile FILE Log to this (rotated) log file --help Show this message and exit. Commands: gerrit Process Gerrit event-stream events and enqueue IRC... irc Read messages from redis, format them, and send them to... phorge Process Phorge events and enqueue data for creating IRC... update-credits Update CREDITS file with current contributors. ``` ```lang=shell-session, counterexample tools.wikibugs@tools-sgebastion-10:~$ toolforge webservice python3.9 shell -- venv-wikibugs2-39/bin/python3 -m wikibugs2 --help usage: toolforge webservice [-h] [--template TEMPLATE_FILE] [--backend {gridengine,kubernetes}] [--release {buster}] [-m MEMORY] [-c CPU] [-r REPLICAS] [--buildservice-image BUILDSERVICE_IMAGE] [--mount {all,none}] [--health-check-path HEALTH_CHECK_PATH] [-f] [-l LAST] [TYPE] ACTION [... [... ...]] toolforge webservice: error: argument -m/--mem: wikibugs2 is not a valid Kubernetes quantity ``` I'm not yet sure if this is an unintended side effect of nested argparse systems or some deeper issue in the `toolforge` command dispatch process.
    • Task
    **Feature summary** (what you would like to be able to do and where): When visiting the libup web view the libraries bundled in `mediawiki/vendor` should be marked as wmf deployed 🗄 to match what happen on wmf servers. In the list under https://libup.wmcloud.org/r?branch=main the repos `mediawiki/services/parsoid` or `mediawiki/libs/IPUtils` should be marked with 🗄, there are listed as library on https://libup.wmcloud.org/r/mediawiki/vendor?branch=main This could be tricky as the package name in `mediawiki/vendor` is not the repo name used in libup. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): When looking on the list of repos with errors it could be relevant if a repo is wmf deployed, as a missing automatic security update may has more impact due to higher usage as other libraries. **Benefits** (why should this be implemented?): Easier to spot all deployed repos
    • Task
    **Feature summary** (what you would like to be able to do and where): When visiting the libup web view the repo `mediawiki/core` should be marked as wmf deployed 🗄 and bundled with tarball 🔮 to match what happen on wmf servers and release management. https://libup.wmcloud.org/r?branch=main **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): When looking on the list of repos with errors it could be relevant if a repo is wmf deployed or tarballed, as a missing automatic security update may has more impact due to higher usage as other extensions or skins. **Benefits** (why should this be implemented?): Easier to spot all deployed or tarballed repos
    • Task
    Please rename my Phabricator account from `GeoffreyT2000` to `GTrang`, as my Wikimedia account was also renamed.
    • Task
    ``` D:\pwb\GIT\core>pwb logentries_tests -v tests: max_retries reduced from 15 to 1 setUpClass (__main__.TestLogentries) ... ERROR setUpClass (__main__.TestLogentryParams) ... ERROR setUpClass (__main__.TestSimpleLogentries) ... ERROR ====================================================================== ERROR: setUpClass (__main__.TestLogentries) ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\pwb\GIT\core\tests\aspects.py", line 460, in setUpClass super().setUpClass() File "D:\pwb\GIT\core\tests\aspects.py", line 927, in setUpClass data['site'] = Site(data['code'], data['family'], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\__init__.py", line 243, in Site _sites[key] = interface(code=code, fam=fam, user=user) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 140, in __init__ self.login(cookie_only=True) File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 400, in login if self.userinfo['name'] == self.user(): ^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 668, in userinfo uidata = uirequest.submit() ^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\data\api\_requests.py", line 993, in submit response, use_get = self._http_request(use_get, uri, body, headers, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\data\api\_requests.py", line 684, in _http_request response = http.request(self.site, uri=uri, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 283, in request r = fetch(baseuri, headers=headers, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 457, in fetch callback(response) File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 322, in error_handling_callback raise FatalServerError(str(response)) pywikibot.exceptions.FatalServerError: HTTPSConnectionPool(host='infogalactic.com', port=443): Max retries exceeded with url: /w/api.php?action=query&meta=userinfo&uiprop=blockinfo%7Cgroups%7Chasmsg%7Cratelimits%7Crights&formatversion=2&maxlag=5&format=json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:992)'))) ====================================================================== ERROR: setUpClass (__main__.TestLogentryParams) ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\pwb\GIT\core\tests\aspects.py", line 460, in setUpClass super().setUpClass() File "D:\pwb\GIT\core\tests\aspects.py", line 927, in setUpClass data['site'] = Site(data['code'], data['family'], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\__init__.py", line 243, in Site _sites[key] = interface(code=code, fam=fam, user=user) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 140, in __init__ self.login(cookie_only=True) File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 400, in login if self.userinfo['name'] == self.user(): ^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 668, in userinfo uidata = uirequest.submit() ^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\data\api\_requests.py", line 993, in submit response, use_get = self._http_request(use_get, uri, body, headers, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\data\api\_requests.py", line 684, in _http_request response = http.request(self.site, uri=uri, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 283, in request r = fetch(baseuri, headers=headers, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 457, in fetch callback(response) File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 322, in error_handling_callback raise FatalServerError(str(response)) pywikibot.exceptions.FatalServerError: HTTPSConnectionPool(host='infogalactic.com', port=443): Max retries exceeded with url: /w/api.php?action=query&meta=userinfo&uiprop=blockinfo%7Cgroups%7Chasmsg%7Cratelimits%7Crights&formatversion=2&maxlag=5&format=json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:992)'))) ====================================================================== ERROR: setUpClass (__main__.TestSimpleLogentries) ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\pwb\GIT\core\tests\aspects.py", line 460, in setUpClass super().setUpClass() File "D:\pwb\GIT\core\tests\aspects.py", line 927, in setUpClass data['site'] = Site(data['code'], data['family'], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\__init__.py", line 243, in Site _sites[key] = interface(code=code, fam=fam, user=user) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 140, in __init__ self.login(cookie_only=True) File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 400, in login if self.userinfo['name'] == self.user(): ^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\site\_apisite.py", line 668, in userinfo uidata = uirequest.submit() ^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\data\api\_requests.py", line 993, in submit response, use_get = self._http_request(use_get, uri, body, headers, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\data\api\_requests.py", line 684, in _http_request response = http.request(self.site, uri=uri, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 283, in request r = fetch(baseuri, headers=headers, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 457, in fetch callback(response) File "D:\pwb\GIT\core\pywikibot\comms\http.py", line 322, in error_handling_callback raise FatalServerError(str(response)) pywikibot.exceptions.FatalServerError: HTTPSConnectionPool(host='infogalactic.com', port=443): Max retries exceeded with url: /w/api.php?action=query&meta=userinfo&uiprop=blockinfo%7Cgroups%7Chasmsg%7Cratelimits%7Crights&formatversion=2&maxlag=5&format=json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:992)'))) ---------------------------------------------------------------------- Ran 0 tests in 1.536s FAILED (errors=3) D:\pwb\GIT\core> ```
    • Task
    Soon after [[https://gerrit.wikimedia.org/r/c/integration/config/+/1007742 | adding ]] Translate as a CI dependency of the CampaignEvents extension, lots of selenium and api-testing tests started failing. All the failures have one thing in common: the user performing a certain action (like creating an event) is reportedly not allowed to do so. To provide a bit of context, the CampaignEvents extension mostly only works with global accounts (via core's CentralIdLookup); having a global account is a prerequisite for creating events. Looking at the CI logs, it was immediately clear that the addition of Translate as a dependency brought in a total of 38 new dependencies, among which is CentralAuth. I still haven't found the courage to install CentralAuth locally, so I ran a few tests in [[https://gerrit.wikimedia.org/r/c/mediawiki/extensions/CampaignEvents/+/1008063 | this patch ]]. All the failing selenium and api-testing tests use the default user account to perform actions; this account is created by install.php. However, it looks like this account is just a local account, as there's no sign of it in the `globaluser` and `localuser` tables, which are in fact empty. If I create a new account via the API, CentralAuth makes it global and I can find it in its tables; still, no sign of the default account. I'm not sure if this is an intentional design choice, but it would be nice to change that, as it makes testing much easier. I also don't know whether this is just a CI thing, or if it actually happens for every new wiki. I also wanted to see if CentralAuth had a solution for this in its own selenium tests, but I think you already know what I found (or didn't). This also means that perhaps nobody ever noticed (it's maybe not too common for extensions to work with central accounts only). Finally, I'm not sure what tags to use here. I don't know if this is a bug (feature?) in CentralAuth itself, in how the installer creates the default account, or in how CI is configured; as such, I'm tagging all these for the time being.
    • Task
    In Toolforge there is a script https://gerrit.wikimedia.org/r/plugins/gitiles/labs/toollabs/+/refs/heads/master/misctools/sql which provides a `sql` tool so that you can connect to enwiki replica database via `sql enwiki`. In PAWS terminal to connect to enwiki replica database we need `mariadb -h enwiki.analytics.db.svc.wikimedia.cloud enwiki_p`. We can bring the `sql` script to PAWS so that we can use the same easy way to connect to a replica database in PAWS terminal.
    • Task
    **Steps to replicate the issue** (include links if applicable): * https://codesearch.wmcloud.org/search/?q=toollabs-jobutils&files=&excludeFiles=&repos= **What happens?**: Return nothing **What should have happened instead?**: Return https://gerrit.wikimedia.org/r/plugins/gitiles/labs/toollabs/+/refs/heads/master/setup.py **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): It seems https://gerrit.wikimedia.org/r/plugins/gitiles/labs/toollabs/ is not indexed in codesearch.
    • Task
    **Steps to replicate the issue** (include links if applicable): When I click on "Upload a new version of this file", it prevents me to upload the file because the file limit is too low (100 MB), while the original file is much larger. **What happens?**: The upload limit of 100 MB is too low to upload a new version of a file that is originally > 100 MB **What should have happened instead?**: The new version of the file should have been uploaded without limits **Software version** (skip for WMF-hosted wikis like Wikipedia): Wikimedia Commons **Other information** (browser name/version, screenshots, etc.): Years with the same situation, I'm very tired {F42367106} {F42367105} {F42367104} {F42367103}
    • Task
    **Steps to replicate the issue** (include links if applicable): * Enable parsoid read views everywhere (or be on Wikitech) * Navigate to a talk page in mobile view **What happens?**: No discussions are shown, unless you click "Learn more about this page" {F42365834} {F42365839} **What should have happened instead?**: Discussions are shown normally {F42365850} **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Steps to replicate the issue** (include links if applicable): * Enable parsoid read views on all pages * Navigate to any page with an image in mobile view **What happens?**: {F42365796} **What should have happened instead?**: Only one image appears **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    When global account blocks are enabled, `Special:CentralAuth/<username>` should display information about any global blocks on that specific account, as well as a link to block that user if the viewer has the permissions to do so.
    • Task
    Hello Cyberpower678, I need read access to the 404 link database. To see how we can best integrate this into fr.wikipedia.org. Thanks in advance
    • Task
    Currently we only notify users when they make 1, 10, 100, 1000, 10000, ... edits. Boooo. Boring. Death to decimal-centric notification system. For example, my fitbit gave me "London metro" badge after walking 402 kilometers (length of London underground). It has more like that (42km for Marathon badge, 112 km for Penguin badge, 563km for Hawaii badge, etc.). We could have some notifications on top of the current decimal ones: (suggestions) - 42 edits: obviously. - 118 edits: Number of chemical elements - 272 edits: Number of London underground stations. - ... (suggestions welcome)
    • Task
    The “first time” you run `kubectl` in a certain user account (after “a while”, presumably some cache’s TTL), it’s quite slow to start up. Afterwards, it will be much faster in the same account, but still slow in another user account, even on the same host / bastion. ```lang=shell-session,name=from 8½ seconds to half a second (2×) lucaswerkmeister@tools-sgebastion-10:~$ time become lexeme-forms kubectl get pods NAME READY STATUS RESTARTS AGE lexeme-forms-f5794849b-jnvls 1/1 Running 0 10h real 0m8,613s user 0m0,324s sys 0m0,358s lucaswerkmeister@tools-sgebastion-10:~$ time become lexeme-forms kubectl get pods NAME READY STATUS RESTARTS AGE lexeme-forms-f5794849b-jnvls 1/1 Running 0 10h real 0m0,586s user 0m0,184s sys 0m0,073s lucaswerkmeister@tools-sgebastion-10:~$ time become lexeme-forms kubectl get pods NAME READY STATUS RESTARTS AGE lexeme-forms-f5794849b-jnvls 1/1 Running 0 10h real 0m0,412s user 0m0,178s sys 0m0,089s lucaswerkmeister@tools-sgebastion-10:~$ time become quickcategories kubectl get pods NAME READY STATUS RESTARTS AGE background-runner-688875655c-2k9jm 1/1 Running 6 (37h ago) 12d quickcategories-686d67f74d-qddtc 1/1 Running 0 11d real 0m8,353s user 0m0,369s sys 0m0,321s lucaswerkmeister@tools-sgebastion-10:~$ time become quickcategories kubectl get pods NAME READY STATUS RESTARTS AGE background-runner-688875655c-2k9jm 1/1 Running 6 (37h ago) 12d quickcategories-686d67f74d-qddtc 1/1 Running 0 11d real 0m0,546s user 0m0,186s sys 0m0,085s lucaswerkmeister@tools-sgebastion-10:~$ time become quickcategories kubectl get pods NAME READY STATUS RESTARTS AGE background-runner-688875655c-2k9jm 1/1 Running 6 (37h ago) 12d quickcategories-686d67f74d-qddtc 1/1 Running 0 11d real 0m0,424s user 0m0,173s sys 0m0,069s ``` I think to some extent this isn’t new, but it does feel like it’s been getting worse the past few weeks or so, and I thought it was worth mentioning here.
    • Task
    Example: https://www.wikidata.org/w/index.php?diff=2092515682. The bot calls an API module (`action=wbremovequalifiers` probably?) which prepends `/* ‎wbremovequalifiers-update:0|: */` to the summary. There is no message `wikibase-entity-summary-wbremovequalifiers-update`, therefore no human-readable comment is displayed. Two similar messages exist: - `wikibase-entity-summary-wbremovequalifiers-remove` ([[ https://www.wikidata.org/w/index.php?diff=2092524996 | example ]]) - `wikibase-entity-summary-wbsetqualifier-update` ([[ https://www.wikidata.org/w/index.php?diff=2092772098 | example ]])
    • Task
    VisualEditor shows an extra empty row in [[https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2024/Connect|mw:Wikimedia_Hackathon_2024/Connect]], as of [[https://www.mediawiki.org/w/index.php?title=Wikimedia_Hackathon_2024/Connect&oldid=6395643|r6395643]]: | view | VisualEditor | {F42362186} | {F42362190} Does not seem to be a Parsoid issue, `?useparsoid=1` view is identical to normal web view.
    • Task
    Bangla character set has been updated in Unicode. Accordingly, the following amendment in WikiEditor is requested: 1. Replacement of character no. 66 from ৷ to । (U+0964) 2. Addition at the end (after zwj): ৼ (U+09FC), ৽ (U+09FD), ৾ (U+09FE). Thanks.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Using any Chromium-based browser on Windows with UI set to English, navigate to https://yue.wiktionary.org/wiki/Module:languages?action=edit * Observe that the source code in the editor is displayed in sans serif * Observe that the cursor does not align with the text **What happens?**: Using CodeMirror with the 2010 editor causes the page source to display in a sans serif font, but the cursor is presumably aligned as if the page source is in a monospace font. **What should have happened instead?**: Cursor aligns with text; ideally the text should be in monospace instead. **Other information** (browser name/version, screenshots, etc.): Latest stable Chrome on Windows 10. Cannot reproduce on Firefox. Cannot reproduce on macOS. {F42354330} {F42354355} {F42354366}
    • Task
    I am a member of Femiwiki Team, and we want the next extensions to be ported from our GitHub to Wikimedia Gerrit to acquire the mature continuous integrations for extensions. Done: - [x] DiscordRCFeed - [x] PageViewInfoGA WIP: - [ ] AchievementBadges - [ ] Code changes (@Lens0021) - [ ] Create Gerrit repository and configure the rights - [ ] Update Translatewiki.net - [ ] Update the extension page on MediaWikiWiki - [ ] Create a tag in Phabricator? - [ ] FacetedCategory - [ ] Code changes (@Lens0021) - [ ] Create Gerrit repository and configure the rights - [ ] Update Translatewiki.net - [ ] Update the extension page on MediaWikiWiki - [ ] Create a tag in Phabricator? - [ ] Sanctions - [ ] Code changes (@Lens0021) - [ ] Create Gerrit repository and configure the rights - [ ] Update Translatewiki.net - [ ] Update the extension page on MediaWikiWiki - [ ] Create a tag in Phabricator?
    • Task
    The enwiki article [[https://en.wikipedia.org/wiki/Sigma_male|Sigma male]] is marked as reviewed in Special:NewPagesFeed, despite not having any PageTriage log entries since its move from draftspace. It seems that either a log entry wasn't created when it should have been, or the page was marked as reviewed when it shouldn't have been. **Page review history (as far as I can see)** * A redirect at the title "Sigma male" was created in August 2021, and was automatically reviewed by DannyS712 bot III (and then again by 'zinbot, after being nominated at RfD in November). * In May 2022, the redirect was converted into an article. It was later turned back into a redirect, and then marked as reviewed, by MarioGom. * In July 2022, the redirect was again converted into an article. A revision was marked as a potential copyright violation by EranBot, and the (now-article) was marked as reviewed by Alexandermcnabb later in the month. * In September 2023, the now-article was blanked-and-redirected by GorillaWarfare. * In March 2024, the mainspace redirect was [[https://en.wikipedia.org/wiki/WP:ROUNDROBIN|pageswapped]] with a draftspace page of the same name by Queen of Hearts (who is not autopatrolled). Because of the page-swap, the [[https://en.wikipedia.org/wiki/Special:Log?page=Sigma+male|log entries for the page "Sigma male"]] prior to March 2024 relate to the previous page (now moved to [[https://en.wikipedia.org/wiki/Draft:Sigma male|Draft:Sigma male]]), rather than the current page.
    • Task
    Tool Name: **wikibugs-testing** Quota increase requested: **+2 services (3 total), +3 apps (6 total)** Reason: Trying to test a full wikibugs deploy + space to experiment with additional features. The wikibugs tool currently has this quota: ``` $ kubectl describe quota Name: tool-wikibugs Namespace: tool-wikibugs Resource Used Hard -------- ---- ---- configmaps 2 10 count/cronjobs.batch 0 50 count/deployments.apps 4 6 count/jobs.batch 0 15 limits.cpu 2 8 limits.memory 2Gi 8Gi persistentvolumeclaims 0 0 pods 4 16 requests.cpu 875m 4 requests.memory 1Gi 4Gi secrets 9 30 services 1 2 services.nodeports 0 0 ``` I would like to be able to run the same 4 continuous jobs (`count/deployments.apps`) plus experiment with additional feeds from GitLab. Additional service quota will be used similarly, +1 for a webservice & +1 more for experiments (local redis for queuing replacement)
    • Task
    Undo the manual edits to `$HOME/public_html/pull.php` from: ```lang=irc [18:37] < wm-bot> !log bd808@tools-sgebastion-11 tools.wikibugs Comment out git pull logic in pull.php while landing and deploying refactor. ```
    • Task
    The description of this task has to begin with a confession; take it as a user story, if you want. If I find code like the following ([[https://gerrit.wikimedia.org/g/mediawiki/core/+/b48017285d449fcb01ffc07d36391bf39b45ffe8/includes/rcfeed/IRCColourfulRCFeedFormatter.php#99 | source ]]): ```lang=php if ( !$attribs['rc_patrolled'] && ( $useRCPatrol || $attribs['rc_type'] == RC_NEW && $useNPPatrol ) // <-- relevant line ) { $flag .= '!'; } ``` I don't know what it does. I just do not remember what the operator precedence is; maybe I never even learned it. Is it `||` before `&&` or the other way around? Whenever I find such code, I generally do one of two things: 1) go check the manual to see what the order is (and promptly forget it afterwards), or 2) trust the author of the code to know operator precedence better than me, and hope that the code does what it looks like it was meant to do (which is probably the case if I'm looking at code written in 2004). Clearly, this is subpar: I have to spend a decent amount of time parsing the code in question. But then I think, maybe it's just me. After all, there are so many things that I have to re-learn often because I forget them immediately. But now that PHPCS 3.9.0 is out and it introduces a new `Generic.CodeAnalysis.RequireExplicitBooleanOperatorPrecedence` sniff, I think it's time for me to actually ask this and see if it's just me--or actually, ask: would it be a problem if we enabled this new rule to forbid mixing boolean operators without parentheses? To clarify, the code above would be flagged, and the accepted way of rewriting it (no autofix available) would be: ```lang=php if ( !$attribs['rc_patrolled'] && ( $useRCPatrol || ( $attribs['rc_type'] == RC_NEW && $useNPPatrol ) ) ) { $flag .= '!'; } ``` where the precedence is made very clear by the parentheses. I think the extra parentheses are well worth the clarity, as well as the reassurance that the code is doing exactly what we want it to do. Still, there are a few occurrences of this pattern (at least in MW core, see patch attached to this task) and we don't have a sniff for it, so I'm wondering if I'm missing some potential downsides.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Create a PDF file (With the "Download as PDF") Of a certain Wikisource page that have notes in the page (I mean those Notes shortcuts or whatever the called) * Press One on the Notes shortcuts in order to reach the content of the note down. * **What happens?**: Instead of directing you to the content of the note in the PDF file, it will redirect you to the note on the Internet Page **What should have happened instead?**: It should have direct you to the associated note on the PDF file. **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): Edge 122 See the video to understand: https://www.youtube.com/watch?v=z_qznXW2iVU
    • Task
    Seen during {T296183}. ``` reedy@mwmaint2002:~/uploads$ ls -al total 5855072 drwxrwxr-x 2 reedy wikidev 4096 Mar 2 21:36 . drwxr-xr-x 6 reedy wikidev 4096 Mar 2 18:24 .. -rwx------ 1 reedy wikidev 517299187 Sep 10 2021 Ilustrovana_vojna_enciklopedija_1.pdf -rwx------ 1 reedy wikidev 534614764 Sep 10 2021 Ilustrovana_vojna_enciklopedija_2.pdf -rwx------ 1 reedy wikidev 4943651881 Sep 21 2021 Narodna_enciklopedija_srpsko-hrvatsko-slovenacka_1.pdf reedy@mwmaint2002:~/uploads$ ``` And uploading fails... ``` reedy@mwmaint2002:~/uploads$ mwscript importImages.php --wiki=commonswiki --user="Nikola_Smolenski" ~/uploads --overwrite Importing Files Ilustrovana_vojna_enciklopedija_2.pdf exists, overwriting...failed. (The file /home/reedy/uploads/Ilustrovana_vojna_enciklopedija_2.pdf does not exist.) Ilustrovana_vojna_enciklopedija_1.pdf exists, overwriting...failed. (The file /home/reedy/uploads/Ilustrovana_vojna_enciklopedija_1.pdf does not exist.) Narodna_enciklopedija_srpsko-hrvatsko-slovenacka_1.pdf exists, overwriting...failed. (The file /home/reedy/uploads/Narodna_enciklopedija_srpsko-hrvatsko-slovenacka_1.pdf does not exist.) Found: 3 Failed: 3 ``` Why does it say the file on disk doesn't exist?
    • Task
    https://en.wikipedia.org/wiki/Wikipedia_talk:AutoWikiBrowser#Request_to_change_banner_shell_general_fixes
    • Task
    When loading mobile VE, the expand/collapse icon disappears and the margin animates away. At the moment the margin doesn't fully animate away, and there is a jump when the editor finishes loading. |{F42343375}||{F42343374}
    • Task
    The constructor of `OrExpressionGroup` and `AndExpressionGroup` are marked as `@internal` and both classes are not marked as `@stable`, but used outside of rdbms library code[1] Replace usage or clarify if the expression groups are okay to use. Some `OrExpressionGroup` could use `ISQLPlatform::factorConds` Avoiding `AndExpressionGroup` could be easy. ---- [1] Simple search: https://codesearch.wmcloud.org/search/?q=new%5Cs%2B%28Or%7CAnd%29ExpressionGroup&files=&excludeFiles=&repos=
    • Task
    **Steps to replicate the issue** (include links if applicable): * Open https://ru.wikipedia.org/w/index.php?diff=136459717&oldid=136291538&title=Атака_мертвецов * Look at edit summary **What happens?**: {F42339947} **What should have happened instead?**: This shouldn't happen.
    • Task
    It'd be useful if importImages.php output timestamps when an import for an individual file started/ended. When you're importing large files, it's nice to know how long it's been going for. Progress bars would be amazing, but probably overkill :P ```lang=bash reedy@mwmaint2002:~/uploads$ ls -al total 14443756 drwxrwxr-x 2 reedy wikidev 4096 Mar 2 18:24 . drwxr-xr-x 6 reedy wikidev 4096 Mar 2 18:24 .. -rw-rw-r-- 1 reedy wikidev 3709453185 Feb 8 20:25 'Cyrano de Bergerac (1950).webm' -rw-rw-r-- 1 reedy wikidev 3555024932 Feb 11 16:13 'His Girl Friday (1940) by Howard Hawks.webm' -rw-rw-r-- 1 reedy wikidev 4093040711 Jan 15 04:44 'Night of the Living Dead (1968).webm' -rw-rw-r-- 1 reedy wikidev 3432851873 Jan 16 02:29 'The Freshman.webm' (reverse-i-search)`up': cd ^Cloads/ reedy@mwmaint2002:~/uploads$ mwscript importImages.php --wiki=commonswiki --user="D. Benjamin Miller" ~/uploads --overwrite Importing Files The Freshman.webm exists, overwriting...done. His Girl Friday (1940) by Howard Hawks.webm exists, overwriting...done. Cyrano de Bergerac (1950).webm exists, overwriting...done. Night of the Living Dead (1968).webm exists, overwriting...done. Found: 4 Overwritten: 4 ```
    • Task
    ==== Error ==== * mwversion: 1.42.0-wmf.19 * reqId: 7298fffc-fbdf-452c-b68a-aecc7876d22e * [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2024-02-22T13:02:55.829Z',to:'2024-02-24T13:02:55.829Z'))&_a=(query:(query_string:(query:'reqId:%227298fffc-fbdf-452c-b68a-aecc7876d22e%22'))) | Find reqId in Logstash ]] ```name=normalized_message,lines=10 [error/wt2html] TableFixups: Failed to successfully reparse title="{{{drużyna3}}}-{{{drużyna2}}}" | as table cell attributes ``` ```name=,lines=10 exception.trace ``` ==== Impact ==== ==== Notes ====
    • Task
    This is probably related to T358814. # On operating system set dark mode as preference # Install some script that adds <input>, <select>, or <textarea> # Go to some page on which the tags are generated, in mobile # Night mode applies to the aforementioned DOM elements even if on-wiki night mode is disabled For instance: {F42337410} This looks weird when all other software-generated tags are shown in light mode. And, this can be thought of as a script-internal issue, but the new look may make it unnecessarily difficult to style up elements added by user scripts. For instance, assuming that night mode is enabled on the operation system: {F42337854} {F42337865} The topmost input box in the first image has a text and a CSS specification of `color: white;`, hence invisible. I got to know the change in night mode handling when someone had reported a "bug" of my user script, saying "texts are suddenly invisible". I had a `color: white;` specification for a CSS class when the script is used with the minerva skin. I can modify the code to make it work with the change, but please check whether the new look ("night mode under light mode") is intentional.
    • Task
    {F42338285 size=full} Not sure if this is just PhpStorm being weird... But I keep getting this error browsing #parsoid files in PhpStorm. In which case, may be #upstream. xref {db786d224aad3174f4121a64e228fd57074397a7} and {T349327}.
    • Task
    Seen on https://gerrit.wikimedia.org/r/c/mediawiki/services/parsoid/+/1007028 in https://integration.wikimedia.org/ci/job/composer-coverage-patch-docker/1365/console Is there a lot of code that is just being incidentally tested, or are we missing various `@covers`? I filed {T358954} originally for `LinterTest`, due to the same patch, noticing that while we had a `LinterTest`, the coverage was shown as zero. The patch above suggests it makes the coverage from 0% to 24%... Which makes me wonder if the lack of annotation/similar is resulting in a not quite so healthy code coverage (in the reports - 10.70% of lines!) as it should be, considering the amount of tests etc. {F42334327 size=full} ``` 23:35:40 +------------------------------------------------------+--------+--------+ 23:35:40 | Filename | Old % | New % | 23:35:40 +------------------------------------------------------+--------+--------+ 23:35:40 | Config/Env.php | 00.00 | 43.00 | 23:35:40 | Config/PageConfig.php | 00.00 | 50.00 | 23:35:40 | Config/SiteConfig.php | 00.00 | 25.00 | 23:35:40 | Config/StubMetadataCollector.php | 00.00 | 23.00 | 23:35:40 | Core/DomSourceRange.php | 00.00 | 19.00 | 23:35:40 | Core/LinkTargetTrait.php | 00.00 | 19.00 | 23:35:40 | Core/PageBundle.php | 00.00 | 10.00 | 23:35:40 | Core/Sanitizer.php | 00.00 | 23.00 | 23:35:40 | Core/SectionMetadata.php | 00.00 | 10.00 | 23:35:40 | Core/TOCData.php | 00.00 | 04.00 | 23:35:40 | DOM/Document.php | 00.00 | 100.00 | 23:35:40 | Ext/DOMUtils.php | 00.00 | 09.00 | 23:35:40 | Ext/Gallery/Gallery.php | 00.00 | 09.00 | 23:35:40 | Ext/Indicator/Indicator.php | 00.00 | 35.00 | 23:35:40 | Ext/JSON/JSON.php | 00.00 | 04.00 | 23:35:40 | Ext/LST/LST.php | 00.00 | 56.00 | 23:35:40 | Ext/Nowiki/Nowiki.php | 00.00 | 16.00 | 23:35:40 | Ext/ParsoidExtensionAPI.php | 00.00 | 02.00 | 23:35:40 | Ext/Poem/Poem.php | 00.00 | 28.00 | 23:35:40 | Ext/Poem/PoemProcessor.php | 00.00 | 20.00 | 23:35:40 | Ext/Pre/Pre.php | 00.00 | 45.00 | 23:35:40 | Language/LanguageConverter.php | 00.00 | 03.00 | 23:35:40 | Logger/ParsoidLogger.php | 00.00 | 19.00 | 23:35:40 | Mocks/MockDataAccess.php | 00.00 | 03.00 | 23:35:40 | Mocks/MockMetrics.php | 00.00 | 11.00 | 23:35:40 | Mocks/MockPageConfig.php | 00.00 | 82.00 | 23:35:40 | Mocks/MockPageContent.php | 00.00 | 43.00 | 23:35:40 | Mocks/MockSiteConfig.php | 00.00 | 33.00 | 23:35:40 | NodeData/DataBag.php | 00.00 | 100.00 | 23:35:40 | NodeData/DataMw.php | 00.00 | 50.00 | 23:35:40 | NodeData/DataParsoid.php | 00.00 | 15.00 | 23:35:40 | Parsoid.php | 00.00 | 14.00 | 23:35:40 | Tokens/EOFTk.php | 00.00 | 33.00 | 23:35:40 | Tokens/EndTagTk.php | 00.00 | 46.00 | 23:35:40 | Tokens/KV.php | 00.00 | 45.00 | 23:35:40 | Tokens/KVSourceRange.php | 00.00 | 13.00 | 23:35:40 | Tokens/SourceRange.php | 00.00 | 45.00 | 23:35:40 | Tokens/TagTk.php | 00.00 | 46.00 | 23:35:40 | Tokens/Token.php | 00.00 | 07.00 | 23:35:40 | Utils/ContentUtils.php | 00.00 | 04.00 | 23:35:40 | Utils/DOMCompat.php | 00.00 | 45.00 | 23:35:40 | Utils/DOMCompat/TokenList.php | 00.00 | 40.00 | 23:35:40 | Utils/DOMDataUtils.php | 00.00 | 28.00 | 23:35:40 | Utils/DOMTraverser.php | 00.00 | 73.00 | 23:35:40 | Utils/DOMUtils.php | 00.00 | 24.00 | 23:35:40 | Utils/DTState.php | 00.00 | 100.00 | 23:35:40 | Utils/PHPUtils.php | 00.00 | 37.00 | 23:35:40 | Utils/Timing.php | 00.00 | 100.00 | 23:35:40 | Utils/Title.php | 00.00 | 49.00 | 23:35:40 | Utils/TokenUtils.php | 00.00 | 17.00 | 23:35:40 | Utils/Utils.php | 00.00 | 36.00 | 23:35:40 | Utils/WTUtils.php | 00.00 | 18.00 | 23:35:40 | Wikitext/ContentModelHandler.php | 00.00 | 19.00 | 23:35:40 | Wt2Html/DOMPPTraverser.php | 00.00 | 100.00 | 23:35:40 | Wt2Html/DOMPostProcessor.php | 00.00 | 87.10 | 23:35:40 | Wt2Html/Frame.php | 00.00 | 16.00 | 23:35:40 | Wt2Html/Grammar.php | 00.00 | 19.02 | 23:35:40 | Wt2Html/PP/Handlers/CleanUp.php | 00.00 | 31.00 | 23:35:40 | Wt2Html/PP/Handlers/DisplaySpace.php | 00.00 | 20.00 | 23:35:40 | Wt2Html/PP/Handlers/Headings.php | 00.00 | 14.00 | 23:35:40 | Wt2Html/PP/Handlers/TableFixups.php | 00.00 | 01.00 | 23:35:40 | Wt2Html/PP/Handlers/UnpackDOMFragments.php | 00.00 | 03.00 | 23:35:40 | Wt2Html/PP/Processors/AddLinkAttributes.php | 00.00 | 24.00 | 23:35:40 | Wt2Html/PP/Processors/AddMediaInfo.php | 00.00 | 02.00 | 23:35:40 | Wt2Html/PP/Processors/AddRedLinks.php | 00.00 | 10.00 | 23:35:40 | Wt2Html/PP/Processors/ComputeDSR.php | 00.00 | 41.00 | 23:35:40 | Wt2Html/PP/Processors/ConvertOffsets.php | 00.00 | 75.00 | 23:35:40 | Wt2Html/PP/Processors/DOMRangeBuilder.php | 00.00 | 05.00 | 23:35:40 | Wt2Html/PP/Processors/I18n.php | 00.00 | 38.00 | 23:35:40 | Wt2Html/PP/Processors/LangConverter.php | 00.00 | 90.00 | 23:35:40 | Wt2Html/PP/Processors/Linter.php | 00.00 | 24.00 | 23:35:40 | Wt2Html/PP/Processors/MarkFosteredContent.php | 00.00 | 10.00 | 23:35:40 | Wt2Html/PP/Processors/MigrateTemplateMarkerMetas.php | 00.00 | 16.00 | 23:35:40 | Wt2Html/PP/Processors/MigrateTrailingNLs.php | 00.00 | 35.00 | 23:35:40 | Wt2Html/PP/Processors/Normalize.php | 00.00 | 100.00 | 23:35:40 | Wt2Html/PP/Processors/PWrap.php | 00.00 | 20.00 | 23:35:40 | Wt2Html/PP/Processors/PWrapState.php | 00.00 | 33.00 | 23:35:40 | Wt2Html/PP/Processors/ProcessTreeBuilderFixups.php | 00.00 | 67.00 | 23:35:40 | Wt2Html/PP/Processors/Section.php | 00.00 | 78.00 | 23:35:40 | Wt2Html/PP/Processors/WrapSections.php | 00.00 | 89.00 | 23:35:40 | Wt2Html/PP/Processors/WrapSectionsState.php | 00.00 | 15.00 | 23:35:40 | Wt2Html/PP/Processors/WrapTemplates.php | 00.00 | 100.00 | 23:35:40 | Wt2Html/PageConfigFrame.php | 00.00 | 100.00 | 23:35:40 | Wt2Html/Params.php | 00.00 | 10.00 | 23:35:40 | Wt2Html/ParserPipelineFactory.php | 00.00 | 88.00 | 23:35:40 | Wt2Html/PegTokenizer.php | 00.00 | 37.00 | 23:35:40 | Wt2Html/PipelineStage.php | 00.00 | 76.00 | 23:35:40 | Wt2Html/TT/AttributeExpander.php | 00.00 | 28.00 | 23:35:40 | Wt2Html/TT/AttributeTransformManager.php | 00.00 | 28.00 | 23:35:40 | Wt2Html/TT/BehaviorSwitchHandler.php | 00.00 | 29.00 | 23:35:40 | Wt2Html/TT/DOMFragmentBuilder.php | 00.00 | 10.00 | 23:35:40 | Wt2Html/TT/ExtensionHandler.php | 00.00 | 03.00 | 23:35:40 | Wt2Html/TT/ExternalLinkHandler.php | 00.00 | 06.00 | 23:35:40 | Wt2Html/TT/IncludeOnly.php | 00.00 | 09.00 | 23:35:40 | Wt2Html/TT/LanguageVariantHandler.php | 00.00 | 02.00 | 23:35:40 | Wt2Html/TT/ListHandler.php | 00.00 | 16.00 | 23:35:40 | Wt2Html/TT/NoInclude.php | 00.00 | 09.00 | 23:35:40 | Wt2Html/TT/OnlyInclude.php | 00.00 | 17.00 | 23:35:40 | Wt2Html/TT/ParagraphWrapper.php | 00.00 | 40.00 | 23:35:40 | Wt2Html/TT/PreHandler.php | 00.00 | 36.00 | 23:35:40 | Wt2Html/TT/QuoteTransformer.php | 00.00 | 08.00 | 23:35:40 | Wt2Html/TT/SanitizerHandler.php | 00.00 | 61.00 | 23:35:40 | Wt2Html/TT/TemplateHandler.php | 00.00 | 04.00 | 23:35:40 | Wt2Html/TT/TokenCollector.php | 00.00 | 09.00 | 23:35:40 | Wt2Html/TT/TokenHandler.php | 00.00 | 91.00 | 23:35:40 | Wt2Html/TT/TokenHandlerResult.php | 00.00 | 86.00 | 23:35:40 | Wt2Html/TT/TokenStreamPatcher.php | 00.00 | 24.00 | 23:35:40 | Wt2Html/TT/WikiLinkHandler.php | 00.00 | 01.10 | 23:35:40 | Wt2Html/TokenTransformManager.php | 00.00 | 58.00 | 23:35:40 | Wt2Html/TokenizerUtils.php | 00.00 | 41.00 | 23:35:40 | Wt2Html/TreeBuilder/Attributes.php | 00.00 | 14.00 | 23:35:40 | Wt2Html/TreeBuilder/DOMBuilder.php | 00.00 | 100.00 | 23:35:40 | Wt2Html/TreeBuilder/RemexPipeline.php | 00.00 | 58.00 | 23:35:40 | Wt2Html/TreeBuilder/TreeBuilderStage.php | 00.00 | 41.00 | 23:35:40 | Wt2Html/TreeBuilder/TreeMutationRelay.php | 00.00 | 73.00 | 23:35:40 +------------------------------------------------------+--------+--------+ ```
    • Task
    **Steps to replicate the issue** (include links if applicable): * Replace an old ORES V1 request by a V3 "equivalent", such as https://ores.wikimedia.org/v3/scores/ptwiki/60845189/articlequality **What happens?**: The predicted article class is now a boolean instead of one of the strings that appear as keys for the probability dictionary: ```lang=json { "ptwiki": { "models": { "articlequality": { "version": "0.8.0" } }, "scores": { "60845189": { "articlequality": { "score": { "prediction": true, "probability": { "1": 0.6505766596726758, "2": 0.09876741829105372, "3": 0.0634780511261501, "4": 0.06104126161283134, "5": 0.0573997480745124, "6": 0.06873686122277664 } } } } } } } ``` **What should have happened instead?**: The predicted article class should be identical to one of the keys in the probability dictionary (the string "1" in the example, which is the key that maximizes the probability value): ```lang=json ... "prediction": "1", "probability": { "1": 0.6505766596726758, ... ``` **Other information**: The problem does not happen when one of the other classes has the highest probability: * `"prediction": "2"`: https://ores.wikimedia.org/v3/scores/ptwiki/66497121/articlequality * `"prediction": "3"`: https://ores.wikimedia.org/v3/scores/ptwiki/66513703/articlequality * `"prediction": "4"`: https://ores.wikimedia.org/v3/scores/ptwiki/66078571/articlequality * `"prediction": "5"`: https://ores.wikimedia.org/v3/scores/ptwiki/65832901/articlequality * `"prediction": "6"`: https://ores.wikimedia.org/v3/scores/ptwiki/67282651/articlequality
    • Task
    I realized this as i was looking through a rather complex form submit. There are lots of places where we use wp*string constants in $request->getText() etc. We should have classes of constants for these query parameter names, so that we don't use strings directly. One very good reason to do so, is that this allows for easier tracking of which query param names we have to begin with, but much more important, it allows to document them in code, so that i don't have to wonder what that value even does. https://codesearch-beta.wmcloud.org/things/?q=%5C%27wp&files=&excludeFiles=&repos=
    • Task
    == Steps to reproduce 1. Open https://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Karbantart%C3%B3m%C5%B1hely&useparsoid=0. 2. Notice that there are a couple of interlanguage links in the language selector / sidebar. 3. Open https://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Karbantart%C3%B3m%C5%B1hely&useparsoid=1. == Actual result 4. Notice that there are no interlanguage links in the language selector / sidebar. == Expected result 4. Notice that the same interlanguage links continue to appear. == Other information Wikidata-based interlanguage links continue to appear using Parsoid, which both makes the issue less severe and made it more challenging to find out the root cause (I experienced it first in an article that has both Wikidata-based and local interlanguage links, so I only saw that the Wikidata-based one overrides the local one).
    • Task
    This is how a dialog from VisualEditor is displayed in Russian: https://ru.wikipedia.org/wiki/Фасмер,_Макс?veaction=edit {F42328887} In Russian, the text in two of the panel labels says ‘Additional se...’ and ‘Used te...’. At the same time, there is more than enough space to avoid this sort of text truncation. I think that in such cases this sort of text truncation should be entirely avoided, and modern Codex style guide should clarify that text truncation should be the last resort if there is really no space in a component. Although this is probably fine in English, in Russian this makes it a degrading experience for people who wouldn’t entirely know that the text hides ‘settings’ and ‘templates’.
    • Task
    **Steps to replicate the issue** (include links if applicable): On your cell phone, visit a random language Wikipedia homepage. In our sample today, we visited the Vietnamese Wikipedia homepage. We tried it on several browsers, the browser type is not important. We got the same results. As you can see in the attached image. **What happens?**: As you can see the page is a complete mess. **What should have happened instead?**: The page should look readable. **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): We especially tested in the state where we were not logged in, to see what the majority of users will see when first visiting the site. We picked a random browser that we had not messed with the settings with either. It is a big shame that many language Wikipedias will be one big mess when using a cell phone to view. I am just making a guess that the banner size has not been tested on more than one language version. Or perhaps the banners have only been tested on desktops. But that ignores the fact that there's many people, a growing number, that use cell phones. Note: you might think the website looks fine, but please give it a minute or two for the page to fully load. {F42328879}
    • Task
    **Steps to replicate the issue** (include links if applicable): * Navigate to plwiki * Type `mw.loader.load('ext.gadget.wp_sk-T358946')` in browser console **What happens?**: An error is displayed in the console: > ext.gadget.wp_sk-script-0.js: > Parse error: Unexpected \ on line 910 **What should have happened instead?**: The gadget should have been loaded as it used to before wmf.20 deployment **Software version** (skip for WMF-hosted wikis like Wikipedia): wmf.20 **Other information** (browser name/version, screenshots, etc.): The linter complains about this line: https://pl.wikipedia.org/wiki/MediaWiki:Gadget-sk.js#L-910 which is syntactically correct. The file hasn't been changed significantly for a few weeks and all the recent changes in that file were to overcome this error. The error is present in ResourceLoader output, implying that the linter fails to recognize something in the code: https://pl.wikipedia.org/w/load.php?lang=pl&modules=ext.gadget.wp_sk-T358946&skin=vector The error has been firstly reported after an hour of deploying MediaWiki 1.42-wmf.20 to plwiki. (Edited the task to use a gadget copy, the main one has been hotfixed)
    • Task
    ``` [37746786.414774] scsi 0:0:6:0: rejecting I/O to dead device [37746786.414775] print_req_error: I/O error, dev sdg, sector 175978944 ``` ``` Mar 01 21:00:00 cloudcephosd1017 systemd[1]: Started Ceph object storage daemon osd.132. Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.362+0000 7f2983904e00 0 set uid:gid to 499:499 (ceph:ceph) Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.362+0000 7f2983904e00 0 ceph version 15.2.16 (d46a73d6d0a67a79558054a3a5a72cb561724974) octopus (stable), process ceph-osd, pid 2959 Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.362+0000 7f2983904e00 0 pidfile_write: ignore empty --pid-file Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.370+0000 7f2983904e00 -1 bluestore(/var/lib/ceph/osd/ceph-132/block) _read_bdev_label failed to read from /var/lib/ceph/osd/ceph-132/block: (5) Input/output error Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.370+0000 7f2983904e00 -1 ** ERROR: unable to open OSD superblock on /var/lib/ceph/osd/ceph-132: (2) No such file or directory Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.370+0000 7f2983904e00 -1 bluestore(/var/lib/ceph/osd/ceph-132/block) _read_bdev_label failed to read from /var/lib/ceph/osd/ceph-132/block: (5) Input/output error Mar 01 21:00:00 cloudcephosd1017 ceph-osd[2959]: 2024-03-01T21:00:00.370+0000 7f2983904e00 -1 ** ERROR: unable to open OSD superblock on /var/lib/ceph/osd/ceph-132: (2) No such file or directory Mar 01 21:00:00 cloudcephosd1017 systemd[1]: ceph-osd@132.service: Main process exited, code=exited, status=1/FAILURE ```
    • Task
    **Steps to replicate the issue** (include links if applicable): * open commons-file: File:Bikers in Helsinki 1940 (2516C; JOKAHBL3C A51-2).tif * open with pillow and calculate hash * hash is bogus * convert same file to jpeg * open with pillow and calculate hash * hash is valid **What happens?**: Calculating hash from TIFF-image does not work correctly and gives same value for multiple different images that are all TIFF-images. Converting image to JPEG gives a valid hash. There are also other issues with Pillow when using Pillow to convert TIFF-images into JPEG-images: pillow results in an empty image (for 'L' band images) while using other software for conversion results in a valid image. **What should have happened instead?**: Hash should be a valid regardless of format of the image given to hashing function. Two different images should not have the same hash. Also, conversion to different image formats from TIFF should not give empty images **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): You get same bogus hash with other TIFF-images as well, for example Athletics Championships at Ratina Stadium in Tampere in 1991 (JOKAJUK3D C-5).tif and Cleopatra -film's Finland's premier 1963 (JOKAUAS2 11016-5).tif result in same bogus image. Converting the images to JPEG before hashing gives a valid hash. ** Reproducing results ** ``` import imagehash import io import os import tempfile from PIL import Image def openimage(name): f = open(name, "rb") return Image.open(f) def getimagehash(img, hashlen=8): phash = imagehash.phash(img, hash_size=hashlen) dhash = imagehash.dhash(img, hash_size=hashlen) return tuple((hashlen, str(phash), str(dhash))) name = "Bikers_in_Helsinki_1940_(2516C;_JOKAHBL3C_A51-2).tif" commons_image = openimage(name) commonshash = getimagehash(commons_image) print("phash ", commonshash[1], " dhash ", commonshash[2]) name = "Bikers_in_Helsinki_1940_(2516C;_JOKAHBL3C_A51-2).jpg" commons_image = openimage(name) commonshash = getimagehash(commons_image) print("phash ", commonshash[1], " dhash ", commonshash[2]) ```
    • Task
    My mwbot-rs activity usually comes in waves, and I'd like to not be a roadblock to other people making improvements. @XtexChooser, @mirror-kt, would you both be interested in getting access to make releases of the mwbot-rs crates? If so, please let me know your github username and I can add you.
    • Task
    GerritBot comments for 7-digit Gerrit changes conflict with Diffusion commit hashes. That's a mouthful, so just look at T356157#9581302 or T358932#9592922: {F42319843} {F42319847} The first link goes to a completely unrelated commits – it just happens that the hexadecimal commit hash happens to start with 7 decimal digits, and we now have 7-digit changes in Gerrit. Can we configure Phabricator so that it doesn't try to auto-link commit hashes that only contain digits? Or failing that, hashes shorter than 8 characters?
    • Task
    In practise this doesn't matter too much, as most jobs are retryable. Just filing a bug for the record In WMF's job queue, the logic that handles RunnableJob::allowsRetries() lives in MediaWiki\Extension\EventBus\JobExecutor. If the HTTP connection dies before this code is executed (for example, due to apache config or the timeout config in change-propogation), then change-propagation service will retry this job. This is pretty unexpected. If you really don't want your jobs to be retried, you have to set that in the change-propagation config as well as implementing allowsRetries()
    • Task
    **Steps to replicate the issue** : * On `wikifunctions wmf.20` in the search field, enter **Special:Run** - two suggestions would be displayed: **Special:RunFunction** and **Special:RunJobs** * Click on **Special:RunJobs** - `Bad Request Request must be POSTed.` page will be displayed {F42311585} ``` https://www.wikifunctions.org/wiki/Special:RunJobs Request Method: GET Status Code: 400 Bad Request ``` **What should have happened instead?**: - if a request for page results in a bad request, the page should not be in the search suggestions list - bad requests (if possible) should be handled in more user-friendly way **Other information** : - the suggestion **Special:RunJobs** exists on all wiki (lang wikis and non-lang wikis). What makes the issue more prominent on `wikifuncitons` is that there is **Special:RunFuncitons** page, so users have more chances to see **Special:RunJobs** page as a suggestion. - checked: non-existing Special pages are handled gracefully
    • Task
    Display an accordion for active blocks and an accordion for account's block log in Codex. These two accordions should display when the selected user has current active blocks and past blocks. The component should look and feel as defined in [[ https://www.figma.com/file/3PthdmimQ8FiuhGIsLhmZi/Multiblock?type=design&node-id=853-19790&mode=design&t=M42UnSCtZKDoPhy7-0 | Figma file ]]
    • Task
    Last night we got paged for probe failures for Kubernetes apiserver in codfw: ``` 01:35:41 <jinxer-wm> (ProbeDown) firing: (2) Service kubemaster2001:6443 has failed probes (http_codfw_kube_apiserver_ip4) #page - https://wikitech.wikimedia.org/wiki/Runbook#kubemaster2001:6443 - https://grafana.wikimedia.org/d/O0nHhdhnz/network-probes-overview?var-job=probes/custom&var-module=All - https://alerts.wikimedia.org/?q=alertname%3DProbeDown ``` It turns out the apiserver on kubemaster2001 restarted at 00:53, and again at 01:33: ``` Mar 01 00:53:35 kubemaster2001 systemd[1]: kube-apiserver.service: Succeeded. Mar 01 00:53:35 kubemaster2001 systemd[1]: Stopped Kubernetes API Server. Mar 01 00:53:35 kubemaster2001 systemd[1]: kube-apiserver.service: Consumed 1w 4d 11h 55min 22.441s CPU time. Mar 01 00:53:35 kubemaster2001 systemd[1]: Starting Kubernetes API Server... ``` The same happened on kubemaster2002 at 00:52, and again at 01:34. We lost probes on both hosts both times, but it only paged the second time: https://grafana.wikimedia.org/goto/_YP4RkASz?orgId=1 It shows up in the resource graphs: https://grafana.wikimedia.org/goto/by8I-qTSz?orgId=1 @Clement_Goubert identified this as a Puppet cert renewal, which makes sense: https://puppetboard.wikimedia.org/report/kubemaster2001.codfw.wmnet/5a536b70ffe6b5d7b7d0c62ddde739b101a4bc25 But Puppet ran on kubemaster2001 at 00:51, and on kubemaster2002 at 01:31, so it looks like we restarted both servers both times -- that might be an effect of coordinating a rolling restart to minimize leader elections, but doing the rolling restart //twice// might be a bug; I haven't dug into it yet. So, questions: * Is the 2x2 restart expected? * Of course starting up the apiserver is expensive. Is it expected to lock up for so long that probes fail? ** If not, should we do something about that? ** If so, should we bump the alert thresholds so it doesn't page? * Did the rolling restart succeed in avoiding any actual API unavailability (because the elected leader was always serving)? ** If so, nothing was actually wrong -- should we only alert on the API service, and not on individual machines?
    • Task
    Implement username lookup field in Codex. This field will allow admins to find a username or enter an IP address. The component should look and feel as defined in [[ https://www.figma.com/file/3PthdmimQ8FiuhGIsLhmZi/Multiblock?type=design&node-id=853-19790&mode=design&t=M42UnSCtZKDoPhy7-0 | Figma file ]]
    • Task
    The existing Special:Block Form UI needs to be refactored in Codex. The look and feel should be as specified in the [[ https://www.figma.com/file/3PthdmimQ8FiuhGIsLhmZi/Multiblock?type=design&node-id=853-19790&mode=design&t=M42UnSCtZKDoPhy7-0 | Figma file ]]
    • Task
    Notes: For used pageIDs, to send on the GET method to avoid receiving repeated tasks Maybe: Caching to optimize task loading - discuss
    • Task
    **Steps to replicate the issue** (include links if applicable): * Visit https://de.wikipedia.beta.wmflabs.org/wiki/Special:RecentChanges?uselang=en * Enable the "Newcomers", "Learners" or "Experienced users" filters **What happens?**: Temporary users are shown in the results: {F42309002} **What should have happened instead?**: Temporary users are not shown in the results – this screenshot is taken after selecting "Registered" too, which should not have an effect (as explained in the tooltip), but it does: {F42309058}
    • Task
    ====Why are we doing this? We want to ensure that people who use assistive technologies and accessible type sizes can easily navigate through the Images recommendations task. ====Elements to audit - Ensure that screen reader options on the Image recommendations flow are all are clear - Ensure that accessible type sizes for titles, messages, and interactive elements work in the Image Recommendations flow - Ensure that it is possible to see all actions in the flow with accessible type sizes enabled
    • Task
    JSON documents in HTML dump (I tested on enwiki, namespace 0) contain now "event" field inside the "version" field: ``` "version": { "identifier": 1139735274, "comment": "[[WP:AES|←]]Created page with '== الأكواد في العلوم الاجتماعية == في العلوم الاجتماعية، كتابة الأكواد عمومًا تكون عملية تحليلية حيث تكون المعلومة، إما : كمي أو نوعي. . يوجد غرض واحد من كتابة الأكواد، ألا وهو تحويل البيانات لشيء يتم فهمه من قبل الحاسب. تصنيف البيانات تعتبر خطوة مهمة، مث...'", "tags": [ "visualeditor" ], "scores": { "damaging": { "probability": { "false": 0.9262439762028375, "true": 0.07375602379716255 } }, "goodfaith": { "prediction": true, "probability": { "false": 0.03032542228178925, "true": 0.9696745777182108 } } }, "editor": { "identifier": 45367280, "name": "STORM091" }, "number_of_characters": 1362, "size": { "value": 2392, "unit_text": "B" }, "event": { "identifier": "fa4bf5e9-476f-4fea-be58-4238e066377a", "type": "create", "date_created": "2023-02-16T17:07:10.046857Z", "date_published": "1970-01-01T00:00:00Z" } }, ``` See that `date_published` is an invalid timestamp. Given that `date_published` is optional anyway, it should be omitted. I am guessing that in the underlying Go struct, it should be: ``` DatePublished *time.Time `json:"date_published,omitempty"` ``` But it is probably not a pointer.
    • Task
    Create view for when there are no image recommendations available according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=154-12308&mode=design&t=WcdHOxEKBRCRqqL0-0 | figma ]] reference {F42307971}
    • Task
    == Common information * **dashboard**: https://grafana.wikimedia.org/d/000000377/host-overview?orgId=1&var-server=cloudcephosd1017 * **description**: Unit ceph-osd@132.service on node cloudcephosd1017 has been down for long. * **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/SystemdUnitDown * **summary**: The systemd unit ceph-osd@132.service on node cloudcephosd1017 has been failing for more than two hours. * **alertname**: SystemdUnitDown * **cluster**: wmcs * **instance**: cloudcephosd1017:9100 * **job**: node * **name**: ceph-osd@132.service * **prometheus**: ops * **severity**: critical * **site**: eqiad * **source**: prometheus * **state**: failed * **team**: wmcs == Firing alerts --- * **dashboard**: https://grafana.wikimedia.org/d/000000377/host-overview?orgId=1&var-server=cloudcephosd1017 * **description**: Unit ceph-osd@132.service on node cloudcephosd1017 has been down for long. * **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/SystemdUnitDown * **summary**: The systemd unit ceph-osd@132.service on node cloudcephosd1017 has been failing for more than two hours. * **alertname**: SystemdUnitDown * **cluster**: wmcs * **instance**: cloudcephosd1017:9100 * **job**: node * **name**: ceph-osd@132.service * **prometheus**: ops * **severity**: critical * **site**: eqiad * **source**: prometheus * **state**: failed * **team**: wmcs * [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=node_systemd_unit_state%7Bcluster%3D%22wmcs%22%2Cstate%3D%22failed%22%7D+%3D%3D+1&g0.tab=1)
    • Task
    ####Background If someone rejects an image suggestion using the "No" option, it's important to understand why so we can improve our model. One of our guardrail metrics is that Less than 5% of users report NSFW or offensive content being suggested through the tool. ####Requirements - People may select multiple options. - Tapping cancel takes you back to the image recommendation. - Trying to type a reason in "Other" automatically selects this option. - Deselecting "Other" grey's out any left over text in the input field. - Tapping Submit takes you to the next article/image suggestion - Data is stored / events are logged in accordance with Shay's requirements ####Design: Create the survey according to the [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=154-13103&mode=design&t=WcdHOxEKBRCRqqL0-0 | figma ]] reference | {F42307048} | {F42306956} | {F42306971}|
    • Task
    Create the Save Changes (Edit Summary) screen according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=154-12628&mode=design&t=WcdHOxEKBRCRqqL0-0 | figma ]] reference {F42306696} ####Requirements: - 'Publish' is disabled until edit summary is added. (This pattern should apply everywhere). - Add tapable pre-populated edit summaries "Added image" and "Added image and caption" which autofill into field
    • Task
    == Requestor provided information and prerequisites == **Complete ALL items below as the individual person who is requesting access:** * Wikimedia developer account username: GeorgeMikesell * Email address: gmikesell-ctr@wikimedia.org * SSH public key (must be a separate key from Wikimedia cloud SSH access): sh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOnm+TGg4lLdOeDMaeB+X8Gtkjn84OJvzX4FRkjTUvnm gmikesell-ctr@wikimedia.org * Requested group membership: analytics-private-data (along with Keberos identity) * Reason for access: user needs to do QA work for instrumentations and scehmas * Name of approving party (manager for WMF/WMDE staff): @SBisson * Ensure you have signed the L3 Wikimedia Server Access Responsibilities document: Yes * Please coordinate obtaining a comment of approval on this task from the approving party. == SRE Clinic Duty Confirmation Checklist for Access Requests == This checklist should be used on all access requests to ensure that all steps are covered, including expansion to existing access. Please double check the step has been completed before checking it off. **This section is to be confirmed and completed by a member of the #SRE team.** [] - User has signed the L3 Acknowledgement of Wikimedia Server Access Responsibilities Document. [] - User has a valid NDA on file with WMF legal. (All WMF Staff/Contractor hiring are covered by NDA. Other users can be validated via the NDA tracking sheet) [] - User has provided the following: developer account username, email address, and full reasoning for access (including what commands and/or tasks they expect to perform) [] - User has provided a public SSH key. This ssh key pair should only be used for WMF cluster access, and not shared with any other service (this includes not sharing with WMCS access, no shared keys.) [] - The provided SSH key has been confirmed out of band and is verified not being used in WMCS. [] - access request (or expansion) has sign off of WMF sponsor/manager (sponsor for volunteers, manager for wmf staff) [] - access request (or expansion) has sign off of group approver indicated by the approval field in data.yaml For additional details regarding access request requirements, please see https://wikitech.wikimedia.org/wiki/Requesting_shell_access
    • Task
    **Steps to replicate the issue** (include links if applicable): * See T352875 for precursor discussions **What happens?**: Something possibly related to this series of changes messed up the symmetry of the padding at two en.WP templates. See https://en.wikipedia.org/w/index.php?title=Template_talk:Quote_box&oldid=1209709062#bottom-of-box_sizing and https://en.wikipedia.org/w/index.php?title=Template_talk:Talk_quote_block&oldid=1209711889#Ugly_bottom_padding Also see https://en.wikipedia.org/w/index.php?title=Template_talk:Markup&oldid=1211286676#Row_height **What should have happened instead?**: Spacing at the bottom of these templates should not have been affected. **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    Note: Reuse existing code Create article preview screen according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=154-12709&mode=design&t=WcdHOxEKBRCRqqL0-0 | figma ]] reference. {F42306472} ####Requirements - Preview of image in article - User can navigate back to Add Image details screen - Next takes them to the Edit Summary screen
    • Task
    Reuse Alt text work from T347121 / prototype
    • Task
    Note: If we can reuse from insert image flow (maybe S) Create the add images screen according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=154-12538&mode=design&t=WcdHOxEKBRCRqqL0-0 | figma ]] reference {F42305819} {F42305811} ####Requirements - When no information is added people may tap 'Next' but it is not progressive. - Once caption and/or alt text is added the 'Next' button becomes progressive. - If image already has alt text, it is populated Question: - Should we support ability to copy/paste in text to the caption and alt text fields? (From image details or image file name).
    • Task
    - Continue testing [[ https://phabricator.wikimedia.org/T350501 | Autorescue ]] - Continue enabling [[ https://phabricator.wikimedia.org/T324517 | ACH ]] on Adyen - Continue segmentation work [[ https://phabricator.wikimedia.org/T353264#9589064 | adding back middle donor segment ]] - Amazon reintegration - [[ https://phabricator.wikimedia.org/T358624 | scope whether this can be achieved with Adyen ]] - Japan form in advance of campaign [[ https://phabricator.wikimedia.org/T228902 | 1 ]] [[ https://phabricator.wikimedia.org/T358525 | 2 ]] - [[ https://phabricator.wikimedia.org/T357334 | Spike - Payments orchestration ]] - Email Prefs - review [[ https://phabricator.wikimedia.org/T358878 | new criteria ]] **Not priorities for this Sprint, but to be triaged:** - [[ https://phabricator.wikimedia.org/T358914 | Annual recurring ]] - [[ https://phabricator.wikimedia.org/T351325 | UTM params ]]
    • Task
    **Feature summary** (what you would like to be able to do and where): add "Live" (live edits count) column to Pages Created https://xtools.wmcloud.org/pages/en.wikipedia.org/Pfomma Before: {F42305969} After: {F42306001} **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): **Benefits** (why should this be implemented?): Calculate how many non-deleted articles a user has without having to do mental math (subtracting deleted from pages).
    • Task
    * We'd like to offer our supporters the ability to donate yearly [or pledge to give to us once a year] via our existing donation flows. * Need to consider how to record and treat these supporters in Civi and through stewardship comms, and parse out exactly what the UX flow would be (including "pledge" wording and functionality). * Think through technical requirements and any potential edge case issues. Note: an update to this old phab: {T251418}
    • Task
    Create the webview of the suggested image according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=152-827&mode=design&t=WcdHOxEKBRCRqqL0-0 | Figma ]] reference {F42305769} {F42305554} ####Requirements - Webview of commons entry for suggested image displays after user taps on external image link - User can navigate using Back to suggested edit flow
    • Task
    **Steps to replicate the issue** (include links if applicable): I'm not sure this can be replicated easily, because it showed only once and disappeared after a page refresh. Here's what I did: * Opened https://en.wikipedia.org/wiki/Project:Village_pump_(technical), being on Vector 2022 skin * Clicked the user menu dropdown button **What happens?**: {F42305341} I couldn't figure out which element the pulsating dot is pointing to. After I looked into HTML, I saw this: {F42305361} Then I realized the dot is clickable, but after a click I got this: {F42305403} After I refreshed the page, I saw this and no pulsating dot: {F42305427} **Other information** (browser name/version, screenshots, etc.): Last Chrome.
    • Task
    When LibUp upgrades the `"mediawiki/mediawiki-codesniffer` to version `>=43.0.0` the plugin `dealerdirect/phpcodesniffer-composer-installer` must be accepted. This can be reached by saying yes to the composer dialog: > Do you trust "dealerdirect/phpcodesniffer-composer-installer" to execute code and wish to enable it now? (writes "allow-plugins" to composer.json) [y,n,d,?] y Or by adding to `composer.json` (when missing): ``` "config": { "allow-plugins": { "dealerdirect/phpcodesniffer-composer-installer": true } } ``` It is possible that the key `config.allow-plugins` already exists for other plugins.
    • Task
    Create the full-screen image according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=152-1582&mode=design&t=WcdHOxEKBRCRqqL0-0 | Figma ]] reference {F42305228} ####Requirements - Allow users to zoom/pinch on image - Users should be able to access image metadata
    • Task
    When a pod in an admin-managed namespace is crashing or not starting, or the replicaset/deployment managing it is not up-to-date, there should be an alert.
    • Task
    There should be some sort of an alert when what's in a live cluster does not match what's in the `toolforge-deploy.git` repository.
    • Task
    == Common information * **dashboard**: https://grafana.wikimedia.org/d/P1tFnn3Mk/wmcs-ceph-eqiad-health?orgId=1&search=open&tag=ceph&tag=health&tag=WMCS * **description**: Ceph cluster in eqiad has slow ops, which might be blocking some writes * **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/CephSlowOps * **summary**: Ceph cluster in eqiad has 354 slow ops * **alertname**: CephSlowOps * **cluster**: wmcs * **job**: ceph_eqiad * **prometheus**: cloud * **service**: ceph,cloudvps * **severity**: critical * **site**: eqiad * **source**: prometheus * **team**: wmcs == Firing alerts --- * **dashboard**: https://grafana.wikimedia.org/d/P1tFnn3Mk/wmcs-ceph-eqiad-health?orgId=1&search=open&tag=ceph&tag=health&tag=WMCS * **description**: Ceph cluster in eqiad has slow ops, which might be blocking some writes * **runbook**: https://wikitech.wikimedia.org/wiki/Portal:Cloud_VPS/Admin/Runbooks/CephSlowOps * **summary**: Ceph cluster in eqiad has 354 slow ops * **alertname**: CephSlowOps * **cluster**: wmcs * **job**: ceph_eqiad * **prometheus**: cloud * **service**: ceph,cloudvps * **severity**: critical * **site**: eqiad * **source**: prometheus * **team**: wmcs * [Source](https://prometheus-eqiad.wikimedia.org/cloud/graph?g0.expr=ceph_healthcheck_slow_ops%7Bjob%3D%22ceph_eqiad%22%7D+%3E+0&g0.tab=1)
    • Task
    ## Context/background Currently, TextInput, Select, and Combobox have a min-width of 256px. This is applied in code and in Figma. ### Problem In certain layouts, such as some new designs, the form itself has a max-width of 512px, determined in the Constructing forms guidelines in the Style Guide and meant to prevent unnecessarily long inputs when desired in forms. In such cases, when there is a fieldset or multiple columns, two fields next to one another using a TextInput and Select, for example, need to be shorter than 256px. Example below. {F42304409} ### Proposed solution To ensure flexibility and ability to create certain interfaces, I propose we remove the restriction on the component, and instead lead with guidance on why 256px is the preferred min-width, except when labels and input values will be short, even when translated to other languages, etc. etc. This guidance would be provided on the Field component page.
    • Task
    **Steps to replicate the issue** (include links if applicable): I don't know. **What happens?**: Transcodes of File:Parallax_mapping.ogv on commons are terribly wrong. There's lot of dark gray boxes. **What should have happened instead?**: Dark gray boxes must not be visible. **Software version** (skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): I was tried that on Firefox and Chrome, the problem happens in the both so there's a problem with the transcode generator.
    • Task
    == Background The mobile search overlay, in particular the image placeholder is not friendly with night mode. Given our work making Codex night mode friendly rather than revising the code to make it work in night mode, I think it is a good idea to align our designs for desktop and mobile a little more. Rather than re-implementing the whole thing in mobile, I think it would be a good idea to revise the existing PageList markup to use Codex. == User story As a user on mobile I want a working consistent search experience with the desktop site that works in night mode. == Requirements [] When an article has no image the placeholder should be visible in both light and night themes. [] The article search icon should be visible (currently not shown at all) [] The designer should be happy with the alignment of icons == Design # Current experience {F42301932} # Proposed experience {F42302206} == Acceptance criteria - [] resources/mobile.pagesummary.styles/noimage.svg should be removed in favor of a Codex icon - [] The search in pages icon should be restored. == Communication criteria - does this need an announcement or discussion? - Add communication criteria
    • Task
    Create the overflow menu, according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=152-232&mode=design&t=XoKbKR6tChgsBaCT-0 | Figma ]] {F42301488} #####Requirements: - Learn more links to the [[ https://www.mediawiki.org/wiki/Wikimedia_Apps/iOS_Suggested_edits_project#Add_an_image | Suggested Edits FAQ page ]] (WIP by Haley) - Tutorial launches the tooltips flow (T358896) - Problem with feature launches autofilled email ([[ https://docs.google.com/spreadsheets/d/1y5gGEjYvZxtOxAOMvCnBmGnyaNx3rjU7rPnmxtApEQM/edit#gid=0 | translations ]]) #####Support email Copy: **Subject:** Issue Report- Add an Image Feature **Body:** I’ve encountered a problem with the Add an Image Suggested Edits Feature: - [Describe specific problem] The behavior I would like to see is: - [Describe proposed solution] [Screenshots or Links]
    • Task
    There's a few stray `wfDebug()` calls left here which make these messages harder to find in production, as well as harder to isolate during local development among the flood of other debug messages. It looks like there is already a PSR log channel set up for PoolCounter, aptly named `poolcounter`. But, this is not yet dependency injected and not yet adopted by most of the code except for 1 call in `PoolCounterWork`.
    • Task
    Rebasing a patch set where someone has set CodeReview+2 (but it was not merged due to dependency) does not get a fresh test run, while 'patch-performance' and 'coverage' are running. Is this the expected behaviour? I find it useful to get also a test response in this case as it is possible the patch set stays so for longer and patch set without Verified+1 looking like something is wrong/broken. Feel free to decline if this is wanted. ----- Example: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1007440 After uploading a change to the parent patch set I have rebased this patch set, but no test run was started. I have also seen this with LibUp patches where +2 already exists, rebasing it after other changes gets merged makes sure the history does not have to many entries and previous also gets a fresh run without a recheck comment.
    • Task
    At the moment, when you post a new topic on a Talk page (or when replying to an existing topic), we don't show a preview of the resulting text, and instead just post it immediately. This can be a bit jarring and unexpected, especially since we're now expanding the kinds of wikitext markup (templates etc) that can be included in a talk message. This task is to introduce a Preview screen, identical to the screen we use in the general Editing activity, so that the user gets a chance to see their message as it will be formatted upon posting. Note that this will be a WebView-based preview (same as in the general editor), so the appearance of the previewed message may be slightly different from how it appears in the native Talk screen.
    • Task
    I request being added as a co-maintainer of Muninnbot. The tools admin link is https://toolsadmin.wikimedia.org/tools/id/muninnbot Following the [[https://wikitech.wikimedia.org/wiki/Help:Toolforge/Abandoned_tool_policy#Adoption |Adoption policy]], * The tool has been non-functional for 14 days, per https://guc.toolforge.org/?by=date&user=Muninnbot * And, The current maintainer(s) have been inactive for 28 days, per https://guc.toolforge.org/?by=date&user=Tigraan The current maintainer(s) have been notified on all of their: * wikitech usertalk pages, per https://wikitech.wikimedia.org/wiki/User_talk:Tigraan#Muninnbot * homewiki usertalk pages, per https://en.wikipedia.org/wiki/User_talk:Tigraan#Muninnbot * I confirm that I have notified them all via email (if available): yes * All of this was done 14 (or more) days ago and there have been no objections Please could the TFSC : [] check the tool's home directory for obvious secret information, following the Adoption policy instructions
    • Task
    Create onboarding flow according to [[ https://www.figma.com/file/cyI6KSOoNlhsqUQ3l3156U/iOS-Suggested-edits%3A-Add-an-image?type=design&node-id=194-8768&mode=design&t=XoKbKR6tChgsBaCT-0 | Figma ]] reference {F42301156} #####Requirements - Sheet appears once after someone opens the feature for the first time - Tooltips show up the first time someone opens the feature - Tutorial option in overflow menu that re-triggers the tooltips