Page MenuHomePhabricator
Search Open Tasks
Use the application-specific Advanced Search to set additional search criteria: Tasks, Commits. (More information)
    • Task
    Follow up to T331596. Namespace translations provided by User:Katelem at https://translatewiki.net/w/i.php?title=User_talk:Katelem&oldid=12631440#Namespaces_for_Obolo
    • Task
    Brought up as a part of discussion in a patch [here](https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1072713/comment/a01de207_98b6f052/) following the pattern of [web2017-polyfills](https://gerrit.wikimedia.org/g/mediawiki/core/+/f632039f0364a134f9b9bcd5c181a75b3dd64a56/resources/Resources.php#114) module of MediaWiki, perhaps we can have a polyfill for [Object.fromEntries](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/fromEntries), perhaps as a part of a new module called 'web2019-polyfills' (?) that could contain implementation of [core-js](https://github.com/zloirock/core-js/blob/f0688cde0f4de245bdf7e5913e6f00c63ddd0c7d/packages/core-js/modules/es.object.from-entries.js#L6) or [this](https://github.com/tc39/proposal-object-from-entries/blob/main/polyfill.js) Object.fromEntries's polyfill. Maybe it can also have the implementation of 'flat' function also, have a look at T357197#9531203
    • Task
    #wikiproject-tools seems to have been an old 2015 project that is not being monitored by anyone and no longer tracks anything of note. Noticed this when the (otherwise valid and appropriately tagged) T374761 was tagged with it.
    • Task
    **Steps to replicate the issue**: * Uploading files using a python script These are the parameters I use: ``` params = { 'action': 'upload', 'filename': file, 'text': page, 'token': tok, 'format': 'json', 'formatversion': 'latest', 'comment': page, 'ignorewarnings': 1 } files = { 'file': (file, open(file, 'rb'), 'multipart/form-data') } req.post(url, files=files, data=params) ``` **What happens?**: With no clear pattern the following happens: * Instead of a success response I get an error response like the following one: ``` {'error': {'code': 'backend-fail-internal', 'info': 'An unknown error occurred in storage backend "local-swift-codfw".', 'stasherrors': [{'message': 'uploadstash-exception', 'params': ['UploadStashBadPathException', "Path doesn't exist."], 'code': 'uploadstash-exception', 'type': 'error'}], 'docref': 'See https://commons.wikimedia.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at <https://lists.wikimedia.org/postorius/lists/mediawiki-api-announce.lists.wikimedia.org/> for notice of API deprecations and breaking changes.'}, 'servedby': 'mw-api-ext.eqiad.main-79cbd6dc7f-rql5d'} ``` * My script retries the upload and repeatedly gets the message that the file is a duplicate of an existing file. * I checke the uploaded files and indeed the file was uploaded correctly This makes proper error handling very difficult and causes problems with after upload edits like adding structured data. **What should have happened instead?**: If a file was uploaded correctly the API should always respond with: ``` {'upload': {'result': 'Success',... ```
    • Task
    >>! In T374394#10142758, @Ladsgroup wrote: > A better solution would be to make PageAssessments extension expose wikiprojects as wg config variables. I'm sure there will be a lot of usecases for it. **Feature summary**: Surfacing the WikiProject through wg config variables on PageAssessments. **Use case(s)**: CentralNotice is a viable use case, though there are probably many others. It is unfortunately not possible to surface WikiProjects on enwiki or those with a similar setup, as the relevant wikiproject categories are on the talk page. And the mainspace categories are not organized enough to do something meaningful on this front without a list of hundreds or thousands of subcategories. **Benefits**: I think CentralNotice use could be of high utility in reaching out to particular interest areas and demographic groups, especially underrespresented ones. And I believe it has great potential for both virtual and in-person events, and potentially also for fundraising applications.
    • Task
    On [[Special:Whatlinkshere]] the text ""Displayed 50 items" is shown. It is defined in MediaWiki message "whatlinkshere-count": "Displayed $1 {{PLURAL:$1|item|items}}." Changing this to "Displaying 50 items" would be better (T44357#5887338). (Another possibility is "Showing 50 items", like English Wikipedia has done locally.)
    • Task
    **Steps to replicate the issue** (include links if applicable): * Open the article [[https://en.wikipedia.org/wiki/Aokora_Mosque|Aokora Mosque]] created by Tanbiruzzaman, a non-autopatrolled user. **What happens?**: * It was autopatrolled, and the history and logs don’t show any clue that justifies its autopatrolled status. **What should have happened instead?**: * It shouldn't be autopatrolled, as articles created by non-autopatrolled users are not automatically patrolled. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Steps to replicate the issue** (include links if applicable): * Attempt to log in via Google Chrome or Arc (which is Chromium based) **What happens?**: * Even with third party cookies allowed and after clearing browser data, login is sometimes blocked or produces a bizarre error where it says the login was blocked but also that you're already logged in {F57508047} {F57508046} **What should have happened instead?**: **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): This happens on the latest version of Chrome for MacOS 128.0.6613.138 **Other information** (browser name/version, screenshots, etc.):
    • Task
    **Steps to replicate the issue** (include links if applicable): * Acoustic query: WHEN donor_status_id is equal to one of the following (80 | 85 | 90 | 95) **What happens?**: 11 donors in the results. CIDS: 58590405 61837508 62632871 62825957 50193811 62996739 42034634 62667164 60845717 61375489 12884388 **What should have happened instead?**: There should not be any donors with these status ids according to https://civicrm.wikimedia.org/civicrm/wmf-segment
    • Task
    == Background Currently we are increasing the spacing between h3s and a preceding paragraph for the legacy parser output, here: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/skins/MinervaNeue/+/842a91590a9d838f3f70678b83581c02b75297ca/resources/skins.minerva.base.styles/content/headings.less#119 Due to the layout of the parsoid output, it's harder to target these same situations It was suggested that the following: ``` .mw-heading3 { margin-top: 1.5em; } // If the h3 is not preceeded by any text ie. is first child of parent section remove the top margin. section > div > section:first-child > .mw-heading3 { margin-top: 0; } ``` would be a close approximation == User story As a reader of wikis on a mobile device, I want to see a visual break between subsections so that I understand I am reading a new section of content == Requirements - [ ] h3s preceded by a paragraph have a margin-top of `headingMargin * 3` - [ ] h3s that are not preceded by a paragraph do not have added top margin === BDD - For QA engineer to fill out === Test Steps - For QA engineer to fill out == Acceptance criteria - [ ] https://en.m.wikivoyage.org/wiki/Hospet?useparsoid=0 and https://en.m.wikivoyage.org/wiki/Hospet?useparsoid=1 have identical spacing - [ ] https://en.m.wikipedia.org/wiki/Taylor_Swift?useparsoid=0#Life_and_career and https://en.m.wikipedia.org/wiki/Taylor_Swift?useparsoid=1#Life_and_career have identical spacing == Rollback plan This should be a CSS change so it can be rolled back freely and without concern for the cache
    • Task
    ####User Story: Specific user story: - As the Growth team Product Manager, I want to ensure we are sharing user testing results with the community and other interested stakeholders. Underlying user stories: - As a new Wikipedia account holder on mobile, I want editing workflows that are broken into a series of easy steps, so that I can successfully contribute. - As a new Wikipedia account holder on mobile, I want editing support that is surfaced in the moment that I need it, so that I can successfully contribute. ####Task Scope: After completion of {T373723} we should set up share results from our user tests on MediaWiki: https://www.mediawiki.org/wiki/Growth/Constructive_activation_experimentation ---- ###Background - Project page: [[ https://www.mediawiki.org/wiki/Growth/Constructive_activation_experimentation | Constructive activation experimentation ]] - Related epic: {T368187} - Related work: [[ https://www.mediawiki.org/wiki/Growth/Personalized_first_day/Structured_tasks | Structured tasks ]] & [[ https://www.mediawiki.org/wiki/Edit_check | Edit check ]] Current full-page editing experiences require too much context, patience, and trial and error for many newcomers to contribute constructively. To support a new generation of volunteers, we will increase the number and availability of smaller, structured, and more task-specific editing workflows (E.g. Edit Check and Structured Tasks). The Growth team will primarily focus on Structured Tasks, while working closely with the Editing team to ensure our work integrates well with Edit Check. This project aims to address the following user problem: **Getting started editing on Wikipedia is difficult and especially frustrating on mobile devices. I want the editing interface to provide the in-the-moment policy and technical guidance I need, so my initial efforts aren't reverted.** This project aims to achieve the following user outcome: **As a new Wikipedia volunteer, I feel confident and enthusiastic about contributing to the Wikimedia movement by editing Wikipedia articles. The tools provided guide me step-by-step, limit distractions, and allow me to learn progressively so I can successfully contribute on my mobile device.** #####Growth team hypothesis As part of the Growth team 2024/2025 Annual Plan, the Growth team will explore various ways to increase constructive activation on mobile. This is part of the Wikimedia Foundation 2024-2025 Annual Plan, specifically the [[ https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2024-2025/Product_%26_Technology_OKRs#WE_KRs | Wiki Experiences 1.2 Key Result ]] **Wiki Experiences 1.2 Key Result** **Constructive activation:** Increase in the percentage of newcomers who publish ≥1 constructive edit in the main namespace on a mobile device. **Wiki Experiences 1.2.3 Hypothesis**: If we conduct user tests on two or more design prototypes introducing Structured Tasks to newcomers within/proximate to the Visual Editor, then we can quickly learn which designs will work best for new editors, while also enabling engineers to assess technical feasibility and estimate effort for each approach. ---- ####Acceptance Criteria [] Review user testing results [] Write a summary and share with Growth team [] Post designs and a user testing summary on MediaWiki related project page: https://www.mediawiki.org/wiki/Growth/Constructive_activation_experimentation#Design
    • Task
    There is a repeated word ["which which"](https://codesearch.wmcloud.org/search/?q=which+which&files=&excludeFiles=&repos=) in a description in some of the Metrics Platform schemas in https://gerrit.wikimedia.org/r/plugins/gitiles/schemas/event/secondary/ (I attempted submitting a patch for this but ran into an error `Cannot find module './yerror'`)
    • Task
    Documentation at https://www.mediawiki.org/wiki/Content_Transform_Team/Chores - [] Vendor patch for commit `0c3bc681` for train 1.43.0-wmf.23 (T373642 (train ticket)) -- [x] RT-testing started (Friday by Europe EOD) -- [x] regression script run -- [x] RT-testing logs checked -- [] Vendor+core patch created -- [] Deployment changelog -- [] Vendor patch reviewed -- [] Patches (vendor + core) merged (Monday by US EOD) - [] Group 0 -- [] logstash checked -- [] Grafana checked - [] Group 1 -- [] logstash checked -- [] Grafana checked - [] Group 2 -- [] logstash checked -- [] Grafana checked - [] Update status on [[ https://www.mediawiki.org/wiki/Parsoid/Deployments | deployment changelog ]] to done - [] Monitor [[ https://www.mediawiki.org/wiki/Talk:Parsoid/Parser_Unification/Known_Issues | Parsoid Community-reported issues ]] (Thursday before triage meeting) - [] PCS deployment - [] Wikifeeds deployment - [] Next week's phab created and linked on Slack bookmarks (template: https://www.mediawiki.org/wiki/Content_Transform_Team/Chores/Phabricator_template; link it here)
    • Task
    ===Background Following on from {T373217}, we'd like to send a survey out to Stewards and Admins in order to check our assumptions. ===Key questions 1. Which types of users with extended rights (Stewards, CU admins, non-CU admins) regularly make IP address or IP range blocks? 3. What IP information do users consider most important when deciding whether to make an IP address or IP range block? 4. Do admins who make IP/range blocks understand IP information and how to use it in block decision-making? ===Checklist [] Survey plan [] Review survey questions [] Participant recruitment [] Setup Qualtrics [] Deploy survey
    • Task
    Implementing the usage tracking tables envisioned in T370378; this will live mostly in JsonConfig but may need explicit invocation from Chart, and work is being done as part of the Charts task force: * in every wiki (uncertain if this is needed, let's confirm or remove this) * `jsonlinks` table with local JSON links matching linktarget IDs * in a shared database, likely `x1` in production: * `globaljsonlinks`, `globaljsonlinks_source_ns`, `globaljsonlinks_target` tables with wiki id, namespace name, and titles broken out to help avoid duplicate string storage [todo: paste or reference the full ADR] Internally, JsonConfig will add an API for recording usage of a data page into a ParserOutput, and then on page update/creation/deletion will update the local and/or tracking tables. It's unclear whether we should just use the global tracking table always, or use a local tracking table with a shared global tracking table as an add-on (this may simplify database administration).
    • Task
    This work will live mostly in JsonConfig with a portion in Chart, and is being done as part of the Charts taskforce. Implementation for the cache invalidation based on global Data: page usage tracking as envisioned by T370378: On all wikis: * job queue job to follow the tracking tables for local usage for a given Data: page and purge those pages' cached output ** when a local JsonConfig page changes, fire off this job ** provide an API action to fire off this job for a given Data: page On the shared wiki (Commons): * job queue job to get a list of all wikis with usages for a given Data: page ** when a local JsonConfig page changes and we are the shared repository, fire off API requests to each wiki using the page requesting them to do their local cache invalidation This allows for safely propagating data from the Commons context to the individual wikis for performing their own invalidations in their local databases, with only a limited number of HTTP hits made even in the case of thousands of pages using a resource. Concerns: * don't expect to need to put rate limiting or permission controls in this system, but if they were applied we'd want to use some secret key or something to validate the requests from the automated system.
    • Task
    Steps to replicate: * consider an item X with `mul` label "Jack", no English label and English description "given name" * go to an item Y and add "Jack" as English label and "given name" as English description What happens: * the edits on item Y are saved What should happen: * the edits on item Y should not be saved and a message should be displayed "Could not save due to an error. Item X already has label "Jack" associated with language code en, using the same description text." In fact in item X there is no English label, but since there is no English label the software should interpret `mul` label as it was the English label and thus inhibit the save Cf. T285156 regarding `mul` labels See also: {T306918} (prevent adding duplicated labels to //the same// item)
    • Task
    {T370880} ==Description This task is the Java client update AC of T368326. Update the Metrics Platform Java client library to include the experiments enrollment data object in the final submit method of the metrics client. We will need to explore how to do bucketing assignments of users (in MW, this is handled by the MP extension) in Android << this should be spun off into a different task. ==Acceptance Criteria [] Create a spike to determine best approach for bucketing users in the Android context [] Update the app base schema to include the updated common fragment v1.2.0 [] Update the prod Android instruments to use the new app base schema [] Test event validation e2e using beta stream ==Required [] Unit/Integration tests? [] Documentation? [] Passed QA?
    • Task
    == Background We enabled two quick surveys last sprint. We have enough data/browser extension users so should disable them both. == User story As a reader, I don't want to spend time doing quick surveys if the data is not going to be used. == Requirements - Add task requirements. Requirements should be user-centric, well-defined, unambiguous, implementable, testable, consistent, and comprehensive === BDD - For QA engineer to fill out === Test Steps - For QA engineer to fill out == Design - Add mockups and design requirements == Acceptance criteria - Add acceptance criteria == Communication criteria - does this need an announcement or discussion? - Add communication criteria == Rollback plan - What is the rollback plan in production for this task if something goes wrong? //This task was created by Version 1.2.0 of the [[ https://mediawiki.org/w/index.php?title=Reading/Web/Request_process | Web team task template ]] using [[ https://phabulous.toolforge.org/ | phabulous ]] //
    • Task
    Currently the Metrics Platform docs state that all contextual attributes are optional. However, as part of investigating T374116, we discovered that instrument creators are required to specify `agent_client_platform_family` in the stream config for the JS and PHP clients. To do: Update https://wikitech.wikimedia.org/wiki/Metrics_Platform/How_to/Create_First_Metrics_Platform_Instrument and https://wikitech.wikimedia.org/wiki/Metrics_Platform/How_to/Creating_a_Stream_Configuration to reflect that `agent_client_platform_family` is a required contextual attribute
    • Task
    This task will track the #decommission of server frban2002.frack.codfw.wmnet. With the launch of updates to the decom cookbook, the majority of these steps can be handled by the service owners directly. The DC Ops team only gets involved once the system has been fully removed from service and powered down by the decommission cookbook. **frban2001.frack.codfw.wmnet** **Steps for service owner:** [x] - all system services confirmed offline from production use [x] - set all icinga checks to maint mode/disabled while reclaim/decommmission takes place. (likely done by script) [x] - remove system from all lvs/pybal active configuration [x] - any service group puppet/hiera/dsh config removed [x] - netbox update to decommissioning status [x] - host powered down [] - remove all remaining puppet references and all host entries in the puppet repo [] - remove from icinga config [] - reassign task from service owner to no owner and ensure the site project (ops-sitename depending on site of server) is assigned. **End service owner steps / Begin DC-Ops team steps:** [] - system disks removed (by onsite) [] - determine system age, under 5 years are reclaimed to spare, over 5 years are decommissioned. [] - IF DECOM: system unracked and decommissioned (by onsite), update netbox with result and set state to offline [] - IF DECOM: mgmt dns entries removed. [] - IF RECLAIM: set netbox state to 'inventory' and hostname to asset tag
    • Task
    Release version/Date: 2.7.50499-r-2024-09-05 End experiment Date: 2024-09-25 #### OKR Hypothesis This work is apart of the [[https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2024-2025/Product_%26_Technology_OKRs#Draft_Key_Results | 2024-2025 Annual Plan Wiki Experiences 3.1]] work. **Hypothesis** If we enhance the search field in the Android app to recommend personalized content based on a user's interest and display better results, we will learn if this improves user engagement by observing whether it increases the impression and click-through rate (CTR) of search results by 5% in the experiment groups compared to the control group over a 30-day A/B/C test. This improvement could potentially lead to a 1% increase in the retention of logged out users. #### How will we know we were successful **Validation** * Search Satisfaction rate of 65% * 1% higher search retention rate from experiment group vs control during experiment period * 5% of unique users click suggestion in search more than once in a 15 day period * 5% increase in CTR of Search from experiment group compared to control group * Personalized suggestion has 10% higher CTR than Generalized Suggestions **Guardrails:** * Experiment group doesn't have a higher abandonment rate than control * No more than 2% of feedback includes reports of NSFW, Vandalism or Offensive recommendations * Search doesn't worsen geographic bias **Curiosities:** - Required: Do we see a difference in metrics between logged in and logged out users? - Does the preference for the type of content shown in search differ by platform and language? - Would users like to see suggestions presented somewhere other than search? - Do people return to the search just to click a suggestion? - Should there be a filter for the type of content suggested (BLP, NSFW, Controversial Topics, etc.?) #### Decision Matrix TBD ####User Stories - As a member of The Beyhive that read an article about Beyonce, I want to know that her new line of hair care products, Cecred, has a Wikipedia article the next time I open my search, so that I can read it, realize there is a reference to Fenty Beauty on it, and join a discussion on the talk page of if it has relevance. - As a football fan that stays up to date with current events, I want to be notified that the latest AFCON 2023 statistics are available on Wikipedia, so that I can see if Nigeria made it to the next round despite being at work completing a research paper. - As a student learning about astronomy, I want to know there is an article about Titan after I read the Saturn article, so that I can check the references used for my research paper. - As a Wikipedian in Kolkata, I want encouragement to check out The National Library of India article, so that it inspires me to visit in person and explore the 2.2 million books available and use it to add citations to articles I care about. ####Must Haves - Retain access to recent search results - Show suggestions in search when accessing search from the main page / explore feed and article view of Wikipedia in the main namespace - Fall back solution for latency issues or lack of content - Use existing APIs - Experiment must be an ABC test using Metrics Platform ####Nice to Haves - Show top most recommendation related current article in the search bar #### Target Quant Regions and Languages South Asia & Sub Saharan Africa **User Testing Languages** - English - Hindi - French - Arabic **User Testing Considerations** - Impact for screenreaders - Impact for RtL readers - Preferences based on Age ####Resources Work in Progress Product [[ https://docs.google.com/presentation/d/17Zcb6-OiUS1mqJSCzLzh694SHkmZPEusKQ0jUUJoZzM/edit#slide=id.g2407935fefb_1_0 | Deck ]] Data Instrumentation Planning [[ https://docs.google.com/presentation/d/1Wbab2-_cQQwkviD_-WmZ_2F67m63jEDf2768YS5uTfY/edit#slide=id.g2f4f4d6d613_0_0| Deck ]] Instrumentation Process and Spec [[ https://docs.google.com/spreadsheets/d/1KtQGNL8QnNbmIL5fbxUmUpjoi9TqyvjVG-y-yp77IDQ/edit?gid=0#gid=0 | spreadsheet ]]
    • Task
    When I open Special:ArticleFeedback,v5, I get this error message 174 times before the page content: Deprecated: json_decode(): Passing null to parameter #1 ($json) of type string is deprecated in /var/www/html/extensions/ArticleFeedbackv5/includes/ArticleFeedbackv5Utils.php on line 240
    • Task
    What/Why: [[ https://gitlab.wikimedia.org/repos/abstract-wiki/wikifunctions/function-orchestrator/-/merge_requests/218 | This MR ]] added an env var to toggle on metrics creation; we only want it turned on in Prod. The metrics creation will therefore not be created otherwise because it interferes with running the test suite. How: Add `GENERATE_FUNCTIONS_METRICS: true` to the OrchestratorConfig before deploying the above MR/commit to Production.
    • Task
    Old test cases add language links like `'foo'` and `'bar'` which aren't actually valid data; every language link entry should have a language prefix.
    • Task
    ScheduleDeploymentBot is great. Unfortunately, its edits are now tagged as bot edits, which means people watchlisting Deployments no longer get updates as soon as the bot makes an edit unless they manually visit. Can it be unmarked as a bot, please?
    • Task
    In T5233, a feature was added that adds a cookie to the browser of users who attempt to edit using a locally blocked IP or account. If they then attempt to edit while logged out or create an account, the system will block the attempt. The #GlobalBlocking extension should also have this functionality. This is so that global blocks can have the same effectiveness at preventing further edits as local blocks. It may also help with mitigating the impact of abuse of users who use #temporary_accounts as the system could apply cookie blocks to blocked temporary accounts.
    • Task
    Please relabel the following nodes following their rename: [ ] `kubernetes2059` to `wikikube-worker2114` [ ] `kubernetes2060` to `wikikube-worker2115` [ ] `mw2301` to `wikikube-worker2116` [ ] `mw2302` to `wikikube-worker2117` [ ] `mw2303` to `wikikube-worker2118` [ ] `mw2304` to `wikikube-worker2119` [ ] `mw2305` to `wikikube-worker2120` [ ] `mw2313` to `wikikube-worker2121` [ ] `mw2314` to `wikikube-worker2122` [ ] `mw2315` to `wikikube-worker2123` Thanks!
    • Task
    Ex, ``` {| |- | colspan="2" style="height:1px;" | |} ``` The legacy parser puts the closing table data tag on a newline, whereas Parsoid does not. The whitespace results in a difference in rendering. https://parsoid-vs-core.wmflabs.org/diff/viwikivoyage/B%E1%BA%A3n_m%E1%BA%ABu:Quickbar https://vi.wikivoyage.org/wiki/B%E1%BA%A3n_m%E1%BA%ABu%3AQuickbar?useparsoid=0 https://vi.wikivoyage.org/wiki/B%E1%BA%A3n_m%E1%BA%ABu%3AQuickbar?useparsoid=1
    • Task
    would it be possible to also add the full-color icons of the siblings projects beside the monochrome version in our icon set? in our own case, we'd love to give stronger attribution to wikidata. {F57505938}
    • Task
    The `Maintenance::runChild` method was marked as stable to override until it was renamed to `Maintenance::createChild`, leaving the old name as a deprecated alias. Because it was marked as stable to override we might need to wait for a release cycle to ensure that anyone who is overriding it can rename their overrides. This task is to replace usages of `Maintenance::runChild` with `::createChild` once this is appropriate [[ https://www.mediawiki.org/wiki/Stable_interface_policy#Stable_to_override | per the stable interface policy ]] and to do the pre-work necessary to reach this point.
    • Task
    As of today we have to explicitly list broker ips in the values file of a service using a kafka client. Recent kafka clients should offers a way to discover those doing a first ns lookup and then resolve the ips returned by the name server. In https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/1072759 we tested this approach declaring: ``` bootstrap.servers: kafka-main-eqiad.external-services.svc.cluster.local:9093 client.dns.lookup: resolve_canonical_bootstrap_servers_only ``` According to https://docs.cloudera.com/runtime/7.2.18/kafka-configuring/topics/kafka-client-dns-lookup-property.html > Cloudera recommends that you set the resolve_canonical_bootstrap_servers_only value because this option provides the most fault tolerance. I think this is the option we need in our case because it allows SSL. My understanding of the way it works is # do a lookup on kafka-main-eqiad.external-services.svc.cluster.local which properly resolves to the ipv4 and ipv6 of the 5 kafka-main hosts # do a reverse lookup on these ips to resolve their canonical name so that SSL hostname verification can happen # initiate the discovery In step 2 sadly the canonical name of these ips is not what I would expect, for instance `10.64.16.37` resolves to `10-64-16-37.kafka-main-eqiad.external-services.svc.cluster.local.` which is then failing SSL hostname verification with: ``` SSLHandshakeException: No subject alternative DNS name matching 2620-0-861-107-10-64-48-30.kafka-main-eqiad.external-services.svc.cluster.local found ``` We could disable hostname verification but this seems not ideal, I'm not sure what solutions we could explore (tell the nameserver running in k8s to resolve `10.64.16.37` to `kafka-main1002.eqiad.wmnet` or tell the kafka nodes cert that `10-64-16-37.kafka-main-eqiad.external-services.svc.cluster.local` is a valid hostname.
    • Task
    === Steps to reproduce # Change device calendar to Japanese (iOS Settings > General > Language and Region > Calendar # Install app and On This Day widget === Expected results On this day Explore card and widget shows incorrect relative "years ago" dates === Actual results On this day Explore card and widget shows correct relative "years ago" dates === Screenshots {F57505861} {F57505858} === Environments observed **App version: ** 7.5.9 (4044) **OS versions:** iOS 17.7 **Device model:** iPhone 12 Pro Max **Device language:** EN
    • Task
    Surprising to find this still, but in https://pl.wikivoyage.org/wiki/Pilzno?useparsoid=0#Noclegi item 8 is: ``` * {{Znacznik|'''Zakwaterowanie Silenus'''|49.7333100 |13.3903428|typ=nocleg|url=|kolor=000080|treść=}}'''[https://plzen-ubytovani.cz/ Zakwaterowanie Silenus]''' (''Ubytovna Silenus'''), Zahradní 1941/25, Východní Předměstí (''przystanek Liliová: linie tramwajowe nr 1, 4 i linie autobusowe nr N2''), ``` If you look closely, you can see that `Ubytovna Silenus` has two leading single-quotes but three trailing single-quotes. Comparing legacy and parsoid, legacy renders this as: * **Zakwaterowanie Silenus**' (Ubytovna Silenus), while Parsoid renders it as: * '//Zakwaterowanie Silenus// (Ubytovna Silenus) and legacy boldfaces the first and puts the unmatched quote last, while parsoid renders it as italics and puts the unmatched quote first. Oddly, the "Ubytovna Silenus" part which actually has the quote mismatch is boldfaced in both renderings, and it's the leading part which get unboldfaced by Parsoid.
    • Task
    From Slack. Donor said: "I found the error, please look at the below image, where "house number" is not mandatory, while somebody tries to donate without a house number there will be an error." @RKumar_WMF said, yes it's required. Screenshot: {F57505823} Let's update the form
    • Task
    Ex, ``` asdf 123 ``` Legacy ``` <pre>asdf </pre> <p>123 </p> ``` Parsoid ``` <pre data-parsoid='{"dsr":[0,5,1,0]}'>asdf</pre> <p data-parsoid='{"dsr":[7,12,0,0]}'> <br data-parsoid='{"dsr":[8,8,0,0]}'/> 123</p> ``` Can probably be closed as a known difference. From https://parsoid-vs-core.wmflabs.org/diff/viwikivoyage/S%E1%BB%95_tay_ti%E1%BA%BFng_Th%C3%A1i https://vi.wikivoyage.org/wiki/S%E1%BB%95_tay_ti%E1%BA%BFng_Th%C3%A1i?useparsoid=0 https://vi.wikivoyage.org/wiki/S%E1%BB%95_tay_ti%E1%BA%BFng_Th%C3%A1i?useparsoid=1
    • Task
    From https://pl.wikivoyage.org/w/index.php?title=Wikipodr%C3%B3%C5%BCe:Pub_podr%C3%B3%C5%BCnika&action=edit&section=49 ``` <section begin="announcement-content" />:''[[.....]] ``` Legacy parser renders that as indented text; parsoid renders a literal colon.
    • Task
    With the forecoming deployment of the web interface for requestctl, it will introduce the ability to create and modify objects from it. This of course creates a conflict with the model used by the CLI tool that uses to //requestctl sync// changes from a git repository. How can we transition to the use of the web application? There's a few possible ways to go here. **Keep sync working ** This would mean that we'd have to co-locate the web app and the git repository, and have the web app commit changes to git whenever something gets changed. Pros: - it would allow people who want to keep using the CLI to keep their workflow mostly unchanged - existing scripts would keep working - we'd keep having the same form of auditing that we have had up until now. Cons: - I can see all sorts of race conditions we'd have to solve with locks, which then would risk creating deadlocks, which is definitely something you don't want to have to debug when responding to an outage - It would tie the deployment of the webapp to the git repository location, and/or discourage deployment on k8s even in the future. **Remove the sync function from the CLI tool** This would imply that the cli tool would retain all the rest of its functions, including the ability to enable/disable rules and to commit changes to the DSL. Given we don't need to be tightly coupled, we could even retain the audit logs on git, by creating an auxiliary program that would receive audit log messages from the application and convert those to git commits, although I think just writing a good series of audit log messages should be enough. Pros: * It would allow us complete freedom to deploy the webapp anywhere * It would clarify there's only one place where changes should be made by humans Cons: * It would disrupt the workflow of people who prefer the CLI (but, is there anyone that would prefer editing a yaml file in vim rather than filling a web form with better feedback?) * If we don't implement a git-based audit trail, we would probably have a much worse level of detail in our changes # Migration plan If we chose to keep sync working, the migration plan is mostly the migration plan of the git repository to the alerting hosts: * Add a locking mechanism to requestctl so that the webapp can lock syncing until it has updated the git repository * Create a repo on the alert hosts, make them sync similarly to how /srv/private is kept in sync * Import all requestctl data and the corresponding commit history, if possible (I don't think it would be, but git has surprised me time and time again, I'm done betting against its ability to do anything) * Move the automated scripts to update publicly available ipblocks and data from our IP reputation provider * Install the webapp to run similarly to what klaxon does If we instead decide to stop supporting sync, we need to: * Develop an audit log functionality for the web app specifically, optionally create a small python app that takes an audit log line as input and generates a git commit * Prepare a conftool release where sync is disabled and prints out a message directing people to use the webapp * Install the web application on the alert hosts, the log to git converter on the puppetservers where `/srv/private` is located * Send an email to ops@ announcing the breaking change, X days in advance * Prepare some documentation changes * Deploy the new conftool release, send an email to ops@ announcing the change has happened
    • Task
    This task builds on the v0 metrics definitions defined during the work outlined in T372102. [] Research: Define the scope of content to measure with v0 proposed tech docs metrics. Resolve [[ https://www.mediawiki.org/wiki/Wikimedia_Technical_Documentation_Team/Doc_metrics/Research#RQ3 | RQ3 ]]. [] Design: Use either developer workflows, MediaWiki core code structure, or some other logical framework to identify several content collections to measure. [] Implement: Use PagePile or other mechanism to create collections. Target completion date: October 30, 2024
    • Task
    == GitLab account activation == **To activate your account, please sign into GitLab and then answer the following question:** * Developer account / GitLab username: @ilario == Activation checklist == [] User has provided the following: developer account username [] User has an existing developer account, and has used it to log in to GitLab If any of the following criteria are met, user should be approved immediately: [] User has a history of contributions on-wiki, on Gerrit, Phabricator, etc. [] User is known to the admin [] User is vouched for by a known contributor [X] User is a member of a movement organization
    • Task
    Please create two new options in the communications area - No Direct Mail and No Paper TY No Direct Mail will need to sync with acoustic at some point
    • Task
    **Feature summary** (what you would like to be able to do and where): I would like to be able to make use of WMF's subscriptions to query IP information on arbitrary IPs, IP ranges, ASN's. Subsets of this information are available via the IP Information tools, but only on a project where the IP was recently used for writing. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): Would like to be able to input via a dedicated method such as on the webui "Special:IPInfo/$1" where $1 can be an IP, IP Range, or ASN. Potentially also via the API. Permissions for such could reuse the existing permsision model for ip information tools. Would like the output to include as much information as feasible, including making use of subscription services such as SPUR. **Benefits** (why should this be implemented?): In dealing with IP blocks, unblocks, range blocks, and exceptions it is very useful to have knowledge of the ip information and usage for such IP's. Several volunteer services on toolforge have worked on this (e.g. ipcheck.toolforge.org , bullseye.toolforge.org), however these tools have become deprecated or unmaintained.
    • Task
    Checking a build of [[ https://integration.wikimedia.org/ci/job/wmf-quibble-core-vendor-mysql-php74/ | wmf-quibble-core-vendor-mysql-php74 ]], cloning and checking out the repos took 157 seconds or 2 minutes 37 seconds. That is a bit long. I copied the Jenkins console output with the elapsed time and went to process them with some python: ``` lang=python #!/usr/bin/env python import datetime import re def parse(logs): times = {} for line in logs.splitlines(): match = re.match( '(?P<timing>.+) INFO:zuul.Cloner.(?P<repo>.+):(Creating|Prepared)', line ) if match: (hours, minutes, seconds, milliseconds) = re.split( '[\\.:]', match['timing']) delta = datetime.timedelta( hours=int(hours), minutes=int(minutes), seconds=int(seconds), milliseconds=int(milliseconds), ) if match['repo'] in times: duration = delta - times[match['repo']] times[match['repo']] = duration else: times[match['repo']] = delta for repo in reversed(sorted(times, key=times.get)): print("%s %s" % (repo, times[repo].total_seconds())) parse(""" <LOG HERE> """) ``` Which gives: ``` lines=15 mediawiki/extensions/GrowthExperiments 24.505 mediawiki/extensions/Translate 22.374 mediawiki/extensions/MobileFrontend 22.021 mediawiki/extensions/CirrusSearch 20.855 mediawiki/extensions/ContentTranslation 20.646 mediawiki/extensions/InputBox 20.413 mediawiki/extensions/GuidedTour 19.909 mediawiki/extensions/Interwiki 19.711 mediawiki/extensions/ConfirmEdit 19.399 mediawiki/extensions/ImageMap 19.205 mediawiki/extensions/ProofreadPage 19.131 mediawiki/extensions/Kartographer 19.046 mediawiki/extensions/GlobalPreferences 18.968 mediawiki/extensions/Babel 18.693 mediawiki/extensions/UniversalLanguageSelector 18.612 mediawiki/extensions/Graph 18.571 mediawiki/extensions/JsonConfig 18.464 mediawiki/extensions/MobileApp 18.39 mediawiki/extensions/Elastica 18.371 mediawiki/extensions/FileImporter 18.058 mediawiki/extensions/Poem 17.87 mediawiki/extensions/PdfHandler 17.818 mediawiki/extensions/EventBus 17.694 mediawiki/extensions/PageImages 17.465 mediawiki/extensions/SpamBlacklist 17.376 mediawiki/extensions/GeoData 17.373 mediawiki/extensions/SandboxLink 16.735 mediawiki/extensions/Disambiguator 16.731 mediawiki/extensions/TimedMediaHandler 16.286 mediawiki/extensions/EventStreamConfig 16.053 mediawiki/extensions/SiteMatrix 15.782 mediawiki/extensions/Thanks 15.562 mediawiki/extensions/WikimediaMessages 15.456 mediawiki/extensions/CiteThisPage 15.192 mediawiki/extensions/MediaModeration 14.591 mediawiki/extensions/WikiEditor 14.407 mediawiki/extensions/IPInfo 14.226 mediawiki/extensions/AntiSpoof 13.773 mediawiki/extensions/WikiLove 13.699 mediawiki/extensions/CheckUser 13.034 mediawiki/extensions/AbuseFilter 12.911 mediawiki/extensions/WikibaseMediaInfo 12.286 mediawiki/extensions/EventLogging 12.246 mediawiki/extensions/Math 12.085 mediawiki/extensions/PageTriage 11.916 mediawiki/extensions/Wikibase 11.675 mediawiki/extensions/PageViewInfo 11.412 mediawiki/extensions/WikibaseCirrusSearch 11.406 mediawiki/extensions/Scribunto 10.858 mediawiki/extensions/cldr 10.443 mediawiki/extensions/TemplateData 10.215 mediawiki/skins/MinervaNeue 9.873 mediawiki/extensions/Gadgets 9.757 mediawiki/extensions/ParserFunctions 9.713 mediawiki/extensions/Echo 9.664 mediawiki/extensions/GlobalCssJs 9.473 mediawiki/extensions/VisualEditor 9.34 mediawiki/extensions/CodeEditor 8.944 mediawiki/vendor 8.903 mediawiki/skins/Vector 8.818 mediawiki/extensions/BetaFeatures 8.6 mediawiki/extensions/NavigationTiming 7.646 mediawiki/extensions/Cite 5.806 mediawiki/extensions/CommunityConfiguration 3.48 ``` The sum is around 15 minutes but we run up to 8 clones in parallel. My guess is something is off in the infra. Maybe the mirrors are not working or the Zuul merger have too many refs. The data are from cloning from `contint1002.wikimedia.org`.
    • Task
    Add support and enable IPv6 on cloudgw.
    • Task
    We want the FQDN for VMs to have an IPv6 record registered in openstack designate. This ticket is to track the work to make it happen.
    • Task
    We need to make sure neutron security groups works as expected using IPv6.
    • Task
    As part of {T364725} and {T245495} and we need to enable IPv6 on the codfw cloudsw device, so we can provide edge IPv6 routing to the cloud.
    • Task
    In parent task {T187929} there has been work done to decide on the CIDR allocations for IPv6 in Cloud VPS. This ticket is to track the work to reflect that in netbox.
    • Task
    I was debugging an issue with `sync_check_icinga_contacts` as part of T372418 and the problem was that `keyholder-proxy` didn't restart after `/etc/keyholder-auth.d/metamonitor.yml` was deployed, and thus didn't grant access to `metamonitor` user to the respective ssh key. Not surprisingly, `modules/keyholder/manifests/agent.pp` doesn't refresh the service. I tried a straightforward fix: ```lang=diff --- a/modules/keyholder/manifests/init.pp +++ b/modules/keyholder/manifests/init.pp @@ -109,6 +109,7 @@ class keyholder( group => 'root', mode => '0444', content => "REQUIRE_ENCRYPTED_KEYS='${require_encrypted_keys}'\n", + notify => Service['keyholder-proxy'], } # The `keyholder` script provides a simplified command-line ``` Which doesn't work as I thought and results in a dependency cycle: ``` Error: Found 1 dependency cycle: (File[/etc/keyholder-auth.d/metamonitor.yml] => Service[keyholder-proxy] => Systemd::Service[keyholder-proxy] => Class[Keyholder] => Keyholder::Agent[metamonitor] => File[/etc/keyholder-auth.d/metamonitor.yml])\nTry the '--graph' option and opening the resulting '.dot' file in OmniGraffle or GraphViz Error: Failed to apply catalog: One or more resource dependency cycles detected in graph ``` And I can't quite figure out what the problem or the fix can be in this situation
    • Task
    The current main/main is deployed at test-spenden-2. Problem: - when entering a valid IBAN, the error does not vanish / validation does not pass {F57505429} Solution: - investigate recent changes around DirectDebit components
    • Task
    The access for Chelsy Xie (chelsyx) was removed. It needs to be checked if data was left in home dirs on stat*/HDFS since they were part of the "analytics-privatedata-users" group. There was no Kerberos principal.
    • Task
    The access for Erik Zachte (ezachte) was removed. It needs to be checked if data was left in home dirs on stat*/HDFS since they were part of the "analytics-privatedata-users" group. There was no Kerberos principal.
    • Task
    This will include multiple pipelines, with seperate sub-tasks as needed. - The first set of pipelines will be using Airflow to calculate the required metrics, and publishing to https://analytics.wikimedia.org/published/datasets - The second set of pipelines will be using [[ https://wikitech.wikimedia.org/wiki/Help:Toolforge/Jobs_framework | Toolforge jobs framework ]] to read from the published dataset, and update the db in ToolsDB (which will be used by the dashboard). ---- === Set A (Airflow jobs) === To aggregate and published metrics. **Automoderator config** * Data source: https://noc.wikimedia.org/conf/InitialiseSettings.php.txt * wmgUseAutoModerator & wgAutoModeratorUsername * Processing: Job repo & Airflow DAG * Output: * `wmf_product.automoderator_config` * https://analytics.wikimedia.org/published/datasets * Frequency: weekly **Daily monitoring metrics** * Data source: MariaDB-replicas (tables: revision, change_tag, change_tag_def, actor, user) * Processing: SQL queries (tweaked version of [[ https://gitlab.wikimedia.org/kcvelaga/automoderator-measurement/-/blob/main/pilot_analysis/queries/automod_reverts_info.sql | automod_reverts_info.sql ]] can be used), Job script and Airflow DAG * Output: * `wmf_product.automoderator_monitoring_metrics_daily` * https://analytics.wikimedia.org/published/datasets * Frequency: daily **Monthly key metrics** * Data source: `wmf.mediawiki_history` * Processing: HQL queries and Airflow DAG * Output: * `wmf_product.automoderator_key_metrics_monthly` * https://analytics.wikimedia.org/published/datasets * Frequency: monthly **Revert proportion handled monthly** * Data source: `wmf.mediawiki_history` * Processing: HQL queries and Airflow DAG * Output: * `wmf_product.automoderator_revert_proportion_monthly` * https://analytics.wikimedia.org/published/datasets * Frequency: monthly === Set B (Toolforge jobs) === To load the published metrics into ToolsDB. * A daily job to update the monitoring metrics. * A monthly job to update the key metrics. * maybe split into another one to update the revert proportion handled data.
    • Task
    As described in the parent ticket, we are exploring the idea of a global Community Configuration page which would enable Stewards and Meta administrators to control Automoderator across a large number of Wikipedias. We would like to start exploring designs for this global configuration so that we can identify open questions we might still have about how it should work.
    • Task
    As part of T372904 some methods concerned with weighted tags have been deprecated. For the sake of cleaning up the code base, they shall be removed. **AC:** * remove methods marked `@deprecated` as part of T372904, in particular from `CirrusSearch`, `DataSender`, and `MultiListBuilder` * adapt tests to use the replacement methods
    • Task
    Per the parent task, we would like to provide a global configuration, for Stewards and Meta administrators to control Automoderator across our smaller Wikimedia projects. This configuration might have a single set of options which apply to all the projects within the scope of this control (i.e. bot flag, minor edit flag, talk page message), and then a list of wikis for global control of where Automoderator should be active. The initial list of possible wikis for global control should be defined by the 'Opted-out of global sysop wikis' WikiSet (https://meta.wikimedia.org/wiki/Special:WikiSets/7) - i.e. only the wikis which have not opted out. Further decisions about which wikis should be in scope or not can be deferred to Stewards and Meta admins. This WikiSet does not change often so it is likely acceptable to hard-code this list based on the wikis in the set at the time of implementation. It should be possible for a wiki in scope of this configuration to assume control over Automoderator by setting up local configuration which would take precedent over the global configuration. See T372413, particularly T372413#10130621 for the technical approach to tackling this.
    • Task
    Umbrella tasks for manual account linking. If your [[ https://www.mediawiki.org/wiki/Developer_account | Wikimedia Developer Account ]] is different than your [[ https://meta.wikimedia.org/wiki/Help:Unified_login | Wikimedia Unified Login (SUL) ]]: * Please visit [[ https://idm.wikimedia.org/ | Wikimedia IDM ]] where you can verify the ownership of your SUL account * Leave a comment in this task stating both usernames **Sample Comment** * **Wikitech:** https://wikitech.wikimedia.org/wiki/User:Effie_Mouzeli * **SUL:** https://meta.wikimedia.org/wiki/User:EMouzeli_(WMF)
    • Task
    To be able to create some Superset dashboards, we need some data in the data lake. However, we are not likely to have real production data still for some days (weeks?). But, if we generate some fake data that looks more or less like the community updates data, we can still play with Superset, see how far we can get with the current assumptions, and encounter problems that we can start thinking of.
    • Task
    There are 9 Wikipedia projects with volunteer-maintained anti-vandalism bots: | Project | Bot | ----- | ----- | en.wiki | [[ https://en.wikipedia.org/wiki/User:ClueBot_NG | ClueBot NG ]] | es.wiki | [[ https://es.wikipedia.org/wiki/Usuario:SeroBOT | SeroBOT ]] | fr.wiki | [[ https://fr.wikipedia.org/wiki/Utilisateur:Salebot | Salebot ]] | pt.wiki | [[ https://pt.wikipedia.org/wiki/Usuário(a):Salebot | Salebot ]] | fa.wiki | [[ https://fa.wikipedia.org/wiki/%DA%A9%D8%A7%D8%B1%D8%A8%D8%B1:Dexbot | Dexbot ]] | bg.wiki | [[ https://bg.wikipedia.org/wiki/%D0%9F%D0%BE%D1%82%D1%80%D0%B5%D0%B1%D0%B8%D1%82%D0%B5%D0%BB:PSS_9 | PSS 9 ]] | simple.wiki | [[ https://simple.wikipedia.org/wiki/User:ChenzwBot | ChenzwBot ]] | ru.wiki | [[ https://ru.wikipedia.org/wiki/%D0%A3%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D1%86%D0%B0:%D0%A0%D0%B5%D0%B9%D0%BC%D1%83_%D0%A5%D0%B0%D0%BA%D1%83%D1%80%D0%B5%D0%B9 | Рейму Хакурей ]] | ro.wiki | [[ https://ro.wikipedia.org/wiki/Utilizator:PatrocleBot | PatrocleBot ]] See {T341857} for relevant background data on these bots' activity. At some point in the future these communities may want to evaluate moving from their volunteer-maintained tool to Automoderator. We should be ready with data comparing their options so that they can make informed decisions. Questions we may want to answer: * What is the median (and mode) number of edits that the existing bot and Automoderator would revert per month?* * What percentage of the bot's edits would Automoderator revert? * What percentage of the reverts Automoderator would make is the bot currently making? * What is the false positive rate of each bot? * What is the average time to revert for the bot and for Automoderator? * What is the user type of edits reverted (as in the [[ https://superset.wmcloud.org/superset/dashboard/37/?native_filters_key=Wswo8SSCin522BRV85LmiiqAmiJNzvHrW5liwY-ZM08_uObzm9jnGppk6r3apJMN | dashboard ]])? //*This could also be per day, but given the low numbers for some projects monthly might be more illustrative?//
    • Task
    As part of the effort of simplifying namespaces and deployments we want to move the article-descriptions model to the new article-models namespace. The migration plan will look something like this: # Deploy article-descriptions to article-models namespace # Make sure the required configuration and connectivity exists in the new namespace # Update related documentation # Ensure no traffic exists on the current service and delete the previous deployment
    • Task
    In future iterations, we would need to account for **campaign/event pages **that: a) present their lists in multiple sub-pages, for example, -> [[ https://meta.wikimedia.org/wiki/Wikimedia_CEE_Spring_2024/Structure/Armenia | Wikimedia_CEE_Spring_2024 ]] b) present their lists in the form of categories, for example, -> [[ https://meta.wikimedia.org/wiki/Igbo_Wikimedians_User_Group_Black_History_Month_2022/To_do | Igbo_Wikimedians_User_Group_Black_History_Month_2022 ]]
    • Task
    The "list has X moderation requests waiting" email previously provided a link to the mailing list that I could easily click to go and deal with the requests, but this seems to have disappeared, so the email now provides no link at all to lists.wikimedia.org, meaning I have to navigate there manually after getting the email. This is frustrating and I'd like the link back :)
    • Task
    **Steps to replicate the issue** (include links if applicable): * see https://de.wikipedia.beta.wmflabs.org/wiki/Benutzer:Hgzh/autolinkText.css **What happens?**: In line 3, the template isn't linked because there is a wikilink following on the same line **What should have happened instead?**: Template should have been linked regardless of the following content
    • Task
    ## Description These functions should ensure the `Z669*K1` String passes a regex (e.g. `/L[1-9]\d*/` for `Z6695/Wikidata lexeme reference`). [] Validator function for `Z6691` [] Validator function for `Z6692` [] Validator function for `Z6694` [] Validator function for `Z6695` [] Validator function for `Z6696` **Desired behavior/Acceptance criteria (returned value, expected error, performance expectations, etc.)** [] Each validator function returns true if and only if the `Z669*K1` String matches the appropriate regex. --- ## Completion checklist * [ ] Before closing this task, review one by one the checklist available here: https://www.mediawiki.org/wiki/Abstract_Wikipedia_team/Definition_of_Done#Back-end_Task/Bug_completion_checklist
    • Task
    **Steps to replicate the issue** (include links if applicable): (Observed but not tested. I don’t trust automatic translation so I don’t use it.) * Translate an article on enwiki to zh-yue * * **What happens?**: Various things in citations which should **not** be translated are being translated. Specifically, * Template names are being translated, but not to the correct names, causing the translated page to transclude **non-existent templates**. * Template parameters are being translated, causing **all** parameters to become **illegal parameters**. * Things like author names, titles, are all being translated, causing the cited work to become unrecognizable (and potentially incorrect). **What should have happened instead?**: Citations should be left alone. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    This task will track the #decommission of server kafka-main2005.codfw.wmnet. With the launch of updates to the decom cookbook, the majority of these steps can be handled by the service owners directly. The DC Ops team only gets involved once the system has been fully removed from service and powered down by the decommission cookbook. kafka-main2005.codfw.wmnet **Steps for service owner:** [x] - all system services confirmed offline from production use [x] - set all icinga checks to maint mode/disabled while reclaim/decommmission takes place. (likely done by script) [x] - remove system from all lvs/pybal active configuration [x] - any service group puppet/hiera/dsh config removed [x] - remove site.pp, replace with role(spare::system) recommended to ensure services offline but not 100% required as long as the decom script is IMMEDIATELY run below. [x] - login to cumin host and run the decom cookbook: cookbook sre.hosts.decommission <host fqdn> -t <phab task>. This does: bootloader wipe, host power down, netbox update to decommissioning status, puppet node clean, puppet node deactivate, debmonitor removal, and run homer. [x] - remove all remaining puppet references and all host entries in the puppet repo [x] - reassign task from service owner to no owner and ensure the site project (ops-sitename depending on site of server) is assigned. **End service owner steps / Begin DC-Ops team steps:** [] - system disks removed (by onsite) [] - determine system age, under 5 years are reclaimed to spare, over 5 years are decommissioned. [] - IF DECOM: system unracked and decommissioned (by onsite), update netbox with result and set state to offline [] - IF DECOM: mgmt dns entries removed. [] - IF RECLAIM: set netbox state to 'inventory' and hostname to asset tag
    • Task
    ==== Error ==== * mwversion: 1.43.0-wmf.22 * reqId: af161e5c-577e-4abb-8ca2-0ed35185f077 * [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2024-09-12T06:52:44.694Z',to:'2024-09-13T07:12:50.697Z'))&_a=(query:(query_string:(query:'reqId:%22af161e5c-577e-4abb-8ca2-0ed35185f077%22'))) | Find reqId in Logstash ]] ```name=normalized_message,lines=10 Expectation (writes <= 0) by MediaWiki\MediaWikiEntryPoint::restInPeace not met (actual: {actualSeconds}): {query} ``` ```name=exception.trace,lines=10 from /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/TransactionProfiler.php(534) #0 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/TransactionProfiler.php(335): Wikimedia\Rdbms\TransactionProfiler->reportExpectationViolated(string, Wikimedia\Rdbms\Query, int, string, string) #1 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/TransactionManager.php(593): Wikimedia\Rdbms\TransactionProfiler->recordQueryCompletion(Wikimedia\Rdbms\Query, float, bool, int, string, string) #2 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(822): Wikimedia\Rdbms\TransactionManager->recordQueryCompletion(Wikimedia\Rdbms\Query, float, bool, int, string) #3 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(707): Wikimedia\Rdbms\Database->attemptQuery(Wikimedia\Rdbms\Query, string, bool) #4 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(634): Wikimedia\Rdbms\Database->executeQuery(Wikimedia\Rdbms\Query, string, int) #5 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(1471): Wikimedia\Rdbms\Database->query(Wikimedia\Rdbms\Query, string) #6 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/DBConnRef.php(127): Wikimedia\Rdbms\Database->insert(string, array, string, array) #7 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/DBConnRef.php(407): Wikimedia\Rdbms\DBConnRef->__call(string, array) #8 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/querybuilder/InsertQueryBuilder.php(343): Wikimedia\Rdbms\DBConnRef->insert(string, array, string, array) #9 /srv/mediawiki/php-1.43.0-wmf.22/extensions/CheckUser/src/Services/CheckUserCentralIndexManager.php(272): Wikimedia\Rdbms\InsertQueryBuilder->execute() #10 /srv/mediawiki/php-1.43.0-wmf.22/extensions/CheckUser/src/Services/CheckUserCentralIndexManager.php(97): MediaWiki\CheckUser\Services\CheckUserCentralIndexManager->getWikiMapIdForDomainId(string) #11 /srv/mediawiki/php-1.43.0-wmf.22/extensions/CheckUser/src/Services/CheckUserInsert.php(194): MediaWiki\CheckUser\Services\CheckUserCentralIndexManager->recordActionInCentralIndexes(MediaWiki\User\User, string, string, string, bool) #12 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/MWCallableUpdate.php(52): MediaWiki\CheckUser\Services\CheckUserInsert->MediaWiki\CheckUser\Services\{closure}(string) #13 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/DeferredUpdates.php(460): MediaWiki\Deferred\MWCallableUpdate->doUpdate() #14 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/DeferredUpdates.php(204): MediaWiki\Deferred\DeferredUpdates::attemptUpdate(MediaWiki\Deferred\MWCallableUpdate) #15 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/DeferredUpdates.php(291): MediaWiki\Deferred\DeferredUpdates::run(MediaWiki\Deferred\MWCallableUpdate) #16 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/DeferredUpdatesScope.php(243): MediaWiki\Deferred\DeferredUpdates::MediaWiki\Deferred\{closure}(MediaWiki\Deferred\MWCallableUpdate, int) #17 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/DeferredUpdatesScope.php(172): MediaWiki\Deferred\DeferredUpdatesScope->processStageQueue(int, int, Closure) #18 /srv/mediawiki/php-1.43.0-wmf.22/includes/deferred/DeferredUpdates.php(310): MediaWiki\Deferred\DeferredUpdatesScope->processUpdates(int, Closure) #19 /srv/mediawiki/php-1.43.0-wmf.22/includes/MediaWikiEntryPoint.php(674): MediaWiki\Deferred\DeferredUpdates::doUpdates() #20 /srv/mediawiki/php-1.43.0-wmf.22/includes/MediaWikiEntryPoint.php(496): MediaWiki\MediaWikiEntryPoint->restInPeace() #21 /srv/mediawiki/php-1.43.0-wmf.22/includes/MediaWikiEntryPoint.php(454): MediaWiki\MediaWikiEntryPoint->doPostOutputShutdown() #22 /srv/mediawiki/php-1.43.0-wmf.22/includes/MediaWikiEntryPoint.php(209): MediaWiki\MediaWikiEntryPoint->postOutputShutdown() #23 /srv/mediawiki/php-1.43.0-wmf.22/index.php(58): MediaWiki\MediaWikiEntryPoint->run() #24 /srv/mediawiki/w/index.php(3): require(string) #25 {main} ``` ==== Impact ==== ==== Notes ==== Write in a deferred update on GET, presumably needs to be created as a job. There's also a complain from TransactionProfiler about `Expectation (masterConns <= 0) by MediaWiki\MediaWikiEntryPoint::restInPeace not met (actual: {actualSeconds}): {query} `
    • Task
    ==== Error ==== * mwversion: 1.43.0-wmf.22 * reqId: 4e888b69-4e53-4c93-bdc7-bb07de7848b2 * [[ https://logstash.wikimedia.org/app/dashboards#/view/AXFV7JE83bOlOASGccsT?_g=(time:(from:'2024-09-11T23:33:53.364Z',to:'2024-09-13T07:11:05.007Z'))&_a=(query:(query_string:(query:'reqId:%224e888b69-4e53-4c93-bdc7-bb07de7848b2%22'))) | Find reqId in Logstash ]] ```name=normalized_message,lines=10 Expectation (readQueryRows <= 10000) by MediaWiki\Actions\ActionEntryPoint::execute not met (actual: {actualSeconds}) in trx #{trxId}: {query} ``` ```name=exception.trace,lines=10 from /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/TransactionProfiler.php(534) #0 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/TransactionProfiler.php(327): Wikimedia\Rdbms\TransactionProfiler->reportExpectationViolated(string, Wikimedia\Rdbms\GeneralizedSql, int, string, string) #1 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/TransactionManager.php(593): Wikimedia\Rdbms\TransactionProfiler->recordQueryCompletion(Wikimedia\Rdbms\GeneralizedSql, float, bool, int, string, string) #2 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(822): Wikimedia\Rdbms\TransactionManager->recordQueryCompletion(Wikimedia\Rdbms\GeneralizedSql, float, bool, int, string) #3 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(707): Wikimedia\Rdbms\Database->attemptQuery(Wikimedia\Rdbms\Query, string, bool) #4 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(634): Wikimedia\Rdbms\Database->executeQuery(Wikimedia\Rdbms\Query, string, int) #5 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/Database.php(1340): Wikimedia\Rdbms\Database->query(Wikimedia\Rdbms\Query, string) #6 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/DBConnRef.php(127): Wikimedia\Rdbms\Database->select(array, array, array, string, array, array) #7 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/database/DBConnRef.php(351): Wikimedia\Rdbms\DBConnRef->__call(string, array) #8 /srv/mediawiki/php-1.43.0-wmf.22/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(746): Wikimedia\Rdbms\DBConnRef->select(array, array, array, string, array, array) #9 /srv/mediawiki/php-1.43.0-wmf.22/extensions/GlobalBlocking/includes/Services/GlobalBlockLookup.php(297): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet() #10 /srv/mediawiki/php-1.43.0-wmf.22/extensions/GlobalBlocking/includes/Services/GlobalBlockLookup.php(171): MediaWiki\Extension\GlobalBlocking\Services\GlobalBlockLookup->getGlobalBlockingBlock(string, int, int) #11 /srv/mediawiki/php-1.43.0-wmf.22/extensions/GlobalBlocking/includes/Services/GlobalBlockLookup.php(80): MediaWiki\Extension\GlobalBlocking\Services\GlobalBlockLookup->getUserBlockDetails(MediaWiki\User\User, string) #12 /srv/mediawiki/php-1.43.0-wmf.22/extensions/GlobalBlocking/includes/GlobalBlockingHooks.php(109): MediaWiki\Extension\GlobalBlocking\Services\GlobalBlockLookup->getUserBlock(MediaWiki\User\User, null) #13 /srv/mediawiki/php-1.43.0-wmf.22/includes/HookContainer/HookContainer.php(159): MediaWiki\Extension\GlobalBlocking\GlobalBlockingHooks->onGetUserBlock(MediaWiki\User\User, null, null) #14 /srv/mediawiki/php-1.43.0-wmf.22/includes/HookContainer/HookRunner.php(2001): MediaWiki\HookContainer\HookContainer->run(string, array) #15 /srv/mediawiki/php-1.43.0-wmf.22/includes/block/BlockManager.php(223): MediaWiki\HookContainer\HookRunner->onGetUserBlock(MediaWiki\User\User, null, null) #16 /srv/mediawiki/php-1.43.0-wmf.22/includes/user/User.php(1450): MediaWiki\Block\BlockManager->getBlock(MediaWiki\User\User, null, bool) #17 /srv/mediawiki/php-1.43.0-wmf.22/includes/user/User.php(1526): MediaWiki\User\User->getBlock() #18 /srv/mediawiki/php-1.43.0-wmf.22/extensions/CheckUser/src/CheckUser/Pagers/CheckUserLogPager.php(174): MediaWiki\User\User->isHidden() #19 /srv/mediawiki/php-1.43.0-wmf.22/includes/pager/ReverseChronologicalPager.php(134): MediaWiki\CheckUser\CheckUser\Pagers\CheckUserLogPager->formatRow(stdClass) #20 /srv/mediawiki/php-1.43.0-wmf.22/includes/pager/IndexPager.php(595): MediaWiki\Pager\ReverseChronologicalPager->getRow(stdClass) #21 /srv/mediawiki/php-1.43.0-wmf.22/extensions/CheckUser/src/CheckUser/SpecialCheckUserLog.php(158): MediaWiki\Pager\IndexPager->getBody() #22 /srv/mediawiki/php-1.43.0-wmf.22/includes/specialpage/SpecialPage.php(719): MediaWiki\CheckUser\CheckUser\SpecialCheckUserLog->execute(null) #23 /srv/mediawiki/php-1.43.0-wmf.22/includes/specialpage/SpecialPageFactory.php(1708): MediaWiki\SpecialPage\SpecialPage->run(null) #24 /srv/mediawiki/php-1.43.0-wmf.22/includes/actions/ActionEntryPoint.php(502): MediaWiki\SpecialPage\SpecialPageFactory->executePath(string, MediaWiki\Context\RequestContext) #25 /srv/mediawiki/php-1.43.0-wmf.22/includes/actions/ActionEntryPoint.php(145): MediaWiki\Actions\ActionEntryPoint->performRequest() #26 /srv/mediawiki/php-1.43.0-wmf.22/includes/MediaWikiEntryPoint.php(200): MediaWiki\Actions\ActionEntryPoint->execute() #27 /srv/mediawiki/php-1.43.0-wmf.22/index.php(58): MediaWiki\MediaWikiEntryPoint->run() #28 /srv/mediawiki/w/index.php(3): require(string) #29 {main} ``` ==== Impact ==== Slow queries on `Special:CheckUserLog` for some old `cu_log` entries. ==== Notes ==== Appears to be caused by leading 0s in IP addresses used as the target for some old `cu_log` entries. Sanitising the IP address with `IPUtils::sanitizeIP` should fix the problem as it removes the leading 0s.
    • Task
    IMPORTANT: This is not yet actionable by serviceops! A few compatibility changes on the MediaWiki REST endpoints remain to be coded/merged/deployed. This task is being created ahead of that work being completed to open conversation and allow scheduling of the rerouting. See the checklist below for status. NOTE: ServiceOps will be tagged once this task is reviewed and approved by MediaWiki Engineering. Summary: certain endpoints currently handled by RESTbase should be rerouted to equivalent MediaWiki REST endpoints. This will unblock this portion of RESTbase sunset. It is expected that callers will eventually be moved to new canonical urls, allowing these paths to be completely retired. However, new urls have not yet been defined. Implementing new urls and moving callers will take at least months, if not longer. See {T366835} for related discussion. === Mapping of production URLs to be routed to #mediawiki-rest-api --- === **Page and revision meta-data** 1. **Get revision metadata for a title** - **HTTP Verb:** `GET` - **Production Endpoint:** - `<domain>/api/rest_v1/page/title/{title}` - Example: https://en.wikipedia.org/api/rest_v1/page/title/Earth - **MW REST Endpoint:** - `<domain>/w/rest.php/v1/page/{title}/bare` - Example: https://en.wikipedia.org/w/rest.php/v1/page/Earth/bare 2. **Get revision metadata for a title, by revision id** - **HTTP Verb:** `GET` - **Production Endpoint:** - `<domain>/api/rest_v1/page/title/{title}/{revision}` - Example: https://en.wikipedia.org/api/rest_v1/page/title/Earth/1244226389 - **MW REST Endpoint:** - `<domain>/w/rest.php/v1/revision/{id}/bare` - Example: https://en.wikipedia.org/w/rest.php/v1/revision/1244226389/bare - Notes: - The `{revision}` parameter in the Production Endpoint corresponds to the `{id}` parameter in the MW REST Endpoint - The `{title}` parameter in the Production Endpoint does not have an equivalent in the MW REST Endpoint --- === **Rendered HTML** 3. **Get latest HTML for a title.** - **HTTP Verb:** `GET` - **Production Endpoint:** - `<domain>/api/rest_v1/page/html/{title}` - Example: https://en.wikipedia.org/api/rest_v1/page/html/Earth - **MW REST Endpoint:** - `<domain>/w/rest.php/v1/page/{title}/html` - Example: https://en.wikipedia.org/w/rest.php/v1/page/Earth/html 4. **Get HTML for a specific title/revision & optionally timeuuid.** - **HTTP Verb:** `GET` - **Production Endpoint:** - `<domain>/api/rest_v1/page/html/{title}/{revision}` - Example: https://en.wikipedia.org/api/rest_v1/page/html/Earth/1244226389 - **MW REST Endpoint:** - `<domain>/w/rest.php/v1/revision/{id}/html` - Example: https://en.wikipedia.org/w/rest.php/v1/revision/1244226389/html - Notes: - The `{revision}` parameter in the Production Endpoint corresponds to the `{id}` parameter in the MW REST Endpoint - The `{title}` parameter in the Production Endpoint does not have an equivalent in the MW REST Endpoint === Additional Configuration: All forwarded calls should include a `x-restbase-compat` header with a value of '`1`'. This instructs the MW REST endpoints to return RESTbase-compatible data. === Acceptance Criteria - [ ] This list is reviewed and approved by @MSantos, @HCoplin-WMF , and @daniel - [ ] This list is reviewed and approved by ServiceOps (will tag them once the first review phase is complete) - [ ] Endpoints are routed through REST Gateway
    • Task
    Currently, the bulk of a focus area page is in the `|description=` parameter of the [[https://meta.wikimedia.org/wiki/Template:Community_Wishlist/Focus_area/Full|{{Community Wishlist/Focus area}}]] template. This means that even when editing with VisualEditor, that part of the page must be edited as wikitext within the template dialog: {F57504437} Instead of this, it should be made possible to edit this largest part of the page directly in the editor.
    • Task
    I am running this code: ```python import pywikibot site = pywikibot.Site("wikidata", "wikidata") repo = site.data_repository() item_id = "Q64835" # Özcan Arkoç item = pywikibot.ItemPage(repo, item_id) if item.exists(): print(f"{item_id} exists") else: print(f"{item_id} doesn't exist") ``` It crashes on `item.exists()`: ``` Traceback (most recent call last): File "/home/amir/devel/pywikibot-core/pywikibot/tools/__init__.py", line 771, in wrapper return getattr(obj, cache_name) ^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'Claim' object has no attribute '_type'. Did you mean: 'type'? During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/amir/devel/pywikibot-core/exists.py", line 8, in <module> if item.exists(): ^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 698, in exists self.get(get_redirect=True) File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 1165, in get data = super().get(force, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 737, in get data = WikibaseEntity.get(self, force=force) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 285, in get value = cls.fromJSON(self._content.get(key, {}), self.repo) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_collections.py", line 218, in fromJSON this[key] = [pywikibot.page.Claim.fromJSON(repo, claim) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 1753, in fromJSON claim.sources.append(cls.referenceFromJSON(site, source)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 1777, in referenceFromJSON claim = cls.fromJSON(site, {'mainsnak': claimsnak, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 1742, in fromJSON if claim.type in cls.types: ^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/tools/__init__.py", line 773, in wrapper val = fn(obj) ^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/page/_wikibase.py", line 1427, in type return self.repo.getPropertyType(self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amir/devel/pywikibot-core/pywikibot/site/_datasite.py", line 278, in getPropertyType raise KeyError(f'{prop} does not exist') KeyError: "Claim(DataSite('wikidata', 'wikidata'), 'P6615') does not exist" CRITICAL: Exiting due to uncaught exception KeyError: "Claim(DataSite('wikidata', 'wikidata'), 'P6615') does not exist" ``` The same code outputs "Q64834 exists", "Q64836 exists", and "Q7 doesn't exist" when running with these Q numbers, but [[ https://www.wikidata.org/wiki/Q64835 | Q64835 ]] causes it to crash. Maybe it happens because [[ https://www.wikidata.org/wiki/Property:P6615 | property 6615 ]], which was used in a statement on that item, was deleted a couple of months ago. The [[ https://www.wikidata.org/w/index.php?title=Q64835&diff=prev&oldid=2190834717 | statement was deleted, too ]], and it's not even the latest edit on the item. In any case, such a thing probably shouldn't cause the whole script to crash.
    • Task
    **Steps to replicate the issue** (include links if applicable): * go to https://commons.wikimedia.org/wiki/Special:UploadWizard?campaign=wlm-it * upload different medias via the UploadWizard * in the first file, add caption and check "Same as caption" for description, add category and add the Monument ID (for exemple 07I2250005) * in "Copy information to other uploads", check "caption", "description", "categories", "other information" but not "title" * click the button "copy selections to other uploads" * (imagine to do little changes to every descriptions, but it is not necessary for reproduce the bug) * now, you want to create a standard title for your photo series, so * write a significant title in the first file * in "Copy information to other uploads", check "title" and uncheck "caption", "description", "categories", "other information" **What happens?**: UploadWizard hides/deletes captions and descriptions already copied to other uploads, and it is impossible to add or modify them again (the form to fill out disappears) **What should have happened instead?**: Because already copied, captions and descriptions have not to disappear in all medias. When I uncheck them in "Copy information to other uploads", my goal is to add other information by keeping information already added,, not to delete previous information. Unchecking a box should not coincide with deleting the information. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): Windows 10 Home version 22H2, browser Firefox 126.0 (64 bit) before clicking on "Copy information to other uploads" the second time: {F57504322} after clicking on "Copy information to other uploads" the second time: {F57504329}
    • Task
    == Background We should continue to roll out dark mode as projects are ready. We have been checking this on a monthly basis. == User story As a project admin who has been updating my project, I'd like to see dark mode enabled for everyone as a result of my actions == Requirements [] Review green results on "tier 3" https://night-mode-checker.wmcloud.org/ [] Promote any sites that match our criteria. === BDD - For QA engineer to fill out === Test Steps - For QA engineer to fill out == Design - Add mockups and design requirements == Acceptance criteria - Add acceptance criteria == Communication criteria - does this need an announcement or discussion? - Add communication criteria == Rollback plan - What is the rollback plan in production for this task if something goes wrong? //This task was created by Version 1.2.0 of the [[ https://mediawiki.org/w/index.php?title=Reading/Web/Request_process | Web team task template ]] using [[ https://phabulous.toolforge.org/ | phabulous ]] //
    • Task
    All RSS items in the watchlist and other MediaWiki feeds have the same URL. For example, my watchlist on enwikipedia starts with: ```lang=xml <channel> <title>Wikipedia - Watchlist [en]</title> <link>https://en.wikipedia.org/wiki/Special:Watchlist</link> <description>Watchlist</description> <language>en</language> <generator>MediaWiki 1.43.0-wmf.22</generator> <lastBuildDate>Fri, 13 Sep 2024 00:49:28 GMT</lastBuildDate> <item> <title>Wikipedia:Village pump (technical)</title> <link>https://en.wikipedia.org/wiki/Special:Watchlist</link> <guid isPermaLink="false">https://en.wikipedia.org/w/index.php?title=Wikipedia:Village_pump_(technical)&amp;diff=1245437892</guid> <description>/* New pages highlight color changed */ new section (Maile66)</description> <pubDate>Fri, 13 Sep 2024 00:25:20 GMT</pubDate> <dc:creator>Maile66</dc:creator> </item> <item> <title>Wikipedia:Village pump (technical)</title> <link>https://en.wikipedia.org/wiki/Special:Watchlist</link> <guid isPermaLink="false">https://en.wikipedia.org/w/index.php?title=Wikipedia:Village_pump_(technical)&amp;diff=1245435874</guid> <description>/* IP editor(s) cannot edit talk pages */ Reply (Swan2024)</description> <pubDate>Fri, 13 Sep 2024 00:11:23 GMT</pubDate> <dc:creator>Swan2024</dc:creator> </item> ``` The `<link>https://en.wikipedia.org/wiki/Special:Watchlist</link>` should have the URL of the Village pump rather than the Watchlist (the latter is correctly the feed's `<link>` value). Atom feeds do not have this bug.
    • Task
    I'm running this code: ```python import pywikibot site = pywikibot.Site("wikidata", "wikidata") repo = site.data_repository() item_id = "Q254" # Wolfgang Amadeus Mozart item = pywikibot.ItemPage(repo, item_id) if item.exists(): print(f"item {item_id} exists") item.get() ``` It produces these warnings at `item.exists()` and `item.get()`: ``` WARNING: entity-schema datatype is not supported yet. ``` The reason may be that https://www.wikidata.org/wiki/Q254 has a statement with the property [[ https://www.wikidata.org/wiki/Property:P12861 | EntitySchema for this class ]]. However, showing this warning at .exists() and .get() looks like an exaggeration, because these methods don't seem to do anything with the statement with this property.
    • Task
    ####User Story: As the Growth team Product Manager, I want to be sure our instrumentation is working as expected, because then I can track the metrics as outlined in our [[ https://docs.google.com/spreadsheets/d/1tSjGZPL9wMe1Egbq-YLL4MNoiqRBaK8g_Q84ukJuszI/edit?gid=847167629#gid=847167629 | Community Updates Instrumentation Specs]]. ####Acceptance Criteria: Given the Community Updates module is available on Test Wiki, When users test the module, Then the following metrics are tracked as outlined in the Instrumentation Specs: - Number of users who visited the Homepage during the experiment - Number of impressions of the Community Updates module - CTR of Community Updates
    • Task
    === How many times were you able to reproduce it? Everytime I cloned the repo and set it up (more than 3 times) === Steps to reproduce # Clone the repository. # Attempt to build and run the project. # Open the project in Xcode (I am using Xcode version 15.0.1). # build project === Expected results The project should build and run successfully. === Actual results Build fails with the following errors: Inheritance from non-protocol type 'any NSCopying'. Unknown attribute 'retroactive'. === Screenshots {F57503972}
    • Task
    Hi all, I will be on vacation from September 16 - September 27, 2024. Rachel Stallman has agree to cover any NDA requests while I'm away. I'll be back at work on September 30, 2024. Please route NDA requests to Rachel during the time frame I'll be OOO. Thanks so much!
    • Task
    == Common information * **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status * **description**: jenkins.service on releases1003:9100 * **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state * **summary**: jenkins.service on releases1003:9100 * **alertname**: SystemdUnitFailed * **instance**: releases1003:9100 * **name**: jenkins.service * **prometheus**: ops * **severity**: critical * **site**: eqiad * **source**: prometheus * **team**: collaboration-services == Firing alerts --- * **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status * **description**: jenkins.service on releases1003:9100 * **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state * **summary**: jenkins.service on releases1003:9100 * **alertname**: SystemdUnitFailed * **instance**: releases1003:9100 * **name**: jenkins.service * **prometheus**: ops * **severity**: critical * **site**: eqiad * **source**: prometheus * **team**: collaboration-services * [Source](https://prometheus-eqiad.wikimedia.org/ops/graph?g0.expr=%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%2C+name%29+group_left+%28team%29+systemd_unit_owner%29+%3D%3D+1+or+ignoring+%28team%29+%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%29+group_left+%28team%29+role_owner%7Bteam%21%3D%22wmcs%22%7D%29+%3D%3D+1&g0.tab=1)
    • Task
    We'd like to know which recommendation engine performed best in our non-UI testing. [[ https://docs.google.com/spreadsheets/d/1bxVj-hTc0zz2-jri-X3cEdsYrVWic1R_ERZ_kef14E0/edit?usp=sharing | Spanish survey results ]] [[ https://docs.google.com/spreadsheets/d/1OZz_YOw-FBEKYCxXMWl-tq_z2LjyX-bmA-IHr_w4gd0/edit?usp=sharing | English survey results ]] Some questions we'd like to ask the data: - Were there preferences for different APIs depending on reading frequency? - Is there a correlation between people who chose "none of the above" and their stated likelihood to read about a topic outside the study? - Is there a clear "winner" between different recommendation engines?
    • Task
    == Background We are [[ https://phabricator.wikimedia.org/T373039 | running a survey ]] against the output of three different recommendation engines: - morelike - vectorsearch - listreader Assuming that none of these recommendation sources are ideal, I'd like to run a follow-up survey with the "winner" of the first experiment and two other recommendation sources: # The "See Also" links from the articles # The [[ https://superlative-jalebi-8c2fbb.netlify.app/ | Most Linked API ]] The goal is to see how the performance of a recommendation engine based on an article's position in the knowledge network and hand-curated recommendations from community contributors compares to the article content based APIs we tested in the first experiment. == User story As a member of the web team, I want to know if any other sources of recommendations outperform the 3 APIs in the non-UI experiment. will affect the end user == Requirements - Generate top 3 articles for the two new recommendation sources - Create EN and ES surveys following the same format with that new content - Point the links from Quicksurveys to the new surveys === BDD - For QA engineer to fill out === Test Steps - For QA engineer to fill out == Design n/a == Acceptance criteria - Add acceptance criteria == Communication criteria - does this need an announcement or discussion? - Add communication criteria == Rollback plan - What is the rollback plan in production for this task if something goes wrong? //This task was created by Version 1.2.0 of the [[ https://mediawiki.org/w/index.php?title=Reading/Web/Request_process | Web team task template ]] using [[ https://phabulous.toolforge.org/ | phabulous ]] //
    • Task
    **Build**: 7.5.9 (4049) **Due Date**: Sept 18th **Non-EN** [x] translations pulled in from TWN [] smoke test in non-Latin language **Other** [] smoke test [] beta test [] tablet display [] VoiceOver
    • Task
    **Release version**: **Release tag**: **Release SHA1**: **App Store Submission Date**: **App Store Approval Date**: **App Store Release Date**: --- **Pre-release** [ ] If needed: Gather new screenshots & app icons from design for App Store Connect, add to project. [ ] If needed: Get "What's new" release notes from Product. Otherwise use "Minor updates and enhancements." [ ] Confirm release build was expanded to TestFlight external testers group, with no major issues [ ] If needed: Post to mobile-l when release build is expanded to external testers [ ] If needed: File subtask for smoke tests and notify TSG with release candidate build number [ ] If needed: Update contributors list in project [ ] If needed: Update "Licenses" section (for any new third-party libraries, fonts, images, etc) in project [ ] If needed: Review Regression testing report, triage bugs, and identify any blockers **Day of submit** [ ] If needed: Upload new screenshots to App Store Connect [ ] If needed: Translate "What's new" release notes via Google Translate, copy & paste into each App Store language, otherwise paste "Minor updates and enhancements" translations from [[https://docs.google.com/document/d/1W0QIM4AumND64nVmn4-sWeOQNODHCpXyhfVBkV7ZE7E/edit#heading=h.r5tvupqekyzr | release document]]. [ ] Install previous (active) App Store version, then update to release candidate via TestFlight and verify app load [ ] Submit release candidate on App Store Connect for App Store review with automatic release disabled **Day of approval** [ ] Release to App Store (NOT on a Friday!) Post-release: [ ] Update [[https://www.mediawiki.org/wiki/Wikimedia_Apps/Team/iOS/Release_history | release notes]] once released
    • Task
    ####User story & summary: As a Wikimedian interested in supporting a new generation of editors, I want to know about the Community Updates module, so I can use it when appropriate. ####Project details: https://www.mediawiki.org/wiki/Growth/Community_Updates Configuration form: [[ https://en.wikipedia.beta.wmflabs.org/wiki/Special:CommunityConfiguration/CommunityUpdates | Special:CommunityConfiguration/CommunityUpdates ]] ####Acceptance Criteria: [] Create a brief Help page for the Community Updates module - - Drafted at: https://www.mediawiki.org/wiki/Help:Growth/Tools/Community_updates_module [] Add Help page to Media Wiki [x] Integrate to https://www.mediawiki.org/wiki/Help:Growth/Tools [x] Integrate to Help:Community Configuration (added as a commented element [[ https://www.mediawiki.org/w/index.php?title=Help:Community_Configuration&diff=prev&oldid=6752261 | diff ]]) (You can use [[ https://docs.google.com/document/d/1Ti461Q6kcLwiXoHXfbg_A1NHjRBSZSIrcVUAUdkaVmc/edit | this draft ]] to start, or create a different help doc).
    • Task
    ####User story & summary: //Specific user story:// As the Growth team Product Manager, I want to be able to test this new feature on our pilot wikis (Arabic & Spanish Wikipedia), so we can make an informed decision about making improvements, scaling to other wikis, or removing the feature. //Broader user story: //As a new editor visiting my Newcomer homepage, I want to see community updates relevant to new editors, so that I can deepen my involvement in the wikis. ####Project details: This task includes adding scaling the new [[ https://www.mediawiki.org/wiki/Community_configuration | configurable ]] "[[ https://www.mediawiki.org/wiki/Growth/Community_Updates | Community Updates ]]" module for the [[ https://www.mediawiki.org/wiki/Growth/Personalized_first_day/Newcomer_homepage | Newcomer homepage ]] to Growth's pilot wikis. - Project page: https://www.mediawiki.org/wiki/Growth/Community_Updates - Release plan: {T372840} - Epic: {T360485} ####Acceptance Criteria: [] Release the Community Updates module to Growth Pilot Wikis (Arabic & Spanish Wikipedia) NOTE: the Community Updates module configuration form should be "Turned off" upon release, so the only user facing changes initially visible will be the new item on the Community Configuration dashboard (Special:CommunityConfiguration) and the empty form will be visible via Special:CommunityConfiguration/CommunityUpdates.
    • Task
    **Feature summary** (what you would like to be able to do and where): Add a new option to hide indirect transclusions. Preferable both in WhatLinksHere and the Cirrus search. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): I want to check how is a certain template used. For example want to check if Hatnote is transcluded in some articles directly. I use WhattLinksHere: https://en.wikipedia.org/wiki/Special:WhatLinksHere?target=Template%3AHatnote&namespace=&hidelinks=1&hideredirs=1 This however shows all transclusions, not just direct transclusions. No luck here. I can search insource which is partially better, but also shows {{hatnote group}} https://en.wikipedia.org/w/index.php?search=insource%3AHatnote++hastemplate%3AHatnote&title=Special:Search&profile=advanced&fulltext=1&advancedSearch-current=%7B%22fields%22%3A%7B%22hastemplate%22%3A%5B%22Hatnote%22%5D%7D%7D&ns0=1&searchToken=b13ftph0d34jafwmh71y16jsi There just seem to be no easy way to find actual template usage. I can sometimes use regexp but that is slow and has other risks. **Benefits** (why should this be implemented?): Helpful for: - Template editors. - People looking for examples usages of templates.
    • Task
    In the parent ticket we are debugging a problem with missing events from edit requests. The source of this problem appears to be that a request ran for multiple hours in post-send, and many (all?) of the deferred's timed out. To see how common this is i wrote a script to query reqIds that logged `EmergencyTimeoutException`, and then did an aggregation query that filtered logs for the same (host, reqId) combo (to exclude jobs that reuse reqId) and reported the delta between the earliest and latest log message. This reports requests with > 10 minutes between start and end Script: P69109 Results for Sep 1 - 11: P69110 It's not a crazy number of requests per day, on average < 10 with 20 on the worst day, but we have multiple requests per day that run for 2+ hours. The longest request runs for 173 minutes. I should note that this depends on the initial request logging something. If the initial request didn't log anything and the logs start at the timeout they are likely not included here. Perhaps `EmergencyTimeoutException` could be adjusted to report the current request runtime in the error message to give more concrete information.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Visit https://en.wikipedia.beta.wmflabs.org/w/index.php?title=User:RoanKattouw/sandbox * Click edit and load VE * **What happens?**: {F57503839} I'm seeing the inlined SVG. **What should have happened instead?**: The chart should render and be easy to edit. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    The following test failure is blocking merges in WikimediaMessages on this patch: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/WikimediaMessages/+/1071984 ``` 11:50:15 mw-error.log:2024-09-12 18:29:00 1a43f383625e wikidb: [e298a95dce5ecb7f35eb7ab1] [no req] Wikimedia\Assert\ParameterTypeException: Bad value for parameter $repoDomain: must be a string 11:50:15 mw-error.log:#0 /workspace/src/extensions/Wikibase/lib/includes/Rdbms/RepoDomainDbFactory.php(37): Wikimedia\Assert\Assert::parameterType() 11:50:15 mw-error.log:#1 /workspace/src/extensions/Wikibase/client/WikibaseClient.ServiceWiring.php(844): Wikibase\Lib\Rdbms\RepoDomainDbFactory->__construct() 11:50:15 mw-error.log:#2 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(447): MediaWikiIntegrationTestCase::{closure}() 11:50:15 mw-error.log:#3 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService() 11:50:15 mw-error.log:#4 /workspace/src/includes/MediaWikiServices.php(355): Wikimedia\Services\ServiceContainer->getService() 11:50:15 mw-error.log:#5 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService() 11:50:15 mw-error.log:#6 /workspace/src/extensions/Wikibase/client/includes/WikibaseClient.php(523): Wikimedia\Services\ServiceContainer->get() 11:50:15 mw-error.log:#7 /workspace/src/extensions/Wikibase/client/WikibaseClient.ServiceWiring.php(1095): Wikibase\Client\WikibaseClient::getRepoDomainDbFactory() 11:50:15 mw-error.log:#8 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(447): MediaWikiIntegrationTestCase::{closure}() 11:50:15 mw-error.log:#9 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService() 11:50:15 mw-error.log:#10 /workspace/src/includes/MediaWikiServices.php(355): Wikimedia\Services\ServiceContainer->getService() 11:50:15 mw-error.log:#11 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService() 11:50:15 mw-error.log:#12 /workspace/src/extensions/Wikibase/client/includes/WikibaseClient.php(150): Wikimedia\Services\ServiceContainer->get() 11:50:15 mw-error.log:#13 /workspace/src/extensions/Wikibase/client/WikibaseClient.ServiceWiring.php(984): Wikibase\Client\WikibaseClient::getWikibaseServices() 11:50:15 mw-error.log:#14 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(447): MediaWikiIntegrationTestCase::{closure}() 11:50:15 mw-error.log:#15 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService() 11:50:15 mw-error.log:#16 /workspace/src/includes/MediaWikiServices.php(355): Wikimedia\Services\ServiceContainer->getService() 11:50:15 mw-error.log:#17 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService() 11:50:15 mw-error.log:#18 /workspace/src/extensions/Wikibase/client/includes/WikibaseClient.php(215): Wikimedia\Services\ServiceContainer->get() 11:50:15 mw-error.log:#19 /workspace/src/extensions/Wikibase/client/WikibaseClient.ServiceWiring.php(407): Wikibase\Client\WikibaseClient::getStore() 11:50:15 mw-error.log:#20 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(447): MediaWikiIntegrationTestCase::{closure}() 11:50:15 mw-error.log:#21 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService() 11:50:15 mw-error.log:#22 /workspace/src/includes/MediaWikiServices.php(355): Wikimedia\Services\ServiceContainer->getService() 11:50:15 mw-error.log:#23 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService() 11:50:15 mw-error.log:#24 /workspace/src/extensions/Wikibase/client/includes/WikibaseClient.php(160): Wikimedia\Services\ServiceContainer->get() 11:50:15 mw-error.log:#25 /workspace/src/extensions/Wikibase/client/WikibaseClient.ServiceWiring.php(872): Wikibase\Client\WikibaseClient::getEntityLookup() 11:50:15 mw-error.log:#26 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(447): MediaWikiIntegrationTestCase::{closure}() 11:50:15 mw-error.log:#27 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService() 11:50:15 mw-error.log:#28 /workspace/src/includes/MediaWikiServices.php(355): Wikimedia\Services\ServiceContainer->getService() 11:50:15 mw-error.log:#29 /workspace/src/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService() 11:50:15 mw-error.log:#30 /workspace/src/vendor/wikimedia/object-factory/src/ObjectFactory.php(204): Wikimedia\Services\ServiceContainer->get() 11:50:15 mw-error.log:#31 /workspace/src/vendor/wikimedia/object-factory/src/ObjectFactory.php(149): Wikimedia\ObjectFactory\ObjectFactory::getObjectFromSpec() 11:50:15 mw-error.log:#32 /workspace/src/includes/HookContainer/HookContainer.php(256): Wikimedia\ObjectFactory\ObjectFactory->createObject() 11:50:15 mw-error.log:#33 /workspace/src/includes/HookContainer/HookContainer.php(348): MediaWiki\HookContainer\HookContainer->makeExtensionHandlerCallback() 11:50:15 mw-error.log:#34 /workspace/src/includes/HookContainer/HookContainer.php(524): MediaWiki\HookContainer\HookContainer->normalizeHandler() 11:50:15 mw-error.log:#35 /workspace/src/includes/HookContainer/HookContainer.php(146): MediaWiki\HookContainer\HookContainer->getHandlers() 11:50:15 mw-error.log:#36 /workspace/src/includes/HookContainer/HookRunner.php(2983): MediaWiki\HookContainer\HookContainer->run() 11:50:15 mw-error.log:#37 /workspace/src/includes/parser/Parser.php(653): MediaWiki\HookContainer\HookRunner->onParserClearState() 11:50:15 mw-error.log:#38 /workspace/src/includes/parser/Parser.php(4929): MediaWiki\Parser\Parser->clearState() 11:50:15 mw-error.log:#39 /workspace/src/includes/parser/Parser.php(696): MediaWiki\Parser\Parser->startParse() 11:50:15 mw-error.log:#40 /workspace/src/includes/language/MessageCache.php(1533): MediaWiki\Parser\Parser->parse() 11:50:15 mw-error.log:#41 /workspace/src/includes/Message/Message.php(1492): MessageCache->parse() 11:50:15 mw-error.log:#42 /workspace/src/includes/Message/Message.php(1059): MediaWiki\Message\Message->parseText() 11:50:15 mw-error.log:#43 /workspace/src/includes/Message/Message.php(1087): MediaWiki\Message\Message->format() 11:50:15 mw-error.log:#44 [internal function]: MediaWiki\Message\Message->__toString() 11:50:15 mw-error.log:#45 /workspace/src/extensions/WikimediaMessages/includes/SiteAdminHelperModule.php(69): explode() 11:50:15 mw-error.log:#46 /workspace/src/includes/ResourceLoader/FileModule.php(410): MediaWiki\Extension\WikimediaMessages\SiteAdminHelperModule->getStyleFiles() 11:50:15 mw-error.log:#47 /workspace/src/includes/ResourceLoader/Module.php(853): MediaWiki\ResourceLoader\FileModule->getStyles() 11:50:15 mw-error.log:#48 /workspace/src/includes/ResourceLoader/Module.php(812): MediaWiki\ResourceLoader\Module->buildContent() 11:50:15 mw-error.log:#49 /workspace/src/includes/ResourceLoader/ResourceLoader.php(1268): MediaWiki\ResourceLoader\Module->getModuleContent() 11:50:15 mw-error.log:#50 /workspace/src/includes/ResourceLoader/ResourceLoader.php(1192): MediaWiki\ResourceLoader\ResourceLoader->addOneModuleResponse() 11:50:15 mw-error.log:#51 /workspace/src/includes/ResourceLoader/ResourceLoader.php(1110): MediaWiki\ResourceLoader\ResourceLoader->getOneModuleResponse() 11:50:15 mw-error.log:#52 /workspace/src/tests/phpunit/structure/BundleSizeTestBase.php(81): MediaWiki\ResourceLoader\ResourceLoader->makeModuleResponse() 11:50:15 mw-error.log:#53 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestCase.php(1617): MediaWiki\Tests\Structure\BundleSizeTestBase->testBundleSize() 11:50:15 mw-error.log:#54 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestCase.php(1223): PHPUnit\Framework\TestCase->runTest() 11:50:15 mw-error.log:#55 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestResult.php(729): PHPUnit\Framework\TestCase->runBare() 11:50:15 mw-error.log:#56 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestCase.php(973): PHPUnit\Framework\TestResult->run() 11:50:15 mw-error.log:#57 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestSuite.php(685): PHPUnit\Framework\TestCase->run() 11:50:15 mw-error.log:#58 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestSuite.php(685): PHPUnit\Framework\TestSuite->run() 11:50:15 mw-error.log:#59 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestSuite.php(685): PHPUnit\Framework\TestSuite->run() 11:50:15 mw-error.log:#60 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestSuite.php(685): PHPUnit\Framework\TestSuite->run() 11:50:15 mw-error.log:#61 /workspace/src/vendor/phpunit/phpunit/src/Framework/TestSuite.php(685): PHPUnit\Framework\TestSuite->run() 11:50:15 mw-error.log:#62 /workspace/src/vendor/phpunit/phpunit/src/TextUI/TestRunner.php(651): PHPUnit\Framework\TestSuite->run() 11:50:15 mw-error.log:#63 /workspace/src/vendor/phpunit/phpunit/src/TextUI/Command.php(146): PHPUnit\TextUI\TestRunner->run() 11:50:15 mw-error.log:#64 /workspace/src/vendor/phpunit/phpunit/src/TextUI/Command.php(99): PHPUnit\TextUI\Command->run() 11:50:15 mw-error.log:#65 phpvfscomposer:///workspace/src/vendor/phpunit/phpunit/phpunit(106): PHPUnit\TextUI\Command::main() 11:50:15 mw-error.log:#66 /workspace/src/vendor/bin/phpunit(118): include(string) 11:50:15 mw-error.log:#67 {main} 11:50:15 + set -e 11:50:15 + echo -e 'MediaWiki emitted some errors. Check output above.' 11:50:15 MediaWiki emitted some errors. Check output above. 11:50:15 + exit 1 11:50:15 Build step 'Execute shell' marked build as failure 11:50:15 [PostBuildScript] - [INFO] Executing post build scripts. 11:50:15 [wmf-quibble-vendor-mysql-php74] $ /bin/bash /tmp/jenkins3798968297428817702.sh 11:50:15 + set -o pipefail 11:50:15 ++ pwd 11:50:15 + '[' '!' -d /srv/jenkins/workspace/wmf-quibble-vendor-mysql-php74/log ']' 11:50:15 ++ pwd 11:50:15 + exec docker run --entrypoint=/bin/rm --volume /srv/jenkins/workspace/wmf-quibble-vendor-mysql-php74/log:/log --security-opt seccomp=unconfined --init --rm --label jenkins.job=wmf-quibble-vendor-mysql-php74 --label jenkins.build=17712 --env-file /dev/fd/63 docker-registry.wikimedia.org/buster:latest -fR /log/rawSeleniumVideoGrabs 11:50:15 ++ /usr/bin/env 11:50:15 ++ egrep -v '^(HOME|SHELL|PATH|LOGNAME|MAIL)=' 11:50:15 [PostBuildScript] - [INFO] Executing post build scripts. 11:50:15 [wmf-quibble-vendor-mysql-php74] $ /bin/bash -xe /tmp/jenkins2937141356033878064.sh 11:50:15 + find log/ -name 'mw-debug-*.log' -exec gzip '{}' + ```
    • Task
    **Steps to replicate the issue**: * Visit https://doc.wikimedia.org/codex/latest/components/demos/chip-input.html#configurable * Turn on `separateInput` and `disabled` **What happens?**: The input element is still white {F57503769} **What should have happened instead?**: It should be `background-colored-disabled` like the rest of the component {F57503772}
    • Task
    In T369868#10142190, we made a benchmark on top of `wmf_dumps.wikitext_raw`, and found out the following query planning times: ``` # With 8 driver cores, 66k files scanned, read 47M rows, 148k splits, 4.4GB # Elapsed time: 303.5358393192291 seconds (120 query, 183 planning) # Elapsed time: 301.08918023109436 seconds (102 query, 199 planning) # Elapsed time: 286.9110209941864 seconds (90 query, 196 planning) # Elapsed time: x seconds (x query, x planning) for i in range(0,3): start_time = time.time() spark.sql(""" SELECT count(1) as count FROM wmf_dumps.wikitext_raw_rc2 WHERE (wiki_db = 'enwiki' AND page_id IN (12924534, 29687986, 35328557, 73692977, 74530252, 74530254, 75962364, 75962367, 75971447, 75971928, 75977325, 75985100, 75985872, 76310582, 76310588, 76310589, 77328875, 77478992, 77479936, 77480638, 77480639, 77486461, 77486512, 77488096, 77488322, 77488407, 77488486)) """).show(20) end_time = time.time() print(f"Elapsed time: {end_time - start_time} seconds") ``` ~190 seconds is a lot of time spent on query planning, especially considering the actual query time is ~100. A quick investigation shows that: 1) The Iceberg java threadpool is doing the right thing [[ https://github.com/apache/iceberg/blob/8e9d59d299be42b0bca9461457cd1e95dbaad086/core/src/main/java/org/apache/iceberg/SystemConfigs.java#L38-L43 | as per code ]]. I can confirm this on a thread dump. 2) Most all the Iceberg threadpool threads are waiting on: ``` java.lang.Object.wait(Native Method) java.lang.Object.wait(Object.java:502) org.apache.hadoop.util.concurrent.AsyncGet$Util.wait(AsyncGet.java:59) org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1600) org.apache.hadoop.ipc.Client.call(Client.java:1558) org.apache.hadoop.ipc.Client.call(Client.java:1455) org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242) org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129) com.sun.proxy.$Proxy30.getBlockLocations(Unknown Source) org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:333) sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) => holding Monitor(org.apache.hadoop.io.retry.RetryInvocationHandler$Call@768716090}) org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) com.sun.proxy.$Proxy31.getBlockLocations(Unknown Source) org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:900) org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:889) org.apache.hadoop.hdfs.DFSClient.getBlockLocations(DFSClient.java:946) org.apache.hadoop.hdfs.DistributedFileSystem$2.doCall(DistributedFileSystem.java:288) org.apache.hadoop.hdfs.DistributedFileSystem$2.doCall(DistributedFileSystem.java:285) org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) org.apache.hadoop.hdfs.DistributedFileSystem.getFileBlockLocations(DistributedFileSystem.java:295) org.apache.iceberg.hadoop.HadoopInputFile.getBlockLocations(HadoopInputFile.java:210) org.apache.iceberg.hadoop.Util.blockLocations(Util.java:111) org.apache.iceberg.hadoop.Util.blockLocations(Util.java:84) org.apache.iceberg.spark.source.SparkInputPartition.<init>(SparkInputPartition.java:62) org.apache.iceberg.spark.source.SparkBatch.lambda$planInputPartitions$0(SparkBatch.java:90) org.apache.iceberg.spark.source.SparkBatch$$Lambda$2759/410160600.run(Unknown Source) org.apache.iceberg.util.Tasks$Builder.runTaskWithRetry(Tasks.java:413) org.apache.iceberg.util.Tasks$Builder.access$300(Tasks.java:69) org.apache.iceberg.util.Tasks$Builder$1.run(Tasks.java:315) java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) java.util.concurrent.FutureTask.run(FutureTask.java:266) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:750) ``` This points to HDFS's `org.apache.hadoop.hdfs.DistributedFileSystem.getFileBlockLocations()` being a bottleneck. In this task we should: [] Figure out if this speculation is correct. [] Figure whether we can tune the HDFS Client to utilize more network and/or more cores, We can see from HDFS Namenode that there is no spikes coming from our query, so we should be able to squeeze mroe bytes out of it. [] Explore Iceberg metadata caching mechanism as suggested by @JAllemandou and figure if it would be beneficial for us: [] Blog: https://blog.cloudera.com/12-times-faster-query-planning-with-iceberg-manifest-caching-in-impala/ [] Iceberg MR: https://github.com/apache/iceberg/pull/4518/files
    • Task
    After enabling ERROR logging for all channels on the beta cluster (T228838#10137684), these new errors appeared: https://beta-logs.wmcloud.org/goto/d6429a185b706ce8065ae6b77488cae2 (for access credentials, see: https://www.mediawiki.org/wiki/Beta_Cluster#Testing_changes_on_Beta_Cluster) {F57503595} ``` Bad survey configuration: The "Empty search experiment survey" external survey must have a secure url. Bad survey configuration: The "Web non-UI experiment survey" external survey must have a secure url. ``` If this is not really an error, please change the code so that it does not log it. If it is, please fix the surveys. The QuickSurvey log channel is not currently enabled in production, so I don't know whether the issue affects production as well, but please look into that as well, since the beta and prod config look quite similar. We're planning to make the same logging config change in production soon.
    • Task
    == Common information * **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status * **description**: apache2.service on vrts2002:9100 * **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state * **summary**: apache2.service on vrts2002:9100 * **alertname**: SystemdUnitFailed * **instance**: vrts2002:9100 * **name**: apache2.service * **prometheus**: ops * **severity**: critical * **site**: codfw * **source**: prometheus * **team**: collaboration-services == Firing alerts --- * **dashboard**: https://grafana.wikimedia.org/d/g-AaZRFWk/systemd-status * **description**: apache2.service on vrts2002:9100 * **runbook**: https://wikitech.wikimedia.org/wiki/Monitoring/check_systemd_state * **summary**: apache2.service on vrts2002:9100 * **alertname**: SystemdUnitFailed * **instance**: vrts2002:9100 * **name**: apache2.service * **prometheus**: ops * **severity**: critical * **site**: codfw * **source**: prometheus * **team**: collaboration-services * [Source](https://prometheus-codfw.wikimedia.org/ops/graph?g0.expr=%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%2C+name%29+group_left+%28team%29+systemd_unit_owner%29+%3D%3D+1+or+ignoring+%28team%29+%28instance_name%3Anode_systemd_unit_state_failed%3Acount1+%2A+on+%28instance%29+group_left+%28team%29+role_owner%7Bteam%21%3D%22wmcs%22%7D%29+%3D%3D+1&g0.tab=1)
    • Task
    TASK AUTO-GENERATED by Nagios/Icinga RAID event handler A degraded RAID (md) [[ https://icinga.wikimedia.org/cgi-bin/icinga/extinfo.cgi?type=2&host=aqs1014&service=MD RAID | was detected ]] on host `aqs1014`. An automatic snapshot of the current RAID status is attached below. Please **sync with the service owner** to find the appropriate time window before actually replacing any failed hardware. ``` CRITICAL: State: degraded, Active: 10, Working: 12, Failed: 0, Spare: 0 $ sudo /usr/local/lib/nagios/plugins/get-raid-status-md Personalities : [raid10] [linear] [multipath] [raid0] [raid1] [raid6] [raid5] [raid4] md126 : active raid10 sdb[1] sdf[0] 3749445632 blocks super external:/md127/0 64K chunks 2 near-copies [4/2] [_U_U] md127 : inactive sdb[1](S) sdf[0](S) 1303216 blocks super external:ddf md1 : active raid10 sda2[0] sdc2[2] sdd2[3] 3701655552 blocks super 1.2 512K chunks 2 near-copies [4/3] [U_UU] bitmap: 12/28 pages [48KB], 65536KB chunk md0 : active raid10 sda1[0] sdc1[2] sdd1[3] 48791552 blocks super 1.2 512K chunks 2 near-copies [4/3] [U_UU] md2 : active raid10 sdh2[3] sde2[0] 3701655552 blocks super 1.2 512K chunks 2 near-copies [4/2] [U__U] bitmap: 28/28 pages [112KB], 65536KB chunk unused devices: <none> ```
    • Task
    Something has broken the crawler cronjob again. Web UI is reporting Jul 23, 2024 9:03 PM as the last successful run. {T366506} was the last time this was noticed as being broken.
    • Task
    Please import `moswiki` from incubator, once it is created. Thanks!
    • Task
    Please add new wiki `moswiki` to Wikistats, once it is created. Thanks!
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created