Page MenuHomePhabricator
Search Global Search
Use the application-specific Advanced Search for better results and additional search criteria: Tasks, Commits. (More information)
    • Task
    **Steps to replicate the issue** (include links if applicable): * Have a toolforge tool that uses a reverse proxy. ** I have the following two proxies configure on my uploadmap tool they had no problems since many month: ``` --- # Service object for routing requests to cdn.example.com apiVersion: v1 kind: Service metadata: name: cdn-map namespace: tool-uploadmap spec: type: ExternalName externalName: map.leolenz.de --- # Ingress object for routing requests to cdn.example.com apiVersion: networking.k8s.io/v1 kind: Ingress metadata: name: proxy-cdn-map namespace: tool-uploadmap annotations: nginx.ingress.kubernetes.io/rewrite-target: /$2 nginx.ingress.kubernetes.io/upstream-vhost: map.leolenz.de nginx.ingress.kubernetes.io/backend-protocol: https nginx.ingress.kubernetes.io/proxy-ssl-server-name: "on" nginx.ingress.kubernetes.io/proxy-ssl-name: map.leolenz.de spec: rules: - host: uploadmap.toolforge.org http: paths: - backend: service: name: cdn-map port: number: 443 path: /cdn(/|$)(.*) pathType: ImplementationSpecific ``` ``` --- # Service object for routing requests to cdn.example.com apiVersion: v1 kind: Service metadata: name: cdn-osm-org namespace: tool-uploadmap spec: type: ExternalName externalName: tile.openstreetmap.org --- # Ingress object for routing requests to cdn.example.com apiVersion: networking.k8s.io/v1 kind: Ingress metadata: name: proxy-cdn-osm-org namespace: tool-uploadmap annotations: nginx.ingress.kubernetes.io/rewrite-target: /$2 nginx.ingress.kubernetes.io/upstream-vhost: tile.openstreetmap.org nginx.ingress.kubernetes.io/backend-protocol: https nginx.ingress.kubernetes.io/proxy-ssl-server-name: "on" nginx.ingress.kubernetes.io/proxy-ssl-name: tile.openstreetmap.org spec: rules: - host: uploadmap.toolforge.org http: paths: - backend: service: name: cdn-osm-org port: number: 443 path: /osm(/|$)(.*) pathType: ImplementationSpecific ``` **What happens?**: * For all content from these proxies I only get 404 errors. ** It is unlikely a problem of the the source servers, as these are two totally different servers they are affected. ** Requesting the data using curl in a shell as the tool works as expected. **What should have happened instead?**: * The requested files should be available through the proxies. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    In {T423027} the lack of up-to-date runbooks was mentioned. As a follow-up we should improve the [documentation for Gerrit](https://wikitech.wikimedia.org/wiki/Gerrit/Operations#Runbooks) so that SREs on-call also know how to perform basic Gerrit operations. Additionally (if possible) we should also document how to temporarily escalate privileges, some incidents might require gerrit admin permissions. * non-existing runbooks and documented gerrit ssh commands are not working when Gerrit is stuck/down * how to [] restart CI [x] restart gerrit, [] fix disk issues on Gerrit [x] have gerrit ssh commands to elevate privileges
    • Task
    The working directory seems to be only partially specified. This feels like a translation or message integration problem. I think this message used to be more cut-n-paste friendly. {F76114540,size=full} Additionally there could be a runbook at https://www.mediawiki.org/wiki/Phabricator/Help#Batch_edits that explained how to actually find the Phabricator server and perform the action. Doing this stuff less than once a year makes it hard to keep in my head.
    • Task
    In the documentation today we miss out on how to mount tmpfs, that makes our database test runs unrealistic compared to CI. Acceptance Criteria: * Update how to run the Docker dev container with tmpfs
    • Task
    The title says it all: the [[ https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#Configurable_properties | WDQS user manual on Mediawiki.org ]] will be outdated when the new backend is live. It needs to be updated/rewritten.
    • Task
    This page is more about the entire project than a specific wiki called "Abstract Wikipedia".
    • Task
    **Feature summary** (what you would like to be able to do and where): There should be a single OpenAPI spec document of all MediaWiki REST API endpoints deployed on Wikimedia sites. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): As a REST API developer, I want to know which endpoints are available. Currently, this information is scattered in several places: * https://en.wikipedia.org/w/rest.php/specs/v0/module/- (only some modules) * https://www.wikidata.org/w/rest.php/wikibase/v1/openapi.json (#wikibase_rest_api_wpp) * https://en.wikipedia.org/w/rest.php/specs/v0/module/growthexperiments/v0 (#growthexperiments) * https://en.wikipedia.org/w/rest.php/specs/v0/module/specs/v0 (broken?) * (Special:RestSandbox also shows some specs that aren’t under `/w/rest.php`, but I would think those are out of scope for this task.) More specifically, as the developer of a REST API client library (T411325), I want to know which request and response formats are used by existing API endpoints. In [this commit](https://gitlab.wikimedia.org/repos/m3api/m3api-rest/-/commit/bf7b46aaf8), I mistakenly concluded (based only on the first spec linked above) that all JSON-returning endpoints return objects or arrays, not primitives, and designed my API around that; I’ve now discovered that this is not true in Wikibase, and have to adjust my API to account for this (currently in [this commit](https://gitlab.wikimedia.org/repos/m3api/m3api-rest/-/commit/d1c9c240c7), but that will end up being rebased around a bunch of times, so the link will probably be dead at some point; search for a commit message with the subject “Also allow Strings from getJson(), postForJson()”, either on the `fetch` branch or eventually on `main`). This could have been avoided if I had been able to query a merged document for all REST APIs from the beginning. **Benefits** (why should this be implemented?): Better information for REST API consumers. Possibly related: {T416532}
    • Task
    I am trying to install Extension:ReportIncident on my wiki, but I am just stuck at ``` Original exception: [a637c5a75694681cd1034d7a] /wiki/Lipu_open MediaWiki\Config\ConfigException: Config variable ReportIncidentEnabledNamespaces not found in community configuration.Should be requested via MediaWikiConfigRouter instead. Backtrace: from /home/taavi/src/mediawiki/extensions/CommunityConfiguration/src/Access/MediaWikiConfigReader.php(111) #0 /home/taavi/src/mediawiki/extensions/CommunityConfiguration/src/Access/MediaWikiConfigReader.php(123): MediaWiki\Extension\CommunityConfiguration\Access\MediaWikiConfigReader->getConfigByVariableName() #1 /home/taavi/src/mediawiki/extensions/ReportIncident/src/Services/ReportIncidentController.php(180): MediaWiki\Extension\CommunityConfiguration\Access\MediaWikiConfigReader->get() #2 /home/taavi/src/mediawiki/extensions/ReportIncident/src/Services/ReportIncidentController.php(29): MediaWiki\Extension\ReportIncident\Services\ReportIncidentController->getLocalConfig() #3 /home/taavi/src/mediawiki/extensions/ReportIncident/src/Services/ReportIncidentController.php(63): MediaWiki\Extension\ReportIncident\Services\ReportIncidentController->shouldShowButtonForNamespace() #4 /home/taavi/src/mediawiki/extensions/ReportIncident/src/Hooks/Handlers/MainHooksHandler.php(45): MediaWiki\Extension\ReportIncident\Services\ReportIncidentController->shouldAddMenuItem() #5 /home/taavi/src/mediawiki/includes/HookContainer/HookContainer.php(135): MediaWiki\Extension\ReportIncident\Hooks\Handlers\MainHooksHandler->onBeforePageDisplay() #6 /home/taavi/src/mediawiki/includes/HookContainer/HookRunner.php(1023): MediaWiki\HookContainer\HookContainer->run() #7 /home/taavi/src/mediawiki/includes/Output/OutputPage.php(3301): MediaWiki\HookContainer\HookRunner->onBeforePageDisplay() #8 /home/taavi/src/mediawiki/includes/Actions/ActionEntryPoint.php(161): MediaWiki\Output\OutputPage->output() #9 /home/taavi/src/mediawiki/includes/MediaWikiEntryPoint.php(180): MediaWiki\Actions\ActionEntryPoint->execute() #10 /home/taavi/src/mediawiki/index.php(44): MediaWiki\MediaWikiEntryPoint->run() #11 {main} ``` I found https://www.mediawiki.org/wiki/Extension:CommunityConfiguration/Developer_setup which has zero information about this. There is also https://www.mediawiki.org/wiki/Extension:CommunityConfiguration/Technical_documentation which says you can run `CommunityConfiguration:ChangeWikiConfig` but that does nothing (except saying `Saved!`) for me. Nor did something like `$wgReportIncidentEnabledNamespaces = [ NS_MAIN, NS_TALK ];` in LocalSettings do anything. Please document how to run the extension.
    • Task
    Information at the following locations should be updated to reflect the current status of the ORES migration / deprecation process. The information on these pages would also benefit from being combined into one reliable resource, and the remaining pages should just link to that single source rather than having duplicate and scattered information that is difficult to find and to maintain: * https://wikitech.wikimedia.org/wiki/ORES * https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing/Usage#Differences_using_Lift_Wing_instead_of_ORES * https://www.mediawiki.org/wiki/Machine_Learning/Modernization#Migration_from_ORES_to_Lift_Wing
    • Task
    Several Wikimedia API endpoints use a page title as a path parameter. The purpose of this task is to decide on an accurate description for title parameters and update the docs for any applicable APIs. ### Problem statement It would be ideal to be able to document the title parameter as simply "Page title". However, this may lead to confusion if an API users tries to use a title in the wrong format. For example: - Some special characters don't work unless encoded: - example page title: `Fate/stay_night` - example API URL with 404 response due to `/` in page title: https://en.wikipedia.org/w/rest.php/v1/page/Fate/stay_night - Titles with spaces don't work with curl: - example: `curl "https://en.wikipedia.org/w/rest.php/v1/page/North America"` ### Consideration 1: Correct format It seems that the most correct format for titles in API path parameters is one in which, first, spaces are replaced with underscores, and then the result is URL-encoded. Example: `Man%27s_best_friend` (https://en.wikipedia.org/w/rest.php/v1/page/Man%27s_best_friend) This is the format returned by the MediaWiki REST API when providing an API URL (see html_url in [this response](https://en.wikipedia.org/w/rest.php/v1/page/Man%27s_best_friend/bare)). ### Consideration 2: API sandbox behavior However, the format described above does not work in the API sandbox since swagger UI automatically encodes path parameters: - Putting `Man%27s_best_friend` into the sandbox, results in a request for `Man%2527s_best_friend`, which 404s - Putting `Man's best friend` into the sandbox, results in a request for `Man%27s%20best%20friend`, which results in a redirect when using curl Since the documentation will be read primarily in the context of the sandbox, we should call this out. ### Proposed description "Page title. To format the title, replace spaces with underscores, and then URL-encode the result (example: Steller%27s_jay). In the API sandbox, replace spaces with underscores, but do not URL-encode (example: Steller's_jay)." Note: I had originally hoped to use "API explorer" instead of "API sandbox", but I found the term "sandbox" has already been well established in the context of Special:RESTSandbox, and that it would be more confusing to try and introduce a different term at this point. ### Potential future opportunities for improvement (Out of scope for this task) - API responses could consistently contain page titles in a format suitable for use with the API. For example, the MediaWiki REST API Search pages endpoint returns page titles only in `key` (`Fate/stay_night`) and `title` (`Fate/stay night`) formats. - The API could accept any format for page titles - The API could accept page IDs instead of titles - The API sandbox could behave the same way as the API
    • Task
    Revscoring model endpoints available via Lift Wing API are intended to replace the legacy ORES services, but most of the model cards on Meta-wiki currently only link to or reference ores.wikimedia.org. Even though ORES may still be available as a service, if we want users to be able to find appropriate replacement endpoints using our preferred, more modern infrastructure, we should update the model cards to facilitate discovery of the appropriate Lift Wing API endpoints. (This will also solve the current issue of [[ https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing#Current_Inference_Services | this table ]] being confusing by linking to model cards that only reference ORES). ## Documentation updates needed For each of the models listed below: 1. Supplement or replace the ORES links in the model card infobox (template) with links to Lift Wing API docs. ([[ https://meta.wikimedia.org/w/index.php?title=Machine_learning_models/Production/Multilingual_readability_model_card&diff=prev&oldid=30159036 | example change ]]) 2. Supplement or replace example ORES API calls in the "Implementation" section of the page with Lift Wing API examples. 3. Check and update all other ORES references on the page, and determine whether to supplement or replace them with Lift Wing references. 4. Verify that the Lift Wing API reference documentation links to the model card on Meta ([[ https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_articletopic_outlink_prediction | example ]]). * Note that the API portal is going to be shut down (T415293), so it may not be ideal to make these updates there , or it may be the best option for now, as the final location and generation process for Lift Wing API docs is TBD. ## Model cards to update, and their corresponding Lift Wing API endpoints **Article quality** * 12 model cards in the "Article quality" column of [[ https://meta.wikimedia.org/wiki/Machine_learning_models#Revscoring_models | this table ]] * https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_articlequality_prediction **Draft quality** * Model cards in the "Draft quality" column of [[ https://meta.wikimedia.org/wiki/Machine_learning_models#Revscoring_models | this table ]] are red links (do we need to create them?) * https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_draftquality_prediction **Damaging edit** * 33 model cards linked in the "Damaging edit" column of [[ https://meta.wikimedia.org/wiki/Machine_learning_models#Revscoring_models | this table ]] * https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_damaging_prediction **Goodfaith edit** * 33 model cards linked in the "Goodfaith edit" column of [[ https://meta.wikimedia.org/wiki/Machine_learning_models#Revscoring_models | this table ]] * https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_goodfaith_prediction **Revert risk** * 9 models with red links in the "Revert risk" column of [[ https://meta.wikimedia.org/wiki/Machine_learning_models#Revscoring_models | this table ]] (do these model cards need to be created?) * https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_reverted_prediction **Article topic** * 11 model cards linked in the "Article topic" column of [[ https://meta.wikimedia.org/wiki/Machine_learning_models#Revscoring_models | this table ]] * https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_articletopic_prediction **Draft topic** *https://meta.wikimedia.org/wiki/Machine_learning_models/Production/English_Wikipedia_draft_topic *https://api.wikimedia.org/wiki/Lift_Wing_API/Reference/Get_revscoring_drafttopic_prediction
    • Task
    Now that most of the work from T416894 is complete, moving instruments from the old setup that uses directly EventLogging and legacy schemas to the new, which uses TestKitchen pre configured clients, instruments or experiments should be easier. This has both benefits on the reduction of data collection risks and for creating KPI dashboards. **Acceptance criteria** - [] A mw.org page/section is created with clear instructions on how to migrate an existing instrument, which includes: - [] What are the available and correct streams/instruments/schemas to use - [] Where should be the data collected consumed
    • Task
    https://www.mediawiki.org/wiki/Wikidata_Query_Service doesn't talk about the split service and still has the search team as supporting it.
    • Task
    We should iterate and make sure we have the most common uses cases. For example today we have many examples for mediawiki-docker as default. That should be quickstart or fresh. Also we should look at the most common use cases. Are they captured correctly? Maybe we have other use cases that we should include. https://www.mediawiki.org/wiki/Selenium
    • Task
    Let's document what we've done and why in a blog post. We should write about our measurements and how we measured it. And then our actual changes and what numbers/metrics it gave us. **Acceptance criteria:** * Blog post published about our changes
    • Task
    Right now there are different places where we document Watchlists and how to use them. There's: - https://www.mediawiki.org/wiki/Help:Watching_pages and - https://www.mediawiki.org/wiki/Help:Watchlist and once you can watch pages while editing we want to make sure that watchlist labels are mentioned or referenced in the existing documentation. https://www.mediawiki.org/wiki/Help:Watchlist_labels
    • Task
    So far: - `action=watch` - `action=query&prop=info` ...
    • Task
    The [Add Link](https://wikitech.wikimedia.org/wiki/Add_Link) Wikitech page contains several information that is no longer up to date. For example, it contains obsolete information on codebase location (see T416877 for more detailed description). Similarly, the instructions on how to run the training no longer apply (as of now, whoever wishes to do that needs to schedule an Airflow DAG on the ML instance). Let's update the #documentation!
    • Task
    From T416483: > There's nothing in the wikitech docs about adding an ssh key for codfw1dev ldap, or even a note that you NEED to. A reasonable user would expect their existing eqiad1 key to just work which it definitely does not.
    • Task
    The mediawiki/vendor [[https://github.com/wikimedia/mediawiki-vendor/blob/master/README.md|README]] has a detailed explanation of how to create new patches, but the part on how to merge / deploy them is very vague (basically "ask someone who knows"). The process is nontrivial (or at least used to be, not sure about the exact state today) due to circular dependencies, and the uncertainty around it can make vendor patches a bottleneck - e.g. the train was just delayed by a day because {T416456} blocked it and no one available was confident how to revert a library version bump. There should be clear step-by-step documentation, just like for creating the patch.
    • Task
    **Description:** Préparer toutes les sections de la documentation de Traduc Ivoir' pour l’extension Translate. Ajoutez des balisages de traduction tout en préservant la mise en forme, les titres, les exemples de code, les modèles et les liens. Assurez-vous que chaque unité est correctement segmentée et que la terminologie reste cohérente entre les unités de traduction. **Domaine:** Documentation (traduction) **Difficulté:** Difficile **Étapes pour reproduire:** - Ajouter des balises `<translate>` autour des parties traduisibles de la documentation. - Excluer les éléments non traduisibles (par exemple, blocs de code, noms de paramètres) lorsque cela est nécessaire. - Vérifier que la page s’affiche correctement après avoir ajouté le balisage. - Activer la traduction de la page dans l’interface de traduction de pages. **Résultat attendu:** Une page de documentation entièrement marquée, prête pour une traduction multilingue, avec toutes les sections correctement reconnues par l’extension Translate. **Liens ou références:** - [[ https://meta.wikimedia.org/wiki/Traduc_Ivoir%27 | Documentation originale en français de Traduc Ivoir’ ]] - [[ https://www.mediawiki.org/wiki/Help:Extension:Translate/Page_translation_administration | Directives de traduction Wikimedia ]] **Étapes de mise en place:** Activer les outils d’aperçu du balisage de traduction et d’édition bilingue dans l’interface d’édition.
    • Task
    **Description:** Réorganiser la documentation dans une structure plus claire : - Présentation - Fonctionnalités - Installation - Contribuer - Soutien - Licence **Domaine:** Documentation **Difficulté:** Intermédiaire **Étapes pour reproduire:** - Analyser la documentation actuelle. - Identifier les sections manquantes ou mal placées. - Réécriver la structure en suivant le plan ci-dessus. **Résultat attendu:** Une documentation bien organisée. **Liens ou références:** - [[ https://meta.wikimedia.org/wiki/Traduc_Ivoir%27 | Documentation de Traduc Ivoir’ ]] - [[ https://www.wikidata.org/wiki/Wikidata:ArchiveExternalLinks | Documentation de ArchiveExternalLinks (exemple) ]] **Étapes de mise en place:** Aucune.
    • Task
    What can we use from https://curl.se/.well-known/security.txt to update https://en.wikipedia.org/.well-known/security.txt ? Follows up: {T187617}, {T337949}.
    • Task
    We are missing documentation on how to correctly override hooks in the wdio configuration. The implementation //wdio-mediawik//i take for granted that all hooks are run and if you extend the wdio-configuration and do not call the function you are overriding, things will break. Acceptance criteria: * Add documentation with a couple of examples on how to correctly override a couple of hooks
    • Task
    User story: As a developer writing code to add new functionality for AuthManager, I'd like a tutorial or how-to guide to show me the essential steps to implement a [[ https://doc.wikimedia.org/mediawiki-core/master/php/interfaceMediaWiki_1_1Auth_1_1SecondaryAuthenticationProvider.html | SecondaryAuthenticationProvider ]]. Related requests: https://www.mediawiki.org/wiki/Manual_talk:SessionManager_and_AuthManager#c-Legoktm-2016-10-27T08:13:00.000Z-Examples https://phabricator.wikimedia.org/T320349 Existing related content: https://www.mediawiki.org/wiki/Manual:SessionManager_and_AuthManager#SecondaryAuthenticationProvider and existing example page for SessionProvider (I find no parallel example page for AuthenticationProvider(s)) https://www.mediawiki.org/wiki/Manual:SessionManager_and_AuthManager/SessionProvider_examples
    • Task
    Both https://wikitech.wikimedia.org/wiki/Backport_windows#Doing_the_deploy and https://wikitech.wikimedia.org/wiki/Backport_windows/Deployers contain very outdated instructions about manually deploying patches, and should probably be replaced with pointers to spiderpig/scap backport?
    • Task
    Domain Documentation Level Advanced Pula official wiki: https://www.wikidata.org/wiki/Wikidata:PULA Sample translated wiki: https://www.wikidata.org/wiki/Wikidata:Pula/yo
    • Task
    Use any standard package to add onboarding wizard for mobile app. Follow the color palette and convention Type Coding Level Intermediate
    • Task
    **Domain:** Mobile / Code Quality / Documentation **Difficulty:** Intermediate **Problem / steps to reproduce (for bugs)** - Many React Native components **lack documentation**, making it hard for new developers to use them correctly. - Props are not clearly explained, which can lead to misuse or bugs. **Expected outcome / task:** - Add **component-level documentation** for all React Native components in the mobile app. - Each component documentation should include: - **Component name** - **Purpose / description** - **Props**: name, type, description, default value (if applicable) - **Events / callbacks** (if any) - Use **JSDoc / TypeScript style comments** for components, e.g.: ```ts /** * Button component for user actions * * @param {string} label - Text to display on the button * @param {() => void} onPress - Callback fired when button is pressed * @param {boolean} [disabled=false] - Disable the button */ const Button: React.FC<ButtonProps> = ({ label, onPress, disabled = false }) => { return <TouchableOpacity onPress={onPress} disabled={disabled}><Text>{label}</Text></TouchableOpacity>; };
    • Task
    **Domain:** Frontend / Code Quality / Documentation **Difficulty:** Intermediate **Problem / steps to reproduce (for bugs)** - Many React components in the project **lack proper documentation**, making it harder for other developers to understand how to use them. - Props are not clearly described, and component purpose is not explained. - This can lead to misuse, bugs, and slower onboarding for new developers. **Expected outcome / task:** - Go through all React components and add **clear documentation** including: - **Component name** - **Purpose / description** of the component - **Props**: - Name - Type - Description - Default value (if applicable) - **Events / callbacks** the component emits (if any) - Use **JSDoc / TypeScript comment style** for components: ```ts /** * Button component for user actions * * @param {string} label - Text to display on the button * @param {() => void} onClick - Callback fired when button is clicked * @param {boolean} [disabled=false] - Disable the button */ const Button: React.FC<ButtonProps> = ({ label, onClick, disabled = false }) => { return <button onClick={onClick} disabled={disabled}>{label}</button>; };
    • Task
    **Domain:** Frontend / Backend / Code Quality **Difficulty:** Intermediate **Problem / steps to reproduce (for bugs)** - Many functions in the project currently **lack proper comments or documentation**. - This makes it difficult for other developers to understand, maintain, or contribute to the codebase. - There is no consistent pattern for function documentation across the project. **Expected outcome / task:** - Go through the entire codebase and add **clear function-level documentation** for all functions. - Each function documentation should include: - **Function name** - **Purpose / description** of what the function does - **Parameters**: names, types, and description - **Return value**: type and description - Any **side effects** if applicable - Use standard commenting patterns, e.g. **JSDoc** for JavaScript / TypeScript: ```ts /** * Calculates the sum of two numbers. * @param {number} a - The first number * @param {number} b - The second number * @returns {number} The sum of a and b */ function sum(a: number, b: number): number { return a + b; }
    • Task
    A documentation page for patchdemo is needed, perhaps on wikitech. Use case: As a new developer, I have worked on some code changes. I would like to know an easy way to do UAT on my code. I know nothing about patchdemo. Acceptance Criteria: * A page for patchdemo on mediawiki.org or wikitech.wikimedia.org ** Explanation on how to create a wiki, select extensions, go to the wiki, etc ** Explanation on common problems and what to do about them ** FAQ * A link to the patchdemo documentation page from some documentation for new developers, such as [[ https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker#Change_and_test_the_code | How to become a MediaWiki hacker ]] * A link to the patchdemo documentation page from patchdemo.wmcloud.org
    • Task
    **Feature summary** (what you would like to be able to do and where): Make the platform more legible and accessible for technical contributors who will write edit checks and suggestions in the future. **Benefits**: - Technical contributors can easily start new checks and offer patches to existing ones. **What does done look like?** - [ ] A brief overview of `TextMatchEditCheck`, in case the user can be directed there instead - [ ] A minimal VE + Editcheck dev setup guide for mwcli – eg a pointer to other documentation and `LocalSettings.php` changes. - [ ] One or two examples - [ ] Lifecycle documentation - What listeners exist, when they fire, what data they receive - What context data is available, eg action tags (dismissal etc), modified ranges, selection - [x] Brief guide to working in the codebase - [ ] Pointers to VE data model concepts and a brief overview - [ ] Examples of how to do the following with the data model: - Search for strings (modified or otherwise) - Identify modified nodes - Identify nodes of a particular type - [x] Notes on enabling checks for development - [ ] Pointers to the edit check configuration file and how it's laid out - [ ] It's clear how to test a check, eg, enabling experimental checks and suggestions **Timeframe** I'll seek feedback on this as I tend to underestimate how long it takes me to write text. I think it would be reasonable to target 8 hours, expecting some of the items above to be half an hour and some to be 1.5-2 hours. Generating examples feels harder to scope, I'll aim to outline one full example and the interesting bits of a second one, based on existing code.
    • Task
    Due to T409860 the [[ https://wikitech.wikimedia.org/wiki/HCaptcha | hCaptcha wikitech page ]] is very out of date. Furthermore, it should be updated to contain the following: - [ ] proxy design overview - [ ] incl. anycast setup - [ ] say the word fallback somewhere in there - [x] update relevant puppet module paths - [ ] describe how it interacts with common mediawiki userflows <-- this is probably in the trial design docs, pull it out into techwiki - [ ] runbooks - [ ] smoke test requests - [ ] how to depool a VM/site - [ ] fallback mechanism of the extension
    • Task
    **Feature summary**: Create comprehensive contributor documentation including CONTRIBUTING.md with code style guidelines, development workflow, and testing requirements, plus architecture diagrams showing data flow, class relationships, and cache strategy to help new contributors understand and contribute to Paulina effectively. **Use case(s)**: 1. A new contributor wants to submit their first pull request to Paulina - Current behavior: No CONTRIBUTING.md file with guidelines for code style, commit messages, or PR process - Expected behavior: Clear documentation guides contributors through the entire contribution workflow 2. A developer wants to understand how data flows from Wikidata to the user - Current behavior: no visual diagram or architecture points to that - Expected behavior: Architecture diagrams provide visual understanding of system components and data flow 3. A contributor wants to understand the complex copyright calculation logic in pdclasses.py - Current behavior: Complex business logic has minimal inline comments explaining the legal basis - Expected behavior: Detailed comments explain the copyright law rules and assumptions behind calculations 4. A new developer encounters an installation error and searches for solutions - Current behavior: README.md lacks troubleshooting section for common issues - Expected behavior: Troubleshooting guide helps developers resolve common setup problems 5. A contributor wants to know what code style standards to follow - Current behavior: No documented code style guidelines (PEP 8 compliance, naming conventions, etc.) - Expected behavior: CONTRIBUTING.md specifies code quality standards and linting requirements **Benefits**: 1. **Lower Barrier to Entry**: New contributors can get started quickly with clear documentation 2. **Consistent Quality**: Code style guidelines ensure consistent, maintainable code 3. **Faster Onboarding**: Visual diagrams help developers understand architecture faster than text alone 4. **Better Maintainability**: Well-documented code is easier to maintain and extend 5. **Reduced Support Burden**: Troubleshooting guide reduces repetitive questions 6. **Knowledge Preservation**: Documents complex domain knowledge (copyright law logic) 7. **Community Growth**: Good documentation attracts and retains contributors
    • Task
    == Background and overview Epic umbrella task for transitioning [[ https://github.com/gruntjs/grunt | Grunt JavaScript task runner ]] tasks into npm scripts in Wikimedia deployed projects. With the decline of Grunt for many years now, it won't be feasible to use it infinitely without risking modern functionality and abilities and security (read below). The last release dates back to Jan 2023. In a recent NPM audit **in Oct 2025, a security vulnerability** [[ https://security.snyk.io/vuln/SNYK-JS-INFLIGHT-6095116 | was identified ]], where Grunt, depends on an unmaintained version of Glob. This task tracks the overall progress while {T246326} serves with examples. Besides the pure technical transition challenges, we also need to ensure to update documentation and remove mentions of Grunt. https://www.mediawiki.org/wiki/Continuous_integration/Entry_points == Goal Transition Grunt tasks using projects away with npm as current technology choice == Acceptance criteria for done [] Transition each of the projects below, orient on https://codesearch-beta.wmcloud.org/deployed/?q=%22grunt%22 [] Remove mentioning of/replace Grunt in documentation [] Update LibUp configuration === Projects (Wikimedia deployed) **MediaWiki core** [] MediaWiki Core **Extensions** From prior art tasks [x] T206069 MobileFrontend [x] T206462 Popups [x] T295464 ContentTranslation [x] T266977 MediaViewer (MMV) Newly captured in this task [x] #GrowthExperiments [x] #mediawiki-extensions-readinglists T410437 **Skins** [x] Skin:Vector {T242781} [] Skin:Example [] Skin:Timeless [] Skin:MonoBook [] Skin:Modern Other projects ==== Prior art {T246321}
    • Task
    ### Description We need to improve clarity over who should respond to errors and ultimately be responsible for incoming requests across API modules. ### Conditions of acceptance 1. Start with the Core Maintainer's list as a source of truth for clearly owned services to get a sense of what is actually maintained vs not within the API. a. This includes both Action & REST API modules. a. Ensure that all modules are accounted for; only a subset are currently in the spreadsheet. The most complete list is likely the [[ https://en.wikipedia.org/wiki/Special:ApiSandbox#action=parse&page=Pet_door&format=json | Special:ApiSandbox ]]"action" dropdown list, if there is not a better scripting or tool alternative. a. Tag instances that are currently "owned" by teams that might not make sense. 1. Review initial proposal with MWI ======= Out of Scope/Follow up tasks ======== 1. Update the API list to include last-touch and/or conceptual owners for unmaintained modules that have APIs. 1. Create an API maintainers list proposal for REST APIs (this should be fairly straightforward) 1. Create a maintainers list proposal for Action API modules 1. Review prospective owning teams (Halley can help with this) 1. Publish 'final' proposal on-wiki ### Implementation details * [[ https://docs.google.com/spreadsheets/d/1-fcgVYgM0elpTizqN93VEsBINgcs88Gp_6Kw-wrczIs/edit?gid=2132717274#gid=2132717274 | This spreadsheet ]] [WMF staff only for now] can be used as a starting point for documenting the list of APIs & formatting (but feel free to edit as you see fit) * Use the [[ https://www.mediawiki.org/wiki/Developers/Maintainers#MediaWiki_core | MW maintainer's list ]] as a starting point for extension and feature ownership. * In cases where the maintainer's list does not illuminate specific ownership, review recent changes in code to determine which team/person might be a good starting point. * We should have at least some recommendation for every module/API family.
    • Task
    ``` $ echo "[http://google.com]" | php bin/parse.php <p data-parsoid='{"dsr":[0,19,0,0]}'><a rel="mw:ExtLink nofollow" href="http://google.com" class="external autonumber" data-parsoid='{"dsr":[0,19,18,1]}'></a></p> ``` This output is not marked for localization and should be localized via CSS, but we don't seem to have instructions for these anywhere yet. If we want to persist with this approach, we should provide instructions for editors and/or provide CSS for localized numbering on wikis that use localized numbers (ex: fawiki). Alternatively, Parsoid should either switch these to explicit numbering and and add data-mw-i18n attributes and have the OutputTransformPipeline localize them. See this example on fawiki: [[ https://fa.wikipedia.org/wiki/%D8%B3%DB%8C%D9%85%D9%88%D9%86_%D9%BE%D8%A7%D9%BE%D9%88%DB%8C%D8%A7%D9%86?useparsoid=1 | Parsoid ]] vs [[ https://fa.wikipedia.org/wiki/%D8%B3%DB%8C%D9%85%D9%88%D9%86_%D9%BE%D8%A7%D9%BE%D9%88%DB%8C%D8%A7%D9%86?useparsoid=0 | Legacy ]].
    • Task
    The publication of the draft page https://www.mediawiki.org/wiki/API/Deprecation may create confusing and inconsistent information about API versioning for the impacted APIs: - The [[ https://www.mediawiki.org/wiki/Wikimedia_REST_API | Wikimedia REST API ]] claims to follow https://www.mediawiki.org/wiki/API_versioning as its versioning policy, but https://www.mediawiki.org/wiki/API/Deprecation applies to that same API. - https://www.mediawiki.org/wiki/API/Deprecation is not listed in the list of Deprecation policies in https://www.mediawiki.org/wiki/Template:Deprecation_policies - https://api.wikimedia.org/wiki/Core_REST_API claims to follow [[ https://api.wikimedia.org/wiki/Stability_policy | this stability policy ]] , which currently has no apparent connection to https://www.mediawiki.org/wiki/API/Deprecation I updated the [[ https://www.mediawiki.org/wiki/API:REST_API | landing page for the MediaWiki REST API ]] to point to the new Deprecation page from its "Versioning" section, but more work is needed here to ensure that the information provided at https://www.mediawiki.org/wiki/API_versioning, and linked to from each API's docs, is consistent. [[ https://www.mediawiki.org/wiki/API/Deprecation#How_does_this_relate_to_existing_stability_policies? | This section of the Deprecation doc ]] alludes to this, but I'm filing this task because: - we shouldn't let this inconsistent information linger for too long - we shouldn't let the new Deprecation page be marked for translation if some or all of its content should ultimately be integrated with the existing API_versioning page
    • Task
    Compare [[ https://en.wikipedia.org/wiki/List_of_EC_numbers_(EC_2)?useparsoid=1 | Parsoid ]] vs [[ https://en.wikipedia.org/wiki/List_of_EC_numbers_(EC_2)?useparsoid=0 | Legacy ]] output. If you scroll down to `EC 2.1.1.270: (+)-6a-hydroxymaackiain 3-''O''-methyltransferase` .. you will see that the link renders differently in parsoid & legacy. The actual redlink is the same but legacy's linktext converts the `''O''` to an italicized //`O`//. A number of links on this page are similarly affected. With a quick test, I cannot reproduce this difference on the CLI. So, needs some more investigation.
    • Task
    https://www.wikifunctions.org/wiki/Wikifunctions:Abstract_Wikipedia/2025_fragment_experiments needs some step-by-step instructions **Related discussion / context ** https://abstract.wikipedia.org/wiki/Abstract_Wikipedia:Project_chat#Making_it_easier_to_contribute
    • Task
    Alerts are setup that will alert if there are no new metrics produced by Pixel. We need a runbook so everyone in the team can act on the alert and take a first look. AC: * Create a runbook that handles: 1. Log into the instance and figure out if the problem is Pixel or metrics just don't get delivered 2. Restart Pixel 3. Redeploy Pixel @Mhurd do those actions make sense or should it be others that are explained in the runbook?
    • Task
    https://www.mediawiki.org/wiki/Extension:ConfirmEdit documentation doesn't have everything in the root `extension.json` config variables, we should fix that. See also: {T404716}
    • Task
    This document is about interfaces covered by the stable interface policy so I'm leery about changing it. The document uses several prefixes. It is unclear whether these prefixes are used in the Turtle dump files. It turns out that the Turtle dump files use data: and s:, not wdata: and wds:. There are other differences. This difference should be remedied or documented. As well, there should be a delineation of just what is stable. Is it the triples? Is it the RDF qnames, i.e., including the prefixes? Are repeated @prefix blocks allowable? The reason the last is important is that some systems that ingest Turtle files allow parallel parsing of parts of the file. This is only guaranteed to work if all the prefixes are in a block at the start of the file.
    • Task
    **Feature summary** (what you would like to be able to do and where): If you visit https://doc.wikimedia.org/mediawiki-core/1.39.7/php/classMediaWiki_1_1ResourceLoader_1_1StartUpModule.html#details it's not clear you are reading old outdated documentation. Could we put a banner at the top of the page that says "This documentation is outdated" and link to the latest version? **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): I had a work colleague who ended up on it from a ResourceLoader search which led to outdated code problems (1.39.7 is mentioned but in very tiny text at the top). We already do this on mw.org with https://www.mediawiki.org/wiki/Template:Update **Benefits** (why should this be implemented?): We want - Google searches presumably will bump newer documentation if old documentation links to it - Less confusion for new developers
    • Task
    I'm currently adding its first jest tests to the CampaignEvents extension. I followed the [[https://www.mediawiki.org/wiki/Vue.js/Testing | documentation page ]] on mediawiki.org, but it hasn't received substantial updates in 5 years and some parts of it were clearly outdated, like the list of dev dependencies. I updated that because it was obvious, but I'm sure there's more. It then also says you need to create a jest config and a jest setup file, both with some boilerplate code. I ended up copying this from other extensions, and I suppose I'm not the first doing it; this is obviously problematic. I don't have specific preferences on how to make this easier to developers, but I do have some suggestions: - Update documentation - Track the dev dependencies in LibUp for automatic updates (would mitigate the issue of the documentation not using latest versions) - Consider creating a package that brings in all the needed dependencies, as well as extensible config files with sensible defaults (what we do for code checkers like PHPCS, eslint, phan, stylelint) - Create a phab tag, or column in #mediawiki-core-tests, for reporting bugs. I'm having to tag this task semi-randomly for the time being.
    • Task
    Me falling over it again, in {T404801}... We don't seem to document anywhere that WebAuthn needs an SSL cert, or otherwise specific configuration to work properly...
    • Task
    At our team page today developers/teams can contact us using Slack or Phabricator. Is that enough? Let's discuss and decide if that who'll be updated. In T384421 we slightly talk about the same thing, AC: * Discuss in the team how we want people to contact us. * If necessary, update the team page
    • Task
    Our team page at https://www.mediawiki.org/wiki/Wikimedia_Testing_Platform is missing the section of what we do, what is our projects. Let's discuss that within the team and then document it on our team page. AC: * Define our projects - the projects that we are responsible for as a team * Is there other areas that is not a project that is our responsibility or should be? * Update the documentation page
    • Task
    Update our [[ https://www.mediawiki.org/wiki/Wikimedia_Testing_Platform | team page ]] to reflect the teams core values. But first iterate our core values. I think we can use release engineering as a starting point: https://www.mediawiki.org/wiki/Wikimedia_Release_Engineering_Team/About Document WIP: https://docs.google.com/document/d/1sm9J5mzPfelJzycYrXvl9MtqFDmqTnicDRAa5p03Yls/edit?tab=t.0#heading=h.c5xtg0yeoevu Is there something in that values section that we want to change? AC: * Agree what is our core values * Update the page to reflect them
    • Task
    https://www.mediawiki.org/wiki/Extension:ConfirmEdit#hCaptcha is very much lacking in all the extra config `$wg` that hCaptcha brings in
    • Task
    Please document what these various config vars in the extension mean ```lang=json "WMEReadingDepthSamplingRate": { "value": 0 }, "WMEWebUIScrollTrackingSamplingRate": { "value": 0 }, "WMEWebUIScrollTrackingSamplingRateAnons": { "value": 0 }, "WMEWebUIScrollTrackingTimeToWaitBeforeScrollUp": { "value": 0 }, "WMEStatsdBaseUri": { "value": false }, "WMEStatsBeaconUri": { "value": false }, "WMEEditCampaigns": { "value": [] }, "WMESchemaEditAttemptStepSamplingRate": { "value": "0.0625" }, "MFSchemaEditAttemptStepOversample": { "value": false }, "WMEWikidataCompletionSearchClicks": { "value": [] }, "WMEClientErrorIntakeURL": { "value": false }, "WMESessionTick": { "value": false }, ```
    • Task
    The documentation for Pixel 2 is ready: https://wikitech.wikimedia.org/wiki/Draft:Pixel2 Let someone else in the team review it and give feedback. Four eyes is better than two :) One question though, who is the audience for this documentation? **AC**: * Verify that the code examples in the documentation works * Verify that the documentation is easy to read and understand * Verify that the documentation is what's needed for the audience for the documentation * Document the feedback in this task and have session where the document is edited/fixed.
    • Task
    The canonical documentation page for OAuth apps is https://www.mediawiki.org/wiki/OAuth/For_Developers but it's very hard to use, in part because it tries to describe three different workflows (handshake, API request, OIDC-ish) for five different app types (OAuth 1, OAuth 1 owner-only, OAuth 2, OAuth 2 non-confidential, OAuth 2 owner-only; the sixth would be OAuth 1 with RSA but we don't even document that) which all work a bit differently, and in part because it tries to be non-wiki-specific so it only handwaves the relevant URLs instead of actually providing them. It's terrible DX. We should just have MediaWiki generate the documentation (at `Special:OAuthHelp/<app id>` or something like that) that's relevant for a given app and the current wiki.
    • Task
    Hi, While configuring `MediaWiki:Are-ratings`, we noticed that the `category` field inside the JSON seems to have no visible effect. [The documentation](https://www.mediawiki.org/wiki/Extension:ArticleRatings) only briefly mentions it and points to Brickipedia as an example, but does not explain its actual purpose or how it should be defined. From what we understand, it might have been intended either: 1. to automatically place pages with a certain rating into a corresponding category, or 2. to automatically assign a rating to pages in a given category. However, in practice it seems to do neither. On Brickipedia, some of the `category` entries even point to redlinks with no pages, which makes the feature even harder to interpret. Could you please clarify what the `category` field is actually supposed to do, and ideally improve the documentation accordingly? Thanks!
    • Task
    GrowthExperiments has instructions for developer setups available at MediaWiki.org, see https://www.mediawiki.org/wiki/Extension:GrowthExperiments/developer_setup. The documentation mentions several ways to use Suggested edits, some of them use production data (useful if you import articles from production), others include full integration with the Add Link service (requiring only the language of articles to match) and other solutions mock the data from Add Link service using a subpage. Unfortunately, not all of those solutions are currently working. For example, T402698 includes a report by @SomeRandomDeveloper that the documentation suggests to use https://addlink-simple.toolforge.org/ for mocking the Add Link service output, but that tool is down. Indeed, https://addlink-simple.toolforge.org currently returns a 404, and the corresponding tool account does not currently exist in Toolforge (maybe the tool was deleted since?). Within this task, we should review the developer instructions documentation, ensure what it recommends people do is up to date and make any relevant updates (and/or create tasks for them).
    • Task
    We have information on [[https://wikitech.wikimedia.org/wiki/Catalyst|Wikitech for Catalyst]], a stub on [[https://www.mediawiki.org/wiki/Patch_demo|MediaWiki for Patch demo]], the [[https://gitlab.wikimedia.org/repos/test-platform/catalyst/patchdemo|Patch demo README]] and the [[https://gitlab.wikimedia.org/repos/test-platform/catalyst/catalyst-api|Catalyst README]]. We should think about the user flow through this documentation, and what information belongs on what page. (draft) AC: - Catalyst wiki page has catalyst information - Patch demo wiki page has information on Patch demo
    • Task
    The AuthManager-based Action API endpoints (`action=clientlogin` and `action=createeaccount`; there are more but those aren't used much) return a set of JSON objects describing what kind of credentials are expected, and then the client can submit those using dynamic parameters. This isn't obvious, isn't documented very well, and it's not clear what stability guarantees exist for the dynamic parameters (none, but we should state this explicitly).
    • Task
    We are going to need runbooks to help folks figure out what to do when things like the various tofu and magnum failures from https://gitlab.wikimedia.org/repos/releng/zuul/tofu-provisioning/-/merge_requests/29 happen. There is a small temptation to do docs as code things for this, but discoverability and long term improvement of the docs are more likely if we do it on Wikitech or mediawiki.org. https://wikitech.wikimedia.org/wiki/Nova_Resource:Zuul would be a reasonable place to start building docs. We can move them later if a better home is found.
    • Task
    When making a change to a field in a MediaWiki OpenAPI spec, I want to be able to preview my change in Special:RestSandbox locally before submitting a patch. I wasn't able to find any documentation for how to do this, so it would be great to have some docs available on mediawiki.org.
    • Task
    For users how to setup collections. For the team to refer to when closing tasks related to the old lists feature.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Navigate to the "posts" section on the Codex docs site main page: https://doc.wikimedia.org/codex/latest/#posts **What happens?**: * Notice that each card displays a placeholder thumbnail image because the images provided have a 404 status "not found". For example, the image URL source, https://doc.wikimedia.org/homepage-images/creating-the-wikimedia-design-system.jpg, results in an error: >Failed to load resource: the server responded with a status of 404 () **What should have happened instead?**: - The image should load with `/codex/latest/` included in its base path: https://doc.wikimedia.org/codex/latest/homepage-images/creating-the-wikimedia-design-system.jpg **Software version** : Codex v2.2.0 **Other information** (browser name/version, screenshots, etc.): * Other images that load successfully on this page, such as the Codex logo, have a base path with `/codex/latest/` due to VitePress's base configuration that determines the base path in production, `base: process.env.CODEX_DOC_ROOT || '/'`. In addition, the Codex logo is a static image, and VitePress automatically processes static asset paths. As per https://vitepress.dev/guide/asset-handling#base-url, dynamic asset paths need to be wrapped with the `withBase()` helper function. * Suggestion: (1) Consider using static assets versus dynamic props, or (2) wrap the URL with `withBase()`. # Acceptance Criteria [ ] Ensure VitePress processes the asset path by either using static assets or wrapping the image URLs with `withBase()`.
    • Task
    The page at https://www.mediawiki.org/wiki/Documentation/Tools doesn't have any incoming links, nor is it linked to from https://www.mediawiki.org/wiki/Documentation/Toolkit. However, it is the [[ https://www.mediawiki.org/w/index.php?search=subpageof%3ADocumentation%2FTools&title=Special%3ASearch&profile=advanced&fulltext=1&advancedSearch-current=%7B%22fields%22%3A%7B%22subpageof%22%3A%22Documentation%2FTools%22%7D%7D&ns0=1&ns12=1&ns100=1&ns102=1&ns104=1&ns106=1 | parent page for very relevant pages ]] describing our recently-launched documentation tools. Now that we have more content (and tools!), we should update this page to cover the current state of things, and make the page it easier to find.
    • Task
    https://wikitech.wikimedia.org/wiki/Portal:Toolforge/Admin/Striker or some page linked from it should contain documentation that explains how to run Striker database migrations in production. Roughtly, the process is: 1. Deploy the Striker version with the database changes included with the usual deployment process 1. Acquire the striker admin database credentials from somewhere 1. Shell to a cloudweb hosts, and enter the Striker pod 1. Run manage.py, with the database credential environment variables manually overridden
    • Task
    At https://doc.wikimedia.org/codex/latest/components/demos/button.html there are flagged buttons: {F62433061} and iconed buttons: {F62433077} but not both. Such an example would look like: {F62433135}
    • Task
    In https://gerrit.wikimedia.org/r/c/mediawiki/core/+/928902, the `docs/hooks.txt` file was deleted from MW core and we now new hooks docs is now `docs/Hooks.md`. There are still references to the old `docs/hooks.txt` file: https://codesearch.wmcloud.org/search/?q=hooks.txt&files=&excludeFiles=HISTORY&repos= (HISTORY file excluded), in code and also onwiki docs like: https://www.mediawiki.org/wiki/Template:MediaWikiHook. We should probably scan for references and update those to point to the new documentation in the .md file? NOTE: Notice how https://www.mediawiki.org/wiki/Template:MediaWikiHook#Finding_a_hook's_version_and_Gerrit_ID points to a link to `hooks.txt` that is broken.
    • Task
    As a Wikibase user I want to be able to quickly navigate relevant documentation content and links on [MediaWiki](https://www.mediawiki.org/wiki/Wikibase)
    • Task
    It's not all clear to handlers in ingress objects how to handle idempotency so that handler-maintained state converges on what it should be despite processing some things out of order or dealing with concurrent changes from other handlers or even non-handler code. I'm thinking of high-level classes of listeners to consider: [] a) Notify users (e.g. email, echo) on page change (mentions page diff) [] b) Flag a page for a daily digest notification (e.g. email, echo) on page change (mentions page diffs) [] c) Flag a page as "changed since last seen" for users [] d) Notify users when a milestone is reached (e.g. edit count eaching 1000) [] e) Assign autopromote group to a user once a threshold is reached (e.g. "editor" for FlaggedRevs) [] f) Purge a cache entry (e.g. in memcached) keyed on page ID [] g) PUT/DELETE a storage object (e.g. in Swift) keyed on page ID [] h) Purge a cache entry (e.g. in memcached) keyed on (namespace/dbkey) (strongly discouraged) [] i) PUT/DELETE a storage object (e.g. in Swift) keyed on (namespace/dbkey) (strongly discouraged) Each of these could use documentation on a page linked from https://www.mediawiki.org/wiki/Manual:Domain_events . This would include things like locking the page row with READ_LOCKING, checking (a) if the row is still there and (b) comparing page_latest if it is there, using those to decide whether to no-op (e.g. when maintaining state) or continue anyway (e.g. when sending notifications). Right now, people writing handlers have to magically know this stuff. If they don't want to do these things, it should at least be a sensible tradeoff (e.g. extra cache purges that don't hurt or the maintained state has a short TTL anyway).
    • Task
    Having a field that can autocomplete from a list of users on a wiki would be useful functionality. This is doable via https://www.mediawiki.org/wiki/Manual:HTMLForm_Tutorial_3#usersmultiselect and https://www.mediawiki.org/wiki/Manual:HTMLForm_Tutorial_3#user ```lang=php $class = HTMLForm::getClassFromDescriptor( $name, $field ); ``` https://www.mediawiki.org/wiki/Extension:ContactPage#Creating_complex_forms isn't great... and requires a few clicks to find the types...
    • Task
    While it's not uncommon, it does happen that extensions (and maybe skins?) have breaking (config) changes that should probably be mentioned at a higher level than just the `git log` of the extension. We should potentially have a specific section in the mediawiki core RELEASE-NOTES file where things like this can be highlighted. Useful for things like one "fix" for {T394814}, or {T384064}, where, basically the same issue... some specific config (`LocalSettings.php`) change is needed to save breaking a wiki during upgrade
    • Task
    https://doc.wikimedia.org/codex/latest/components/demos/menu.html#menu-groups It says "//Avoid mixing menu groups with visible titles and menu groups with visually-hidden titles.//" but the example does not actually mix anything. Both of the titles in the example are hidden, but I believe makes it a "good" example rather than a "bad" example — according to this guideline at least. {F60317136 height=300} {F60317162 height=300} See also: * {T394829}
    • Task
    Currently the documentation does not contain instructions for what to do if a wiki does not have templates. It does suggest en wiki documentation pages to go along with en wiki templates. However these are complex templates with many dependencies, and they may be hard for small wikis to import, understand (they use Lua) and maintain as they may lack the correct permissions to import rights. Many small wikis will try to copy paste these templates instead of importing them, and this results in broken templates because of missing dependencies. We should in the documentation include a minimally viable config and template (+ TemplateData) for small wikis that they can copy paste, and don't require import rights. Draft minimal config which uses the en wiki Citation template: https://www.mediawiki.org/wiki/Citoid/Minimal_config
    • Task
    While many of the config variables are defined in ORES' `extension.json`, many are not. Similarly, some are documented on https://www.mediawiki.org/wiki/Extension:ORES#Config_variables, but not all of them Can we please document the rest? T393876#10812740 being a case where the use of one is not 100% clear.
    • Task
    For large deployments or other managements tasks it's sometimes useful to set up (or otherwise modify existing) dblists in `mediawiki-config`, however it's a bit under-documented. I managed to figure out the `add`/`del` commands myself :D but got tripped up on a patch because I'd forgotten to run the `update` command, which resulted in some confusion after editing `DB_LISTS` but forgetting to update derivative arrays. It probably wouldn't hurt to add a `dblists/README.md` which mentions at least this specific fact. :D The individual command syntaxes are self-documenting, the main thing is to lay out the workflow.
    • Task
    ConfirmEdit will always need to change captcha solutions as the landscape of bots and vandals changes. To aid the implementation of new CAPTCHA technologies, we should strive to have documentation and/or an example of a provider implementation, noting key areas where customisation is likely to be needed, and key pitfalls. As it stands, there's no real documentation on MediaWiki.org for this, and code comments on some classes are generic and/or lacking.
    • Task
    Document Wikisource-Wikidata Linking on Malayalam Wikisource
    • Task
    The docs at https://wikitech.wikimedia.org/wiki/Portal:Data_Services/Admin/Quarry describe the 'bare metal' cluster from before @rook's work to deploy and manage the service within a Magnum Kubernetes cluster. There is lore that has been passed back and forth via IRC about how to access and manage parts of the new system. This lore needs to make it to the wiki. https://github.com/toolforge/quarry/blob/main/README.md will be helpful as well.
    • Task
    We have multiple sources of instructions for how to generate an SSH key; most of these pages seem to be offering the same information, but some have more helpful details, or better formatting. As a developer who was trying to deploy a tool, my workflow was slowed down by encountering these multiple pages and trying to cross-reference them to make sure I wasn't missing anything. - https://wikitech.wikimedia.org/wiki/Help:SSH -- Targeted at Cloud VPS users, but that isn't clear from the title / page location (and unclear if that group of users really needs a separate page about this) -- Has the clearest page title / namespace, but seems to be an orphan page (almost no incoming links) -- Has nice breakout of instructions for different operating systems -- Has troubleshooting that other docs lack - https://wikitech.wikimedia.org/wiki/Generate_an_SSH_Key -- The shortest and most direct set of instructions, but lacks nuance -- **Linked to in the UI for IDM** -- Linked to from key entry point docs: Help:Accessing_Cloud_VPS_instances and Help:Toolforge/Quickstart -- Includes instructions about resetting passphrase that no other docs include - https://www.mediawiki.org/wiki/SSH_keys -- This page has some troubleshooting info that others lack, and I find the explanations slightly clearer -- https://www.mediawiki.org/wiki/SSH_keys#Copy_your_SSH_Public_key doesn't clearly say where to paste the key after you copy it -- This page has been translated -- Linked to by Gerrit tutorial. I just added a link to this page also from https://www.mediawiki.org/wiki/GitLab/Workflows/Registering_an_account_on_GitLab#Add_an_SSH_key, since it was only linking to the upstream GitLab docs on this topic. Other pieces of information that could be consolidated by replacing them with a link to a single-source of info on this topic: - Troubleshooting: https://www.mediawiki.org/wiki/Gerrit/Troubleshooting#ssh - https://wikitech.wikimedia.org/wiki/Help:Toolforge/Quickstart#Set_up_an_SSH_client_and_a_key could be even shorter and only link to instructions, rather than duplicating some of them. Other info that might be worth including in these instructions: - How to have multiple keys https://www.mediawiki.org/wiki/Toolserver:Logging_in (deprecated doc but a topic not covered elsewhere) Out of scope: Docs for SRE audiences, or those that need production access, like https://wikitech.wikimedia.org/wiki/SRE/Production_access#Generating_your_SSH_key. It makes sense that those docs are separate.
    • Task
    For various reasons, we document a recommended composer version to prevent churn in composer generated files. As part of {T367677}, and in the followup in https://gerrit.wikimedia.org/r/c/mediawiki/extensions/3D/+/1136128 there was a lot more changes than expected, primarily adding license data, making the diff much more noisy. To try and prevent that, including making sure these versions are set in any relevant images we may use, we should define at least a minimum, if not a recommended version of NPM to be using.
    • Task
    With the Kubernetes migration of [[ https://wikitech.wikimedia.org/wiki/Periodic_jobs | periodic jobs ]] and [[ https://wikitech.wikimedia.org/wiki/Videoscaling | videoscaling ]], we took the opportunity to overhaul existing documentation and more importantly write clear and extensive documentation where previously there was none. We didn't accomplish the same thing when migrating Jobrunners to mw-jobrunner. Looking "jobrunner" up on wikitech gives documentation on the [[ https://wikitech.wikimedia.org/wiki/History_of_job_queue_runners_at_WMF | history of jobrunners ]], which (while nice to have) isn't directly useful for incident response or debugging. The [[ https://wikitech.wikimedia.org/wiki/MediaWiki_JobQueue | Jobqueue overview ]] is also useful, but has a more general audience in mind. In general, how and when jobs are run is not clearly understood across SRE and much of P&T, and this documentation should work to explain the mw-jobrunner's part in this.
    • Task
    The documentation on the Commons:VideoCutTool wiki page appears to be outdated and may no longer reflect the current status or capabilities of the tool. This could confuse users who are trying to understand or use the tool. https://commons.wikimedia.org/wiki/Commons_talk:VideoCutTool
    • Task
    Placeholder task for exploring ways to scale Codex-related questions and support requests, potentially through the use of generate AI tools like domain-specific chat bots.
    • Task
    We have this page: [[https://www.mediawiki.org/wiki/Manual:How_to_debug/Login_problems|Manual:How to debug/Login problems]] but it's confusing because it's trying to be a mix of "how to debug login problems on Wikimedia wikis" and "how to debug login problems on my own wiki". And now it's also a bit outdated due to SUL3. We should update and split it to help people who are experiencing logout problems and are willing to give us detailed reports about it but aren't sure how.
    • Task
    There's no documentation whatsoever on how to add links to the upper right such as a donate link to the left of the create account link. I have checked https://www.mediawiki.org/wiki/Manual:Interface/Sidebar but that seems only for the sidebars which is not what I want. I have also checked https://www.mediawiki.org/wiki/Manual:Hooks/SkinTemplateNavigation::Universal but there's only example code on how to add links to places other than the place I want. Example code is really needed on how to add a link in the upper right corner next to the create account link such as a donate link like Wikipedia and the MediaWiki websites have.
    • Task
    ## Summary Currently there is no user-facing documentation about the email auth verification code. In this task, let's create a public facing help page (mw:Help:Extension:EmailAuth) and link to it from the verification code input page. ## Background Users may need additional context to understand the verification code workflow, or know how to find help if the code does not work, or if they no longer have access to their email. ##User story As a user logging in, I want a link to a help page if I get stuck at the verification code step. ## Technical notes Update the auth form change hook to include a link to mw:Help:Extension:EmailAuth ## Acceptance criteria - [ ] mw:Help:Extension:EmailAuth exists and has instructions to help with 1) token not working or 2) no longer have access to my email address - [ ] Token input form links to mw:Help:Extension:EmailAuth
    • Task
    Follow up to {T390209} Deployment-prep runs a local CDN using the `role::cache::text` and `role::cache::upload` Puppet config. This is a complex stack that is mysterious to debug for a random person trying to help keep the Beta Cluster alive. Some docs on how to check each layer of HAproxy, Varnish, and ATS would be very helpful.
    • Task
    In Beta Commons, Cat-a-lot Settings doesn't work. It is broken both when the code is executed as a gadget and when it is executed as a user script. **What happens** # Enable Cat-a-lot from [[ https://commons.wikimedia.beta.wmflabs.org/wiki/Special:Preferences#mw-prefsection-gadgets|preferences ]] OR [[https://commons.wikimedia.beta.wmflabs.org/wiki/special:mypage| special:mypage/common.js]] # Goto [[ https://commons.wikimedia.beta.wmflabs.org/w/index.php?title=Category:Test_categories | category page ]] # Open Cat-a-lot UI from right-bottom corner # Click Cat-a-lot Preferences link in right-bottom corner # Nothing happens **What should have happened instead?**: # Settings dialog should be opening. **What should be done** **Step 1** - Debug why Settings doesn't work and document reason as a comment to the ticket. It doesn't throw an error to the javascript console so it needs to be checked from and using the code why button doesn't work. **Step 2** - Fix the problem when Cat-a-lot has been loaded as user script. Cat-a-lot should be able to - to open preferences dialog - to save settings temporarily - to save settings permanently to user page You can check how the Preferences dialog is supposed to work by trying out how Cat-a-lot works on commons.wikimedia.org.
    • Task
    https://www.mediawiki.org/wiki/Manual:Developing_extensions#Structure states: "Good practice is to add a README file with basic info about how to install and configure the extension. You can use either plain text or Phabricator markup syntax. For example, see the Phabricator Diffusion page for the Extension:Page Forms. If markdown is used, add the file extension .md. For example, see the README.md file for Parsoid on Phabricator Diffusion." Instead of offering three options (plain text, Phabricator markup, and Markdown), choose one recommended option, and update the docs.
    • Task
    ## Background Currently, the Codex site provides instructions for [[ https://doc.wikimedia.org/codex/main/using-codex/usage.html | usage of Codex ]] focused on developers, but not for designers working with the Codex Figma library. Some guidance for designers exists on an [[ https://www.figma.com/design/KoDuJMadWBXtsOtzGS4134/Codex?node-id=9463-85611&t=zyR40jHFRiTUfkav-11 | introduction page within the Figma library ]], while the rest of the design guidelines are on the Codex site. This task aims to centralize all design guidelines on the Codex site for easier access and consistency. Instructions related with the Codex Figma library that could live in the Codex site: - Enable the library (for all type of users T370675) - Reuse the Figma variables and styles - Reusing components and assets - Working with modes: Light/Dark modes and Sizes modes (this last will be related on how to create mocks for desktop/tablet/mobile) ### Acceptance criteria (or Done) [] Decide if we want to include these instructions for designers in the Codex site [] Decide where these instructions in Codex would live [] Document and include the sections listed above
    • Task
    ### Description ### Conditions of acceptance Add the following endpoints to all MediaWiki REST API documentation solutions * /transform/wikitext/to/lint * /transform/wikitext/to/lint/{title} * /transform/wikitext/to/lint/{title}/{revision} Update MediaWiki documentation * Under the the "Convert HTML to Wikitext" section, add a section for "Check Wikitext for errors" * **//Endpoint description//**: Parse the supplied wikitext and check it for lint errors. * **//Method//**: `POST` * **//Parameters//**: See those documented in [RESTbase documentation](https://en.wikipedia.org/api/rest_v1/#/Transforms/post_transform_wikitext_to_lint)) * **//Payload//**: [Transform request body](https://www.mediawiki.org/wiki/API:REST_API/Reference#Transform_response_body) with `source` * **//Returns//**: Lint errors, if any exist. * Create a new response body example that reflects expected Lint error formant. * Increment MediaWiki version number to v1.44 for the transform section Update API Portal documentation (?) * Decision: We will NOT update the API Portal at this time. It is not an 'officially' launched or supported product. * NOTE: Other transform endpoints are not currently in the API portal --> Is there an intentional reason for that? NOTE: Per Tech Writer recommendation, we need to keep the on-wiki reference documentation up to date as we create new endpoints. Eventually we will transition to have the REST Sandbox be THE source of truth for reference documentation for the REST APIs and will remove/reroute the [[ https://www.mediawiki.org/wiki/API:REST_API/Reference# | existing reference docs ]]. However, until we do that, both must be kept in sync.
    • Task
    In order to use dependency injection in jobs (T245900), we need to add constructor parameters. However, the `GenericParameterJob` interface defines a constructor with a single parameter (`array $params`), and therefore it cannot be overridden in subclasses. So, a job cannot be converted to DI unless we drop the `GenericParameterJob` from it first. However, this creates a situation where some (newer-style) jobs implement the `GenericParameterJob` interface and others don't. Keeping in mind the eventual goal of dropping the interface altogether, it should be made clear what the preferred approach is for the time being. For example, the interface might be made optional in favour of the `needsPage: false` config option. This already seems to be the case now, but I don't know to what extent. The interface documentation (and possibly the relevant pages on mw.org) should state this.
    • Task
    Given that most Analytics stuff is hosted on Wikitech and that's where most 'server admin logs' are, it should probably be moved. https://www.mediawiki.org/wiki/Analytics/Server_Admin_Log These edits are performed by https://www.mediawiki.org/wiki/User:Analyticslogbot
    • Task
    The [[https://commons.wikimedia.org/wiki/Help:Gadget-Cat-a-lot|Cat-a-lot]] gadget on Wikimedia Commons currently lacks detailed technical documentation for setting up its local development environment. This gap makes it difficult for new contributors to develop, test, and debug the gadget locally. **Goal:** Develop comprehensive documentation that includes: - Creation of pages related to Cat-a-lot - Testing procedures - Clearing the local cache - Debugging guidelines **Current status:** An initial draft of the developer documentation is available at: * https://commons.wikimedia.org/wiki/Help:Gadget-Cat-a-lot/Local_Development **Next steps:** [ ] Review and feedback from current Wikimedia Commons interface editors [ ] Seek reviews from contributors experienced in Cat-a-lot’s local development setup
    • Task
    The [[https://commons.wikimedia.org/wiki/Help:Gadget-Cat-a-lot|Cat-a-lot]] gadget on Wikimedia Commons currently lacks technical documentation for developers. This makes it challenging for new contributors to understand how to modify, test, and deploy changes to the gadget. **Goal:** Create comprehensive developer documentation that covers: - Development environment setup - Code modification workflow - Testing procedures - Deployment process - Best practices **Current status:** I have created an initial draft of the developer documentation at * https://commons.wikimedia.org/wiki/Help:Gadget-Cat-a-lot/Developing **Next steps:** [ ] Review and feedback from current Wikimedia Commons interface editors [ ] Review and feedback from new contributors
    • Task
    ## Background We recently replaced all images representing components with demos on the component pages in Codex to illustrate examples. Additionally, we updated the [[ https://doc.wikimedia.org/codex/latest/style-guide/using-links-and-buttons.html | Style Guide > Using Links and Buttons ]] to follow this new format. Now, we need to decide whether to update the remaining Style Guide pages by removing images and summarizing the Do's and Don'ts or keep images for certain pages where they may be necessary for clarity. These are the pages in the Style Guide section: **Style Guide** - Overview: //no images or demos needed// **Design Principles** - Overview: The images here don't represent components, so we can keep them. - Accessibility: //no images or demos needed// - Bidirectionality: We need to decide if we want to replace the images with demos. We will need to update the Do-Don't recommendations to the new format. **Visual Styles** - Colors: The images here don't represent components, so we can keep them. - Typography: The images here don't represent components, so we can keep them. However, we will need to update the Do-Don't example to the new format. - Icons: The images here don't represent components, so we can keep them. - Images: The images here don't represent components, so we can keep them. - Illustrations: The images here don't represent components, so we can keep them. **Layout Guidelines** - Content overflow: We need to decide if we want to replace the images with demos. We will need to update the Do-Don't recommendations to the new format. - Using links and buttons: Currently using demos to represent the examples. - Constructing forms: We need to decide if we want to replace the images with demos. It will be difficult in this case to create demos that clearly illustrate these forms. **Content Guidelines** - Voice and tone: We need to decide if we want to replace the images with demos. We will need to update the Do-Don't recommendations to the new format. - Writing for copy: We need to decide if we want to replace the images with demos. We will need to update the Do-Don't recommendations to the new format. - Additional resources: //no images or demos needed// **Platforms** - Wikipedia Apps: images are ok in this page since they cannot be replaced with demos. ### Acceptance criteria (or Done) [] Decide which pages in the Style Guide section should be updated to the new format [] Update them accordingly
    • Task
    {T385803} highlighted a low amount of institutional knowledge about manual actions typically needed following reboots of deployment-prep instances. We should invest time and energy in attempts to improve this. One starting point could be adding documentation discoverable from https://wikitech.wikimedia.org/wiki/Nova_Resource:Deployment-prep on known manual needs like: * [[https://wikitech.wikimedia.org/wiki/Nova_Resource:Deployment-prep/Databases#Read_only|enabling read-write mode on MariaDB servers]] * [[https://wikitech.wikimedia.org/wiki/Nova_Resource:Deployment-prep/Help#Secrets|arming keyholder on the deploy server]] * ...
    • Task
    == Background During the January 2025 Sprinthackular, the "Choose a component" tool was created to guide users through a survey to help them identify a component they might be looking for. === Goal We should analyze this tool and determine if it is something we would want to refine and add to the docs site. === Considerations - The [[ https://gerrit.wikimedia.org/r/c/design/codex/+/1115105 | patch ]] with the [[ https://1115105--wikimedia-codex.netlify.app/components/component-wizard.html | demo ]] was written during hackathon and should be cleaned up before considering for code review, even as an MVP. @SToyofuku-WMF can be consulted from an engineering perspective. - We should consider the maintenance factor of a tool like this. When we add a new component or revise specific guidance for an existing component, this would need to be updated. - In the future, we could also consider a more visual version that might be more helpful, but new component variants might be needed for this concept, such as a selectable Card. === Acceptance criteria [] Discuss and decide if this tool should belong on the Codex docs site [] If so, clean up the existing code to get the current patch production ready for a proper code review
    • Task
    - See https://www.mediawiki.org/wiki/Selenium/How-to/Debug_with_browser.debug()
    • Task
    If Babel's community configuration integration is enabled, but the migrateConfigToCommunity migration script is not executed and then someone edits Babel's configuration on wiki, it is likely the on-wiki configuration is supposed to be used, despite anything that is in the server configuration. This is because the on-wiki configuration is what is actually used (and what the community members would see). In this task, we should make the migration script not attempt to override an already existing configuration page.