Page MenuHomePhabricator
Search Open Tasks
    • Task
    Steps to Reproduce: Run Special:ArticlesHome Actual Results: Internal Error Expected Results: Suggestion: Change the typo cahce to cache
    • Task
    When pages with many thumbnails like https://commons.wikimedia.org/w/index.php?title=Category:Media_needing_categories_as_of_26_August_2018&filefrom=Starr-100901-8896-Dubautia+linearis-habitat-Kanaio+Natural+Area+Reserve-Maui+%2824419916404%29.jpg%0AStarr-100901-8896-Dubautia+linearis-habitat-Kanaio+Natural+Area+Reserve-Maui+%2824419916404%29.jpg#mw-category-media and https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Collection/Bavarian_State_Painting_Collections/18th_Century haven't been visited for a while you don't get all the thumbnails. A lot of the thumbnails will fail with an error like: Request from 62.251.20.116 via cp3061 cp3061, Varnish XID 971825121 Upstream caches: cp3061 int Error: 429, Too Many Requests at Wed, 21 Oct 2020 16:03:11 GMT
    • Task
    Use `dev` branch to QA our current work
    • Task
    The client ID for API gateway requests seems to be always NULL, even if a request included OAuth 2.0 information.
    • Task
    FR Tech, We have received notice that there are regulatory changes that have started the blocking of anonymous prepaid cards to guard against money laundering. In order to avoid inordinate inquiries around these blocked bins, is there a way we can message or block these bins to avoid an inquiry load on DS? The list is attached. {F32410311}
    • Task
    This task is a prerequisite for storing search filters in URL params (T261537), because having filter params implies that they will be recognized immediately if the user was to copy and paste the URL into a new browser or otherwise do a hard refresh. Initial results are fetched on the server and we have no guarantee that users will have JS enabled at all, so search filter URLs must be handled in PHP as well as JS. === Acceptance Criteria === - [ ] Users who arrive at Special:MediaSearch with valid filter params in the url (like `?q=tree&mimeType=tiff`, etc) must get filtered results back from the initial batch of hits, prior to JS initialization - [ ] The no-JS UI must provide a way for the user to clear filters that have been set via URL param
    • Task
    We've done these tests in the past and made fixes but @cscott felt that we could do one final round of testing after tweaking and dusting off all patches before merge and deploy.
    • Task
    For starters, we are going to only look at a per-wiki configuration flag.
    • Task
    prometheus-openldap-exporter currently uses Python 2, but porting it to Python 3 isn't straightforward: It uses ldaptor, but was never ported to Python 3 and has been removed in Bullseye.
    • Task
    As noted in the parent task, Parsoid plans to change the <figure-inline> tag that currently wraps media images (see current spec: https://www.mediawiki.org/wiki/Specs/HTML/2.1.0#Media) to instances of <span>. Mobileapps relies on <figure-inline> in several places, so these will need to be updated accordingly.
    • Task
    While looking into T263999, we realized that the `TermLookup`/`PrefetchingTermLookup` and `LabelLookup`/`LabelDescriptionLookup` interfaces don’t seem to clearly specify whether these services apply language fallbacks or not. We should make that clearer in their documentation, maybe even renaming them.
    • Task
    As part of the project to use Parsoid everywhere, we are bridging HTML output differences between Parsoid and core parser. One of these is in the HTML generated for media output. As part of {T251641}, where we initially wanted to switch core output to <figure-inline> tag for inline media, after TechCom review and discussion, we decided to switch Parsoid back to <span> tag instead. The change in Parsoid will come with a version bump as well. Before we roll out this change in Parsoid, we need all Parsoid clients to be prepared to handle either <span> or <figure-inline> tags for a brief period. Same with any necessary CSS selectors. Presumably, this will be a simple change. I've created a common task and tagged all known Parsoid clients. Feel free to create a specific subtask for your own client, if necessary.
    • Task
    As described in T266132, the site https://analytics.wikimedia.org/ is less readable on and less friendly to smartphones. On the other hand, the site is more readable on tablets like an iPad. However, navigability is the issue with this site on mobile devices like smartphones and tablets. When using any browser on a mobile device like Chrome or Firefox, a stats graph would not respond to your fingers well. I tried to read the graph by moving my finger left and right, but the graph itself just moves left and right. //Id est// I could not see specific monthly (or daily) stats while using my finger(s) on the touchscreen. When using Safari on an iPhone or iPad, there are issues described at T266122. If using Safari on an iPhone or iPad, tap either https://analytics.wikimedia.org/dashboards/browsers/ or https://analytics.wikimedia.org/dashboards/vital-signs/, but navigability issues with the dashboards still persist as well. I'm unsure how the site works on a hybrid tablet like Microsoft Surface Book. If anyone here uses it, testing the site out on Surface Book would be nice.
    • Task
    While looking into T263999, we realized that both `CachingFallbackLabelDescriptionLookup` and `CachingPrefetchingTermLookup` use the `TermCacheKeyBuilder` trait to manage cache keys. However, it seems that the former should apply language fallbacks and the latter should not. We want to make sure that if that’s the case, they shouldn’t share the same cache keys.
    • Task
    HTML entities were replaced by the Unicode characters in these edits made with DiscussionTools: * https://he.wikipedia.org/?diff=29625893 – `&#x2126;` * https://sv.wikipedia.org/?diff=48324635 – `&#x2001;` In both diffs, there are other entities that were not affected (even on the same line of text).
    • Task
    IPInfo uses the geoip2 library, which will be added in T174553, following a security readiness review: T262963. Until then Phan CI fails, blocking new commits to the IPInfo repo, so we either need to disable Phan temporarily or use composer in CI temporarily. We should reinstate the original config once geoip2 is available in mediawiki/vendor.
    • Task
    {F32410276} For example in https://wikidata.beta.wmflabs.org/wiki/Special:Version?useskin=timeless Introduced after 1.36.0-wmf.14
    • Task
    Write draft reply on the legislative proposal
    • Task
    Invite Wikidata and Wikimedia community to give input in a virtual meeting
    • Task
    The site https://analytics.wikimedia.org looks awkward on smartphones # Use any Android smartphone # Open any supported browser, like Chrome or Firefox or supported Chromium-based browser # Go to https://analytics.wikimedia.org/ # Examine any page, like the main page of the site or any dashboard. # Switch Portrait and Landscape views back and forth ## On Portrait orientation view, the tabs or text would scramble all over ## On Landscape, the site would look better than on Portrait. ## However, mobile user experience is still left to be desired. # When on a dashboard page, the graph window on the page becomes smaller and then harder to see and navigate. # Examine all other pages especially for readability. # Repeat steps for any iPhone # On an iPhone, use the same browser that you used on an Android device. The site on Safari has issues as described at T266122 when using an iPhone. If using Safari on an iPhone, then tap on either https://analytics.wikimedia.org/dashboards/browsers/ or https://analytics.wikimedia.org/dashboards/vital-signs/ if you want to test out a dashboard, both of which still load well on Safari but have the same issues as described while using any other browser.
    • Task
    Quickly read through the proposal and identify important parts to reply on.
    • Task
    We are receiving alarms like: Throughput of EventLogging EventError events is CRITICAL because an external wiki is sending bursts of malformed events. The user agent used is "Fuzz Faster U Fool v1.2.0-git". Those events are already filtered out at Refine time, so do not represent a danger to data ingetrity. However, they are failing verification and forwarded to EventLogging_EventError topic, which generates the mentioned alarms.
    • Task
    ```name=Error message Invariant failed: Bad UTF-8 (full string verification) ``` ##### Notes The bad UTF-8 can also be seen directly in the source for: https://ja.wikipedia.org/wiki/Wikipedia:%E5%89%8A%E9%99%A4%E8%A8%98%E9%8C%B2/%E9%81%8E%E5%8E%BB%E3%83%AD%E3%82%B0_2004%E5%B9%B411%E6%9C%88 This task is forked from T237467, which ended up being an issue with `Language::commafy` generating bad UTF-8. In contrast, in this task the bad UTF-8 is coming directly from the DB. As described in T237467#6566785, we need the following mitigations: 1. Bad UTF-8 is not supposed to make it past PST to get stored in the DB in the first place. So we need to track down how it got in there and clean it up; also perhaps cleaning up other articles that managed to get saved with bad UTF-8. 2. Fix core to plug this hole so that bad UTF-8 is not stored in the DB. 3. Validate wikitext source we get from the DB and fix up bad UTF-8 we get, downgrading this from a crasher to a warning. (The assertion is still appropriate if we encounter bad UTF-8 later, since that would be generated by Parsoid from valid inputs; but Parsoid operates under the assumption that all of its inputs are valid.)
    • Task
    We have two versions of the use of funds page, depending on the viewport width. This task is done after implementing the version for mobile banners.
    • Task
    For mobile banners, we also want to use the new design of the overlay that explains how we use our funds. **Acceptance Criteria** * The design is implemented as shown in the [Figma file](https://www.figma.com/file/SoenfJJvbjeHgteEYjN0KC/wikimedia-mittelverwendung-final-02-copy-6?node-id=0%3A3). * In the mobile device view, * the bar chart does not have headers. * the cost areas are listed below the chart, each of them is expandable by clicking/tapping on it. **Implementation notes** * We should put this into a different commit than {T266114}, so we can cherry-pick during mobile banner development.
    • Task
    Implementation notes * Markup and styles can likely be copied
    • Task
    As far as #DBA s have been told, this was an old table of mediawiki core dropped in MW 1.35: * https://gerrit.wikimedia.org/r/c/mediawiki/core/+/545308/ * T231366 While the task was resolved, data was not removed from production. This is particularly interesting, because any data would have been lost as I belive it was created with the memory engine (data lost on every restart). This has been brought to my attention while running CHECK TABLES on source backup hosts (T265866), as those tables on the MEMORY/HEAP engine cannot be checked.
    • Task
    We're getting failmail on the back of errors inserting duplicate records. Need to work out why we're not skipping these maybe. //donations_queue_consume-20201021-015401.log.civi1001.bz2// contains: ``` 2020-10-21 01:54:06,161 ERROR WD wmf_common: DUPLICATE_CONTRIBUTION Contribution already exists. [error] 2020-10-21 01:54:06,161 ERROR Ignoring message. 2020-10-21 01:54:06,161 ERROR Source: array ( 2020-10-21 01:54:06,161 ERROR ) 2020-10-21 01:54:06,161 ERROR WD wmf_common: Aborting DB transaction. [info] 2020-10-21 01:54:06,161 ERROR WD wmf_common: Failure while processing message: [error] 2020-10-21 01:54:06,162 ERROR DUPLICATE_CONTRIBUTION Contribution already exists. Ignoring message. 2020-10-21 01:54:06,162 ERROR Source: array ( 2020-10-21 01:54:06,162 ERROR ) 2020-10-21 01:54:06,162 ERROR WD wmf_common: Dropping message altogether: amazon-93090454-1 [error] ``` and then an unhandled error is thrown later on: ``` 2020-10-21 01:54:06,165 ERROR WD wmf_common: Beginning DB transaction [info] 2020-10-21 01:54:06,196 ERROR WD wmf_common: Aborting DB transaction. [info] 2020-10-21 01:54:06,198 ERROR WD wmf_common: UNHANDLED ERROR. Halting dequeue loop. Exception: [error] 2020-10-21 01:54:06,198 ERROR Value already exists in the database 2020-10-21 01:54:06,198 ERROR Stack Trace: #0 2020-10-21 01:54:06,198 ERROR /srv/org.wikimedia.civicrm/sites/all/modules/wmf_civicrm/wmf_civicrm.module(84): 2020-10-21 01:54:06,198 ERROR civicrm_api3('OptionValue', 'create', Array) 2020-10-21 01:54:06,198 ERROR #1 2020-10-21 01:54:06,198 ERROR /srv/org.wikimedia.civicrm/sites/all/modules/wmf_civicrm/wmf_civicrm.module(2151): 2020-10-21 01:54:06,199 ERROR wmf_civicrm_ensure_option_value_exists('appeal_20080709...', 2020-10-21 01:54:06,199 ERROR 'endowment2020') ```
    • Task
    Steps to Reproduce: 1. Open https://en.wikipedia.org/wiki/Wikipedia?banner=trilogy_dsk_p1_lg_temp_wydg_modal5k&country=US 2. Select any amount to donate 3. Select Visa or Paypal payment method 4. Click Continue 5. Click Back 6. Observe missing Maybe later link {F32410116} Actual Results: Currently the Maybe later link disappears after clicking the back button on the monthly convert step Expected Results: All links should remain visible after clicking back
    • Task
    I was testing the website https://analytics.wikimedia.org via latest versions of Safari and Chrome on iPhone 5s (iOS 12.4.8) and iPadOS 14.1 (latest version of supported iPads). Seems that, on iOS version of Safari, the website's UI would not respond to touch screens' interactions with collapsed tabs, "Download" and "Dashboard". Nonetheless, I can still tap on the "Contact" tab, which is not collapsible. Split from T266071
    • Task
    Hello, Filing as advised by billinghurst. Apaprently we have a new feature called watchlist expiration, and the permanent button seems annoying to me in meta, not in content project. Is there any possiblity of a button I can press in like Beta/Watchlist/Gadgets to hide it permanently the option for duration of watchlist. I know there are some css options, but can we work it into the mediawiki software? Link:https://meta.wikimedia.org/w/index.php?title=Meta:Babel&diff=20556707&oldid=20556289#Watchlist_expiration Thanks CM
    • Task
    All banners need to use this year's campaign parameter values.We need to change the name of the included "page", so that all upcoming banners use it. Notes: * The file `webpack.production.js` needs to be changed. * Copy current values to `global_banner_settings.js` * Delete the file `wikipedia_org_js_banner_values.hbs`.
    • Task
    There are currently 17 mysql configs that set `event_scheduler`. One of them sets the value depending on the hostname (`modules/profile/templates/mariadb/mysqld_config/db_inventory.my.cnf.erb`). This should be a parameter to `mariadb::config`, so it can be set via hiera.
    • Task
    We currently use a mix of setups without swap (caches, mw servers) and with swap (rest of the fleet). We should deep-dive whether - this makes sense for the servers currently using swap - we avoid swap partitions in favour of swap files (if we retain them) - review our swap-related sysfs settings
    • Task
    During the batch upload project [[ https://commons.wikimedia.org/wiki/Commons:UK_legislation_project | UK Legislation ]], there are a significant number of files rejected by the API with 'chunk-too-small'. Is there a work-around or a fix that could be applied for this Pywikibot based mass upload? This error has not been a problem for image mimetype uploads, but appears quite likely for document mimetypes. * Example un-uploadable files: *# [[ https://www.legislation.gov.uk/ukpga/Edw7/7/17/contents/enacted | Edw7-7-17 page ]]/[[ http://www.legislation.gov.uk/ukpga/1907/17/pdfs/ukpga_19070017_en.pdf | pdf link ]] *# [[ https://www.legislation.gov.uk/uksi/1965/1559/made | UKSI 1965-1559 page ]]/[[ https://www.legislation.gov.uk/uksi/1965/1559/pdfs/uksi_19651559_en.pdf | pdf link ]] * This may be related to T132676.
    • Task
    There is a new design of the page explaining for what we use the donations we collect. We already introduced the banner version of it and want to change this in the donation application, too. The design and behaviour is mostly the same as in the banner version. **Acceptance Criteria** * There are two versions of the content depending on the viewport width: * [Version for mobile devices](https://www.figma.com/file/SoenfJJvbjeHgteEYjN0KC/wikimedia-mittelverwendung-final-02-copy-6?node-id=0%3A3) * [Version for desktop devices](https://www.figma.com/file/SoenfJJvbjeHgteEYjN0KC/wikimedia-mittelverwendung-final-02-copy-6?node-id=0%3A184) * In the desktop device view, interacting with the bar chart (hovering, tapping) changes the text below the bar chart. * In the mobile device view, * the bar chart does not have headers. * the cost areas are listed below the chart, each of them is expandable by clicking/tapping on it.
    • Task
    When creating an account, the inclusion of an email address is optional but there is no warning that without an email address there is no way to recover the account. Arguably, a new user does not have a clue that they could lose access to the account permanently without an email attached to it. Every now and then the Trust and Safety team also receives such PW reset requests. So I wonder if we can add a warning message similar to what we added to the ‘Password’ field so that this confusion can be rectified. E.g. “An email address is required if you ever need to reset your password.”
    • Task
    There are some changes we want to make to the new design of our use of funds info overlay before including them into all campaign banners. **Acceptance Criteria** * Links to the sources we took the annual revenue from are changed to: * [Amazon](https://www.statista.com/statistics/266282/annual-net-revenue-of-amazoncom/) * [Facebook](https://www.statista.com/statistics/268604/annual-revenue-of-facebook/) * [Google](https://www.statista.com/statistics/266206/googles-annual-global-revenue/) * The text before the link to the WMF's annual plan is changed to "Einen genauen Einblick in unsere internationalen Aktivitäten finden Sie hier:" * The text before the link to WMDE's annual plan is changed to "Einen genauen Einblick in unsere Aktivitäten in Deutschland finden Sie hier:" * The titles of the cost areas in the bar chart are changed as follows: * Technik -> Software * Internationales -> International und Technik * The paragraph describing Software is changed to: "Aktuelle Daten der Welt sind das Rückgrat von Wikipedia & Co. Damit alle Wikipedia-Sprachversionen einfach auf einen gemeinsamen Datenpool zugreifen können, entwickeln unsere Softwareentwicklerinnen und -entwickler in Deutschland und weltweit eine sichere und verlässliche technische Infrastruktur. Damit wird die Ergänzung und Aktualisierung von Wikipedia-Artikeln mit neuesten Daten enorm erleichtert." **Notes** * The links to the annual plans of the two organizations are not final. We'd need to redeploy all banners once we have them, so we might want to discuss extracting them to a central location. * We might want to try make this look good on viewport widths of 600px and above.
    • Task
    Monthly check-in with Opad [] Prepare Agenda [] Send out reminder with agenda [] Hold meeting [] Create any follow-up tasks [] Create task for next meeting [] Send out calendar invite for next meeting
    • Task
    In order to test syntax highlighting, we need to pull in a couple more articles and all of their templates (incl. nested ones). Each language test instance will get an article in that language. We also need some for other testing (VE, TemplateData) EN: https://en.wikipedia.org/wiki/MediaWiki https://en.wikipedia.org/wiki/The_Wizard_of_Oz_(1939_film) DE: https://de.wikipedia.org/wiki/Dark_(Fernsehserie) https://de.wikipedia.org/wiki/Stromberg_(Fernsehserie) https://de.wikipedia.org/wiki/Systemsprenger_(Film) https://de.wikipedia.org/wiki/Das_wei%C3%9Fe_Band_%E2%80%93_Eine_deutsche_Kindergeschichte https://de.wikipedia.org/wiki/Liste_der_Baudenkm%C3%A4ler_in_Schwelm
    • Task
    1. Edit a page which contains a bullet list. ([[https://fr.wikipedia.org/w/index.php?title=Festival_du_Troquet&oldid=143176912 | example on fr.wp]]) 2. Place your cursor at list end. 3. Press Enter key twice to create a new paragraph after the list, then type some words. Actual generated wikitext: ```lang=wikitext * List item * Last item Added paragraph ``` Expected wikitext: ```lang=wikitext * List item * Last item Added paragraph ``` Notes: * If you create a new list, an empty new line is well added between new list and new paragraph. * If you split a list, same problem happens: no empty line is added, neither before nor after added paragraph.
    • Task
    This task is for enabling the [Codehealth pipeline](https://www.mediawiki.org/wiki/Continuous_integration/Codehealth_Pipeline) with the research/mwaddlink repo.
    • Task
    Steps to Reproduce: Using MW 1.35, editing a MW page by the API (I'm using mwbot for this), with AutoCreateCategoryPages enabled, and MW display errors is on. Actual Results: I'm getting this message: ``` Warning: Use of undefined constant DB_SLAVE - assumed 'DB_SLAVE' (this will throw an Error in a future version of PHP) in /var/www/html/w/extensions/AutoCreateCategoryPages/AutoCreateCategoryPages.body.php on line 11 ``` Expected Results: No error accrues. What should we do: Replace abandoned DB_SLAVE with DB_REPLICA;
    • Task
    Orchestrator supports auth via forwarded headers (https://github.com/openark/orchestrator/blob/master/docs/security.md). Ideally we can put it behind idp/cas.
    • Task
    The upcoming [[ https://gerrit.wikimedia.org/r/c/mediawiki/extensions/AbuseFilter/+/634032 | ChangeTagsManager ]] service loads tags reserved by abuse filters, including global filters, and caches the results for one minute. It uses different keys for enabled and disabled filters. The result is used (not exclusively) for hooks `ChangeTagsListActive` and `ListDefinedTags` called from core's `ChangeTags` class (which is likely to be split and deprecated soon), which wraps the result with its own cache (five minutes). When a filter using tags is changed, the caches have to be purged. `ChangeTagsManager` currently purges both abuse filter cache and core cache (`ChangeTags::purgeTagCacheAll`). However, it is only done on the same wiki. When a global filter with tags is changed, the caches are only purged on the central wiki. (So for example remote wikis are not aware of newly introduced tags of global filters for a while.) Moreover, we should consider caching tags of local and global filters separately. The global database is naturally queried more often, so its cache should perhaps be more persistent. (I think we already cache rules of global filters separately as they are run on many wikis.) One more idea: there are separate caches for enabled (active tags) and disabled (defined tags) filters. We could have only one cache where pairs `(tag, status)` would be cached.
    • Task
    [X] New entry created in the [[ https://docs.google.com/document/d/1LdqapyOwl_exaz6AB3TMF6LRZob1BashTtRKkQ3cGx0/edit | reporting document ]] [AC] [] Report reminder sent out [AC] [] Text entries [] WMSE [] STTS [] KTH [] Financial entries [] WMSE [] STTS [] KTH [] Looked through by AC [] Any issues followed up [] Create [[https://phabricator.wikimedia.org/maniphest/task/edit/form/37/?title=Create+%3CMONTH%3E+2020+monthly+Wikispeech+report&project=WMSE-Wikispeech-Speech-Data-Collector-2019,User-LokalProfil&points=5&description=%5B%5D+New+entry+created+in+the+%5B%5B+https%3A%2F%2Fdocs.google.com%2Fdocument%2Fd%2F1LdqapyOwl_exaz6AB3TMF6LRZob1BashTtRKkQ3cGx0%2Fedit+%7C+reporting+document+%5D%5D+%5BAC%5D%0A%5B%5D+Report+reminder+sent+out+%5BAC%5D%0A%5B%5D+Text+entries%0A++%5B%5D+WMSE%0A++%5B%5D+STTS%0A++%5B%5D+KTH%0A%5B%5D+Financial+entries%0A++%5B%5D+WMSE%0A++%5B%5D+STTS%0A++%5B%5D+KTH%0A%5B%5D+Looked+through+by+AC%0A%5B%5D+Any+issues+followed+up%0A%5B%5D+Create+%5B%5B%3CXXXX%3E%7C%28deadline%29+task%5D%5D+for+next+month%2C+and+update+the+deadline-task+url+of+it|(deadline) task]] for next month, and update the deadline-task url of it
    • Task
    Missing publish statement in .pipepine/config.yaml!
    • Task
    We want to test the WMF's currently best performing banner text on English Wikipedia. The banners are based on **last year's control banner**. **Acceptance Criteria** * Both control and variant banner * show the progress bar. * include the [campaign parameter file for 2020](https://meta.wikimedia.org/wiki/MediaWiki:WMDE_Fundraising/Campaign_Parameters_2020). * The variant banner contains the following banner text: "**To all our readers in Germany,** It might be awkward, but please don't scroll past this. This [day of the week], for the 1st time recently, we humbly ask you to defend Wikipedia's independence. 98% of our readers don't give; they simply look the other way. If you are an exceptional reader who has already donated, we sincerely thank you. **If you donate just € 5, Wikipedia could keep thriving for years.** Most people donate because Wikipedia is useful. If Wikipedia has given you € 5 worth of knowledge this year, take a minute to donate. Show the volunteers who bring you reliable, neutral information that their work matters. Thank you."
    • Task
    # Campaign | Preceded by | | Succeeded by | |---|---|---| | | [C20_WMDE_EN_Test_01](https://meta.wikimedia.org/w/index.php?title=Special:CentralNotice&subaction=noticeDetail&notice=C20_WMDE_EN_Test_01) | | ## Banners | Banner name | Respective page on meta-wiki | Link to example on production wiki | |---|---|---| | B20_WMDE_EN_Test_01_ctrl | [setup](https://meta.wikimedia.org/wiki/Special:CentralNoticeBanners/edit/B20_WMDE_EN_Test_01_ctrl) | [preview](https://de.wikipedia.org/?banner=B20_WMDE_EN_Test_01_ctrl) | | B20_WMDE_EN_Test_01_var | [setup](https://meta.wikimedia.org/wiki/Special:CentralNoticeBanners/edit/B20_WMDE_EN_Test_01_var) | [preview](https://de.wikipedia.org/?banner=B20_WMDE_EN_Test_01_var) | # Campaign settings ## General | Name | **C20_WMDE_EN_Test_01** | | Start Date | **2020-11-03** | | Start Time | **11:00 UTC** | | End Date | **2020-12-31** | | End Time | **23:00 UTC** | | Projects | Wikipedia | | Languages | **en - English** | | Geotargeted | Germany | | User bucketing | 2 | | Priority | **high** | | Limit traffic | **100%** | ## Extra features ### Legacy hiding and impression counting support | Set sample rate | {icon check} | | Sample rate | 1 | | Banners might not display | {icon check} | ### Impression diet | Identifier | **wmde-campaign-2020** | | Skip impressions | 0 | | Max impressions | 10 | | Wait time | 0 | ## Banner settings | Category | fundraising | | Display to | Anonymous users | | Display on | desktop |
    • Task
    After testing the introduction of reduced address data provision, we want to test the effect of asking a user which address data they want to provide in the banner form already. Banners are based on the **variant banner** of **desktop-de-08**. **Acceptance Criteria** * Both control and variant banner * show the progress bar. * include the [campaign parameter file for 2020](https://meta.wikimedia.org/wiki/MediaWiki:WMDE_Fundraising/Campaign_Parameters_2020). * contain the "campaign day sentence". * include the final version of the use of funds design. * The variant banner has a two-step form (similar to the variant of [desktop-de-02](https://de.wikipedia.org/?banner=B20_WMDE_02_var)). * If a user selects direct-debit, * only the full address option is available, other options are disabled. * the form shows a notice below the headline that explains why (direct debit notice). * The submit button label of the second page changes depending on the user's choice. * Both banners use the new address provision option version of the donation form. **Notes** * {T266114} should be finished before deploying these banners. **Labels** | Element | Label | | ------------------------------------------------------- | ----------------------------------------------------------- | | Second step headline | Möchten Sie Ihre Kontaktdaten angeben? | | Full address data label | Vollständige Kontaktdaten | | Full address data notice | (Für Spendenquittung per Post und Bestätigung per E-Mail) | | Only e-mail address label | Nur E-Mail-Adresse | | Only e-mail address notice | (Für Bestätigung per E-Mail) | | No address data | Gar keine Kontaktdaten | | Error message | Bitte wählen Sie aus, ob Sie Kontaktdaten angeben möchten. | | Submit button (no contact data & PayPal payment) | Weiter zu PayPal | | Submit button (no contact data & credit card payment) | Weiter zur Dateneingabe | | Submit button (no contact data & bank transfer payment) | Weiter zur Bankverbindung | | Direct debit notice | Für Lastschriften ist die Angabe eine Adresse erforderlich. | **Mockups** | form on the right | {F32409936, size=full} | | form below | {F32409937, size=full} |
    • Task
    # Campaign | Preceded by | | Succeeded by | |---|---|---| | T265251 | [C20_WMDE_Test_09](https://meta.wikimedia.org/w/index.php?title=Special:CentralNotice&subaction=noticeDetail&notice=C20_WMDE_Test_09) | | ## Banners | Banner name | Respective page on meta-wiki | Link to example on production wiki | |---|---|---| | B20_WMDE_Test_09_ctrl | [setup](https://meta.wikimedia.org/wiki/Special:CentralNoticeBanners/edit/B20_WMDE_Test_09_ctrl) | [preview](https://de.wikipedia.org/?banner=B20_WMDE_Test_09_ctrl) | | B20_WMDE_Test_09_var | [setup](https://meta.wikimedia.org/wiki/Special:CentralNoticeBanners/edit/B20_WMDE_Test_09_var) | [preview](https://de.wikipedia.org/?banner=B20_WMDE_Test_09_var) | # Campaign settings ## General | Name | **C20_WMDE_Test_09** | | Start Date | **2020-11-03** | | Start Time | **11:00 UTC** | | End Date | **2020-12-31** | | End Time | **23:00 UTC** | | Projects | Wikipedia | | Languages | de - German | | Geotargeted | Germany | | User bucketing | 2 | | Priority | **high** | | Limit traffic | **100%** | ## Extra features ### Legacy hiding and impression counting support | Set sample rate | {icon check} | | Sample rate | 1 | | Banners might not display | {icon check} | ### Impression diet | Identifier | **wmde-campaign-2020** | | Skip impressions | 0 | | Max impressions | 10 | | Wait time | 0 | ## Banner settings | Category | fundraising | | Display to | Anonymous users | | Display on | desktop |
    • Task
    The items prepped / created in task above can be filled with data from Libris: [x] Libris URI [x] Place of publication [x] Year of publication [x] no. pages [x] author (reconciled via Libris URI) [x] author name string (no author URI in Libris post or unable to reconcile URI) [x] title + subtitle [x] language (always sv because svwikisource) Query: [[ https://w.wiki/hsa | Edition-level items with svwikisource pages. ]]
    • Task
    For cases like {T255334}, I need a list of non-closed wikis using the Translate extension to determine where to run a maintenance script. This time I did it like this: * Initial list: `grep wmgUseTranslate /srv/mediawiki/wmf-config/InitialiseSettings* -A43 | grep true | cut -f2 -d"'" | xargs` * Expand wikidata manually to real values * Manually remove closed wikis I think there should be `translate.dblist` so that I could do something like `expanddblist '%% translate - closed'`. https://wikitech.wikimedia.org/wiki/Configuration_files#dblists says to check with RelEng first for approval.
    • Task
    For the APG application we need a budget document. Set one up based on the 2020 framework with the following modifications * Costs tab: //Infrastructure// should be removed and these rows should be merged into //Administration// * Costs tab: A new section for //Donation// work should be added * Costs tab: Ensure //Personalkostnader// section has been removed
    • Task
    Samsyn project meeting, Wednesday October 21, 13.00 * End of project, requests for funding, T233618 * Future life of the Samsynwiki, T264589 * Samsyn/Wikipedia knowledge transfer, T250991
    • Task
    - Test - Add instructions to README.md
    • Task
    - Add any additional setup needed - Test - Write instructions in README.md
    • Task
    For consistency with core, Echo should check `$wgEnableUserEmail` and `$wgEnableEmail` when creating preference options. If either is true, "Email from other user" option should not be provided, as it will not be useful. I think it's not causing actual problem, but is liable to cause confusion, for example https://www.mediawiki.org/wiki/Topic:Vw4vsqcaoxeidm5q
    • Task
    Hello, Would like to keep my volunteer credentials to keep contributting to the movement. Per @MoritzMuehlenhoff offering I think this is possible.
    • Task
    **Background:** WS Export currently [[https://github.com/wsexport/tool/blob/bb23901069b8c3c0cbe9b45534326af67863e2d8/src/Util/Api.php#L62..L67 | sort of supports]] wikibooks.org (with e.g. `?lang=en-wikibooks`). This is a whole feature that should maybe be worked on one day -- for now, it doesn't work very well (i.e. the exported books still say 'Wikisource') and it'd be nice to clean up the code and project scope. WSExport doesn't seem to work very well for Wikibooks. Meanwhile, Wikibooks users have the other options of all the wikis: print-to-PDF, and ElectronPDF. We did a global search for the regex wsexport.*\-wikibooks and got no results, which suggests there are no direct links anywhere. But removing the slash I did find some old discussions like https://fr.wikisource.org/wiki/Discussion_utilisateur:Tpt/Archive_1#Tool_for_Wikibooks and https://it.wikibooks.org/wiki/Wikibooks:Bar/Archivio17#Wikibooks_e_le_scuole . Sounds like it didn't work then, either **Acceptance Criteria:** * Drop WSExport support for Wikibooks
    • Task
    Steps to Reproduce: ``` [03:16:58]tools.jjmc89-bot@tools-sgebastion-08:~> date -Iseconds 2020-10-21T03:16:59+00:00 [03:16:59]tools.jjmc89-bot@tools-sgebastion-08:~> python3 Python 3.5.3 (default, Jul 9 2020, 13:00:10) [GCC 6.3.0 20170516] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import datetime, pywikibot >>> datetime.datetime.utcnow() datetime.datetime(2020, 10, 21, 3, 17, 35, 314456) >>> datetime.datetime.now() datetime.datetime(2020, 10, 21, 3, 17, 38, 927904) >>> site = pywikibot.Site('commons', 'commons') >>> site.server_time() ``` Actual Results: `Timestamp(2020, 10, 20, 7, 7, 8)` Expected Results: `Timestamp(2020, 10, 21, 3, 17, 40)` (approximate) Version: ``` Pywikibot: [https] r-pywikibot-core.git (5ebba28, g1, 2020/10/19, 16:57:22, stable) Release version: 5.0.0 requests version: 2.21.0 cacerts: /etc/ssl/certs/ca-certificates.crt certificate test: ok Python: 3.5.3 (default, Jul 9 2020, 13:00:10) [GCC 6.3.0 20170516] Toolforge hostname: tools-sgebastion-08 ```
    • Task
    Docker should use PHP 7.3 to and CI should test PHP versions 7.2, 7.3 and 7.4. **Acceptance criteria:** Bump Docker to PHP 7.3 and CI should test against 7.2, 7.3, and 7.4
    • Task
    See {T265685} for context. TL;DR: #cloud-services-team needs to run `pack` in CI to verify their [[ https://buildpacks.io/docs/operator-guide/create-a-builder/ | builder configuration ]]. However, it needs access to dockerd to function. I suggested one option might be to give it access to the dockerd socket via a bind mount. Before doing that, we'll need to verify that the configuration can't contain anything that would allow for command injection.
    • Task
    //Note:// I could not reproduce the issue in production although there are many small-sized images. 1. On betalabs Special:MediaSearch enter "dark night" (without quotes) as search terms into the search field. Press 'Search' (https://commons.wikimedia.beta.wmflabs.org/wiki/Special:MediaSearch?type=bitmap&q=dark+night). 2. Only one result (https://commons.wikimedia.beta.wmflabs.org/wiki/File:Volcanic_Eruptions_on_Io_-_Discovery_Picture.j) is returned - click on it to bring the QuickView. The UI is displayed as the following: {F32409681} The Console will display the following two errors: ``` TypeError: mw.util.parseImageUrl(...).resizeUrl is not a function TypeError: Cannot read property 'focus' of undefined ``` It seems to complained of this one: ``` function Ue(e, t, n) { if (!z && !V || "undefined" == typeof console) throw e; console.error(e) } ```
    • Task
    User story: As a user looking for direction on how to search by ID, I would like a message with more guidance, so that I can find my way to the proper format on my own. You can search for specific task IDs by leveraging a hidden "IDs" field, revealed when a search URL takes the following format: https://phabricator.wikimedia.org/maniphest/?ids=1,2,3,4#R This is a very useful feature! (Useful enough that it might warrant being visible by default, but that's a separate issue). The field will also appear if you type in, say, https://phabricator.wikimedia.org/maniphest/query/advanced/?ids=T123. However, the addition of a non-numerical value ("T", as most tasks are prefixed and might be intuited), while allowed in the field once it is revealed, will throw an error when clicking "Search": {F32409662} I have updated [[ https://www.mediawiki.org/wiki/Phabricator/Help#Search_terms | documentation ]] to mitigate the issue, but I still think it would be inclusive and friendly to change this error. :)
    • Task
    ``` Script started on 2020-10-20 23:26:49+0000 urbanecm@titanium (master u=) ~/Documents/git/gerrit/pywikibot/core $ python3 pwb.py interwikidata.py -lang:smn -family:wikipedia -clean -start:! /home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/config2.py:1060: _ConfigurationDeprecationWarning: "interwiki_contents_on_disk" present in our user-config.py is no longer a supported configuration variable and should be removed. Please inform the maintainers if you depend on it. _ConfigurationDeprecationWarning) /home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/config2.py:1060: _ConfigurationDeprecationWarning: "use_mwparserfromhell" present in our user-config.py is no longer a supported configuration variable and should be removed. Please inform the maintainers if you depend on it. _ConfigurationDeprecationWarning) Retrieving 50 pages from wikipedia:smn. >>> 1111 <<< No interlanguagelinks on [[1111]] >>> 12 Years a Slave (elleekove) <<< No interlanguagelinks on [[12 Years a Slave (elleekove)]] >>> 1647 <<< WARNING: API warning (main): Unrecognized parameter: uiprop. WARNING: API warning (query): Unrecognized value for parameter "meta": userinfo WARNING: API warning (main): Unrecognized parameter: uiprop. WARNING: API warning (query): Unrecognized value for parameter "meta": userinfo WARNING: API warning (main): Unrecognized parameter: uiprop. WARNING: API warning (query): Unrecognized value for parameter "meta": userinfo WARNING: API warning (wbgetentities): Unrecognized value for parameter "sites": smnwiki WARNING: API error param-missing: Either provide the Item "ids" or pairs of "sites" and "titles" for corresponding page 2 pages read 0 pages written 0 pages skipped Execution time: 2 seconds Read operation time: 1.0 seconds Script terminated by exception: ERROR: APIError: param-missing: Either provide the Item "ids" or pairs of "sites" and "titles" for corresponding page [messages: [{'name': 'wikibase-api-illegal-ids-or-sites-titles-selector', 'parameters': [], 'html': {'*': 'Either provide the Item "ids" or pairs of "sites" and "titles" for corresponding page'}}]; help: See https://www.wikidata.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at &lt;https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce&gt; for notice of API deprecations and breaking changes.] Traceback (most recent call last): File "pwb.py", line 363, in <module> if not main(): File "pwb.py", line 358, in main file_package) File "pwb.py", line 75, in run_python_file main_mod.__dict__) File "./scripts/interwikidata.py", line 245, in <module> main() File "./scripts/interwikidata.py", line 239, in main bot.run() File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/bot.py", line 1498, in run self.treat(page) File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/bot.py", line 1777, in treat self.treat_page() File "./scripts/interwikidata.py", line 86, in treat_page item = pywikibot.ItemPage.fromPage(self.current_page) File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/page/__init__.py", line 4541, in fromPage if not lazy_load and not i.exists(): File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/page/__init__.py", line 4165, in exists self.get(get_redirect=True) File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/page/__init__.py", line 4595, in get data = super().get(force, *args, **kwargs) File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/page/__init__.py", line 4203, in get data = WikibaseEntity.get(self, force=force) File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/page/__init__.py", line 3957, in get data = self.repo.loadcontent(identification) File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/site/__init__.py", line 7084, in loadcontent data = req.submit() File "/home/urbanecm/Documents/git/gerrit/pywikibot/core/pywikibot/data/api.py", line 1994, in submit raise APIError(**result['error']) pywikibot.data.api.APIError: param-missing: Either provide the Item "ids" or pairs of "sites" and "titles" for corresponding page [messages: [{'name': 'wikibase-api-illegal-ids-or-sites-titles-selector', 'parameters': [], 'html': {'*': 'Either provide the Item "ids" or pairs of "sites" and "titles" for corresponding page'}}]; help: See https://www.wikidata.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at &lt;https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce&gt; for notice of API deprecations and breaking changes.] CRITICAL: Exiting due to uncaught exception <class 'pywikibot.data.api.APIError'> urbanecm@titanium (master u=) ~/Documents/git/gerrit/pywikibot/core $ exit Script done on 2020-10-20 23:27:12+0000 ```
    • Task
    The `image` table is the authoritative source of whether a file is local to a given wiki, so to answer the questions in the parent task we'd like to have that table sqooped monthly so it can be joined with the `imagelinks` table.
    • Task
    Tested in betalabs. (1) QuickView often produces a blurry image in IE11 |IE11|Chrome |---|--- |{F32409612}|{F32409616} |{F32409614}|{F32409619} (2) "Load more" button is displayed on the inviting search screen (the button when clicked does not do anything). Steps to reproduce: - enter a search term that produce many results (so there will be "Load more") - e.g. "rose" - when the results are displayed, click on 'X' in the search field to dismiss the search results. The "Load more" button will be displayed. {F32409623}
    • Task
    This was spun off from an engineering sync discussion about fixing https://phabricator.wikimedia.org/T265678. UIKit semantic colors are supposed to adjust the shade of darkness depending on how high up the view hierarchy they are displayed. This would fix our occasional disappearing elements problem that we sometimes see like in the task linked above. I propose for iOS13+ we try changing some of our black theme background color values to semantic colors to gain that functionality. I'm filing this as a separate task since this is an app-wide fix and would require a lot of back & forth with design. Some documentation: https://developer.apple.com/documentation/xcode/supporting_dark_mode_in_your_interface?language=objc https://developer.apple.com/documentation/uikit/uicolor/ui_element_colors?language=objc Another comment with some screenshots on how our black mode backgrounds disappear in iPad: https://phabricator.wikimedia.org/T227123#5301614
    • Task
    To reduce duplication of content where possible, enable $wgEnableScaryTranscluding on api.wikimedia.org and api.wikimedia.beta.wmflabs.org
    • Task
    Now SkinMinerva is using SkinMustache (T256083) slowly migrate the contents of MinervaTemplate piece by piece to the new SkinMinerva class. Wherever possible refer and use [[ https://github.com/wikimedia/mediawiki/blob/master/includes/skins/SkinMustache.php#L74 | the core SkinMustache template variables ]] or use names/data structures that mirror the [[ https://github.com/wikimedia/Vector/blob/master/includes/SkinVector.php#L118 | template variables used in SkinVector ]] [] Unused template variables are removed [] Replace subtitle with data provided by SkinMinerva [] Replace all message keys variables defined inside getTemplateData e.g. main-menu-tooltip with definitions inside SkinMinerva::$options['message'] ... [] When MinervaTemplate has been reduced to a getTemplateData function and private functions, move these functions to SkinMinerva and remove the MinervaTemplate class [] Review all triple brace template variables e.g. {{{search}}} and replace with templates and data where possible [] Make sure all template keys provided by Minerva have "minerva-" in their title so they can be distinguished from values coming from core.
    • Task
    All wdqs servers appear to hit an error, causing the unit to fail. Eventually the service is restarted, at which point it eventually hits the error and fails again. ``` Oct 20 20:17:32 wdqs1008 systemd[1]: Started Query Service Updater. Oct 20 20:17:32 wdqs1008 wdqs-updater[12116]: Updating via http://localhost:9999/bigdata/namespace/wdq/sparql Oct 20 20:17:33 wdqs1008 wdqs-updater[12116]: #logback.classic pattern: %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg %mdc%n Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: 20:27:13.519 [main] ERROR org.wikidata.query.rdf.tool.Update - Error during updater run. Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2] Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChanges(WikibaseRepository.java:356) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChangesByTime(WikibaseRepository.java:327) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.change.RecentChangesPoller.doFetchRecentChanges(RecentChangesPoller.java:322) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.change.RecentChangesPoller.fetchRecentChanges(RecentChangesPoller.java:314) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.change.RecentChangesPoller.batch(RecentChangesPoller.java:338) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.change.RecentChangesPoller.nextBatch(RecentChangesPoller.java:167) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.change.RecentChangesPoller.nextBatch(RecentChangesPoller.java:38) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.Updater.nextBatch(Updater.java:371) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.Updater.run(Updater.java:163) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.Update.run(Update.java:174) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.Update.main(Update.java:98) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2] Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:693) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:591) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2630) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:832) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:729) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4141) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4000) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3070) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.getJson(WikibaseRepository.java:526) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChanges(WikibaseRepository.java:350) Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: ... 10 common frames omitted Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: Exception in thread "main" java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') Oct 20 20:27:13 wdqs1008 wdqs-updater[12116]: at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2] Oct 20 20:27:13 wdqs1008 systemd[1]: wdqs-updater.service: Main process exited, code=exited, status=1/FAILURE Oct 20 20:27:13 wdqs1008 systemd[1]: wdqs-updater.service: Unit entered failed state. Oct 20 20:27:13 wdqs1008 systemd[1]: wdqs-updater.service: Failed with result 'exit-code'. ```
    • Task
    This task is a checklist for the support that CommRel willp provide for the 2021 Community Wishlist Survey. **Wishlist Schedule:** - Stage 1 -- Submit, discuss and revise proposals: Monday, November 16 - Monday, November 30 - Stage 2 --Community Tech reviews and organizes proposals: Monday, November 23 - Monday, Dec 7 - Stage 3 -- Vote on proposals: Tues, Dec 8 - Monday, Dec 21 - Stage 4 -- Results posted: Wednesday, Dec 23 **Checklist for this year:** [] MassMessage relevant Village Pump equivalents [] CentralNotice banners [] Mailing lists [] Old wishlist talk pages [] Pinging users who participated in relevant categories previous years [] IRC/Telegram/chat [] Tech News [] Meta fron page [] Blog post on Diff? [] Dedicated thread on Discuss
    • Task
    Running Doc: https://www.mediawiki.org/wiki/Wikimedia_Cloud_Services_team/Onboarding_David [] IRC [] Join core IRC channels (wikimedia-operations, wikimedia-cloud, wikimedia-cloud-admin, wikimedia-cloud-feed) [] Set enforce for irc nick (https://meta.wikimedia.org/wiki/IRC/Instructions#Register_your_nickname,_identify,_and_enforce ) [] Apply for Wikimedia cloak (https://meta.wikimedia.org/wiki/IRC/Cloaks) [] Get invites to non-public IRC channels [] Backchannel [] Add to WMCS Telegram group [] Add to Technical Engagement Telegram group [] Technical Engagement team shares [] Add to TE shared calendar (@nskaggs) [] Add to TE google team drive (@nskaggs) [] Calendar invites [] Add to WMCS weekly meeting (@nskaggs) [] Add to WMCS quarterly team practices meeting (@nskaggs) [] Add to Developer Advocacy weekly meeting (@nskaggs) [] Add to SRE weekly meeting (@nskaggs) [] Wikitech [] Create Wikimedia developer account (ldap) https://wikitech.wikimedia.org/wiki/Help:Create_a_Wikimedia_developer_account [] Add 2factor to wikitech login (which will also be toolsadmin, and horizon) [] Gerrit trusted groups [] Add to toollabs-trusted group for operations/docker-images/toollabs-images [] Mailing lists [] Add to cloud-admin mailing list (https://lists.wikimedia.org/mailman/listinfo/cloud-admin) (@nskaggs) [] Add to Technical Engagement Internal mailing list (google groups list) (@nskaggs) [] Add to Cloud Services Internal mailing list (google groups list) (@nskaggs) [] Add to ops mailing list (https://lists.wikimedia.org/mailman/listinfo/ops) [] Subscribe to cloud-announce (https://lists.wikimedia.org/mailman/listinfo/cloud-announce) [] Subscribe to cloud-l mailing list (https://lists.wikimedia.org/mailman/listinfo/cloud) [] Subscribe to wikitech-l mailing list (https://lists.wikimedia.org/mailman/listinfo/wikitech-l) [] Phabricator [] Register in phabricator (https://www.mediawiki.org/wiki/Phabricator/Help#Creating_your_account) [] Associate WMF mediawiki account with phab user account (https://phabricator.wikimedia.org/settings/user/XXXXX/page/external/) [] Add 2factor to Phabricator login [] Access to WMF-NDA protected tasks on Phabricator (https://phabricator.wikimedia.org/project/members/974/) [] trusted-contributors group https://phabricator.wikimedia.org/project/members/3104/ [] server access and responsibilities agreement https://phabricator.wikimedia.org/L3 [] add to #acl_security_management phabricator project/acl [] Add to editing acl for https://phabricator.wikimedia.org/phame/blog/view/5/ (done via #acl_wmcs-team) [] Join WMCS team [] (Optional) Watch WMCS parent project / set email notification preferences [] Cloud VPS [] Make projectadmin in "admin" project [] Make projectadmin in "tools" project [] Make projectadmin in "toolsbeta" project [] Toolforge [] Request access to Toolforge project https://toolsadmin.wikimedia.org/tools/membership/apply [] Make projectadmin for Tools project [] sudo for Toolforge [] Add as maintainer of "admin" Toolforge tool [] Shell account configuration [] wikitech static /etc/hosts entry T164290#4046628 [] New shell user process (https://wikitech.wikimedia.org/wiki/Production_shell_access#New_users) (Add to data.yaml in correct groups) [] wmf and ops ldap groups [] prod icinga contact (including cgi.cfg inclusion) (`private.git`) [] add to sms contact group (`private.git`) [] add to root@ alias in exim (`private.git` make sure to use your email username, not shell) [] Add to cloud-wide root ( see T185493#3920144 ) [] GPG key for pwstore (https://wikitech.wikimedia.org/wiki/PGP_Keys) [] cloud shinken contact and group as well as cloud-admin-feed list (ops/puppet:modules/nagios_common/files/contactgroups-labs.cf) [] VictorOps account [] Points of interest [] https://gerrit.wikimedia.org [] Push a patch to `ops/puppet`, have it reviewed, submit, puppet-merge [] `ssh primary.bastion.wmflabs.org` [] https://netbox.wikimedia.org (uses shell username, requires WMF LDAP) [] https://logstash.wikimedia.org [] https://puppetboard.wikimedia.org/ [] https://debmonitor.wikimedia.org/ [] https://icinga.wikimedia.org/icinga/ [] https://grafana.wikimedia.org [] https://shinken.wmflabs.org [] https://wikitech.wikimedia.org/wiki/Prometheus [] https://horizon.wikimedia.org [] https://tools.wmflabs.org/openstack-browser/project/
    • Task
    As per T265771, we want to measure the vectors for how multimedia content gets added to Wikipedia articles - via Visual Editor, direct Wikitext editing, or bots. This ticket is to create edit tags so that we can do those measurements. Acceptance criteria: [] An edit tag `image add - VE` is applied to edits that add images to articles via Visual Editor [] An edit tag `image add - bot` is applied to edits that add images to articles via bots [] An edit tag `image add - Wikitext` is applied to edits that add images to articles via Wikitext ** Note #1: the Wikitext image addition tag may not be possible. Let's discuss and we can remove from the AC if necessary. ** Note #2: the wording of the edit tags isn't final and is open to discussion! ** Open question: do we want to and can we differentiate between image adds via VE search and image adds via VE uploads?
    • Task
    Steps to Reproduce: in nodejs service runner I was using these 2 services like this -> await this.runner.start({ num_workers: 0, services: [{ name: 'mcs', module: 'node_modules/service-mobileapp-node/app.js', conf: { port: 6927, mwapi_req: { method: 'post', uri: `https://{{domain}}${this.mw.apiUrl.pathname}`, headers: { 'user-agent': '{{user-agent}}', }, body: '{{ default(request.query, {}) }}', }, restbase_req: { method: '{{request.method}}', uri: 'http://localhost:8000/{{domain}}/v3/{+path}', query: '{{ default(request.query, {}) }}', headers: '{{request.headers}}', body: '{{request.body}}', }, }, }, { name: 'parsoid', module: 'node_modules/parsoid/lib/index.js', entrypoint: 'apiServiceWorker', conf: { timeouts: { // request: 4 * 60 * 1000, // Default request: 8 * 60 * 1000, }, limits: { wt2html: { // maxWikitextSize: 1000000, // Default maxWikitextSize: 1000000 * 4, // maxListItems: 30000, // Default maxListItems: 30000 * 4, // maxTableCells: 30000, // Default maxTableCells: 30000 * 4, // maxTransclusions: 10000, // Default maxTransclusions: 10000 * 4, // maxImages: 1000, // Default maxImages: 1000 * 4, // maxTokens: 1000000, // Default maxTokens: 1000000 * 4, }, }, mwApis: [{ uri: this.mw.apiUrl.href, }], }, }], logging: { level: 'info', }, }); and then after making some parsoid requests stopped it using runner.stop(); Actual Results: process doesn't terminate some handles are left leaked I tried to trace them back but there is something wrong in these services only somewhere . is there a way to kill those leaked handles using runne?
    • Task
    The class SpecialEditWatchlist is extended in extension, but the class is not part of the stable policy and that does not allow to extend it. I am not sure what the best way is for the extension https://codesearch.wmcloud.org/search/?q=extends%5Cs%2BSpecialEditWatchlist%5Cb&i=nope&files=&repos=
    • Task
    Cloud VPS Project Tested: N/A Site/Location:EQIAD Number of systems: 1 VM Service: Analytics test cluster Networking Requirements: internal IP, Analytics VLAN Processor Requirements: 2 VCPUS Memory: 4GB Disks: 150GB Other Requirements: N/A This is to replace an-tool1006, which is running Stretch, and we'd like to upgrade to Buster.
    • Task
    In eswiki, this tool doesn't recognize RFPP (Wikipedia:Tablón de anuncios de los bibliotecarios/Portal/Archivo/Protección de artículos/Actual), AIV (Wikipedia:Vandalismo en curso) and AfD (Wikipedia:Consultas de borrado). XTools version: 3.10.17-bed6358d
    • Task
    Focus and hover states should use CSS transitions, as in OOUI. I don't know if these widgets are defined in mediasearch, or upstream in the WVUI library?
    • Task
    I've learned today that some members of the performance team has some extra features when they run tests on Firefox on Android: https://github.com/acreskeyMoz/browsertime_on_android_scripts For example: They force stop Firefox before/after a test, I should implement that too. Let go through the code and see if there's more things we should adopt.
    • Task
    Producing to Kafka is just one way to use EventGate. It's nice to have it as the default, except that building the librdkafka binary is not very lightway, nor does it always work so smoothly on all OS-es (e.g. MacOS). By making node-rdkafka an optional dependency (and guarding against it being missing), we should be able to more easily create development eventgate instances without needing to build librdkafka.
    • Task
    Time to move these hosts up to buster. Test in the vb setup and upgrade the servers when ready.
    • Task
    **Per conversation with @SNowick_WMF on October 20:** @SNowick_WMF will look into [[ https://superset.wikimedia.org/superset/dashboard/androidsearch/ | Android Search Data ]] and analyze trends for search by source and decline in start events. It’ll help us for more clarity in regards to impact of the recent search changes on Android. **Bonus fix:** In our conversation, we realized that `results` in the `Android Search Actions Weekly` chart are incorrect as the number for `start` should be higher than `results`.
    • Task
    The outcome of T253673, is to go with idea 3 - rolling restarts. Work: * [x] Patch Scap to do the rolling restart always (instead of only every once in a while, currently based on unreliable opcache thresholds) -- https://gerrit.wikimedia.org/r/631776 * [ ] Patch Scap to implement an emergency flag that will perform this restart in a way that does not ensure live server capacity, in case of a bad patch having taken down the site in large part. Tracked by {T243009} * [x] Deploy updated Scap to Beta Cluster -- https://integration.wikimedia.org/ci/job/scap-beta-deb/118/ * [ ] Measure timing before and after, and report on task. * [ ] Package Scap for production aptitude. * [ ] Deploy updated Scap to production. * [ ] Measure timing before and after, and report on task.
    • Task
    In logstash there is no way to filter out an empty string. If the stack_trace is an empty string, it should have a null value. This will make it much easier to filter out user browser extensions that have no stack trace.
    • Task
    In {T250206} we worked on a proof of concept project to design a replacement for monitoring a subset of projects using Shinken with Prometheus. Later we realized that {T210993} means that we also need to migrate to Prometheus for basic dashboards like {T264920}. We now need to redesign and build out the POC to scale to collecting at least basic instance health information for all instances in all projects for some reasonable amount of time (3+ months for sure, 1+ year ideally). Features needed beyond POC: [] Storage for metrics from 700+ instances [] Configure the alert rules to monitor disk capacity [] Project local puppetmaster for secrets storage (e.g. volunteer's email addresses) [] Refactor puppet module to merge public and private hiera config [] irc relay for irc alerting per project [] karma alert management dashboard with ability to silence alerts (T250206#6443507)
    • Task
    There are a lot of opportunities for improvement when it comes to collecting data, gathering/querying it, and processing/wrangling it. In FY20-21 we could only work on 3-4 solutions to the problems identified by data analysts and data scientists who work with Wikimedia's production data. This task is about identifying solutions that should be worked on in the following year, potentially in collaboration with teams in other departments. This step should be done by the end of Q3 so that we can work with those teams in Q4 because they will have their own priorities, and so if we require their help we need to provide them with enough time to decide whether they are able to commit to our cause.
    • Task
    Once in a while (e.g. one in every ten smoke tests) it would be good to run the tests with "Don't keep activities" enabled. This is a Developer setting in Android that basically simulates low-memory conditions on the device, and causes activities to be destroyed as soon as they are put in the background. To enable "Don't keep activities": * Go to Developer Settings (in the device itself, not in the app) * The "Don't keep activities" checkbox should be near the bottom of the list. * After enabling it, proceed to test the app as before, and look for any unusual behavior or crashes. (and don't forget to disable "Don't keep activities" when finished testing, to put the device back to "normal")
    • Task
    Need to add a value in `https://meta.wikimedia.org/wiki/Schema:MobileWikiAppAppearanceSettings` for `fontThemeChange`. Current action values are: `themeChange` and `fontSizeChange` This schema also has fields for `current_value` and `new_value` which track old to new sizes and themes so we will need to add values for `serif` and `sans-serif` Once these are instrumented in the app to send those values to the schema, assign this ticket to me and I will add to the schema.