See my mediawiki.org userpage.
User Details
- User Since
- Sep 19 2014, 7:30 PM (443 w, 2 d)
- Roles
- Administrator
- Availability
- Available
- IRC Nick
- legoktm
- LDAP User
- Legoktm
- MediaWiki User
- Unknown
Today
Yesterday
This was mostly discussed in the context of T154067, but my interpretation of the current consensus is that while <big> is deprecated in HTML, we are making an explicit choice to not consider it deprecated in wikitext and OK for editors to use. It has been more than the "three or four years" since T40487#3119873 (2017) and I don't think there is any interest in moving this forward. Certainly if people do want to remove it from the toolbar, that should be done on its own merits and not because it is obsolete in HTML (which is most of the discussion here).
Sat, Mar 18
Thu, Mar 16
Sorry I misread, if it's been broken since February 23 it's not UBN by definition, but definitely API breakage that still should be fixed.
Wed, Mar 15
Not to mention that in most cases w.wiki will generate shorter URLs than enwp.org.
Tue, Mar 14
Pretty minor so I'm just logging it here - https://en.wikipedia.org/w/index.php?title=Wikipedia:WikiProject_Arena_Football_League/Newsletter/Issue_IX&diff=prev&oldid=1144636755 and https://en.wikipedia.org/w/index.php?title=Wikipedia:WikiProject_Arena_Football_League/Newsletter/Issue_XIII&diff=prev&oldid=1144636461 removed trailing whitespace from the beginning of a table cell that was only modified at the very bottom.
Mon, Mar 13
Sun, Mar 12
Created https://wikis.world/@mediawiki and documented at https://www.mediawiki.org/wiki/Project:Mastodon - please boost :)
Sat, Mar 11
It looks fixed to me as well. OK to close or is more investigation needed?
I found another outdated /mismatched entry - P45731. If I had to guess, it's not fully updating pages after they become redirects? But it's still updating their HTML?
@awight it's already on Toolforge and WMCS, see /public/dumps/public/other/enterprise_html/
The packages were initially backports of the bullseye versions, but we have a bunch of random patches on top. On T286217#8572913 I wrote/suggested:
Our current Mailman deployment is a bunch of backported and forked debs with random patches thrown on top based on what we managed to fix upstream. It's not sustainable (as hopefully T286217#7406437 shows). Given that we need to get off buster anyways, I would suggest that we wait until the bookworm freeze gets more frozen and set up some lists1003 with normal Debian packages, and after some level of testing switch lists.wm.o over to the new host. The new version will have a new set of bugs, we either learn to live with them or patch via puppet.
I mean, we're swimming in a big ocean of proprietary API & services usage already. If what you mean is that between rate limiting, authentication, and engineering the hook to the API it's just not worth all that extra effort, then sure, I agree.
For reference, anomiebot is MySQL user s51055. I can reproduce the connection error with tools.legobot, which is s51043. It appears to affect all slices. But the tool I created on Wednesday, tools.tour-nyc (s55336) is not affected and can connect just fine.
https://wikitech.wikimedia.org/wiki/Incidents/2023-03-09_mailman still a draft but mostly complete, please edit, etc.
Thu, Mar 9
Sent notification to listadmins@, of course most people won't see it until everything has caught up.
There are 2,936 emails in the out queue, it takes ~5.1 seconds to send each, so we're looking at recovery in ~250 minutes or 4.2 hours. Because new mail will keep coming in, it'll probably be closer to 5 hours I'd guess.
Re-opening just for tracking while we wait for the queue to go down. Also tagging as incident worthy, I can work on the writeup tomorrow.
Using a regex was always going to have limitations, at the time I couldn't find a decent looking parser, but https://lib.rs/crates/rfc822-like seems promising. https://lib.rs/search?q=debian has quite a few more.
Tue, Mar 7
Mon, Mar 6
It's the various select and selectRow functions in https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Linter/+/refs/heads/master/includes/Database.php
Sat, Mar 4
The reason it happens is because the database query needs to store two numbers, the # of enabled users and the # of active users. It abuses the namespace column for the number of active users; given those values are not real namespaces, we end up with Special:BadTitle.
Fri, Mar 3
Thanks for the update - filed T331087: Remove bbcrewind.co.uk exemption for Wikimedia Maps to track the removal.
Thu, Mar 2
Thanks, looking through some of those is pretty interesting. https://en.wikipedia.org/wiki/User:The_Earwig/Signature is a good example of why this is an impossible problem to fully prevent, it varies the signature based on {{REVISIONUSER}}, and defaults to an empty string, which would pass length checks.
Wed, Mar 1
@AntiCompositeNumber would you be able to provide some estimates on how many signatures this change would affect?
Re-opening because the limit is mostly useless if it doesn't count post-expanded length. Compared to 2012 and 2015 we now have proper signature requirements, which mostly invalidate the arguments in T12715#155577.
Tue, Feb 28
Mon, Feb 27
Sat, Feb 25
At least one Trove VM did not resume correctly when all hypervisors were shut down and restarted
Caused by T329949: [Cloud VPS] Trove dbs do not restart after a hypervisor restart, I manually started the database VM and seems to be up - I manually kicked off the jobs to start again.
Sorry feel behind on this, let me see...
Shouldn't we just fix T330576...?
Few doc improvements: https://www.mediawiki.org/w/index.php?diff=5794808&oldid=5576457&title=Manual:Security and https://www.mediawiki.org/w/index.php?diff=5794835&oldid=5766210&title=Manual:Configuring_file_uploads - latter page needs more work.
@taavi suggested adding the header in core's pre-existing images/.htaccess; I'll submit a patch for that.
Fri, Feb 24
Discussed with @brennen today in #wikimedia-gitlab. Based off the existing GitLab/Phab integration (see https://wikitech.wikimedia.org/wiki/GitLab/Phabricator_integration), we can set a webhook globally to receive all the events we care about and said webhook can live in Toolforge. I'll take a first stab at the implementation.
This ticket mostly discusses the technical implementation; have we decided as a policy matter that e.g. GitLab issues should not be used?
Merged and released in parsoid 0.7.6 and 0.8.0-alpha.5.
Thu, Feb 23
I think this merits a backport to the 0.7.x stable series.
The JSON structure is:
Object { "templatearg": Object { "i": Number(1), "params": Object { "1": Object { "wt": String(""), }, }, "target": Object { "wt": String("2"), }, }, },
Tue, Feb 21
(Assuming the assignment was just part of closing the task and not planning to work on this)
The API was created a while back and documented at https://en.wikipedia.org/api/rest_v1/#/Transforms/post_transform_wikitext_to_lint__title_