Wed, Jul 7
That was fast :)
Tue, Jul 6
I have actually removed those two print() statements (some debugging, it seems), so it doesn't produce any output.
Mon, Jul 5
It's still sending the announcement-only mail, but the cron is working now. :-) :-)
So you are trying to import https://te.wikipedia.org/wiki/%E0%B0%B8%E0%B0%BE%E0%B0%AE%E0%B1%86%E0%B0%A4%E0%B0%B2%E0%B1%81_-_%E0%B0%85 ?
Does importing another page, or not including the full history succeed?
Sun, Jul 4
And, weird enough, it both went through and sent back an auto-reponse saying it's an announcement-only mailing list.
Well, having too many things is probably part of the reason ;-)
This can't be that hard. @Legoktm do you want me to have a look at this? Doesn't seem to require any advenced permission, only on potd and ml, so I could probably handle it.
mailman3 supports having an account with multiple emails. Requiring one of them (not necessarily the mail used in the mailing list) to match the wiki one seems acceptable.
Sat, Jul 3
Sat, Jun 26
gerrit autobumping it may actually be preferable, indeed.
Fri, Jun 25
May 14 2021
Probably more a Feature Request for upstream, but I think mailman3 should parse that rejection message, find out the error is actually due to the specific message it was trying to deliver, and not increment the bounce counter. Giving semantics to the error messages isn't ideal, but I'm not sure that's possible with their enhanced status codes alone. At least, those error messages are very clear on why it is rejecting them. It also means hardcoding the messages used for certain vendors (to which more can be added in the future), but given the prevalence of gmail/gsuite is so large, and this issue will appear often enough, to make this worthwhile.
Apr 8 2021
I'm not convinced by the "don't commit the result" part. The "compiled" code is still needed for tarballs and even the developers themselves. And they must be using the same version as in prod, or they could be testing a slightly different code, which would be hard to discover.
Mar 28 2021
I'm not currently running an IRC bot from Toolforge, but I have been using SASL for a long time. That code was already there 5 years ago.
Mar 1 2021
I don't think it's complicated at all. It should run fin on a ssh with a Match rule to only allow from external networks the user git (and, while we're at it, forcecommand it there, too).
The part that may be controversial -simple but controversial- is to open port 22 in the firewall to this machine. However, a ssh listening on an alternate port and opening that one is equally bad, should there be a fatal sshd vulnerability.
Jan 26 2021
(Maybe split this subthread into a new task "Connecting to prod should be easy?")
Jan 24 2021
Sorry @Peculiar_Investor, you are right in that there was a change in 1.35.1, I was thinking this was included in 1.35.0
The related change in 1.35.0 vs 1.35.1 was that Content-Encoding: none was changed into Content-Encoding: identity (T258877).
@Peculiar_Investor I don't think that's an issue of 1.35.0 vs 1.35.1 but just that $wgDisableOutputCompression = true; doesn't work with the invisible caching which your hosting does.
Jan 14 2021
Jan 11 2021
On the topic of ssh accesses, there shouldn't be a "big headache of using the command line" for getting access to the cluster. I don't think anyone here with "Technical" in their role would have a problem for doing that, but it wouldn't be necessary. There shouldn't be a need to use a command line, even. There are graphical tools for creating SSH keys and transferring files via ssh. And if the file to copy was in the bastion host, that would be even easier, as no jumping would be needed.
If getting access is being such a big issue (and for multiple people!), that seems a sign that the documentation is in urge need for improvement. It would be a matter of following a number of steps with screenshots. Fill this value here, then click that button, copy the following magical settings into this file.
Dec 14 2020
@Ahonc: they want to upload Ukranian files which are Public Domain in the US but not in Ukraine.
I agree with @Urbanecm in that this seems a can of worms for contributors to Wikisource in Ukraine (which we can fairly expect to be based in Ukraine), which would be uploading files violating the copyright in their local country.
Dec 12 2020
I have been debugging the specific filter with @SRuizR and it wasn't a problem in the regex engine.
Nov 1 2020
The current is setup seems inconsistent, since private filters don't trigger a feed notification, yet an anonymous user can view on Special:AbuseLog that they were triggered, so I see no reason not to publish that through the RCFeed.
Sep 7 2020
We should get a CVE for this extension vulnerability. This code has been here since 2014, and was added itself to avoid a XSS, so basically (assuming it wasn't safe before and something changed) everyone with MobileFrontend installed would be affected.
My +2 to nray patch
I thought it was removing links from headers, but it seems it was not doing anything ¯\_(ツ)_/¯ (other than adding a security vulnerability).
Actually removing the regex seems preferable, indeed.
However, I think this may produce links inside links, which the previous code was trying to avoid?
The basic fix I tried
probably fixed by changing to
This also happens (in Mobile) when forcing a different skin, such as monobook or vector
If the page is protected (thus no edit section link), the XSS doesn't fire
The first img doesn't really need any parameters:
== <center><img><img src=zxcv onerror=throw(document.domain)> ==
I have simplified it to
Aug 25 2020
potential security and privacy concerns with IRC surfaced
Aug 13 2020
There is certainly a lack of documentation. It would be appreciated if you can tell us the result of you setting this up. Or directly update https://www.mediawiki.org/wiki/Manual:$wgSMTP
Aug 12 2020
Aug 10 2020
Probably because the ids themselves would be everything needed to hide them.
What value were you using on title? 'File:Ջուդիթ Կրանց.jpg' ? Or perhaps something like just 'Ջուդիթ Կրանց.jpg' ?
Aug 8 2020
There is no need to actually proxy gravatar. We could have our own instance. Gravatar is just a service mapping email md5 to an uploaded image. Is people still uploading their avatars there? Didn't that stop like a decade ago? Even if some people have an image there, it seems saner to use our own "wikimedia avatars". I'm not particularly happy on using the (hashed) email as primary key, but that seems to be what they are working with.
Aug 7 2020
It could go both ways. If as an Hungarian with only Hungarian credit card, and temporarily visiting the US, you are given HU options, it would succeed. OTOH, if you only had a US card, or if it persisted a US cookie after coming back, it's a failure.
Aug 2 2020
Files uploaded in 2013 with two upload entries at the same minute.
Jul 26 2020
The content encoding of 'identity', was added in rfc2616 with a note that it "SHOULD NOT be used in the Content-Encoding header". The transfer coding identity was removed by rfc7230. rfc7231 uses "identity" as a special value in the context of Accept-Encoding, not of Content-Encoding. Anyway, the semantics of a Content-encoding: identity are completely clear and supported even if it may make for a redundant header.
Jul 9 2020
Jun 16 2020
$maxConcurrency was set to 50, but we had nearly one thousand operations pending.
I can disable $async on FileBackendStore::doQuickOperationsInternal(), and then it no longer fails. Understandably, that makes the process slower (Copied 969 captchas to storage in 43.6 seconds).
I found it is a file descriptor problem. ulimit -n is set to 1024. FormatJson is failing with
Jun 12 2020
I would try
- throwing a clearstatcache() somewhere, in case it makes find the file Json again
- run a different program than python that creates the file externally, e.g. touch filename
Jun 8 2020
It doesn't make any sense that you can upload to phabricator, but not to commons.
I would suspect some crazy with some intermediate box, but the whole connection is encrypted.
Jun 7 2020
Note: The receiving Exim doesn't seem to be configured to accept list mail:
MX records cannot have IP addresses. They must be associated to a hostname (plus a priority)
May 31 2020
It might be a simple issue of changing the db charset, or adding a SET NAMES to the client.
May 6 2020
Apr 23 2020
Maybe, rather than a new permission for that, something like $wgCustomModelProtection which allows requiring a specific right for certain models (presumably those that could be sensitive, most models shouldn't need that), rather than a one-right fits all approach.
Apr 22 2020
Apr 20 2020
Heh, calling it "Welcome Bot" sounded nicer, even if it would still be saying go away from GitHub ;)
I understand the app Name will be the bot name, so rather than "Wikimedia PR Closer" I would prefer something more user-friendly, such as "Wikimedia Welcome Bot for GitHub users"
Apr 17 2020
A point that is somewhat behind this is that the edit conflict isn't isn't too user friendly, which makes edit conflicts more burdensome.
Apr 16 2020
https://github.com/WICG/trust-token-api seems to be another project doing basically the same thing.
If a third party would be presenting our challenges, it could help making them not be able to link the requestors of captchas (for which they would have IP addresses, run js, etc.) and the actual wikipedia (since the token will be redeemed at a later date).
Yes, because with the behavior of the "back" button in browsers at the time, it was actually needed. It no longer is.
Apr 14 2020
Apr 11 2020
Note that MediaWiki doesn't "fail" to detect the conflict if the edit was made by the same user. It explicitly goes and does an expensive check to see if it should ignore the conflict because the edit is by the same user. This was explicitly implemented as a feature. Don't ask me why though ;)
Apr 5 2020
I don't think that graph is the right one, André. It may provide approximate data (both are emails), but I think list email is even sent from a completely different relay.
Mar 22 2020
I think it could be a good request that the tool shouldn't show it. However, the tool is completely right. The user was blocked for 1030 days.
Mar 15 2020
Why are you using that query?
Mar 14 2020
And how do you know, server-side, the number of px in 1 em for the current user?
There are file descriptors being opened that you did not capture. Some suggestions: openat, dup2, socket, accept...
Mar 13 2020
A great usecase of this would be to allow showing a collapsed view of the history page. Currently, you have some page histories where it almost hasn't changed for years, yet there are a lot of history entries due to edits and reverts/undos.
It would be great to have them collapsed as a single link mentioning there were 53 irrelevant edits, so that only actual changes affecting the current page are shown.
Feb 18 2020
Or simply a Michael Jackson event! (that we'd better be aware, too)
Feb 3 2020
@Beej--phabricator can you check if the password being rejected was indeed in the large list at
Jan 14 2020
Maybe mwlog1001 will have some details about what happened?
Jan 11 2020
As an external solution, it could be added to an external mailing list archiver such as marc (they already have wikitech-l, mediawiki-l...)
Jan 3 2020
It was added by @chasemp on Apr 23 2019, 6:53 PM, along WDoranWMF.
Dec 20 2019
The entry SBL205747 is special in that it is an entry requested by mail.com itself. When they consider that the email they are going to send is spam, they send it through an IP address listed there so that people checking the Spamhaus blacklist may block it.
Then the internal urls are really no different than the public ones. Converting them would simply mean prepending "https://upload.wikimedia.org/wikipedia/commons/thumb/"
Dec 12 2019
How are the internal swift urls? I'm not sure why we need two lists.
Also, while a baseline, I don't expect python to be the most efficient construct. A better one could be construed directly in C. Or even use a Bloom filter rather than a set.
Dec 5 2019
I think the bzip2 api doesn't handle the multistream transparently, so tools coded using that would probably be affected.