User Details
- User Since
- Oct 24 2014, 10:10 PM (401 w, 5 d)
- Availability
- Available
- LDAP User
- Platonides
- MediaWiki User
- Platonides [ Global Accounts ]
Wed, Jun 15
May 10 2022
Instead of clearing the handlers, why not avoid registering the handlers again?
Feb 2 2022
Jan 13 2022
@MoritzMuehlenhoff, did you see https://www.spinics.net/lists/stable/msg509296.html ?
Apparently upstream identified the issue as 09e856d54bda5f288ef8437a90ab2b9b3eab83d1 and reverted it on all stable trees (Debian might not have picked the revert though).
I would have expected the wikitech timeline to contain a final entry for "mx2001" back into (i.e. T297128)
Jan 12 2022
Vector was created in 2009 (became the default in 2010), so instead of vector19 it would have made more sense to call it vector2009.
If I understand this task correctly, currently the Ganeti cluster is running on stretch nodes. The VM themselves have no explicit kvm:machine_version set, which on stretch nodes means "pc-i440fx-2.8" but on buster nodes that would default to "pc-i440fx-3.1" and thus they would be incompatible.
Jan 2 2022
Dec 28 2021
I think it should be something to set at the *BagOStuff level. MediumSpecificBagOStuff has the concept of a configurable keyspace, which is then inherited by the other classes, but MediumSpecificBagOStuff::makeGlobalKey() hardcodes the keyspace to global.
Dec 7 2021
@Pealhasan.x2 one would generally mark the second report of a bug as a duplicate of the first one. However, in this case it was tracked on T293556, and only determined to be the same as this later. Marking it as a duplicate means "this is the same problem as T293556". It makes sense to put them in this orer since the work was done there. even though in this case it does happen that your report came earlier.
Oct 30 2021
Just noticed it again, this time on a page with an ampersand in the title it is shown as & (html &)
Oct 4 2021
Aug 26 2021
Aug 24 2021
If we are going to make that the default for Japanese, I think it would be best not to manually add some manual markup for forcing it into every score.
Aug 22 2021
We may submit it again if needed. I would expect findActorIdInternal to be a quick action. I guess the db connection timed out while it was doing all the slow file work so the next db action (which happened to be that findActorIdInternal) failed.
Thanks. That's weird. :/
I find that the Article & Talk links that appear are a bit out of place. I would put it as a 'Discuss' link besides the edit link (which would then requires getting a suitable icon).
$wgMaxUploadSize is set on InitialiseSettings.php to 4 GB. I was under the impression that it was 1 GB, but it is set to 4 GB per 46e69532.
@Mojtaba-kd, please wait a few working days so that people may catch up. This is not so urgent it needs to be done during the weekend
What about:
It may indeed be a filesize issue.
Aug 20 2021
This seems to have been given the internal video2commons id 18bc7a39f68a4169
Aug 19 2021
(I don't know why some have a "-l" and some have not. I leave it up to the mailman admins if we should harmonize that. I'd prefer the variant without "-l".)
Aug 11 2021
It doesn't seem. It is serving a non-expired certificate generated on July:
Not Before: 7/11/2021
Not After: 10/9/2021
Jul 7 2021
That was fast :)
Jul 6 2021
I have actually removed those two print() statements (some debugging, it seems), so it doesn't produce any output.
Jul 5 2021
It's still sending the announcement-only mail, but the cron is working now. :-) :-)
So you are trying to import https://te.wikipedia.org/wiki/%E0%B0%B8%E0%B0%BE%E0%B0%AE%E0%B1%86%E0%B0%A4%E0%B0%B2%E0%B1%81_-_%E0%B0%85 ?
Does importing another page, or not including the full history succeed?
Jul 4 2021
And, weird enough, it both went through and sent back an auto-reponse saying it's an announcement-only mailing list.
Well, having too many things is probably part of the reason ;-)
This can't be that hard. @Legoktm do you want me to have a look at this? Doesn't seem to require any advenced permission, only on potd and ml, so I could probably handle it.
mailman3 supports having an account with multiple emails. Requiring one of them (not necessarily the mail used in the mailing list) to match the wiki one seems acceptable.
Jul 3 2021
Jun 26 2021
gerrit autobumping it may actually be preferable, indeed.
Jun 25 2021
May 14 2021
Probably more a Feature Request for upstream, but I think mailman3 should parse that rejection message, find out the error is actually due to the specific message it was trying to deliver, and not increment the bounce counter. Giving semantics to the error messages isn't ideal, but I'm not sure that's possible with their enhanced status codes alone. At least, those error messages are very clear on why it is rejecting them. It also means hardcoding the messages used for certain vendors (to which more can be added in the future), but given the prevalence of gmail/gsuite is so large, and this issue will appear often enough, to make this worthwhile.
Apr 8 2021
I'm not convinced by the "don't commit the result" part. The "compiled" code is still needed for tarballs and even the developers themselves. And they must be using the same version as in prod, or they could be testing a slightly different code, which would be hard to discover.
Mar 28 2021
I'm not currently running an IRC bot from Toolforge, but I have been using SASL for a long time. That code was already there 5 years ago.
Mar 1 2021
I don't think it's complicated at all. It should run fin on a ssh with a Match rule to only allow from external networks the user git (and, while we're at it, forcecommand it there, too).
The part that may be controversial -simple but controversial- is to open port 22 in the firewall to this machine. However, a ssh listening on an alternate port and opening that one is equally bad, should there be a fatal sshd vulnerability.
Jan 26 2021
(Maybe split this subthread into a new task "Connecting to prod should be easy?")
Jan 24 2021
Sorry @Peculiar_Investor, you are right in that there was a change in 1.35.1, I was thinking this was included in 1.35.0
The related change in 1.35.0 vs 1.35.1 was that Content-Encoding: none was changed into Content-Encoding: identity (T258877).
I don't think they would need the IP address. If all they want are statistics on the number of requests/solves from an IP address, they could be given a HMAC of the IP address with a secret salt. Plus probably the AS and country of the IP, since I'm sure that's also part of their risk analysis. They couldn't combine requests from wmf users with those from third parties, wikimedia sites would be on its own island, but that's the goal. We have a big enough user base, that I doubt it combining it would really be needed. That, plus proxying the actual image loads (and not letting them insert arbitrary javascript, but using a known-good copy), I think would work wrt privacy. Still not ideal from a FOSS philosophical POV, though.
@Peculiar_Investor I don't think that's an issue of 1.35.0 vs 1.35.1 but just that $wgDisableOutputCompression = true; doesn't work with the invisible caching which your hosting does.
Jan 14 2021
Jan 11 2021
On the topic of ssh accesses, there shouldn't be a "big headache of using the command line" for getting access to the cluster. I don't think anyone here with "Technical" in their role would have a problem for doing that, but it wouldn't be necessary. There shouldn't be a need to use a command line, even. There are graphical tools for creating SSH keys and transferring files via ssh. And if the file to copy was in the bastion host, that would be even easier, as no jumping would be needed.
If getting access is being such a big issue (and for multiple people!), that seems a sign that the documentation is in urge need for improvement. It would be a matter of following a number of steps with screenshots. Fill this value here, then click that button, copy the following magical settings into this file.
Dec 14 2020
@Ahonc: they want to upload Ukranian files which are Public Domain in the US but not in Ukraine.
I agree with @Urbanecm in that this seems a can of worms for contributors to Wikisource in Ukraine (which we can fairly expect to be based in Ukraine), which would be uploading files violating the copyright in their local country.
Dec 12 2020
I have been debugging the specific filter with @SRuizR and it wasn't a problem in the regex engine.
Nov 1 2020
The current is setup seems inconsistent, since private filters don't trigger a feed notification, yet an anonymous user can view on Special:AbuseLog that they were triggered, so I see no reason not to publish that through the RCFeed.
Sep 7 2020
We should get a CVE for this extension vulnerability. This code has been here since 2014, and was added itself to avoid a XSS, so basically (assuming it wasn't safe before and something changed) everyone with MobileFrontend installed would be affected.
My +2 to nray patch
I thought it was removing links from headers, but it seems it was not doing anything ¯\_(ツ)_/¯ (other than adding a security vulnerability).
Testing it.
Actually removing the regex seems preferable, indeed.
However, I think this may produce links inside links, which the previous code was trying to avoid?
The basic fix I tried
probably fixed by changing to
This also happens (in Mobile) when forcing a different skin, such as monobook or vector
If the page is protected (thus no edit section link), the XSS doesn't fire
The first img doesn't really need any parameters:
== <center><img><img src=zxcv onerror=throw(document.domain)> ==
I have simplified it to
Aug 25 2020
potential security and privacy concerns with IRC surfaced
Aug 13 2020
There is certainly a lack of documentation. It would be appreciated if you can tell us the result of you setting this up. Or directly update https://www.mediawiki.org/wiki/Manual:$wgSMTP
Aug 12 2020
MediaWiki can use the internal PHP mail() function or PEAR::Mail.
Aug 10 2020
Probably because the ids themselves would be everything needed to hide them.
What value were you using on title? 'File:Ջուդիթ Կրանց.jpg' ? Or perhaps something like just 'Ջուդիթ Կրանց.jpg' ?
Aug 8 2020
There is no need to actually proxy gravatar. We could have our own instance. Gravatar is just a service mapping email md5 to an uploaded image. Is people still uploading their avatars there? Didn't that stop like a decade ago? Even if some people have an image there, it seems saner to use our own "wikimedia avatars". I'm not particularly happy on using the (hashed) email as primary key, but that seems to be what they are working with.
Aug 7 2020
It could go both ways. If as an Hungarian with only Hungarian credit card, and temporarily visiting the US, you are given HU options, it would succeed. OTOH, if you only had a US card, or if it persisted a US cookie after coming back, it's a failure.
Aug 2 2020
Files uploaded in 2013 with two upload entries at the same minute.
Jul 26 2020
The content encoding of 'identity', was added in rfc2616 with a note that it "SHOULD NOT be used in the Content-Encoding header". The transfer coding identity was removed by rfc7230. rfc7231 uses "identity" as a special value in the context of Accept-Encoding, not of Content-Encoding. Anyway, the semantics of a Content-encoding: identity are completely clear and supported even if it may make for a redundant header.
Jul 9 2020
Jul 4 2020
Jun 16 2020
$maxConcurrency was set to 50, but we had nearly one thousand operations pending.
I can disable $async on FileBackendStore::doQuickOperationsInternal(), and then it no longer fails. Understandably, that makes the process slower (Copied 969 captchas to storage in 43.6 seconds).
I found it is a file descriptor problem. ulimit -n is set to 1024. FormatJson is failing with
Jun 12 2020
I would try
- throwing a clearstatcache() somewhere, in case it makes find the file Json again
- run a different program than python that creates the file externally, e.g. touch filename
Jun 8 2020
It doesn't make any sense that you can upload to phabricator, but not to commons.
I would suspect some crazy with some intermediate box, but the whole connection is encrypted.
Jun 7 2020
Note: The receiving Exim doesn't seem to be configured to accept list mail:
MX records cannot have IP addresses. They must be associated to a hostname (plus a priority)
May 31 2020
It might be a simple issue of changing the db charset, or adding a SET NAMES to the client.