User Details
- User Since
- Oct 24 2014, 10:10 PM (331 w, 3 d)
- Availability
- Available
- LDAP User
- Platonides
- MediaWiki User
- Platonides [ Global Accounts ]
Yesterday
I don't think it's complicated at all. It should run fin on a ssh with a Match rule to only allow from external networks the user git (and, while we're at it, forcecommand it there, too).
The part that may be controversial -simple but controversial- is to open port 22 in the firewall to this machine. However, a ssh listening on an alternate port and opening that one is equally bad, should there be a fatal sshd vulnerability.
Jan 26 2021
(Maybe split this subthread into a new task "Connecting to prod should be easy?")
Jan 24 2021
Sorry @Peculiar_Investor, you are right in that there was a change in 1.35.1, I was thinking this was included in 1.35.0
The related change in 1.35.0 vs 1.35.1 was that Content-Encoding: none was changed into Content-Encoding: identity (T258877).
I don't think they would need the IP address. If all they want are statistics on the number of requests/solves from an IP address, they could be given a HMAC of the IP address with a secret salt. Plus probably the AS and country of the IP, since I'm sure that's also part of their risk analysis. They couldn't combine requests from wmf users with those from third parties, wikimedia sites would be on its own island, but that's the goal. We have a big enough user base, that I doubt it combining it would really be needed. That, plus proxying the actual image loads (and not letting them insert arbitrary javascript, but using a known-good copy), I think would work wrt privacy. Still not ideal from a FOSS philosophical POV, though.
@Peculiar_Investor I don't think that's an issue of 1.35.0 vs 1.35.1 but just that $wgDisableOutputCompression = true; doesn't work with the invisible caching which your hosting does.
Jan 14 2021
Jan 11 2021
On the topic of ssh accesses, there shouldn't be a "big headache of using the command line" for getting access to the cluster. I don't think anyone here with "Technical" in their role would have a problem for doing that, but it wouldn't be necessary. There shouldn't be a need to use a command line, even. There are graphical tools for creating SSH keys and transferring files via ssh. And if the file to copy was in the bastion host, that would be even easier, as no jumping would be needed.
If getting access is being such a big issue (and for multiple people!), that seems a sign that the documentation is in urge need for improvement. It would be a matter of following a number of steps with screenshots. Fill this value here, then click that button, copy the following magical settings into this file.
Dec 14 2020
@Ahonc: they want to upload Ukranian files which are Public Domain in the US but not in Ukraine.
I agree with @Urbanecm in that this seems a can of worms for contributors to Wikisource in Ukraine (which we can fairly expect to be based in Ukraine), which would be uploading files violating the copyright in their local country.
Dec 12 2020
I have been debugging the specific filter with @SRuizR and it wasn't a problem in the regex engine.
Nov 1 2020
The current is setup seems inconsistent, since private filters don't trigger a feed notification, yet an anonymous user can view on Special:AbuseLog that they were triggered, so I see no reason not to publish that through the RCFeed.
Sep 7 2020
We should get a CVE for this extension vulnerability. This code has been here since 2014, and was added itself to avoid a XSS, so basically (assuming it wasn't safe before and something changed) everyone with MobileFrontend installed would be affected.
My +2 to nray patch
I thought it was removing links from headers, but it seems it was not doing anything ¯\_(ツ)_/¯ (other than adding a security vulnerability).
Testing it.
Actually removing the regex seems preferable, indeed.
However, I think this may produce links inside links, which the previous code was trying to avoid?
The basic fix I tried
probably fixed by changing to
This also happens (in Mobile) when forcing a different skin, such as monobook or vector
If the page is protected (thus no edit section link), the XSS doesn't fire
The first img doesn't really need any parameters:
== <center><img><img src=zxcv onerror=throw(document.domain)> ==
I have simplified it to
Aug 25 2020
potential security and privacy concerns with IRC surfaced
Aug 13 2020
There is certainly a lack of documentation. It would be appreciated if you can tell us the result of you setting this up. Or directly update https://www.mediawiki.org/wiki/Manual:$wgSMTP
Aug 12 2020
MediaWiki can use the internal PHP mail() function or PEAR::Mail.
Aug 10 2020
Probably because the ids themselves would be everything needed to hide them.
What value were you using on title? 'File:Ջուդիթ Կրանց.jpg' ? Or perhaps something like just 'Ջուդիթ Կրանց.jpg' ?
Aug 8 2020
There is no need to actually proxy gravatar. We could have our own instance. Gravatar is just a service mapping email md5 to an uploaded image. Is people still uploading their avatars there? Didn't that stop like a decade ago? Even if some people have an image there, it seems saner to use our own "wikimedia avatars". I'm not particularly happy on using the (hashed) email as primary key, but that seems to be what they are working with.
Aug 7 2020
It could go both ways. If as an Hungarian with only Hungarian credit card, and temporarily visiting the US, you are given HU options, it would succeed. OTOH, if you only had a US card, or if it persisted a US cookie after coming back, it's a failure.
Aug 2 2020
Files uploaded in 2013 with two upload entries at the same minute.
Jul 26 2020
The content encoding of 'identity', was added in rfc2616 with a note that it "SHOULD NOT be used in the Content-Encoding header". The transfer coding identity was removed by rfc7230. rfc7231 uses "identity" as a special value in the context of Accept-Encoding, not of Content-Encoding. Anyway, the semantics of a Content-encoding: identity are completely clear and supported even if it may make for a redundant header.
Jul 9 2020
Jun 16 2020
$maxConcurrency was set to 50, but we had nearly one thousand operations pending.
I can disable $async on FileBackendStore::doQuickOperationsInternal(), and then it no longer fails. Understandably, that makes the process slower (Copied 969 captchas to storage in 43.6 seconds).
I found it is a file descriptor problem. ulimit -n is set to 1024. FormatJson is failing with
Jun 12 2020
I would try
- throwing a clearstatcache() somewhere, in case it makes find the file Json again
- run a different program than python that creates the file externally, e.g. touch filename
Jun 8 2020
It doesn't make any sense that you can upload to phabricator, but not to commons.
I would suspect some crazy with some intermediate box, but the whole connection is encrypted.
Jun 7 2020
Note: The receiving Exim doesn't seem to be configured to accept list mail:
MX records cannot have IP addresses. They must be associated to a hostname (plus a priority)
May 31 2020
It might be a simple issue of changing the db charset, or adding a SET NAMES to the client.
May 6 2020
Apr 23 2020
Maybe, rather than a new permission for that, something like $wgCustomModelProtection[] which allows requiring a specific right for certain models (presumably those that could be sensitive, most models shouldn't need that), rather than a one-right fits all approach.
Apr 22 2020
Apr 20 2020
Heh, calling it "Welcome Bot" sounded nicer, even if it would still be saying go away from GitHub ;)
I understand the app Name will be the bot name, so rather than "Wikimedia PR Closer" I would prefer something more user-friendly, such as "Wikimedia Welcome Bot for GitHub users"
Apr 17 2020
A point that is somewhat behind this is that the edit conflict isn't isn't too user friendly, which makes edit conflicts more burdensome.
Apr 16 2020
https://github.com/WICG/trust-token-api seems to be another project doing basically the same thing.
If a third party would be presenting our challenges, it could help making them not be able to link the requestors of captchas (for which they would have IP addresses, run js, etc.) and the actual wikipedia (since the token will be redeemed at a later date).
Yes, because with the behavior of the "back" button in browsers at the time, it was actually needed. It no longer is.
Apr 14 2020
Apr 11 2020
Note that MediaWiki doesn't "fail" to detect the conflict if the edit was made by the same user. It explicitly goes and does an expensive check to see if it should ignore the conflict because the edit is by the same user. This was explicitly implemented as a feature. Don't ask me why though ;)
Apr 5 2020
I don't think that graph is the right one, André. It may provide approximate data (both are emails), but I think list email is even sent from a completely different relay.
Mar 22 2020
I think it could be a good request that the tool shouldn't show it. However, the tool is completely right. The user was blocked for 1030 days.
Mar 15 2020
Why are you using that query?
Mar 14 2020
And how do you know, server-side, the number of px in 1 em for the current user?
There are file descriptors being opened that you did not capture. Some suggestions: openat, dup2, socket, accept...
Mar 13 2020
A great usecase of this would be to allow showing a collapsed view of the history page. Currently, you have some page histories where it almost hasn't changed for years, yet there are a lot of history entries due to edits and reverts/undos.
It would be great to have them collapsed as a single link mentioning there were 53 irrelevant edits, so that only actual changes affecting the current page are shown.
Feb 18 2020
Or simply a Michael Jackson event! (that we'd better be aware, too)
Feb 3 2020
@Beej--phabricator can you check if the password being rejected was indeed in the large list at
https://github.com/danielmiessler/SecLists/blob/aad07ff/Passwords/10_million_password_list_top_100000.txt ?
Jan 14 2020
Maybe mwlog1001 will have some details about what happened?
Jan 11 2020
As an external solution, it could be added to an external mailing list archiver such as marc (they already have wikitech-l, mediawiki-l...)
Jan 3 2020
It was added by @chasemp on Apr 23 2019, 6:53 PM, along WDoranWMF.
Dec 20 2019
The entry SBL205747 is special in that it is an entry requested by mail.com itself. When they consider that the email they are going to send is spam, they send it through an IP address listed there so that people checking the Spamhaus blacklist may block it.
Then the internal urls are really no different than the public ones. Converting them would simply mean prepending "https://upload.wikimedia.org/wikipedia/commons/thumb/"
Dec 12 2019
How are the internal swift urls? I'm not sure why we need two lists.
Also, while a baseline, I don't expect python to be the most efficient construct. A better one could be construed directly in C. Or even use a Bloom filter rather than a set.
Dec 5 2019
I think the bzip2 api doesn't handle the multistream transparently, so tools coded using that would probably be affected.
Dec 1 2019
The error "page" is actually https://office.wikimedia.org/wiki/Main_Page#/media/File:Wikimedia_Foundation_All_Hands_2018_-_Myleen_Hollero_(edited).jpg, which is failing due to the underlying https://office.wikimedia.org/w/api.php?action=query&format=json&prop=imageinfo&titles=File%3AWikimedia_Foundation_All_Hands_2018_-_Myleen_Hollero_(edited)%2Ejpg&iiprop=url&iiurlwidth=1024&smaxage=86400&maxage=86400 erroring with:
code: readapidenied
info: You need read permission to use this module.
Nov 25 2019
I can help mentoring these @Gopavasanth
Nov 16 2019
Nov 7 2019
The file was reuploaded by Denniss on September 2016, but the original file is still missing, TheSandDoctor.
(old comment that was waiting as draft in the browser)
Oct 12 2019
I'm not sure about the purges. Sometimes you need a null edit in order to really bypass some caches.
Looks like a partially downloaded image. Probably an interrupted download produced a truncated image that was then cached (locally, on a server?).
Oct 10 2019
For one, when reading the UI of T234537#5553996, initially thought that "Require email address" meant requiring being in control of the email address (i.e. that the feature was like requiring that the user validates the change via email in addition of knowing the current password before password changes take action), whereas it actually is meant to require knowing which is the email address linked to the account.
Oct 6 2019
As an alternative solution, you could add an AbuseFIlter that prevents non-admins from adding {{yes}}
The default value of stablepages is 'exclude', since the very beginning (f798ae02d418f63301da636d1d21719ead743773).
Sep 22 2019
Similarly, on Wednesday ...@yahoo.com subscriptions were disabled for wikitech-l due to this same issue, bouncing https://lists.wikimedia.org/pipermail/wikitech-l/2019-September/092531.html, a legitimate email which seems perfectly innocuous.
See T22507 for the same issue in 2009
Yesterday, we had the same issue on biblio-es-l with all subscribers using a yahoo.es email address being automatically disabled delivery (funnily, it was not the case for those with a yahoo.com one), as the max retry timeout for emails from an email from Tuesday was reached.
Aug 17 2019
There is an API which allows you to look up deleted files by hash:
https://commons.wikimedia.org/w/api.php?action=query&list=filearchive&fasha1base36=eu7dlnl8z4upc36048um3aonw45w510&faprop=sha1
https://commons.wikimedia.org/w/api.php?action=query&list=filearchive&fasha1=7f08c97431182ef389ef6f09faac9ff6410b5674&faprop=sha1
Aug 10 2019
It is trying to purge objects created more than 2592000 seconds ago (30 days), with a sleep of 0,0005 seconds between each batch.
Jun 14 2019
I'm also ok with that.