Wed, Jun 21
So i guess someone is sending spam subject lines to wikimedia-gh, with a forged from address of email@example.com, in order for the mailing list software to resend the spam in the form of a pending moderation message. That's a really cute trick.
Mon, Jun 19
Yes it was my ip.
I'm not actually sure if it comes from the MaxMind db, and if so what db, but if it does - https://www.maxmind.com/en/geoip2-city-database-accuracy gives some accuracy numbers. Although that's averages per country, I imagine major urban centers are much more accurate than rural places.
The geoip cookie already records the city (And a lat/long), so the data is already there
re: the file header I'm assuming it'd be set for both the original and carried over to thumbnails, and purged from all once the file is patroled
Sat, Jun 17
Fri, Jun 16
Thu, Jun 15
Maybe dupe of T121797?
A bigger question is where to store the hashes. Usually perceptual hashes are compared using hamming distance. This is sort of inefficient to do in a traditional mysql database. I remember years ago that manybubbles talked about how it would make sense to use elasticsearch as the storage backend for image similarity, so itd probably be useful to look into that direction
Wikipedia Zero traffic is tied to IP addresses, not users. So it definitely could be performant. Have MediaWiki set an unpatrolled header and purge on patrol. Then (somehow) configure Varnish to understand WP0 IP ranges and block if the unpatrolled header is set.
Oh, i see, the A record points to a page which is hosted by the registrar for .wiki (who happens to use aws- i didnt originally think of navigating to 22.214.171.124 to find out more info...), so its essential some domain parking service that redirects to our site.
Hmm. W.wiki seems to be similar in that we own the domain but the A record points to AWS. It has the additional interesting thing in that its included in the subject alt name of our main certificate
Wed, Jun 14
Tue, Jun 13
I thought this would have been dealt with by T133147
Mon, Jun 12
The cl_from entries in those pages are missing (i.e. no page table entry) e.g. https://ba.wikipedia.org/w/index.php?curid=122102&uselang=en gives a badtitle error.
Is this query from labs replica or actual db? In the past labs replicas have had replication issues related to DELETEs on categorylinks table that cause it to retain old rows that arent really there
Fri, Jun 9
I know everyone is tired of discussing the code of conduct, but for non-Wikimedia repos hosted in gerrit, I think it would have been more appropriate to have a mailing list discussion before force merging a code of conduct doc into such repos.
ping @DFoy : Everything look good? this has been live for a couple weeks now.
Thu, Jun 8
Do we really want to imply to the user that they can just replace the core/ directory? What if we add a new php entry point? Will this make (tarball) users less likely to upgrade skins/ and extensions/?
Tue, Jun 6
That's odd. Looks like the skin specific styles for vector are not loading.
Mon, Jun 5
Sorry this took so long. I think this is good to go.
Note that an alternative fix for this issue may come in the form of the TemplateStyles extension.
Wed, May 31
Just as an update, this is almost done and will be finished by the end of the week
Mon, May 29
May 23 2017
May 22 2017
May 21 2017
It looks like as of the hackathon all of this is done (Yay!). I'm leaving this open because I'm going to do a quick double check of the fixes later this week, but this bug is essentially done :D
I can't for the life of me figure out how this could have happened
#0 /var/www/core/core/includes/libs/rdbms/TransactionProfiler.php(218): Wikimedia\Rdbms\TransactionProfiler->reportExpectationViolated('writes', 'query-m: REPLAC...')
Maybe its just localization cache screwing with the no-writes in read access requests, transaction profiler thingy
May 20 2017
May 19 2017
$pageTitle = $rc->getTitle(); in the getDiffHistLinks method can return null, a case not currently handled.
Yay it works, but it seems silly that we now have 2 separate globals for showing sql errors (The $wgShowSQLErrors still seems to be needed for the api)
May 17 2017
Sorry for the delay, we will do this soon: https://gerrit.wikimedia.org/r/#/c/354113/
May 16 2017
Add a test comment?
In theory we could have a whitlist and then emit DENY or ALLOW-FROM depending on the origin, but it would have to be implemented in all kinds of things that render/cache wiki pages (MediaWiki, Parsoid, Varnish...) which is a bit of a pain. Maybe it could be limited to authenticated page views (framing an unauthenticated view seems pretty harmless).
unassigning from self. Community-tech has taken over working on this extension.
One handy (ab)use of php serialization is deep cloning
This is probably the header that would improve our security the most. I've been working on this, but progress has been very slow, largely due to lack of time on my part. See https://www.mediawiki.org/wiki/Requests_for_comment/Content-Security-Policy and T135963 for more details. There are different levels of using this header, with different levels of changes required depending on how "strict".
Its related to whether a page is "click-jackable". For ordinary articles, usually that means if there is a "patrol" link on the page.
May 4 2017
May 3 2017
I'm still working through all this, but some initial things:
May 2 2017
Apr 30 2017
@Legoktm: there is also a report that debian has the wrong version as well (I havent verified this myself)
Apr 29 2017
I sent a warning to mediawiki-l and wikitech-l (https://lists.wikimedia.org/pipermail/mediawiki-l/2017-April/046524.html) arguably an issue of this type deserves a warning to mediawiki-announcements but i dont have send access to that mailing list.
Theres a post about this on oss-security now
We should really issue another release for this right away. The syntax highlight bug was by far the most severe of all the bugs last release. To avoid confusion we should probably bump the mediawiki version number (eventhough that version number technically only applies to mediawiki core and not syntaxhighlight)
Apr 28 2017
+1 to just reverting the pdfhandler config change
The character encoding thing looks like its related to colours. Presumably the escape control character is being stripped by the logs
I mean its being sent as Content-Type: application/x-www-form-urlencoded
Apr 27 2017
Whats the error in question?
Apr 25 2017
This week is already becoming kind of insane due to events on frwiki. How about we do this on monday
This is more a browser UI issue (This is the same for anything else protected by basic http auth e.g https://logstash.wikimedia.org )