I work on the MediaWiki Security Team.
Fri, Feb 16
On enwiki there are some 3.5 million blocks of IPs in logging -- a rough guess judging by SELECT COUNT(*) FROM logging WHERE log_type = 'block' AND log_title rlike '^[1-9]' AND log_page = 0, and again without log_page = 0 (includes IPs that have a userpage and accounts that start with a number).
It would depend on how important having the ipc_timestamp in the index actually is (since log_search doesn't have that. OTOH if there actually is a performance boost, that may be useful for other log_search usecases). Otherwise its a pretty similar setup to the ip_changes table and actually in many ways is meant for a pretty similar use case (except log_search usually assumes the number of results will be very small).
Wouldn't this not work for the intended usecase of T146628 as that's asking for the target of the block log entry, not the user associated with the log entry?
Thu, Feb 15
Theres a whitelist i puppet for which log types are allowed on labs. Anything not on the list is not replicated to labs.
Ok, fixes for the failures at: https://gerrit.wikimedia.org/r/#/c/410876/ https://gerrit.wikimedia.org/r/#/c/410869/ and https://gerrit.wikimedia.org/r/#/c/410894/ (In particular, the ImageMap one was an actual issue, in that i18n error messages were being used as raw html)
[I just randomly stumbled across this bug]
Get legal to sign off on this task (is it needed?)
Just to confirm, Collaboration team is planning to take this task on right away?
Wed, Feb 14
Tue, Feb 13
Overall looks good. Review passed. Some small minor things:
Sillu question - isn't the new index pointless? Its adding el_id on the end, but isn't that always on the end of every index, since its the primary key?
This extension is incompatible with ExternalStorage feature used on Wikimedia Wikis, so we cannot install it.
This extension is incompatible with ExternalStorage feature used on Wikimedia Wikis, so we can't install it, so there's not much point doing a security review.
Which is super weird. I guess this would happen if the mRecord property of the Revison object is null but I'm not sure how that could happen.
BadMethodCallException from line 906 of /srv/mediawiki/php-1.31.0-wmf.20/includes/Revision.php: Call to a member function getContent() on a non-object (null)
Mon, Feb 12
To be clear - I feel it would be nice to have an independent review although perhaps not a hard blocker (suppose it goes back to threat models and how "sensitive" the material being encrypted is)
Sun, Feb 11
Fwiw, i think in a replicated environment the original query (with the addition of wfWaitForSlave() and separating into separate commits ) would be better than the proposed new query.
Sat, Feb 10
Fri, Feb 9
Thu, Feb 8
Given this is crypto software, it would be nice if it had published audits of its code (The underlying openpgp.js does appear to have an audit at https://github.com/openpgpjs/openpgpjs/wiki/Cure53-security-audit )
Wed, Feb 7
For some cache cases surviving restarts is probably a good thing (e.g. sessions and maybe parser cache)
Mon, Feb 5
My take on this:
Fri, Feb 2
Thu, Feb 1
Tue, Jan 30
Given everyones going to be at allhands/devsummit next week, maybe we could discuss this bug in person.
Have we considered just having it as a subdirectory of https://doc.wikimedia.org/ ? It seems like a documentation type thingy.
Mon, Jan 29
Yes, this patch looks correct, and fixes the unserialization issue
That's quite a range. They are all associated with University of Tennessee, but does really all of U of tenn really need to be whitelisted. Including ResNet?
Sun, Jan 28
As in, i think a steward would have to add you to the group.
I think this is pretty obviously ok. Just needs someone to do it.
Sat, Jan 27
I fixed it
Fri, Jan 26
Its probably something that could be fixed in a post-processing layer to the svg.
Wed, Jan 24
Im not sure i would describe the english discussion as concensus.
Tue, Jan 23
Noticed the vegetarian option for lunch Tuesday was 'available on request' instead of being put out with the meaty food. This may be an issue for people.
As an aside, I think it would be more useful to users to have semi-deep. Most people want ~4 or 5 levels deep. The category tree becomes very messed up (at least on commons) and usually results in weird results if you have 6 or more levels deep
What's the path to choose here to get the patch finally merged?
I often like to point out that all wmf employees (afaik) can techncially merge in mediawiki and all extensions, even when they have no contributions to core or any extensions.
I'm not sure how I feel. Leszek certainly has a lot of contributions to many extensions, and seems to be a good programmer. But I would normally want more contributions to MediaWiki core before +2 to core. (This should not be counted as a negative vote, I just need to think some about it)
So core is fixed now, so we should be all clear to go ahead with the fix in wikidata
Mon, Jan 22
ping on this. How is this going?
Sun, Jan 21
I'm asking you to trust my 9 years of Drupal experience.
Sat, Jan 20
I think a lot of us (myself included) get stuck in the trap of thinking about how much time/effort/money we have spent on a piece of software and the thought of abandoning that software seems crazy,
Fri, Jan 19
I thought https://firstname.lastname@example.org/unlearning-toxic-behaviors-in-a-code-review-culture-b7c295452a3c was an interesting article kind of related to this topic
Jan 18 2018
Do we actually have shared secrets?
Jan 17 2018
[Note, I am sort of away this week, so have limitted availability]
Jan 15 2018
On the subject of things google will not change - It would be nice if well one task is waiting for review, the student would be allowed to start the next task to reduce time where student has nothing to do (Provided that they only have one waiting for review and they've succesfully completed at least 2)
Jan 13 2018
Ive always tried to ensure fixes are backported for extension fixes im involved with, but we probably need to do better.
Jan 12 2018
Ok, review done.
Jan 11 2018
I haven't finished with this yet, but the (relatively minor) concerns I have so far:
Jan 10 2018
Scraping data out of the serialized PHP data contained in the log_params field in a MariaDB view would be difficult at best. Performance would certainly be horrible if this artificial field was then expected to be searchable with SQL.