The plug-in could easily be moved to it's own non-core, non-mediawiki repo. The only thing it uses MediaWiki for is to check if the number passed to it is a non-latin zero, which doesn't matter for PageTriage (since it's only on English Wikipedia). We could then just make it a dependency in PageTriage via Composer and remove it from core.
@Huji: Discussing further with Legal and the Echo team. Need to figure out data retention issues.
Haven't been able to reproduce. Pinged the reporter on the thread.
Mon, Oct 16
Fri, Oct 13
Could someone elaborate on what 'purge script' we're talking about? Are we talking about eraseArchivedFile.php or something else? Who is running this currently?
Thu, Oct 12
I don't agree with € -> G and π -> R, but the rest look reasonable (although ∆ is debatable).
Wed, Oct 11
Pinged WMF Privacy/Legal/Security to get feedback on this before moving forward.
Tue, Oct 10
Some RTL bugs may have been fixed by https://gerrit.wikimedia.org/r/#/c/380678/. Let's re-test and see if there are still problems.
@Jdx: Thanks! I've informed Ops that the problem has not been resolved.
@He7d3r: I can't parse what https://pt.wikipedia.org/wiki/Special:AbuseFilter/18 is trying to do. Can you give us a simplified use case here? The use case that ccnorm_contains_any() was written to address is the one in the description:
ccnorm_contains_any( added_lines, "testing", "vandalizing" )
Pull request merged.
I see this bug all the time in Firefox and it makes it difficult to edit sometimes. @Amire80: Any suggestions? Perhaps you could point us in the direction of where the relevant code is that controls the position of the ULS icon.
I think returning null for all unsupported protocols is a good idea.
Mon, Oct 9
How can you try to move a page that is move protected?
Actually, it looks like the issue isn't when a page is move protected, but when the page can't be moved for some other reason, like a page of that name already exists, or the new title is invalid.
Sat, Oct 7
@dbarratt, @Legoktm: Since there's nothing MediaWiki-specific in this library and we want to advertise it for general 3rd party use, would anyone object if I renamed the repo from wikimedia/mediawiki-libs-Equivset to wikimedia/Equivset (similar to wikimedia/DeadlinkChecker which is also a general-purpose library)?
Fri, Oct 6
@dbarratt: I don't think we need to go crazy with testing the equivalencies. Basically, I just want to make sure we have the following covered:
- Testing whatever function(s) are in the library for doing string comparison
- Making sure we test at least one equivalency that involves recursive mapping, e.g. Θ -> 0 -> O
- Making sure we test at least one equivalency that involves case change
- Testing a few random spoof strings, e.g. j1mmy w4l35 (no idea if that one even works)
@Dispenser: Thanks for the update. I've pinged Ops to let them know and see where we can go from here.
Yeah, this bug is driving me crazy.
The big bug we're dealing with is parsing end dates out of the closing string. Other than that, the RfX tools are functional.
Is that bug tracked anywhere?
These have been rewritten in the new XTools, they're just a little buggy and lacking a few features. We'll get to that eventually, so I would recommend leaving these two as-is for now until the rewritten versions go live.
Does it really even make sense having these in XTools? They are both specific to English and German Wikipedia rather than being general purpose tools. I haven't looked at the code though, so I don't know how integrated they are with the shared XTools architecture. Also, I'm worried that if we just leave them until the versions in the new XTools are running, we're never going to actually deprecate the old XTools.
Thu, Oct 5
Can we redirect the Article Blamer tool to http://wikipedia.ramselehof.de/wikiblame.php? We really should sunset any active functionality at http://tools.wmflabs.org/xtools/, as it's confusing having 2 different xtools.
OK, so it sounds like we're going with Symfony.
@Tbayer: It looks like the job queue has been back to reasonable levels since the beginning of September.
Added documentation for foreachwiki :)
@dbarratt: There's basically two different ways to do this:
- SWAT deployment: You schedule the deployment in a SWAT window and once the code is deployed, you ask the deployer to run it for you.
- Riding the train: The maintenance script rides the deployment train and ends up on all the wikis a week after merging. Then you, or someone with access to terbium, runs the maintenance script.
Wed, Oct 4
@EddieGP: I believe there were performance concerns with that approach. My understanding is that Aaron preferred that we do something more passive (like this).
If purging is undesirable on production, which is something I can agree with, wikireplicas can create "compatibility layers" like a view that shows only non-expired rows as a compromise option (which is transparent to wikireplica users).
Personally, I would prefer that we do the purging at the source. It's confusing to me when there are big differences between the wikireplica data and the production data, and I'd like to keep them as synced as practically possible (with appropriate differences for security and privacy). Is there a down-side to purging in production?
I'm not sure. @Keegan: Can you confirm whether or not the caching problem is still a problem. Ops says the job queue should no longer be backed up, but there is disagreement about whether or not the job queue congestion was likely to have affected deletion requests.
Tue, Oct 3
If this is going to be done by a maintenance script and a cron job, you may want to look at how PageTriage or PageAssessments does this as an example. The maintenance script would live in the core MediaWiki repo and the cron script that runs maintenance script the would live in the operations/puppet repo (probably in modules/mediawiki/manifests/maintenance/). @Dzahn can help with this.
It looks like the answer (prior to ACTRIAL) is ~36% of new articles are created by auto-patrolled users.
As Leon's manager, I approve.
Please move to the sprint planning column of the Community-Tech workboard once community consensus is established.