My conscious is a jukebox
Tue, Feb 25
This sounds amazing! I really enjoy the CLI tools provided by Addshore's docker wrapper, and adding on ways to manage extensions would be incredible too. I assume it wouldn't mimic vagrant roles, e.g. do everything you need to do to get CentralAuth working? This is totally fine; just a helper to download the extension would still be quite nifty.
Mon, Feb 24
Thanks for the re-ping! @Aklapper We aren't actively working on this bot anymore, so I don't how much use we'll get out of a dedicated Phab project but we're not against the idea. Please feel free to go ahead :)
Thu, Feb 20
Note also you need to update all the count methods in WatchedItemStore, e.g. at the top of Special:Watchlist it says "You have N pages on your watchlist". That number should not include expired items.
Sounds good! But the link should go somewhere on-wiki. I found https://www.wikidata.org/wiki/Wikidata:Tools/OpenRefine/Editing
Note I have not reached out to GLAM about this at all recently, but I know this was requested in the past so I believe it is a meaningful project (and I'll probably end up doing it anyway!). I think the Hackathon is a great opportunity to devote time to this, and hopefully rope in some new contributors.
Tue, Feb 18
Unassiging until T245213 is resolved.
There is now a NSFW image classifier running on Cloud Services, and from my observations it has proven to be a very effective algorithm. It needs to be in production if we wanted to use it in an extension. That is tracked at T214201: Implement NSFW image classifier using Open NSFW but unfortunately it has lost traction and is no longer on the road map. Perhaps the extension itself could include the classifier. The idea is to pre-store the scores, then I suppose there's a hook we could tie into to hide the relevant imagery on page load, as opposed to retroactively hiding them which would be much too slow. Note the extension would need to also live on Commons since that's where most images live, and I guess there could be a configuration variable or something to disable the auto-hide feature itself, since Commons specifically would not want it.
Mon, Feb 17
Fri, Feb 14
There's an existing cleanupWatchlist.php maintenance script
I have added Copyvios to Community Tech's uptime monitor, so maintainers will get emailed if it goes down. This does not cover errors with the Google API, which is the subject of this task. @Earwig If you don't want the uptime emails let me know :) If you are okay with them, make sure firstname.lastname@example.org is on your contact list. Gmail in particular seems to mistake it for spam.
Thu, Feb 13
I think an acceptable compromise might be to just report views of media in commons. What do people think?
I'm going to stop here and ask for feedback on my approach.
I was wrong! The HTML output is generated by the C++ library. It doesn't look that bad to work with, though, and I think https://gerrit.wikimedia.org/r/c/mediawiki/php/wikidiff2/+/539906 will offer some clues. So:
Wed, Feb 12
I like this idea! It is a simple query to get the current size of the backlog. However if we wanted historical data (to see how the backlog size changed over time), it would require a database schema change, and more UI changes too. I'm not sure if we have time to implement that right now, but we can at least give you a live count.
Not the full file upload (as bd808 pointed out the md5 chuncks should not get exposed) but maybe it is not far fetched to ask users to enter: "/wikipedia/commons/SomeFile.png" path? (given that a "project" path and a file uniquely identifies a file)
Tue, Feb 11
This has been on the to-dos for quite some time. Thanks for creating a task for it!
Mon, Feb 10
without relying on the API you can't tell whether en.wikisource.org/wiki/File:Speed_Limit_50_Minimum_5_sign.svg refers to a file on Commons or a file on Wikisource that had 0 views in the requested period
Per standup I'm going to mark this as resolved. @ifried said she got the information she needed.
I think this can be closed now. I am seeing some 5xx-level errors in the logs, but this was happening on Toolforge, too. I'll create some tasks for exports that appear to be broken.
Okay both production and staging now live in the new wikisource project:
Sat, Feb 8
Just FYI, non-Vector desktop skins simply show "Watch" and "Unwatch" links. Not sure if you wanted to change the language there.
Indeed it seems around July 22, pageviews to Special:Contributions stopped being recorded https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org&platform=all-access&agent=user&range=latest-300&pages=Special:Contributions/MusikAnimal
Fri, Feb 7
Thu, Feb 6
(mostly copied/pasted from my findings at T242918#5824735):
The watchlist table is not replicated, so indeed we don't need to replicate watchlist_expiry.
The wikisource project has been created! I went ahead and set up the dev instance, now available at https://wsexport-test.wmflabs.org/
Wed, Feb 5
Thanks, we concur this is deserving a dedicated project. Note I just changed the requested name to wikisource, in the off-chance other related tools get bundled into the project (we also have reservations about some connotations of wsexport :)
Tue, Feb 4
New VPS project requested at T244307: Request creation of wikisource VPS project
The expected URL on the API's side has double URL encoded the title. %C3 (ä) in the title has become %25C3 in the API URL. %25 is the URL encoding of %.
@MusikAnimal could work around this in his client, but if it is possible to fix in the backing API that would be nicer for other consumers.
Here are the modified queries; note the EXPLAIN results are on my local dev wiki, hence not representative of production:
I concur that this doesn't serve the use-case of AutoEdits. MassMessage edits are made only by User:MediaWiki message delivery, so there's little sense in searching for such edits on other accounts. You wouldn't want to run AutoEdits on the mass message bot either, since all of that account's edits are mass messages. Hopefully that makes sense.
Same as with T244072: Request to add OAuth Hello world (end user) to AutoEdits. I'm not sure what the value is in adding this example tool to the AutoEdits configuration.
We can add it, since in terms of performance we're not worried about having too many things to lookup on testwiki, but I don't see the value. This is an example tool that has no widespread use. Presumably there are just a few such edits from each account, and probably just a handful of accounts in total. I am tempted to decline solely because of the unnecessary clutter in our configuration. Is there a use case for using AutoEdits on testwiki?
The API key has been refreshed and copyvios has been updated accordingly.
Mon, Feb 3
https://tools.wmflabs.org/mediaviews is back! I would still consider this "beta". Any feedback is most welcomed. More features will be added in the near future, such as getting request counts for all media in a category.
Okay, VPS is up and running https://wsexport.wmflabs.org/, and https://tools.wmflabs.org/wsexport is being redirected there. Documentation for the instance is at https://wikitech.wikimedia.org/wiki/Tool:CommTech#Wikisource_Export. UptimeRobot is now monitoring the new location.
Increasing the memory limit on Toolforge apparently didn't do the trick.
I'll be honest, I don't know how to really answer this without doing some coding. Max's investigation says a lot: T231698#5512527. Since that time, the wikidiff2 output apparently includes section names with the relevant line numbers (T234096). This includes both the left side and the right side (see for example REST API output), which I think alleviates Max's performance concern. If I'm understanding things correctly, the work would be limited to MediaWiki's DifferenceEngine.php to make use of the section titles. I suppose system administrators can swap out the diff engine (MediaWki appears to use sebastianbergmann/diff by default), our changes will have to only show section titles if the engine provides this information.