Program Officer at The Wikipedia Library.
Also User:Samwalton9 (WMF).
Thanks @Arcayn, I'm going to take a look shortly!
Meta page created too.
Haven't seen any other instances of this, so I assume it was an early platform bug.
We could also give staff a way to manually trigger a new update, in case something needs replacing quickly for some reason.
Looks like the issue is from these commits:
Admin and sysadmin docs improved. Developers to be done.
Oops, didn't mean to close this yet. This task just isn't blocked now.
This is now implemented per https://github.com/SuLab/WikidataIntegrator/issues/52#issuecomment-443003064
The list may contain sensitive data from the tool so we'd rather the archives be private.
Pull Request merged, so there are now some brief usage instructions on Github. I won't mark this as resolved just yet, because we should create a Meta page too.
Might require a bit of reworking in how I've set up the search page since I may want to re-use the search form + stats box, but should really avoid duplicating all that code in the view and template where possible.
Just noting that this is waiting on Jason to finish off some Authorization work before I tidy up the work noted in PR comments.
Ideally we would check the edit just before it's proposed to the user, and move on to another edit (deleting the suggestion) if it's no longer valid, but this could add quite an overhead to the time taken to load the edit - especially if the tool has to go through a number of suggestions before it's able to find a valid one. We could instead have a background process running through all the cached edits and ensuring each gets checked on a regular basis, deleting suggestions that are no longer valid, but that doesn't 100% solve the problem because we're not checking at the time the edit is proposed.
I'm going to flag this as resolved - we can re-open if the error shows up again.
Now backing up weekly on production.
@Aklapper Could this task be made available? My other task for this tool has now been completed :)
JSON can now be retrieved at /json, e.g. https://hashtags.wmflabs.org/json/?query=100wikidays
Specifically, the script needs to check that the hashtags_hashtag table exists, and that migrations have finished running, before starting.
Looks like this task strongly overlaps with T180101, and there's a Github issue for the title part at https://github.com/WikiEducationFoundation/WikiEduDashboard/issues/1514.
The app now waits for mysqladmin to return a valid ping before attempting to start and migrate (https://github.com/Samwalton9/hashtags/commit/cebaefec0f8e6bb24dca3648b652306cf4c7c77f).