User Details
- User Since
- Oct 23 2014, 3:02 PM (332 w, 3 d)
- Availability
- Available
- LDAP User
- Magnus Manske
- MediaWiki User
- Unknown
Fri, Mar 5
Turns out flickr retired the secure.flickr.com domain without forwarding. Removed the "secure" but still using https, should work again.
Thu, Feb 25
Also, doesn't seem to work for "has a birth date"?
https://www.wikidata.org/w/index.php?search=haswbstatement%3AP21%3DQ6581072++haswbstatement%3AP569&title=Special:Search&profile=advanced&fulltext=1&advancedSearch-current=%7B%7D&ns0=1&ns120=1
Fri, Feb 19
Restarted wikidata-todo.
Jan 28 2021
It does work, however, since there is currently not a single page on euwiki where the latest revision has a wp10 score, the results will be empty.
Jan 21 2021
Jan 20 2021
Thanks, I think I fixed it now.
Dec 18 2020
Should restart now
On it
Nov 25 2020
I would also like to point out https://tabernacle.toolforge.org/ which ATM is for editing existing items, but I could easily create a new blank item to edit.
Nov 23 2020
Oct 14 2020
Just saw this one. I think fixing wikibase to add to watchlist is the way to go. I could update the OAuth permissions but that would break everyone's server-based edits as they would use an old OAuth, correct?
Oct 9 2020
Should be fixed now
Oct 7 2020
I'm on it.
Jul 13 2020
Worked with 10K batches. I still think this rapid disconnect in the middle of a query is a bug.
Trying 1M chunks, first two worked, but then:
mysqlimport: Error: 2013, Lost connection to MySQL server during query, when using table: osopenuprn_202006
I'll try even smaller ones, but it's getting a bit ridiculous. Why disconnect the server in the middle of a query?
Jul 9 2020
Jun 30 2020
I have switched the configuration to use the petscan db connections only.
Jun 26 2020
Just to add, if some queries take too long I can likely just change a parameter to fix that. Someone with access to the stats, let me know.
The reason I use that many connections is precisely to avoid long-running ones, as I had in a previous version; they would time out or lose database connection, so I re-wrote the code to use more but shorter queries.
Jun 24 2020
I have "forked" it into github. Using diffusion was a mistake in the first place. Now at https://github.com/magnusmanske/quickstatements
Tried again with ssh://vcs@git-ssh.wikimedia.org/source/tool-quickstatements.git, no joy:
Permission denied (publickey,keyboard-interactive). fatal: Could not read from remote repository.
tools.quickstatements@tools-sgebastion-07:~$ git remote remove origin
tools.quickstatements@tools-sgebastion-07:~$ git remote add origin ssh://phabricator.wikimedia.org/diffusion/2010/tool-quickstatements.git/
tools.quickstatements@tools-sgebastion-07:~$ git push fatal: The current branch master has no upstream branch.
To push the current branch and set the remote as upstream, use
Jun 22 2020
FWIW, I believe all my (active) tools now use the new canonical URL.
Jun 19 2020
As I wrote five years ago, I added a limit to upload only two files concurrently.
No further comment has been made since, so I consider this issue to be solved.
Jun 18 2020
Thanks, that seems perfect!
Jun 8 2020
OK thanks
As a horrible short term fix for reasonator, I did:
$ ln -s $HOME/public_html/reasonator_types.js $HOME/public_html/_types.js
Jun 7 2020
I think this is a blocker to moving all tools to the canonical URL. T234617 ?
I cleared the error log on reasonator and switched to the previous URL schema, then back to the canonical one. No server errors in the log (some unrelated PhP ones, now cleared again)
Jun 6 2020
@LucasWerkmeister thanks that worked, for now
Same with https://geohack.toolforge.org/geohack.php . Geohack is kinda central for Wikipedia so please help!
Thanks @bd808 !
Other files load fine. Haven't checked all though
Jun 3 2020
May 28 2020
Restarted it now.
I am also working on a watcher script to restart it automatically when it goes down.
It's my dev server, should work for any query size. Feel free to use when the main site is down.
May 22 2020
May 21 2020
Apr 29 2020
@Aklapper actually the repo is https://github.com/magnusmanske/petscan_rs (the bitbucket one is the old C++ version).
Apr 22 2020
Apr 20 2020
There are some PetScan outages, reason unknown so far, but outside those, the above queries all work fine.
Apr 14 2020
Still happening.
Apr 7 2020
Maybe this helps:
Yes, still seeing it. Runs for ~20sec so probably not a cache. Changed the JSONP callback(s) and still same error, so definitely no content-level cache.
Mar 18 2020
Mar 17 2020
Mar 13 2020
It appears that the third-party bot framework quietly swallows this and just throws an exception with the "info" field :-(
In the error I mean
Hmm, I think what tripped my code is that there was no code:'maxlag' in the response?
Seems to clear up now
But it's all kinds of wikis that show up. Something with the dispatcher?
Mar 4 2020
QuickStatements edits as the user who submitted the job, using OAuth, as is the WMF-preferred method for tools.
Feb 25 2020
Feb 19 2020
Should now be fixed with https://bitbucket.org/magnusmanske/listeria/commits/5b53c94ffb10
Feb 4 2020
Getting this error for mix-n-match tool.
Jan 28 2020
Jan 26 2020
@bd808 Thanks, now running quickstatements as two repilcas
Jan 24 2020
Update: Setting server.max-request-size = 1000000 in $HOME/.lighttpd.conf and restarting the webservice did not help.
Jan 14 2020
Dec 10 2019
For debugging:
It's not a stale cache on my side. If I get different properties from https://www.wikidata.org/w/api.php?action=help&modules=query%2Buserinfo I get the requested properties but still the bogus block.
If I check the API directly in the browser, it doesn't show. Maybe because OAuth login?
I seem to be getting the block for [[User:Doqume]] on Wikidata, but for every user
I think I found the reason. Using MW API to get user info, I get:
{ "id": 4420, "name": "Magnus Manske", "blockid": 15320, "blockedby": "Mahir256", "blockedbyid": 203574, "blockreason": "আপনাকে স্বয়ংক্রিয়ভাবে বাধা দেওয়া হয়েছে, কারণ আপনার আইপি ঠিকানাটি সম্প্রতি \"[[User:Doqume|Doqume]]\" ব্যবহার করেছেন। Doqume-কে বাধাদানের কারণ \"Freebald-ish behavior\"", "blockedtimestamp": "2019-12-09T16:59:04Z", "blockexpiry": "2019-12-10T16:59:04Z", "groups": [ "rollbacker", "*", "user", "autoconfirmed" ], "rights": [ "autopatrol", "editsemiprotected", "move", "autoconfirmed", "skipcaptcha", "abusefilter-log-detail", "suppressredirect", "read", "edit", "createpage", "createtalk", "writeapi", "translate", "item-term", "property-term", "item-merge", "item-redirect", "abusefilter-view", "abusefilter-log", "flow-hide", "reupload-own", "move-rootuserpages", "move-categorypages", "minoredit", "purge", "applychangetags", "changetags", "reupload", "upload", "flow-edit-post" ] }
But I am not blocked; editing works fine.
I can't create batches either, so good for testing!
I did not touch QS for weeks. No idea why this is happening.
Dec 6 2019
Fixed now.
Dec 5 2019
Dec 4 2019
Done.
Dec 3 2019
This should get them all:
on it
Nov 12 2019
To add another use case (and to ping the issue):
Sep 25 2019
deleted access and error log files
Sep 24 2019
As of today, QuickStatements supports MediaInfo items (Mxxx).
For now, you'll have to supply the IDs manually, which is a pain.
I am working on a QS syntax parser in Rust, which will support
- ranks
- page/filename => ID conversion on-the-fly
This will require some more testing
Sep 17 2019
Sep 16 2019
Bot code patched, deployed, someone please test
+1
That's true, but a reload of the batch page should return the STOP button, as its state is only read from the database. The bot, in turn, only checks the database (or should, I suspect it doesn't).
Sep 8 2019
Another idea came to me:
What is it's not just "page lists", but any (general, of one of pre-defined types) tables?
One table type would be "page title/page namespace", giving us the above lists.
Others could be, say, Mix'n'match catalogs ("external ID/url/name/description/instance of").
Sep 6 2019
Sep 5 2019
Started some design notes of such a product: https://meta.wikimedia.org/wiki/Gulp
OK, some initial thoughts and remarks on this:
- I have actually rewritten Listeria in Rust, to use the Commons Data: namespace (aka .tab files) to store the lists, and use Lua to display them.
- I think the Commons Data: namespace would technically work for a generalized "list storage", thugh it seems to be a bit of abandonware (will this feature be long-term supported by the WMF?)
- Commons Data: namespace, if supported, would also have the proper scaling, caching etc. that PagePile is lacking
- It should, in principle, be possible to change PagePile to write new piles to the Commons Data: namespace, and return queries from there. That would give the new list storage a running start. We can replace PagePile later.
- Drawbacks of Commons Data: namespace are (a) cell size limit (400 characters, so should work for simple page lists), and (b) total page size (thus limiting the max list length)
- If Labs were to offer a scalable, backed-up object store for tools, that might be better suited for general list management
- Much of the "average Wikimedian" integration will have to come from (user-supplied) JavaScript, such as "snapshot this category tree" or something. I doubt waiting for WMF would be a timely solution.
- Short term, we (I?) could write a slim web API on Labs that abstracts the implementation away, offering a to-be-discussed set of functions (create/amend/remove list etc). Initially, this could run on PagePile in the background, or Commons Data: namespace, or even both (large lists go to pagepile, short ones into a MySQL database or Commons Data: namespace, etc.)