Restarted it now.
I am also working on a watcher script to restart it automatically when it goes down.
- Feed Queries
- All Stories
- Search
- Feed Search
- Transactions
- Transaction Logs
Jun 6 2020
Jun 3 2020
May 28 2020
It's my dev server, should work for any query size. Feel free to use when the main site is down.
May 22 2020
May 21 2020
Apr 29 2020
@Aklapper actually the repo is https://github.com/magnusmanske/petscan_rs (the bitbucket one is the old C++ version).
Apr 20 2020
There are some PetScan outages, reason unknown so far, but outside those, the above queries all work fine.
Apr 14 2020
Still happening.
Apr 7 2020
Yes, still seeing it. Runs for ~20sec so probably not a cache. Changed the JSONP callback(s) and still same error, so definitely no content-level cache.
Mar 17 2020
Mar 13 2020
It appears that the third-party bot framework quietly swallows this and just throws an exception with the "info" field :-(
In the error I mean
Hmm, I think what tripped my code is that there was no code:'maxlag' in the response?
Seems to clear up now
But it's all kinds of wikis that show up. Something with the dispatcher?
Mar 4 2020
QuickStatements edits as the user who submitted the job, using OAuth, as is the WMF-preferred method for tools.
Feb 25 2020
Feb 19 2020
Should now be fixed with https://bitbucket.org/magnusmanske/listeria/commits/5b53c94ffb10
Feb 4 2020
Getting this error for mix-n-match tool.
Jan 28 2020
Jan 26 2020
@bd808 Thanks, now running quickstatements as two repilcas
Jan 24 2020
Update: Setting server.max-request-size = 1000000 in $HOME/.lighttpd.conf and restarting the webservice did not help.
Jan 14 2020
Dec 10 2019
In T240316#5728232, @Lea_Lacroix_WMDE wrote:@Bugreporter That seems unlikely. QuickStatementsBot is blocked since October 28th, and daily users of the tool only started reporting this issue yesterday.
For debugging:
It's not a stale cache on my side. If I get different properties from https://www.wikidata.org/w/api.php?action=help&modules=query%2Buserinfo I get the requested properties but still the bogus block.
If I check the API directly in the browser, it doesn't show. Maybe because OAuth login?
I seem to be getting the block for [[User:Doqume]] on Wikidata, but for every user
I think I found the reason. Using MW API to get user info, I get:
{
"id": 4420,
"name": "Magnus Manske",
"blockid": 15320,
"blockedby": "Mahir256",
"blockedbyid": 203574,
"blockreason": "আপনাকে স্বয়ংক্রিয়ভাবে বাধা দেওয়া হয়েছে, কারণ আপনার আইপি ঠিকানাটি সম্প্রতি \"[[User:Doqume|Doqume]]\" ব্যবহার করেছেন। Doqume-কে বাধাদানের কারণ \"Freebald-ish behavior\"",
"blockedtimestamp": "2019-12-09T16:59:04Z",
"blockexpiry": "2019-12-10T16:59:04Z",
"groups": [
"rollbacker",
"*",
"user",
"autoconfirmed"
],
"rights": [
"autopatrol",
"editsemiprotected",
"move",
"autoconfirmed",
"skipcaptcha",
"abusefilter-log-detail",
"suppressredirect",
"read",
"edit",
"createpage",
"createtalk",
"writeapi",
"translate",
"item-term",
"property-term",
"item-merge",
"item-redirect",
"abusefilter-view",
"abusefilter-log",
"flow-hide",
"reupload-own",
"move-rootuserpages",
"move-categorypages",
"minoredit",
"purge",
"applychangetags",
"changetags",
"reupload",
"upload",
"flow-edit-post"
]
}But I am not blocked; editing works fine.
I can't create batches either, so good for testing!
I did not touch QS for weeks. No idea why this is happening.
Dec 6 2019
Fixed now.
Dec 5 2019
In T239036#5715194, @ROOTxDEAD wrote:
Dec 4 2019
Done.
Dec 3 2019
This should get them all:
on it
Nov 12 2019
To add another use case (and to ping the issue):
Sep 25 2019
deleted access and error log files
Sep 24 2019
As of today, QuickStatements supports MediaInfo items (Mxxx).
For now, you'll have to supply the IDs manually, which is a pain.
I am working on a QS syntax parser in Rust, which will support
- ranks
- page/filename => ID conversion on-the-fly
This will require some more testing
Sep 17 2019
Sep 16 2019
Bot code patched, deployed, someone please test
+1
That's true, but a reload of the batch page should return the STOP button, as its state is only read from the database. The bot, in turn, only checks the database (or should, I suspect it doesn't).
Sep 8 2019
Another idea came to me:
What is it's not just "page lists", but any (general, of one of pre-defined types) tables?
One table type would be "page title/page namespace", giving us the above lists.
Others could be, say, Mix'n'match catalogs ("external ID/url/name/description/instance of").
Sep 6 2019
In T231891#5468371, @Astinson wrote:In T231891#5467967, @Magnus wrote:Started some design notes of such a product: https://meta.wikimedia.org/wiki/Gulp
Ooooh, thank you Magnus, that's a really great first pass at thinking about that. For the reuse by something like Listeria or tabernacle would it make sense to store the associated Wikidata item (or also commons MediaInfo id?) with the page ? (so that you wouldn't have to query those pages for them to do things like add properties?) One thing that kindof "fails" for me in the user experience of the current Petscan->Pagepile->Tabernacle workflows (and I am thinking this might be true in other workflows as well) is that the endtool expects _only_ Wikidata items, so if I don't generate a Wikidata list first, the tool either needs to have code to retrieve that or you have to generate a new list. If there was a second column with the optional Wikidata id, it would probably make lists made with one wiki in mind more portable.
In T231891#5468418, @Astinson wrote:In T231891#5467967, @Magnus wrote:Started some design notes of such a product: https://meta.wikimedia.org/wiki/Gulp
For the List data structure, in addition too or as part of the Description: would it make sense to require a field for "Source" of the data (i.e. Petscan id, shorturl for query, etc) so that anyone "seeing" the pile could go to it, recreate the query/input, and modify it? (kindof like how folks use the Listeriabot lists).
In T231891#5469814, @LucasWerkmeister wrote:Minimum viable product
- Import from various sources
- All sources offered in PagePile
- Export to various places
- All consumers offered in PagePile
How is this supposed to work? As far as I can tell, these imports and exports would have to go through the PagePile tool in some form, so to me these read like requirements that can only be fulfilled by one person: the PagePile maintainer.
Sep 5 2019
Started some design notes of such a product: https://meta.wikimedia.org/wiki/Gulp
OK, some initial thoughts and remarks on this:
- I have actually rewritten Listeria in Rust, to use the Commons Data: namespace (aka .tab files) to store the lists, and use Lua to display them.
- I think the Commons Data: namespace would technically work for a generalized "list storage", thugh it seems to be a bit of abandonware (will this feature be long-term supported by the WMF?)
- Commons Data: namespace, if supported, would also have the proper scaling, caching etc. that PagePile is lacking
- It should, in principle, be possible to change PagePile to write new piles to the Commons Data: namespace, and return queries from there. That would give the new list storage a running start. We can replace PagePile later.
- Drawbacks of Commons Data: namespace are (a) cell size limit (400 characters, so should work for simple page lists), and (b) total page size (thus limiting the max list length)
- If Labs were to offer a scalable, backed-up object store for tools, that might be better suited for general list management
- Much of the "average Wikimedian" integration will have to come from (user-supplied) JavaScript, such as "snapshot this category tree" or something. I doubt waiting for WMF would be a timely solution.
- Short term, we (I?) could write a slim web API on Labs that abstracts the implementation away, offering a to-be-discussed set of functions (create/amend/remove list etc). Initially, this could run on PagePile in the background, or Commons Data: namespace, or even both (large lists go to pagepile, short ones into a MySQL database or Commons Data: namespace, etc.)
I believe I fixed the issue in the Rust bot. I had a successful test, but please try it yourself.
Sep 4 2019
Actually, that bitbucket repo is for the _really old version_ (pre-1.0).
Sep 2 2019
In T229917#5458155, @Lydia_Pintscher wrote:https://www.mediawiki.org/wiki/Manual:Tags says "A complete list of all the available tags is displayed on Special:Tags. Users with the managechangetags user right – administrators by default – can use this special page to create and delete tags (see Help:Tags)."
Does this help?
Added it for most of my tools, centrally. Works fine for distributed-game. for wdfist I get:
E1:The tag "wdfist" is not allowed to be manually applied
Now rolling the change back, until I know what tags I am allowed to use where and when.
Jul 24 2019
Because toolforge forgot the replica.conf again, see T166949. Webservice restarted, manually, yet again, works. For the next few minutes, probably.
Jul 2 2019
Everyone, I own Reasonator, including the experimental version 2 which is used here (and should be better suited than the dated V1).
Jun 27 2019
Jun 26 2019
In T149410#5284363, @Jdforrester-WMF wrote:In T149410#5284327, @Multichill wrote:Changed back the topic. This is a huge scope change and derailing things. As far as I see everywhere in the api we use "claims", not "statements" (also in the functions). The only inconsistency right now is mediainfo, that should be fixed. If you want to change the everything in the Wikibase API to use statements instead of claims (wbgetclaims -> wbgetstatements, etc.), file a new task so I can down vote that one as a huge waste of resources.
OK, then I can just Decline this task? As established above, when Wikimedia DE wrote WBMI in early 2016 they used "statements" because all new code should use that and not "claims", but haven't gone back to fix Wikidata to use the modern language.
Jun 25 2019
@Jdforrester-WMF Is that an official design decision (claims=>statements)? Where was this fundamentally breaking change announced to the public?
Jun 22 2019
FWIW, I have already changed my code to work with either claims or statements. Quick thoughts:
Jun 21 2019
On another note, the Reasonator example in my original post seems to load now. I'll check if the Rust code works as well now.
In T226084#5271761, @Krinkle wrote:@Magnus It is well-known currently that MediaWiki exposes many powerful API that we do not support to perform well, but allow regardless as a convenience service. If we were stricter about response times for all features, we'd probably just turn many of them off and limit the capabilities of those APIs until and unless the amount of resources required to make them work reliably fast is justified.
I expect the maintainers of this API to have tested the supported and encouraged use cases and to know whether they are fast. I haven't personally looked at the p99 for this particular API, but from experience in other endpoints, it tends to be extreme cases that we'd be very unlikely to support with fast responses.
But if they haven't in a while, it's certainly worth looking at those again from time to time.
Jun 20 2019
May I humbly suggest to have a look at the consistent 2min response time of the p99 server (in grafana), before deciding it's a problem outside WMFs control, no matter how convenient that may seem?
Jun 19 2019
No, sorry, issue remains.
GET /w/api.php?callback=jQuery21303406678877236998_1560936691744&action=wbgetentities&ids=P2508%7CP2631%7CP2509%7CP4276%7CP272%7CP4529%7CP5032%7CP4947%7CP5786%7CP6145%7CP1609%7CP1230%7CP2896%7CP4730%7CP2093%7CP1844%7CP1813%7CP5396%7CQ1199348%7CP435%7CP3959%7CP747%7CP1274%7CP1085%7CP5331%7CP4839%7CP4969%7CP103%7CQ49088%7CP1648%7CQ19045189%7CP3793%7CP2847%7CP3035%7CP4389%7CP5062%7CP5508%7CP4264%7CP6698%7CP6617%7CP2241%7CQ44374960%7CQ4644021%7CQ839097%7CP1268%7CQ9624%7CQ8055775%7CQ210152%7CQ4642661%7CQ635616&props=info%7Caliases%7Clabels%7Cdescriptions%7Cclaims%7Csitelinks%7Cdatatype&format=json&_=1560936691745 HTTP/1.1 Host: www.wikidata.org User-Agent: Mozilla/5.0 (Windows NT 10.0; rv:68.0) Gecko/20100101 Firefox/68.0 Accept: */* Accept-Language: en-GB,en;q=0.7,de;q=0.3 Accept-Encoding: gzip, deflate, br Referer: https://tools.wmflabs.org/reasonator/?q=Q350 DNT: 1 Connection: keep-alive
Cookies redacted
Response header from one of the slow requests:
HTTP/2.0 200 OK date: Wed, 19 Jun 2019 09:31:43 GMT content-type: text/javascript; charset=utf-8 server: mw1341.eqiad.wmnet x-powered-by: HHVM/3.18.6-dev mediawiki-login-suppressed: true cache-control: private, must-revalidate, max-age=0 content-disposition: inline; filename=api-result.js x-content-type-options: nosniff x-frame-options: DENY backend-timing: D=1129973 t=1560936702635082 vary: Accept-Encoding,Treat-as-Untrusted,X-Forwarded-Proto,Cookie,Authorization,X-Seven content-encoding: gzip x-varnish: 774984440, 505592114, 724292664 via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1) accept-ranges: bytes age: 0 x-cache: cp1081 pass, cp3032 pass, cp3041 pass x-cache-status: pass server-timing: cache;desc="pass" strict-transport-security: max-age=106384710; includeSubDomains; preload x-analytics: ns=-1;special=Badtitle;loggedIn=1;WMF-Last-Access=19-Jun-2019;WMF-Last-Access-Global=19-Jun-2019;https=1 x-client-ip: 2001:630:206:6204:cc46:3ce1:27e1:3062 X-Firefox-Spdy: h2
Jun 6 2019
May 23 2019
Done.
May 14 2019
Never mind, it's the multilingual string!
Mar 8 2019
Happened to me as well, yesterday (2019-03-08, 08:23UTC)
Mar 7 2019
Fixed Listeria.
Don't know anything about ASammourBot.
Mar 5 2019
Feb 28 2019
Removal is running.
Update: Will remove them with QuickStatements now
So here is what happens: I create(d) lots of gene/protein items (example) for various species. For many statements, I can create references, as I get them from the upstream source. That paper is one of the often-cited ones, about a determination method.
Feb 27 2019
Feb 19 2019
I have added euwiki to the list of wikis where the bot flag is to be used.
Feb 12 2019
That did the trick, thanks!
Feb 11 2019
Tried that, also on login.tools.wmflabs.org (just to be sure). Both say "webservice is not running". Still won't start kubernetes.
Thanks, I have rebuild and updated via npm on the kubernetes shell.
Feb 9 2019
I have run npm update in the kubernetes shell, but no joy.
Feb 8 2019
Feb 1 2019
Try it now...


