Mon, Dec 30
Dec 23 2019
Dec 10 2019
The only reason I mention Special:WhatLinksHere is that I thought it would be easier to implement, which doesn't seems to be the case.
If some can expose reasonable subset of haswbstatement - I would be more than happy.
Nov 19 2019
Nov 12 2019
Oct 3 2019
looks like its working again
Sep 9 2019
Can reproduce all three links above on Chrome 76.0.3809.136 (Official Build) (64-bit) ChromeOS
Sep 5 2019
Aug 18 2019
Mine wdt:P disappear
Aug 17 2019
I definitely can re-import mine, but existing wdt:Pxxx makes operating existing bots much more difficult than it should be
Aug 16 2019
Still getting a few dozens of wdt:Pxxx, query:
Aug 13 2019
@Anomie, you were kind to fix T215444, maybe you can take a look on this one? We have thousands of potentially affected categories and problem reveal itself only on small random subset of them. If you need any additional traces in lua - I'm here to help :)
Aug 4 2019
Any category here except for Википедия:Страницы с ошибками скриптов, использующих Викиданные. The problem is that errors disappears after ?action=purge
Jul 23 2019
Thank you very much!
Jul 22 2019
@Krenair, I assume that only home directory was not created:
ghuron@tools-sgebastion-07:~$ ls /data/project/dspull/ ls: cannot access '/data/project/dspull/': No such file or directory ghuron@tools-sgebastion-07:~$ id uid=19300(ghuron) gid=500(wikidev) groups=500(wikidev),50380(project-tools),53381(tools.mw2sparql),53738(tools.wdml),54111(tools.dspull)
Jul 19 2019
Jul 12 2019
Apr 10 2019
Look at the query that is used to get missing articles for "List of articles every Wikipedia should have" https://quarry.wmflabs.org/query/26700 There are 2 joins:
- For iwlinks/wb_items_per_site join I have to do either CONCAT ('Q', ips_item_id) or TRIM('Q' FROM iwl_title)
- For wb_items_per_site/page join I have to do REPLACE(ips_site_page, ' ', '_') or vice versa and potentially take care of non-default namespace
Isn't it ironic that wikidatawiki_p.wb_items_per_site has 2 columns that is potentially joinable with wikipedia tables and both of them requires format transformation?
Mar 1 2019
thanks, no problems now!
Still experiencing problems with hewiki_p (and sometimes eswiki_p). The rest is ok
А чем конкретно Вас не устраивает дамп?
Feb 28 2019
Feb 21 2019
Feb 6 2019
It looks like it depends whenever PHP7 setting is tuned on or off. When its off, problem disaperars, when its on, action=purge any article that uses it and you will get "false" when calling mw.ext.data.get()
Dec 11 2018
works for chromeos 70.0.3538.110
Dec 3 2018
Fixed for all dbs mentioned above
@Aklapper I've found it in my labs project mw2sparql (it was bgwiki_p initially 07:34 UTC today) but since that I'm repoducing this in quarry (see link above). Sounds like operation issue with db replicas
Aug 21 2018
Currently it is impossible to add/remove sitelinks in Russian UI. I would appreciate if someone will look into that as soon as possible
Jun 15 2018
Apr 18 2018
I'd like to explain another scenario, where this functionality can be useful. Ru-wiki infoboxes sometimes add categories to the corresponding articles. Sometimes we do this based on wikidata, but we are pretty much limited to a few cases, defined in P1465, P1792, P1464 or very special usage of P910 (e.g. if astronomical object belongs to some constellation). If I want to automatically place categories like Q8454640 to the films, directed by James Marsh, I probably would need a designated property. Alternatively I should be able to do it via extracting items that has P971:Q2720543 and simultaneously P971:Q29017630. This may sound like reinventing SPARQL queries, but getting results of getBacklinks(Q2720543) and getBacklinks(Q29017630) and performing list intersection in lua would work for me.
Apr 12 2018
Mar 11 2018
Jan 22 2018
Jan 20 2018
I'm personally missing ability to get "what links here" (see T185313)