Thu, Dec 5
@Aklapper: Oh yea, but there happens nothing for four years ... OMG
@Aklapper: please give it useful tags, I don't know any for this
Wed, Nov 20
Tue, Nov 19
Meaning: the subcategory 'Crocodylomorpha' is not counted as subcat of 'Crurotarsi'.
@Urbanecm: I should reopen, if there is one more issue. Please take a look at https://de.wikipedia.org/w/index.php?title=Kategorie:Crurotarsi - it has one subcategory that is also not counted/or added to -1.
Mon, Nov 18
@Urbanecm: Thank you, but what about all the other language wikis?
I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1.
T238500: I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1. Please keep an eye on T238500 though it is marked as 'closed'. Thanks!
Sun, Nov 17
@TTO : are you still working on it?
Mon, Nov 11
Nov 7 2019
Nov 6 2019
@Aklapper: Because some of these attachments are copyrighted and only for private research use. If copyrighted I must not upload the content of these attachments to MediaWiki.
Oct 2 2019
Please provide me with an update? Are you working on it?
Sep 24 2019
And we don't need a providing of deduplication but a prevention of duplication. That has to be the first step.
And why do you use the "MediaWiki message delivery" sending The Tech News out?
This monday the Tech News has been spammed three times again. If it is running by a cron job, please check the crontab. Maybe the crontab has been started three times, too. I figured out the rhythm of the spam job: 0-1-6 minutes: The second Tech News followed within one minute, the third Tech News followed within 6 minutes. It has happend the last three weeks identically.
Aug 12 2019
yes, it's the same and can be merged
Jul 4 2019
@Aklapper: can you assign this ticket please?
Mar 30 2019
- put Notifier and Veto to two routines
- merged to adt2.tcl
Mar 28 2019
Mar 27 2019
assign to TaxonBot
Mar 26 2019
Assigned to TaxonBota
Jan 10 2019
@Smalyshev What's the status to this task? There are still problems, -> https://www.wikidata.org/wiki/Wikidata:Request_a_query#SPARQL_query_result_erroneous
May 9 2018
@Bawolff I changed the botpassword about 10:45 UTC, May 9th. A login is successful now, but it was not successful on a first test with the old botpassword before.
May 8 2018
and how to get authorized for this link?
but first reduce the waiting time to login from 2 days to 0 minutes
okay, but prio High at least. The bot is even very important. ...
May 1 2018
Apr 30 2018
I think, the dev team can build a script to declare automatically such defective ID properties as dead links anyhow.
Mar 18 2018
Mar 14 2018
Feb 27 2018
?item wdt:P27 wd:Q183 was my mistake, these are the German women. I need the Swedish, sorry: ?item wdt:P27 wd:Q34
This is such a query that runs into a timeout. It should give all Swedish women that have no sitelink to dewiki with counting the sitelinks and listing some properties to each item, and is needed for a dewiki community project. I cannot limit it because of the needed double ORDER BY, so the query has to run with unlimited result. If there is a longer timeout, the query will give a successful result. It will be great, if you can optimize it, if possible. Can you calculate, how much time the query will take to complete?
Feb 26 2018
Maybe 120s limit? I think, we have to test it, but how?
Thank you, Magnus, this was my opinion, but my English is not so good like yours, so ...
This will always be the case, we will never be able to serve arbitrary requests that require unlimited time to perform.
Feb 24 2018
Ah! I didn't know about, thank you
Feb 23 2018
"The quantity of entities in Wikidata has risen very much.
Aug 17 2017
Jul 11 2017
Jul 1 2017
Jun 28 2017
Jun 14 2017
Apr 16 2017
IMHO it does not look like the same ...
Feb 19 2017
Hi! I could find this emoji too, yesterday, but it was very late, so I couldn't report it any more.
Feb 18 2017
@matmarex: take a look, there's nothing fixed, user:Unknown_user is back again: https://de.wikipedia.org/w/index.php?title=Benutzer:Delta456/Carcassonne_(Spiel)/Eigenst%C3%A4ndige_Spiele_2&action=history
Let me try the import once more to another target and we'll see.
@matmarex: I imported it by an exported XML file.
No, you're wrong. Sorry. I have made it myself, i know what I've done
Look the two case lines in the task description. These was done !before! the import! One bot flagged post edit and one emptying edit. The import case were the following revisions only
I reopened this task because there is something wrong: there was not a single import edit at all by User:Unknown_User but you mentioned this above. This user only made emptying edits, not any more at all. Please check the details of this case more in depth.
Please note: the first was !NOT! an import revision !!! emptied by User:Unknown_user too.
Jan 28 2017
Jan 26 2017
okay, the problem was the wikidata extension, thank you for any help