Tue, Jun 16
@Marostegui: After the declining of ths task - have you guessed to take something better than MariaDB, if possible? Is our DB running with full capacity again?
Tue, Jun 9
@Aklapper: Meanwhile I cannot reproduce the issue any more. It seems to be solved otherwise.
Jun 4 2020
At least another week, possibly 2 more weeks.
Okay super! If so it will be very helpful to make a statement about it here.
Sorry for little impatience: Can you say when you are running with full capacity again? (T252209)
May 8 2020
@Marostegui: Thank you for your explanation!
@Marostegui: my client user is u4802
Apr 2 2020
@DavidBrooks and others: In the Earwig's Copyvio Detector window it's written: "We are still investigating recent performance issues/timeouts on the new infrastructure."
Apr 1 2020
Feb 22 2020
@Aklapper Das Tool wird häufig genutzt und ist auch nicht unwichtig. Kannst Du diesen Task bitte assignen und/oder die Tags anpassen? Danke sehr ...
Dec 5 2019
@Aklapper: Oh yea, but there happens nothing for four years ... OMG
@Aklapper: please give it useful tags, I don't know any for this
Nov 20 2019
Nov 19 2019
Meaning: the subcategory 'Crocodylomorpha' is not counted as subcat of 'Crurotarsi'.
@Urbanecm: I should reopen, if there is one more issue. Please take a look at https://de.wikipedia.org/w/index.php?title=Kategorie:Crurotarsi - it has one subcategory that is also not counted/or added to -1.
Nov 18 2019
@Urbanecm: Thank you, but what about all the other language wikis?
I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1.
T238500: I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1. Please keep an eye on T238500 though it is marked as 'closed'. Thanks!
Nov 17 2019
@TTO : are you still working on it?
Nov 11 2019
Nov 7 2019
Nov 6 2019
@Aklapper: Because some of these attachments are copyrighted and only for private research use. If copyrighted I must not upload the content of these attachments to MediaWiki.
Oct 2 2019
Please provide me with an update? Are you working on it?
Sep 24 2019
And we don't need a providing of deduplication but a prevention of duplication. That has to be the first step.
And why do you use the "MediaWiki message delivery" sending The Tech News out?
This monday the Tech News has been spammed three times again. If it is running by a cron job, please check the crontab. Maybe the crontab has been started three times, too. I figured out the rhythm of the spam job: 0-1-6 minutes: The second Tech News followed within one minute, the third Tech News followed within 6 minutes. It has happend the last three weeks identically.
Aug 12 2019
yes, it's the same and can be merged
Jul 4 2019
@Aklapper: can you assign this ticket please?
Mar 30 2019
- put Notifier and Veto to two routines
- merged to adt2.tcl
Mar 28 2019
Mar 27 2019
assign to TaxonBot
Mar 26 2019
Assigned to TaxonBota
Jan 10 2019
@Smalyshev What's the status to this task? There are still problems, -> https://www.wikidata.org/wiki/Wikidata:Request_a_query#SPARQL_query_result_erroneous
May 9 2018
@Bawolff I changed the botpassword about 10:45 UTC, May 9th. A login is successful now, but it was not successful on a first test with the old botpassword before.
May 8 2018
and how to get authorized for this link?
but first reduce the waiting time to login from 2 days to 0 minutes
okay, but prio High at least. The bot is even very important. ...
May 1 2018
Apr 30 2018
I think, the dev team can build a script to declare automatically such defective ID properties as dead links anyhow.
Mar 18 2018
Mar 14 2018
Feb 27 2018
?item wdt:P27 wd:Q183 was my mistake, these are the German women. I need the Swedish, sorry: ?item wdt:P27 wd:Q34
This is such a query that runs into a timeout. It should give all Swedish women that have no sitelink to dewiki with counting the sitelinks and listing some properties to each item, and is needed for a dewiki community project. I cannot limit it because of the needed double ORDER BY, so the query has to run with unlimited result. If there is a longer timeout, the query will give a successful result. It will be great, if you can optimize it, if possible. Can you calculate, how much time the query will take to complete?
Feb 26 2018
Maybe 120s limit? I think, we have to test it, but how?
Thank you, Magnus, this was my opinion, but my English is not so good like yours, so ...
This will always be the case, we will never be able to serve arbitrary requests that require unlimited time to perform.
Feb 24 2018
Ah! I didn't know about, thank you
Feb 23 2018
"The quantity of entities in Wikidata has risen very much.
Aug 17 2017
Jul 11 2017
Jul 1 2017
Jun 28 2017
Jun 14 2017
Apr 16 2017
IMHO it does not look like the same ...
Feb 19 2017
Hi! I could find this emoji too, yesterday, but it was very late, so I couldn't report it any more.
Feb 18 2017
@matmarex: take a look, there's nothing fixed, user:Unknown_user is back again: https://de.wikipedia.org/w/index.php?title=Benutzer:Delta456/Carcassonne_(Spiel)/Eigenst%C3%A4ndige_Spiele_2&action=history
Let me try the import once more to another target and we'll see.