Wed, May 12
@cscott Are you still working on this bug and VisualEditor changes of <references responsive> to <references responsive =""> ?
Mar 30 2021
@Aklapper: if there are no language subdomains, why is it working like expected using the Go-Button?
Instead of clicking "Search" button I simply push the <Enter> key.
Entering de:ABC to search field in the upper right corner of https://commons.wikimedia.org/wiki/Main_Page?useskin=monobook redirects to https://commons.wikimedia.org/w/index.php?title=Special%3ASearch&search=de%3AABC&fulltext=Suchen&ns0=1&ns6=1&ns12=1&ns14=1&ns100=1&ns106=1 (logged in, Chrome Version 89.0.4389.90 (Official Build) (64-Bit))
Mar 22 2021
How have I to change something like this in my user script to prevent breaking it soon:
Feb 26 2021
resolved on dewiki
Feb 17 2021
@Ammarpad reviewed this change.
doesn't work in dewiki properly with the same result stated in Description
Feb 11 2021
Do there exist similar problems with other languages and their variants? If so the outcome should be changed too.
Feb 10 2021
The TaxonBot workboard has been setup to manage tags, for ticketing and projecting the TaxonBot project running not yet by the taxonbot tool account but by tools cloud project 'dwl'. That's why the index.html still returns 503 error. I'm working at Toolforge on a webservice tool, but I guess it will need some month more, so actually there's not a code repository.
Nov 3 2020
Oh, it's a great idea to implement it as option. This will be the best for all the wikis using different languages like Huji said before.
Nov 2 2020
Please explain in simple speech or in German why to change hyphen minus to minus sign! What is advantage and disadvantage?
Jun 16 2020
@Marostegui: After the declining of ths task - have you guessed to take something better than MariaDB, if possible? Is our DB running with full capacity again?
Jun 9 2020
@Aklapper: Meanwhile I cannot reproduce the issue any more. It seems to be solved otherwise.
Jun 4 2020
At least another week, possibly 2 more weeks.
Okay super! If so it will be very helpful to make a statement about it here.
Sorry for little impatience: Can you say when you are running with full capacity again? (T252209)
May 8 2020
@Marostegui: Thank you for your explanation!
@Marostegui: my client user is u4802
Apr 2 2020
@DavidBrooks and others: In the Earwig's Copyvio Detector window it's written: "We are still investigating recent performance issues/timeouts on the new infrastructure."
Apr 1 2020
Feb 22 2020
@Aklapper Das Tool wird häufig genutzt und ist auch nicht unwichtig. Kannst Du diesen Task bitte assignen und/oder die Tags anpassen? Danke sehr ...
Dec 5 2019
@Aklapper: Oh yea, but there happens nothing for four years ... OMG
@Aklapper: please give it useful tags, I don't know any for this
Nov 20 2019
Nov 19 2019
Meaning: the subcategory 'Crocodylomorpha' is not counted as subcat of 'Crurotarsi'.
@Urbanecm: I should reopen, if there is one more issue. Please take a look at https://de.wikipedia.org/w/index.php?title=Kategorie:Crurotarsi - it has one subcategory that is also not counted/or added to -1.
Nov 18 2019
@Urbanecm: Thank you, but what about all the other language wikis?
I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1.
T238500: I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1. Please keep an eye on T238500 though it is marked as 'closed'. Thanks!
Nov 17 2019
@TTO : are you still working on it?
Nov 11 2019
Nov 7 2019
Nov 6 2019
@Aklapper: Because some of these attachments are copyrighted and only for private research use. If copyrighted I must not upload the content of these attachments to MediaWiki.
Oct 2 2019
Please provide me with an update? Are you working on it?
Sep 24 2019
And we don't need a providing of deduplication but a prevention of duplication. That has to be the first step.
And why do you use the "MediaWiki message delivery" sending The Tech News out?
This monday the Tech News has been spammed three times again. If it is running by a cron job, please check the crontab. Maybe the crontab has been started three times, too. I figured out the rhythm of the spam job: 0-1-6 minutes: The second Tech News followed within one minute, the third Tech News followed within 6 minutes. It has happend the last three weeks identically.
Aug 12 2019
yes, it's the same and can be merged
Jul 4 2019
@Aklapper: can you assign this ticket please?
Mar 30 2019
- put Notifier and Veto to two routines
- merged to adt2.tcl
Mar 28 2019
Mar 27 2019
assign to TaxonBot
Mar 26 2019
Assigned to TaxonBota
Jan 10 2019
@Smalyshev What's the status to this task? There are still problems, -> https://www.wikidata.org/wiki/Wikidata:Request_a_query#SPARQL_query_result_erroneous
May 9 2018
@Bawolff I changed the botpassword about 10:45 UTC, May 9th. A login is successful now, but it was not successful on a first test with the old botpassword before.
May 8 2018
and how to get authorized for this link?
but first reduce the waiting time to login from 2 days to 0 minutes
okay, but prio High at least. The bot is even very important. ...
May 1 2018
Apr 30 2018
I think, the dev team can build a script to declare automatically such defective ID properties as dead links anyhow.
Mar 18 2018
Mar 14 2018
Feb 27 2018
?item wdt:P27 wd:Q183 was my mistake, these are the German women. I need the Swedish, sorry: ?item wdt:P27 wd:Q34
This is such a query that runs into a timeout. It should give all Swedish women that have no sitelink to dewiki with counting the sitelinks and listing some properties to each item, and is needed for a dewiki community project. I cannot limit it because of the needed double ORDER BY, so the query has to run with unlimited result. If there is a longer timeout, the query will give a successful result. It will be great, if you can optimize it, if possible. Can you calculate, how much time the query will take to complete?
Feb 26 2018
Maybe 120s limit? I think, we have to test it, but how?
Thank you, Magnus, this was my opinion, but my English is not so good like yours, so ...
This will always be the case, we will never be able to serve arbitrary requests that require unlimited time to perform.