Oh, it's a great idea to implement it as option. This will be the best for all the wikis using different languages like Huji said before.
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Advanced Search
Nov 3 2020
Nov 2 2020
Please explain in simple speech or in German why to change hyphen minus to minus sign! What is advantage and disadvantage?
Jun 16 2020
@Marostegui: After the declining of ths task - have you guessed to take something better than MariaDB, if possible? Is our DB running with full capacity again?
Jun 9 2020
@Aklapper: Meanwhile I cannot reproduce the issue any more. It seems to be solved otherwise.
Jun 4 2020
At least another week, possibly 2 more weeks.
Okay super! If so it will be very helpful to make a statement about it here.
Sorry for little impatience: Can you say when you are running with full capacity again? (T252209)
May 8 2020
@Marostegui: Thank you for your explanation!
@Marostegui: my client user is u4802
Apr 2 2020
@DavidBrooks and others: In the Earwig's Copyvio Detector window it's written: "We are still investigating recent performance issues/timeouts on the new infrastructure."
Apr 1 2020
Feb 22 2020
@Aklapper Das Tool wird häufig genutzt und ist auch nicht unwichtig. Kannst Du diesen Task bitte assignen und/oder die Tags anpassen? Danke sehr ...
Dec 5 2019
@Aklapper: Oh yea, but there happens nothing for four years ... OMG
@Aklapper: please give it useful tags, I don't know any for this
Nov 20 2019
Nov 19 2019
Meaning: the subcategory 'Crocodylomorpha' is not counted as subcat of 'Crurotarsi'.
@Urbanecm: I should reopen, if there is one more issue. Please take a look at https://de.wikipedia.org/w/index.php?title=Kategorie:Crurotarsi - it has one subcategory that is also not counted/or added to -1.
Nov 18 2019
@Urbanecm: Thank you, but what about all the other language wikis?
I tested several eventualities, and could find out: database table 'categories', column 'cat_subcats' starts by -1. If there are 2 subcategory pages, then 'cat_subcats' returns 1.
@Bugreporter: IMHO this task is not a duplicate of T228585, because the problem/the bug is not mentioned there.
Nov 17 2019
@TTO : are you still working on it?
Nov 11 2019
In T237503#5651919, @Reedy wrote:In T237503#5651894, @doctaxon wrote:In T237503#5643126, @Bawolff wrote:I would be concerned about the increased potential for phising and viruses/other malicious attachments.
Why don't you consider antivirus software as freemail providers do, too?
Not allowing attachments is much easier/saner than accepting liability for scanning/dealing with attachments and not identifying something that's problematic
In T237503#5643126, @Bawolff wrote:I would be concerned about the increased potential for phising and viruses/other malicious attachments.
Nov 7 2019
Nov 6 2019
@Aklapper: Because some of these attachments are copyrighted and only for private research use. If copyrighted I must not upload the content of these attachments to MediaWiki.
Oct 2 2019
Please provide me with an update? Are you working on it?
Sep 24 2019
In T232379#5519095, @Johan wrote:@doctaxon I'm not sure exactly what it is you're asking? The problems with multiple deliveries stem from the problem discussed in this task, for which a fix is being merged now. If you've got general questions about Tech News and how it's being delivered, I suggest you ask them at m:Talk:Tech/News and I'll be happy to answer.
In T232379#5518629, @Aklapper wrote:In T232379#5518455, @doctaxon wrote:And why do you use the "MediaWiki message delivery" sending The Tech News out?
@doctaxon: Please ask general Tech News questions on https://meta.wikimedia.org/wiki/Talk:Tech/News - thanks. This task is about mass message.
And we don't need a providing of deduplication but a prevention of duplication. That has to be the first step.
And why do you use the "MediaWiki message delivery" sending The Tech News out?
This monday the Tech News has been spammed three times again. If it is running by a cron job, please check the crontab. Maybe the crontab has been started three times, too. I figured out the rhythm of the spam job: 0-1-6 minutes: The second Tech News followed within one minute, the third Tech News followed within 6 minutes. It has happend the last three weeks identically.
Aug 12 2019
yes, it's the same and can be merged
Jul 4 2019
@Aklapper: can you assign this ticket please?
Mar 30 2019
- put Notifier and Veto to two routines
- merged to adt2.tcl
Mar 28 2019
Mar 27 2019
assign to TaxonBot
Mar 26 2019
Assigned to TaxonBota
Jan 10 2019
@Smalyshev What's the status to this task? There are still problems, -> https://www.wikidata.org/wiki/Wikidata:Request_a_query#SPARQL_query_result_erroneous
May 9 2018
@Bawolff I changed the botpassword about 10:45 UTC, May 9th. A login is successful now, but it was not successful on a first test with the old botpassword before.
May 8 2018
and how to get authorized for this link?
but first reduce the waiting time to login from 2 days to 0 minutes
okay, but prio High at least. The bot is even very important. ...
May 1 2018
Apr 30 2018
I think, the dev team can build a script to declare automatically such defective ID properties as dead links anyhow.
Mar 18 2018
Mar 14 2018
In T189698#4050502, @Cyberpower678 wrote:Why did I get an email for this ticket. This doesn’t concern me.
Feb 27 2018
In T179879#4006226, @Lucas_Werkmeister_WMDE wrote:Oh, and the outer query is trying to get all the labels of the place of birth, place of death, country of citizenship, etc., in all languages. That’s also a terrible idea.
?item wdt:P27 wd:Q183 was my mistake, these are the German women. I need the Swedish, sorry: ?item wdt:P27 wd:Q34
In T179879#4006181, @Lucas_Werkmeister_WMDE wrote:It only returns some two thousand results, so the surrounding query shouldn’t be a problem either…
@Jonas
This is such a query that runs into a timeout. It should give all Swedish women that have no sitelink to dewiki with counting the sitelinks and listing some properties to each item, and is needed for a dewiki community project. I cannot limit it because of the needed double ORDER BY, so the query has to run with unlimited result. If there is a longer timeout, the query will give a successful result. It will be great, if you can optimize it, if possible. Can you calculate, how much time the query will take to complete?
Feb 26 2018
Maybe 120s limit? I think, we have to test it, but how?
Thank you, Magnus, this was my opinion, but my English is not so good like yours, so ...
@Smalyshev wrote:
This will always be the case, we will never be able to serve arbitrary requests that require unlimited time to perform.
Feb 24 2018
Ah! I didn't know about, thank you
Feb 23 2018
"The quantity of entities in Wikidata has risen very much.
Aug 17 2017
Jul 11 2017
Jul 1 2017
Jun 28 2017
Jun 14 2017
Apr 16 2017
IMHO it does not look like the same ...
Feb 19 2017
Hi! I could find this emoji too, yesterday, but it was very late, so I couldn't report it any more.
Feb 18 2017
@matmarex: you can find the xml here: http://tools.wmflabs.org/taxonbot/xml1.xml