Mon, Feb 15
I too think I've been facing this issue – happened twice over the past 3 days, though IIRC those have been the only two occurrences this year. The bot's process remains active, but the onopen and onerror event listeners catch nothing.
T274656 is closely related to this, I guess.
Sat, Feb 13
Still occuring, yes. This bot log crudely suggests a rate of 2 in 100.
Fri, Feb 12
Jan 28 2021
@GWicke The image https://hub.docker.com/r/wikimedia/mediawiki/ is under wikimedia account (hence looks official) but the description is misleading as it says "This image tracks latest production" though it was last updated 4 years ago.
Jan 25 2021
The issue has been occurring with other bot libraries too, not just pywikibot. In a few cases, the issues recurs even after waiting for 5 seconds and retrying the API request.
Jan 11 2021
Jan 4 2021
Dec 30 2020
ApiQueryLanguageInfo is related. Would it be acceptable to expand this API to also provide the i18n rules data?
Dec 26 2020
Encountered this error on testwiki today. Error code is lockmanager-fail-acquirelock and error text is something like Could not acquire lock for "mwstore://local-multiwrite/local-deleted/f/a/e/fae7og4zcjppicvqojoaqgnkl8mzl6j.png".
Dec 23 2020
Nov 23 2020
PageTriage appears to be using Date.js just for the single function Date.parseExact, that too for parsing the yyyyMMddHHmmss timestamp format only. This functionality can be easily made available through a utility function that extracts the date parts from the string and feeds them into Date.UTC(), and the library can then be removed.
Oct 10 2020
I have requested a Rapid Grant to take on this project. Please see https://meta.wikimedia.org/wiki/Grants:Project/Rapid/SD0001/Twinkle_localisation.
Oct 1 2020
Sep 28 2020
T261290 is related (the number of active users per Special:ActiveUsers / ApiQueryAllUsers are much less than what Special:Statistics claims).
Sep 27 2020
I have a bot that queries ORES a lot (I believe most of its requests are probably being fulfilled from the caches) and indeed it's persistently been running into 500 errors since 25 Sep.
Sep 17 2020
Aug 31 2020
A kinda reverse bug now seems to be occurring. Deepcat searches on enwiki in Category:Articles needing expert attention gives: "A warning has occurred while searching: Deep category query returned too many categories". No results are actually displayed (not even a limited set of results).
Aug 26 2020
Jul 11 2020
May 25 2020
May 21 2020
Apr 30 2020
Apr 19 2020
Apr 18 2020
The problem appears to be on line 207:
$editform.find( '.templatesUsed .mw-editfooter-list' ).detach().empty().append( newList ).appendTo( '.templatesUsed' );
This is assuming that $editform.find( '.templatesUsed .mw-editfooter-list' ) exists. But that's correct only while editing the whole page. While editing a section, .templatesUsed exists but is an empty div.
Apr 17 2020
It's worth noting that enwiki is currently considering making the userscript a gadget - see VPT discussion.
Mar 25 2020
Is this being fixed? This bug is affecting the work of bots on wikipedia, so it would be great if this could be looked into. Thanks.
Mar 21 2020
Feb 9 2020
It is also very useful on Special:EditWatchlist.
Jan 26 2020
Dec 23 2019
Dec 21 2019
@Anomie I've uploaded a simple patch that adds the autoMerge and fail options. For the overwrite option, I think it would be acceptable to merely null the basetimestamp and starttimestamp fields, since it seems that these two fields are only being used for conflict resolution?
Dec 20 2019
@Anomie that link is no longer accessible, but I guess the if condition you were referring to is around L2242? Just changing that condition to if ( !$this->noAutoMerge && $this->mergeChangesIntoContent( $content ) ), and adding a noAutoMerge flag (set by ApiEditPage) would be acceptable?
Another pseudo-duplicate: T177150: Add IP range support to list=usercontribs API
Dec 13 2019
Dec 9 2019
@JTannerWMF I have written a patch for this. It is fairly simple. Could you review it?
Dec 3 2019
Instead of adding yet another item in mw.config, it would also help if the timestamp in the page html were machine readable - enclosed in a span tag with a data-ts attribute or something with the mediawiki timestamp ("YYYYMMDDHHMM") in UTC.
Nov 17 2019
Nov 1 2019
It looks as if PageCuration is preemptively making an API call for getting prior AfD/MfD nomination page names on every single page (even on pages not in the new pages feed, and even in namespaces where it is not required). This isn't exactly a good idea performance-wise, is it? Shouldn't this be delayed until the user actually tries to mark the page for deletion?
Oct 28 2019
Oct 18 2019
Oct 5 2019
@Reedy, @ItSpiderman can you look into this again, please. While no internal error appears now, I am still unable to to enable 2FA: the message "Failed to validate two-factor credentials" comes up every time. I have tried using both Google Authenticator as well as 2FA Authenticator apps.