Thu, Feb 13
For now. Disk space has been freed by 163 GB
Wed, Feb 12
I'm nuking everything not from 2020 right now.
Damn. I hadn't realized it had accumulated that much.
Mon, Feb 10
Disk space is now critically low.
Wed, Feb 5
This one's on you.
Jan 12 2020
Jan 11 2020
Jan 8 2020
Sadly, I don't think the VM will last that long. I am able to make a dump of the DB and load that dump into a new VM quite easily. If the storage space made large enough to last until the new CEPH project deploys, that would be fine. I think it will only handle tops 2 more months, as there are new projects being worked on with new tables required for these projects.
Jan 6 2020
Jan 3 2020
This one's on you.
Dec 27 2019
Nov 29 2019
Nov 28 2019
Nov 26 2019
Nov 23 2019
I don't think it unreasonable that sysops should have it by default, and maybe bots.
Nov 20 2019
Someone re-enabled the bot without authorization.
Nov 19 2019
You might want to check the history again. It wasn't the bot.
Nov 14 2019
Oct 19 2019
IABot is seeing all of these as alive now, since you reported the links as dead. They were automatically reset. I don't see what else to do here?
Domain has been whitelisted.
Oct 18 2019
@Anomie I can confirm that your suggestion to normalize the UTF text does the trick. Thank you.
Oct 10 2019
I added dev-master to my composer json. This should fix the reported issue.
No further response
This was already implemented before this ticket was opened.
Because deadurl is no longer a supported alias on enwiki, and because the current definitions are being drawn from the enwiki definitions, deadurl is not acknowledged on the template at all. Thus they are ignored.
IABot cannot be expected to accurately detect these cases. Please use https://tools.wmflabs.org/iabot/index.php?page=manageurlsingle&wiki=enwiki to adjust the URL metadata of the links it rescues. You can disassociate the, or associate a different, archive there. Changes are applied immediately, and a bot job can be started on applicable pages afterwards.
Please use https://tools.wmflabs.org/iabot/index.php?page=reportfalsepositive&wiki=alswiki to report false positives or simply revert the bot.
The bot correctly matched the format being used in the template and applied the same format to the new dates it added. There is no bug here.
https://tools.wmflabs.org/iabot/index.php?page=manageurlsingle&url=https%3A%2F%2Fwww.bipm.org%2Fen%2Fmeasurement-units%2F suggests that it found the source with the access date of 2018-11-23 somewhere on some wiki. IABot picked the closest snapshot to that and saved it for future use as it is a guaranteed working snapshot with similar, if not the same, data as the original from that time frame. If you want the bot to use a different snapshot, you can use the provided link and change the archive URL the bot should use. Changes are applied immediately.
Oct 8 2019
Oct 2 2019
Sep 30 2019
Sep 29 2019
Sep 18 2019
Seems to be working now on my end. :-)
Sep 16 2019
@Anomie what's the latest about this bug? I see it's awaiting review, but when will it get merged and deployed? This is sadly holding up work on my end.
Sep 15 2019
I see nothing to indicate you should be hitting a barrier. Please make sure the interface is set to the English Wikipedia. Please also report what username you are using. I found "Brown Chocolate" which is a member of "basic user" which has the permissions needed to run the bot.
Errors are populated here, which given the edit rate of the bot, and based off of a random sample of what IABot broke vs what humans broke, we see that of the millions of pages it edited, this all that it broke. We should certainly commission a bot to clean up the mess, however.
The error rate is still very, very low from what I'm seeing. I'm not using that as a reason to dismiss your concerns, I just need to look at what's happening. I'm not fond of patch jobs. I want to fix this at the root.
"Indeed, it might be best if IABot didn't edit articles where it can't predict the outcome of its own edits." <-- I'm referring to that. IABot can't know when an edit it makes might have an unpredictable outcome. That would require serious Machine Learning. Something that's not slated to be implemented until further down the road (IABot v3). That's still some years off at least.
That's also not feasible. There's no way for IABot to even know that the edit is unpredictable to begin with them. The solution is to try and establish why the bot only touches one ref and ignores the other.
@Mikeblas I honestly don't know if this is worth fixing in IABot now. I don't even know what reflist is doing to trigger this error. And I can't program IABot to predict what references inside templates will do on other wikis either. This will need a different solution.
The deduping has been disabled.
Sep 14 2019
The domain is whitelisted because IABot cannot reliably assess the URL. As for it ignoring deadurl, it is no longer supported and is no longer acknowledged. A different bot should go around replace deadurl with url-status.
Looking more closely at the example, it looks like "ARIA News 28 Oct" was defined 3 times, where one had vastly different content, and the other was an identical duplicate. I don't see an error here other than IABot exposing that there was a ref error not being caught earlier.
Just looking at the first example, I'm already confused. It deduped a named reference but a different reference lit up with a red error. Can you help me understand what is going on here?
Sep 11 2019
Sep 10 2019
I couldn't say. I can't even see which page it's trying to pull in that given batch.
This isn't the first time I've run my script against IABot's contributions, so I would say it is a recent issue.
Sep 9 2019
Sep 8 2019
On wiki template data could have recently been changed. The bot will read that as well.
If it's producing a blank parameter, than somewhere in the template data, that parameter was marked as required.
Sep 7 2019
Sep 4 2019
Added url-access for all book references not going to specific pages as the ones going to a specific page do not require registration to view.
Sep 3 2019
I cannot establish what is going on here. When pulling up the URL I see one page being mentioned and it's the same one you mentioned.
Aug 31 2019
First off there are no such things as URLs without a schema (HTTP/HTTPS/FTP). The https does not redirect anywhere.
Aug 29 2019
Also just leaving this here as well. I have now changed my on wiki hash.
I'm still in Germany, so I will be going to bed now. :-). Thanks @bd808
Welp my phone got logged out somehow. So now I only have one device left with an active Phab session going.
I have 2FA enabled and my identity recently confirmed multiple times on Wikipedia, most recently on this very thread. Can I just post a 2FA reset confirm post on my talk page?