I intent to have it ready by the end of August.
Fri, Aug 23
Boldly closing this. Internet Archive actively crawls all WMF projects looking for new links and IABot handles Wikidata.
There is insufficient information in this report to effectively diagnose the issue. Unable to reproduce.
Thu, Aug 22
They are taking for me. What did you try and push?
Yes. This appears to be fixed now.
Also preventing logins too. Now it really needs a quick fix.
Wed, Aug 21
Mon, Aug 19
Sun, Aug 18
Sat, Aug 17
Thu, Aug 15
Should be resolved now.
@Skalman I'm sitting at Table 30 right now. Would you like to meet me there?
Wed, Aug 14
Tue, Aug 13
I'm not sure what is being asked for here. Can you clarify?
Sat, Aug 10
No it means it’s fixed in the code, but the code isn’t deployed yet.
Fri, Aug 9
It follows the most used format in the found in the template and goes with it. Every parameter in the case has a space. The url duplicates is intended by design and is approved to do so.
Tue, Aug 6
For normal users, error messages aren't seen unless debug mode is enabled. @Cirdan please append debug=true to the tool URL query string so you can continue seeing it.
So if I understand this correctly, when title is not present encyclopedia becomes title, but isn't wikilinked unless there's are double square brackets within the parameter correct? If so, this is now fixed.
Fri, Aug 2
Not within the bot's scope. A search and replace bot for this domain should be commissioned.
Thu, Aug 1
I'm classifying this as a transient issue that no longer applies.
I've been able to fix the errors, but the fail error is happening in this case because the archive URL is already set to the one given, so the change is discarded.
Wed, Jul 31
No further report is needed.
Beta16 will transform ref dups into self-closing refs.
Whitespace issues fixed.
Sun, Jul 28
It didn't produce the template, it did something worse. It destroyed the template. There used to be actual data in there.
Sat, Jul 27
It's an opt in service. If you are seeing it, then you are opted in to it. You can opt back out in your preferences. As for using it, I'm allowed to as long as users consent to it.
I'm going to go ahead and close this. Assuming the one time cleanup but did fix this, this shouldn't be an issue going forward.
Is this still happening? I'm thinking whatever it was just a transient issue. The URLs you have reset haven't gone back to be dead and they've been getting checked regularly since they were reset.
With that being said, if there is no log domain level log entry present for a given URL, it happened automatically, and can likely be reset. Any user setting it to that state will leave a log entry behind. Just be aware that resetting a whitelisted URL/Domain, can trigger the bot to rewhitelist it if it continues to think it can't read the sites correctly.
I've gone ahead and dewhitelisted them, but if the bot continues to think it can't correctly assess the links, it will probably white list them again.
Unable to replicate
Reasons are generally reflected in the logs. If there isn't any, then the bot whitelisted it because it suspects it cannot reliably assess the given domain.
Fri, Jul 26
The links are found in the DB, but they have been whitelisted. Is it just the links, or the entire domain that is dead. If just the links, I would suggest simply marking them as dead on wiki.
@Graham87 unrelated question, how is the interface working for you. Does it need any accessibility improvements?
These links aren't dead, they're just blocked. They work perfectly fine outside of the EU and because Wikipedia servers are based in the US, that's where IABot lives. It lives within their servers. If IABot gets any 400 error or up, it will be seen as dead. If users want to get around that, they should use a VPN. A lot of them are free these days.
It is unclear what is being asked here.
So there's quite a bit of syntax malformation on the page, but I added some safeties to work around them. They should be ready when beta16 is fully released.
Jul 24 2019
Nevermind, I just reproduced it on my end.
I cannot replicate this bug. It could have been a glitch as a result of a prior, now fixed bug. There was a template embedded into the link, which I removed.
Jul 23 2019
Nevermind. Was just a cache issue. This is now resolved.
Whoops. I missed one.
You can now control this in your settings.
Jul 20 2019
All good. :-)
How did it do something wrong? The existing parameters do nothing to the templates. IABot added archive URLs using the correct acknowledged parameters so they would render. Bad parameters are ignored by IABot. On Vcite, archive-url and archive-date are not supported. Only archiveurl and archivedate are.
I see your point. I will alter the herald task to simply subscribe me. The idea was that I'm immediately made aware of any new task or changes to tasks concerning my bot.
Please re-open if you feel you may be able to identify an issue.
There is insufficient information to establish a problem here. Dewiki appears to function just as fast as other wikis based on the queue workers.
vcite templates do not support archive-url/archive-date. As such they are invisible in the output and to the bot.
I am the only developer of the bot, and until more developers would like to join in and assist me, I see no reason to remove this rule as I'm the only one that ends up acting on them.