Thank you!
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Advanced Search
Sep 16 2019
Sep 15 2019
This is now at least the third bug introduced by the "fix" of T224344. Can we please revert that change and let IABot deal with dead links only? Fixing strange parser behavior is not within scope.
Sep 13 2019
@Mikeblas To illustrate the point I made over and over again in our discussion: This "fix" of a problem with the MediaWiki parser unrelated to IABot's core functionality causes lots of disrupted articles on dewiki, breaking perfectly fine articles:
Sep 2 2019
Sep 1 2019
Aug 17 2019
Jul 28 2019
I haven't done a lot of cleaning yet, when I find the time I'll try to achieve consensus for a proper solution.
Jul 6 2019
According to a user report here the Citavi-Picker plug-in is the culprit again. The user uses the latest Firefox with Ghostery and Citavi-Picker.
Jun 19 2019
In T218511#5267984, @IKhitron wrote:Well, this time the lag is not twenty minutes any more.
EightTen hours and counting.
And this time the problem is also in history pages and in API.
Jun 14 2019
There are currently 46 jobs in the queue. The bot will get to them eventually, but as far as I know there are some serious issues with the job workers, requiring manual intervention by the bot's developer on a regular basis.
Jun 10 2019
I don't think we're going to reach an agreement here. IABot will not add workarounds and checks for wikitext which is invalid, which would greatly inflate the code base and therefore result in many more errors.
Jun 7 2019
In T224344#5232733, @Mikeblas wrote:The MediaWiki parser is quite generous,
I suppose that's true, but if the parser renders the page without complaint, how can we decide if it's invalid? Is there some other authority that tells us what is "valid" or "invalid" wiki text, if not the parser itself?
Jun 6 2019
This is closely related (and possibly a duplicate) of T92432
Jun 5 2019
This is very likely related to T225115, there is something wrong with the FlaggedRevs configuration.
Jun 4 2019
May 31 2019
Thanks for reporting the issue with the article. However, you somehow ended up on Wikimedia Phabricator, where staff and volunteer developers discuss the development of and issues with the software that is powering Wikipedia and other Wikimedia projects. Specifically, this is the project space for the InternetArchiveBot, which is a tool that automatically finds and fixes broken weblinks.
It is nevertheless causing users to assume that IABot removed the non-breaking whitespace and it obscures the fact that there is a non-breaking whitespace in the sourcecode. Would it be feasible for IABot to encode whitespaces and other "invisible" characters within template parameters?
May 29 2019
I would like to second this request. The bot flag is sufficient if users want to filter out IABot's edits. The minor flag should be reserved for cosmetic changes and simple find-and-replace tasks.
May 27 2019
In T224344#5215134, @Mikeblas wrote:I can't understand why you're saying that the article has invalid wikitext. The page rendered correctly and without error before InternetArchiveBot made its edits. To be clear, it was only after InternetArchiveBot made its edit that the page rendered with an error.
Thanks for the explanation!
May 26 2019
Please reopen if the problem appears again.
Can you clarify what you mean by "duplicate reference definitions"? In the first diff you linked, IABot expanded shortened URLs to their full form, and in the second diff, it added {{webarchive}} to two links.
IABot only edits links in references. The link you tried to have fixed is part of the main article text.
May 24 2019
It seems that this edit fixed the problem. I just ran the bot on this page and it did not try to make any modifications. Can you confirm that all is well now?
IABot should only touch URLs within references, unless you configured it otherwise. So I thought the messed up reference structure might have been the issue.
May 23 2019
No, Python]]<ref>{{cite web is never closed as far AS I can see. I get an error on dewiki when I paste the code.
There is a missing </ref> in your code: diff.
Can you post the entire infobox/template structure around it as well, please? I'll go check the template definitions for your wiki in the meantime.
What do you suggest?
I have a hard time understanding the reason for these edits from the diffs. Could you extract the original template insertion (prior to any IABot edits) and post it here?
May 16 2019
May 15 2019
May 13 2019
May 12 2019
@Tgr Could you quickly outline how this could be fixed? (E.g. where to place the check and how to check) Thanks!
As far as I can see, there is currently no domain-wide status set for www.nljonline.com. I suggest we let the link checker do its work.
If a URL is whitelisted, it means that the bot ignores it. It is used to prevent the bot from messing up links where the automatic detection of the link's status fails (e.g. because the programming of the website is messy or the host blocks the bot's link check requests).
I believe it makes sense to keep google.com whitelisted. Unfortunately, whitelisting only works on a domain level, otherwise http://www.google.com/hostednews/ could be exempt.
False positives and false negatives are expected to occur regularly. Unless these mistakes clearly point to an issue with the dead link checker, there is nothing that can be done here except correcting the links in the database.
That should be possible with a change to the template configuration. I assume that @Cyberpower678 will take care of that.
Thanks for the report. I removed the whitelisting and marked the URL as dead.
Thanks for the report. I removed the domain-wide whitelisting and set the URL to dead.
@Zppix Cyberpower678 is the only developer and operator/maintainer of the bot, hence the prioritization is rarely used within the InternetArchiveBot Phabricator project, quite often task are never triaged at all. You can be sure that he is working towards resolving all issues around the bot as often as his time permits.