Now that the deallink logging API is finished, we should have Cyberbot start using it.
Documentation for logging API:
https://meta.wikimedia.org/wiki/Fixing_dead_links/Deadlink_logging_app
Now that the deallink logging API is finished, we should have Cyberbot start using it.
Documentation for logging API:
https://meta.wikimedia.org/wiki/Fixing_dead_links/Deadlink_logging_app
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | Cyberpower678 | T120433 Migrate dead external links to archives | |||
Resolved | Cyberpower678 | T125610 Epic: Create a centralized logging interface for tracking and reporting dead link fixes | |||
Resolved | Cyberpower678 | T130035 Get Cyberbot to use logging API |
@Cyberpower678: Do you have any thoughts or concerns about this task? Is this something that you would like to work on or would it be better for the Community Tech team to do the integration?
I don't mind doing it. I just need to get around to it. If you want to do, I don't mind either. I would suggest the actual calls are made in the API class and it would be called from the analyzePage function.
So the bot does not count links not fixed. It would need to take a bit to get that implemented the way the bot is built.
I already fixed this. I just added a counter to the part of the code that tags or ignores instead of fixes.
Number of links not fixed = Total number of reference links on the page - Number of links fixed.
Actually, numn is supposed to be for dead links that are not fixed, not any links that are not fixed. Knowing how many dead links are not successfully fixed is much more useful information for the Internet Archive.
Yes, I talked to Max about this and he says that the bot is correctly logging this number to the logger already.