Page MenuHomePhabricator

Internet Archive Bot apparently fails to analyse Tenerife article on enwiki
Closed, ResolvedPublic


I've tried several times this evening (UK time) to get the Internet Archive Bot to analyse the Tenerife article on the English Wikipedia. I've encountered no errors, but after many minutes (in one case over 30) all browser activity ceases with just a blank page displayed (url: ) and no edits made to the article (although I don't know if any are required). It is a large article with 179 references, but I've successfully run it on similar sized pages previously (all with success or explicit error inside 10 minutes), and concurrently with the failures to analyse Tenerife it has worked on several other, smaller, articles.
If it matters, I'm using Firefox 56.0 on Linux

Event Timeline

Restricted Application added a project: Internet-Archive. · View Herald TranscriptJan 4 2018, 12:45 AM
Reedy renamed this task from Inetert Archive Bot apparently fails to analyse Tenerife article on en.wp to Internet Archive Bot apparently fails to analyse Tenerife article on enwiki.Jan 4 2018, 1:18 AM
Cyberpower678 triaged this task as Medium priority.Jan 4 2018, 3:02 PM
Cyberpower678 moved this task from Inbox to v1.6 on the InternetArchiveBot board.

It seems the script is timing out on the webserver. This may not be fixable. The best I could possibly do is to detect a possible timeout before it happens and throw an error message to the user. My concern though is that a user will sit there for a while only to be given an error message after waiting for so long.

For me, even though it would be frustrating seeing an error after so long that would be preferable to silently timing out. If the error had suggested next steps (e.g. alternatives or workarounds if there are any) then this would reduce the frustration slightly.

2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP Fatal error:  Maximum execution time of 30 seconds exceeded in /mnt/nfs/labstore-secondary-tools-project/iabot/IABot/Parser/parse.php on line
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP Stack trace:
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   1. {main}() /mnt/nfs/labstore-secondary-tools-project/iabot/public_html/index.php:0
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   2. analyzePage() /mnt/nfs/labstore-secondary-tools-project/iabot/public_html/index.php:151
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   3. Parser->analyzePage() /mnt/nfs/labstore-secondary-tools-project/iabot/public_html/Includes/actionfunctions.php:1855
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   4. Parser->getExternalLinks() /mnt/nfs/labstore-secondary-tools-project/iabot/IABot/Parser/parse.php:443
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   5. Parser->parseLinks() /mnt/nfs/labstore-secondary-tools-project/iabot/IABot/Parser/parse.php:1079
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   6. Parser->getNonReference() /mnt/nfs/labstore-secondary-tools-project/iabot/IABot/Parser/parse.php:1278
2018-01-06 20:24:36: (mod_fastcgi.c.2673) FastCGI-stderr: PHP   7. preg_match() /mnt/nfs/labstore-secondary-tools-project/iabot/IABot/Parser/parse.php:1458

I may have to push this to the v2.0 release to fix this as that new release will have a plethora of performance enhancements.

Cyberpower678 closed this task as Resolved.Feb 23 2019, 12:48 AM