Page MenuHomePhabricator

"500 Internal Server Error" on all non-HTML pages
Closed, ResolvedPublic

Description

Author: dr.trigon

Description:

Internal Server Error

The server encountered an internal error or misconfiguration and was unable to
complete your request.

Please contact the server administrator, mpelletier@wikimedia.org and inform
them of the time the error occurred, and anything you might have done that may
have caused the error.

More information about this error may be available in the server error log.

Additionally, a 500 Internal Server Error error was encountered while trying
to use an ErrorDocument to handle the request.

Very simple/basic cgi script: http://tools.wmflabs.org/saper/cgi-bin/simple
Essentially ALL cgi-bin scripts do not work anymore, see e.g.:
http://tools.wmflabs.org/drtrigonbot/cgi-bin/sum_cat_disc.py
http://tools.wmflabs.org/drtrigonbot/cgi-bin/panel.py
etc.

Only plain html works: http://tools.wmflabs.org/saper/test.html


Version: unspecified
Severity: blocker

Details

Reference
bz59118

Event Timeline

bzimport raised the priority of this task from to High.Nov 22 2014, 2:36 AM
bzimport added a project: Toolforge.
bzimport set Reference to bz59118.

I checked this with a newly created "tool" from scratch, and my simplest CGIs do not work: http://tools.wmflabs.org/saper/cgi-bin/env (simplest CGI in /bin/sh) gives 500

Same for the PHP script stored in "public_html":

http://tools.wmflabs.org/saper/test.php

(In reply to comment #2)

Is this related to http://icinga.wmflabs.org/icinga/ outage?

No. Icinga is hosted in the Labs project "Nagios" and isn't a dependency for Tools.

Lots of requests from Baiduspider/2.0 on tools-webproxy.

Set up robots.txt as a temporary measure to:

User-agent: *
Disallow: /

and will reinstate the spiders block in tools-webproxy's /etc/apache2/sites-available/webproxy in a jiffy.

tools-webserver-01 is (very much :-)) out of memory.

scfc@tools-webserver-01:~$ sudo -i
-bash: fork: Cannot allocate memory
-bash: fork: Cannot allocate memory
scfc@tools-webserver-01:~$

Rebooting.

And it's out again:

[Mon Dec 30 12:55:41 2013] [error] [client 10.4.1.89] (12)Cannot allocate memory: couldn't create child process: /usr/lib/suphp/suphp for /data/project/ipp/public_html/npp_extern.php

Unfortunately, the bot hitting that page doesn't send a User-Agent, and before I exclude the whole world, I gotta read up on mod_rewrite (and then later re-start ipp as NewWeb). Moment, please.

Yeah, the OOM was a consequence of Baidu insanely spidering some of the tools that have links to themselves with expensive parameters. Recursion for the loss.

I've blocked the spider at the network level. With a bit of luck, things should settle back down.

Executed as local-ipp "webservice start", removed the rewrite on tools-webproxy, and tools-webserver-01 is down again. *Argl*.

Oh, even more fun. We have Nigma.ru also crawling and disobeying robots.txt.

(In reply to comment #11)

Oh, even more fun. We have Nigma.ru also crawling and disobeying robots.txt.

I think my rewrite rule may have prevented the named spiders from accessing robots.txt :-) (fixed now). Oh, well, I should get my IRC client going again for some synchronous communication.

There seems to be a wiki at bookmarkmanagerv2?

Indeed. Disabled (it had open registration, and is infested by spambots)

JFTR: Notified APPER about restart as NewWeb at [[de:Benutzer Diskussion:APPER#IP-Patrol auf Tools]].

dr.trigon wrote:

Looks like the scripts work now. Thanks! But I have issues to connect to the DB:

<class '_mysql_exceptions.OperationalError'>: (2003, "Can't connect to MySQL server on 'dewiki.labsdb' (110)")

(In reply to comment #15)

Looks like the scripts work now. Thanks! But I have issues to connect to the
DB:

<class '_mysql_exceptions.OperationalError'>: (2003, "Can't connect to MySQL
server on 'dewiki.labsdb' (110)")

That must be related to my reboot of tools-webserver-01.

Coren, where are the port forwards loaded? I see identical "/etc/iptables.conf"s, on both -01 and -03 "sudo iptables -L" gives identical, yet "empty" output, however, -03 works, while -01 doesn't.

That's because -L shows the /filter/ table by default, not nat table where those rules live.

(In reply to comment #17)

That's because -L shows the /filter/ table by default, not nat table where
those rules live.

D'oh! DrTrigon, working for you now?

dr.trigon wrote:

Yupp! Up and running again! Perfect, thanks to everybody involved!!! Greetings

(In reply to comment #5)

Set up robots.txt as a temporary measure to:

User-agent: *
Disallow: /

Temporary? Bug 61132.