https://gerrit.wikimedia.org/r/#/c/59797/ changed the way jobs are executed on page loads, starting a new php shell to launch runJobs.php. The idea was to execute it asynchronously, so the server can send the page to the client and end the request without waiting for runJobs.php to end. However, this doesn't work.
I've just tested it, changing $wgPhpCli to a shell script with only a sleep command, and no other page would load until the sleep command ends.
According to http://php.net/manual/en/function.passthru.php
*Note:* If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends (!).
It does this:
$cmd = wfShellWikiCmd( "$IP/maintenance/runJobs.php", array( '--maxjobs', $n ) );
wfShellExec( "$cmd &", $retVal );
But it isn't redirecting stdout nor stderr to another stream (/dev/null for example), so PHP gets stuck on that command until it finishes.
This could cause issues like bug 47375 comment 9
The odd thing is that it apparently causes the entire server to hang. For example, in the test shell script I wrote, i put "sleep 5", and then I open 3 different links of the wiki consecutively, and in a top command there's only one script executing at a time. The last page loads at about 15 seconds after I open the first link (5 + 5 + 5). My environment is PHP 5.3.15 (apache2handler)
If this is really happening, that's not something desirable, even if the process is executed in the background, not blocking the request.
Version: 1.22.0
Severity: major
OS: Linux