As proposed in https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/534642/ , concurrency could be improved and connection overhead largely reduced. In testing around via shell.php, hitting 1K enwiki URLs goes from 17.5s for each of two patches in a row, down to 1.5 - 3.5s for each. Also netstat only shows 50 TCP connections rather than 2000 in ESTABLISHED/TIME_WAIT.
Test.php file included from shell.php to expose $benchmark().
$reqs = array_map( function ( $word ) { return [ 'method' => 'HEAD', 'url' => "https://en.wikipedia.org/wiki/" . ucfirst( trim( $word ) ) ]; }, file( "$IP/words.list" ) // 1000 word list ); $http = new MultiHttpClient( [] ); $benchmark = function () use ( $http, $reqs ) { for ( $i = 1; $i <= 2; ++$i ) { $start = microtime( true ); $reqs = $http->runMulti( $reqs ); $real = microtime( true ) - $start; $codes = []; foreach ( $reqs as $req ) { $codes[] = $req['response']['code']; } echo "$real sec\n"; var_dump( array_count_values( $codes ) ); } };