During the Apr 20th run I noticed that the dump of certain page ranges took an extremely long time, even using lbzip2 for compression. Check if this is due to a large number of revisions per page, long revision text, or some other reason. If possible, account for this when splitting jobs into page ranges so that no jobs take an abnormally long period of time.
Example slow range: wikidatawiki-20190401-pages-meta-history27.xml-p56915353-p56950553.bz2 35l pages, 40GB of data (compressed), over 12 hours.