Pasting from email to get it out of my inbox,
node bin/roundtrip-test.js --prefix enwiki "User:Pdebee"
Ok, that's a separate issue. I didn't set a worker_heartbeat_timeout in my little service wrapper, so it's using the default. Since we're supplying a parsoidURL in rt, it didn't come up. https://github.com/wikimedia/parsoid/blob/master/tests/serviceWrapper.js#L83-L89 What's really going on here though is that there's a huge <include> on that page (starting around {{user citing sources}}) and we recursively parse the contents, but this time it's synchronous, https://github.com/wikimedia/parsoid/blob/master/lib/wt2html/pegTokenizer.pegjs.txt#L1084 and that does lock up the process enough to prevent the heartbeat, at least locally. This still seems somewhat unsatisfying in that heartbeats are supposed to be 3 mins, so I'll probably look a little further tomorrow. I want to say that the solution to this is to treat <noinlcudeonly> as extension tags (like https://gerrit.wikimedia.org/r/#/c/281076/) and tokenize them async in the extension handler. But that's just an initial thought ...
A stopgap measure could be to put limit on the wikitext length we're willing to parse synchronously.