User Details
- User Since
- Oct 6 2014, 10:34 PM (468 w, 2 d)
- Availability
- Available
- IRC Nick
- arlolra
- LDAP User
- Arlolra
- MediaWiki User
- Arlolra [ Global Accounts ]
Yesterday
Note the File:Undefined from https://it.wikipedia.org/wiki/Timor_Est?useparsoid=1 while verifying T346703#9204822
Alas, this is going to need a new release of Timestamp similar to T329594
The train has reached itwiki and the TMH player is now loaded on,
https://it.wikipedia.org/wiki/Timor_Est?useparsoid=1
An attempt at this was started in,
https://github.com/wikimedia/mediawiki-services-parsoid/commit/55da3090d0e7faa49e40a5bce0f68634fea5fc83
@phuedx Thank you
Tue, Sep 26
It looks like the question mark is forcing this as an extlink,
Mon, Sep 25
Fri, Sep 22
I actually need three of them to trigger the issue:
Thu, Sep 21
Note that you can produce the same error with,
Tue, Sep 19
Mon, Sep 18
From what I can tell, a fairly routine The maximum execution time of 60 seconds was exceeded RequestTimeoutException was thrown by Excimer while inside the try block creating a DateTime,
Fri, Sep 15
Post-deploy, there is no longer forstered content lint for,
https://en.wikisource.org/w/index.php?title=User:Arlolra/sandbox&oldid=13383266
There haven't been any instances post-deploy yesterday, though there used to be days of zero occurrence regardless. I'll be optimistic in closing and check back on it later.
Fri, Sep 8
Parsoid's output looks ok
https://en.wikipedia.org/api/rest_v1/page/html/Abbey_Road#Track_listing
Thu, Sep 7
So if I were to write the following some time ago I would also get a broken result?
Wed, Sep 6
Sorry, there are two wikitext parsers in MediaWiki. The "legacy parser" is the one that still produces what you'd see in a desktop browser.
Here's a simplified test case from the reqId,
Tue, Sep 5
The crucial piece you're tripping on here is || in the file link [[Tập tin:Hellenic Army War Flag.svg|23x20px||alt=|link=]]
Another path may be to add a later hook for that sort of cases.
Fri, Sep 1
However, in the parse direction, the metrics are only emitted in ParsoidHandler::wt2html, so calls to DirectParsoidClient::getPageHtml / transformWikitext aren't recording any data. They should be moved somewhere inside HtmlOutputRendererHelper or whatever it calls.
...
That leaves RESTBase. It's still being populated for the other endpoints that depend on it, which continues to make call to the ParsoidHandler, accounting for the data we see.
Thu, Aug 31
Post-deploy, data is starting to repopulate https://grafana.wikimedia.org/d/000000046/parsoid-timing-html2wt
Previous to the switch, all serialization was done on the Parsoid cluster. Now, each app server is doing that work and all those servers aren't instantiating statsd with the same prefix as they used to (MediaWiki vs MediaWiki.Parsoid).
Interesting, that patch fails because of the test added in,
https://github.com/wikimedia/mediawiki/commit/79cc21beaf34c82f368ef88125bcdd2567a8f389
Wed, Aug 30
I tried adding equivalent queries with both prefixes to the graphs, for continuity, but because not all the data goes to zero for all graphs, it was kind of a mess. Instead, I duplicated the dashboard with a -direct postfix,
https://grafana.wikimedia.org/d/000000046/parsoid-timing-html2wt?orgId=1&refresh=30s&from=now-6M&to=now
https://grafana.wikimedia.org/d/xxRCkIzIk/parsoid-timing-html2wt-direct?orgId=1&refresh=30s&from=now-6M&to=now
At first blush, it's unclear why switching VE away from RESTBase would only affect the html2wt metrics since the DirectParsoidClient calls Parsoid directly in both directions.
Tue, Aug 29
- Add [rel=mw:referencedBy] to the legacy output (@Arlolra believes there are some issues with cut-and-paste of RDF-contiaining output from legacy pages to VE, and has a proposal on how to deal with it)
Aug 29 2023
Aug 28 2023
Aug 25 2023
A request can be made to the REST API to lint arbitrary wikitext,
Aug 24 2023
If I'm reading this correctly,
The breakage is because of T134469, similar to T303705#9103106 and T275475#9094307
Aug 23 2023
Similar to the discussion at T294720#8670929, we need to show that tidy would have done this migration in the presence of a bold tag, since we're trying to detect a rendering difference, post switching to remex, that could result in a visual regression.
Aug 22 2023
Also, note that T173943 is a related task about filtering counts by content namespaces.
A request can be made to the REST API to lint arbitrary wikitext,
> curl -X POST -H "Content-Type: application/json" -d '{"wikitext":"<div>test"}' https://www.mediawiki.org/api/rest_v1/transform/wikitext/to/lint [{"type":"missing-end-tag","dsr":[0,9,5,0],"params":{"name":"div","inTable":false}}]
I imagine this works because of fixes in T261181 but, in any case, it no longer seems to be an issue.