Page MenuHomePhabricator

Make getContent() work for interwiki pages
Open, Needs TriagePublic

Description

Hi! I'm actively working on the Module:Excerpt which allows for reuse of diverse content within Wikipedia. It's very useful and gaining popularity. However it has the limitation of only working with pages on the same wiki. Would it be possible to make it work also with pages from other wikis? If only getContent() retrieved the wikitext from other wikis, the rest can be taken care from within the module and the sky is the limit!

Event Timeline

Restricted Application added subscribers: Liuxinyu970226, Aklapper. · View Herald TranscriptMay 31 2020, 1:35 PM
Pppery added a subscriber: Pppery.May 31 2020, 4:24 PM
Uzume moved this task from Backlog to Transclusions on the Crosswiki board.EditedJun 21 2020, 3:23 PM
Uzume added a subscriber: Uzume.

I highly doubt this sort of functionality will arrive anytime soon. The main issue is that if a Scribunto module supplies different output based on different input from remote wikis, how does Mediawiki track the links and maintain the page rendering caches (so cached output gets properly updated when a dependency changes)? To accomplish this sort of dependency tracking the link tables would have to somehow be expanded to support cross wiki linking so that things like [[Special:Whatlinkshere]] can list remote page transclusions, etc. (perhaps you read that getContent causes the page to be recorded as a transclusion and this is why).

Short of Wikidata, so far as I know, currently the only supported remote data is centralized at Commons via Extension:JsonConfig and something like the tabular mw.ext.data.get() interface. Incidentally both Wikidata and JsonConfig are both JSON-based. There is also Commons for media file pages but I doubt that is flexible enough to do what you are interested in.