Page MenuHomePhabricator

Popups should not use github.com for documentation generation
Open, Needs TriagePublic

Description

For some reason I can't really figure out, running npm it starts trying to download resources off of github:

user@dev ~/g/m/c/e/Popups> cat .storybook/storybook-resources.sh 
#!/usr/bin/env bash
set -ex

curl https://raw.githubusercontent.com/wikimedia/mediawiki/master/resources/src/mediawiki.less/mediawiki.mixins.less -o .storybook/mocks/mediawiki.mixins.less
curl https://raw.githubusercontent.com/wikimedia/mediawiki/master/resources/src/mediawiki.less/mediawiki.mixins.animation.less -o .storybook/mocks/mediawiki.mixins.animation.less
curl https://raw.githubusercontent.com/wikimedia/mediawiki/master/resources/src/mediawiki.less/mediawiki.ui/variables.less -o .storybook/mocks/mediawiki.ui/variables.less
curl https://raw.githubusercontent.com/wikimedia/mediawiki/master/resources/src/mediawiki.ui/components/icons.less -o .storybook/mocks/mediawiki.ui/components/icons.less

Aside from issues like potentially not working across different branches, why can't this just use the MediaWiki clone that's already checked out? Depending upon an external, non-free service isn't really acceptable.

Event Timeline

Legoktm created this task.Jan 21 2020, 7:58 AM
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptJan 21 2020, 7:58 AM
Niedzielski added a subscriber: Niedzielski.

I think we can use something like https://phabricator.wikimedia.org/source/mediawiki/browse/master/resources/src/mediawiki.less/mediawiki.mixins.less?view=raw (unless you know a better URL). Related discussion in Vector.

Github is only used for building documentation. We run documentation building as part of npm test as a safety guard. Access to local files wasn't possible in our release engineering pipeline and we were advised to use CURL to keep things simple). We basically use github to curl about 2-5 css files.

We don't depend on it. This could be nonvoting if we wanted.

It doesn't have to be github but it needs to provide access to recent copies of raw files. Ideally we'd use gerrit but couldn't find suitable URIs. Unsure whether phabricator is best place to access code and there were concerns from security about using beta cluster.

Recommendations helpful.

Jdlrobson renamed this task from Popups tests rely on github to Popups should not use github.com for documentation generation.Jan 27 2020, 9:36 AM
Jdlrobson removed a project: Technical-Debt.

Not sure if gerrit supports access to raw source on master? Or am I just being dumb?: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseMediaInfo/+/master/CODE_OF_CONDUCT.md

If so I guess Phabricator is the best option?

Note using gerrit seems the logic choice but it doesn't seem to behave as I'd expect. A CURL of the following: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/master/resources/src/mediawiki.less/mediawiki.mixins.animation.less?format=TEXT returns base64 encoded string. Not sure if that's a bug or by design but it means gerrit is not feasible.

Using phabricator will work if we follow redirects but it appears to be noticeably slower than github meaning it will lead to slower jenkins runs.

I don't see how downloading resources over HTTP is maintainable long-term. When the repo is branched, it should be using the corresponding branch of MediaWiki, not continuing to use master. What if someone wants to rearrange the files in Git?

Github is only used for building documentation. We run documentation building as part of npm test as a safety guard. Access to local files wasn't possible in our release engineering pipeline and we were advised to use CURL to keep things simple). We basically use github to curl about 2-5 css files.

Is the documentation built by Jenkins? If so, it should be straightforward to add in a MediaWiki core clone as the job runs.

Task description

[…] why can't this just use the MediaWiki clone that's already checked out?

There isn't one. This is the standalone doc pipeline. It doesn't come with overhead of Quibble and MediaWiki

Task description

Depending upon an external, non-free service isn't really acceptable. […]

I would normally agree, but the service being used here doesn't seem significant to me. It is fetching a one of our own static assets from a simple URL. Not very different from the millions of files we already fetch from npmjs.com and packagist (including for code we wrote), or e.g. curl'ling code.jquery.com as canonical source for something. All of these likely also involve non-free intermediaries while handling that request. That's not to say it doesn't matter. It does matter. But I think the time invested here isn't going to pay off, and there isn't any form of run-time dependency or logical lock-in involved.

[…] issues like potentially not working across different branches,

Yeah, it should probably use a permalink instead like /blob/hash123/path/file.ext but that does add the maintenance cost of having to update it from time to time. This is a trade-off by the team though, not something I think we should interfere with. Note that this all started as curl https://en.wikipedia.org/w/load.php?… which, looking back, seems simpler for everyone and could be restored as-is as far as I'm concerned. This isn't run-time code, it is build-time code. It only needs to work at the time of a merge when the doc is built. It is then stored indefinitely on doc1001. We provide no promise of the git repo being able to reproduce the build from a commit years later (Node.js versions change, Docker images change, and third-party URIs may change at any time). And if something does break during a build, I'm sure the team will find and fix that in a future commit at which point the doc export is replaced again. So again, no real impact.

Is the documentation built by Jenkins? If so, it should be straightforward to add in a MediaWiki core clone as the job runs.

One does not simply clone MediaWiki core, doing so would make the script unusable/untestable during local development...

See T213223#5836798 which obsoletes this issue.

FWIW hitting github seems no different to hitting npm like you say. My initial thought was to use gerrit exactly out of concern for such a task emerging but I couldn't find suitable URIs. I am happy to use a different URI if it is more in line with our values, but right now Phabricator seems to be the only option. Is that preferable given phabricator seems a lot less reliable than github? When either service goes down no documentation gets built and new patches will not be mergeable.

The assets we're talking about in this particular case are LESS variable and mixin files that are imported and not exposed on load.php URIs so hitting load.php is not an option. An alternative option would be to publish those on npm and maintain them, but I don't see that happening any time soon.

Krinkle removed a subscriber: Krinkle.Apr 17 2020, 7:10 PM