- Create mediawiki/libs/Minify repo.
- Import history from mediawiki/core. I'll probably use git-filter-branch, with one or two retro renames if needed. We previously did this for CSSJanus, OOjs, OOUI, and VE which worked out well.
- Set up the project, see https://www.mediawiki.org/wiki/Manual:Developing_libraries
- Create empty GitHub repo.
- Fiddle with GitHub repo settings and description.
- Create Phabricator tag, with herald rule for its steward (Performance-Team).
- Add boilerplate for new repos (gitattributes, composer.json, Doxygen, etc.), and move code under Wikimedia\Minify\ namespace.
- Confirm boilerplate commit inits the GitHub mirror.
- Enable php-composer jobs in CI, including Doxygen and code coverage.
- Enable packagist hook on GitHub mirror. (Usually done automatically by after submission, by org-wide GitHub "app" for Packagist)
- Add "wikimedia" user as package co-owner in Packagist.
- Add to integration/docroot.
- Add to mw:Maintainers.
- Publish 2.0.0 with the current ES5 support and new namespace. (We may want to do a retro 1.0 at some point for the original ES3-complaint version)
- Add to mediawiki/vendor and switch ResourceLoader. Keep aliases within mediawiki/core for compat (these are considered internal, so we can probably remove then within a week or two, to be verified, can also keep an alias for one release if really needed, no problem, but we have a public entry point already that is implementation agnostic.)
|Open||None||T32956 Make ResourceLoader a standalone library|
|Resolved||Krinkle||T273247 Publish wikimedia/minify as its own repo and package|
I considered using git filter-branch which I had used numerous times before for VisualEditor, OOjs, OOUI, and CSSJanus. However, the MediaWiki core repository is at least one or two orders of magnitude larger and I had to abort after several failed attempts filtering at less than 1% progress after 10-20 minutes.
Instead, I stumbled upon git filter-repo which was recommended in the upstream Git docs, and it is amazing. Note only is it blazingly fast, it also far more intuitive to use for these kinds of purposes (where you mainly want to specify what to preserve, as opposed to what to remove). Plus, it has numerous cool features out of the box, such as moving release tags to the nearest older commit when removing non-matching commits.
- (If you haven't already, per mw.org) Import SVN-era git-notes in your MediaWiki checkout: git fetch origin refs/notes/commits:refs/notes/commits
- (If you haven't already, per mw.org) Import Gerrit git-notes in your MediaWiki checkout: git fetch origin refs/notes/review:refs/notes/review
- Create mediawiki-libs-Minify directory locally as a clone of MediaWiki at first. E.g. git clone ./mediawiki/.git mediawiki-libs-Minify. This is pretty fast when you point it to your existing MediaWiki core checkout instead of a Gerrit URL.
- Copy over notes: git fetch ../mediawiki refs/notes/commits:refs/notes/commits
- Restore git-notes references (per git-filter-repo#22):
git for-each-ref refs/replace/ --format='%(objectname) %(objecttype) %(refname:lstrip=2)' \ | while read new type old; do if [ "$type" != "commit" ]; then exit 1 fi git notes copy $old $new git notes remove $old done
- Ensure further rebases and amends that change SHA1 hashes (temporarily) do retain git-notes associations:
git config notes.rewrite.rebase true git config notes.rewrite.amend true
- Make any further clean ups as you see fit. For example, I used this to squash some ancient edits to secondary files like COPYING, into the first commit that I wanted to become the initial commit ("add resourceloader to trunk"). Using: git rebase --root -i --committer-date-is-author-date. Here leave out, re-order, etc as you see fit. When you want to squash an old commit into a newer one, that I find easiest is to remember the newer SHA1, then let interactive rebase fix it away into the older one (don't re-arrange as it may lead to pointless merge conflicts), then amend it with the correct author/date by using git commit --amend -C NEWER_TARGETS_SHA1_HERE.
- Copy SVN and Gerrit code review URLs from git-notes into the commit message for convenience.
# Add "SVN: <url>" footer to commit message (incl empty line since SVN commits didn't have a footer yet) # # It copies from the local git-notes data that we imported already and was preserved so far. # # Before: # (bug 25546) Feed argument # # After: # (bug 25546) Feed argument # # SVN: http://mediawiki.org/wiki/Special:Code/MediaWiki/74946 # K_GITNOTE="$(git notes show || true)"; test -z "$K_GITNOTE" || (K_OLDMSG="$(git log --format=%B -n 1)"; git commit --amend -m "$(printf "$K_OLDMSG\n\nSVN: $K_GITNOTE\n")"); # Copy "Reviewed-on: <url>" field from Gerrit git-notes into the commit message. # # We can't copy this from the local notes data. I tried but association got lost (I don't recall why), # but unlike for SVN commits, with Gerrit commits we can simply use # `GIT_DIR=… git log --grep="Change-Id: …"` to get this from our local MW checkout. # This is actually surprisingly fast, even for very old commits from a repo as large as MW # # Before: # Fix typos # # Bug: T201491 # Change-Id: I25a27d11faabe2f5fa02950c7a4fb58b13fb3662 # # After: # Fix typos # # Bug: T201491 # Change-Id: I25a27d11faabe2f5fa02950c7a4fb58b13fb3662 # Reviewed-on: https://gerrit.wikimedia.org/r/452630 # git rebase $FIRST_GERRIT_COMMIT_HERE -i --committer-date-is-author-date --exec 'K_GERRITID=$(git log -1 --format=%b | grep "Change-Id:" | head -n1 || true) && test -n "$K_GERRITID" && K_REVIEWLINE=$(GIT_DIR=/Users/krinkle/Development/mediawiki/.git git log --notes=review --grep="$K_GERRITID" -1 --format=%N | grep "Reviewed-on:" || true) && test -n "$K_REVIEWLINE" && K_OLDMSG=$(git log --format=%B -n 1) && git commit --amend -m "$(printf "$K_OLDMSG\n$K_REVIEWLINE\n")"'
- Clean up
git update-ref -d refs/notes/commits git gc --prune=all --aggressive
- Add Gerrit remote and push (temporarily set Gerrit repo permissions to alllow pushes, and merge commits, and forged committers, and forged authors).