Page MenuHomePhabricator

Retire the Not Wikilambda test wiki
Closed, ResolvedPublic

Description

With today’s release of Beta Wikifunctions (or Wikifunctions Beta, apparently?), the usefulness of the Not Wikilambda test wiki has probably come to an end. We should figure out a plan for what to do with it, then implement that.

Event Timeline

In my opinion, keeping Not Wikilambda running as an actual MediaWiki install is not an option. If we keep upgrading the software, the functionality will just degrade further and further (you already get a server error today if you try to load a ZObject, apparently); meanwhile, running the old software forever presents a security risk that will turn into a proper vulnerability sooner or later.

We could try to take an HTML-level snapshot of whatever part of the wiki still exists (wget with mirroring options?), then serve that as a static website (lighttpd webservice type, probably, with PHP hopefully disabled in the config). But given the current state of ZObject pages (500 Internal Server Error), that wouldn’t preserve what was probably the most important part of the content.

We could publish a redacted SQL dump (without the tables user, objectcache, and any WSOAuth tables – anything else?) on tools-static, and just have lighttpd serve a very simple index.html pointing to those SQL dumps; if users want to do anything useful with the content, they’ll have to deal with the dumps themselves.

Or perhaps we can combine the two? Edit the main page to point to the tools-static dumps and document the wiki’s status while the wiki is still live, also add a shorter disclaimer to the footer if possible, then do the wget mirror?

(I’ve already taken an SQL dump now, just to be safe.)

CCing @DVrandecic and @Jdforrester-WMF, since we talked about this during the recent developers-volunteers’ corner.

Also note that I’m on vacation and away from my PC for a while, so I might not actually get around to the “implement” part too soon :)

I think a redacted SQL dump is the way to go; I'd drop all the secondary tables entirely for simplicity (so essentially just the page and revision tables, and maybe the actor and comment ones if you want to be fancy).

Clearly I couldn’t be bothered to do any fancy HTML-level snapshotting so far, so let’s just do a partial database dump. Using an allowlist of tables as suggested by @Jdforrester-WMF seems simpler.

Mentioned in SAL (#wikimedia-cloud) [2022-10-30T10:04:23Z] <wm-bot> <lucaswerkmeister> stopped webservice, wiki is being retired (T314880)

Mentioned in SAL (#wikimedia-cloud) [2022-10-30T10:22:02Z] <wm-bot> <lucaswerkmeister> start webservice again (PHP 7.4 now) with new lighttpd config to rewrite everything to a very bare index.html explaining the shutdown (T314880); partial SQL dump available in public_html too

The partial SQL dump includes the following tables:

  • page
  • revision
  • text
  • actor
  • comment
  • slots
  • revision_comment_temp
  • logging
  • content

For now, I’ve left the database and source code (including LocalSettings.php) in place (the source code moved from public_html/ to public_html-old/). Eventually, both should probably be deleted.

(There are no rows with rev_deleted != 0 or log_deleted != 0 in either table, by the way, so I think they should be okay to publish in full.)

Mentioned in SAL (#wikimedia-cloud) [2022-10-30T10:34:52Z] <wm-bot> <lucaswerkmeister> kubectl delete deployment function-evaluator function-orchestrator pygments-server (T314880)

Mentioned in SAL (#wikimedia-cloud) [2022-10-30T10:58:02Z] <wm-bot> <lucaswerkmeister> kubectl delete service function-evaluator function-orchestrator pygments-server # T314880

Mentioned in SAL (#wikimedia-cloud) [2022-11-01T22:03:38Z] <wm-bot> <lucaswerkmeister> deleted 1.76 GiB of npm cacache and composer cache (cc T314880)

Mentioned in SAL (#wikimedia-cloud) [2023-04-16T17:27:41Z] <wm-bot> <lucaswerkmeister> sql tools 'DROP DATABASE s54524__mediawiki' # T314880

It was supposedly taking up some 79 MB of space, btw:

MariaDB [(none)]> SELECT SUM(data_length + index_length) FROM information_schema.tables WHERE table_schema = 's54524__mediawiki';
+---------------------------------+
| SUM(data_length + index_length) |
+---------------------------------+
|                        79468868 |
+---------------------------------+

Though that might not be accurate depending on how recently the table had been optimized or whatever. A local xz-compressed backup takes up 3.3 MB.

LucasWerkmeister claimed this task.

I tried to reduce the disk usage of the git clones (turn them into shallow clones somehow) but couldn’t figure out how to do it. Let’s just call this done and move on.

Mentioned in SAL (#wikimedia-cloud) [2023-09-29T12:19:12Z] <wm-bot> <lucaswerkmeister> kubectl delete cronjob restart (T314880)