Page MenuHomePhabricator

Provide statistics on the size of gadgets
Open, Needs TriagePublicFeature

Description

See also T345960 and T340705.

If statistics and some guidance (possibly averages?) are provided this could help communities and administrators to make informed decisions.

Here's a rough idea on how this might be provided. Feel free to add to/improve this task description but please keep true to the spirit of it: to inform, not to sanction.

  1. Create a loop with a list of all project (sub)domains e.g. "en.wikipedia.org", "es.wikipedia.org", "www.wikidata.org", etc
  2. Read MediaWiki:Gadgets-definition, for example https://commons.wikimedia.beta.wmflabs.org/wiki/MediaWiki:Gadgets-definition?action=raw
  3. Create two lists of gadget names: those that are default+not action specific+require no rights and a superset consisting of all gadget names.
  4. Make a request that loads all default gadgets, e.g. https://commons.wikimedia.beta.wmflabs.org/w/load.php?lang=en&modules=ext.gadget.betaCommons%2CUploadWizard%2CImageAnnotator%2CuploadWizardMobile%2CAnonLoader%2Cswitcher&skin=vector-2022
  5. Check length of response
  6. Check gzipped length of response
  7. Create a loop to load every gadget individually like this, check length+gzipped length
  8. Turn this information into two pie charts, one for all gadgets and one for the default gadgets, using HTML and possibly also a wikicode version (https://www.mediawiki.org/wiki/Extension:Cargo can create pie charts but doesn't seem to be installed on Wikimedia)
  9. Have a bot/script upload/post this on ToolForge or somewhere on-wiki.

Event Timeline

Aklapper changed the subtype of this task from "Task" to "Feature Request".Sep 12 2023, 1:05 PM

A table with all this data would be great too. I mean something like:

{|
|-
! gadget !! size [KiB] !! zipped [KiB]
|-
...
|}

Note that a sum of this might not be correct:

Create a loop to load every gadget individually like this, check length+gzipped length

I mean two gadgets might load the same library, but I don't think that's so important. Calculating it as planned would be useful too :-) (unless you can maybe check minified/gzipped length of individual files without complicating things too much)

I made a proof of concept: https://en.wikipedia.org/wiki/User:Alexis_Jazz/EverybodyLikesPie.js

Tested on itwikisource (for being at the top of T340705): https://it.wikisource.org/wiki/Utente:Alexis_Jazz/Sandbox

Pie chart doesn't look quite right because I can't seem to copy https://en.wikipedia.org/wiki/Template:Legend/styles.css properly for https://it.wikisource.org/wiki/Template:Legend. That page title is disallowed for being a subpage and I seemingly can't create a page with sanitized CSS pagecontentmodel. If anyone can help with this, please..

@AlexisJazz Seems I found a way, if that helps. They have some rules for addresses/page title as seems.

	wgPageContentModel:"sanitized-css"
	wgRelevantPageName:"Template:Legend/Style.css"

Did you ever thought of using NodeJS?

You could keep some of your code, but do it in an automated manner. You can loop over many pages to gather data and save locally (as files or in DB).
Then deploy this to Wikipedia in a simple script.

I've recently discovered this JS module for doing API request to MediaWiki: MWN JS library.

Saving things back to Wikipedia can be done either directly with MWN or if you generate file then you can use WikiployLite which kind of makes thing easier (you only provide configuration for that). When you use either of those methods you can use a local Jenkins installation that just runs your scripts on certain time.

You can find an example of how to use MWN here:
https://github.com/Eccenux/wiki-wd-mass-modyfication/blob/b1f8e19c89c17eed1a68bbea255a2d5b41378b96/runBot.js

You'll find that WikiBotLite is even a thinner layer over MWN then WikiployLite is:
https://github.com/Eccenux/wiki-wd-mass-modyfication/blob/b1f8e19c89c17eed1a68bbea255a2d5b41378b96/src/WikiBotLite.js

Did you ever thought of using NodeJS?

You could keep some of your code, but do it in an automated manner. You can loop over many pages to gather data and save locally (as files or in DB).
Then deploy this to Wikipedia in a simple script.

I've recently discovered this JS module for doing API request to MediaWiki: MWN JS library.

Saving things back to Wikipedia can be done either directly with MWN or if you generate file then you can use WikiployLite which kind of makes thing easier (you only provide configuration for that). When you use either of those methods you can use a local Jenkins installation that just runs your scripts on certain time.

You can find an example of how to use MWN here:
https://github.com/Eccenux/wiki-wd-mass-modyfication/blob/b1f8e19c89c17eed1a68bbea255a2d5b41378b96/runBot.js

You'll find that WikiBotLite is even a thinner layer over MWN then WikiployLite is:
https://github.com/Eccenux/wiki-wd-mass-modyfication/blob/b1f8e19c89c17eed1a68bbea255a2d5b41378b96/src/WikiBotLite.js

Thanks, I added looking into this to my to-do list. The PoC took maybe a few hours to write+debug so starting from scratch wouldn't be a massive loss either.