compress ("minify"/obfuscate) javascript output
Closed, DeclinedPublic

Description

Author: alistrius

Description:
hello

i'm actually working on local javascripts, trying to make them like a "formal garden" http://en.wikipedia.org/wiki/Formal_garden

so i remove useless bits of code, optimise them and so on, in the purpose of makig them lightest possible, and then less heavy for the servers (i think that every single byte is significant in the common.js, the monobook.css of big projects).

but i also write documentation for the functions i optimise... so the weight i loss in the code optimisation i gain it back because of the comments

so i wonder if it should be possible that the software strip the comments in the stylesheets and javascripts before sending them to the clients.

the full scripts page should remain intact with direct request on them of course.

i think that this feature will save a huge amount of money for wikimedia.

Kind regards


Version: unspecified
Severity: enhancement

bzimport added a project: MediaWiki-Parser.Via ConduitNov 21 2014, 9:57 PM
bzimport added a subscriber: Unknown Object (MLST).
bzimport set Reference to bz12250.
bzimport created this task.Via LegacyDec 8 2007, 11:43 PM
bzimport added a comment.Via ConduitDec 9 2007, 3:38 PM

alistrius wrote:

and besides this money consideration, it should be a nice feature

tstarling added a comment.Via ConduitDec 9 2007, 3:49 PM

I'm against this. In my experience, the savings from javascript compressors post-gzip are so small that you could only justify it on the grounds of obfuscation. Obfuscation is not what we're about. These files are cached on the client side so typically sent to the user once per visit.

If you want something to optimise, try the 51KB of edit tools we send out every time someone clicks a red link (over 100KB on commons).

bzimport added a comment.Via ConduitDec 11 2007, 5:52 PM

alexsm333 wrote:

I remember at least 3 proposals on en.wp to replace Edittools with JavaScript. There was no opposition, but none of the admins was brave enough to actually implement this (sorry for offtopic).

bzimport added a comment.Via ConduitDec 18 2007, 6:30 PM

ayg wrote:

I agree with Tim. We already compress it, using gzip. The benefits of any further compression are outweighed by the annoyance of anyone who actually wants to read the stuff.

cneubauer added a comment.Via ConduitMar 25 2008, 5:01 PM

According to YSlow (http://developer.yahoo.com/yslow/), a number of JS files from commons are not gzip'd. Try editing a page on commons in Firefox and take a look at the response headers of wikibits.js for example. YSlow also complains that the commons logo doesn't have an expires header. There's probably a good reason for that that's beyond me.

brion added a comment.Via ConduitMar 25 2008, 5:14 PM

Raw files are not currently compressed as they are served directly by the web server, which we have not currently configured to compress them (though we plan to do so).

Any JS served directly out of the wiki gets compressed.

This feature request though was about *obfuscation* for the purposes of making the files smaller; that's something we'll reject regardless of the use or non-use of gzip compression, as it makes debugging and customization much more difficult.

I've updated the summary to clarify.

Add Comment