Sun, Sep 8
I formally object to Aklapper removing my comments as being "off topic". The fact that the WMF has spend 13 years not fixing this problem and that we are coming up on 14 years seems to be very much on topic. I don't see any link for accessing the edit history, so could someone please send me an email with my comments here and a record of when they were removed and by who?
Mon, Sep 2
Re: "About potential backlash on disabling CAPTCHAs: Here stewards are arguing for better CAPTCHAs here because they have too many fake accounts they need to deal with", that's why I am not asking that CAPTCHAs be disabled. Instead I am asking that the WMF
- Assign an employee or contractor the task of fixing this problem.
- Budget some amount of money towards fixing this.
- Give us at least the start of a schedule with a rough estimate of how long it is expected to take to fix this.
- Write some sort of requirements -- however informal -- defining "done".
Re: discussing solution on this page, we have been doing that FOR THIRTEEN YEARS and blind people are still unable to access Wikipedia.
I am having a bit of a quandary here.
Jul 8 2019
I just posted the following to [ https://meta.wikimedia.org/wiki/Talk:Wikimedia_Foundation_Board_of_Trustees ]:
Dec 13 2016
Re: "when you wrote "all our HTML is served with gzip compression", did you mean that everything uses gzip compression, or like Guy writes above, is it only pages above a certain size?":
Nov 29 2016
Sorry for not responding. So far I have not been able to get anyone at the WMF who has the ability to make this change to discuss the merits of doing this. A developer working on page weight could do a quick test in less than five minutes that will answer the gzip question using the actual Wikipedia environment. Instead, I am being asked by people who have zero ability to actually make the change to (imperfectly) duplicate Wikipedia and do my own tests, and of course if I do that I will be then be told that I have not duplicated the Wikipedia environment correctly (and they would be correct). Plus, due to the overhead and latency of compression and decompression, websites typically only gzip files above a certain size threshold, so I would also have to figure out the minimum size at which Wikipedia stops compressing, estimate how many pages are below that (redirects are tiny), and in the end I still will have utterly failed to open up a discussion with an actual developer who has at least the potential of running a real-world test of my proposal.
Sep 3 2015
I am working on setting up a procedure (using Slackware or windows 10) that will allow me to efficiently measure the result after gzip compression. See [ https://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Wikipedia.27s_HTTP_compression ]. More later when I have some actual numbers. --Guy Macon