https://gerrit.wikimedia.org/r/570387 a.k.a. e4d02b5e9a896546d63cd800ce6b0d77ae8ea190 seems to have fixed this, as checked via a local override with that patch reverted. Then the task may belong in WikiEditor instead.
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Advanced Search
Aug 6 2020
Aug 4 2020
May 21 2020
Recompiling the Sass should not be too hard. Hosting it as part of the tool itself would cause the benefits of the cdnjs to get missed, though, as far as the theme is concerned. Or I might do without the theme entirely. Anyway I'm not placing this up for grabs as it does not make sense for anyone else to work on the Wikidata Slicer in its current state.
Not discouraging anyone from picking the patch up or starting a new one
Not discouraging anyone from picking the patch up or starting a new one
Not going to work on this in the near future
Not going to work on this in the near future
Not going to work on this in the near future
Not discouraging anyone from picking the patch up or starting a new one
Babel AutoCreate still blocked on many wikis. Task likely in need of movement-wide clarification/reevaluation
Mar 29 2020
Mar 2 2020
Jan 14 2020
Jan 11 2020
Dec 24 2019
Dec 21 2019
Glad that the old files are still around and the change has been documented so that browser version data can be gathered again. Hope moving the logs to my local machine once every few years will help freeing up some disk space while complying with the privacy Terms
Oct 4 2019
I seem to recall How we made editing Wikipedia twice as fast having a big impact on the public perception of HHVM. Is a new post in the works? Someone from the broader community is likely interested in the what/why/how of the switch back to Zend.
Sep 2 2019
GitHub PR by GlazerMann https://github.com/wikimedia/mediawiki-oauthclient-php/pull/7 predates task but the repo seems to be using Gerrit.
Aug 25 2019
Aug 23 2019
Aug 22 2019
Jul 27 2019
Apr 6 2019
Apr 5 2019
Jan 29 2019
Jan 9 2019
I still get dropped connections (not 414) with much longer URLs (total header size from 9442 bytes onward). Is that expected?
Jan 7 2019
@Vgutierrez my bad, I meant retrying with a shorter URL.
Thank you all for investigating. What would you recommend as the best practice w.r.t. URL length in client code? E.g. POST requests, hard-coded limits, retrying on 414-coded responses...
Nov 20 2018
Firefox 63, Chromium 70, curl 7.58 all work fine with HTTP/1.1 but fail with HTTP/2 on Wikimedia servers. Other HTTP/2 servers seem not to be affected.
Nov 19 2018
In T209590#4753835, @Anomie wrote:When I try making a variant of that request (spaces replaced with %20) using curl, I get back the proper API response.
Nov 11 2018
Oct 8 2018
Sep 28 2018
Sep 27 2018
Aug 1 2018
Jul 21 2018
Jul 17 2018
Jul 14 2018
Thank you.