My conscious is a jukebox
Fri, Dec 6
"Curate this page" was changed to "Open Page Curation". As far as I know, that link was never displayed on pages unless they were currently in the queue or had recently been in the queue.
Thu, Dec 5
Wed, Dec 4
Tue, Dec 3
The proxy is now set up on both Toolforge and VPS. I did the latter only because I thought Toolforge was the reason why large responses weren't working, but https://github.com/wikimedia/WhoWroteThat/pull/97 fixed that.
Merged. I'll repeat here the observation I made on the PR:
PR that fixes #1: https://github.com/wikimedia/WhoWroteThat/pull/105
Thanks for the thorough QA, Dom!
Mon, Dec 2
@Anomie Thanks for the very thorough research! It sounds like maybe this is too risky. My goal was merely to fix the list=blocks API. Maybe I could just rework it to accept all IP ranges, rather than changing Users:isIP directly?
I don't know who fixed it, but it's fixed. Closing.
See T232093#5685396 regarding QA.
Sun, Dec 1
@Mrjohncummings Sorry for the late reply. The "Event summary" download should finish immediately, since that data is pre-stored. The others do have to run a bunch of queries. The "Pages Created" CSV downloaded for me in about 5 minutes. I don't think it is my fast internet speed because the file itself isn't that big. The slow part is the queries being ran on the server. That said, I'm not sure why you would have the "0kb of unknown" issue :(
If you are on IRC you could use https://en.wikipedia.org/wiki/User:MusikBot/AbuseFilterIRC. If you want to subscribe to filters outside English Wikipedia and Commons, let me know and I can add whatever wiki you want. There are also live feeds of all activity for each wiki, so you could join the channel and set your client's "highlight words" to match the desired filter.
Yes, I'm going to submit a patch once I figure out what's gone wrong with my git review. Coming soon! :)
Wed, Nov 27
Tue, Nov 26
@Mrjohncummings Which report are you talking about? I tried all four, and the wikitext export worked. The pages created/improved reports did take a bit to load.
Unless WhoColor is rewritten to use Parsoid, I think these issues are bound to come up. For the short-term, we could try to detect them (perhaps with the linterrors API), and give some sort of indication that there is invalid wikitext and only partial results may be shown.
There was an issue with very large responses (e.g. for [[Barack Obama]]). This was fixed by using cURL. PR at https://github.com/wikimedia/WhoWroteThat/pull/97
Mon, Nov 25
Per our discussion on GitHub, I am going to decline this since create2 doesn't include byemail. We can only link to one, so the full log seems like the best fit. Nonetheless, thanks for your work on this!
Sat, Nov 23
Fri, Nov 22
The patch won't merge due to an unrelated issue with ORES.
This will probably be very difficult if not impossible to QA, but we can try! See T232093#5624961 for the findings; seems like if there's a page in the queue that's missing metadata (even if it's not the next page), the API response returns warning instead of success. So we'll need to somehow ensure a page is missing metadata somewhere in the queue. Why there are pages missing metadata is a separate problem, apparently.
Thu, Nov 21
I think this is the same as T171374: Make CodeMirror support IME functionality (see "ULS transliteration" section). I spent a lot of time trying to come up with a fix, to no avail.
Wed, Nov 20
Tue, Nov 19
Not a bug report.
Is the /api/rest_v1/feed/featured endpoint returning pre-stored data or is it doing the filtering in real-time? If the latter, I think a short-term fix is acceptable, where we don't necessarily change the definition of trending. Regardless, doing whatever you can to take out the fake items is probably still better than a feed that's missing a few genuinely trending articles.
I believe pageviews go off of the web request logs (basically after the page is viewed), so captchas may not help. I haven't checked yet, but it may also be that the pages listed at T232992 are getting distributed traffic that doesn't come from a single IP. We saw something similar for T158071.
Mon, Nov 18
Sun, Nov 17
mw.Api.plugin.options, using mw.user.options.get to get the preference. This uses https://en.wikipedia.org/w/api.php?action=help&modules=options. Note for scripts the option key be prefixed with userjs-.
Yes. The API call should be fast enough, and if you find otherwise, consider caching it with mw.storage or something (but I don't think you need to). I would also use meta.wikimedia per convention but mediawiki.org works just as well, as it's part of the farm.
Sat, Nov 16
Hello @Charlotte and @Dbrant! Allow me to raise this issue once again. It seems the German Wikipedia is getting a lot of complaints through OTRS and other venues about the apparent abuse of the "trending" list (T232992).