Sun, Jan 14
Fri, Jan 12
What I'm thinking is that I'll spin up a medium or large Cloud VPS instance with local RESTBase and MCS installations and benchmark for a test sample before and after using ab and parallels along the lines described here: https://www.simonholywell.com/post/2015/06/parallel-benchmark-many-urls-with-apachebench/
Thu, Jan 11
Wed, Jan 10
Tue, Jan 9
Confirmed that the zeroconfig API returns an empty object for requests in configured but disabled IP ranges. However, responses in these cases will include an x-carrier header with the carrier's MCC-MNC code.
I'll test this again to be 100% sure, but unfortunately if memory serves (and as the source suggests) I believe an empty JSON object is also returned in the case of an existing but disabled carrier config. The fields are only populated if the config exists and is enabled.
@Zoranzoki21, using Orion, would you mind visiting https://en.wikipedia.org/w/api.php?action=zeroconfig&type=config and copy/pasting the result here?
You're correct that Orion Telekom is not and appears never to have been a Zero partner.
As for T173537#3886490, I believe that configured carrier IPs are consolidated for Varnish in https://github.com/wikimedia/mediawiki-extensions-ZeroPortal/blob/master/includes/ApiZeroPortal.php and that the logic there could be updated to exclude disabled carriers. (+@dr0ptp4kt for sanity checking).
Unfortunately, I believe this is expected behavior. Separately from the specific IPs in phabbanlist.conf, all users in Zero IP ranges are currently blocked from accessing uploaded files in Phabricator via https://gerrit.wikimedia.org/r/#/c/363264 (see T168142 for background on the abuse that led to this block).
Telenor Serbia is an active Zero partner; unfortunately for that user, the patch is working as intended in that case.
I was able to reproduce this, and I think @wassan.anmol117 is correct that this is caused by https://gerrit.wikimedia.org/r/#/c/363264/. Varnish internally marks up requests in Zero IP ranges with X-CS headers. The patch blocks requests marked up with this header from requesting uploaded files from Phabricator. The problem (if I'm correct) is that the X-CS markup is applied without regard to whether or not a carrier is a currently active Zero participant. (It's up to clients to determine whether the carrier referred to in a Zero header is active before displaying Zero chrome.)
Something's definitely awry. Orange Cameroon has not been an active Zero partner since March 2016. I'm looking into it.
Also, there's no concept of a primary definition in any Wiktionary as far as I'm aware, so that would have to be defined in a product spec.
I wouldn't hold the existing Wiktionary definition endpoint up as an example to emulate. IMO the only sane way of moving forward on Wiktionary integration, given the lack of consistency in page structure or content, is to come up with a set of markup reflecting elements we're interested in (definitions, examples, etc.), and then work with Wiktionary communities to incorporate that markup into pages so that clients can reliably query for what they're interested in. There's a task for this: T138709
Mon, Jan 8
Thu, Jan 4
Thu, Dec 21
Dec 14 2017
Dec 13 2017
Per discussion in weekly RI meeting, no need for special handling of these since (unlike MediaViewer) we're not altering link behavior.
@ssastry, https://fr.wikipedia.org/api/rest_v1/page/html/Mod%C3%A8le:Donn%C3%A9es%2F6117%2F%C3%A9volution_population is an illustrative example of a page that caused an error yesterday. Of interest to your comment, the first section identified contains only a <p> and a <span> as children, no headings. The second section contains headings but only in child sections.
I'm also skeptical that throwing an error is the right way of going about things in the situations we do care about.
A quick and probably mostly effective fix would be to exclude sections with IDs of -1 and -2 from this checking. I'm trying to think through whether this is correct behavior in light of what those IDs represent.
Actually, there seems to be a bigger problem here. /page/html is returning only sections of the desired page.