The following request returns always a HTTP 429 response error:
$ curl -I "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dc/Yeongnamroad2.png/220px-Yeongnamroad2.png" HTTP/2 429 date: Mon, 28 Mar 2022 08:34:43 GMT server: Varnish x-cache: cp3065 int x-cache-status: int-front server-timing: cache;desc="int-front", host;desc="cp3065" strict-transport-security: max-age=106384710; includeSubDomains; preload report-to: { "group": "wm_nel", "max_age": 86400, "endpoints": [{ "url": "https://intake-logging.wikimedia.org/v1/events?stream=w3c.reportingapi.network_error&schema_uri=/w3c/reportingapi/network_error/1.0.0" }] } nel: { "report_to": "wm_nel", "max_age": 86400, "failure_fraction": 0.05, "success_fraction": 0.0} permissions-policy: interest-cohort=() set-cookie: WMF-Last-Access=28-Mar-2022;Path=/;HttpOnly;secure;Expires=Fri, 29 Apr 2022 00:00:00 GMT content-type: text/html; charset=utf-8 content-length: 1843
This, wathever the system I launch the HTTP request from. This HTTP error code should be responded only when too many requests are sent from the requesting system, but here it seems to appear in place of a HTTP 5XX error.
This is a really serious problem for the openZIM/Kiwix project, because the scraper MWoffliner slows down each time it get this error code to accomodate the backend. But if this kind of error always comes then this transforms a normal scrape in an endless experience!