Page MenuHomePhabricator

API doesn't seem to test for the URL's domain
Closed, ResolvedPublic

Description

Hello, I've set up a script that makes use of the API to shorten URLs right from a spreadsheet, and I was surprised to find that the $wgUrlShortenerAllowedDomains is not tested then. I could shorten just about anything as long as it was a valid URL.

For what it's worth, here is the script (to be used from within Google Spreadsheet:

/**
 * Function that returns a shorten URL from a wiki URL
 */
function shortenURL(input) {
  
  // See https://developers.google.com/apps-script/reference/url-fetch/url-fetch-app#fetch(String,Object)
  var options = {
    'method' : 'post',
    'contentType': 'application/json'
  };
  var response = UrlFetchApp.fetch('https://mywiki.fr/api.php?action=shortenurl&format=json&url=' + encodeURIComponent(input), options);
  
  var json = response.getContentText();
  var data = JSON.parse(json);

  // {"shortenurl":{"shorturl":"3perf.fr/r/7"}}
  // {error={info=Not a valid URL}}
  Logger.log(data);
  Logger.log("https://" + data["shortenurl"]["shorturl"]);

  if (data["error"])
    throw data["error"]["info"];

  return "https://" + data["shortenurl"]["shorturl"];
}

Event Timeline

Ladsgroup renamed this task from API doesn't seem to test for the URL's domain.... to API doesn't seem to test for the URL's domain.Dec 9 2021, 12:48 PM

This seems to be an issue in your wiki's configuration. For example this is not passing through in our production:
https://meta.wikimedia.org/wiki/Special:ApiSandbox#action=shortenurl&format=json&url=https%3A%2F%2Ftranslate.google.com%2F

{
    "error": {
        "code": "urlshortener-error-disallowed-url",
        "info": "URLs to domain translate.google.com are not allowed to be shortened",
        "*": "See https://meta.wikimedia.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at <https://lists.wikimedia.org/postorius/lists/mediawiki-api-announce.lists.wikimedia.org/> for notice of API deprecations and breaking changes."
    },
    "servedby": "mw1356"
}

Can you make such requests against our API and get w.wiki URL for addresses outside of Wikimedia?

Interesting... Here's my config:

$wgUrlShortenerAllowedDomains = array(
	'(.*\.)?tripleperformance\.fr'
);

When I go through the special page Special:UrlShortener the same URLs are refused when that are allowed from the API....

I did try with https://meta.wikimedia.org/w/api.php and could not reproduce indeed.

I think it might be linked to https://phabricator.wikimedia.org/T258134

In my configuration (MW 1.35) I do not have $wgUrlShortenerDomainsWhitelist set but only $wgUrlShortenerAllowedDomains (as per documentation). When you do that, getAllowedDomainsRegex builds a regex that looks like that : /|^(.*\.)?tripleperformance\.fr$/ which basically matches everything (because of the | at the begining).

I see that this has been fixed in rel 1_36, I'll just update the extension I guess (hopefully it will still be compatible with 1.35 ?).

Nice catch. Thanks.
The patch fixing the security issue should have been backported to 1.35. I will check.

It's backported to 1.35. I do see it as commit 85252b31f0a400fb9c06dbf1e7adda036fd03b99. Are you sure you're using the correct release?

That extension is not in the tarball so I don't think it was messed up there. How did you download this extension?

That extension is not in the tarball so I don't think it was messed up there. How did you download this extension?

So the commit exists in branch REL1_35 in our source of truth (gerrit) but not in replicas (including diffusion and github). You just need to backport that commit in your system, I'll ask how we can fix the replication.

Ah ! That must be it... I download my extensions from github indeed. Interesting to note that there are differences then. I'll change the git URL to point to gerrit instead.

NB : just tested again with the latest REL1_35 on github and everything is now ok. Thanks for the fix @Ladsgroup !

Ladsgroup claimed this task.

Awesome. Thanks for reporting.