Page MenuHomePhabricator

Enable customizing of CSS values filter
Open, LowPublicFeature

Description

Sanitizer currently wipes out entire url() CSS values.

This is unnecessarily too strict.

It should at least allow same domain (either via full URL or relative path) URLs.

On WMF sites it should allow something like "^[a-z0-9]+\.(wik(i((m|p)edia|books|news|quote|source|versity)|tionary)\.org/.*"

Furthermore it would be even better to have configurable list (eg. [[MediaWiki:Allowedurls]]) of allowed URLs perhaps using regexps like eg. blacklists do.


Version: unspecified
Severity: enhancement

Details

Reference
bz11106

Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 9:51 PM
bzimport added a project: MediaWiki-Parser.
bzimport set Reference to bz11106.
bzimport added a subscriber: Unknown Object (MLST).

The current function works this way:

static function checkCss( $value ) {

$stripped = Sanitizer::decodeCharReferences( $value );

// Remove any comments; IE gets token splitting wrong
$stripped = StringUtils::delimiterReplace( '/*', '*/', ' ', $stripped );

$value = $stripped;

// ... and continue checks
$stripped = preg_replace( '!\\\\([0-9A-Fa-f]{1,6})[ \\n\\r\\t\\f]?!e',
  'codepointToUtf8(hexdec("$1"))', $stripped );
$stripped = str_replace( '\\', '', $stripped );
if( preg_match( '/(?:expression|tps*:\/\/|url\\s*\().*/is',
    $stripped ) ) {
  # haxx0r
  return false;
}

return $value;

}

I'd suggest perhaps to add optional variable such as $wgStrippedCss or so to LocalSettings and make the change from

if( preg_match( '/(?:expression|tps*:\/\/|url\\s*\().*/is', $stripped ) )
...

to something like (in kind of pseudocode just to show the sense)

$restrictCss = ( $wgStrippedCss == null ? '/(?:expression|tps*:\/\/|url\\s*\().*/is' : $wgStrippedCss )
if( preg_match( $restrictCss, $stripped ) )
...

where WMF sites would have

$wgStrippedCss = '/(?:expression|tps*:\/\/(?!upload\\.wikimedia\\.org)|url\\s*\(\\s*(?:'|")?\\s*(?!http:\/\/upload\\.wikimedia\\.org)).*/is'

in LocalSettings to allow linking of images uploaded on WMF sites.

While the logic in checkCss() has changed in the meantime (see https://git.wikimedia.org/blob/mediawiki%2Fcore.git/1765227b43dea46e44094c7380d28e84e5139751/includes%2FSanitizer.php#L851 ), the underlying request here seems to be still valid.

Looks like the commit to Sanitizer.php that disallows the CSS url() function was meant to avoid running JavaScript in CSS with the url("javascript:..."); type of thing. (See the Mozilla bug report at https://bugzilla.mozilla.org/show_bug.cgi?id=230134 for more info.)

While I agree with the security concerns (and bad bad bad coding practice) of executing JavaScript from within a CSS file, I don't agree that the url() function should be completely disallowed.

The method above (comment1) seems like a good idea: if some global variable ($wgDisallowedCssRegex) is set, use that. If not, fallback to what is there now.

I might tackle this on the weekend if I've got time.

Change 144249 had a related patch set uploaded by Nemo bis:
Breaking out disallowed CSS into a global variable

https://gerrit.wikimedia.org/r/144249

Change 144249 merged by jenkins-bot:
Breaking out disallowed CSS into a global variable

https://gerrit.wikimedia.org/r/144249

I'm not sure I see how making this entire thing a configuration variable is a good thing. Security should not be configurable.

Another big reason why url() is forbidden is to avoid cross-domain requests being made from a wiki page (especially with regards to CSRF, DDOS, traffic sniffing, privacy policy etc.).

When additional security issues are found and added to MediaWiki, existing installs that customised this filter for some silly feature, will no longer be using adequate security measures.

I recommend this feature be reverted and we figure out a way to enable this other use of url() in a sane way. Whether we want that way to be allowed always or behind an opt-in flag is a separate question, but I don't think there is valid use case for making the entire thing configurable. That only complicates maintenance, security updates, and overall mobility of wikitext between sites.

I think any exceptions to CSS sanitization should be specific to the use case. I don't think we should expose a regex, since that is an implementation detail internal to Sanitizer, which may change in the future. I don't think it's secure to generally allow url() pointing to any page on a domain, since url() can be used for scary non-image things, like the behavior property.

The url() function in CSS is perfectly valid as long as there's not a "javascript:" psuedo-schema in it that executes code. There are plenty of legimate cases where the url() function might be useful/necessary that are currently forbidden by the Sanitizer because it is overly generic.

Maybe a better fix to this problem, instead of pulling out the regex to be possibly overridden/reset/unset by an admin, is to make sure the string "javascript:" does not appear it in.

P.S. I'm totally cool with reverting all this (as Krinkle says above) and finding another way.

(In reply to Krinkle from comment #7)

I'm not sure I see how making this entire thing a configuration variable is
a good thing. Security should not be configurable.

My reason for +2'ing that is that I'd prefer an admin, who really wants to allow some of that css through, put it in their configuration rather than doing their own hack in core, which would be less secure for them.

At the WMF, I would likely never be ok with an exception allowing any url()'s through, so if that really is Danny B.'s only motivation, someone should probably just update the title and wontfix this bug.

I recommend this feature be reverted and we figure out a way to enable this
other use of url() in a sane way. Whether we want that way to be allowed
always or behind an opt-in flag is a separate question, but I don't think
there is valid use case for making the entire thing configurable. That only
complicates maintenance, security updates, and overall mobility of wikitext
between sites.

I'm fine with reverting it this change, since Timo and Tim seem to feel strongly about it. Also it definitely hurts the mobility of wikitext between sites, I hadn't considered that aspect of it.

I'm honestly skeptical we can find a sane way to support urls in css, but I'm open to being surprised. And again, if that's the goal, let's change the bug reflect what's actually wanted.

Reverted in I1dbb927997693d686b4677b9c2107be99dedd7b2

Note that an alternative fix for this issue may come in the form of the TemplateStyles extension.

Note that an alternative fix for this issue may come in the form of the TemplateStyles extension.

TemplateStyles itself isn't planned to do anything to fix this issue, although you could work around it using TemplateStyles stylesheets.

But the css-sanitizer library developed for TemplateStyles could potentially be used to replace the existing code in Sanitizer, either directly in core or as added functionality in TemplateStyles if appropriate hooks were added to core.

Aklapper changed the subtype of this task from "Task" to "Feature Request".Feb 4 2022, 11:01 AM