Hi!
User Details
- User Since
- Oct 25 2014, 1:53 AM (598 w, 4 d)
- Roles
- Administrator
- Availability
- Available
- IRC Nick
- Bawolff
- LDAP User
- Brian Wolff
- MediaWiki User
- Bawolff [ Global Accounts ]
Today
@Ladsgroup a question that came up - is there an eventual plan to undeploy DynamicPageList? I know its used on some other wikis, but its always kind of been wikinews that was the reason it was kept around. Wikinews editors might want to replace it with static lists prior to the closure if it indeed will eventually be removed, so that content looks right for posterity.
Yesterday
Personally i think this is such a common thing to do, that a standard wrapper makes sense, but i don't feel strongly about it one way or another.
Mon, Apr 13
I assume because libraries are loaded prior to the current page being executed, and thus there is no current frame in scope at the time the library is loaded (and the environment including libraries may be set up and shared/cloned between multiple executions).
To clarify, can this bug be closed now?
So it looks like LUA will fix the UTF-8 encoding on any text output by it. However if you directly pass the invalid unicode to the parser via frame:extensionTag() or something like that, then the invalid utf-8 is not repaired.
I cannot reproduce this, as long as the Title object is in the file namespace. e.g. mw.title.new('File:Something').file.exists works.
If the new behavior is the one intended
This can't be the case here, as the file is 841 × 1,650 pixels, 21 pages, 7.42 MB.
Thu, Apr 9
0x0 means that metadata was failed to be extracted, which probably means it takes too much time or too much memory
Tue, Mar 31
All their license agreements will be structured to avoid a loophole which would allow streaming platforms to avoid paying that fee.
Mon, Mar 30
Sun, Mar 29
Fri, Mar 27
I think descBaseUrl will be used to pull the file description page. The overriding is kind of confusing, but File::getDescriptionText() calls FileRepo::getDescriptionRenderUrl which calls FileRepo::getDescriptionUrl() [And not File::getDescriptionUrl], thus the override in the file class doesn't apply to description fetching, it only applies to things like the message telling people to go to commons.
Wed, Mar 25
Sometimes files have local description pages. A common example is featured pictures on wikipedia. I would suggest maybe linking to commons only if the local page does not exist.
Sun, Mar 22
Is there a significant legal or operational benefit to transcoding through this new proposed intermediary process, rather than treating it like any other format in MediaWiki that requires conversion before embedding in articles?
Mar 13 2026
Is this a dupe of T418969 ?
I guess what is happening is $wgThumbnailSteps is preventing the original asset from being used
That said, im not sure if mild DoS is considered a security threat to the operators of this extension.
The author does not seem to be on phab, but it looks like a miraheze extension. Maybe @Paladox can help direct it to the right place
Mar 11 2026
We could make the Javascript content model parse the page and look for mw.loader.load() and register a link for Special:Whatlinkhere when something is included. That would be kind of cool.
Mar 10 2026
03:28 < Neriah63> [12:25:40] <Neriah63> Hi
03:28 < Neriah63> [12:25:42] <Neriah63> https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1235542
03:28 < Neriah63> [12:25:49] <Neriah63> I would think this also requires a change in 1.43/1.44/1.45. Am I mistaken?
03:28 < Neriah63> [12:25:59] <Neriah63> It forced us to add handling in common.js for images that are not handled at
the server level, and because of that it takes 2 seconds for those images to load...
[..]
03:33 < bawolff> Is this causing problems for people using instant commons? Because otherwise it doesn't seem that
critical to backport as most people aren't using $wgThumbnailSteps for local uploads
03:35 < Neriah63> Yes, we have the problem for instant commons.Point of interest, it appears that this vulnerability was at least attempted to be used in the wild. The script mentioned in T419143 appears to attempt to use it. Obviously in Wikimedia's case it has long since been patched, however it appears that that script may have been copied and originally targeted at other wikis. Its hard to know when the script was originally written, but its probable the original target was using an outdated version of MediaWiki at the time and was vulnerable.
Mar 9 2026
Being on meta you can import scripts that are outside of meta e.g.:
https://meta.wikimedia.org/wiki/User:Nux/global.js?diff=cur#L-104
So i did a profile, and it came back that mediawiki was spending almost all of the time in the function to deserialize a snark. I feel like whatever deserializing entails it shouldnt take 10 seconds to deserialize a single entity even if its a large one.
Mar 8 2026
is any of these scripts installed ServiceWorker which still continues to work (even after script removal and hard cache clean)
Mar 7 2026
Yubikey's are very cheap, especially when buying in bulk. I think we'd be talking like $15 each. One potential option is that WMF just buys a yubikey for every person with intadmin rights.
Mar 6 2026
This is somewhat insecure - an attacker can wait for the user to do something mildly sensitive, then take over the account and use it for something very sensitive.
That's a nice find.
Seems like this works on non-js pages as well, so anyone can just edit the target's user space to add something evil and then trick them into clicking the link.
Experiences like 5 Mar 2026 could be avoided.
Mar 5 2026
Instead of making the edit action require 2fa when editing a js page, an alternative version might be:
Mar 4 2026
My attempt at this is https://en.wikipedia.org/wiki/Template:CineMol (it supports a click to enlarge on the generated images. Mildly hacky)
Feb 28 2026
However, I agree with @Bawolff that using Sanitizer::stripAllTags() is a much better and more "MediaWiki-native" way to handle this. It would likely resolve the order-of-operations issue
Feb 27 2026
cc'ing @Jdlrobson-WMF for the minervaneue aspects of this.
As a minor aside, instead of doing decode & strip itself, the extension may want to consider using Sanitizer::stripAllTags() instead.
p.s. Just to clarify, under impact you wrote "The {{GETSHORTDESC}} parser function which returns the raw stored value." Are you saying you can create an XSS with the parser function? Normally parser function output is not interpreted as HTML.
If the intention was to allow HTML tags as entities (like the "< 1 mm" example), then the use of strip_tags() is logically redundant or misleading. The current implementation creates a "worst of both worlds" scenario: it fails to stop malicious HTML because of the order of operations, yet it claims to "sanitize" the input.
Just a point of clarify (Because I got confused). This is about the non-Wikimedia deployed ShortDescription extension. On Wikimedia wikis ShortDescription parser function is provided by the Wikibase extension. Potentially the Wikibase code that implements this is similar.
The sanitize() function contains a logic flaw where strip_tags() is called before html_entity_decode(), allowing encoded HTML entities to bypass tag stripping and be stored as raw HTML in the database.
Feb 21 2026
I thought i tried to add every known namespace a while back. Guess i missed some.
Feb 20 2026
They are testing whether the characters » « are inside of some environments. Okay. But why? Why only these environments?
Feb 18 2026
Fwiw, i think that is a fine thing to work on at the hackathon. Just keep in mind that adoption by wikimedians would probably be an uphill battle as they will probably view it as just some other website.
Its not entirely clear how you plan to integrate this into mediawiki. There is no concept of private messaging in mediawiki currently, so its unclear how this fits into the design of mediawiki.
Feb 13 2026
[From gerrit]
I understand that, and I'm asking for help, as we're used to ask on any collaborative site. I am questioning whether the CURRENT test cases are actually testing anything important. And whatever they are testing, they should be failing for any wiki that is not French, yet they are not. I'm okay with you guys adding your -1's, but someone will at some point have to work on this bug. So let's continue this discussion at phab, please
Feb 12 2026
It sounds like a race condition. Different db config (e.g. replicas), different caching setup, different job queue setup might be effecting it.
Feb 10 2026
Feb 8 2026
It feels a bit weird to apply some normalization blindy. Like if the tiff file had metadata indicating it was a GeoTiff file and not a normal black and white image that would be one thing, but it seems like correcting this has the potential to do the wrong thing for normal images.
Is the issue that depth is like a 32 bit number so of the range isn't normalized everything appears white?
Feb 6 2026
If i recall jxl uses the same metadata as c2pa in jpeg which there is some interest in as well.
How close is DPL4 to something that could work on Wikimedia wikis?
Feb 5 2026
Sorry i followed a link here from discord and assumed this was only about page language. I agree that short description probably shouldnt deoend on page language.
Edit: refering only to page language Uzume mentioned
Isnt the page language being en correct? The lua language is in english, all the keywords are in english.
Feb 4 2026
So that definitely has some upload.w.o resource calls, but when I check various console output via Chrome's developer tools, they all 404 for me when I load Gnome-emblem-web.svg. It seems all of those alphanum-named PNGs might not exist anymore? As I would expect to see actual CSP violation reports within Chrome's developer tools console output as opposed to 404 errors.
Feb 3 2026
Note that the version of the header prefixed with an x- is only needed for browsers from before 2013. Its probably not relavent anymore.
Feb 2 2026
Given context, it seems unlikely to me that these are credentials to a real production server. Nonetheless, hopefully @apaskulin can confirm.
FWIW, I added a notification about this breaking change on https://www.mediawiki.org/wiki/API:Imageinfo - rephrase away if you think it is unclear.
Jan 31 2026
Perhaps it makes sense to just switch to strict-origin-when-cross-origin
Jan 27 2026
Jan 26 2026
I found a hacky work around, if you don't set a height, MF leaves the image alone. With that in mind, my links don't work anymore, but you can see the original in https://en.wikipedia.org/wiki/User:Bawolff/interactive_images in the "File:Trajans-Column-lower-animated.svg" section
At the moment it looks like RevertAction::onSubmit() does not submit the timestamp of the old file to localFile->upload() and thus when it eventually gets to FileRepo::recordUpload3 which makes the log entry, that code does not know what the timestamp of the old file was (although it does record the sha1).
Jan 24 2026
Adding stuff to the user-agent part fixed.
I would agree for individual wikis, not so much for wikifarms. For reference, we have the extension enabled globally. For users that wish to use QIC, we would have to suppress $wgUseQuickInstantCommons to false. Having its value be in a superstate might be problematic. While I don't think the foreign file repos definition will be changing soon, it is unideal having to maintain your own definition of it. I'm also open to alternative ideas, such as raising the default expiry to a week, which seems to be WMF's desired default for (large) users.
Jan 23 2026
I think l10n cache might be not webserver owned if manualRecache is set so its generated at build time (like in wnf config).
I like the idea, but im not sure comparing with index.php is the right approach. In a locked down install, you would probably want index.php to not be owned by the same user as the webserver is running as.
Jan 21 2026
Jan 17 2026
After some reflection, i think my current view is:
Jan 16 2026
I'm a bit unsure about the first part. $wgUseQuickInstantCommons was meant as the easy option where no config needed. I'm a bit hesitent to add additional config options here which just duplicate the foreignfilerepos config.
Jan 6 2026
I think this is generally expected. InstantCommons expects you to have the same media handler extensions installed (in this case PagedTiffHandler).
To be fair depending on how the sanitizer works (haven't checked) it might be hard to support data:image/svg+xml. The contents of svg don't need to be sanitized, but it might be hard to find boundaries of the svg.
Jan 5 2026
I think IP restrictions dont make sense for toolforge, but may make sense for people privately hosting stuff on their own servers, although the benefit is probably marginal.
Jan 4 2026
Not sure if it would be better to do some of this in the CssSanitizer interfaces instead of solely in the TemplateStyles extension
Given we now allow scribunto to make SVGs (T405861), i don't see any reason not to do this. At least for <image> types, the security implications for <svg> type values might be different I'm not sure.
Dec 24 2025
Would adding a sandbox="" attribute to your iframe (which should make the user logged out) fix your issue?
Dec 21 2025
So reading the discussion via google translate, it sounds like many people in the arwiktionary community aren't exactly sure of the precise differences in the sorting rules. Is a demo site needed so that people can test and play with it prior to changing the setting on arwiktionary, or is that unnecessary and this change is good to go?
Dec 16 2025
Could we keep parsing and disable PST?
Dec 8 2025
Just as a historical note - the primary goal at the time was to be a test in preparation of using real CSP, which never happened. Report only headers by themselves provide questionable value. I suppose they could be used to help enforce policy around not including external resources in common.js, as well as in compromise scenarios where off-site js is loaded, however there is probably a real question of if it makes sense to continue with them if there is no plan to use the real header (I know krinkle has had opinions on this in the past)
