Get the Data: namespace enabled on metawiki.
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Advanced Search
Yesterday
In T91154#7961505, @Quiddity wrote:Re: Tech News - What wording would you suggest as the content, and When should it be included? Thanks! (The deadline for entries in the next edition, is ~24 hours from now)
In T309310#7960630, @diegodlh wrote:Thank you very much for your interest in Web2Cit, for helping us test it, and for reporting the issues you found! It's very helpful for us.
You're welcome!
From what I see in the original revision of www.independent.ie's templates configuration file, Web2Cit was likely ignoring the translation template you had configured because it lacked mandatory template fields itemType and title. These template fields are mandatory and Web2Cit will ignore translation templates that do not include both of them. This is explained in the information box that pops up in the configuration file editor, next to the Fields property title (though I acknowledge it may be somewhat hidden in this initial Web2Cit-Editor version):
Because of that, Web2Cit was using its default fallback template to translate the target webpage, which simply uses Citoid's response for all Web2Cit-supported fields. That's why you were not seeing a difference in the output. You can tell that Web2Cit was using the fallback template from the translation results page: "Translation result using fallback template".
Make sure you include in your translation template all the fields that you want an output for. If you want to reuse Citoid's response for a field, explicitly say so by using the Citoid selection step (provided by default). Try adding itemType and title fields to your template. Use the default procedure for both. This should fix the "I don't see a difference in the output" of the problem.
I see you then manually changed the templates file, which unfortunately made matters worse, as you ended with an invalid JSON file (there is an extra comma at the end of the selections array). JSON files are complex and we are likely to make mistakes when manually editing them. Even more so for JSON files in our main storage, which do not have the JSON editor available (see T305571). If possible, please use the configuration file editor instead. I've just removed that extra comma to make it a valid JSON again.
If the user is about to do something stupid, you should turn backgrounds for relevant elements pale red. (but make sure text contrast remains sufficient) If the user is about to do something really stupid, disable buttons.
Be careful here, there are four types of usernames:
- Those with only [A-Za-z0-9], like yours. Always work.
- Those with the above and one or more spaces. Usually works after converting the space to an underscore. Where inappropriate, MediaWiki typically converts the underscore to a space. In api.php this will cause a warning to be thrown, but the action still succeeds.
- Those who include some non-space characters that requires percent-encoding to be used in a URL, like the é in Condé Nast. Now you have to be careful to apply percent-encoding only when using it in a URL.
- Those with really screwy characters like a colon which can cause confusion with namespaces (hasn't been allowed for years, but some older users exist like .:Jenni:. (who also has periods!)
And if you are dealing with namespaces anywhere, beware that gendered namespaces exist. Portuguese Wikipedia for example has those: Usuário(a): for unknown, Usuário: for male and Usuária: for female. Yeah, that was fun when I discovered it. Not.
In T309253#7959319, @TheDJ wrote:Who disables all codecs ???
In T309253#7959264, @Aklapper wrote:with a browser that can't (or refuses to) play the music.
For future reference, hints what exactly that means are welcome.
Using Firefox 60.9.1 on mobile (I guess that's old enough), I cannot see screen flashing but it infinitely tries to load the music file, showing a status bar animation forever.
In T298427#7932716, @diegodlh wrote:With support from a Wikimedia Foundation grant we are currently developing Web2Cit, a tool to collaboratively fix cases like this, requiring much less technical skills than those required to write a Zotero translator.
That's BRILLIANT!
Wed, May 25
In T308881#7958047, @LucasWerkmeister wrote:Put the recording up here for now: https://tmp.lucaswerkmeister.de/2022-05-21%20Wikimedia%20Hackathon%202022%20concert.mp4 (note, there’s no video for the first 5 seconds, because I cut the video without reenconding it); CC BY 4.0 if anyone wants it
The Florentiner March recording I used is here: https://commons.wikimedia.org/wiki/File:Florentiner_March_-_U.S._Air_Force_Band.ogg
"too many errors" is triggered by invisible errors when not using strict mode. The ones that are big enough in numbers to reach the ceiling are typically:
- Ironic Missing "use strict" statement. errors for every single function.
- $ is not declared errors. Just repeating $('body'); 101 times also triggers "too many errors".
When coding sloppily there can be others. In JSHint there's this "assume jQuery" option, can that be enabled somehow here as well? (putting /*globals $:false */ in the script also works, but isn't that pretty) Combined with raising the limit to 1000 I suspect this would be sufficient for the vast majority of scripts.
In T308679#7957890, @TheresNoTime wrote:In T308679#7957851, @PatchDemoBot wrote:Test wiki created on Patch demo by TheresNoTime using patch(es) linked to this task:
https://patchdemo.wmflabs.org/wikis/c2b4c82f0c/w/https://patchdemo.wmflabs.org/wikis/c2b4c82f0c/w/index.php?title=User:Patch_Demo/test.js&action=edit
In T308679#7938784, @TheresNoTime wrote:Pretty sure this is provided by Extension:CodeEditor, so have tagged CodeEditor instead :)
I couldn't immediately see a config variable for this, but I feel like its something to do with the $wgScribuntoEngineConf here?
It's definitely maxerr. Here's how to reliably reproduce this: https://commons.wikimedia.beta.wmflabs.org/wiki/User:AJ/toomanyerrors.js
In T308679#7938784, @TheresNoTime wrote:Pretty sure this is provided by Extension:CodeEditor, so have tagged CodeEditor instead :)
I couldn't immediately see a config variable for this, but I feel like its something to do with the $wgScribuntoEngineConf here?
I don't think so. I found this issue on Github which links https://github.com/ajaxorg/ace/blob/master/lib/ace/mode/javascript_worker.js#L92 maxerr in ace/lib/ace/mode/javascript_worker.js. For us this is found in https://en.wikipedia.org/w/extensions/CodeEditor/modules/ace/worker-javascript.js. There's a window.ace object but I can't find changeOptions in it.
Tue, May 24
In T308940#7951736, @Dzahn wrote:
In T306477#7953515, @MPhamWMF wrote:Thanks for the examples. As you may have noticed, and as Erik pointed out above, the behavior you are noticing is section title highlighting working as intended, even if it is not the best experience. We do not and cannot currently index sections as searchable documents. When section titles are highlighted in search results, this is due to part of the search query matching the specific text in that section title itself, not because of any kind of relevancy of the section's content.
With the knowledge shared by @EBernhardson I did another experiment, and strangely this is indeed how it works, also in Special:Search:
Sun, May 22
In T258803#7939426, @Dominicbm wrote:I get this error reliably when I search for "cors" as well.
Sat, May 21
In T308946#7947338, @Aklapper wrote:Cannot reproduce on https://phabricator.wikimedia.org/search/ . https://phabricator.wikimedia.org/search/query/_ndFuPQejqYu/#R lists results.
In T308947#7947285, @Legoktm wrote:Wikitech and Wikidata are not linked by SUL, so you need to make an unathenticated request, see https://www.mediawiki.org/wiki/API:Cross-site_requests#Unauthenticated_CORS_Requests
In T308940#7947028, @RhinosF1 wrote:This should be resolved now
Fri, May 20
In T308335#7927809, @Lucas_Werkmeister_WMDE wrote:With a template that transcludes as a newline, using {{#tag:}} for this seems to work well on my local wiki:
{{#tag:syntaxhighlight|#!/bin/sh{{newline}}echo "hello"|lang=sh}}
(Note that on enwiki, Template:Newline ↳ Template:Break inserts a <br> instead. I couldn’t find an existing template that transcludes to a single newline, though I also didn’t search for too long.)
Will it be uploaded to Commons afterwards?
On https://en.wikipedia.org/wiki/Main_Page I don't see the compact list despite having it enabled. Same on dewiki.
On the dewiki homepage mw.uls contains only ActionsMenuItemsRegistry which is an empty object. This is different from, say, dewiktionary where mw.uls.getBrowserLanguage does exist.
This also happens on https://de.wikipedia.org/wiki/Wikipedia:Hauptseite.
Thu, May 19
Wed, May 18
In T91154#7937134, @Aklapper wrote:@AlexisJazz: Please either check the comment added right before your last comment, or ask a more specific question. Thanks a lot.
Tue, May 17
That surprises (and confuses) me. As there's no relation to the issue at hand I took this to https://en.wikipedia.org/wiki/User_talk:AKlapper_(WMF)#Not_paid?.
In T283646#7935515, @Aklapper wrote:Anyone is welcome to contribute a software patch. The (custom Wikimedia changes) repo seems to be operations/software/thumbor-plugins; the file wikimedia_thumbor/engine/imagemagick/imagemagick.py. See also https://wikitech.wikimedia.org/wiki/Thumbor#Updating_the_custom_Thumbor_plugins . (For completeness, https://gerrit.wikimedia.org/r/plugins/gitiles/operations/debs/python-thumbor-wikimedia/ would be the upstream Debian package.)
Thinking about it a year later, what "This area is not the final size of the image" actually only applies if you set both width and height. Since you there's no reason to set height, there's no problem.
EXTENT? Not AGAIN! T282385 rears its ugly head again. I provided the solution and it was professionally ignored.
Right now it works, as usual with these it was a transient error.
Sun, May 15
See T307862
In T308296#7929541, @Aklapper wrote:Also note that https://en.wikipedia.org/wiki/File:Holden_(car_brand).webp#filehistory shows that the 528px file is a re-upload. Might be related (or not).
- 727 pixels has transparency.
- 728 pixels hasn't.
- 729 pixels has.
- 730 pixels has.
- 731 pixels hasn't.
- 732 pixels has.
- 733 pixels has.
- 734 pixels hasn't.
- 735 pixels has.
- 736 pixels has.
- 737 pixels hasn't.
- 738 pixels has.
- 739 pixels hasn't.
- 740 pixels has.
Sat, May 14
In T308389#7929146, @Legoktm wrote:That request isn't solely to fetch CSRF tokens, it serves another purpose:
* Query the foreign wiki to see if we're already logged in there in the user's browser, which * means that there's no need to query for and use 'centralauthtoken' parameter. * * To avoid wasted requests, get a CSRF token at the same time.The request scheme is described at https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/CentralAuth/+/refs/heads/master/modules/ext.centralauth.ForeignApi.js#9 - basically if you are not logged in remotely, you need to get a short-lived centralauthtoken for each foreign request. But if you're logged in remotely, which is what the meta=userinfo is for, then we don't need centralauthtokens. And since we're making a request anyways, it makes sense to fetch the CSRF token at that time, if possible.
@cscott where are we with this?
In T308382#7928950, @hashar wrote:Thank you to have taken the time to file this report. I have looked at the Gerrit server and added the stacktrace to this task description.
Gerrit indeed can not find the comment:
Unable to find comment for key CommentContextKey{project=mediawiki/extensions/WikimediaMessages, changeId=35609, id=AAADbX%2F%2F%2ByU%3D, path=5925d5e5eb8f842ba6950ebb0be07777, patchset=1, contextPadding=3}The change https://gerrit.wikimedia.org/r/c/mediawiki/extensions/WikimediaMessages/+/35609 is from 2012. We had a few patchsets and comments lost with time due to some git garbage collections issues. Can surely be verified by looking at the change metadata in NoteDB under refs/changes/09/35609/meta.
Given it is a very old change and it is not blocking anyone, I am marking this as a low priority, I might investigate later though.
Fri, May 13
In T306181#7927825, @BTullis wrote:In T306181#7927769, @akosiaris wrote:The 50% bump in capacity didn't make any noticeable difference this time around. :-(
Thanks for trying it and it's still a useful experiment.
There was some discussion on another ticket today, which seems likely to be related: T295427: Problem with delay caused by intake-analytics.wikimedia.org
My hope is that we might be able to use those reports to identify some test cases that always (or often) cause delayed responses or errors in intake-analytics.wikimedia.org
If we can do that then I can use some mitmproxy based investigation into why it's happening.
In T308335#7927809, @Lucas_Werkmeister_WMDE wrote:With a template that transcludes as a newline, using {{#tag:}} for this seems to work well on my local wiki:
{{#tag:syntaxhighlight|#!/bin/sh{{newline}}echo "hello"|lang=sh}}
(Note that on enwiki, Template:Newline ↳ Template:Break inserts a <br> instead. I couldn’t find an existing template that transcludes to a single newline, though I also didn’t search for too long.)
In T295427#7926809, @BTullis wrote:Any evidence of requests to these endpoints timing out or resultng in 503 errors for the client, especially if they are repeatable, is something that could well provide a useful reference point for me to investigate further.
503? Reminds me of all the connection issues on beta cluster: T289029, T303160, T303165, T302699, T300525. Probably not related, I think, I assume intake-analytics doesn't run on the same stuff beta cluster runs on, but I don't really know.
No problem, thanks for fixing!
In T295427#7817708, @phuedx wrote:The latter seems more likely to me but it doesn't explain why the request to https://intake-analytics.wikimedia.org is not resolving quickly but timing out (and that timeout being on the order of minutes).
I'm wondering if maybe the issue could be caused by an unresponsive DNS, proxy that refuses the beacon or some web accelerator software.
Thu, May 12
In T308239#7924428, @TheresNoTime wrote:What does "all notifications" show, just out of interest?
I just uploaded https://commons.wikimedia.beta.wmflabs.org/wiki/File:Jason_Shaw_-_Big_Car_Theft.ogg. MP3 transcode worked. Not showing up in search yet, maybe needs more time.
Wed, May 11
In T68606#7920007, @TheDJ wrote:Solutions:
- We fix the api to not have the API scrape ONLY the first creator from the creator box (see my patch)
Every little bit helps, but it won't be enough. FoP (Freedom of Panorama) is another issue when it comes to informing re-users.
Tue, May 10
Screw it, I've worked around it anyway.
@TheDJ just thinking out loud: for this particular purpose, maybe we shouldn't even bother (at least not initially) with device detection. If a user wants a different skin on some device/browser they own, let them indicate that themselves on the device in question and serve the chosen skin on that device from then on. The problem to be solved, which will happen when I create a script for this, is an involuntary page reload when opening a Wikimedia site on a device that requires a different skin from the last used device. An involuntary reload several times a day as a user switches between their phone and laptop is not particularly pretty, an involuntary reload once a week as a cookie expires, meh, not a dealbreaker.
This one I don't fully understand.. we'd still want to entice people to edit right ? we want them to read a talk page message if they get send (and if they work) ?
Yes and no, I guess. The vast majority of visitors are readers. You can shove an "edit" button in their face for 50 years and they'll never use it. There's no point in wasting their screen estate. They'll never receive a talk page message, or if they do, it's because they share their IP with someone else.
In T307851#7916579, @Aklapper wrote:In T307851#7916457, @AlexisJazz wrote:Just stop saying it can't be done and start thinking about how it COULD be done.
@AlexisJazz: Please refrain from getting personal and keep things constructive. Thanks.
Did you already forget T307851#7912034?? (double question marks! aaaargh!) You are not helping to keep things constructive. You're not de-escalating either. Maybe you think you are, but with comments like these (and you post these on regular basis) you're really not.
Obviously not a duplicate. Subtask at best.
In T307851#7915959, @Jdlrobson wrote:If you can serve a different skin by appending ?useskin=minerva, why can't you do it with a cookie?
In a nutshell that query string bypasses caching altogether. I suggest you read up on https://wikitech.wikimedia.org/wiki/Caching_overview
This task is impractical.
This is not a duplicate. I'm not asking for Timeless on the mobile domain. More than that, I don't care about the mobile domain at all, the mobile domain should just die.
Mon, May 9
Some relation with T302699 ? @Majavah @Zabe @dom_walden ?
In T307848#7915177, @Jdlrobson wrote:The Minerva overlays create a modal window so they only work with MobileFrontend installed. I think it would be fine to conditionally load this code on desktop too, as some third parties use Minerva without MobileFrontend.
Noting that on desktop modals take up the full screen and look terrible (see T146329 which was declined due to a lack of bandwidth). I'm hoping the work on Codex will provide a better path to building overlays inside the Minerva desktop experience but last time we checked Minerva desktop usage was very very low.
https://nl.wiktionary.org/w/index.php?oldid=22&uselang=en
[79d39db3-ed1e-440d-b472-e0453f2bad40] 2022-05-09 22:41:06: Fatal exception of type "MediaWiki\Revision\RevisionAccessException"