One test to perform in Scribunto on Commons:
local p = require('Module:HTMLEntities')
One test to perform in Scribunto on Commons:
local p = require('Module:HTMLEntities')
Dans T49137#2207582, @Verdy_p a écrit :But I agree: we could still allow a scribunto parser to get of list a limited number of subpages (e.g. 200) within a range (just like when just transcluding a Prefixindex). This would allow creating pages with navigation buttons to get the next or previous range, overwhich a script could loop.
Do we need to compile Lua with the implementation of string.operator'<' calling "strcoll()" in an attempt to use a collation that never works correctly in any locale on Wikimedia servers? Can it be forced to use binary comparison (comparing only unsigned bytes: this consdition would change the results, sorting all ASCII before all non-ASCII, or the reverse, but it does not matter much, even if using unsigned char would be preferable to sort after ASCII all the non-ASCII bytes, i.e. all the rest of the BMP encoded in UTF-8 and other planes)?
I have just discovered that table.sort(t) on a simple table of UTF-8 string, does NOT sort the table correctly (producing nearly random order).
In fact the bug is NOT in table.sort itself, but in the local implementation of the binary operator '<' that compares two strings.
According to the Lua documentation:
Adding tags manually should not be necessary (it is not obvious to find and choose the proper one from new bugs, when these are coming from most translators that do not develop the project), if the URL shown in Translatewiki.net for bug reports already identified the project directly from the messages group (if it's properly configured).
Does this markup work with spaces in sort keys? (i.e. uncompressed and protected from being moved out of the span markup by the Mediawiki parser or "HTML Tidy" pass).
Dans T256649#7246452, @Trappist_the_monk a écrit :In T256649#7246285, @Esc3300 wrote:It works for the four above. Shall we close this as done?
Not obvious to me that anything has changed. At en.wiki
- {{#language:sty|en}} still returns: себертатар
- {{#language:es-formal|en}} still returns: español (formal)
- {{#language:hu-formal|en}} still returns: magyar (formal)
- {{#language:nl-informal|en}} still returns: Nederlands (informeel)
There are some cases where there was no documentation at all in /qqq, but one was added just to add some temporary "FIXME" note, that should be removed once solved.
One way to avoid that limitation should be to find a relevant documentation, or keep a "FIXME" asking for such missing documentation (that should have been set when resolving the previous issue).
According to Glottolog, this is the same language that it names Akan, to encompass the following dialects: ''Agona'', ''Ahafo'', ''Akyem Bosome'', ''Asen'', ''Dankyira'', ''Fante (or Fanti)'', ''Kwawu'', as well as the group of ''Twi'' dialects. (Glottolog however does not perceive the former dialects to form a single "Fante" group).
The main problem is the color schelme used: It you're deuteranope, you can't see the difference between non-bold "darkgreen" and "darkred" text on the greyish background, they both look exactly the same.
That color scheme is not accessible to about 15% of men (including me). To see the subtle difference, yuo need to zoom in the page a lot (with CTRL+NumpPadPlus).
Also I think that the wikilink targets inserted as part of the message may not facilitate the reuse of this "Global Blocking" feature outside Wikimedia, because they would go to Wikimedia MetaWiki. This could conflict with Miraheze and BlueSpice wikis that have their own "Meta" support pages. The target should then probably be passed as a "$1" tvar, so that it can be tuned separately, depending on the wiki farm used (if there's one, may be it is no the same wiki and its local "M:" interwiki could be used for something else, such as a main support site, or a generic policy page describing multiple policies in the local "Project:" or "Help:" namespaces, not specific to a policy but possibly using target anchors).
Apparently it seems that those two messages are part of two different groups:
Those groups reuse the same message ID, but only the 2nd of these groups was recently changed in sources to include the wikilink; so there are conflicting IDs within the same target namespace (MediaWiki:).
Note that apparently it's just impossible to place any wikilink (or external link by URL), whatever the target or the display text it contains, we always get a warning and a "fuzzy" status for the saved message.
Note that that message is displayed when using the __NEWSECTIONLINK__ anywhere on any page (and in any namespace, including the main namespace). It then generates a "Start a new section" button linked to "Special:NewSection/(pagename)" that will start the new editor (featuring both the Visual Editor and the Code editor) without leaving the viewed page.
Maintenance of LiquidThread seems to have been abandonned since long. So this bug is unlikely to be solved soon.
Yes but this adds many constraints on talk pages (e.g. with archiving of discussions, or people attempting to talk and posting incorrect links to categories in these talk pages, that will become then hard to cleanup if everything is mixed (forcing to edit all these discussions, especially if they post contents with template calls). The alternative would be to use other talking systems, but LiquidThread for example is no longer maintained and has its problems.
Is there a way to categorize tabular data in Commons' "Data:" namespace?
There's also the problem where MediaWiki still incorrectly surrounds a an <bdi></bdi> element (used as a *mixed-content* element whole content may be inline or block) by forcing its inclusion within a paragraph (within a dummy and undesired <p></p> HTML element, possibly also causing another block element containg it to be terminated too early). This causes problems for contents that should be purely inline and even totally invisible in the rendered page.
Is it the cause of the current fatal "TypeError" exception on Translatewiki.net when proofreading?
Note also that the "stash on Lingualibre.org" reports the language as "eng" (English), while to should be "fra" (audio part, in French), or "fsl" (gesture video part, in French Sign Language)
Note that I am not sure if this must be the local page title on the target wiki and what is actually the text displayed for the link (in which case the namespace "Wikipedia:" may be translated), or if it is used to derive an URL or the wikilink to the target page (so it should also include a language prefix like "fr:Wikipédia:Droit de disparaître", with the effective translated title used in French Wikipedia, and not just "Wikipedia:Courtesy vanishing" used in the English Wikipedia; but on French Wikipedia, the effective title displayed and used for local wikilinks does not include the "fr:" prefix. Note that these are not pure translations in their pagename part, but these pages on different wikis are interwiki-linked in Wikidata).
But there's no more any "upstream" Phabricator because it is defunct since long; the Wikimedia branch has then to be defined by Wikimedia; And I don't knwo how Phorge.it are working, as Wikimedia has never been involved in that separate branch which also depended on the former "uptream" project. If there are still other existing forks of Phabricator, they should join their efforts, but Wikiemdia has many more developers for it, and the translations for Phabrictor made in translatewiki.net only comes from the Wikimedia branch. It's very likely that Phrge.it does not use any of these TWN messages, and that they just use a limtied set of translations for a more limited set of target languages.
Is there any effort to join the efforts across these branches (e.g. in a wikimedia meeting or a small worknig group trying to reunite their efforts?
I did not know that Phorge.it ever existed, but it is developed as a recent fork of Wikimedia Phabricator, but this message was present since long even before Wikiemdia fully adaopted it (when it was abandoned as open source by a former company). So now who manages these parts that Wikimedia itself chose to not use?
And it's strange because Wikimedia Phabricator requires a separate registration and not Wikimedia SUL for its wikis)