Page MenuHomePhabricator

Verdy_p (Philippe Verdy)
User

Projects

User does not belong to any projects.

Today

  • Clear sailing ahead.

Tomorrow

  • Clear sailing ahead.

Wednesday

  • Clear sailing ahead.

User Details

User Since
Feb 23 2015, 1:39 AM (230 w, 21 h)
Availability
Available
LDAP User
Unknown
MediaWiki User
Verdy p [ Global Accounts ]

Recent Activity

Today

Verdy_p added a comment to T184664: Install Noto fonts on scaling servers for SVG rendering.

You affirmed "I don't know why" but I explain you the reason. It's a fact that I got notified by Phabricator just a few minutes ago (may be Phabricator was very late in delivering his notification email)

Mon, Jul 22, 12:21 AM · Operations, Commons, media-storage, Wikimedia-SVG-rendering
Verdy_p added a comment to T184664: Install Noto fonts on scaling servers for SVG rendering.

I got a recent update today from this channel. It was sent by "Maintenance_bot removed a project: Patch-For-Review" which just got closed now.

Mon, Jul 22, 12:11 AM · Operations, Commons, media-storage, Wikimedia-SVG-rendering

Yesterday

Verdy_p added a comment to T184664: Install Noto fonts on scaling servers for SVG rendering.

Note that ALL ISO 15924 scripts marked as encoded in Unicode up to version 9.0 (including historic scripts) have a suitable Noto Font (most of them a "Noto Sans <abbreviatedScriptName>", but a few ones are in Serif style only). This includes all script variants and script mixes, provided you select the correct fallback for these scripts (e.g. use the default "Latn" script for "Latf" or "Latg", but for "Aran" there's a Nastaliq variant defined, and as well for the "Zsye" variant).
For CJK Fonts, it's best to use the "script-mixes" codes to map them: "Jpan", "Kore", and for "Hans" and "Hant" you should add the Bopomofo to the list.
In all cases, for CSS "font-family:" styles, the default font "Noto Sans" for Latin must be added at end of lists.
For symbols, there are three fonts to add in that order: "Noto Sans Symbols", "Noto Sans Symbols2", "Noto Sans Mono" (the last one needed for box drawing characters should be listed *after* the default font "Noto Sans" for Latin/Greek/Cyrillic.

Sun, Jul 21, 11:52 PM · Operations, Commons, media-storage, Wikimedia-SVG-rendering

Sat, Jul 20

Verdy_p added a comment to T194125: [RFC] Future of charset and collation for mediawiki on mysql .

There's not just MySQL. Other organization may use MSSQL, Sybase, Oracle, Infomix all of them having their own charset support (and in all of them, installing additional charsets to support the full UTF-8 is costly as it also requires installing (and maintaining) collation data. In frequent cases, collation cannot be updated all the time at each Unicode version, because it requires costly reindexing (but partial UTF-8 is possible, and I think this is the reason why MySQL defined the UTF-8(mb3), even if it also requires updating the collations when there are Unicode or CLDR updates for characters encoded in the BMP).

Sat, Jul 20, 9:36 AM · MediaWiki-Installer, MediaWiki-General, Core Platform Team (Security, stability, performance and scalability (TEC1)), MediaWiki-Database

Mon, Jul 15

Verdy_p added a comment to T228012: Disable Mediawiki parser on translations pages .

If you don't have it, then my comment is a feature request that would allow Translatewiki.net to be more useful for other projects (and would also avoid polluting the Translatewiki.net with broken links, missing categories, because some programming or markup language uses "custom" placeholder syntax.

Mon, Jul 15, 9:30 PM · translatewiki.net
Verdy_p added a comment to T194125: [RFC] Future of charset and collation for mediawiki on mysql .

your suggestion does not apply: it 's not viable to convert all tables on an existing database that has other uses.
And I do not necessarily "want to support full plane UTF-8" in a "utf8(mb3)" config. MediaWiki should still run without problem with that config, without causing major issues because of some unsupported characters that MediaWiki never checks.
Reread what I asked: I just want that MediaWiki checks the character sets (a simple insert or update in the database at startup, followed by a read, can immediately detect is non-BMP characters are safe or not, and it is enough to position a flag and then allow Mediawiki to make correct "preview" that will warn the user that his edit cannot be saved "as is".
But the fact that MediaWiki continues working as if there was no problem (and no problem visible or reported even when previewing the edited page) is unsafe.
Is it so complicate to make such check, which has a near-zero cost on UTF8(mb4) config, but will force the code to use text validation prior to saving or previewing, only if this "non-UTF8" config is detected? What is the performance impact really ?
Now you suggest me to develop a patch, but that requires me to develop MediaWiki itself (and I don't like programming in PHP). My initial bug was to ask to some developer to consider this as a request for improvement and fixing, and this old bug was valid since years and is still valid today, it is just not solved for now, and the current developers only seem to consider the needs of Wikimedia for its own wikis, but forgets the needs for other wikis that have different goals (and MediaWiki is not just made for Wikimedia, which has lot of WM-specific features not portable to other places, that don't have the large farm of servers and the complex storage configuration). Most wikis outside Wikiemdia run on a single host which run their own local database engine (and cannot support multiple engines, due to resources constraints). That's why MEdiawiki has many optional plugins they don't have to support, and why MediaWiki also supports several DB engines (and I don't see why it could not support an existing "mb3" config, even if this measn that users won't be able to post non-BMP characters; but in this config MediaWiki shoulkd still be safe to use (and for now it is not).

Mon, Jul 15, 9:06 AM · MediaWiki-Installer, MediaWiki-General, Core Platform Team (Security, stability, performance and scalability (TEC1)), MediaWiki-Database
Verdy_p added a comment to T228012: Disable Mediawiki parser on translations pages .

But even if the parser is disabled on Translatewiki.net, the message will be imported in a wiki where it will be incorrect as it will categorize the page displaying the message, but will not display the link and its text, leaving an incomplete non-sense sentence.
Unless the "intuition" plugin uses its own message parser and not the Wiki parser: in that case you should avoid the wiki syntax in the message for placeholders (in that case no fix is needed).
That message has no comment in "qqq" saying that it does not use the wiki syntax, and it was imported in Translatewiki.net with the default flags saying it was using the wiki syntax.
You still then need to fix the source message so that Translatewiki.net can infer the correct thing (this is what is done for messages intended for C/C++, or other programming languages or other markup languages).
So can you state clearly that the wiki parser will not be used on wikis where the "Intuition:Catdown" extension will be used ? You need to check the source code and test it. If effectively the wiki parser will not be used, then add the flag in the import that says to Translatewiki.net that it uses another parser, and don't forget to document it in the "/qqq".
Even if we use the translate interface, the message will be stored in wiki format, and it creates inexistant target links in the Translatewiki.net tracking categories. you should avoid that !

Mon, Jul 15, 8:16 AM · translatewiki.net

Fri, Jul 12

Verdy_p added a comment to T194125: [RFC] Future of charset and collation for mediawiki on mysql .

This T135969 is an old bug but it was recently "closed" abusively, even if it was reported many years ago (and then considered as perfectly valid, because it really affected several wikis of Wikimedia).
Now you refuse to fix it, and says this is an "invalid" bug even if it still there. Mediawiki was desigend to support multiple SQL backends (and even Wikimedia has changed and migrated its databases multiple times). But there are still installations that CANNOT make such SQL migration (the SQL engine is also used for something else and contains other data than just the wiki, or the wiki is used as an interface for other applications needing its existing database and for which it's not viable to change the engine because other apps than MEdiaWiki are using it).

Fri, Jul 12, 10:37 PM · MediaWiki-Installer, MediaWiki-General, Core Platform Team (Security, stability, performance and scalability (TEC1)), MediaWiki-Database
Verdy_p added a comment to T135969: Saved edited page is truncated on supplementary characters (e.g. Emojis, or supplementary chinese, in Unicode planes 1 or 2) when your database doesn't support that.

No, this is still installed as it was always documented. The basic test I request is also on topic for "MediaWiki database". This is a real bug in that part of Mediawiki, that never asserts but only assumes this is configured as you expect. Wikimedia itself has changed multiple times the way the encodings were used in the DB, and changed appropriately the SQL adapters, but it forgot this case which is very simple to test (at least assert at startup). If you made an assertion and stopped the engine, you would receive tons of complaints that Mediawiki now refuses to run.
It cans still be easily corrected by implemented (when required) the encoding converter (using NCRs for example, or saving with pairs of surrogates, if supported by the engine).

Fri, Jul 12, 10:14 PM · MediaWiki-Database
Verdy_p reopened T135969: Saved edited page is truncated on supplementary characters (e.g. Emojis, or supplementary chinese, in Unicode planes 1 or 2) when your database doesn't support that as "Open".

That's wrong. Being "capable" is just assumed, it is never checked and there are existing wikis using SQL backends that silently drop non-BMP characters (and all what follows them), one of them being the OpenStreetmap wiki. May be its misconfigured, but MediaWiki is completely forgets to check that, and this causes silent drops of data when editing.

Fri, Jul 12, 9:58 PM · MediaWiki-Database

Thu, Jul 4

Verdy_p added a comment to T19160: Gender specific display text for User namespace.

I know this is old, but the idea of using GENDER (from the viewing user) to change the title of a namespace where every user is neutral ! Or may be this will just apply when the "User:" prefix is used before an existing registered user (whose gender is known). But frequently we can't refer to ''any'' user just by its name (in some parameter) to guess which gender should apply. But may be MediaWiki, when it sees a "User:Name" may change itself the gender in the namespace found in link or when viewing the user page (or one of its subpages) according to user's preference.
In all cases, the gender forms for the "User:" namespace must be aliases on the target wiki, and this can cause problems on multilingual wikis if all gender forms in all languages must be used (the case of multilingual wikis in Wikimedia are for example Commons and Meta, but these also have a "default language" which is English and does not need any gnder form (so no need to create aliases).

Thu, Jul 4, 7:25 PM · I18n, MediaWiki-Internationalization

Wed, Jul 3

Verdy_p added a comment to T226312: tpt-languages-separator message is inconsistent between languages.

It's not a yes/no question, but multiple questions packed into one, for which it is impossible to reply by yes/no.

Wed, Jul 3, 10:27 AM · MediaWiki-extensions-Translate
Verdy_p added a comment to T226312: tpt-languages-separator message is inconsistent between languages.

@Aklapper. You still did not ask any "yes/no" question. You just mentioned me with the goal to get explains, and that's what I did (in a structured way even if that not what you expected, but then what you expected is not what you asked for).

Wed, Jul 3, 9:04 AM · MediaWiki-extensions-Translate

Mon, Jul 1

Verdy_p added a comment to T226863: Wiki markup used in "tpt-languages-separator" translation is not interpreted by parser.

Finally another reason is that the 4/4 icon is also very ugly: it should be dropped in case of completion of the translation (i.e. the green 4 squares), making it visible only for uncomplete ones, just to signal to users that the link goes to an uncomplete page.
We would then cleaner lists once translations are completed, only separated by tiny bold middle dots, and not with the bold bullet. The icons and the "big bullet" are both undesirable for standard navigation

Mon, Jul 1, 10:31 AM · MediaWiki-extensions-Translate, MediaWiki-General
Verdy_p added a comment to T226863: Wiki markup used in "tpt-languages-separator" translation is not interpreted by parser.

Also I don't see any interest of using the very bold "standard bullet": there's also a thick separator introduced by the 4/4 colored icon.

Mon, Jul 1, 10:04 AM · MediaWiki-extensions-Translate, MediaWiki-General
Verdy_p added a comment to T226863: Wiki markup used in "tpt-languages-separator" translation is not interpreted by parser.

Also the "big bullet" is actually much too bold in horizontal lists, it obscures the text.
These big bullets are only suitable for vertical lists. The "default" value is then bad. Note that the vertical line used in some other lists of the interface (notably in categories) is also bad (the vertical line is confused with actual letters of some scripts).
The middle dot is correct ONLY if it is surrounded by spaces and distinguished for some other dors used in some scripts. Note that the middle dot may now be used in French in the middle of words (notably for the "inclusive orthography" noting masculine + feminine), making it a bit bolder (but still not the ugly very bold "bullet") and surrounded by spaces avoids all confusions and still has a good interpretation as a punctuation separator.

Mon, Jul 1, 10:00 AM · MediaWiki-extensions-Translate, MediaWiki-General
Verdy_p added a comment to T226863: Wiki markup used in "tpt-languages-separator" translation is not interpreted by parser.

No it was made to be consistent with other lists in many places of the interface.

Mon, Jul 1, 9:54 AM · MediaWiki-extensions-Translate, MediaWiki-General

Sat, Jun 29

Verdy_p reopened T135969: Saved edited page is truncated on supplementary characters (e.g. Emojis, or supplementary chinese, in Unicode planes 1 or 2) when your database doesn't support that as "Open".
Sat, Jun 29, 4:05 PM · MediaWiki-Database
Verdy_p added a comment to T135969: Saved edited page is truncated on supplementary characters (e.g. Emojis, or supplementary chinese, in Unicode planes 1 or 2) when your database doesn't support that.

I just wanted that MediaWiki performs a basic check if it is not installed on a compliant base (this test can be extremely fast at startup, to see if it supports non-BMP characters or if they cause text to be truncated: in that case, some gobal boolean flag is set and will activate any data submission containing such non-BMP character so that the user is informed that these characters are not supported; but submitted text with them will then be rejected, and no unexpected truncation will silently occur) : this is a basic security feature, as many wikis cannot be reinstalled on another database without long offline migration period, and possibly the underlying database will not support it.

Sat, Jun 29, 4:05 PM · MediaWiki-Database

Wed, Jun 26

Verdy_p added a comment to T226312: tpt-languages-separator message is inconsistent between languages.

You did not ask any question that I would reply yes or no. You just pinged
me with "may be I would know best" so you wanted some explains. That's
exactly what I did.

Wed, Jun 26, 7:34 AM · MediaWiki-extensions-Translate

Mon, Jun 24

Verdy_p added a comment to T226312: tpt-languages-separator message is inconsistent between languages.

There's NOTHING that Lua cannot handle. But modules have to do that
themselves (and most of them don't!). We have already various helper
modules that allow parsing the wikitext in parameters (in the appropriate
frame context) and convert them to stripped wiki text, or to perform the
full conversion to HTML, cleaning up safe HTML, normalizing, trimming,
compressing, detecting other equivalent values (including normalizing input
numbers), performing case folding.
However I'm not sure that all these steps can be implemented by Mediawiki
(before using the Scribunto hook) or by Scribunto itself:

Mon, Jun 24, 9:05 PM · MediaWiki-extensions-Translate
Verdy_p added a comment to T226312: tpt-languages-separator message is inconsistent between languages.

In act it's up to each Lua module to determine which parameter values they
consider as equivalent, so that they will first canonicalize them.

Mon, Jun 24, 5:56 PM · MediaWiki-extensions-Translate
Verdy_p added a comment to T226312: tpt-languages-separator message is inconsistent between languages.

Unfortunately I was cited for a single edit made 5 years ago (not invalid
at that time and not conflictin with any one as it had no prior history).
I cannot remember exactly the reason why I used "&nbsp;" in that case when
other languages used (most probably later) "&#160;" (which is also
unexplained).

Mon, Jun 24, 5:46 PM · MediaWiki-extensions-Translate

May 6 2019

Verdy_p added a comment to T4085: Add a {{USERLANGUAGE}} magic word.

I'm not convinced this is needed to parse the page, only to generate its content. But is this related to conditional code like #if and #switch and with transclusion of Lua generated contents (that would then need to generate all lingusitic versions until a laguage filter is applied at end to purge the excluded section) ?

May 6 2019, 2:58 PM · Parsing-Team, Performance-Team (Radar), Patch-For-Review, MediaWiki-Parser, I18n, MediaWiki-Internationalization
Verdy_p added a comment to T4085: Add a {{USERLANGUAGE}} magic word.

An we still lack the possibility of marking a specific page (with a margic syntax generating metadata, not content, like "[[Category:...]]") as being primarily in a specific language (independant of the user language, but that should NOT be inserted in pages marked for translations with the translation tools, which are marked automatically by the Translation tool and uses a specific page naming convention using "/langcode" suffixes/subpages, or some "langcode:" prefix or namespace, like on the OpenStreemap wiki).

May 6 2019, 2:08 PM · Parsing-Team, Performance-Team (Radar), Patch-For-Review, MediaWiki-Parser, I18n, MediaWiki-Internationalization

Mar 12 2019

Verdy_p added a comment to T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.

Your suspcition is wrong, I used standard jamos, not compatibility ones, and standard Hangul syllables.

I gave the sample code that just uses

#( mw.ustring.toNF[K][C/D] ( teststring ) )
to test the length of the result (3 bytes per Korean character in UTF-8).
I've not been able to get any NFD decomposition from an NFC encoded standard Korean string from Lua (in currently deployed "wm.ustring" package) where there are precomposed Hangul LVT or LV syllables (which are used everywhere in the NFC form in almost all Korean texts). So you jsut tested that you got NFC correct from an NFC string, but NFD is still not working, and canonical equivalence is still not working across all forms (NFC, NFD, or other non-normalized forms)

Mar 12 2019, 9:37 PM · MediaWiki-extensions-Scribunto, utfnormal

Mar 10 2019

Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:34 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:34 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:32 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:30 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:30 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:24 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p updated the task description for T217973: mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) can't work properly in Korean Hangul contents?.
Mar 10 2019, 8:23 AM · MediaWiki-extensions-Scribunto, utfnormal
Verdy_p added a comment to T153994: Migrate MediaWiki extensions away from UtfNormal in MediaWiki core to external UtfNormal library.

This is offtopic. Please file a separate bug.

Mar 10 2019, 8:09 AM · Patch-For-Review, Google-Code-In-2016, utfnormal, good first bug, Technical-Debt

Feb 11 2019

Verdy_p added a comment to T7303: UtfNormal replacement proposal.

Note that the current implementation found in Commons does not work!

Feb 11 2019, 1:58 PM · utfnormal, MediaWiki-General
Verdy_p added a comment to T153994: Migrate MediaWiki extensions away from UtfNormal in MediaWiki core to external UtfNormal library.

mw.ustring.toNFD(str) and mw.ustring.toNFKD(str) do not work as expected for all modern Korean Hangul syllables:

  • the algorithmically composed syllables (LVT or LV forms using basic jamos) in range U+AC00..U+D7AF are still not decomposed at all
    • the current implementation only uses the simple decomposition mapping pairs found in the UCD
    • most decomposition mappings are found in the UCD for Korean, except those for Hangul LVT and LV syllables using "modern simple jamos".
    • only the decomposition "legacy jamos" (some of them are of type VV, there are also some LVT or LV forms but using legacy jamos nor part of the Hangul precomposed syllable ranges) are in the UCD !
Feb 11 2019, 1:54 PM · Patch-For-Review, Google-Code-In-2016, utfnormal, good first bug, Technical-Debt

Oct 2 2018

Verdy_p added a comment to T199106: wgULSGeoService and https://freegeoip.net/ out of service.

"Browsers seem" is not sufficient. The actual bug is open in specifications of CORS and XMLHttpRequest and there are headers to control this. But this requires developing a javascript library. So yes the bug can (and should) be resolved. Various bugs are also open in browser development to better control this. There are lot of malicious attacks using these redirects and lot of demands to control and avoid pesky redirects which are unsafe, and still maintain a website usable without causing exceptions like those caused here.

Oct 2 2018, 1:06 PM · Language-Team (Language-2018-October-December), MediaWiki Language Extension Bundle, MW-1.32-notes (WMF-deploy-2018-09-18 (1.32.0-wmf.22)), UniversalLanguageSelector
Verdy_p added a comment to T199106: wgULSGeoService and https://freegeoip.net/ out of service.

Was there also a patch in the code that supported the query of external queries via any web API (independantly of the external service) so that they won't honor any further redirect from HTTPS to HTTP ? This is still needed for security, and may affect other Mediawiki extensions: such redirects should not be followed at all by default (except with a specific authorisation in the module/extension using that external HTTPS API, using an optional parameter to the support library); the global setting specific for $wgULSGeoService is not enough as the problem is more general and we shouldn't need to multiply such global setting when each extension should have its own settings.

Oct 2 2018, 8:00 AM · Language-Team (Language-2018-October-December), MediaWiki Language Extension Bundle, MW-1.32-notes (WMF-deploy-2018-09-18 (1.32.0-wmf.22)), UniversalLanguageSelector

Sep 19 2018

Verdy_p added a comment to T198834: Have a simpler mapframe syntax, similar to wikitext for files.

No: it's not "confusing", as the purpose was exactly to imitate the syntax of files, but with another prefix, and notably (as stated in the proposal allow simple basic conversion by replacing "File:name" by "Mapframe:x/y/z" and keeping all the additionals parameters used in Files, including notably sizing, positioning, framing, alignment, description, and as well supporting the same for links to files (using ":" before "File:" to go to a separate full page, such as a link with "[[:File:name|text]]" just becomes "[[:Mapframe:x/y/z|text]]" with the same basic replacement.

Sep 19 2018, 7:31 PM · Maps (Kartographer)

Sep 7 2018

Verdy_p added a comment to T199106: wgULSGeoService and https://freegeoip.net/ out of service.

Anyway this bug may reoccur at any time: I really suggest that any script or extension that makes HTTPS requests to any third party site does NOT honor any 403 redirect to HTTP, but instead logs a warning to disable it, or treat that redirect as a server-side error (as if it was HTTP 500, and not HTTP 403). This will then allow scripts to behave correctly and not bypass the security.
I wonder if there's a way to handle that a generic way in a common library used by all extensions.
This will make them safer: the status can be kept as 403, instead of being replaced by the status of the redirected page. The library loading the resource should still flag the resource as being in error. The HTTP status text may be kept, but appended by " (error: redirect from HTTPS to HTTP forbidden)". May be some scripts/extension may need to avoid this and a flag could by this check and still honor it, but the HTTP status text should be appended by " (warning: deprecated redirect from HTTPS to HTTP)".

Sep 7 2018, 11:46 AM · Language-Team (Language-2018-October-December), MediaWiki Language Extension Bundle, MW-1.32-notes (WMF-deploy-2018-09-18 (1.32.0-wmf.22)), UniversalLanguageSelector

Aug 17 2018

Verdy_p added a comment to T164658: Recursive "bdi" elements incorrectly parsed.

OK this looks good when testing it effectively on Mediawiki wiki. The bug can be closed (until the new version fully deployed on other wikis).

Aug 17 2018, 2:03 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 2:02 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 2:01 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 2:01 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p added a comment to T164658: Recursive "bdi" elements incorrectly parsed.

I am testing it, I'll reply after that

Aug 17 2018, 1:54 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 1:53 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 1:53 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 1:51 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 1:47 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 1:46 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 12:08 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 12:06 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 12:03 PM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 11:50 AM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL
Verdy_p updated the task description for T164658: Recursive "bdi" elements incorrectly parsed.
Aug 17 2018, 11:49 AM · MediaWiki-Internationalization, MediaWiki-Parser, I18n, RTL

Aug 7 2018

Verdy_p added a comment to T199106: wgULSGeoService and https://freegeoip.net/ out of service.

This breaks all wikis using the ULS (most of them). Pages refuse to load, saying that an attempt was made to load an unsecured script over HTTP (even if it was requested by HTTPS): FreeGeoIP in fact redirects the HTTPS requests to an HTTP-only error page.
As this cans cause a whole wiki to get broekn in a secure browser, you should provide a default catcheer that will detect such redirects from HTTPS to HTTP and will not break the browser by trying to follow the redirect for the error returned (JSON) insterad of the actual result. Actually no script part of MediaWiki should tolerate such unsafe redirects to another site or protocol, unless it is specifically configured to allow it.
ULS should then provide the safe checking.

Aug 7 2018, 11:30 PM · Language-Team (Language-2018-October-December), MediaWiki Language Extension Bundle, MW-1.32-notes (WMF-deploy-2018-09-18 (1.32.0-wmf.22)), UniversalLanguageSelector

Aug 4 2018

Verdy_p added a comment to T165585: Make creating a new Language project easier.

Frankly, do we really need things like "-formal" and "-informal"? They can't be recognized by browser as no browser think that both are country codes.

Aug 4 2018, 8:58 AM · Release-Engineering-Team, Release-Engineering-Team-TODO, incubator.wikimedia.org, Language-strategy, I18n, Epic
Verdy_p added a comment to T165585: Make creating a new Language project easier.

@Verdy_p:

cbk-zam -> should be aliased to ???

should be renamed back to cbk, see T124657

map-bms -> aliased to "bms"

Huh? Banyumasan = Bilma Kanuri?

simple -> should be aliased to "en-x-simple"

Just en-simple, no need to use "-x-" here.

Aug 4 2018, 8:44 AM · Release-Engineering-Team, Release-Engineering-Team-TODO, incubator.wikimedia.org, Language-strategy, I18n, Epic

Aug 2 2018

Verdy_p added a comment to T165585: Make creating a new Language project easier.

stop limiting the language codes to 3 characters

The following languages with more than 3 characters already exist in production:

~/dns/templates/helpers$ cut -d\' -f2 langs.tmpl | grep -E '^[a-z-]{4,}'
(...)

bat-smg -> aliased to "sgs"
be-tarask -> conforming to BCP 47
be-x-old -> aliased to "be-tarask"
cbk-zam -> should be aliased to ???
fiu-vro -> aliased to "vro"
map-bms -> aliased to "bms"
minnan -> aliased to "nan"
nds-nl -> conforming to BCP 47
roa-rup -> aliased to "rup"
roa-tara-> should be aliased to "it-x-tara"
simple -> should be aliased to "en-x-simple"
zh-cfr -> aliased to "nan"
zh-classical -> aliased to "lzh"
zh-min-nan -> aliased to "nan"
zh-yue -> aliased to "yue"
nrm -> should be first aliased to "nrf", then the "nrm" alias deleted after (mostly) complete migration (and cleanup of Wikidata)

Aug 2 2018, 11:11 AM · Release-Engineering-Team, Release-Engineering-Team-TODO, incubator.wikimedia.org, Language-strategy, I18n, Epic
Verdy_p added a comment to T165585: Make creating a new Language project easier.

My suggestion is not just for Wikimedia wikis. This is a general need for deployement of various wikis which would like to be more flexible in what is shared and what is not, and without necessarily needing a specific domain for each wiki sharing common namespaces (notable "User:" and "User talk:", as well as user preferences for a single registration, possibly even other namespaces like "Template:", "Template talk:", "Module:", "Module talk:", "File:", "File talk:", "Category:", "Category talk:", "Help:", "Help talk:"; with only "Project:", "Project talk:", being specific, and hosted under their own "interwiki" code).

Aug 2 2018, 11:04 AM · Release-Engineering-Team, Release-Engineering-Team-TODO, incubator.wikimedia.org, Language-strategy, I18n, Epic

Jul 31 2018

Verdy_p added a comment to T165585: Make creating a new Language project easier.

Note that there's absolutely NO need to create "temporary" domains for languages codes and project in Incubator. We can just use the existing interwiki prefixes as they work now as rewriter rules for URLs, and they can already be resolved in the incibunator domain and its path structure.
All that is needed is to path the rewrite rules for their language prefix and project prefix.
This way we could still use normal interwiki links across all projects so that "lang:Articlename" in any wikipedia edition or in any wikipedia incubator subproject will link to "incubator:Wp/lang:Articlename". and this would also apply to Wikidata which could also accept already Wikipedia links using also "lang:Articlename" instead of "incubator:Wp/lang:Articlename".

Jul 31 2018, 9:32 AM · Release-Engineering-Team, Release-Engineering-Team-TODO, incubator.wikimedia.org, Language-strategy, I18n, Epic

Jun 25 2018

Verdy_p added a comment to T172035: Blockers for Wikimedia wiki domain renaming.

Finally,

some ot them may look like language/locale codes but are not.

What's "ot" here? fire? grass? eight?...

Jun 25 2018, 6:26 PM · Language-strategy, Wikimedia-Site-requests

Jun 20 2018

Verdy_p added a comment to T196371: Provide a multi-language user-faced warning regarding AES128-SHA deprecation.

Will that affect Wikipedia Zero, given it is freely hosted by third party volunteer ISPs that provide their own caching proxy? Can their proxies support the new security requirement for conencting to Wikimedia sites and delivering their contents (and optionally allow their users to contribute via their proxies)?

Jun 20 2018, 1:00 PM · User-notice, User-Johan, Operations, Traffic

Jun 14 2018

Verdy_p added a comment to T196371: Provide a multi-language user-faced warning regarding AES128-SHA deprecation.

Not able to even read the wiki in an enforced incognito mode (removing all private session keys, disabling some scripts, just render the content)?

Jun 14 2018, 1:51 PM · User-notice, User-Johan, Operations, Traffic
Verdy_p added a comment to T196371: Provide a multi-language user-faced warning regarding AES128-SHA deprecation.

Note that because of ULS, using "Wikipedia" instead of "Wikimedia" is still accurate: the secure logon will be made on other wikis simultaneously, including Wikipedia, when you are on any other Wikimedia site (there's a small exception for some internal Wikimedia wikis that are not connected with ULS but use a separate logon, these internal wikis are used mostly by English-speaking users, except possibly the WM conference wikis created each year and whose users may be using another local language, and some old wiki projects whose editing has been stopped and kept online as a readonly archive where no logon is necessary via ULS, and their few administrators will very likely have the way to use decent alternate browsers if needed, or can contact another admin in case of problems to perform some emergency action).

Jun 14 2018, 1:13 PM · User-notice, User-Johan, Operations, Traffic
Verdy_p added a comment to T196371: Provide a multi-language user-faced warning regarding AES128-SHA deprecation.

Isn"t there a way for the wiki server to autodetect those browsers that are still using the legacy TLS implementation and add some flag whose fvlue that can be used to conditionally display the warning?

Jun 14 2018, 1:04 PM · User-notice, User-Johan, Operations, Traffic

Jun 13 2018

Verdy_p added a comment to T127680: Rename Serbo-Croatian Wikipedia and Wiktionary from sh.wiki* to hbs.wiki*.

Once again the deletion from ISO 639-1 is not relevant, the code is still conforming to BCP47; there's no need at all to rename this one (and "sh" cannot be reallocated to any other language). We just want to conform to BCP47 which is stable (ISO 639 is not stable and in fact ambiguous in many other cases, ISO 639-1 is no longer a normative source of BCP47, this informative reference exists for historic purpose only and we do not conform to ISO 639 and will never be able to conform to it; all the web standards are based on BCP47 which explains in iuts RFC the difference and why not all ISO639 codes are accepted, as it contains numerous classification errors and ISO 639 is inconsistant; ISO 639 remains used only for old bibliographic purpose, for libraries of printed books and old legal archives, not for technical tagging; many libraries have stopped using ISO 639 and have converted to BCP47 which is consistant, stable, and much more precise)

Jun 13 2018, 2:19 PM · Wiki-Setup (Rename), Wikimedia-Language-setup, I18n

May 31 2018

Verdy_p added a comment to T135969: Saved edited page is truncated on supplementary characters (e.g. Emojis, or supplementary chinese, in Unicode planes 1 or 2) when your database doesn't support that.

Whever you like it or not, it is a fact that MediaWiki will have already been installed with the "utf8" option in the scripts already delivered since long for creating the initial database for MySQL.
Now you say you don't support it, but this was supported as long as no one was attempting to use supplementary characters (outside the BMP, needing 4 bytes and not just 3 bytes).
It is a fact that MySQL silently truncate strings when storing them, no error is returned. The preview before saving is still correct (so it is not a problem of PHP or HHVM or OS compatibility).

May 31 2018, 3:36 PM · MediaWiki-Database

May 6 2018

Verdy_p added a comment to T176370: Migrate to PHP 7 in WMF production.

What was off-topic was the sentence I commented: "proposals seen as an attack". Which is completely wrong. Thaere's nothing bad in making proposals and this is not attacking any one or any existing project. It's just about reevaluating the choices that were made years ago under assumptions which may no longer be true. Yes we have put Mediawiki into creating its own isolated ecosystem with just smaller support, and most people in wikis do not how to use this old Lua specification (and its many limitations). Now Lua in Wikimedia servers is draining too much resources and we all want to have wikis that are faster (and it's not just about choosiung between HHMV or PHP7, when there's only a very minor or in fact no measurable or significant performance improvement)
What makes a real performance differnece is the way the frameworks deloped on top of HHVM or PHP are written and if they can benefit of newer versions, optimisations and securisation efforts. Mediawiki tends to be more and more isolated from the worldwide developments. We should not isolate ourself from this trend, as the gaps of technologies is now dangerously increasing, we'll get less users, less developers and this will finally affect also Wikimedia sites if only a few highly specialized users can use it and finally control all the content and bloc kits evolutions. We should be more open.

May 6 2018, 12:11 PM · Core Platform Team (PHP7 (TEC4)), Patch-For-Review, TechCom-RFC (TechCom-Approved), User-ArielGlenn, HHVM, Operations
Verdy_p added a comment to T176370: Migrate to PHP 7 in WMF production.

"proposals as an attack" ? Strange attitude. I's easy to see that Lua is in fact very slow compared to modern JS engines. And it is damn more complex to program than JS. It also drains more resources (CPU and memory). There used to be some good use cases for LUA in the past that JS did not support but since JS version ES5 and even more since ES6, there's no real difference in use cases; And JS now benefits from a lot more developers for their engines and for securing the exposed APIs or checking and enforcing the conformance requirements.
I'm sure that many admins choosing Mediawiki would like to integrate JS scripting. Many wikis (notably those runnign on small servers, not like those large farms used by Wikimedia) have banned using Scribunto/Lua.. And too many mediawiki users do not want to learn another scripting language and would prefer using JS because it has many more training resources available for them (Lua looks so much "exotic" with its "metatables" and compelx handling of datatypes, default values).

May 6 2018, 11:45 AM · Core Platform Team (PHP7 (TEC4)), Patch-For-Review, TechCom-RFC (TechCom-Approved), User-ArielGlenn, HHVM, Operations

May 5 2018

Verdy_p added a comment to T176370: Migrate to PHP 7 in WMF production.

Note that PHP is still only marginally faster than HHVM, only starting since PHP 7.2. HHVM is still faster than PHP 7.1 and PHP 7.0 in terms of requests/second (this was measured for Wordpress, another text templating and CMS system similar to Mediawiki in many use cases).
Some tests reveal that PHP 7.0+ will be faster than HHVM when there are some plugins, because integration of plugins within PHP is probaly easier and faster than in HHVM: this could be the case here to allow Mediawiki coexist with other server-side extensions used by Wikimedia (notably Scribunto for Lua, where the coexistence of the HHVM and Lua VMs may compete for the same system resources, notably for managing memory). However most Mediawiki-based wikis do not even use or want Scribunto and Lua.
The main reason for getting back to PHP is probably based on long term evolution (but I don't tthnk that HHVM itself is suddenly terminated, and that Facebook will stop supporting it, unless Facebook considers that HHVM development is no longer needed as all it wanted in PHP is now integrated in the main branch of PHP (and PHP itself is continuing to improve the speed ot its own VM and JIT compiler, notably with contributions made and sold by Zend and): PHP will probably include new features that may take time or would be complex to port back in HHVM, if this requires them to rewrite significant part of the JIT, and secure the code (think about the nightmare caused by Spectre/Meltdown: there are probably many more developers tracking the issue of resistance to time-attacks in PHP, than in HHVM, simply because many more sites use PHP, and HHVM is rarely proposed and supported by hosting providers which still love PHP they don't have to support directly as it has a larger community of fans and developers finding tricks, workarounds when something has to be replaced or removed: PHP is much easier to maintain and tweak to these needs, and is also vastly more portable on different architectures of CPUs/networks/storages/OSes and PHP is reailly easy to deploy in virtual or distributed cloud environments, with less restrictions than HHVM, and PHP can run on smaller systems, which is also a goal for MediaWiki to be usable on small sites...).

May 5 2018, 11:07 PM · Core Platform Team (PHP7 (TEC4)), Patch-For-Review, TechCom-RFC (TechCom-Approved), User-ArielGlenn, HHVM, Operations
Verdy_p added a comment to T184664: Install Noto fonts on scaling servers for SVG rendering.

So you just installed only a subset of Noto fonts for just a few *scripts* but not even all the scripts already used in Wikimedia projects...
Why are most "minor" scripts from Asia and Africa ignored?
How do you plan to update these fonts when the Noto package continues evolving and being refined (updates to newer Unicode versions, new supported clusters and dependant forms, updates to OpenType specs, improved hinting, introduction of "variable" styles which allow tuning the blackness of fonts to the size and device color/diffusion profiles better than what hinting alone could produce, better adaptability to 3D rendering, support for more than 1 implicit foreground color, support for variable alpha-transparency in glyphs and various effects that come from OpenGL and SVG, new table lookup formats for large fonts, font linking, new data compression schemes, and special properties for accessibility and device-dependant rendering, plus many fixes needed for better legibility and distinction or better placement of diacritics, improved kerning table, and also faster processing on common OSes like Windows, MacOS, Linux, Android, iOS, improved font properties for easier selection of fonts, inclusion of sample texts for languages where the font was tested.).
Unlike most free fonts and most commercial fonts, these Noto fonts are in constant evolution, it is a very active project with lots of participants, and still they are extremely stable.
These fonts form a single unified set, they are maintained in sync and are interdependant as they share some identical basic subsets and are tweaked so their mutual metrics are compatible (it is especially important to support multilingual texts.

May 5 2018, 8:44 PM · Operations, Commons, media-storage, Wikimedia-SVG-rendering

Apr 17 2018

Verdy_p added a comment to T55130: Categories should be displayed in a responsive table and be mobile friendly..

If using CSS multicoluln layout, prefer using column-width rather than column-count (even if the specified value 3 is small). And please avoid specifying the width in absolute units, use font-size relative units according t othe text content (this is also needed for accesbility, as users may want to have larger font sizes).

Apr 17 2018, 6:35 PM · Patch-For-Review, MediaWiki-Categories
Verdy_p added a comment to T55130: Categories should be displayed in a responsive table and be mobile friendly..

Rather than CSS multi-column blocks, may be we should use CSS flex layout (which can still have a fallback to multicolumn), to give consistant presentation of cells without undesired column-breaks (flex layout is on a grid but with implicit rows which can be vertical or horizontal, and where full cells can be wrapped and still maintain their alignment.
The fFlex layout is very used on mobile sites (and implemented in all mobile browsers) as it allows a great flexibility of formats, and better accessibility for touch. They have the clean layout of tables, but without forcing a static number of columns or rows, and allow designing the UI so it will get only one scrollbar (horizontal or vertical), never two and it is well suited for mobile phones which can be easily rotated between paysage and portrait. They work magically for all the builtin UI of the OS (e.g. in config menus) and for most apps.
The only problem is flex layout is still new on desktop browsers and some old desktop browsers (and some old browsers for smartTVs that don't rotate and frequently have antique renderers) don't have it (but the same devices also have problems for multicolumn and support of tables is frequently full of bugs/caveats or require device-specific developments and severe degradation of UI features, so I doubt Wikis are even usable on these devices; I think that users that have these device don't use it really to browse the net, and will still have more recent and decent smartphones or tablets; given the lifetime of mobile devices which is roughly 3 years, and flex design and modern browsers implementing it on these mobiel devices have been on market since more than 5 years, I don't think the flex layout will cause compatibility problems; in fact there's probably now a better support for flex layout than multicolumn which was intended for paged devices, i.e. printing, or desktop on large screens in landscape mode).

Apr 17 2018, 6:33 PM · Patch-For-Review, MediaWiki-Categories

Mar 7 2018

Verdy_p added a comment to T187181: messages like MediaWiki:Babel/am could be wrongly rendered due to Amharic full stop (can be misread as "double colon" in sometimes).

May be closed now, the bug is solved for this string, but may a a TODO task can be made to check similar problems and prevent them to occur later at any time in any language (double vertical bars should not exist within square brackets).

Mar 7 2018, 11:47 PM · MediaWiki-extensions-Babel

Mar 6 2018

Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

How users without Javascript upload files ? There's a special page for that and it should still continue to offer the legacy upload form not requiring any javascript, but using form submissions to the server and input fields for the file, the description, and a way to select the licence. All wikis have this form (or should continue to have it). The javascripted version is just an helper which can be lauched to automate various thinkgs or update the form dynamically.

Mar 6 2018, 9:30 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

Note also that this new UploadWizard requires Javascript.

Mar 6 2018, 8:11 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

Thanks now, at least the problem the recognized and correctly solved. (Of course you'll need to maintain the code so the generated localized URLs will be correct for the languages displayed.

Mar 6 2018, 8:09 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

Then you should remove completely this now misleading alert in the "/qqq" documentation:

Mar 6 2018, 5:26 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard

Mar 5 2018

Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

Initially I thought that the code was running on the server, but I see here that this code is implemented on the client side by loaded javascript, and it does not have any MediaWiki parser builtin.
From what I see, the generation of the content in Javascript is using jQuery to generate HTML tags, and all other text elements are "HTMLized" by the javascript msg() method (which seems then to take the message from the server by querying it for the content of "Mediawiki:message/*".

Mar 5 2018, 3:04 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

It seems that what changed was in the .msg() method used in that code, which enforced the HTML-ization of the message (possibly because of security against malicious injection of malicious active HTML, including javascripts and events running in the user agent). Here the parameter of msg is directly the message coming from Translatewiki.net.

Mar 5 2018, 2:49 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard

Mar 3 2018

Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

This has never been a critic against people, but statements only about the code itself (there's no assumption at all with the addition of the adjective "your", this is not personal offense but means what it means: what "you" support)

Mar 3 2018, 6:55 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

"Your" code is the code that "you" want to support, even if it was written initally by someone else. It does not mean it belongs only to you or the initial author of course.
We must find a way to solve this recurring problem. There's an unseen bug, and it's definitely not in TranslateWiki.net but on incorrect assumptions about what Translatewiki.net does or does not perform or how it works. The initial developer was not aware of such caveat. This code was never tested correctly before it was deployed in January on Commons.
(and the initial solution that worked on Commons before January used HTML comments without problems).
It's not something new because the problem was already signaled multiple times last year: the project was incorrectly prepared for translatability on Translatewiki.net and it broke Commons only when the new UploadWizard was deployed there without proper tests with translations (it was visibly tested only in English).

Mar 3 2018, 6:12 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

You necessarily use some forme of parsing of the string to change the wiki notation of external links (between [] brackets) into plain HTML links.

Mar 3 2018, 5:37 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p reopened T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end as "Open".
Mar 3 2018, 4:29 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T188818: German translation of CC radio button texts in UploadWizard display "<!--$2-->" at the end.

This is definitely a bug of the beta upload wizard which does not parse the basic wiki syntax correctly (including core HTML for comments).
For some unknown reasons, it parses it incompletely (to find links between [bracket]) then HTMLizes everything else (meaning that basic HTML needed for some translations will never render correctly.
This resource is to be valid Wikitext and usable on all wikis (not jsut this beta version of the Upload tool currently now deployed only in Commons)

Mar 3 2018, 4:29 PM · MW-1.31-release-notes (WMF-deploy-2018-03-06 (1.31.0-wmf.24)), Multimedia, UploadWizard
Verdy_p added a comment to T132307: [[MediaWiki:And/fr]] uses encoded leading space (wrong for some East Asian languages).

I don't know why I am assigned this task as I cannot resolve it myself in Mediawiki ! I just want to be notified (i.e. only a subscriber)...

Mar 3 2018, 11:02 AM · MediaWiki-Internationalization, Chinese-Sites, MediaWiki-Special-pages, I18n
Verdy_p placed T132307: [[MediaWiki:And/fr]] uses encoded leading space (wrong for some East Asian languages) up for grabs.
Mar 3 2018, 11:02 AM · MediaWiki-Internationalization, Chinese-Sites, MediaWiki-Special-pages, I18n
Verdy_p added a comment to T183465: Make Extension:ParserFunctions convert localized digits to arabic numerals in #(if)expr and #time.

I think it would be ok to support all digits in any known scripts that have decimal digits (i.e. the "Nd" generic property in Unicode).

Mar 3 2018, 10:35 AM · I18n, ParserFunctions
Verdy_p updated the task description for T188790: {{#ifexist:{{FULLPAGENAME}}|...|...}} incorrectly counted as costly.
Mar 3 2018, 10:15 AM · ParserFunctions

Mar 2 2018

Verdy_p created T188790: {{#ifexist:{{FULLPAGENAME}}|...|...}} incorrectly counted as costly.
Mar 2 2018, 10:57 PM · ParserFunctions

Feb 23 2018

Verdy_p added a comment to T172035: Blockers for Wikimedia wiki domain renaming.

Note that renaming a wiki or changing its internal database or domain name is not mandatory. All we need is to support the correct interwikis, and stop polluting Wikidata with fake language codes for its translations (Wikidata should use "en-simple" even for linking to Wikipedia, and it's perfectly possible to alias the domain name). Only a minor modification of known interwiki codes is needed so that it points to the correct domain name even if it's not changed.
Renaming databases is not absolutely necessary. Some pywiki bots will need to be upated to know also the new interwiki alias.
After years, and with other pending renames or creations, we should know now exactly where and how to centralize such maintenance for lists of language codes supported, fallbacks, updates to Translatewiki.net and its import bot.
And so we should also deprecate legacy codes we still use on Translatewiki.net (we should not pollute other non-wikimedia projects hosted there, even if there are now warnings on this site for such legacy private codes, notably on its language portals and their associated categories), as well as all Wikidata entries in properties that are NOT links to Wikipedia. We're reaching the point where complete cleanup can be terminated. For Wikidata it's an important goal to have it established as an important standard, as useful and powerful as CLDR.

Feb 23 2018, 2:53 AM · Language-strategy, Wikimedia-Site-requests
Verdy_p added a comment to T172035: Blockers for Wikimedia wiki domain renaming.

Note also that this initial bug was really started many years ago, before that BCP47 registration only two years ago (before even this thread in Phabricator when it imported all bugs from former bug trackers)

Feb 23 2018, 2:43 AM · Language-strategy, Wikimedia-Site-requests
Verdy_p added a comment to T172035: Blockers for Wikimedia wiki domain renaming.

Sorry, I should have checked if it had been recently registered (so only "simple" is not conforming and breaks BCP47 resolvers by forcing us to implement a custom fallback to "en" instead of the standard BCB 47 fallback of "en"); various templates are already converting "simple" to "en-simple" but there may remain a few that use "en-x-simple" to be compliant with HTML/CSS/XML lang="*" attributes, and with standard i18n libraries

Feb 23 2018, 2:40 AM · Language-strategy, Wikimedia-Site-requests

Feb 14 2018

Verdy_p added a comment to T33097: Automatically created categories should be categorized: xx-y in xx, xx in <babel-footer-url>'s category.

Autocategorization should probably be configurable so that it will generate a category page containing a local template transclusion such as

  • {{Babel category|da|N}} (per level) or
  • {{Babel category|da}} (all levels for the language).

(This is the same solution as the exposed template used in huwikisource, and implemented in https://gerrit.wikimedia.org/r/#/c/332332/6/BabelAutoCreate.class.php, except that this patches enforces the generation of a parent category in the autocreated per-level category, which may not be appropriate: the "Template:Babel category" can add this itself, the autocreation bot of Babel does not need to do that itself).

Feb 14 2018, 8:38 AM · Patch-For-Review, Google-Code-In-2016, MediaWiki-extensions-Babel
Verdy_p added a comment to T187181: messages like MediaWiki:Babel/am could be wrongly rendered due to Amharic full stop (can be misread as "double colon" in sometimes).

Can we check/lint these kind of errors in Babel resources ?

  • May be just grepping for '||' or '::' would find format problems (Translatewiki already checks unpaired brackets/parentheses).
  • There may be other errors such as unpaired HTML tags (e.g. sub/sup, which may be needed for some translations), not just for Babel but for other subprojects as well, or HTML tags paired incorrectly (<x>...<y>....</x>...</y>) which cause various problems (even in HTML5 if this occurs with block elements)
Feb 14 2018, 8:26 AM · MediaWiki-extensions-Babel

Feb 13 2018

Verdy_p reopened T187181: messages like MediaWiki:Babel/am could be wrongly rendered due to Amharic full stop (can be misread as "double colon" in sometimes) as "Open".
Feb 13 2018, 1:25 PM · MediaWiki-extensions-Babel
Verdy_p added a comment to T187181: messages like MediaWiki:Babel/am could be wrongly rendered due to Amharic full stop (can be misread as "double colon" in sometimes).

It is valid because there's NO resource to translate in the Babel project on Translatewiki.net for these strings.
Yes it is specific to that single language (am), and that's what was in the initial report.
So reopen please, and check the format of your *i18n* JSON data which does not come from translations Translatewiki.net itself (there's no project for these strings, only for some generic strings of the Babel box).

Feb 13 2018, 1:24 PM · MediaWiki-extensions-Babel
Verdy_p updated the task description for T187181: messages like MediaWiki:Babel/am could be wrongly rendered due to Amharic full stop (can be misread as "double colon" in sometimes).
Feb 13 2018, 1:12 PM · MediaWiki-extensions-Babel
Verdy_p updated the task description for T187181: messages like MediaWiki:Babel/am could be wrongly rendered due to Amharic full stop (can be misread as "double colon" in sometimes).
Feb 13 2018, 1:12 PM · MediaWiki-extensions-Babel