There seems to be an complication, perhaps unnecessary, that the timestamp is part of the unique index. I believe this was made so that we could in future store the history as well. But for now we are not keeping history, and instead doing extra effort for finding an existing row (if any) and replacing its contents or inserting a new row.
Tue, Apr 25
Lowering priority as CX is currently back (we are monitoring and ready to disable though). Finding the root cause is important.
For what I can recall, it was made so because the wording "in other languages" does not make sense when there are no interwiki links. Fixing it might involve changing core and/or skins to have it done server side.
Fri, Apr 21
I captured a few (3) draft save requests on my wiki and replayed them in an endless loop. I did not see any deadlocks and queries were completing relatively quickly even under load.
Thu, Apr 20
Unfortunately I did not capture any of the queries that were long-running during the outage. However, the FOR UPDATE queries are very simple and should complete fast:
EXPLAIN SELECT * FROM `bw_cx_corpora` WHERE cxc_translation_id = '194' AND cxc_section_id = 'mwCA' AND cxc_origin = 'user' ORDER BY cxc_timestamp DESC LIMIT 1 FOR UPDATE;
Notes so far:
- If saving fails, the UI will retry automatically every 60 seconds up to 10 times
- If the user is actively using, it seems we will retry every 5 seconds in the worst case (it is debounced, so only if the user pauses typing enough for the autosave to trigger)
- If I simulate slow queries by adding sleep( 120000 ); to the API we use, the automatic retry will cause a build-up of queries to the extend my developer wiki went down. It seems the connection stays open forever from hhvm, if the nginx sends timeout to the client?
- There is cxsave pinglimiter to reject if there is more than 10 saves in 30 seconds. This does not prevent build-up of queries though.
Possible avenues for investigation:
- set your test wiki to read-only, keep translating and watch out if the retry mechanism creates a pile-up of requests
- copy the production db tables to your local wiki, try to spam it with lots of queries seen during the outage, see if they pile up, explain the queries to see that the query plan makes sense
The build logs for previous, failing patchsets of https://gerrit.wikimedia.org/r/#/c/348693/ are still available. Of course other patches are not failing anymore, because the fix was merged.
Wed, Apr 19
Captured the log for triage, coordination and mitigation parts in P5290
The UniversalLanguageSelectorHooks::getDefaultLanguage implementation is less than ideal in this regard. It takes first exact match, failing that, it strips the country code away and tries again. Perhaps it should be more aggressive at that stripping.
It seems that this broke few tests in ContentTranslation that I fixed in https://gerrit.wikimedia.org/r/#/c/348693/
It sounds like your browser is configured to request pages in English, thus it functions as expected.
With script pages, do you mean user page subpages ending in .js and MediaWiki namespaces pages ending in .js, or something else?
Tue, Apr 18
If I remember correctly, the main thing cxserver is doing with the parsoid output is to segment it to sentences with additional mark-up. I am pretty sure plain parsoid HTML output doesn't work at all, or at least some functionality would be broken.
With the proposed public APIs, the CX extension would still need to call CXServer directly for /page. Won't this inconsistency be annoying in the short and long run?
From your list of steps you are missing https://www.mediawiki.org/wiki/Help:Extension:Translate/Page_translation_example#Step_3:_Enabling_translations
Is this a new error? If so I believe the cause is a recent change to $wgDummyLanguageCodes in https://gerrit.wikimedia.org/r/331208
Mon, Apr 17
Sun, Apr 16
I usually do exports two times a week. I will report on Monday if I see any issues.
Fri, Apr 14
One important point about it being universal would be to support the use of LanguageConverter in interface messages. Then it could also be considered for https://www.mediawiki.org/wiki/Internationalisation_wishlist_2017#Better_support_for_formal_and_informal_variants
Wed, Apr 12
I have applied the patch. It would be nice if you also added a small description for the project to https://translatewiki.net/wiki/Group_descriptions.
This was discussed in the daily meeting yesterday, where I explained that there is a patch that works and has been tested (it has -1 from Krinkle with suggestions to make the code better). It was decided that if someone feels like it is important and doesn't need much time to review it, they can do it. But unless it is SWATed on this Thursday, the next possible deployment is SWAT on Monday 24th. If that does not happen, then there is plenty of time to redo the patch.
Tue, Apr 11
Okay, I was wrong. It is ULS ext.uls.compactlinks after all. Makes me wonder if we could avoid loading it on pages which certainly don't have language links.
Mon, Apr 10
There is no fix, only work-arounds. MediaWiki limits what page titles are valid. I believe that it is far easier for you to remove one space from the message key than me to implement a message key rewriting code that
- might be slow
- is likely to be complicated due to complex rules what is a valid title and what is not
- will take us further from the 1:1 to mapping between page title and the real message key in the files
Nothing needs to be done. At some point we can clean up the code and drop ES3 support.
This was reported again elsewhere. The issue is that the key has two spaces, and MediaWiki does not support that. I really recommend you to change the message key.
Fri, Apr 7
Apologies for review taking too long.
The percentage is the primary data – color is used here just for as an additional cue.
Thu, Apr 6
This causes the ULS cog to be positioned incorrectly in the sidebar. I am looking how to fix this, but I might need help figuring out how to make the positioning more generic, or how to detect presence of this change reliably, or have this change reverted before next deployment train if no fix is ready at at that time.
Wed, Apr 5
Enforcing sorting across all repositories seems a lot more involved and controversial change that trying to reserve a place at the top for some of the most popular repositories. Backports could still conflict with each other, I guess?
The familiar issue. We really need to get into habit of not using #-selector with variable input.
return this.parentTranslationUnit.$translationSection.find( '#' + this.model.sourceDocument.id );
Thanks @matmarex for the patch. It's faster me to review than to hunt someone else do a code review. Verified on test.wikipedia.org.
Tue, Apr 4
It's more useful for translators to have related messages appear together. In which cases do security patches or normal deployments need to update i18n files?
Mon, Apr 3
- There should not be anymore rows that are stuck in gray
- The speed of updates is slow (less than ten pages per second)
- Given the special page has time limit of 2 seconds for processing, and meta has ~6000 groups, it will take a while
- Api has 10 second limit (8 by default).
- Lots of Duplicate get(): "metawiki:SpecialLanguageStats%3A%3AmakeGroupRow:----progress-page-###-fi-uk" fetched N times in the logs, I have a patch in progress that uses getMulti that will also avoid this issue.
- Profiling shows some hot spots:
- MediaWiki\Logger\Monolog\LegacyHandler::write 4,2s (perhaps due to above log messages?)
- StringMatcher::match 4s due to inefficient implementation of MessageGroups::expandWildcards
The mobile styles for jquery.uls are not currently maintained (and like you said, they should be responsive instead).
Fri, Mar 31
I would bet there would be demand for features 2 and 3 if lack of 1 wasn't a blocker.
On 0.20.2 the positioning is completely broken (this is without changes in Translate):
Of course it is possible already via lots of clicks, but likely with small effort it could be much more efficient overall.
I am not sure why I left this open. There isn't pressing need nor way to fix this without completely changing the way it works (i.e. DOM based translation).
Wed, Mar 29
I don't know what all it controls. In general people wan't to know if their translation is updated by FuzzyBot (this happens frequently in translatewiki.net when importing external changes).
I am closing this for a lack of sufficient amount of information to start investigating this.
I see two solutions:
- enable trailer filter on Wikimedia (which will likely be blocked for performance reasons, and CleanChanges has been blocked already)
- output those links in Special:SupportedLanguages only conditionally or not at all.
It is unclear to me what, if anything, needs to be done in Translate. Could someone clarify?
Tue, Mar 28
Mar 28 2017
Why was MediaWiki-extensions-Translate added again?
Mar 27 2017
- Prefix puppet is handy (I wish I had discovered it before starting to create instances)
- Easy MediaWiki setup for each student (the focus was not on installing MediaWiki)
- Easy for me to assist students with the issues they had and to review assignments
- The wikis were public, and I could also ssh into the instances but that was not necessary
- I was able to use one instance for demonstrations in the class and then students could go that later to check what I did
- Students now have wikitech accounts, making it easier for them to contribute in the future if they want to
Okay to go ahead on this. All instances are turned off already. Thanks for the opportunity.
Nested groups exist already, so it is a reasonable request to allow using them (that's why I reopened the task). Whatever happens here won't resolve this issue until someone comes and actually makes a patch (that's why I discouraged further discussion in my previous comment).