@WikiLucas00, both language and family are "lingualibre", so you can use
site = pywikibot.Site('lingualibre', 'lingualibre')
and yes, it is already available in the master branch.
@WikiLucas00, both language and family are "lingualibre", so you can use
site = pywikibot.Site('lingualibre', 'lingualibre')
and yes, it is already available in the master branch.
I proposed a patch at https://gerrit.wikimedia.org/r/c/pywikibot/core/+/703625
I am clearly not an expert of the Pywikibot core so it may be not correct but it works for me.
Dans T215055#7031318, @Pamputt a écrit :Does someone know how to enable translations in javascript gadget such as Gadget-ExternalTools.js, so that it is possible to translate "This is not an allowed URL... It should link to PetScan or Wikidata Query." and so on?
Dans T215055#6875890, @Pamputt a écrit :There is also this page (https://lingualibre.org/wiki/Special:Log/remoteupload) where we cannot translate (I did not find where) the sentence "These events track when someone upload a file to the remote wiki using OAuth."
That's said, it would be useful to be able to translate "Remote upload log" as well (like for the other logs)
And also be able to translate the log message such as
2 mars 2021 à 19:41 Pamputt (discussion | contributions | bloquer) has uploaded a file to the remote wiki: File:LL-Q150 (fra)-Pamputt-fuselage.wav
The difference comes from the datatype of the properties: P2 (instance of) has the Item datatype so the Wikibase checks that the value that is entered is an item; P12/P14 have the external identifier datatype. So the a priori check cannot be done. On Wikidata, there is the constain system that allow to check a posteriori that the values are correct (or at least look correct).
I do not know whether we can use the constrain system on the LinguaLibre Wikibase (@VIGNERON ?) but I think we cannot do better because this is how the Wikibase works.
.I don't feel comfortable with a massive import mainly because among all the elements that are languoids, there is a mix between language families, dialects, varieties, etc. And so I wonder about our means to manage all of this. So agree to import list of "official" langages but skeptical about a massive import
Looks to be fixed by this modification.
I have created new screenshots so I think we can close this task because I am not aware about other pages where images are missing.
Oh yes, sorry. I got mixed up with the menus... No real opinion about the best place for the menu. So I close this report.
@Seb35 in the part "I know what I'm doing" (at the bottom), there are still some missing images (not recreated). It is possible to recreate them and close this task. So I would say it depends how difficult it is for you to receover the lost images. If it is too long, I think we should close so that you can concentrate on other tasks.
Indeed, so it is probably related to T269885.
Indeed, it is fixed. I close this task.
This bug is solved in the new version of Lingua Libre (MediaWiki 1.35). @Seb35 I let you close this bug report.
@Eihel, what is the error that appears? I made a change and save it and I did not get any error.
Does someone know how to enable translations in javascript gadget such as Gadget-ExternalTools.js, so that it is possible to translate "This is not an allowed URL... It should link to PetScan or Wikidata Query." and so on?
In the new version of the website (April 2021), there is a string in the user preferences that is not translated; it is
Actually, enabling common.js would be useful for users who want to create a generator as explained in https://lingualibre.org/wiki/Help:Create_a_new_generator. One user wanted to test this feature
There is also this page (https://lingualibre.org/wiki/Special:Log/remoteupload) where we cannot translate (I did not find where) the sentence "These events track when someone upload a file to the remote wiki using OAuth."
@Yug, could you elaborate a bit more? From the title I understand that you would like the LanguaImporter gadget be able to import such list? If so, I disagree I think it should be done by another gadget or be improted by hand and/or bot. We should not add more features to LanguaImporter other than creating an item for a language. So could you retitle?
I did it.
I have updated the LinguaImporter gadget code in order to take into account the new property.
Yug, since you are a subscriber, it means you get a notification (by email for example), so no need to write a comment to say that you read the last message. So suscribe to all taks you are interested in or you can watch the full LinguaLibre project to be aware of all changes happening in any task.
I've created P26 to link to Commons category relative to a language.
By the way, the code is not necessary the same as the ISO 639-3 code because some language/dialect does not have ISO 639-3 code (for example: Martinique Creole). Currently, all recordings for languages with no ISO code are in Category:Lingua_Libre_pronunciation-other. The problem about the category for language/dialect that do not have ISO code is described in T208641. So once we will have found the category name, we will be able to link to that category name with the new property.
If such possibility exists, it should go further. For example the Record Wizard should limit the number of recordings in order not to record more words than the limit.
I have created P25. So go ahead :)
Could you describe a little bit morewhy you want to add a limit? This is policy that is decided one a project basis. For example, for now, the French Wiktionary community decided not to limit the number of audio pronunciations. If one day, the community think there are too many pronunciations, it will decide what to do. It is not really relevant to add an function to disable addition of audio on wiki page. For example, if we set a limit to 10 and there are 10 pronunciation from the same region, it would be interesting to keep only 2 audio recordings from this region to allow recording from other regions. To be useful, this job can only be managed by humans, so that the community, Wiktionary project by Wiktionary project (because the rule may be different from one project to another.
I know many files hosted locally (mainly screenshots for have not been migrated during the deployment of the V2 version (see T264332). I think these files are lost definitely but not sure. So I do not know whether it is linked but if there is some mess in the file database, it would be nice to clean all.
What has to be done now is
This message should explain what is the problem and what to do (wait until the user has uploaded 500 files, or 1 month after the account creation).
The current message
Thank Michael for the ffedback. Let us close this report for now. Feel free to reopen if the problem shows up again.
Dans T233917#6835684, @Yug a écrit :Before to ask someone to code this...
IS THIS PROPERTY REALLY NEEDED SINCE ITS AN EXACT DERIVATIVE OF P-13 ?
To "test", we just need to write something in MediaWiki:Sitenotice. This text should appear somewhere (usually at the top of all pages). We should test again after the deployment of MediaWiki 1.35.
Of course, because this message is visible (or should be visible) by everyone on Lingua Libre, we should not test too long (or at least write a comprehensive message).
Agreed with @WikiLucas00, let us wait the fix be deployed in the actual code (not only in MediaWiki:Common.css).
It is not solved because even if we understand the origin of the problem, we still need to "think about a workaround or at least a clearer error message".
I think it is done. We can close it.
This would be nice indeed. Actually, I do not remember how it did work before because the possibility to get a list of lexemes or forms should be implemented in the Record Wizard (Details step). I've opened a new ticket (T274667) to ask for this feature.
I am not aware that Translatewiki is linked to the "Commons captions".
Difficult to answer precisely. So I would say the best would be to ask on Wikidata talk:Lexicographical data to see what the "Wikidata lexicographers" expect from LinguaLibreBot. I think what is listed in the description is still valid but it's worth to ask to the current community. I think @VIGNERON may have some opinion about that since he is quite involved in lexicographical data.
@Bouzinac, could you point to a page on enwp that still suffers this issue?
Solved by @WikiLucas00 by importing MediaWiki:Lang.