Sat, Sep 15
Thu, Sep 13
Mon, Sep 10
Sun, Sep 9
Fri, Sep 7
API also return (already past) block status:
https://codesearch.wmflabs.org/search/?q=massage&i=nope&files=(js%7Cphp%7Cxml)%24&repos= (this is a word but clearly used incorrectly)
Thu, Sep 6
Wed, Sep 5
As a long-term solution I propose to build a similar tool in MediaWiki, see T203557: Create a Edit group extension.
Tue, Sep 4
As each senses may have some statements, we may create a new special page Special:MergeSense to handle that.
Mon, Sep 3
If we just want to blacklist some sandbox entities it's doable via https://www.wikidata.org/wiki/MediaWiki:Robots.txt, but now it's proposed to blacklist a large number of entities, so it is not a permanent solution.
Thu, Aug 30
Probably they should.
Wed, Aug 29
It's up now, if there're no more issue it can be closed (probably after the maintenance).
Mon, Aug 27
For normal user we already have a 90 edits/minute limit (see T56515). This is probably enough.
Sun, Aug 26
Hmm there refers to two or three different aliases.
Portal: 学部 and 學部
School: 学院 and 學院 and 系
Subject: 学科 and 學科
After this and T202821: Create namespace aliases in zhwikiversity:
The Chinese versions to be aliases.
(all to be aliases, not canonical namespace names)
I have already said "Only do it when handling articles (ns0)"
Fri, Aug 24
Yes unique entity IDs should be prefixes or full URLs.
I try to merge Q123454 -> Q168780 but still does not work.
Using Special:MergeItems. (merge.js also does not work).
We can implement a kind of "internal redirect": replace the merged sense with a kind of "stub" entity including a "symbolic" link to other senses (indicating the content is located in other senses), and existing links will not be broken
We can just store a mapping of old form/sense ID to new form/sense ID in the page of redirected lexeme. But they should be editable in some way, at least in API.
This is much more an issue of WikibaseDataModel: see https://github.com/wmde/WikibaseDataModel/commit/1e3f1730abb03cdaf4a1ae4644672b025608f743
Note the individual wikis also have other "meta" pages like user pages; we should decide where we should we import them to. Also note we have T127582: Convert LQT to Flow on wikimania2010wiki. In addition, should we delete and redirect old wikis?
The search is instead be handled by wgNamespacesToBeSearchedDefault.
Probably they should be added to wgContentNamespaces.
Now sysop can not remove flood yet.
Thu, Aug 23
This is probably not relevant any more after the limit is removed.
Aug 21 2018
This is not my intention: "Allow add or remove interface-admin group by wikidata-staff" should only apply to their own account, not to arbitary accounts. However I'm not certain whether status quo is bad.
Aug 16 2018
In enwikiversity: School=100, Portal=102, Topic=104.
Aug 15 2018
Aug 13 2018
Personally we can remove the current rate limit completely as the default 90 edits/minute limit in T56515: Apply editing rate limits for all users is probably enough.
Aug 10 2018
We should also resolve T201675: Create new namespaces in zhwikiversity first.
Aug 9 2018
We should probably do T167166: Specify the use of extended language codes in Lexemes first.
For example we want to find all article using a specific template (e.g. Template:Infobox person) but missing P31=Q5 in Wikidata. (PetScan can be used in many, but not all, cases)
I don't think we should add it until sense is available and documentation is improved (cf T199616).
Aug 8 2018
In addition we should forwardly review all ISO 639 code (in batch). It is unmanageable if we need to create 7000 tasks to add every languages to Wikidata or a similar database.
Note "https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/How_to_make_new_languages_enabled_on_Wikidata" says "to add a new label language, add it to MediaWiki"; this is because of they have negative effects. However I doubt how much negative effects it have (why not kill wmgExtraLanguageNames and just use the language-data as the list of available languages). TranslateWiki explicitly don't want to support any ancient language (e.g. Avestan) but Wikidata may like it (especially lemma language is using the same set of label language plus optional -x-Qid suffix, but we clearly should use the ISO 639 code if exist, not mis-x-Qxxx).
Personally I think we should abolish this Wikidata-specific setting. We should just allow all languages in ULS (which will be the same as in https://github.com/wikimedia/language-data/blob/master/data/langdb.yaml) to be used in Wikidata (as monolingual language, label language, lemma language); wmgExtraLanguageNames (in InitialiseSettings.php) should be abolished too.
Aug 6 2018
All extant pages using raw HTML: https://foundation.wikimedia.org/w/index.php?title=Special:Search&profile=all&search=insource%3A%2F%5C%3Chtml%5C%3E%2F&fulltext=1
Aug 5 2018
Other message files (MessagesXxx.php) missing (except variants):
ais, ase, awa, bi, brh, cps, cr, hil, hyw, ik, ki, kri, krj, loz, lus, na, niu, pag, pam, pap, pih, prg, rif, sco, sei, shi, shn, sm, sma, sn, so, ss, st, ti, tn, to, tru, ts, tw, tzm, zu
Aug 2 2018
Note I don't propose a cross-wiki search feature in this task - this task is about searching page property of pages in one specific wiki.
Before T89213: Allow fallback to any language is fixed it may be common to see only Qid as many senses may not have English translation.
Aug 1 2018
Jul 31 2018
Probably needing T114662: RFC: Per-language URLs for multilingual wiki pages.
Wikidata Query Service is somewhat complex, especially if you want also to search plain texts (see T141813: Add full-text search support to Query Service).