Thu, Sep 23
Nice work :)
Wow 10 min reaction time! 🤩
I succeeded with help from Cloud Services Support to get https://github.com/dpriskorn/ItemSubjector/tree/prepare-batch working
Wed, Sep 22
Tue, Sep 21
Wed, Sep 15
@Hannah_Bast I could not find the date of the wikidata dump used in the service, is that available in the UI?
Mon, Sep 13
This sounds like a really good idea. We would probably need a new namespace: Sense for this to work.
@KingsleyIdehen maybe you can help shed some light on the questions about Virtuoso here?
Sun, Sep 12
Sat, Sep 11
I agree, the system is working as designed. Edits on-wiki are prioritized over API edits, which are throttled when WDQS cannot keep up. The worst bottleneck at the moment is Blazegraph and copying over every single item there on every edit.
I suggest closing this ticket.
I would very much like this. I tried diving into the query-API on Wikidata today and I did not like the format of the current documentation at all. Confusing to say the least. Thanks to the telegram group I got my query working using the https://www.mediawiki.org/wiki/Help:Extension:WikibaseCirrusSearch extensions but they are not found anywhere in the generated API documentation what I could see.
Thu, Sep 9
Maybe someone in the community can adopt it from Ivan Krestinin? It should be able to run fine on VPS I assume.
Happy to be of service, and nice to meet you too. I liked the presentation a lot. I feel like doing a ph.D myself on knowledge graphs like Wikidata 😆
Wed, Sep 8
FWIW a week-based quarry on semi-automated edits from the most popular tools (manually selected based on another query) is here: https://quarry.wmcloud.org/query/58473
OAuth CID: 1776 869604 aka QS
OAuth CID: 1740 13024 aka Author-Disambiguator 2.0
OAuth CID: 1768 6002 aka Mix'n match 1.0
Fri, Sep 3
I tried porting this script to use the new mutation observer but failed.
For some reason after making the replace the entityselector resets and the replacement disappears.
Thu, Sep 2
This also affects Wikidata. I manually extracted and reuploaded this https://commons.wikimedia.org/wiki/File:Complete_English-Jewish_Dictionary,_6th_ed._-_Harkavy_-_1910.djvu-page5.jpg but that is cumbersome and introduces redundancy
Tue, Aug 31
I totally support finding a way to avoid this redundancy. It's a terrible waste of resources and time to input it for all languages in the world/supported by the system, and does not add knowledge to the whole at all.
The dependencies for Rya is here:
Here is the report for 2021:
The dev mailing list of Rya is unfortunately very quiet. https://firstname.lastname@example.org/maillist.html
I recommend the Search Team to attend the ApacheCon to gather insights from others using Big Data succesfully:
See institutions already using HBase in production: http://hbase.apache.org/poweredbyhbase.html
It is very widely used to handle PB of data, it seems.
@Tpt is it possible to run a cluster of say 2 oxigraph instances on the same sled store?
I could then manually update the sled store in some way and the changes are visible in both of the oxigraph endpoints?
@Tpt thanks for the corrections!
Hi, I fail to understand the descriptions on 2 of the new dashboards.
Mon, Aug 30
Sun, Aug 29
Possible solution here https://phabricator.wikimedia.org/T289940
And they should probably be merged
Aug 26 2021
- no commits since dec 2019 https://github.com/Merck/Halyard/commits/master
Aug 24 2021
Rya is build on Accumulo which was never evaluated according to https://docs.google.com/spreadsheets/d/1MXikljoSUVP77w7JKf9EXN40OB-ZkMqT8Y5b2NYVKbU/edit?usp=sharing
I took a glance at Virtuoso.
Aug 23 2021
Aug 20 2021
@EBernhardson That is a very nice script! Thanks for annotating it, so I almost understand everything going on. 😅
I just ran it for a few minutes and I saw lags of 70-162 seconds. The highest being of a lexeme.
Aug 16 2021
@Salgo60 has documented that Google tracks the change stream and update their searches within 20 minutes. So I guess they are 1 to mention. :)
Aug 15 2021
Aug 9 2021
This would affect https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Commons_media which is a stable interface.
Libreoffice works fine with the CSV from WDQS. I have used multiple versions over the parst 2 years and never had a problem.
I propose to close this as won't fix. CSV is good enough IMO.