Jan 22 2021
Nov 15 2020
The copy of Denelezh's MySQL data files is 365G. The denelezh schema is 324G. I'm working on setting-up a local MySQL server to restore it (and carefully, as this is the only backup at the moment). One idea is to export only a subset of it, so it will be easier to transfer it to Humaniki's server and to restore it.
Oct 2 2020
On Denelezh, I installed MySQL using Oracle's apt repository: https://dev.mysql.com/downloads/repo/apt/
Sep 6 2020
Mar 17 2020
A project grant was opened to achieve this project. Feedback is of course welcome :)
Dec 12 2019
Oct 15 2019
Oct 14 2019
@Maximilianklein I sent you an email about this topic.
Oct 1 2019
Started to work on this with @Seb35. I should be able to deliver a merge request on Saturday or Sunday.
Sep 3 2019
Just a kind reminder that I reported this issue at Wikimania 2017.
Aug 28 2019
Aug 27 2019
Lighting talk at French WikiConvention (the third one): https://meta.wikimedia.org/wiki/WikiConvention_francophone/2019/Programme/pr%C3%A9sentations_%C3%A9clair_communaut%C3%A9_et_outils
Aug 18 2019
Link to Denelezh Git: https://framagit.org/wikimedia-france/denelezh-core
Aug 7 2019
From @Magnus in Wikidata Telegram channel: "I restarted it, someone please close the phab issue".
Jul 19 2019
Thank you all! :)
Jul 7 2019
Thank you all! Using Wikidata Toolkit, I was able to load the dump generated on 2019-07-04 :)
Jul 3 2019
Sorry to reopen this bug, but it seems that the new dumps still have the .not extension:
Jun 26 2019
Jun 25 2019
Thanks for the ping! I don't use RDF dumps at the moment, and I'm fine with this change.
May 9 2019
I did not had the opportunity to test the fix before it was reverted, but I totally agree with @Nikki.
Apr 28 2019
Thanks for the ping! I don't use this table either, but it seems a huge and well prepared work. Good luck :-)
Apr 23 2019
Thank you for your replies. A few comments / questions:
- While I understand your point, I fair that isolating some data from the main dump is only a temporary solution to its size growth. Sooner or later, it will weight 1 TB (even compressed), and we'll have to deal with this (as producer or as consumer).
- Will the lexemes dumps contain the P namespace, or will the consumer have to additionally download the other complete dump to get the data about properties?
- Will there be one dump per namespace (one for P, one for Q, one for L)?
Apr 17 2019
Thank you for the reply and for having clarified that this issue has no assignee at the moment.
Glad my help was useful :) Thank you for your quick fix!
Apr 15 2019
FYI, I also have very different results for P380 (even though it is data from the dump of 2019-04-08). If you follow the link "Usage history" on Property_talk:P380, you'll see that were no recent major change on the usage of this property.
Apr 13 2019
@thiemowmde Are you still working on this issue?
Mar 23 2019
Hello, I'm also very surprised on how things are evolving here. This task was about changing how dumps generation is scheduled (based on week or on month), not their frequency. The first proposals by @JAllemandou were with 4 dumps generations per month. Now, you are talking about reducing the frequency of dumps generations from 4-5 per month to only 2, on an irregular basis (sometimes with 3 weeks between two dumps, sometimes with 1 week).
Mar 17 2019
Feb 25 2019
Women in Red rely on Wikidata Human Gender Indicators (WHGI) statistics, which are generated once a week using Wikidata dump. If not already done, you should get in touch with these communities to evaluate the impact of the change.
Oct 31 2018
Apr 26 2018
Apr 25 2018
Apr 16 2018
Mar 6 2018
Feb 5 2018
Jan 8 2018
Dec 9 2017
As a user, it's back to normal for me. Thanks !
Dec 7 2017
Everything seems slow after a few edits: page load, auto-completion, save, etc. It looks like a rate limit is raised by edits and is impacting every request. After a short time without editing, it's back to normal.
Dec 6 2017
Dec 3 2017
Sorry, I didn't see this task as I wasn't added to it. The problem was solved for me a few hours/days later in August. Thanks :)
Oct 11 2017
@Dereckson Thanks! It seems that there is no IPv6 (ping6 does not work, even if I have an IPv6 inside the network). I was also informed that the network relies on two DSL Orange routers; I hope that IPs will not change during the event...
Sep 13 2017
I've at last published a blog post: https://www.lehir.net/the-french-connection-at-the-wikimania-2017-hackathon/
Sep 2 2017
Aug 13 2017
I totally agree with Ash_Crow. Wikidata aims to be the sum of all knowledge which can be referenced, which Wikimedians data is not. A U namespace could be a solution, but only and only if only a Wikimedian can edit its own item.