Fri, Jul 27
Etalab (who runs the open data portal of the French government) have released a statement (in French) concerning the attribution requirement of their "licence ouverte", confirming that it only applies to the first re-user.
Jul 17 2018
@Chicocvenancio I agree with Yury - it makes it significantly harder to deploy Django projects.
Jun 18 2018
This would be very useful for T197588. It would make a lot of sense for Wikibase Quality Constraints in particular.
One other approach to this problem would be to consider that these manifest files are not expected to be necessarily hosted by the Wikibase instance itself - these configuration files could be user-contributed and hosted anywhere (or derived automatically from the Wikibase Registry). The downside is that this requires more work from the community (users need to maintain these manifest files themselves) but it could be necessary if we want to include things like URLs of external tools like QuickStatements.
A sample of what such a manifest could look like is here:
It could be served at a predictable location for each wikibase instance - such as, for instance,
or something similar
Jun 15 2018
Jun 4 2018
Just noting that this prevents us from adding examples on lexeme-related properties, such as https://www.wikidata.org/wiki/Property:P5244.
Jun 2 2018
Thanks for adding me in the loop! @RazShuty, do you mean any of this?
- migrate the existing reconciliation service (https://tools.wmflabs.org/openrefine-wikidata/) to work on any Wikibase install
- create a Wikibase extension that already provides a reconciliation API natively, without having to create a wrapper like I did
May 28 2018
I have observed this bug multiple times now (also using Firefox).
May 27 2018
May 20 2018
May 19 2018
May 18 2018
After discussion with @Tpt, for now we are just going to change Wikidata-Toolkit's behaviour to use 0 in the After parameter as well… but that's just because it's really hard to shift the default now.
Oh I meant 10:30, fixing that now
@bcampbell that would be nice! but only if it's not too much effort :)
May 17 2018
As a lower hanging fruit, we can also "run OAbot on Wikidata", which would basically mean importing the ids to publication items. @Tpt and I started making a distributed game for that but I think a lot of these could be fully automated. That's a good hackathon-style project if anybody is interested.
May 16 2018
When running software on localhost, the client needs to have OAuth consumer credentials, which are supposed to be private. If I apply for an OAuth consumer for OpenRefine, I cannot put the credentials in OpenRefine's source code, because it would allow anyone to reuse them for any other application. So every user would need to go through the OAuth registration themselves (and then OAuth login).
Note to self: for this we would need to rethink Wikidata authentication in OpenRefine, migrating it to OAuth. This would include adding OAuth support in Wikidata-Toolkit. This has not been done yet because OAuth is not suited for open source software that is run directly by the user on their own machine.
May 15 2018
As soon as this is supported by the Wikibase API, then it makes sense to build support for this directly in Wikidata-Toolkit. This is something that would be massively useful for many people.
May 7 2018
I won't work on this for the next 2 weeks, the floor is yours!
@Nemo_bis that's probably because the edits were cached and generated by an earlier version
May 5 2018
It would be fantastic to have more meaningful edit summaries with wbeditentity. It's of course hard to do this in general, but it would be great to have this for some common cases where a short summary seems doable (adding multiple statements with the same property, for instance).
Both properties mentioned above have been created in the mean time:
So this constraint could be useful, I think.
May 4 2018
I am glad I got the discussion going then: you now have one concrete example to look at (or maybe two? you did not comment on PMC). I think it is fair to say that this is not exactly an isolated case (but I am surprised that you seem (to pretend) not to know? Maybe for legal reasons?) How do you think these problematic uploads should be treated? If there a drift between the practices of the community and the rules of the project, that problem should be solved.
I think there are plenty of examples of non-CC0 data being imported in Wikidata.
Apr 11 2018
Apr 9 2018
Mar 23 2018
Okay - I don't really know much about the MySQL ecosystem to be honest so I cannot really judge (I use postgres when I can). I haven't run into performance issues with pymysql yet - if you are worried about the impact of this maybe we can wait and see if my app (https://tools.wmflabs.org/editgroups/) scales fine in this state first.
Mar 22 2018
Hmmm… I am not sure I understand your reaction… Are you opposing this addition, then? Should I keep monkey-patching my libraries to use the pure python alternative? In that case, why is libmysqlclient-dev included in the Python 2 docker image in the first place?
Mar 21 2018
https://pypi.python.org/pypi/mysqlclient is the one recommended as MySQL backend for Django and is compatible with python3.
Mar 11 2018
@bd808 thanks a lot!!
Mar 7 2018
Feb 10 2018
this bug seems to be fairly new, it has probably been introduced by a recent code change. I have just run into it and it is definitely a new behavior.
Jan 28 2018
Jan 26 2018
I think in general if we have a good pipeline to parse citations, a LOT of people would be interested in that, and yes it would be sad to just throw the results away…
It would make sense but I fear this might go against some enwp guidelines. Editors are free to choose between citation formats but citations should be uniform in a given page, so migrating just a few citations in a page is discouraged: http://enwp.org/WP:CITEVAR.
Jan 15 2018
Note: as far as the templates are concerned, it has already been discussed a few times on wiki:
Dec 13 2017
WikiProjects seem to encode the sort of information we are after, indeed. And there are categories for them, so it might be possible to use that with the API…
@Samwalton9 wow, awesome! that's a big one.
Dec 12 2017
@Samwalton9 at some point OAbot proposed RG & Academia.edu links, so I guess that's when we introduced this regex, but no I don't think we filter anything currently.
Dec 1 2017
Hi @Samwalton9 - your refactoring is very sensible but it switched from http://old.dissem.in to https://dissem.in. There was a bug in that API that I just fixed (I think) so you can try again… but there will be other issues down the line: for instance, this new API also returns ResearchGate URLs, that should probably be filtered out.
Thanks a lot, I had not realized scratch codes could be used in the same way as normal codes.
Nov 20 2017
At Wikicite we have also had a demo of Bilbo, which attempts to parse plain text citations to extract metadata:
I am in touch with the authors of this tool and they were interested in adapting it to wikitext (but no concrete plans yet).
Nov 15 2017
Hi, I have a similar issue. I have enabled 2FA for Wikitech but cannot figure out how to use the scratch codes to regain access to my account. Any idea?
Nov 13 2017
Yep, the issue was introduced when I deployed the new project structure - it's fixed now, I think.
This is not relevant for this project (it applies to the hashtag tool, not oabot).
Oct 30 2017
Oct 28 2017
@Nemo_bis thanks! Yeah I get the idea, but there is quite a lot of work from such a prototype to something solid (like, in this state we would query the enwiki page every time we get a candidate edit)… Also, I think the simplest would be to store candidate edits in SQL directly so that we could filter them out easily when we add a new link to the blacklist.
Oct 27 2017
Thanks a lot! It looks great to me. I will merge in the hope that it does not go against WMF's policy for third-party content: T166604 .
Oct 24 2017
I have regenerated them with ./manage.py makemessages -l qot which is cleaner.
Oct 23 2017
Weird. Here is what I see:
Do you spot anything wrong? I just deleted and resent the invite.
Here is a link to the invite: https://github.com/dissemin/dissemin/invitation
Note that I haven't invited translatewiki to the organization but to the repository itself.
@Nikerabbit I have invited translatewiki's github user to the github project yesterday, should I invite any other username?
Oct 17 2017
@Ocaasi_WMF pong, done. For the favicon you can just use https://stackoverflow.com/questions/9943771/adding-a-favicon-to-a-static-html-page#9943801 in the <head></head> section.
Sep 27 2017
@Lydia_Pintscher that makes sense. Okay, thank you to you both, we are on the same page! Given all these tickets on the topic I was worried that I had missed something obvious about this issue…
@Smalyshev thanks for your quick reply! Just for clarity, I am not personally working on the PST, I was just trying to find out if there was any established way to use RDF to represent a data import. If that is the case, then other tools could use that format too (for instance, OpenRefine could export datasets to this format). I'd be happy to work on that but I can only do it if a RDF model is agreed on.
@Lydia_Pintscher , @Smalyshev and @Tpt : is there any info about how RDF is expected to behave as an import format for Wikidata? As far as I can tell, the RDF that gets fed into the Query Service is not designed for import at all:
Sep 25 2017
Sep 23 2017