Page MenuHomePhabricator

Pintoch
User

Projects

Today

  • Clear sailing ahead.

Tomorrow

  • Clear sailing ahead.

Monday

  • Clear sailing ahead.

User Details

User Since
Nov 9 2016, 7:25 PM (119 w, 3 d)
Availability
Available
LDAP User
Unknown
MediaWiki User
Pintoch [ Global Accounts ]

Recent Activity

Yesterday

Pintoch added a comment to T206392: Redesign rank icons for better visibility.

What is the protocol to go forward on this? Should we hold a RFC on-wiki to let people choose among the possible solutions above?

Fri, Feb 22, 4:41 PM · Wikidata-Frontend, Design, Wikidata

Thu, Feb 21

Pintoch updated subscribers of T204568: Extend message checker framework to support errors that prevent saving.

We have this problem in https://dissem.in/ . This project is set up on Translatewiki, the code is hosted on GitHub and uses Travis for CI. We use Django's localization system which is based on gettext. We compile messages in the CI to check that they are valid. Sometimes translators add incorrect translations
(such as translations not reusing the same variables as the msgid, or in a different format). This breaks our build as any incorrect translation will stop the entire compilation process. It is not clear if and how it would be possible to configure the translation compilation process to ignore invalid messages.

Thu, Feb 21, 2:31 PM · translatewiki.net, MediaWiki-extensions-Translate
Pintoch awarded T204568: Extend message checker framework to support errors that prevent saving a Love token.
Thu, Feb 21, 2:25 PM · translatewiki.net, MediaWiki-extensions-Translate

Tue, Feb 19

Manu1400 awarded T197587: Add WikibaseQualityConstraints to the docker image a Like token.
Tue, Feb 19, 9:43 PM · Wikibase-Quality, Wikibase-Quality-Constraints, Wikidata, Wikibase-Containers

Mon, Feb 11

Pintoch added a comment to T215789: Cannot start oabot: ImportError: liblua5.1.so.0.

Any help with finishing the migration is welcome of course, I am currently busy with dissemin but I will try to come back to this at some point.

Mon, Feb 11, 1:47 PM · OABot
Pintoch added a comment to T215789: Cannot start oabot: ImportError: liblua5.1.so.0.

@Samwalton9 yes that is due to me starting the migration… and not completing it yet!

Mon, Feb 11, 1:07 PM · OABot

Sat, Feb 2

Pintoch added a comment to T57755: Allow time values more precise than day.

I have updated the Wikibase data model docs, which incorrectly mentioned precisions of hours, minutes and seconds. I assume that they were there because they were part of an earlier design?

Sat, Feb 2, 6:36 PM · Wikidata, MediaWiki-extensions-WikibaseRepository

Fri, Jan 25

Pintoch added a comment to T206392: Redesign rank icons for better visibility.

Useful solution from Nikki: add in your common.css:

Fri, Jan 25, 3:05 PM · Wikidata-Frontend, Design, Wikidata

Jan 9 2019

Pintoch added a comment to T213012: Enable the Watchlist Messages gadget in Wikidata.

I have pinged a few interface admins on wiki to enable this.

Jan 9 2019, 10:01 AM · Wikidata-Gadgets, Wikidata

Jan 7 2019

Pintoch added a comment to T213012: Enable the Watchlist Messages gadget in Wikidata.

Oh can they? Sorry I had no idea! Thanks, I will try to enable it myself.

Jan 7 2019, 4:26 PM · Wikidata-Gadgets, Wikidata

Jan 5 2019

Pintoch added a comment to T205017: Investigation: Look at what gadgets it might make sense to pull into Wikibase.
Jan 5 2019, 11:32 PM · Wikidata-Gadgets, Wikibase-Containers, Wikidata, Federated-Wikibase-Workshops@NewYork-2018
Pintoch created T213012: Enable the Watchlist Messages gadget in Wikidata.
Jan 5 2019, 11:23 PM · Wikidata-Gadgets, Wikidata
Pintoch added a comment to T139898: Tool to facilitate property creation process.

I currently use my own custom hacky script to create properties, but having something stable and usable by anyone would be highly beneficial.

Jan 5 2019, 10:56 PM · Wikidata, Wikidata-Gadgets
Pintoch awarded T139898: Tool to facilitate property creation process a Like token.
Jan 5 2019, 10:54 PM · Wikidata, Wikidata-Gadgets

Dec 2 2018

Pintoch added a comment to T207484: API to efficiently format large numbers of entity IDs.

@Lucas_Werkmeister_WMDE thank you very much for that!

Dec 2 2018, 2:18 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata
Pintoch awarded T207484: API to efficiently format large numbers of entity IDs a 100 token.
Dec 2 2018, 2:18 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata
Pintoch awarded T207484: API to efficiently format large numbers of entity IDs a Orange Medal token.
Dec 2 2018, 2:17 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata
Pintoch awarded T207484: API to efficiently format large numbers of entity IDs a Love token.
Dec 2 2018, 2:17 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata

Nov 12 2018

Pintoch updated subscribers of T209031: Not able to scoop comment table in labs for mediawiki reconstruction process [EPIC}.

I have taken the liberty to remove "Cloud Services" as a subscriber to this ticket as I do not think every toollabs user wants to receive notifications about this.

Nov 12 2018, 3:52 PM · Patch-For-Review, Core Platform Team Backlog (Watching / External), Analytics-Kanban, DBA, Data-Services, Analytics

Nov 6 2018

Pintoch awarded T208118: Import arXiv ID (P818) and "full work available at" (P953) from unpaywall dataset a Love token.
Nov 6 2018, 9:55 AM · WikiCite, Wikistorm

Nov 5 2018

Pintoch added a comment to T194813: Find ways to integrate/improve OABot with WikiCite efforts.

As explained in T164152 I am happy to mentor anyone for this.

Nov 5 2018, 8:35 AM · Wikimedia-Hackathon-2018, WikiCite, OABot
Pintoch updated subscribers of T164152: Integrate with Wikidata.

@Daniel_Mietchen regarding https://twitter.com/EvoMRI/status/1055785761574813696 (I do not read Twitter notifications - but happily interact on open platforms such as Mastodon):

Nov 5 2018, 8:33 AM · OABot

Nov 2 2018

Pintoch awarded T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it a Like token.
Nov 2 2018, 3:03 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service
Pintoch added a comment to T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it.

The search interface can also be used for that thanks to the haswbstatement command. That only gets you one id per query, so it might not be suited for all tools. I don't know if the lag is lower in this interface.
Retrieving items by identifiers is quite crucial in many tools so it would be useful to have a solid interface for that instead of relying on SPARQL (which feels indeed like using a sledgehammer to crack a nut).

Nov 2 2018, 2:52 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service
Pintoch added a comment to T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it.

@Gehel my service has been quite unstable for some time, but I haven't found the time yet to find out exactly where the problem is coming from - it could be SPARQL, the Wikidata API, redis or the webservice itself. I will add a few more metrics to understand what is going on and report back here.

Nov 2 2018, 1:54 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service

Nov 1 2018

Pintoch added a comment to T200234: Create edit groups when running Wikidata-related scripts.

@Criscod yes that would be a great idea.

Nov 1 2018, 11:50 AM · Pywikibot, Pywikibot-Wikidata

Oct 31 2018

Pintoch added a comment to T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it.

Thanks for the ping Lydia! On the top of my mind, the only uses of SPARQL in the tools I maintain are in the openrefine-wikidata interface:

  • queries to retrieve the list of subclasses of a given class - lag is not critical at all for this as the ontology is assumed to be stable. (These results are cached on my side for 24 hours, for any root class.)
  • queries to retrieve items by external identifiers or sitelinks - lag can be more of an issue for this but I would not consider it critical. (These results are not cached.)

What matters much more for this tool is getting quick results and as little downtime as possible - lag is not really a concern.

Oct 31 2018, 7:48 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service

Oct 29 2018

Pintoch added a comment to T207839: Batch add WO II war memorials to Wikidata .

Just to let you know that the problem with the ".0" will be solved in the next version of OpenRefine.
In the meantime, you can solve the issue by transforming your column with the following expression: value.toString().replace(".0",""). Hope it helps!

Oct 29 2018, 7:56 PM · Wikidata, Wikistorm

Oct 27 2018

Pintoch awarded T206755: Understand steps to get structured data from Excel-workbook into Wikidata a Love token.
Oct 27 2018, 1:09 PM · Wikistorm
Pintoch added a comment to T206755: Understand steps to get structured data from Excel-workbook into Wikidata.

Woooohooooo!!!!

Oct 27 2018, 1:09 PM · Wikistorm
Pintoch added a comment to T208034: OpenRefine demo at Wikistorm.

So I had the opportunity to annoy a lot of people by shouting OpenRefine repeatedly in their ears over the past 48 hours.

Oct 27 2018, 1:08 PM · Wikistorm

Oct 26 2018

Pintoch added a comment to T207839: Batch add WO II war memorials to Wikidata .

Awesome! \o/ Actually OpenRefine could potentially help you already at that stage to do the matching - let me know if you want a quick demo :)

Oct 26 2018, 7:35 PM · Wikidata, Wikistorm
Pintoch moved T208034: OpenRefine demo at Wikistorm from Backlog to Session on the Wikistorm board.
Oct 26 2018, 6:22 PM · Wikistorm
Pintoch added a comment to T208076: We would like some support for OpenRefine.

I would be happy to help I have a tshirt with an OpenRefine logo (the blue diamond)

Oct 26 2018, 6:22 PM · Wikistorm
Pintoch created T208034: OpenRefine demo at Wikistorm.
Oct 26 2018, 11:16 AM · Wikistorm
Pintoch moved T208018: Wikicite session from Backlog to Session on the Wikistorm board.
Oct 26 2018, 9:02 AM · Wikistorm
Pintoch updated the task description for T208018: Wikicite session.
Oct 26 2018, 9:00 AM · Wikistorm
Pintoch created T208018: Wikicite session.
Oct 26 2018, 8:58 AM · Wikistorm
Pintoch added a comment to T207865: Improve Wikidata's dataset import hub.

I have left some ideas here:

Oct 26 2018, 8:15 AM · Wikistorm

Oct 24 2018

Pintoch moved T207865: Improve Wikidata's dataset import hub from Backlog to Project on the Wikistorm board.
Oct 24 2018, 5:26 PM · Wikistorm
Pintoch created T207865: Improve Wikidata's dataset import hub.
Oct 24 2018, 4:41 PM · Wikistorm
Pintoch renamed T205488: Import villages of Aruba to Wikidata from Write script to import villages of Aruba to Wikidata to Import villages of Aruba to Wikidata.
Oct 24 2018, 2:47 PM · Wikistorm
Pintoch added a comment to T206755: Understand steps to get structured data from Excel-workbook into Wikidata.

I will be available to help with OpenRefine. It is exactly designed for this workflow indeed so I hope it will be a match :)
For reconciliation help, have you seen this page?
https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation

Oct 24 2018, 2:47 PM · Wikistorm
Pintoch claimed T207839: Batch add WO II war memorials to Wikidata .

I would be interested in helping with this - I can guide you through the uploading process with OpenRefine.
If you want to prepare for this, I feel free to download OpenRefine have a look at tutorials, like these:

The videos at http://openrefine.org/ are also useful to get an idea of what OpenRefine does (with no reference to Wikidata).

Oct 24 2018, 2:41 PM · Wikidata, Wikistorm

Oct 19 2018

Pintoch added a comment to T207370: Statistics of number of Wikidata edits with Magnus Manske's tools.

Some of the OpenRefine edits were not tagged during development but all edits done with a released version should be. Some of the OpenRefine batches are uploaded via QuickStatements, in which case they are tagged as such. (The main benefits of using QS with OpenRefine is to run batches in the background or to have a statement matching rules when updating existing claims).

Oct 19 2018, 7:06 AM · GLAM, Wikidata

Oct 13 2018

Pintoch added a comment to T205488: Import villages of Aruba to Wikidata.

Sure, happy to help any time! (Online or at the Wiki TechStorm)

Oct 13 2018, 12:28 PM · Wikistorm

Oct 12 2018

Pintoch closed T192811: Provide an OpenRefine API for matching by class and properties as Invalid.
Oct 12 2018, 9:23 AM · Wikidata, Federated-Wikibase-Workshops@Antwerp-2018
Pintoch added a comment to T192811: Provide an OpenRefine API for matching by class and properties.

I think this ticket can be closed given that we cannot figure out what it is supposed to be about.

Oct 12 2018, 9:22 AM · Wikidata, Federated-Wikibase-Workshops@Antwerp-2018

Sep 28 2018

Pintoch awarded T169666: Render partial results a Love token.
Sep 28 2018, 10:26 AM · Wikidata Query UI, Discovery, Wikidata

Sep 25 2018

Pintoch awarded T178249: Parameter for linking a new page to the Wikidata a Like token.
Sep 25 2018, 10:41 AM · Wikidata

Sep 19 2018

Pintoch added a comment to T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

I was thinking of the opposite: consider the violations related to the revision R of the item I to be the violations of the statements of I with respect to the state of Wikidata just before R+1 was saved.

Sep 19 2018, 5:07 PM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch added a comment to T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

@Lydia_Pintscher yes indeed! For instance the aggregation at batch-level would probably not be meaningful for inverse constraints (unless there is a way to detect all the violations added and solved by an edit, not just on the item where the edit was made). But isn't this a problem that you have anyway, even when storing only the latest violations? For instance, if I add a "subclass of (P279)" statement between two items, don't you need to recompute type violations for all items which are instances of some transitive subclass of the new subclass? I am not sure how this invalidation is done at the moment.

Sep 19 2018, 8:03 AM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Sep 18 2018

Pintoch added a comment to T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

@Lydia_Pintscher personally here is what I would concretely implement in the EditGroups tool. For each edit that is part of an edit group:

  • fetch the constraints violations before and after the edit (this fetching would happen as the edit is retrieved, so in near real-time)
  • compute the difference of constraints violations of each type (for instance, 1 new "value type constraint" violation and 2 less "statement required constraint" violation)
  • aggregate these statistics at a batch level and expose them in batch views (for instance, this batch added 342 new "value type constraint" violations and solved 764 "statement required constraint" violations)

Together with the number of reverted edits in a batch (which the tool already aggregates), this could potentially make it easier to spot problematic batches.

Sep 18 2018, 9:26 AM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Sep 17 2018

Pintoch updated subscribers of T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

This ticket is fantastic news.

Sep 17 2018, 4:19 PM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch awarded T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache a Love token.
Sep 17 2018, 4:11 PM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch awarded T202404: investigate options for regularly running constraint checks a Love token.
Sep 17 2018, 4:07 PM · User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), wikidata-tech-focus, Wikidata-Query-Service, Wikibase-Quality, Wikidata, Wikibase-Quality-Constraints
Pintoch awarded T204022: Add functionality to run QualityConstraint checks on an entity after every edit a Love token.
Sep 17 2018, 4:07 PM · MW-1.33-notes (1.33.0-wmf.14; 2019-01-22), Patch-For-Review, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Story, wikidata-tech-focus, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Sep 16 2018

Pintoch added a comment to T174540: Use open access URL via oaDOI in citoid response.

@martin.monperrus see my first comment in this thread.

Sep 16 2018, 2:14 PM · Citoid
Pintoch added a comment to T174540: Use open access URL via oaDOI in citoid response.

@martin.monperrus Let me emphasize that this is a significant change that should get community approval first. There has already been a lot of discussion about similar changes to the DOI template on the English Wikipedia and there is clearly a consensus against this IMHO.

Sep 16 2018, 7:44 AM · Citoid

Sep 14 2018

Pintoch added a comment to T204267: Flood of WDQS requests from wbqc.

@aborrero thanks for the ping. I do not recognize the shape of the queries as coming from this tool though. The openrefine-wikidata tool should do relatively few SPARQL queries, whose results are cached in redis. How did you determine that this tool is the source of the problem?

Sep 14 2018, 10:57 AM · Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Cloud-Services, Operations, User-Addshore, Wikibase-Quality, Wikidata, Wikidata-Query-Service, Wikibase-Quality-Constraints

Sep 5 2018

Pintoch added a comment to T202729: When creating a new Sense through wbeditentity the summary is confusing "Created a new entity".

@Lydia_Pintscher @Ladsgroup any idea how I could be notified of any new automatic edit summaries, such as the wbeditentity-create-item that this change introduced? For any such summary, I need to add it to EditGroups, especially if the new auto summary replaces a highly-used existing one, as in this case. Otherwise, this breaks the tagging of batches.

Sep 5 2018, 5:34 PM · wikidata-tech-focus, MW-1.32-notes (WMF-deploy-2018-09-04 (1.32.0-wmf.20)), Patch-For-Review, Wikidata-Senses-Iteration3, User-Ladsgroup, Lexicographical data, Wikidata
Pintoch added a comment to T186200: Rewrite Wikibase data model implementation.

I think reworking this implementation would be very welcome because at the moment it is not pretty, to say it politely.
But I am not convinced by the alternative either. Why would Reference inherit from BaseClaim? A reference is not a claim. What would the getSnakType method mean when called on a Reference?

Sep 5 2018, 12:39 PM · Pywikibot-RfCs, Pywikibot-Wikidata, Pywikibot
Pintoch awarded T203557: Create a Edit group extension a Love token.
Sep 5 2018, 12:37 PM · MediaWiki-extension-requests
Pintoch added a comment to T200234: Create edit groups when running Wikidata-related scripts.

It might be worth giving the bot author some control over this feature:

  • there should be some opt-in / opt-out mechanism
  • there should be some control over what constitutes a batch. Some users might want to create multiple logical batches during the same run of a bot, or share the same batch id across consecutive runs of the same python script (for instance if it is called by a bash script…
Sep 5 2018, 11:02 AM · Pywikibot, Pywikibot-Wikidata

Jul 27 2018

Pintoch added a comment to T193728: Address concerns about perceived legal uncertainty of Wikidata .

Etalab (who runs the open data portal of the French government) have released a statement (in French) concerning the attribution requirement of their "licence ouverte", confirming that it only applies to the first re-user.
https://github.com/etalab/wiki-data-gouv#point-juridique

Jul 27 2018, 8:07 AM · WMF-Legal, Wikidata

Jul 17 2018

Pintoch added a comment to T190274: Add libmysqlclient-dev to Toolforge Kubernetes Python3 runtime image.

@Chicocvenancio I agree with Yury - it makes it significantly harder to deploy Django projects.

Jul 17 2018, 2:44 PM · Patch-For-Review, cloud-services-team, Toolforge

Jun 18 2018

Pintoch added a comment to T155155: Implement a way to view (extension) configuration options and its value on-wiki.

This would be very useful for T197588. It would make a lot of sense for Wikibase Quality Constraints in particular.

Jun 18 2018, 2:17 PM · MediaWiki-Configuration
Pintoch awarded T155155: Implement a way to view (extension) configuration options and its value on-wiki a Love token.
Jun 18 2018, 2:14 PM · MediaWiki-Configuration
Pintoch added a comment to T197588: Agree on a "manifest" format to expose the configuration of Wikibase instances.

One other approach to this problem would be to consider that these manifest files are not expected to be necessarily hosted by the Wikibase instance itself - these configuration files could be user-contributed and hosted anywhere (or derived automatically from the Wikibase Registry). The downside is that this requires more work from the community (users need to maintain these manifest files themselves) but it could be necessary if we want to include things like URLs of external tools like QuickStatements.

Jun 18 2018, 1:40 PM · Wikidata
Pintoch added a comment to T197588: Agree on a "manifest" format to expose the configuration of Wikibase instances.

A sample of what such a manifest could look like is here:
https://gist.github.com/despens/d6ae4110c4e97944ddba29f23d78899f
It could be served at a predictable location for each wikibase instance - such as, for instance,
https://www.wikidata.org/manifest-v0.1.json
or something similar

Jun 18 2018, 1:11 PM · Wikidata
Pintoch added a project to T197587: Add WikibaseQualityConstraints to the docker image: Wikibase-Quality-Constraints.
Jun 18 2018, 12:10 PM · Wikibase-Quality, Wikibase-Quality-Constraints, Wikidata, Wikibase-Containers
Pintoch created T197588: Agree on a "manifest" format to expose the configuration of Wikibase instances.
Jun 18 2018, 12:00 PM · Wikidata
Pintoch created T197587: Add WikibaseQualityConstraints to the docker image.
Jun 18 2018, 11:54 AM · Wikibase-Quality, Wikibase-Quality-Constraints, Wikidata, Wikibase-Containers

Jun 15 2018

Pintoch awarded T168626: Check constraints before saving statements a Love token.
Jun 15 2018, 10:40 AM · Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Jun 4 2018

Pintoch added a comment to T195615: handle use of statements linking to Lexemes (and Forms?) more gracefully on client.

Just noting that this prevents us from adding examples on lexeme-related properties, such as https://www.wikidata.org/wiki/Property:P5244.

Jun 4 2018, 12:49 PM · MW-1.32-notes (WMF-deploy-2018-07-10 (1.32.0-wmf.12)), Wikidata-Editor-Experience-Improvements-Iteration1, Wikidata-Campsite, Patch-For-Review, User-Addshore, Wikidata-Turtles-Sprint #5, Wikidata, Lexicographical data

Jun 2 2018

Pintoch added a comment to T192811: Provide an OpenRefine API for matching by class and properties.

Thanks for adding me in the loop! @RazShuty, do you mean any of this?

  • migrate the existing reconciliation service (https://tools.wmflabs.org/openrefine-wikidata/) to work on any Wikibase install
  • create a Wikibase extension that already provides a reconciliation API natively, without having to create a wrapper like I did

*anything else?

Jun 2 2018, 8:55 AM · Wikidata, Federated-Wikibase-Workshops@Antwerp-2018

May 28 2018

Pintoch added a comment to T195258: [weird] Wikidata SPARQL query results not the same when exported.

I have observed this bug multiple times now (also using Firefox).

May 28 2018, 9:19 AM · Patch-For-Review, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Wikidata Query UI, Wikidata

May 27 2018

Pintoch closed T194952: Introducing the EditGroups tool as Resolved.
May 27 2018, 10:13 AM · Wikimedia-Hackathon-2018, Wikidata
Pintoch closed T193875: Improving Wikidata reconciliation in OpenRefine as Resolved.
May 27 2018, 9:41 AM · Wikidata, Wikimedia-Hackathon-2018

May 20 2018

Pintoch awarded T67846: wbeditentity: try to use appropriate autocomment instead of the generic one a Love token.
May 20 2018, 1:50 PM · Patch-For-Review, Wikidata, MediaWiki-extensions-WikibaseRepository

May 19 2018

SandraF_WMF awarded T193875: Improving Wikidata reconciliation in OpenRefine a Burninate token.
May 19 2018, 12:06 PM · Wikidata, Wikimedia-Hackathon-2018
Pintoch updated subscribers of T193875: Improving Wikidata reconciliation in OpenRefine.

@Spinster @DarTar That's in 20 minutes in Sala de Projectes! QC/0011

May 19 2018, 11:41 AM · Wikidata, Wikimedia-Hackathon-2018

May 18 2018

Pintoch updated subscribers of T194869: Wikibase date datatype: default value of the "after" parameter.

After discussion with @Tpt, for now we are just going to change Wikidata-Toolkit's behaviour to use 0 in the After parameter as well… but that's just because it's really hard to shift the default now.

May 18 2018, 8:25 PM · Wikibase-DataModel, Wikidata
Pintoch added a comment to T194952: Introducing the EditGroups tool.

Oh I meant 10:30, fixing that now

May 18 2018, 2:57 PM · Wikimedia-Hackathon-2018, Wikidata
Pintoch added a comment to T194952: Introducing the EditGroups tool.

@bcampbell that would be nice! but only if it's not too much effort :)

May 18 2018, 2:48 PM · Wikimedia-Hackathon-2018, Wikidata
Pintoch updated subscribers of T194952: Introducing the EditGroups tool.
May 18 2018, 2:16 PM · Wikimedia-Hackathon-2018, Wikidata
Pintoch created T194952: Introducing the EditGroups tool.
May 18 2018, 2:14 PM · Wikimedia-Hackathon-2018, Wikidata

May 17 2018

Pintoch updated subscribers of T194813: Find ways to integrate/improve OABot with WikiCite efforts.

As a lower hanging fruit, we can also "run OAbot on Wikidata", which would basically mean importing the ids to publication items. @Tpt and I started making a distributed game for that but I think a lot of these could be fully automated. That's a good hackathon-style project if anybody is interested.

May 17 2018, 9:02 AM · Wikimedia-Hackathon-2018, WikiCite, OABot
Pintoch added a project to T194869: Wikibase date datatype: default value of the "after" parameter: Wikibase-DataModel.
May 17 2018, 8:18 AM · Wikibase-DataModel, Wikidata
Pintoch updated the task description for T194869: Wikibase date datatype: default value of the "after" parameter.
May 17 2018, 8:17 AM · Wikibase-DataModel, Wikidata
Pintoch updated the task description for T194869: Wikibase date datatype: default value of the "after" parameter.
May 17 2018, 8:16 AM · Wikibase-DataModel, Wikidata
Pintoch created T194869: Wikibase date datatype: default value of the "after" parameter.
May 17 2018, 8:15 AM · Wikibase-DataModel, Wikidata

May 16 2018

Pintoch added a comment to T194767: Set up OpenRefine on Cloud VPS.

When running software on localhost, the client needs to have OAuth consumer credentials, which are supposed to be private. If I apply for an OAuth consumer for OpenRefine, I cannot put the credentials in OpenRefine's source code, because it would allow anyone to reuse them for any other application. So every user would need to go through the OAuth registration themselves (and then OAuth login).

May 16 2018, 12:29 PM · Wikimedia-Hackathon-2018, Wikidata
Pintoch added a comment to T194767: Set up OpenRefine on Cloud VPS.

Note to self: for this we would need to rethink Wikidata authentication in OpenRefine, migrating it to OAuth. This would include adding OAuth support in Wikidata-Toolkit. This has not been done yet because OAuth is not suited for open source software that is run directly by the user on their own machine.

May 16 2018, 8:29 AM · Wikimedia-Hackathon-2018, Wikidata

May 15 2018

Pintoch added a comment to T194194: Add possibility to check constraints on unsaved statements.

As soon as this is supported by the Wikibase API, then it makes sense to build support for this directly in Wikidata-Toolkit. This is something that would be massively useful for many people.

May 15 2018, 2:16 PM · Wikimedia-Hackathon-2018, Wikidata-Ministry-Of-Magic, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch awarded T194194: Add possibility to check constraints on unsaved statements a Love token.
May 15 2018, 1:43 PM · Wikimedia-Hackathon-2018, Wikidata-Ministry-Of-Magic, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

May 7 2018

Pintoch added a comment to T187791: Strip .pdf from arxiv identifiers before adding them.

I won't work on this for the next 2 weeks, the floor is yours!

May 7 2018, 6:28 AM · OABot
Pintoch added a comment to T187791: Strip .pdf from arxiv identifiers before adding them.

@Nemo_bis that's probably because the edits were cached and generated by an earlier version

May 7 2018, 6:01 AM · OABot

May 5 2018

Pintoch awarded T192565: Find constraint violations a Like token.
May 5 2018, 2:21 PM · Wikibase-Quality, Wikidata, Wikibase-Quality-Constraints, Epic
Pintoch awarded T191885: Reflect atomic changes in wbeditentity summary a Like token.
May 5 2018, 2:19 PM · Wikidata, Lexicographical data