Page MenuHomePhabricator

Pintoch
User

Projects

Today

  • Clear sailing ahead.

Tomorrow

  • Clear sailing ahead.

Tuesday

  • Clear sailing ahead.

User Details

User Since
Nov 9 2016, 7:25 PM (127 w, 3 d)
Availability
Available
LDAP User
Unknown
MediaWiki User
Pintoch [ Global Accounts ]

Recent Activity

Tue, Apr 16

Pintoch added a comment to T220415: Add API module to get language information.

Let me spell out my use case in more detail then.

Tue, Apr 16, 12:42 PM · Wikimedia-Hackathon-2019, MediaWiki-API, MediaWiki-Internationalization

Mon, Apr 15

Pintoch added a comment to T87283: Wikidata dumps should have revision ID or other sequence mark.

Ok great! I'll move the field to the end and try to make Jenkins happy then.

Mon, Apr 15, 12:44 PM · Patch-For-Review, Wikidata
Pintoch added a comment to T194813: Find ways to integrate/improve OABot with WikiCite efforts.

Sure, it also makes sense to add full text URLs.

Mon, Apr 15, 11:54 AM · Wikimedia-Hackathon-2018, WikiCite, OABot

Tue, Apr 9

Pintoch updated subscribers of T87283: Wikidata dumps should have revision ID or other sequence mark.

@Lydia_Pintscher we would need your thoughts about this.

Tue, Apr 9, 3:05 PM · Patch-For-Review, Wikidata
Pintoch added a comment to T204440: analyze and visualize the identifier landscape of Wikidata.

This is nice! However, when visualizing properties by category, it seems that subclasses are not taken into account: only the properties bearing that exact category as P31 value are listed. This gives a pretty inaccurate view: it is crucial to respect this hierarchy, just like the prop-explorer tool does:
https://tools.wmflabs.org/prop-explorer/

Tue, Apr 9, 12:07 PM · WMDE-Analytics-Engineering, User-GoranSMilovanovic, Wikidata

Mon, Apr 8

Pintoch awarded T220415: Add API module to get language information a Like token.
Mon, Apr 8, 3:43 PM · Wikimedia-Hackathon-2019, MediaWiki-API, MediaWiki-Internationalization

Tue, Apr 2

Pintoch added a comment to T216160: Update wikidata-entities dump generation to fixed day-of-month instead of fixed weekday.

I agree with @Nicolastorzec above.

Tue, Apr 2, 11:11 PM · Patch-For-Review, WikiCite, Analytics, Dumps-Generation, Wikidata
Pintoch added a comment to T87283: Wikidata dumps should have revision ID or other sequence mark.

@Smalyshev okay! Sorry if this is not the right place: I would be happy to migrate the patch to another ticket. Indeed this only adds entity-level metadata, not dump-level metadata. I think this would be less of a breaking change, given that it does not require changing the dump structure (and of course it is more useful to me, haha!)

Tue, Apr 2, 6:59 PM · Patch-For-Review, Wikidata
Pintoch added a comment to T94019: Generate RDF from JSON.

I think Wikidata-Toolkit could be used for that:
https://github.com/Wikidata/Wikidata-Toolkit/blob/master/wdtk-rdf/src/main/java/org/wikidata/wdtk/rdf/RdfSerializer.java
Obviously it would mean making sure the RDF serialization produced by it is consistent with what is being fed in WDQS at the moment.

Tue, Apr 2, 2:28 PM · Wikidata
Pintoch added a comment to T87283: Wikidata dumps should have revision ID or other sequence mark.

I am wondering what is the status of this: is more discussion needed about what version information to include, or are we simply waiting for a patch? I vote for the revision id to serve as version id (possibly with other metadata such as timestamp, as in Special:EntityData). If there is consensus for that, and if directed to the relevant part of the code, I could contribute patch.

Tue, Apr 2, 9:14 AM · Patch-For-Review, Wikidata
Pintoch added a comment to T92961: [Story] Versioning in JSON output.

Concerning the dumps, it should be possible to add versioning information on a per-entity basis, for instance by adding the revision id in the JSON serialization of the entity, as is currently done in Special:EntityData. This would arguably be more useful than a per-dump versioning, given that the dump generation process is not atomic. It would also be less of a breaking change: it would just amount to make JSON serialization of entities more uniform. This is debated in T87283.

Tue, Apr 2, 9:12 AM · Story, Wikidata, MediaWiki-extensions-WikibaseRepository
Pintoch awarded T87283: Wikidata dumps should have revision ID or other sequence mark a Love token.
Tue, Apr 2, 9:06 AM · Patch-For-Review, Wikidata

Sun, Mar 24

Pintoch added a comment to T219070: Failure of "kubectl get pods" for the editgroups project.

Just confirming that the bug has occurred today again and the proposed fix worked perfectly. Thanks again!

Sun, Mar 24, 8:40 AM · Toolforge, Kubernetes
Pintoch closed T219070: Failure of "kubectl get pods" for the editgroups project as Resolved.
Sun, Mar 24, 7:57 AM · Toolforge, Kubernetes
Pintoch added a comment to T219070: Failure of "kubectl get pods" for the editgroups project.

Woaw, thanks for the very thorough analysis! I cannot reproduce this anymore. Thanks for the GOMAXPROCS trick! I will add it to the docs.

Sun, Mar 24, 7:57 AM · Toolforge, Kubernetes

Sat, Mar 23

Pintoch created T219070: Failure of "kubectl get pods" for the editgroups project.
Sat, Mar 23, 2:08 PM · Toolforge, Kubernetes

Fri, Mar 22

Pintoch added a comment to T218779: Expose structured diffs in Wikibase API.

@Lydia_Pintscher Sure! I have just deployed a demonstration of this use case on EditGroups.

Fri, Mar 22, 7:18 PM · Wikidata, Wikidata-Campsite

Mar 20 2019

Pintoch renamed T218779: Expose structured diffs in Wikibase API from Exposed structured diffs in Wikibase API to Expose structured diffs in Wikibase API.
Mar 20 2019, 1:37 PM · Wikidata, Wikidata-Campsite
Pintoch updated the task description for T218779: Expose structured diffs in Wikibase API.
Mar 20 2019, 1:34 PM · Wikidata, Wikidata-Campsite
Pintoch added a subtask for T56328: Provide intraline diff format in API action=compare: T218779: Expose structured diffs in Wikibase API.
Mar 20 2019, 1:31 PM · MediaWiki-API
Pintoch added a parent task for T218779: Expose structured diffs in Wikibase API: T56328: Provide intraline diff format in API action=compare.
Mar 20 2019, 1:31 PM · Wikidata, Wikidata-Campsite
Pintoch created T218779: Expose structured diffs in Wikibase API.
Mar 20 2019, 1:31 PM · Wikidata, Wikidata-Campsite
Pintoch awarded T56328: Provide intraline diff format in API action=compare a Love token.
Mar 20 2019, 10:55 AM · MediaWiki-API
Pintoch added a comment to T56328: Provide intraline diff format in API action=compare.

I would also be interested in this, spcifically for Wikidata where the diff structure could be exploited even further as suggested by @Yair_rand.

Mar 20 2019, 10:55 AM · MediaWiki-API

Mar 10 2019

abian awarded T197587: Add WikibaseQualityConstraints to the docker image a Like token.
Mar 10 2019, 3:25 PM · Wikidata-Campsite, Wikibase-Quality, Wikibase-Quality-Constraints, Wikidata, Wikibase-Containers
Pintoch closed T215789: Cannot start oabot: ImportError: liblua5.1.so.0 (due to migration from Trusty to Stretch) as Resolved.
Mar 10 2019, 1:59 PM · OABot
Pintoch edited projects for T215789: Cannot start oabot: ImportError: liblua5.1.so.0 (due to migration from Trusty to Stretch), added: OABot; removed Tools.
Mar 10 2019, 1:59 PM · OABot

Mar 6 2019

Pintoch created T217768: The entity suggester should return properties.
Mar 6 2019, 1:45 PM · Wikidata, Discovery-Search

Feb 27 2019

Pintoch added a comment to T217258: Language handling for adding citations with citoid in wikidata.

If you need a mapping from ISO language codes to Wikimedia ones, Wikidata-Toolkit has such a mapping: https://github.com/Wikidata/Wikidata-Toolkit/blob/3e62f93b137c25961c5a12172c7f213a720ecb67/wdtk-datamodel/src/main/java/org/wikidata/wdtk/datamodel/interfaces/WikimediaLanguageCodes.java

Feb 27 2019, 5:01 PM · Wikidata, Citoid
Pintoch updated subscribers of T217239: Expose the graph of language fallbacks in an API.

What exactly would you do with this information? i.e. what's the actual use case that makes you file this request?

Feb 27 2019, 4:40 PM · Patch-For-Review, Wikidata, MediaWiki-API, MediaWiki-Internationalization
Mvolz awarded T217239: Expose the graph of language fallbacks in an API a Like token.
Feb 27 2019, 1:47 PM · Patch-For-Review, Wikidata, MediaWiki-API, MediaWiki-Internationalization
Pintoch created T217239: Expose the graph of language fallbacks in an API.
Feb 27 2019, 1:37 PM · Patch-For-Review, Wikidata, MediaWiki-API, MediaWiki-Internationalization

Feb 22 2019

Pintoch added a comment to T206392: Redesign rank icons for better visibility.

What is the protocol to go forward on this? Should we hold a RFC on-wiki to let people choose among the possible solutions above?

Feb 22 2019, 4:41 PM · Wikidata-Frontend, Design, Wikidata

Feb 21 2019

Pintoch updated subscribers of T204568: Extend message checker framework to support errors that prevent saving.

We have this problem in https://dissem.in/ . This project is set up on Translatewiki, the code is hosted on GitHub and uses Travis for CI. We use Django's localization system which is based on gettext. We compile messages in the CI to check that they are valid. Sometimes translators add incorrect translations (such as translations not reusing the same variables as the msgid, or in a different format). This breaks our build as any incorrect translation will stop the entire compilation process. It is not clear if and how it would be possible to configure the translation compilation process to ignore invalid messages.

Feb 21 2019, 2:31 PM · Patch-For-Review, User-abi_, translatewiki.net, MediaWiki-extensions-Translate
Pintoch awarded T204568: Extend message checker framework to support errors that prevent saving a Love token.
Feb 21 2019, 2:25 PM · Patch-For-Review, User-abi_, translatewiki.net, MediaWiki-extensions-Translate

Feb 19 2019

Manu1400 awarded T197587: Add WikibaseQualityConstraints to the docker image a Like token.
Feb 19 2019, 9:43 PM · Wikidata-Campsite, Wikibase-Quality, Wikibase-Quality-Constraints, Wikidata, Wikibase-Containers

Feb 11 2019

Pintoch added a comment to T215789: Cannot start oabot: ImportError: liblua5.1.so.0 (due to migration from Trusty to Stretch).

Any help with finishing the migration is welcome of course, I am currently busy with dissemin but I will try to come back to this at some point.

Feb 11 2019, 1:47 PM · OABot
Pintoch added a comment to T215789: Cannot start oabot: ImportError: liblua5.1.so.0 (due to migration from Trusty to Stretch).

@Samwalton9 yes that is due to me starting the migration… and not completing it yet!

Feb 11 2019, 1:07 PM · OABot

Feb 2 2019

Pintoch added a comment to T57755: Allow time values more precise than day.

I have updated the Wikibase data model docs, which incorrectly mentioned precisions of hours, minutes and seconds. I assume that they were there because they were part of an earlier design?

Feb 2 2019, 6:36 PM · Wikidata, MediaWiki-extensions-WikibaseRepository

Jan 25 2019

Pintoch added a comment to T206392: Redesign rank icons for better visibility.

Useful solution from Nikki: add in your common.css:

Jan 25 2019, 3:05 PM · Wikidata-Frontend, Design, Wikidata

Jan 9 2019

Pintoch added a comment to T213012: Enable the Watchlist Messages gadget in Wikidata.

I have pinged a few interface admins on wiki to enable this.

Jan 9 2019, 10:01 AM · Wikidata-Gadgets, Wikidata

Jan 7 2019

Pintoch added a comment to T213012: Enable the Watchlist Messages gadget in Wikidata.

Oh can they? Sorry I had no idea! Thanks, I will try to enable it myself.

Jan 7 2019, 4:26 PM · Wikidata-Gadgets, Wikidata

Jan 5 2019

Pintoch added a comment to T205017: Investigation: Look at what gadgets it might make sense to pull into Wikibase.
Jan 5 2019, 11:32 PM · Wikidata-Gadgets, Wikibase-Containers, Wikidata, Federated-Wikibase-Workshops@NewYork-2018
Pintoch created T213012: Enable the Watchlist Messages gadget in Wikidata.
Jan 5 2019, 11:23 PM · Wikidata-Gadgets, Wikidata
Pintoch added a comment to T139898: Tool to facilitate property creation process.

I currently use my own custom hacky script to create properties, but having something stable and usable by anyone would be highly beneficial.

Jan 5 2019, 10:56 PM · Wikidata, Wikidata-Gadgets
Pintoch awarded T139898: Tool to facilitate property creation process a Like token.
Jan 5 2019, 10:54 PM · Wikidata, Wikidata-Gadgets

Dec 2 2018

Pintoch added a comment to T207484: API to efficiently format large numbers of entity IDs.

@Lucas_Werkmeister_WMDE thank you very much for that!

Dec 2 2018, 2:18 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata
Pintoch awarded T207484: API to efficiently format large numbers of entity IDs a 100 token.
Dec 2 2018, 2:18 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata
Pintoch awarded T207484: API to efficiently format large numbers of entity IDs a Orange Medal token.
Dec 2 2018, 2:17 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata
Pintoch awarded T207484: API to efficiently format large numbers of entity IDs a Love token.
Dec 2 2018, 2:17 AM · MW-1.33-notes (1.33.0-wmf.6; 2018-11-27), Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Patch-For-Review, Wikidata

Nov 12 2018

Pintoch updated subscribers of T209031: Not able to scoop comment table in labs for mediawiki reconstruction process [EPIC}.

I have taken the liberty to remove "Cloud Services" as a subscriber to this ticket as I do not think every toollabs user wants to receive notifications about this.

Nov 12 2018, 3:52 PM · Patch-For-Review, Core Platform Team Backlog (Watching / External), Analytics-Kanban, DBA, Data-Services, Analytics

Nov 6 2018

Pintoch awarded T208118: Import arXiv ID (P818) and "full work available at" (P953) from unpaywall dataset a Love token.
Nov 6 2018, 9:55 AM · WikiCite, Wikistorm

Nov 5 2018

Pintoch added a comment to T194813: Find ways to integrate/improve OABot with WikiCite efforts.

As explained in T164152 I am happy to mentor anyone for this.

Nov 5 2018, 8:35 AM · Wikimedia-Hackathon-2018, WikiCite, OABot
Pintoch updated subscribers of T164152: Integrate with Wikidata.

@Daniel_Mietchen regarding https://twitter.com/EvoMRI/status/1055785761574813696 (I do not read Twitter notifications - but happily interact on open platforms such as Mastodon):

Nov 5 2018, 8:33 AM · OABot

Nov 2 2018

Pintoch awarded T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it a Like token.
Nov 2 2018, 3:03 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service
Pintoch added a comment to T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it.

The search interface can also be used for that thanks to the haswbstatement command. That only gets you one id per query, so it might not be suited for all tools. I don't know if the lag is lower in this interface.
Retrieving items by identifiers is quite crucial in many tools so it would be useful to have a solid interface for that instead of relying on SPARQL (which feels indeed like using a sledgehammer to crack a nut).

Nov 2 2018, 2:52 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service
Pintoch added a comment to T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it.

@Gehel my service has been quite unstable for some time, but I haven't found the time yet to find out exactly where the problem is coming from - it could be SPARQL, the Wikidata API, redis or the webservice itself. I will add a few more metrics to understand what is going on and report back here.

Nov 2 2018, 1:54 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service

Nov 1 2018

Pintoch added a comment to T200234: Create edit groups when running Wikidata-related scripts.

@Criscod yes that would be a great idea.

Nov 1 2018, 11:50 AM · Pywikibot, Pywikibot-Wikidata

Oct 31 2018

Pintoch added a comment to T199228: Define an SLO for Wikidata Query Service public endpoint and communicate it.

Thanks for the ping Lydia! On the top of my mind, the only uses of SPARQL in the tools I maintain are in the openrefine-wikidata interface:

  • queries to retrieve the list of subclasses of a given class - lag is not critical at all for this as the ontology is assumed to be stable. (These results are cached on my side for 24 hours, for any root class.)
  • queries to retrieve items by external identifiers or sitelinks - lag can be more of an issue for this but I would not consider it critical. (These results are not cached.)

What matters much more for this tool is getting quick results and as little downtime as possible - lag is not really a concern.

Oct 31 2018, 7:48 PM · Operations, Discovery-Wikidata-Query-Service-Sprint, Wikidata, Wikidata-Query-Service

Oct 29 2018

Pintoch added a comment to T207839: Batch add WO II war memorials to Wikidata .

Just to let you know that the problem with the ".0" will be solved in the next version of OpenRefine.
In the meantime, you can solve the issue by transforming your column with the following expression: value.toString().replace(".0",""). Hope it helps!

Oct 29 2018, 7:56 PM · Wikidata, Wikistorm

Oct 27 2018

Pintoch awarded T206755: Understand steps to get structured data from Excel-workbook into Wikidata a Love token.
Oct 27 2018, 1:09 PM · Wikistorm
Pintoch added a comment to T206755: Understand steps to get structured data from Excel-workbook into Wikidata.

Woooohooooo!!!!

Oct 27 2018, 1:09 PM · Wikistorm
Pintoch added a comment to T208034: OpenRefine demo at Wikistorm.

So I had the opportunity to annoy a lot of people by shouting OpenRefine repeatedly in their ears over the past 48 hours.

Oct 27 2018, 1:08 PM · Wikistorm

Oct 26 2018

Pintoch added a comment to T207839: Batch add WO II war memorials to Wikidata .

Awesome! \o/ Actually OpenRefine could potentially help you already at that stage to do the matching - let me know if you want a quick demo :)

Oct 26 2018, 7:35 PM · Wikidata, Wikistorm
Pintoch moved T208034: OpenRefine demo at Wikistorm from Backlog to Session on the Wikistorm board.
Oct 26 2018, 6:22 PM · Wikistorm
Pintoch added a comment to T208076: We would like some support for OpenRefine.

I would be happy to help I have a tshirt with an OpenRefine logo (the blue diamond)

Oct 26 2018, 6:22 PM · Wikistorm
Pintoch created T208034: OpenRefine demo at Wikistorm.
Oct 26 2018, 11:16 AM · Wikistorm
Pintoch moved T208018: Wikicite session from Backlog to Session on the Wikistorm board.
Oct 26 2018, 9:02 AM · Wikistorm
Pintoch updated the task description for T208018: Wikicite session.
Oct 26 2018, 9:00 AM · Wikistorm
Pintoch created T208018: Wikicite session.
Oct 26 2018, 8:58 AM · Wikistorm
Pintoch added a comment to T207865: Improve Wikidata's dataset import hub.

I have left some ideas here:

Oct 26 2018, 8:15 AM · Wikistorm

Oct 24 2018

Pintoch moved T207865: Improve Wikidata's dataset import hub from Backlog to Project on the Wikistorm board.
Oct 24 2018, 5:26 PM · Wikistorm
Pintoch created T207865: Improve Wikidata's dataset import hub.
Oct 24 2018, 4:41 PM · Wikistorm
Pintoch renamed T205488: Import villages of Aruba to Wikidata from Write script to import villages of Aruba to Wikidata to Import villages of Aruba to Wikidata.
Oct 24 2018, 2:47 PM · Wikistorm
Pintoch added a comment to T206755: Understand steps to get structured data from Excel-workbook into Wikidata.

I will be available to help with OpenRefine. It is exactly designed for this workflow indeed so I hope it will be a match :)
For reconciliation help, have you seen this page?
https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation

Oct 24 2018, 2:47 PM · Wikistorm
Pintoch claimed T207839: Batch add WO II war memorials to Wikidata .

I would be interested in helping with this - I can guide you through the uploading process with OpenRefine.
If you want to prepare for this, I feel free to download OpenRefine have a look at tutorials, like these:

The videos at http://openrefine.org/ are also useful to get an idea of what OpenRefine does (with no reference to Wikidata).

Oct 24 2018, 2:41 PM · Wikidata, Wikistorm

Oct 19 2018

Pintoch added a comment to T207370: Statistics of number of Wikidata edits with Magnus Manske's tools.

Some of the OpenRefine edits were not tagged during development but all edits done with a released version should be. Some of the OpenRefine batches are uploaded via QuickStatements, in which case they are tagged as such. (The main benefits of using QS with OpenRefine is to run batches in the background or to have a statement matching rules when updating existing claims).

Oct 19 2018, 7:06 AM · GLAM, Wikidata

Oct 13 2018

Pintoch added a comment to T205488: Import villages of Aruba to Wikidata.

Sure, happy to help any time! (Online or at the Wiki TechStorm)

Oct 13 2018, 12:28 PM · Wikistorm

Oct 12 2018

Pintoch closed T192811: Provide an OpenRefine API for matching by class and properties as Invalid.
Oct 12 2018, 9:23 AM · Wikidata, Federated-Wikibase-Workshops@Antwerp-2018
Pintoch added a comment to T192811: Provide an OpenRefine API for matching by class and properties.

I think this ticket can be closed given that we cannot figure out what it is supposed to be about.

Oct 12 2018, 9:22 AM · Wikidata, Federated-Wikibase-Workshops@Antwerp-2018

Sep 28 2018

Pintoch awarded T169666: Render partial results a Love token.
Sep 28 2018, 10:26 AM · Wikidata Query UI, Discovery, Wikidata

Sep 25 2018

Pintoch awarded T178249: Parameter for linking a new page to the Wikidata a Like token.
Sep 25 2018, 10:41 AM · Wikidata

Sep 19 2018

Pintoch added a comment to T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

I was thinking of the opposite: consider the violations related to the revision R of the item I to be the violations of the statements of I with respect to the state of Wikidata just before R+1 was saved.

Sep 19 2018, 5:07 PM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch added a comment to T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

@Lydia_Pintscher yes indeed! For instance the aggregation at batch-level would probably not be meaningful for inverse constraints (unless there is a way to detect all the violations added and solved by an edit, not just on the item where the edit was made). But isn't this a problem that you have anyway, even when storing only the latest violations? For instance, if I add a "subclass of (P279)" statement between two items, don't you need to recompute type violations for all items which are instances of some transitive subclass of the new subclass? I am not sure how this invalidation is done at the moment.

Sep 19 2018, 8:03 AM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Sep 18 2018

Pintoch added a comment to T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

@Lydia_Pintscher personally here is what I would concretely implement in the EditGroups tool. For each edit that is part of an edit group:

  • fetch the constraints violations before and after the edit (this fetching would happen as the edit is retrieved, so in near real-time)
  • compute the difference of constraints violations of each type (for instance, 1 new "value type constraint" violation and 2 less "statement required constraint" violation)
  • aggregate these statistics at a batch level and expose them in batch views (for instance, this batch added 342 new "value type constraint" violations and solved 764 "statement required constraint" violations)

Together with the number of reverted edits in a batch (which the tool already aggregates), this could potentially make it easier to spot problematic batches.

Sep 18 2018, 9:26 AM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Sep 17 2018

Pintoch updated subscribers of T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache.

This ticket is fantastic news.

Sep 17 2018, 4:19 PM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch awarded T204024: Store WikibaseQualityConstraint check data in persistent storage instead of in the cache a Love token.
Sep 17 2018, 4:11 PM · User-Addshore, Dependency-Tracking, Operations, Core Platform Team Backlog (Designing), Cassandra, Services (designing), wikidata-tech-focus, Wikidata-Campsite, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata
Pintoch awarded T202404: investigate options for regularly running constraint checks a Love token.
Sep 17 2018, 4:07 PM · User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), wikidata-tech-focus, Wikidata-Query-Service, Wikibase-Quality, Wikidata, Wikibase-Quality-Constraints
Pintoch awarded T204022: Add functionality to run QualityConstraint checks on an entity after every edit a Love token.
Sep 17 2018, 4:07 PM · MW-1.33-notes (1.33.0-wmf.14; 2019-01-22), Patch-For-Review, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Story, wikidata-tech-focus, Wikibase-Quality-Constraints, Wikibase-Quality, Wikidata

Sep 16 2018

Pintoch added a comment to T174540: Use open access URL via oaDOI in citoid response.

@martin.monperrus see my first comment in this thread.

Sep 16 2018, 2:14 PM · Citoid
Pintoch added a comment to T174540: Use open access URL via oaDOI in citoid response.

@martin.monperrus Let me emphasize that this is a significant change that should get community approval first. There has already been a lot of discussion about similar changes to the DOI template on the English Wikipedia and there is clearly a consensus against this IMHO.

Sep 16 2018, 7:44 AM · Citoid

Sep 14 2018

Pintoch added a comment to T204267: Flood of WDQS requests from wbqc.

@aborrero thanks for the ping. I do not recognize the shape of the queries as coming from this tool though. The openrefine-wikidata tool should do relatively few SPARQL queries, whose results are cached in redis. How did you determine that this tool is the source of the problem?

Sep 14 2018, 10:57 AM · Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), Cloud-Services, Operations, User-Addshore, Wikibase-Quality, Wikidata, Wikidata-Query-Service, Wikibase-Quality-Constraints

Sep 5 2018

Pintoch added a comment to T202729: When creating a new Sense through wbeditentity the summary is confusing "Created a new entity".

@Lydia_Pintscher @Ladsgroup any idea how I could be notified of any new automatic edit summaries, such as the wbeditentity-create-item that this change introduced? For any such summary, I need to add it to EditGroups, especially if the new auto summary replaces a highly-used existing one, as in this case. Otherwise, this breaks the tagging of batches.

Sep 5 2018, 5:34 PM · wikidata-tech-focus, MW-1.32-notes (WMF-deploy-2018-09-04 (1.32.0-wmf.20)), Patch-For-Review, User-Ladsgroup, Wikidata-Senses-Iteration3, Lexicographical data, Wikidata
Pintoch added a comment to T186200: Rewrite Wikibase data model implementation.

I think reworking this implementation would be very welcome because at the moment it is not pretty, to say it politely.
But I am not convinced by the alternative either. Why would Reference inherit from BaseClaim? A reference is not a claim. What would the getSnakType method mean when called on a Reference?

Sep 5 2018, 12:39 PM · Pywikibot-RfCs, Pywikibot-Wikidata, Pywikibot
Pintoch awarded T203557: Create a Edit group extension a Love token.
Sep 5 2018, 12:37 PM · MediaWiki-extension-requests
Pintoch added a comment to T200234: Create edit groups when running Wikidata-related scripts.

It might be worth giving the bot author some control over this feature:

  • there should be some opt-in / opt-out mechanism
  • there should be some control over what constitutes a batch. Some users might want to create multiple logical batches during the same run of a bot, or share the same batch id across consecutive runs of the same python script (for instance if it is called by a bash script…
Sep 5 2018, 11:02 AM · Pywikibot, Pywikibot-Wikidata

Jul 27 2018

Pintoch added a comment to T193728: Address concerns about perceived legal uncertainty of Wikidata .

Etalab (who runs the open data portal of the French government) have released a statement (in French) concerning the attribution requirement of their "licence ouverte", confirming that it only applies to the first re-user.
https://github.com/etalab/wiki-data-gouv#point-juridique

Jul 27 2018, 8:07 AM · WMF-Legal, Wikidata

Jul 17 2018

Pintoch added a comment to T190274: Add libmysqlclient-dev to Toolforge Kubernetes Python3 runtime image.

@Chicocvenancio I agree with Yury - it makes it significantly harder to deploy Django projects.

Jul 17 2018, 2:44 PM · Patch-For-Review, cloud-services-team, Toolforge

Jun 18 2018

Pintoch added a comment to T155155: Implement a way to view (extension) configuration options and its value on-wiki.

This would be very useful for T197588. It would make a lot of sense for Wikibase Quality Constraints in particular.

Jun 18 2018, 2:17 PM · MediaWiki-Configuration
Pintoch awarded T155155: Implement a way to view (extension) configuration options and its value on-wiki a Love token.
Jun 18 2018, 2:14 PM · MediaWiki-Configuration