Page MenuHomePhabricator

Pintoch (Antonin Delpeuch)
User

Projects

Today

  • Clear sailing ahead.

Tomorrow

  • Clear sailing ahead.

Sunday

  • Clear sailing ahead.

User Details

User Since
Nov 9 2016, 7:25 PM (161 w, 1 d)
Availability
Available
LDAP User
Unknown
MediaWiki User
Pintoch [ Global Accounts ]

Recent Activity

Wed, Dec 11

Pintoch added a comment to T240442: Design a continuous throttling policy for Wikidata bots.

Thanks! I think dynamically changing the maxlag value is likely to still introduce some thresholds, whereas a continuous slowdown (by retrieving the lag and compute one's edit rate based on it) should in theory reach an equilibrium point.

Wed, Dec 11, 4:49 PM · Wikidata
Pintoch added a comment to T240370: Maxlag=5 for Petscan.

Thanks for the analysis! Whether this is a breaking change or not is not my concern: Petscan and other mass-editing tools based on Widar should play by the book. I can provide a simple patch which ensures maxlag=5 is applied to all Widar edits: if someone wants to do a refined version which allows specific user-triggered edits to go through without a maxlag parameter, that is great. @Magnus, what is your take on this?

Wed, Dec 11, 3:27 PM · Wikidata
Addshore awarded T240442: Design a continuous throttling policy for Wikidata bots a Manufacturing Defect? token.
Wed, Dec 11, 11:37 AM · Wikidata
Pintoch added a comment to T240442: Design a continuous throttling policy for Wikidata bots.

If clients are able to retrieve the current lag periodically (through some MediaWiki API call? which one?), then this should not require any server-side change. Clients can continue to use maxlag=5 but to also throttle themselves using the smoothed function proposed.

Wed, Dec 11, 11:25 AM · Wikidata
Pintoch created T240442: Design a continuous throttling policy for Wikidata bots.
Wed, Dec 11, 11:19 AM · Wikidata
Pintoch added a comment to T240370: Maxlag=5 for Petscan.

@Bugreporter have you got details of where this behaviour is currently implemented in PetScan? In particular, how do you request the current maxlag with the MediaWiki API?

Wed, Dec 11, 10:55 AM · Wikidata
Pintoch added a comment to T197588: Agree on a "manifest" format to expose the configuration of Wikibase instances.

@Theklan let's move your issue to a different ticket as your issue does not seem to be related: T240436

Wed, Dec 11, 10:45 AM · OpenRefine, Wikidata
Pintoch added a project to T240436: Unknown command INCRBYFLOAT in OpenRefine reconciliation interface: OpenRefine.
Wed, Dec 11, 10:44 AM · OpenRefine
Pintoch created T240436: Unknown command INCRBYFLOAT in OpenRefine reconciliation interface.
Wed, Dec 11, 10:43 AM · OpenRefine

Tue, Dec 10

Pintoch added a comment to T240370: Maxlag=5 for Petscan.

I've had a quick look at the code to see if I could submit a patch for this myself but it is not clear to me where the edits are done - I have looked in petscan_rs and wikibase_rs to no avail. Petscan edits might be done in the browser by sending them to some Widar-like interface?

Tue, Dec 10, 9:57 PM · Wikidata
Pintoch added a comment to T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5.

@Bugreporter yes indeed! I was off by one hour there. Thanks for your help! Feel free to add more bots which match that period.

Tue, Dec 10, 8:24 PM · Wikidata
Pintoch added a comment to T240371: Maxlag=5 for Author Disambiguator.

Matching pull request: https://github.com/arthurpsmith/author-disambiguator/pull/107

Tue, Dec 10, 8:21 PM · Wikidata
Pintoch added a comment to T240376: Maxlag=5 for LogainmBot.

If you have not changed anything in user-config.py then you should be good to go, it might have been a false positive on my side. Sorry for the noise!

Tue, Dec 10, 8:19 PM · Wikidata
Pintoch updated the task description for T240377: Maxlag=5 for Reinheitsgebot.
Tue, Dec 10, 8:12 PM · Wikidata
Pintoch updated the task description for T240375: Maxlag=5 for LargeDatasetBot.
Tue, Dec 10, 8:11 PM · Wikidata
Pintoch updated the task description for T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5.
Tue, Dec 10, 8:09 PM · Wikidata
Pintoch created T240377: Maxlag=5 for Reinheitsgebot.
Tue, Dec 10, 8:05 PM · Wikidata
Pintoch created T240376: Maxlag=5 for LogainmBot.
Tue, Dec 10, 8:03 PM · Wikidata
Pintoch created T240375: Maxlag=5 for LargeDatasetBot.
Tue, Dec 10, 8:00 PM · Wikidata
Pintoch created T240374: Maxlag=5 for BotMultichill.
Tue, Dec 10, 7:58 PM · Wikidata
Pintoch created T240373: Maxlag=5 for Edoderoobot.
Tue, Dec 10, 7:56 PM · Wikidata
Pintoch assigned T240370: Maxlag=5 for Petscan to Magnus.
Tue, Dec 10, 7:54 PM · Wikidata
Pintoch created T240371: Maxlag=5 for Author Disambiguator.
Tue, Dec 10, 7:54 PM · Wikidata
Pintoch created T240370: Maxlag=5 for Petscan.
Tue, Dec 10, 7:53 PM · Wikidata
Pintoch created T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5.
Tue, Dec 10, 7:51 PM · Wikidata
Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

I am first getting in touch with people who seem to be running bots with maxlag greater than 5 or no maxlag parameter at all, to see if they would accept to follow @Addshore's advice never to use maxlag greater than 5 at all.

Tue, Dec 10, 2:27 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Mon, Dec 2

Pintoch added a comment to T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata.

There has been one attempt I think but it did not go very far - and the ids do not seem to have been imported so far!

Mon, Dec 2, 10:42 AM · OpenRefine, Wiki-Techstorm-2019

Fri, Nov 29

Pintoch added a comment to T238340: Import UK lakes in Wikidata.

I think it is a bit harder to extract more than 1000 records (I didn't cap it on purpose to make it manageable for the workshop).

Fri, Nov 29, 11:27 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238340: Import UK lakes in Wikidata.

@Jheald I don't think anyone is working on this anymore: if you are still interested in the scraped dataset, it is here: http://pintoch.ulminfo.fr/adc2c9aaba/lakes-portal.tsv

Fri, Nov 29, 7:20 AM · OpenRefine, Wiki-Techstorm-2019

Tue, Nov 26

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

OK! If you have ways to check what sort of maxlag values are used it would be great!

Tue, Nov 26, 7:32 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Mon, Nov 25

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

Actually, some tools seem to be doing something like that already, since edits are still going through despite max lag being above 5 for more than an hour now (Author Disambiguator does this, QuickStatements too probably, Edoderoobot too). So these tools use higher (more agressive) maxlag values than 5.

Mon, Nov 25, 11:24 AM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata
Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

One problem with the current policy (requesting all automated editing processes to use maxlag=5) is that this creates a binary threshold: either the query service lag is under the threshold, in which case bots will edit at full speed, or the query service lag is above the threshold, in which case they should all stop editing entirely. This is likely to create an oscillating behaviour, where all bots start and stop periodically. This is probably not ideal neither for the infrastructure nor for the users.

Mon, Nov 25, 10:40 AM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Sat, Nov 23

Pintoch awarded T237471: Import data on hospitals in Västra Götalandsregionen a Love token.
Sat, Nov 23, 9:41 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata.

We have an import here: https://tools.wmflabs.org/editgroups/b/OR/8cf42ae3c0/
Good job!

Sat, Nov 23, 11:28 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238333: Import the Museum Data Files in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Sat, Nov 23, 11:26 AM · Wiki-Techstorm-2019, OpenRefine
Pintoch moved T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Sat, Nov 23, 11:25 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Sat, Nov 23, 11:25 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238441: Import endangered alphabets in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Sat, Nov 23, 11:16 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T237615: Translate interface of OpenRefine in Dutch.

Moving to "Done" although this is not complete, but we did have a lot of new translations (not just in Dutch) during the event.

Sat, Nov 23, 11:15 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T237615: Translate interface of OpenRefine in Dutch from Backlog to Done on the Wiki-Techstorm-2019 board.
Sat, Nov 23, 11:15 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238348: Import Publons journal IDs in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Sat, Nov 23, 11:14 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238348: Import Publons journal IDs in Wikidata.

Data is getting into Wikidata!

Sat, Nov 23, 11:10 AM · OpenRefine, Wiki-Techstorm-2019

Fri, Nov 22

Pintoch added a comment to T236038: WORKSHOP: OpenRefine (10.30 - 12.15).

If you have trouble reconciling, use this reconciliation service: http://ulminfo.fr:3894/en/api

Fri, Nov 22, 2:14 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T236038: WORKSHOP: OpenRefine (10.30 - 12.15).

For people who cannot install OpenRefine on their laptops, use: http://5.135.188.139:48379/

Fri, Nov 22, 9:57 AM · OpenRefine, Wiki-Techstorm-2019

Thu, Nov 21

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

So this is what we get with an exponential back-off (1.5 factor), at the moment:

22:37:27.148 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 1000 milliseconds. (19338ms)
22:37:28.729 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 1500 milliseconds. (1581ms)
22:37:33.809 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 2250 milliseconds. (5080ms)
22:37:37.931 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 3375 milliseconds. (4122ms)
22:37:42.663 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 5062 milliseconds. (4732ms)
22:37:49.437 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 7593 milliseconds. (6774ms)
22:37:58.429 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 11389 milliseconds. (8992ms)
22:38:18.217 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 6 seconds lagged. -- pausing for 17083 milliseconds. (19788ms)
22:38:36.461 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 6 seconds lagged. -- pausing for 25624 milliseconds. (18244ms)
22:39:05.013 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 6.4666666666667 seconds lagged. -- pausing for 38436 milliseconds. (28552ms)

So it looks like this means no OpenRefine edits at all with these new rules, in the current situation.

Thu, Nov 21, 9:53 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata
Pintoch added a parent task for T238847: Create a suitable template and import into wikidata the list of major data breaches: T236038: WORKSHOP: OpenRefine (10.30 - 12.15).
Thu, Nov 21, 9:09 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a subtask for T236038: WORKSHOP: OpenRefine (10.30 - 12.15): T238847: Create a suitable template and import into wikidata the list of major data breaches.
Thu, Nov 21, 9:09 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238847: Create a suitable template and import into wikidata the list of major data breaches from Backlog to Data imports on the OpenRefine board.
Thu, Nov 21, 9:04 PM · OpenRefine, Wiki-Techstorm-2019

Wed, Nov 20

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

Thanks for the notification! I would be happy to release a new version of OpenRefine with a patch applied - I can do this in the coming days. The exponential back-off suggested by @Multichill makes sense intuitively - could WMDE confirm that this is the policy they recommend? Happy to adapt the policy as required.

Wed, Nov 20, 9:52 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Fri, Nov 15

Pintoch triaged T238454: Import list of members of the Federation of Museums and Art Galleries of Wales as Medium priority.
Fri, Nov 15, 10:13 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238454: Import list of members of the Federation of Museums and Art Galleries of Wales from Backlog to Data imports on the OpenRefine board.
Fri, Nov 15, 10:13 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238454: Import list of members of the Federation of Museums and Art Galleries of Wales.
Fri, Nov 15, 10:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch updated the task description for T238340: Import UK lakes in Wikidata.
Fri, Nov 15, 9:35 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238441: Import endangered alphabets in Wikidata from Backlog to Data imports on the OpenRefine board.
Fri, Nov 15, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238441: Import endangered alphabets in Wikidata from Backlog to SPARQLstation on the Wiki-Techstorm-2019 board.
Fri, Nov 15, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch triaged T238441: Import endangered alphabets in Wikidata as Medium priority.
Fri, Nov 15, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238441: Import endangered alphabets in Wikidata.
Fri, Nov 15, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch updated the task description for T236038: WORKSHOP: OpenRefine (10.30 - 12.15).
Fri, Nov 15, 6:54 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238348: Import Publons journal IDs in Wikidata up for grabs.
Fri, Nov 15, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata up for grabs.
Fri, Nov 15, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata up for grabs.
Fri, Nov 15, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238340: Import UK lakes in Wikidata up for grabs.
Fri, Nov 15, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238334: Import public libraries in Austria up for grabs.
Fri, Nov 15, 4:30 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238333: Import the Museum Data Files in Wikidata up for grabs.
Fri, Nov 15, 4:30 PM · Wiki-Techstorm-2019, OpenRefine

Thu, Nov 14

Pintoch moved T236768: Wikimedia Commons: Reconcile & upload images of old prints from Erfgoedcentrum Rozet to Commons with OpenRefine & Pattypan from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T237471: Import data on hospitals in Västra Götalandsregionen from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238333: Import the Museum Data Files in Wikidata from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:19 PM · Wiki-Techstorm-2019, OpenRefine
Pintoch moved T238334: Import public libraries in Austria from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238340: Import UK lakes in Wikidata from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238348: Import Publons journal IDs in Wikidata from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:18 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata from Backlog to Data imports on the OpenRefine board.
Thu, Nov 14, 6:18 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238348: Import Publons journal IDs in Wikidata.

@Ecritures shouldn't we rather assign tasks to the participants who decide to tackle them?

Thu, Nov 14, 6:16 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238348: Import Publons journal IDs in Wikidata.
Thu, Nov 14, 4:51 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata.
Thu, Nov 14, 4:24 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T237615: Translate interface of OpenRefine in Dutch up for grabs.
Thu, Nov 14, 4:13 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238334: Import public libraries in Austria up for grabs.
Thu, Nov 14, 4:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238340: Import UK lakes in Wikidata up for grabs.
Thu, Nov 14, 4:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata.
Thu, Nov 14, 4:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238340: Import UK lakes in Wikidata.
Thu, Nov 14, 3:48 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238334: Import public libraries in Austria.
Thu, Nov 14, 3:08 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a subtask for T236038: WORKSHOP: OpenRefine (10.30 - 12.15): T238333: Import the Museum Data Files in Wikidata.
Thu, Nov 14, 2:58 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a parent task for T238333: Import the Museum Data Files in Wikidata: T236038: WORKSHOP: OpenRefine (10.30 - 12.15).
Thu, Nov 14, 2:58 PM · Wiki-Techstorm-2019, OpenRefine
Pintoch created T238333: Import the Museum Data Files in Wikidata.
Thu, Nov 14, 2:57 PM · Wiki-Techstorm-2019, OpenRefine

Nov 12 2019

Pintoch added a comment to T237615: Translate interface of OpenRefine in Dutch.

I should note that we do have a Dutch translation (which is incomplete) - you should be able to find it the language menu as "Nederlands", not "Dutch". It is the second option in the menu, just after "English".

Nov 12 2019, 12:07 PM · OpenRefine, Wiki-Techstorm-2019

Nov 11 2019

Pintoch created T238019: Error in matching heuristic.
Nov 11 2019, 7:06 PM · OABot
Pintoch added a comment to T238003: Delete openrefine01 instance.

Thanks! I am working on making OpenRefine easier to host, but it's a long term project indeed (exciting announcement about that soon).

Nov 11 2019, 4:51 PM · Wikidata, OpenRefine, Cloud-VPS
Pintoch added a comment to T220696: [Story] Create better edit summaries for wbeditentity API endpoint.

Exciting developments! I am wondering if that comment_data could be (or already is) exposed in any public API? Exposing such diffs would also solve T106306 in the same go.

Nov 11 2019, 1:41 PM · MW-1.35-notes (1.35.0-wmf.5; 2019-11-05), Wikidata-Campsite, MW-1.34-notes (1.34.0-wmf.15; 2019-07-23), Wikidata

Nov 7 2019

Pintoch added a comment to T237615: Translate interface of OpenRefine in Dutch.

The simplest way to translate OpenRefine is to use Weblate:
https://hosted.weblate.org/projects/OpenRefine/
The UI strings are grouped into modules that can be translated independently. "Translations" is for the main features, "wikidata" is for the Wikidata-specific ones (from the Wikidata extension), "gdata" is for the Google extension, and so on.

Nov 7 2019, 10:07 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T237616: Provide file with items to be translated so Dutch language interface is possible for OpenRefine.

Yes! The simplest way to translate OpenRefine is to use Weblate:
https://hosted.weblate.org/projects/OpenRefine/
The UI strings are grouped into modules that can be translated independently. "Translations" is for the main features, "wikidata" is for the Wikidata-specific ones (from the Wikidata extension), "gdata" is for the Google extension, and so on.

Nov 7 2019, 10:05 AM · OpenRefine, Wiki-Techstorm-2019

Nov 5 2019

Pintoch updated Pintoch.
Nov 5 2019, 6:23 PM

Nov 2 2019

Pintoch added a comment to T203557: Create a Edit group extension.

I thought for a moment that there was an issue with the fact that currently, filtering by tags only works for Special:RecentChanges (which only contain the most recent changes, not all of them). But Lucas pointed out that it is also supported by Special:Contributions, which is uncapped in time (I think?). EditGroups currently assumes that all edits in a given group are made by the same user, which I think makes sense as a constraint. So, in the current situation, if you had an edit group which corresponded to a single tag, it would be possible to retrieve all edits in it via Special:Contributions (assuming you know the user in the first place).

Nov 2 2019, 6:18 PM · WMSE-Tools-for-Partnerships-2019-Blueprinting, SDC General, Wikidata, MediaWiki-extension-requests
Pintoch added a comment to T203557: Create a Edit group extension.

Great point! I did not think about that in this way. It sounds like a very sensible route to follow.

Nov 2 2019, 3:59 PM · WMSE-Tools-for-Partnerships-2019-Blueprinting, SDC General, Wikidata, MediaWiki-extension-requests

Nov 1 2019

Pintoch added a comment to T203557: Create a Edit group extension.

Sure, I hope you don't mind us having this discussion here anyway, since this is still at a very early stage. (unless you are already planning to work on this exact architecture?)

Nov 1 2019, 3:58 PM · WMSE-Tools-for-Partnerships-2019-Blueprinting, SDC General, Wikidata, MediaWiki-extension-requests
Pintoch added a comment to T203557: Create a Edit group extension.

Here are a few issues with the current tool:

  • If the tool goes down, this creates millions of dead links in edit summaries (which cannot be changed). Users who ran batches with the assumption that they could be undone if something goes wrong find themselves having to clean things up manually.
  • If the recent changes listener dies for a long time, for instance for longer than the EventStream or recent changes cover, edits which were missed cannot be recovered easily (this could be solved by reading the corresponding edits from the public dumps, but it is not implemented at the moment).
  • As a user, I can easily game EditGroups by imitating the edit summary of any tool, attributing edits to batches they are not actually part of (there are some protections against this, but they are not fully bullet-proof. For instance an edit will only be added to a batch if it was made under the account of the user who first created the batch)
  • Bot/tool authors are reluctant to add grouping support for their bots since EditGroups is not officially part of the Wikidata infrastructure. The community is unlikely to systematically enforce edit grouping in requests for bot approvals as long as it relies on an external tool, which does not come with any sustainability guarantees, SLA, etc.
  • Reverting relies on OAuth, and OAuth tokens can expire while a large edit group is being reverted.
Nov 1 2019, 12:39 PM · WMSE-Tools-for-Partnerships-2019-Blueprinting, SDC General, Wikidata, MediaWiki-extension-requests
Pintoch updated subscribers of T203557: Create a Edit group extension.

With the introduction of Structured Data on Commons and the implementation of mass-editing tools there (AC/DC, QuickStatements), the need to generalize EditGroups to other wikis is gathering interest.

Nov 1 2019, 11:18 AM · WMSE-Tools-for-Partnerships-2019-Blueprinting, SDC General, Wikidata, MediaWiki-extension-requests
Pintoch added a member for Wiki-Techstorm-2019: Pintoch.
Nov 1 2019, 7:50 AM

Oct 31 2019

Pintoch awarded T236768: Wikimedia Commons: Reconcile & upload images of old prints from Erfgoedcentrum Rozet to Commons with OpenRefine & Pattypan a Love token.
Oct 31 2019, 1:55 PM · OpenRefine, Wiki-Techstorm-2019

Oct 28 2019

Pintoch added a comment to T220696: [Story] Create better edit summaries for wbeditentity API endpoint.

Here is a diff where one of the new fallback summaries failed to be translated to a human-readable form: https://www.wikidata.org/w/index.php?title=Property:P571&diff=1040450905&oldid=1038877052

Oct 28 2019, 7:39 PM · MW-1.35-notes (1.35.0-wmf.5; 2019-11-05), Wikidata-Campsite, MW-1.34-notes (1.34.0-wmf.15; 2019-07-23), Wikidata

Oct 27 2019

Pintoch updated the task description for T236619: Workflow to create special properties and items at boot stage.
Oct 27 2019, 1:44 PM · Wikidata, Wikibase-Containers