Page MenuHomePhabricator

Pintoch (Antonin Delpeuch)
User

Projects

Today

  • Clear sailing ahead.

Tomorrow

  • Clear sailing ahead.

Friday

  • Clear sailing ahead.

User Details

User Since
Nov 9 2016, 7:25 PM (170 w, 6 d)
Availability
Available
LDAP User
Unknown
MediaWiki User
Pintoch [ Global Accounts ]

Recent Activity

Sun, Feb 16

Pintoch updated subscribers of T244847: Future of the OpenRefine Wikidata reconciliation interface.

Concerning the choice of language to make it easier to maintain / deploy in a Wikimedia context:

  • PHP seems like a pretty widespread choice, and is mandatory if the API is to be implemented as a MediaWiki extension . My understanding from our meeting with @Lokal_Profil is that it would generally be helpful to other Wikimedia organizations who are familiar with this stack, even if the service is not directly integrated in MediaWiki;
  • @Mvolz told me that Node.js is also used to run services at WMF (but Parsoid is moving from Node.js to PHP-only, perhaps a sign that Node.js is not a good long-term choice).
Sun, Feb 16, 3:43 PM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine

Thu, Feb 13

Alicia_Fagerving_WMSE awarded T244847: Future of the OpenRefine Wikidata reconciliation interface a Love token.
Thu, Feb 13, 11:19 AM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine

Tue, Feb 11

Pintoch updated the task description for T244847: Future of the OpenRefine Wikidata reconciliation interface.
Tue, Feb 11, 1:06 PM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine
Pintoch updated the task description for T244847: Future of the OpenRefine Wikidata reconciliation interface.
Tue, Feb 11, 1:04 PM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine
Pintoch updated the task description for T244847: Future of the OpenRefine Wikidata reconciliation interface.
Tue, Feb 11, 11:13 AM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine
Pintoch updated the task description for T244847: Future of the OpenRefine Wikidata reconciliation interface.
Tue, Feb 11, 11:05 AM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine
Pintoch created T244847: Future of the OpenRefine Wikidata reconciliation interface.
Tue, Feb 11, 11:00 AM · User-Alicia_Fagerving_WMSE, WMSE-Tools-for-Partnerships-2019-Blueprinting, OpenRefine

Thu, Jan 23

Pintoch added a comment to T240436: Unknown command INCRBYFLOAT in OpenRefine reconciliation interface.

Have you tried with a more recent version (such as 5.0 and above)? 3.3.11 is quite old and might not support INCRBYFLOAT, which is used to compute usage statistics for the endpoint.

Thu, Jan 23, 4:22 PM · OpenRefine

Jan 18 2020

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

This does not have anything to do with you indeed! I was just trying to explain that I stopped trying to help solve this issue (therefore unsubscribing from this).

Jan 18 2020, 1:14 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata
Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

I don't know - I stopped working on this task and T240369 since T240374 was declined. I don't think I can contribute to solving this problem in the current state of affairs, sorry!

Jan 18 2020, 7:41 AM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Jan 17 2020

Pintoch added a comment to T240442: Design a continuous throttling policy for Wikidata bots.

It is actually possible to retrieve the current maxlag value from the API without making any edit (see @Addshore's comment above).
So, just retrieve the current maxlag value and compute your desired edit rate for this maxlag with the function plotted above. Then sleep for the appropriate amount of time between any two edits to achieve this rate. Refresh the maxlag value from the server periodically.

Jan 17 2020, 8:18 PM · Wikidata

Jan 3 2020

Pintoch closed T172750: Tool "openrefine-wikidata" loads assets from bootstrapcdn and github, a subtask of T172065: Hunt for Toolforge tools that load resources from third party sites, as Resolved.
Jan 3 2020, 11:43 AM · Privacy Engineering, Toolforge-standards-committee, Tools, Privacy
Pintoch closed T172750: Tool "openrefine-wikidata" loads assets from bootstrapcdn and github as Resolved.

The task has been resolved.

Jan 3 2020, 11:43 AM · OpenRefine, Tools

Dec 29 2019

Pintoch added a comment to T241029: EditGroups tool: Unstable connection to SQL database on Toolforge.

OK! This connection should never be idle given that bot edits on Wikidata never stop, so I am still not sure why this happens. It might be due to the specifics of how Django handles these long-running SQL connections.

Dec 29 2019, 9:01 AM · Tools, Data-Services

Dec 24 2019

Pintoch updated the task description for T241422: Wikidata forms without statements use empty JSON array instead of empty JSON object.
Dec 24 2019, 4:32 PM · MW-1.35-notes (1.35.0-wmf.20; 2020-02-18), User-DannyS712, Patch-For-Review, Wikidata-Campsite, Wikidata, Lexicographical data
Pintoch created T241422: Wikidata forms without statements use empty JSON array instead of empty JSON object.
Dec 24 2019, 4:31 PM · MW-1.35-notes (1.35.0-wmf.20; 2020-02-18), User-DannyS712, Patch-For-Review, Wikidata-Campsite, Wikidata, Lexicographical data

Dec 18 2019

Pintoch renamed T241029: EditGroups tool: Unstable connection to SQL database on Toolforge from EditGroups tool: Unstable connection to SQL database on Toollabs to EditGroups tool: Unstable connection to SQL database on Toolforge.
Dec 18 2019, 11:24 AM · Tools, Data-Services
Pintoch created T241029: EditGroups tool: Unstable connection to SQL database on Toolforge.
Dec 18 2019, 4:39 AM · Tools, Data-Services

Dec 15 2019

Pintoch awarded T240795: Meeting about OpenRefine development a Like token.
Dec 15 2019, 4:39 PM · WMSE-Tools-for-Partnerships-2019-Blueprinting, User-LokalProfil, User-Alicia_Fagerving_WMSE

Dec 14 2019

Pintoch added a comment to T233715: Catch exception for Dissemin papers without expected author names.

This seems like a pretty important bug… I would not get held up by design worries: just make sure no exception is thrown!

Dec 14 2019, 5:33 PM · OABot

Dec 11 2019

Pintoch added a comment to T240442: Design a continuous throttling policy for Wikidata bots.

Thanks! I think dynamically changing the maxlag value is likely to still introduce some thresholds, whereas a continuous slowdown (by retrieving the lag and compute one's edit rate based on it) should in theory reach an equilibrium point.

Dec 11 2019, 4:49 PM · Wikidata
Pintoch added a comment to T240370: Maxlag=5 for Petscan.

Thanks for the analysis! Whether this is a breaking change or not is not my concern: Petscan and other mass-editing tools based on Widar should play by the book. I can provide a simple patch which ensures maxlag=5 is applied to all Widar edits: if someone wants to do a refined version which allows specific user-triggered edits to go through without a maxlag parameter, that is great. @Magnus, what is your take on this?

Dec 11 2019, 3:27 PM · Wikidata
Addshore awarded T240442: Design a continuous throttling policy for Wikidata bots a Manufacturing Defect? token.
Dec 11 2019, 11:37 AM · Wikidata
Pintoch added a comment to T240442: Design a continuous throttling policy for Wikidata bots.

If clients are able to retrieve the current lag periodically (through some MediaWiki API call? which one?), then this should not require any server-side change. Clients can continue to use maxlag=5 but to also throttle themselves using the smoothed function proposed.

Dec 11 2019, 11:25 AM · Wikidata
Pintoch created T240442: Design a continuous throttling policy for Wikidata bots.
Dec 11 2019, 11:19 AM · Wikidata
Pintoch added a comment to T240370: Maxlag=5 for Petscan.

@Bugreporter have you got details of where this behaviour is currently implemented in PetScan? In particular, how do you request the current maxlag with the MediaWiki API?

Dec 11 2019, 10:55 AM · Wikidata
Pintoch added a comment to T197588: Agree on a "manifest" format to expose the configuration of Wikibase instances.

@Theklan let's move your issue to a different ticket as your issue does not seem to be related: T240436

Dec 11 2019, 10:45 AM · OpenRefine, Wikidata
Pintoch added a project to T240436: Unknown command INCRBYFLOAT in OpenRefine reconciliation interface: OpenRefine.
Dec 11 2019, 10:44 AM · OpenRefine
Pintoch created T240436: Unknown command INCRBYFLOAT in OpenRefine reconciliation interface.
Dec 11 2019, 10:43 AM · OpenRefine

Dec 10 2019

Pintoch added a comment to T240370: Maxlag=5 for Petscan.

I've had a quick look at the code to see if I could submit a patch for this myself but it is not clear to me where the edits are done - I have looked in petscan_rs and wikibase_rs to no avail. Petscan edits might be done in the browser by sending them to some Widar-like interface?

Dec 10 2019, 9:57 PM · Wikidata
Pintoch added a comment to T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5.

@Bugreporter yes indeed! I was off by one hour there. Thanks for your help! Feel free to add more bots which match that period.

Dec 10 2019, 8:24 PM · Wikidata
Pintoch added a comment to T240371: Maxlag=5 for Author Disambiguator.

Matching pull request: https://github.com/arthurpsmith/author-disambiguator/pull/107

Dec 10 2019, 8:21 PM · Wikidata
Pintoch added a comment to T240376: Maxlag=5 for LogainmBot.

If you have not changed anything in user-config.py then you should be good to go, it might have been a false positive on my side. Sorry for the noise!

Dec 10 2019, 8:19 PM · Wikidata
Pintoch updated the task description for T240377: Maxlag=5 for Reinheitsgebot.
Dec 10 2019, 8:12 PM · Wikidata
Pintoch updated the task description for T240375: Maxlag=5 for LargeDatasetBot.
Dec 10 2019, 8:11 PM · Wikidata
Pintoch updated the task description for T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5.
Dec 10 2019, 8:09 PM · Wikidata
Pintoch created T240377: Maxlag=5 for Reinheitsgebot.
Dec 10 2019, 8:05 PM · Wikidata
Pintoch created T240376: Maxlag=5 for LogainmBot.
Dec 10 2019, 8:03 PM · Wikidata
Pintoch created T240375: Maxlag=5 for LargeDatasetBot.
Dec 10 2019, 8:00 PM · Wikidata
Pintoch created T240374: Maxlag=5 for BotMultichill.
Dec 10 2019, 7:58 PM · Wikidata
Pintoch created T240373: Maxlag=5 for Edoderoobot.
Dec 10 2019, 7:56 PM · Wikidata
Pintoch assigned T240370: Maxlag=5 for Petscan to Magnus.
Dec 10 2019, 7:54 PM · Wikidata
Pintoch created T240371: Maxlag=5 for Author Disambiguator.
Dec 10 2019, 7:54 PM · Wikidata
Pintoch created T240370: Maxlag=5 for Petscan.
Dec 10 2019, 7:53 PM · Wikidata
Pintoch created T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5.
Dec 10 2019, 7:51 PM · Wikidata
Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

I am first getting in touch with people who seem to be running bots with maxlag greater than 5 or no maxlag parameter at all, to see if they would accept to follow @Addshore's advice never to use maxlag greater than 5 at all.

Dec 10 2019, 2:27 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Dec 2 2019

Pintoch added a comment to T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata.

There has been one attempt I think but it did not go very far - and the ids do not seem to have been imported so far!

Dec 2 2019, 10:42 AM · OpenRefine, Wiki-Techstorm-2019

Nov 29 2019

Pintoch added a comment to T238340: Import UK lakes in Wikidata.

I think it is a bit harder to extract more than 1000 records (I didn't cap it on purpose to make it manageable for the workshop).

Nov 29 2019, 11:27 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238340: Import UK lakes in Wikidata.

@Jheald I don't think anyone is working on this anymore: if you are still interested in the scraped dataset, it is here: http://pintoch.ulminfo.fr/adc2c9aaba/lakes-portal.tsv

Nov 29 2019, 7:20 AM · OpenRefine, Wiki-Techstorm-2019

Nov 26 2019

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

OK! If you have ways to check what sort of maxlag values are used it would be great!

Nov 26 2019, 7:32 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Nov 25 2019

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

Actually, some tools seem to be doing something like that already, since edits are still going through despite max lag being above 5 for more than an hour now (Author Disambiguator does this, QuickStatements too probably, Edoderoobot too). So these tools use higher (more agressive) maxlag values than 5.

Nov 25 2019, 11:24 AM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata
Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

One problem with the current policy (requesting all automated editing processes to use maxlag=5) is that this creates a binary threshold: either the query service lag is under the threshold, in which case bots will edit at full speed, or the query service lag is above the threshold, in which case they should all stop editing entirely. This is likely to create an oscillating behaviour, where all bots start and stop periodically. This is probably not ideal neither for the infrastructure nor for the users.

Nov 25 2019, 10:40 AM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Nov 23 2019

Pintoch awarded T237471: Import data on hospitals in Västra Götalandsregionen a Love token.
Nov 23 2019, 9:41 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata.

We have an import here: https://tools.wmflabs.org/editgroups/b/OR/8cf42ae3c0/
Good job!

Nov 23 2019, 11:28 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238333: Import the Museum Data Files in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Nov 23 2019, 11:26 AM · Wiki-Techstorm-2019, OpenRefine
Pintoch moved T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Nov 23 2019, 11:25 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Nov 23 2019, 11:25 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238441: Import endangered alphabets in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Nov 23 2019, 11:16 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T237615: Translate interface of OpenRefine in Dutch.

Moving to "Done" although this is not complete, but we did have a lot of new translations (not just in Dutch) during the event.

Nov 23 2019, 11:15 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T237615: Translate interface of OpenRefine in Dutch from Backlog to Done on the Wiki-Techstorm-2019 board.
Nov 23 2019, 11:15 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238348: Import Publons journal IDs in Wikidata from Backlog to Done on the Wiki-Techstorm-2019 board.
Nov 23 2019, 11:14 AM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238348: Import Publons journal IDs in Wikidata.

Data is getting into Wikidata!

Nov 23 2019, 11:10 AM · OpenRefine, Wiki-Techstorm-2019

Nov 22 2019

Pintoch added a comment to T236038: WORKSHOP: OpenRefine (10.30 - 12.15).

If you have trouble reconciling, use this reconciliation service: http://ulminfo.fr:3894/en/api

Nov 22 2019, 2:14 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T236038: WORKSHOP: OpenRefine (10.30 - 12.15).

For people who cannot install OpenRefine on their laptops, use: http://5.135.188.139:48379/

Nov 22 2019, 9:57 AM · OpenRefine, Wiki-Techstorm-2019

Nov 21 2019

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

So this is what we get with an exponential back-off (1.5 factor), at the moment:

22:37:27.148 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 1000 milliseconds. (19338ms)
22:37:28.729 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 1500 milliseconds. (1581ms)
22:37:33.809 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 2250 milliseconds. (5080ms)
22:37:37.931 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 3375 milliseconds. (4122ms)
22:37:42.663 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 5062 milliseconds. (4732ms)
22:37:49.437 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 7593 milliseconds. (6774ms)
22:37:58.429 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 5.4916666666667 seconds lagged. -- pausing for 11389 milliseconds. (8992ms)
22:38:18.217 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 6 seconds lagged. -- pausing for 17083 milliseconds. (19788ms)
22:38:36.461 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 6 seconds lagged. -- pausing for 25624 milliseconds. (18244ms)
22:39:05.013 [..baseapi.WbEditingAction] [maxlag] Waiting for all: 6.4666666666667 seconds lagged. -- pausing for 38436 milliseconds. (28552ms)

So it looks like this means no OpenRefine edits at all with these new rules, in the current situation.

Nov 21 2019, 9:53 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata
Pintoch added a parent task for T238847: Create a suitable template and import into wikidata the list of major data breaches: T236038: WORKSHOP: OpenRefine (10.30 - 12.15).
Nov 21 2019, 9:09 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a subtask for T236038: WORKSHOP: OpenRefine (10.30 - 12.15): T238847: Create a suitable template and import into wikidata the list of major data breaches.
Nov 21 2019, 9:09 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238847: Create a suitable template and import into wikidata the list of major data breaches from Backlog to Data imports on the OpenRefine board.
Nov 21 2019, 9:04 PM · OpenRefine, Wiki-Techstorm-2019

Nov 20 2019

Pintoch added a comment to T221774: Add Wikidata query service lag to Wikidata maxlag.

Thanks for the notification! I would be happy to release a new version of OpenRefine with a patch applied - I can do this in the coming days. The exponential back-off suggested by @Multichill makes sense intuitively - could WMDE confirm that this is the policy they recommend? Happy to adapt the policy as required.

Nov 20 2019, 9:52 PM · MW-1.35-notes (1.35.0-wmf.8; 2019-11-26), User-Addshore, Wikidata-Campsite (Wikidata-Campsite-Iteration-∞), MW-1.34-notes (1.34.0-wmf.21; 2019-09-03), Patch-For-Review, observability, Wikidata-Query-Service, Wikidata

Nov 15 2019

Pintoch triaged T238454: Import list of members of the Federation of Museums and Art Galleries of Wales as Medium priority.
Nov 15 2019, 10:13 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238454: Import list of members of the Federation of Museums and Art Galleries of Wales from Backlog to Data imports on the OpenRefine board.
Nov 15 2019, 10:13 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238454: Import list of members of the Federation of Museums and Art Galleries of Wales.
Nov 15 2019, 10:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch updated the task description for T238340: Import UK lakes in Wikidata.
Nov 15 2019, 9:35 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238441: Import endangered alphabets in Wikidata from Backlog to Data imports on the OpenRefine board.
Nov 15 2019, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238441: Import endangered alphabets in Wikidata from Backlog to SPARQLstation on the Wiki-Techstorm-2019 board.
Nov 15 2019, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch triaged T238441: Import endangered alphabets in Wikidata as Medium priority.
Nov 15 2019, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238441: Import endangered alphabets in Wikidata.
Nov 15 2019, 7:57 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch updated the task description for T236038: WORKSHOP: OpenRefine (10.30 - 12.15).
Nov 15 2019, 6:54 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238348: Import Publons journal IDs in Wikidata up for grabs.
Nov 15 2019, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata up for grabs.
Nov 15 2019, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata up for grabs.
Nov 15 2019, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238340: Import UK lakes in Wikidata up for grabs.
Nov 15 2019, 4:31 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238334: Import public libraries in Austria up for grabs.
Nov 15 2019, 4:30 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238333: Import the Museum Data Files in Wikidata up for grabs.
Nov 15 2019, 4:30 PM · Wiki-Techstorm-2019, OpenRefine

Nov 14 2019

Pintoch moved T236768: Wikimedia Commons: Reconcile & upload images of old prints from Erfgoedcentrum Rozet to Commons with OpenRefine & Pattypan from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T237471: Import data on hospitals in Västra Götalandsregionen from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238333: Import the Museum Data Files in Wikidata from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:19 PM · Wiki-Techstorm-2019, OpenRefine
Pintoch moved T238334: Import public libraries in Austria from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238340: Import UK lakes in Wikidata from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:19 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238348: Import Publons journal IDs in Wikidata from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:18 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch moved T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata from Backlog to Data imports on the OpenRefine board.
Nov 14 2019, 6:18 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch added a comment to T238348: Import Publons journal IDs in Wikidata.

@Ecritures shouldn't we rather assign tasks to the participants who decide to tackle them?

Nov 14 2019, 6:16 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238348: Import Publons journal IDs in Wikidata.
Nov 14 2019, 4:51 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238346: Import Californian list of chemicals known to cause cancer or reproductive harm in Wikidata.
Nov 14 2019, 4:24 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T237615: Translate interface of OpenRefine in Dutch up for grabs.
Nov 14 2019, 4:13 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238334: Import public libraries in Austria up for grabs.
Nov 14 2019, 4:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch placed T238340: Import UK lakes in Wikidata up for grabs.
Nov 14 2019, 4:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238345: Import ISO 10383 Codes for exchanges and market identification (MIC) in Wikidata.
Nov 14 2019, 4:12 PM · OpenRefine, Wiki-Techstorm-2019
Pintoch created T238340: Import UK lakes in Wikidata.
Nov 14 2019, 3:48 PM · OpenRefine, Wiki-Techstorm-2019