Software developer on the Wikidata team at Wikimedia Germany. Private account: @LucasWerkmeister.
This is now live \o/ https://commons.wikimedia.org/wiki/Template:Cc-zero
Do we want to get our data from the href or data-attributes? (I like href, because then we can ensure correct no-js behaviour. But they cannot edit this without js on wikidata.org either...)
Mon, Jun 17
Fri, Jun 14
Thu, Jun 13
Rough sketch with Alaa – here, DatabaseTermIdsResolver gets the extra join conditions, and DatabasePropertyTermStore would call that special method if it detects its injected TermIdsResolver is a DatabaseTermIdsResolver.
Yeah, that’s what I meant with this part of the task description:
Wed, Jun 12
I guess building of pull requests should now be turned off again.
Travis is green again :)
Well, this also still needs to be wired up so that DataAccessSettings::useNormalizedPropertyTerms() can actually return true. I’m still working on that part, so re-claiming the task.
Done – this task was only for starting the conversion, and with all the patches here being merged, I think we can close it. Hopefully we can continue with the conversion soon.
It looks like the current BufferingTermLookup is set up in two places:
This was tested in T225084: Test write logic on test node, so I think we can close the task.
I think SiteConfiguration::getSetting() uses null to mean that no setting of that name exists (though it’s not mentioned in the documentation comments), that’s probably why getAll() and extractGlobalSetting() check is_null( $value ). If we still want to allow callers to distinguish between “setting is null” and “setting is missing”, then perhaps a different signature might be necessary:
Great, thanks for testing!
Travis build succeeded except for the PHP 7.3 version, but that seems to be failing for unrelated reasons.
Tue, Jun 11
Thanks! I updated #101 accordingly – so far the Travis job is stuck being queued, but hopefully it’ll start soonish and tell us whether that fully fixes the problem.
And cleanup also seems to work fairly well – wbx_text rows for the old versions of the labels I edited are gone.
That makes sense to me – it also matches TermBuffer’s array $termTypes = null, array $languageCodes = null arguments. A separate method doesn’t seem necessary, and a cloning setter seems harder to use to me. And a TermIdsResolverFilter interface doesn’t really work, I think, because you can’t actually inject arbitrary instances of the interface there, the implementation will have to know how to filter for it in the database. (Also, that would open the door to a MatchingPrefixTermIdsResolverFilter, i. e. a kind of search, which we don’t actually want to support as far as I understand.)
I think removing it is fine for now. Also, this means we can likely (partially) revert Ic25f4817ae: Inject ILBFactory, not ILoadBalancer, into stores, since we won’t need the transaction methods anymore. (Or should we still inject an ILBFactory? I really don’t understand the difference between the two classes, but in all the other classes we added, we inject an ILoadBalancer.)
I think so – looks like this was done in I228e201b73.
Fri, Jun 7
Okay, the Language::getLocalisationCache()->unloadAll(); doesn’t seem to make a difference. But I can keep trying more things (and adding debug code) on GitHub now that pull request building is enabled… though I don’t know how much time I’ll have for this :/
Okay, I think I’ve figured out the most important parts. Basically, you don’t want to name your instance wikibase-something; however, if you do, I’ve now set some default config that at least lets you SSH into the instance, and added some documentation at https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikidata-dev/Documentation#wikibase*.
My wikibase-dev issue turns out to be specific to that project, see T225312: role::wikibase in wikidata-dev Cloud VPS project broken (⇒ can’t SSH into wikibase-* instances), and @Samwalton9’s issue appears to have resolved itself according to discussion in #wikimedia-cloud, so I guess we can close this.
I tried this out by creating other-test-T225307 and wikibase-test-T225307 instances; other-test-T225307 worked fine, wikibase-test-T225307 doesn’t. (I’m deleting other-test-T225307 now, but wikibase-test-T225307 can stay around for a bit, in case someone wants to investigate it further.)
Now it’s building, thanks.
I'm not sure what level exactly it logs at, but it at very least debug logs… and given we resolve this on almost every Wikidata related request, we should try to avoid this also for performance reasons (even if no one listens to that log).
Failed asserting that an array does not have the key 'warnings'.
It might need more optimizations, but that should be a separate task.
Thu, Jun 6
I think we should decline this task – it turns out Wikibase already defers the whole terms update, so there’s no point in deferring just the cleanup part even further IMHO. (See also T225084#5240930.)
I think I’ve figured it out. Stacktrace from log/debug-termwiki.log:
The tests are working:
I just rebased the chain because it conflicted with Iabc6aaeb76. Can we get this merged soon to avoid further conflicts?
Oh, there’s an issue with this though: in the callback, we don’t have the property ID to which the terms belong… I think it actually needs to be a second argument to acquireTermIds(), not a persistent attribute of the acquirer?
Well, the third patch is still waiting for gate-and-submit to finish…
I’ll go with my suggested variant of the first version – I think that’s very clean and shouldn’t be too much effort.
I don’t see why we would need to investigate anything, to be honest – this task as I understand it isn’t about preventing those 429 errors, it’s about showing them to the user a bit more nicely, just like we detect timeouts and show a special error message.
Wed, Jun 5
As a slight variation of the first suggestion – if there’s an addTermIdsConsumerCallback() method, I think we could remove the constructor parameter again. After all, we wouldn’t use it in the WikibaseRepo use case, as far as I understand.
Damn, I also thought it would be due to that patch…