Mentioning T66214: Define an official thumb API for reference.
Declining as an RFC, since this is not proposing a technical solution. But it's certainly interestingly as a problem statement.
Dropping this off the RFC board, since it's not actionable. Adding to TechCom radar, since this seems relevant to platform evolution.
Moving to backlog. Needs to be more of a technical proposal to work as an RFC. @Krinkle said he'd find a related older task and improve the this.
Tue, Apr 24
The "fell back to READ_LATEST" warning seems to be triggers as follows:
Pinging TechCom. I don't think this needs an RFC, but it's worth a heads up.
prepareSave() is indeed a misnomer, but this is hard to fix. There is an isValid() method which we could call when performing an edit, but that returns a boolean. It does not provide a way to report anything helpful to the user.
Or just allow multiple sitelinks and treat them like constraint violations.
Mon, Apr 23
I don't think the statement in T143842#4151062 is useful, or even correct. ;)
RevisionArchiveRecord bug is fixed, prio of the rest is not high.
@thiemowmde ah, with "local" you mean per-repo, not local to my system.
Sat, Apr 21
@daniel: Please feel free to disable this part of the sniff in your local .phpcs.xml for now.
Fri, Apr 20
This ties in with something I have been wanting for a long time for Wikidata: an easy way to discover the entity URI associated with a wikidata page, as well as different data URLs (see T161527). For example, on https://www.wikidata.org/wiki/Q64, I would want to be able to discover:
Thu, Apr 19
@Umherirrender hm, I'd like to avoid using the "Test" suffix - it's not a runnable test case afterall. We use the "TestBase" or "TestCase" ending for abstract test classes for the same reason, and I have used the "Tester" suffix for helper classes that I wrote for the same purpose as the trait: asserting compliance with the contract of an interface, across implementations.
It's done and deployed.
Oh! So, the problem description should be:
Folded into T174038: Implement MCR page update interface
Wed, Apr 18
Sounds like you are re-inventing what wikidata calls "constraints" - have a look at https://www.wikidata.org/wiki/Help:Property_constraints_portal
@Jarekt You can check which pages depend on which entity using Special:EntityUsage on the client wiki, e.g. https://commons.wikimedia.org/wiki/Special:EntityUsage/Q23.
@aaron My confusion is understanding the subtleties of the timestamp handling in RefreshLinksJob, and the interaction between HTMLCacheUpdateJob and RefreshLinksJob, and the ParserCache.
A wikidata change triggers links updates as follows:
- ChangeHandler::handleChange calls WikiPageUpdater::scheduleRefreshLinks
- WikiPageUpdater::scheduleRefreshLinks schedules a RefreshLinksJob
- RefreshLinksJob::runForTitle() then...
- re-parses the page (hopefully - the interaction with the parser cache is somewhat complex. But since the page itself is getting re-rendered, this part seems to work)
- calls WikitextContent::getSecondaryDataUpdates, which returns a LinksUpdate
- calls LinksUpdate::doUpdate, which updates the database, including the categorylinks table
Tue, Apr 17
Bumping to "high" for the RevisionArchiveRecord revision fix, to avoid incorrect revisions being created on undeletion.
Udpating and running populateRevisionLength probably doesn't have high prio.
I'll fix RevisionArchiveRecord.
@Yurik now you lost me. "relgion" isn't a unique ID. And it should be editable. Why not use a regular statement? That's what we do on wikidata. But this seems to be entirely unrelated to this ticket.
Mon, Apr 16
Oh, I just realized. You could fake this using a fake sitelink. On wikidata, an item's sitelinks points to articles in sister projects like wikipedia. It should not be hard to allow pages on non-mediawiki sites to be references in the same way - or even just pretend to reference a page. Sitelinks are unique, and can even be used to address items in the API.
A good example of how to add a custom entity type is https://www.mediawiki.org/wiki/Extension:WikibaseMediaInfo. The entry point for defining an entity type is the wiring file at https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseMediaInfo/+/master/WikibaseMediaInfo.entitytypes.php. For creating entities of a new type, especially with extra requirements, a new API module should be implemented, similar to the one we introduced for Lexeme Forms: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/WikibaseLexeme/+/master/src/Api/AddForm.php.
For a 3rd party site, it would not be terribly hard to implement a new entity type that extends the Item type to have an additional immutable string ID. This would be done by a custom extension on top of Wikibase. I don't think we'd deploy such a thin on Wikidata, though - we'd end up with hundreds of such custom types that only differ by a handful of fields, which with it's own slightly different logic.
Fri, Apr 13
content.content_address doesn't have to be nulled. There will just be no mechanism on labs for resolving these addresses.
Wed, Apr 11
Why is uniqueness even an issue? Just provide a way to search items by the value associated with a property. The result will be a ranked list, potentially incomplete if there are many matches. If the client just wants the "best" match, it should just use the top match.
Tue, Apr 10
I didn't look at the Lua code, but here is my guess based on what was written above:
The "main" slot has always been somewhat special, for example it has to be present (even if empty) on every page.
Mon, Apr 9
Sun, Apr 1
Fri, Mar 30
Thu, Mar 29
Any "php error" (notice, warning, error) is turned into an exception by PHPUnit and fails the test.
While working on MCR, we really noticed that having Content-specific ParserOptions doesn't make much sense anywhere, and we want to remove that option. If I understand correctly, @Anomie is pointing out a better way to achieve what we wanted: splitting the parser cache on the user language.
Wed, Mar 28
@DePiep You can mention people, and you can quite them. You can't really "reply".
Note to self: look at @Tgr's comments on https://www.mediawiki.org/wiki/Topic:U8zvaqr5vxw5d1pw
@Tgr You are right, we need fine grained per-slot tracking to enable efficient purging. We should keep this in mind. This RFC however is about what to do as long as we don't have that. What behavior do we aim for with the current db schema for tracking meta-data.
I'd like to point out that Wikibase has code for unit conversion, along with conversion factors for several thousand units in the config. We currently only use this when exporting to RDF. It wouldn't be very hard to make this functionality available via Lua, though.
Tue, Mar 27
@Ladsgroup I agree that it's a bug and should be fixed. But if I understand correctly, it's a bug in git.
Mar 26 2018
I feel like filing an upstream bug. "fatal: protocol error: bad pack header" is NOT a good way to say "the branch you are tracking no longer exists on the remote repo"...
Ah, thank you!
Is this somehow a new thing? I wonder why I'm running into this for the first time now.
A fresh clone seems to fix the problem. So no UBN.
[14:49] <zhuyifei1999_> worked for me
[14:49] <zhuyifei1999_> https://www.irccloud.com/pastebin/U3V1obs5/
Bumping to high. May even be UBN. I'm trying a fresh clone now.
Mar 23 2018
Mar 22 2018
Mar 21 2018
@dbarratt Creating such a wrapper project indeed does not need an RFC. Proposing the structure defined by that wrapper for new 3rd party installations, or for wmf deployments, or for the standard development environment - that would need an RFC.
A stack trace would be helpful. What's the easiest way to get one these days?
Mar 20 2018
@Tgr You are right: tracking dependencies between resources needs to become more fine grained. That's the idea behind https://www.mediawiki.org/wiki/User:Daniel_Kinzler_(WMDE)/DependencyEngine (which needs an update). The services team has an investigation of this in their anual plan. I don't think there's an empic for this on phabricator yet - perhaps I will add one.
@kchapman no resources. unlikely to move any time soon. the discussion and conclusions are still relevant. I suppose that means it can either sit in the backlog, or drop off the board. I'm fine with either.
Sure, I can drop the table in core (I will check to make sure it has no data) and you can take care of creating it again with the correct schema?
@Marostegui If that's fine with you, that's fine with me.
External stuff can be changed, on the cost of making breaking change. But doing changing in the storage is more painful, from the developer point of view.
Mar 19 2018
Also IMO, there's no need for #2B;