When a null revision is produced, such as by page protection or by file re-upload, there is currently no LinksUpdate executed as deferred, nor is there a RefreshLinks job queued.
### Problem 1: Uploads
This came up when @EBernhardson and I investigated an issue relating to CirrusSearch seeing stale data during its updates after a file re-upload. The hypothesis was that on reupload, MediaWiki stores the new media file in Swift, extracts metadata from the temp file (`LocalFile::recordUpload3`, and `::getMetadataForDb`), writes new metadata to MySQL. Then, queue a links update, which queues Cirrus update, which reads stale metadata from MySQL. The suspected cause is that in some cases, the database host used by the job runner may be slightly behind and hasn't caught up yet.
>>! From `#wikimedia-perf`:
> <ebernhardson> […] while for most pages our cirrus LinksUpdate job comes from the LinksUpdateComplete hook, we have a one-off that attaches to UploadComplete and creates the same jobs (not sure why, aren't all uploads also new revisions?)
> <Krinkle> it's true that reuploads insert a revision, but not a content change. It's a "null" revision akin to page protection. I don't know off-hand whether those trigger the same LinksUpdate or not.
The CirrusSearch extension, uses `onUploadComplete` to synchronously call `JobQueue->push`. Given that upload requests are relatively slow (in part due to cross-table row moves, as highlighted by RFC T28741 that awaits resourcing since 2011), and given that the `push()` here is not a `lazyPush()`, this means it might mean that the job sometimes ends up on a jobrunner before the web response's DB transaction has been committed.
If null revisions queued DerivedDataUpdater the same way that other revisions do, then this workaround would need to exist, and Cirrus can instead fully rely on LinksUpdate.
### Problem 2: Page protection
The most obvious use of null revisions that need to trigger LinksUpdate is page protection. And to my surprise, this is actually broken at the moment.
I suspect this is a regression from the WikiPage/RevisionStore refactoring, where queueing of LinksUpdate got limited to getDerivedDataUpdater(), which in turn is limited to "normal" edits.
Test case, given:
* parser cache is enabled. E.g. `$wgParserCacheType = CACHE_DB;`.
* debug logs enabled, e.g. as DevelopmentSettings.php and mediawiki-docker do by default.
```
<nowiki>{{PROTECTIONLEVEL:edit}}</nowiki> = {{PROTECTIONLEVEL:edit}}
{{#ifeq:{{PROTECTIONLEVEL:edit}}|sysop| [[Sysop]] [[Category:Sysop]] }}
```
Save this on a `[[Sandbox]]` page. Open separate tabs for `Special:WhatLinksHere/Sysop` and `Category:Sysop`, confirm they are empty or otherwise don't include the Sandbox page.
As an admin, protect Sandbox to be editable by administrators only. You'll see the subsequent page view correctly reflects the new protection level. However, mw-debug-www.log reflect that no LinksUpdate was deferred and no RefreshLinks job was queued. And indeed, no amount of refreshing the WhatLinksHere and Category page result in Sandbox appearing.
The reason the page view isn't affected is that ParserCache naturally varies by revision ID, so that didn't require a LinksUpdate.