After a new revision is saved, secondary data, like link tables, is updated using ansynchronously via the DeferredUpdates. Deferred updates are executed out-of-band, after the transaction that updated the primary data (revision meta-data, etc) is complete. Depending on setup and situation, such updates may even be pushed to the job queue, and may not be processed for minutes.
External tools that keep track of edits on the wiki, by polling recentchanges or using some kind of life feed, can get stale data because of this. E.g. a tool that wants to replicated the category graph would query the categorylinks table (either directly or via the api) whenever a category page was edited. But the categorylinks table may not have been updated yet; the external graph cannot be kept up to date. The Wikidata Query Service is affected by this, regarding the page_links table, see T145712: Statement counts from pageprops do not match actual ones ( wikibase:statements and wikibase:sitelinks ).
There should be a mechanism for external tools to be notified when an edit has been fully processed.
Possible solution: for each edit, store the number of pending update jobs in a new field of the recentchanges table. When each update job completes, it decrements that counter. Entries in the recentchanegs table that have a non-zero count of pending updates can then be ignored when desired.
Alternatively, defer the entry to recentchanges until all other updates have completed. This would require a guaranteed order of execution for jobs, though.