The way the logging was built into SpamBlacklist is overly complicated and broken. To maintain it, someone is required to understand the interaction of several hooks with a state variable that may or may not have race conditions. As I'm completely unfamiliar with mediawiki development, I'm not sure if there's a simple solution. But the current solution is un-maintainable by anyone other than the original author. If they refuse to maintain it, I think the best decision would be to scrap it and start over with a much simpler approach.
Question to anyone who's listening: it seems to me the complexity is mostly due to premature optimization, the desire to re-use "links added" and "links removed" arrays that are computed as part of SpamBlacklist. These could be costly to re-compute in a separate extension, so I understand the idea. But how do hooks work? Can't I just subscribe to a hook, do that processing, and fire the logging event without slowing down the actual save? Does the hook have to complete processing before the edit can be saved? Is that what the job queue is for? Is there good documentation on this somewhere?