Looks like I put this under "request IRC meeting" by mistake last week. @Bawolff, do you think this would benefit from a public IRC meeting soon?
Retracted per @D3r1ck01's comment.
I suggest to use variable suffixes instead of arrays. old_content_model_main, old_content_model_mediainfo, etc. Not a great example, but I suppose you see what I mean.
Another thing I found: SpamBlacklist causes a re-parse on every edit. Reason:
Tue, Nov 13
Another note: it seems like the
$this->revision->getUser( RevisionRecord::RAW )->getName() !== $user->getName()
check works correctly for null edits.
Another outcome of this analysis: SpamBlacklist causes a re-parse on every edit. Reason:
The first thing I just found looking into this is this bit in CategoryMembershipChangeJob:
@Joe I think it would be excellent if you could take this on, thank you!
Mon, Nov 12
Sun, Nov 11
Fri, Nov 9
@Cparle this ticket here *is* about making sure all slots are passed to cirrus. Cirrus should then also pass them on via its own hooks. Changing a hook signature isn't trivial though, it's generally better to introduce a new hook.
Oh, this is in MediaInfo. I suppose there is no way to do better here, then. But Cirrus should needs a better hook.
@aaron if I understand correctly, the intention is not to permanently record when the page was watched. The intention is instead to initialize the "last viewed" timestamp to the time the page was watched. This makes sense semantically: Assuming a page that is added to the watchlist was never viewed makes little sense. Asssuming that the user was looking at that page when they added it to the watchlist makes a lot more sense. With respecct to the functionality controlled by wl_notificationtimestamp this should make no difference: either way, no notifications will be shown for the time prior to the point at which the page was added to the watchlist.
@Imarlier you wrote this RFC as the outcome of a session that you took on at the last minute. I suppose it's not fair to expect you to go through with getting this RFC through the process. Reaching consensus on this is not going to be quick or easy. Are you interested in (and have capacity to) take this on? If not, we should designate someone else to drive this.
Thu, Nov 8
Moving this to the backlog of the RFC board, indicating that this ticket needs to be fleshed out more to make it a viable RFC.
Since WatchedItemStore already exists, re-purposing this ticket to just call for isWatched, addWatch and removeWatch to be deprecated.
@Joe the only distinction I see there is "stuff we write" vs "stuff we don't write", really.
Reading through the session notes and this proposal, several aspects come to mind that I believe would be worth mentioning. I'll try to give a quick brain dump:
- Distinction between public services (that clients use instead of or in addition of mediawiki), and internal services (that are used by mediawiki).
- Distinction between stateless and stateful services. MediaWiki uses several external services to maintain state (MySQL, memcached, redis, etc).
- Distinction between long-running tasks (transcoding), async notifications (event logging), and tasks to be performed synchronously during a request "while the user waits" (like the proposed new session storage service).
- Per the above, stateless services that perform synchronous in-request tasks tend to be pointless. The same functionality can generally be implemented inline, in core or as an extension, without the communication and operational overhead of a standalone service.
- Discussion of whether HTTP is the preferred method of communication between services (probably yes, at least for synchronous calls)
- The "chattyness with the API" criterion also works the other way around: if MediaWiki would have to talk to the service a lot, that's an indicator that the functionality should probablly not be in a separate process.
- Having the need to be synchronous as a counter-indicator doesn't seem useful. Several prime candidates for external services would need synchronous call semantics, e.g. an authentication service.
- Functionality invoked via shell-out or implemented by forking should be treated separately, as it has very different characteristics.
- The statement that it's "very likely" that an extension would be used for integration is misleading, as it implies that functionality that is presently in core should not be factored out into an external service. But factoring core functionality out into external services should indeed be considered, especially for security relevant functionality.
- Anything that needs to function in a shared hosting environment (LAMP) can't be an external service (or needs to have an alternative PHP implementation)
- Anything that needs to parse wikitext or need access to MediaWiki i18n messages (which are wikitext) should probably not be an external service.
Accepted as an RFC per today's TechCom meeting. This is considered as "under discussion" for now. When the discussion has settled down, move this to the "request IRC meeting" on the TechCom board (or to thhe inbox column for review).
Wed, Nov 7
As far as I can tell, this is an error raised while handling another error (a ApiUsageException). Something somewhere is writing something that contains a newline (a stack trace, perhaps) into the field that is supposed to contain an error code.
wfRandom currently uses:
Tue, Nov 6
Quite a few "features" discussed in T206075: Wikimedia Technical Conference 2018 Session - Building MediaWiki to Support Collaboration align with "missing" concepts.
Session done, tickets filed.
Analysis: SpamBlacklists uses the EditFilterMergedContent hook, to which it binds the handle function SpamBlacklistHooks::filterMergedContent.
In that method, it calls WikiPage::prepareContentForEdit which returns an EditInfo instances with a ParserOutput object in the $output field. This object comes from DerivedPageDataUpdater::getPreparedEdit, which sets the $output field to the combined ParserOutput for all slots as returned by getCanonicalParserOutput(). SpamBlacklistHooks::filterMergedContent() then calls getExternalLinks() on that ParserOutput, and applies the blacklist filter to it.
Mon, Nov 5
I created several tasks for implementing the things that were discussed during the session. They currently hang off the session as sub-tasks, but that doesn't make sense semantically (they don't block the session, the session is done). As soon as the new tag proposed in T207976: Create "MediaWiki-Decoupling" phabricator project exists, they should be tagged accordingly, and this ticket should be closed.