Description
We are creating an on-wiki registration system for campaigns in the Wikimedia movement (e.g. Wiki Loves Monuments, WikiGap, #1Lib1Ref, etc.). For this iteration, we are focusing on the campaign organisers' experience to create/edit registration, as well as for participants' experience to sign up for campaigns.
Please find user flow here: for campaigns organisers, and for campaigns participants
Additional references:
Preview environment
- As a participant, to register for a sample event: https://meta.wikimedia.beta.wmflabs.org/wiki/Event:Cool_Event
- As an organizer, to enable registration for an existing event page: https://meta.wikimedia.beta.wmflabs.org/wiki/Special:EnableEventRegistration
- As an organiser, to see all your events: https://meta.wikimedia.beta.wmflabs.org/wiki/Special:MyEvents
Which code to review
The code repository is hosted on: https://gerrit.wikimedia.org/g/mediawiki/extensions/CampaignEvents
Performance assessment
Please initiate the performance assessment by answering the below:
What work has been done to ensure the best possible performance of the feature?
We tried to adhere to the performance best practices, making use of the abstraction layers provided by MediaWiki. We evaluated each feature in edge case scenarios (e.g., what happens if the number of X records is one million times the expected number of average records), to ensure that it was built in a scalable way. We use pagination or infinite scrolling when presenting potentially-"unlimited" lists of records. We made sure to use DeferredUpdates where applicable (right now the only example is handling the PageMoveCompleteHook). We consulted DBAs to make sure that our queries could use indexes and remain fast (T308738); the schema was tested with a big random dataset to make sure that reality reflected expectations. We tried bundling client-side features in fewer modules, to avoid bloating the RL registry.
What are likely to be the weak areas (e.g. bottlenecks) of the code in terms of performance?
The extension will be configured to use a central database (on x1) for Wikimedia production. I'm not sure if this would be slower than just using the wiki database.
We currently do not have many features, and consequently not many possible bottlenecks. I would say the main weak area right now is database reads, especially when many records are potentially scanned/retrieved at the same time (e.g. Special:MyEvents).
Also, the fact that our extension is "global" (i.e., uses a central DB, works with central user accounts and references pages cross-wiki) means that some operations could be slower than normal. For instance, Special:EventDetails has an input for filtering a list of users by name. Since we only store the central user IDs, we need to run an unfiltered query on the whole table, then convert IDs to names with CentralAuth and filter the list on the PHP side. Another example is that we cannot preload event pages en masse on Special:MyEvents, because LinkBatch does not support cross-wiki page references (so we only use it for local pages). Looking up a central user account happens in many places throughout the codebase.
Are there potential optimisations that haven't been performed yet?
One thing that we didn't think about (yet) is caching DB records. We use neither in-object caching nor object cache. We haven't looked into which places would benefit from this, and what the appropriate cache type would be for each of them.
Also, this is not really an optimization per se, but we don't have any performance measurement in place. We could maybe use some help in determining which operations would need to be measured.
Please list which performance measurements are in place for the feature and/or what you've measured ad-hoc so far. If you are unsure what to measure, ask the Performance Team for advice: performance-team@wikimedia.org.
As explained above, no measurements are in place. The only thing we kinda measured is the impact of adding the proposed indexes; this was tested on a random dataset, see T308738.