Page MenuHomePhabricator

Pre-load VE code into memory to improve performance and speed
Open, NormalPublic

Description

It is possible for us to query the local storage cache (mw.loader.store) to determine if VE has been loaded before by the client.

If it has then about 300k out of 500k of the modules (compressed sizes) will be in local storage. The remaining modules (ext.visualEditor.core & oojs-ui-widgets) will end up in the browser cache (after the second load).

If this is the case, then we can take about 200ms off VE load time by loading all these modules into memory in the background after the page has loaded.

If we think this is too aggressive, we could consider loading them when the 'edit' button is hovered, or some other heuristic (e.g. user has edited in the past 24 hours).

Event Timeline

Restricted Application added a subscriber: Aklapper. · View Herald TranscriptAug 24 2018, 11:18 AM
Deskana triaged this task as Normal priority.Aug 28 2018, 6:44 PM
Deskana added a subscriber: Deskana.

There's always tradeoffs: preload it so that it works faster for people that do use it but make things slower for those that don't, but doing nothing does the opposite. Needs some discussion!

Deskana renamed this task from Consider pre-loading VE code into memory to Consider pre-loading VE code into memory to improve performance and speed.Aug 28 2018, 6:44 PM

Related: T176262: Consider adding mw.loader.preload / OutputPage::preloadModules()

Currently, preloading is difficult to get right in such a way that won't cause a slow down of the page load time, and also won't cause a notable interruption of synchronous JS execute when the user might be doing other things.

Here is my conservative checklist for getting it right without further needing further sign-off from performance team, and without being blocked on T176262:

  • Use a timeout that waits until at least 2 seconds after the page has finished loading. Or, as Ed suggested, not at all by default but based on an early interaction (e.g. hovering before click). If doing this, make sure it won't start listening for the hover until after window.onload (if the user clicks before then, it'd load without preload).
  • Make the preload call using mw.loader.load(), from a mw.requestIdleCallback().
  • Make sure that the modules being loaded in this way (and all their dependencies) do not perform DOM computations on arrival. E.g. only registering classes, functions and objects. Specifically no function calls, selector matches, event handlers, or other DOM manipulation. Just downloading source code. Other initialisation should not part of the module's initialisation during preload, but rather on-demand during the actual load.

What about loading CSS files? Do we need to make sure they don't trigger any layout changes (they shouldn't anyway)?

@Esanders Yep, preloading CSS files is fine. I'd recommend the same approach for CSS as for JS. (Deferred, as per the above).

Having said that, we have fairly limited experience preloading large amounts of CSS. I expect it to be notably more expensive than loading of JS code. JS code can (if done well) be cheap to load at at any time (e.g. to define additional variables with literal values only). It's mostly been JS. JS code can (if done well) be cheap to load at at any time (e.g. to define additional variables with literal values).

It's good for UX when CSS-preloading doesn't cause a visual change in layout, but it unfortunately does not make it a cheap operation for the browser. Once something is rendered on the screen, we want to avoid the user's browser from being frozen for any noticeable amount of time (e.g. unable to respond to scroll, type, touch, or hover/click).

I hope that a proper idle-guarding, CSS-preloading can be acceptable. Certainly worth trying given the benefits!

Worst case scenario: We might find the CSS payload to be too big to parse and apply (gracefully) in a single step while the user is reading (e.g. freezing the browser momentarily). If that happens, we should preloading less of the VE-CSS payload, a compromise from "preload everything" to "preload enough to be a win, but be not so much it disrupts the reading experience".

Note that once we solve T176262, we could (in theory) safely download any size stylesheet without risk of freezing and without needing to compromise. This would be possible because that task enables us to separate the "download" step from the unpack+parse step. The download step is what takes the longest from the user's perspective (which we would still do, for all VE-CSS), the unpack+parse step is what can freeze the browser momentarily, which we'd only do once the interaction starts.

JTannerWMF moved this task from Q4 to Current work on the VisualEditor board.Oct 3 2018, 4:59 PM
JTannerWMF edited projects, added VisualEditor (Current work); removed VisualEditor.
marcella claimed this task.Oct 3 2018, 5:00 PM

Responding to a question that someone (I had the etherpad active and so couldn't see who was speaking) asked today during SoS: We don't have any current plans to work on T176262 -- it's in our backlog, but isn't something that we've chosen to allocate time to working on to this point.

Does the response from @Krinkle on Sept 5 give you enough to go on without us prioritizing that work? Or are you blocked until we're able to get to it?

We don't have any current plans to work on T176262 -- it's in our backlog, but isn't something that we've chosen to allocate time to working on to this point.
Does the response from @Krinkle on Sept 5 give you enough to go on without us prioritizing that work? Or are you blocked until we're able to get to it?

We aren't blocked on this and should be able to move forward with our own solution. Thanks for letting us know!

Following on from our offline discussion I've profiled the mw.loader.using( VEModules ) step. It looks like the majority of the time is spent eval-ing the code:

There's about 220ms total time. 105ms is in "Evaluate script", and 65ms in "domEval" (most of which is appendChild for the script tags), whereas only 50ms is spend in doPropagation/execute.

The good part is that ResourceLoader has separate eval and execute steps so it should be possible to create a new eval-only API where modules are taken out of the local store as strings, and eval'd into the moduleRegistry. This avoids the issue around resolving dependencies, and the modules would not be executed until they are actually used.

Proposal:

  • Implement an mw.loader.evalFromStore API
  • Given a specified module (list), build a dependency list
  • Flag each module as being in an 'eval' state in the registry
  • Iterate over that list, and call domEval on each item that exists in the store. Ignore anything that isn't cached.
  • In mw.loader.implement, prevent setAndPropagate( name, 'loaded' ) from running on an module in the 'eval' state.

Someone with a better understanding of startup/mediawiki.js may know a better way to do this.

Krinkle added a comment.EditedOct 17 2018, 3:59 PM

@Esanders That seems like a good way to implement the feature in ResourceLoader.

I rewrote this comment a few times, as I initially thought your proposal is effectively what I wrote at the aforementioned T176262 (albeit with a slightly different name), but I see now the difference now.

T176262 outlines a preload feature for the downloading of VE source code.

  • Benefits user that haven't used this weeks' version of VisualEditor yet, as well users that did but whose device/browser emptied the cache for whatever reason (particularly common on mobile).
  • Fairly big gains, especially on mobile where network are slower, and caches are more aggressively emptied by the device/browser. It depends on whether the user's network is able to download the code faster than the device can execute it. I guess on high-speed networks, it'll pay off more to save 150ms of evaluation (compared to e.g. 130ms of network RTT). On slower networks, it might pay off more to save 400ms or 1.5s of network (especially mobile connections that may require reactivation of the radio first).
  • The download is because it's off-thread, which means the viewport remains unfrozen.

Your proposal is for running only the two steps that T176262 wouldn't, which is cache lookup and evaluation of the implementation string.

  • This would benefit secondary edit sessions, after the user has last used the last deployed version of the code, and it remains in the cache.
  • Fairly big gains, especially mobile where processing speeds are lower.
  • Happens on the main thread, which means blocking the viewport for 65ms (on desktop I presume, so more on mobile). Our aim for background tasks is "upto 50ms". I'd say 65ms would be pushing it but might be acceptable if it only happens when the user is hovering the Edit button (not just in general when the page is being viewed), and if its usually less than or upto 65ms.

I'd estimate both are equally difficult to implement and both benefit distinct but important use cases. I think it makes sense for ResourceLoader to provide both of these eventually - implemented in such a way that both can be used, but in separate background tasks so that there is breathing space in between for the viewport to render and handle input events.

I'd support either of these with CR as needed, and might be able to chip in afterwards to add "the other" half based on it.

marcella renamed this task from Consider pre-loading VE code into memory to improve performance and speed to Pre-load VE code into memory to improve performance and speed.Nov 7 2018, 7:27 PM