## Conclusion
Notes: https://etherpad.wikimedia.org/p/WikiDev16-T114542
Feedback at the summit suggested that server-side compositing was preferred over a SPA, as server-side compositing is less risky. Furthermore, lower hanging fruit were recommended to be plucked. {T113066} has been updated accordingly, which is a Q3 FY2015-2016 Reading Web goal to take action to get at those lower hanging fruit using conventional MediaWiki PHP and JavaScript.
## Deck
https://commons.wikimedia.org/wiki/File:Paradigm.pdf
## Definition of the problem: Can we speed up both the end user and developer experience?
Today if you're on a slow connection chances are you'll have problems accessing large Wikimedia pages.
And if you're a developer looking to rapidly build an API driven application or new feature, once you figure out how to call the APIs you'll probably need to do a lot of extra parsing and mashup work at the client, and then you'll need to figure out how to unburden your client from network contention.
Can we do better, much better for users on slow connections?
And can we unburden feature developers from API composition problems, while letting people who love middleware focus on these problems instead?
## Background information and R&D
In WMF fiscal year 2015-2016 quarter 2 (October - December), Reading engineers will prototype a [[ https://www.mediawiki.org/wiki/Reading/Web/Projects/A_frontend_powered_by_Parsoid | Parsoid-backed HTML5 web application ]] that [[ https://phabricator.wikimedia.org/T113066 | dramatically reduces first paint time ]] for users on slow connections (e.g., 2G).
## Expected outcome at the summit: Here's what we know. Proceed?
This is a two hour session to share learning from this R&D, discuss the viability of the approach for multiple form factors and client (and server!) platforms, and inform a number of summit proposals on the table.
## More related tasks: Hey, that's my idea!
The likely approach looks like an [[ https://phabricator.wikimedia.org/T111588 | API driven web frontend ]] with two step loading (like the Android app) of RESTBase Node.js mediated [[ https://phabricator.wikimedia.org/T55784 | Parsoid HTML ]] and inter-server api.php calls and [[ https://phabricator.wikimedia.org/T106099 | ServiceWorker ]] route interception. Fallback for low/no-JavaScript might entail loading the first X bytes, with a user clickthrough to fetch an entire article with very low frills styling.
To keep it simple, the goal is to largely replicate the existing mobile web experience with this new generalizable architectural approach. If this approach is successful, in Q3 the plan would be to aggressively move this to the beta channel of the mobile web and initiate processes for a beta feature for the desktop web so that gadget, special page, and other interested maintainers can see how stuff works.
## I know, I know
There are several known challenge areas to explore in the Q2 work, and we should discuss these further at the summit:
- VE forward compatibility and pre-fetched Parsoid for rapid bootstrapping
- Gadgets
- Idiosyncratic Special: pages
- ResourceLoader...resource loading
- New MediaWiki API endpoints to support this approach
- Packaging for third parties
- No-frills null skin for third parties who can't do Node.js but who can benefit from basic mobile first, multiple form factor CSS.
- Wikipedia Zero support (is already compatible, but needs to be considered)
- Search engine findability
- Analytics
- CentralNotice
## Current status
Follow these tasks to track the current R&D progress:
- T111588 API-driven web front-end
- T113066 Barack Obama article on English Wikipedia on 2G connection should fully load in under 15s (down from 50s)