We have a need to track several bits of data per *title*, independently of revisions corresponding to this title:
- page deletion and other protection information
- efficient and (ideally) ordered listings of all titles
- the title's rename history (so that we can reconstruct linear histories), possibly in the form of a `renamed_from` field
- the MediaWiki page_id and other bits from [the page table](https://github.com/wikimedia/mediawiki/blob/master/maintenance/tables.sql#L223-L279)
We should start to support this properly in RESTBase. We should probably expose this information at [/page/title/{title}](http://rest.wikimedia.org/en.wikipedia.org/v1/?doc#!/Page_content/page_title__title__get). This will also give us a natural resource path for page-related events like page creation or deletion.
Logically we can then check whether a page is deleted on each revision access. This would bring the number of queries per revision request to three for old revisions (one additional to check for revision delection), and two for new revisions. An extra revision metadata request currently roughly halves throughput, so there is a big advantage in retrieving all protection information in a single request. We might be able to avoid the extra page metadata request by also storing page deletion information in a static column in each key_rev_value bucket. The static column is shared between all revisions of a given domain & title, so only needs to be updated once on page deletion, for each content type. Static columns can be added & removed with a schema upgrade.