Once we have usage tracking enabled on clients, and have some page content making use of "random" entities, we should benchmark the database performance of the tracking code. This could be done on the beta cluster.
We should especially benchmark two kinds of distributions: a single item used on a lot (>1000) of pages, and a single page using a lot (>1000) of items. Actions to benchmark are:
1) page editing, blanking/reverting, and deleting/restoring with many items on a page
2) editing and deleting items that are used on a lot of pages
Sean Pringle and Aaron Schultz should be kept in the loop on this.
**Whiteboard**: u=dev c=infrastructure p=0