[[ https://en.wikipedia.org/wiki/User:Community_Tech_bot/Popular_pages | Popular pages bot ]] takes about a full month to process all the WikiProjects. The community has continually come back to us asking why it takes so long, and/or why it didn't create a report at all. The fundamental issue is that it's just too slow.
Currently the implementation loops through each page in a WikiProject, one by one, and only fetches pageviews for the target page + redirects. This usually amounts to maybe 5-10 pages. So we're only making a handful of asynchronous requests to the pageviews API when we are allowed to make up to a 100.
**Proposed solutions**
* Use the database to get the list of pages per WikiProject, and also the redirects for each page. We're currently doing this through the MediaWiki API which is much slower.
* Queue up 100 or so pages and get pageviews all at once, rather then in small batches of just the target article + its redirects.
If done correctly, we should see a significant performance improvement. I estimate the runtime to be cut at least in half, but probably much more.