The next problem that arises is that if the database has grown too
big, then the csv generation will be slow(as it has to jump a lot of
times). So the next idea is , instead of generating all cursors in the
main cgi , the main cgi generates only first few set of page links,
and then triggers a task queue entry to continue from there. By the
time the user would have read the first page and clicks on "next
page", the task would have generated few more entries. As the set of
cursors for pagination is in memcache, you can use it even in your
first page, if you are using AJAX (guessing, not that sure.)