Memcache is a high-performance, distributed memory object caching system, providing fast access to any cached data such as results of datastore queries.
The Memcache Pattern
Memcache is typically used with the following pattern:
- The application receives a query from the user or the application.
- The application checks whether the data needed to satisfy that query is in memcache.
- If the data is in memcache, the application uses that data.
- If the data is not in memcache, the application queries the datastore and stores the results in memcache for future requests.
The pseudocode below represents a typical memcache request:
def get_data(): data = memcache.get('key') if data is not None: return data else: data = self.query_for_data() memcache.add('key', data, 60) return data
ndb internally uses memcache to speed up queries. However, if you wish, you can also explicitly add memcache calls to gain more control about the speed-ups.
Modifying guestbook.py to use Memcache
The guestbook application in the Getting Started Guide queries the datastore on every request (via ndb, so it already gains some of the memcache speed-ups). You can modify the Guestbook application to use memcache explicitly before resorting to querying the datastore.
First we'll import the memcache module and create the method that checks memcache before running a query.
Next we'll separate out the querying and creation of the HTML for the page. When we don't hit the cache, we'll call this method to query the datastore and build the HTML string that we'll store in memcache.
Finally we will update the
MainPage handler to call the get_greetings() method and display some stats about the number of times the cache was hit or missed.