Memcache Java API Overview

Python |Java |PHP |Go

High performance scalable web applications often use a distributed in-memory data cache in front of or in place of robust persistent storage for some tasks. App Engine includes a memory cache service for this purpose.

  1. Caching data with the Low-Level API
  2. Caching data with JCache
  3. When to use a memory cache
  4. Safely handling concurrent memcache updates
  5. How cached data expires
  6. Sharing memcache between different programming languages
  7. Statistics
  8. Limits
  9. Configuring memcache

Caching data with the Low-Level API

The Low-Level API provides MemcacheService and AsyncMemcacheService for accessing memcache service. This API is richer than the one provided by JCache. For more details see Low-Level API.

// ...
    String key = ..
    byte[] value;

    // Using the synchronous cache
    MemcacheService syncCache = MemcacheServiceFactory.getMemcacheService();
    value = (byte[]) syncCache.get(key); // read from cache
    if (value == null) {
      // get value from other source
      // ........

      syncCache.put(key, value); // populate cache

    // Using the asynchronous cache
    AsyncMemcacheService asyncCache = MemcacheServiceFactory.getAsyncMemcacheService();
    Future<Object> futureValue = asyncCache.get(key); // read from cache
    // ... do other work in parallel to cache retrieval
    value = (byte[]) futureValue.get();
    if (value == null) {
      // get value from other source
      // ........

      // asynchronously populate the cache
      // Returns a Future<Void> which can be used to block until completion
      asyncCache.put(key, value);

Caching data with JCache

The App Engine Java SDK supports the JCache API. JCache provides a map-like interface to cached data. You store and retrieve values in the cache using keys. Keys and values can be of any Serializable type or class. For more details, see Using JCache.

When to use a memory cache

One use of a memory cache is to speed up common datastore queries. If many requests make the same query with the same parameters, and changes to the results do not need to appear on the web site right away, the app can cache the results in the memcache. Subsequent requests can check the memcache, and only perform the datastore query if the results are absent or expired. Session data, user preferences, and any other queries performed on most pages of a site are good candidates for caching.

Memcache may be useful for other temporary values. However, when considering whether to store a value solely in the memcache and not backed by other persistent storage, be sure that your application behaves acceptably when the value is suddenly not available. Values can expire from the memcache at any time, and may be expired prior to the expiration deadline set for the value. For example, if the sudden absence of a user's session data would cause the session to malfunction, that data should probably be stored in the datastore in addition to the memcache.

Safely handling concurrent memcache updates

The putIfUntouched and getIdentifiable methods of the memcache service can be used to provide a way to safely make key-value updates to memcache in scenarios where multiple requests are being handled concurrently that need to update the same memcache key in an atomic fashion. (It is possible to get race conditions in those scenarios.)

How cached data expires

By default, values stored in memcache are retained as long as possible. Values may be evicted from the cache when a new value is added to the cache if the cache is low on memory. When values are evicted due to memory pressure, the least recently used values are evicted first.

The app can provide an expiration time when a value is stored, as either a number of seconds relative to when the value is added, or as an absolute Unix epoch time in the future (a number of seconds from midnight January 1, 1970). The value will be evicted no later than this time, though it may be evicted for other reasons.

Under rare circumstances, values may also disappear from the cache prior to expiration for reasons other than memory pressure. While memcache is resilient to server failures, memcache values are not saved to disk, so a service failure may cause values to become unavailable.

In general, an application should not expect a cached value to always be available.

You can erase an application's entire cache via the API or via the Admin Console (under Memcache Viewer).

Sharing memcache between different programming languages

An App Engine app can be factored into one or more modules and versions. Sometimes it is convenient to write modules and versions in different programming languages. You can share the data in your memcache between any of your app's modules and versions. Because the memcache API serializes its parameters, and the API may be implemented differently in different languages, you need to code memcache keys and values carefully if you intend to share them between langauges.

Key Compatibility

To ensure language-independence, memcache keys should be bytes:

  • In Python use plain strings (not Unicode strings)
  • In Java use byte arrays (not strings)
  • In Go use byte arrays
  • In PHP use strings

Remember that memcache keys cannot be longer than 250 bytes, and they cannot contain null bytes.

Value Compatibility

For memcache values that can be written and read in all languages, follow these guidelines:

  • Byte-arrays and ASCII strings can be safely passed between languages.
  • Unicode strings are compatible, but you must encode and decode them properly in Go and PHP.
  • Be careful using integers, they are safe for increment/decrement operations, but Go does not directly support integers and PHP cannot handle 64-bit integers.
  • Avoid using floating point values and complex types like lists, maps, structs, and classes, because each language serializes them in a different way. If you need to use types like these we recommend that you implement your own language-independent serialization that uses a format such as JSON or protocol buffers.


The example code below operates on two memcache items in Python, Java, Go, and PHP. It reads and writes an item with the key “who” and increments an item tieh the key “count”. If you create create a single app with separate modules using these four code snippets, you will see that the values set or incremented in one language will be read by the other languages.


self.response.headers['Content-Type'] = 'text/plain'

who = memcache.get('who')
self.response.write('Previously incremented by %s\n' % who)
memcache.set('who', 'Python')

count = memcache.incr('count', 1, initial_value=0)
self.response.write('Count incremented by Python = %s\n' % count)



byte[] whoKey = "who".getBytes();
byte[] countKey = "count".getBytes();

byte[] who = (byte[]) memcache.get(whoKey);
String whoString = who == null ? "nobody" : new String(who);
resp.getWriter().print("Previously incremented by " + whoString + "\n");
memcache.put(whoKey, "Java".getBytes());

Long count = memcache.increment(countKey, 1L, 0L);
resp.getWriter().print("Count incremented by Java = " + count + "\n");


w.Header().Set("Content-Type", "text/plain")

whoItem, err := memcache.Get(c, "who")
var who = "nobody"
if err == nil {
  who = string(whoItem.Value)
fmt.Fprintf(w, "Previously incremented by %s\n", who)
memcache.Set(c, &memcache.Item{
  Key:   "who",
  Value: []byte("Go"),

count, _ := memcache.Increment(c, "count", 1, 0)
fmt.Fprintf(w, "Count incremented by Go = %d\n", count)


header('Content-Type: text/plain');

$who = $memcache->get('who');
echo 'Previously incremented by ' . $who . "\n";
$memcache->set('who', 'PHP');

$count = $memcache->increment('count', 1, 0);
echo 'Count incremented by PHP = ' .  $count . "\n";


Memcache maintains statistics about the amount of data cached for an application, the cache hit rate, and the age of cache items. Note that the age of an item is reset every time it is used, either read or written. You can view these statistics using the API or in the Administration Console, under Memcache Viewer.


The following limits apply to the use of the memcache service:

  • The maximum size of a cached data value is 1 MB minus the size of the key minus an implementation-dependent overhead which is approximately 96 bytes.
  • A key cannot be larger than 250 bytes. In the Java runtime, keys that are objects or strings longer than 250 bytes will be hashed. (Other runtimes behave differently.)
  • The "multi" batch operations can have any number of elements. The total size of the call and the total size of the data fetched must not exceed 32 megabytes.

Configuring memcache

The memcache service provides best-effort cache space by default. Apps with billing enabled may opt to use dedicated memcache which provides a fixed cache size assigned exclusively to your app. The service is configured via memcache settings on the Admin Console.