[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[[["\u003cp\u003eThis guide explains how to configure and monitor the memcache service through the Google Cloud console, including switching between Shared and Dedicated service levels.\u003c/p\u003e\n"],["\u003cp\u003eYou can cache and retrieve values using the memcache Python API methods like \u003ccode\u003eadd()\u003c/code\u003e for caching and \u003ccode\u003eget()\u003c/code\u003e for retrieval, with the option to set expiration times.\u003c/p\u003e\n"],["\u003cp\u003eThe memcache monitoring in the Google Cloud console offers insights into the service level, hit ratio, cache items, item age, total size, and dedicated memcache allows to see "Hot keys".\u003c/p\u003e\n"],["\u003cp\u003eConcurrent write requests to the same memcache key can be handled using the compare-and-set (CAS) feature, implemented with a retry loop using \u003ccode\u003egets()\u003c/code\u003e and \u003ccode\u003ecas()\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThis API is supported for first-generation runtimes and can be used when upgrading to the corresponding second-generation runtimes.\u003c/p\u003e\n"]]],[],null,["# Using Memcache\n\nThis page describes how to configure and monitor the memcache service for your\napplication using the Google Cloud console. It also describes how to use the App Engine memcache Python\nAPI to set and retrieve cached values and use the compare-and-set feature to\nhandle concurrent write requests to the same memcache\nkey. To learn more about memcache,\nread the [Memcache Overview](/appengine/docs/legacy/standard/python/memcache).\n| This API is supported for first-generation runtimes and can be used when [upgrading to corresponding second-generation runtimes](/appengine/docs/standard/\n| python3\n|\n| /services/access). If you are updating to the App Engine Python 3 runtime, refer to the [migration guide](/appengine/migration-center/standard/migrate-to-second-gen/python-differences) to learn about your migration options for legacy bundled services.\n\nConfiguring memcache\n--------------------\n\n1. Go to the Memcache page in the Google Cloud console. \n [Go to the Memcache page](https://console.cloud.google.com/appengine/memcache)\n2. Select the memcache service level you want to use:\n\n - **Shared** (default) - free and provides cache capacity on a best-effort basis.\n - **Dedicated** - billed by the GB-hour of cache size and provides a fixed cache capacity assigned exclusively to your application.\n\n Learn more about available service classes in [Memcache Overview](#service_levels).\n\nCaching and retrieving values\n-----------------------------\n\n### Caching a value\n\nUse `add()` to add a key's value if and only if it doesn't already exist, with\nan optional expiration time: \n\n memcache.add(key=\"[KEY]\", value=\"[VALUE]\", time=[EXPIRATION_TIME])\n\nFor example, to add the value `raining` to the key `weather_USA_98105` with an\nexpiration time of one hour from when the value is written: \n\n memcache.add(key=\"weather_USA_98105\", value=\"raining\", time=3600)\n\nLearn more about `add()` and other methods for setting values in the [memcache\nPython API\ndocumentation](/appengine/docs/legacy/standard/python/refdocs/google.appengine.api.memcache).\n\nSee other examples of caching values in [Memcache Examples](/appengine/docs/legacy/standard/python/memcache/examples).\n\n### Looking up cached values\n\nUse `get()` to look up the value of a single key: \n\n memcache.get(key=\"[KEY]\")\n\nFor example, to get the value of the key `weather_USA_98105`: \n\n memcache.get(key=\"weather_USA_98105\")\n\nLearn more about `get()` and other methods for looking up values in the\n[memcache Python API\ndocumentation](/appengine/docs/legacy/standard/python/refdocs/google.appengine.api.memcache).\n\nMonitoring memcache in the Google Cloud console\n-----------------------------------------------\n\n1. Go to the Memcache page in the Google Cloud console. \n [Go to the Memcache page](https://console.cloud.google.com/appengine/memcache) \n2. Look at the following reports:\n - **Memcache service level** : Shows if your application is using the Shared or Dedicated service level. If you are an owner of the project, you can switch between the two. Learn more about the [service levels](./#service_levels).\n - **Hit ratio**: Shows the percentage of data requests that were served from the cache, as well as the raw number of data requests that were served from the cache.\n - **Items in the cache**.\n - **Oldest item age**: The age of the oldest cached item. Note that the age of an item is reset every time it is used, either read or written.\n - **Total cache size**.\n3. You can take any of the following actions:\n\n - **New key**: Add a new key to the cache.\n - **Find a key**: Retrieve an existing key.\n - **Flush cache**: Remove all the key-value pairs from the cache.\n4. (Dedicated memcache only) Look through the list of **Hot keys**.\n\n - \"Hot keys\" are keys that receive more than 100 queries per second (QPS) in the memcache.\n - This list includes up to 100 hot keys, sorted by highest QPS.\n\nHandling concurrent writes\n--------------------------\n\nTo use the compare and set feature to handle writes from multiple requests to\nthe same memcache key:\n\n1. Instantiate a memcache `Client` object.\n2. Use a retry loop (preferably with a limit on the number of retries and using exponential backoff)\n 1. Within the retry loop, get the key using `gets()` or `get_multi()` with the `for_cas` parameter set to `True`.\n 2. Within the retry loop, update the key value using `cas()` or `cas_multi()`.\n\nThe following snippet shows one way to use the compare and set feature: \n\n def bump_counter(key):\n client = memcache.Client()\n while True: # Retry loop\n counter = client.gets(key)\n if counter is None: raise KeyError('Uninitialized counter')\n if client.cas(key, counter+1):\n break\n\nThe retry loop is necessary because without the loop this code doesn't actually\navoid race conditions, it just detects them! The memcache service guarantees\nthat when used in the pattern shown here (that is, using `gets()` and `cas()`),\nif two (or more) different client instances happen to be involved in a race\ncondition, only the first one to execute the `cas()` operation succeeds (return\n`True`), while the second one (and subsequent ones) fails (return `False`).\n\nAnother refinement you should add to this sample code is to set a limit on the\nnumber of retries, to avoid an infinite loop in worst-case scenarios where there\nis a lot of contention for the same counter. An example of when such contention\ncould occur is if there are more requests trying to update the counter than the\nmemcache service can process in real time.\n\nLearn more about compare and set in the [Memcache\nOverview](/appengine/docs/legacy/standard/python/memcache#compare_and_set).\n\nWhat's next\n-----------\n\n- Learn more about memcache in the [Memcache Overview](/appengine/docs/legacy/standard/python/memcache).\n- See code examples of using memcache in Python in [Memcache\n Examples](/appengine/docs/legacy/standard/python/memcache/examples)."]]