Fully managed in-memory Valkey, Redis* and Memcached service that offers sub millisecond data access, scalability, and high availability for a wide range of applications.
100% compatible with open source Valkey, Redis Cluster, Redis, and Memcached
Migrate your caching layer to cloud with zero code change
High availability, up to a 99.99% SLA
Benefits
Focus on building great apps
Memorystore automates complex tasks for open source Valkey, Redis Cluster, Redis, and Memcached like enabling high availability, failover, patching, and monitoring so you can spend more time building your applications.
Simplified scaling
Memorystore for Valkey and Redis Cluster scale without downtime to support up to 250 nodes, terabytes of keyspace, and 60x more throughput than Memorystore for Redis with microsecond latencies.
Highly available
Memorystore for Valkey and Redis Cluster have zero-downtime scaling, automatically distributed replicas across availability zones, and automated failover. Memorystore for Redis Cluster also offers a 99.99% SLA.
Key features
Choose from the most popular open source caching engines to build your applications. Memorystore supports Valkey, Redis Cluster, Redis, and Memcached and is fully protocol compatible. Choose the right engine that fits your cost and availability requirements.
Memorystore for Valkey and Memorystore for Redis Cluster are available with Private Service Connect (PSC) to simplify management and to offer secure, private, and granular connectivity with minimal IP consumption. All services are integrated with cloud monitoring, and more, Memorystore is built on the best of Google Cloud.
Use Memorystore for Redis as an ultra-low latency data store for your generative AI applications. Approximate nearest neighbor (ANN) vector search (in preview) delivers fast, approximate results—ideal for large datasets where a close match is sufficient. Exact nearest neighbor (KNN) vector search (in preview) promises accurate results, although it may require a bit more time to process.
Provisioning, replication, failover, and patching are all automated, which drastically reduces the time you spend on DevOps.
Customers
What's new
Sign up for Google Cloud newsletters to receive product updates, event information, special offers, and more.
Documentation
Read about the benefits, use cases, and features of Memorystore for Valkey. The overview also provides key details about the service.
Read about the benefits, use cases, and features of Memorystore for Redis Cluster. The overview also provides key details about the service.
Read about the use cases, and features of Memorystore for Redis standalone service. The overview also provides key details about the service.
This page introduces the Memorystore for Memcached service, including use cases, key concepts, and the advantages of using Memcached.
Learn how to create a Memorystore for Redis instance, connect to the instance, set a value, retrieve a value, and delete the instance.
Learn how to access Redis instances from Compute Engine, GKE clusters, Cloud Functions, the App Engine flexible environment, and the App Engine standard environment.
All features
Choice of engines | Choose from the most popular open source caching engines to build your applications. Memorystore supports Valkey, Redis Cluster, Redis, and Memcached and is fully protocol compatible. Choose the right engine that fits your cost and availability requirements. |
Connectivity | Memorystore for Valkey and Memorystore for Redis Cluster is available with Private Service Connect (PSC) to simplify management and to offer secure, private, and granular connectivity with minimal IP consumption. Memorystore for Redis and Memcached support Private Service Access (PSA) and Direct Peering to offer connectivity using private IP. |
LangChain integration | Easily build gen AI applications that are more accurate, transparent, and reliable with LangChain integration. Memorystore for Redis has three LangChain integrations—Document loader for loading and storing information from documents, Vector stores for enabling semantic search, and Chat Messages Memory for enabling chains to recall previous conversations. Visit the GitHub repository to learn more. |
Vector search | Use Memorystore for Redis as an ultra-low latency data store for your generative AI applications. Approximate nearest neighbor (ANN) vector search (in preview) delivers fast, approximate results—ideal for large datasets where a close match is sufficient. Exact nearest neighbor (KNN) vector search (in preview) promises accurate results, although it may require a bit more time to process. |
Fully managed | Provisioning, replication, failover, and patching are all automated, which drastically reduces the time you spend doing DevOps. |
Persistence | Achieve near-zero Recovery Point Objectives (RPO) through continuous write logging or setup periodic RDB snapshots, ensuring heightened resiliency against zonal failures. |
Security | Memorystore is protected from the internet using VPC networks and private IP and comes with IAM integration—all designed to protect your data. Systems are monitored 24/7/365, ensuring your applications and data are protected. Memorystore for Redis provides in-transit encryption and Redis AUTH to further secure your sensitive data. |
Highly scalable | Memorystore for Valkey and Memorystore for Redis Cluster provide zero-downtime scaling up to 250 nodes, terabytes of keyspace, flexible node sizes from 1.4 GB to 58 GB, 10 TB+ per instance and 60x more throughput with microsecond latencies. |
Monitoring | Monitor your instance and set up custom alerts with Cloud Monitoring. You can also integrate with OpenCensus to get more insights to client-side metrics. |
Highly available | Memorystore for Redis Cluster offers a 99.99% SLA with automatic failover. Shards are automatically distributed across zones for maximum availability. Standard tier Memorystore for Redis instances provide a 99.9% availability SLA with automatic failover to ensure that your instance is highly available. You also get the same availability SLA for Memcached instances. |
Migration | Memorystore is compatible with open source protocol, which makes it easy to switch your applications with no code changes. You can leverage the RIOT tool to seamlessly migrate existing Redis deployments to Memorystore for Valkey or Memorystore for Redis Cluster. |
Pricing
Memorystore offers various sizes to fit any budget. Pricing varies with settings—including how much capacity, how many replicas and which region you provision. Memorystore also offers per-second billing and instances and is easy to start and stop.
View Memorystore for Redis Cluster pricing
*Redis is a trademark of Redis Ltd. All rights therein are reserved to Redis Ltd. Any use by Google is for referential purposes only and does not indicate any sponsorship, endorsement or affiliation between Redis and Google. Memorystore is based on and is compatible with open-source Redis versions 7.2 and earlier and supports a subset of the total Redis command library.
Start building on Google Cloud with $300 in free credits and 20+ always free products.