What is Valkey?

Valkey is an in-memory key-value datastore which can be used for a variety of application-building needs, including caching, message queues, and session stores. You can also use it as a primary in-memory database, with data stored in RAM for fast read and write speeds.

As a NoSQL database, Valkey has a more flexible schema than a relational database—which stores data in strict columns and rows.

Valkey uses the simple key-value method for data storage, where a value is mapped to a unique identifier—the key. This enables incredibly fast lookups when retrieving data directly by its key. It also offers the flexibility to accommodate a range of data types, including strings, sets, lists, hashes, and streams.

Want to jump straight to setting up a Valkey cluster? Learn how to get started on Memorystore.

Key takeaways

Valkey is an open source key-value datastore. It’s known for extremely low latency, making it ideal for caching, message queues, and applications that require real-time data. Here’s a quick overview of Valkey’s key features:

  • In-memory datastore: Primarily stores data in RAM for lightning-fast read and write speeds.
  • Key-value method: Stores data in key-value pairs, offering flexibility for a range of data types and quick lookups.
  • Redis alternative: Valkey evolved from the same codebase as Redis but remains open source.


Valkey FAQs

Valkey can store a range of data types, including strings, hashes, lists, sets, and sorted sets. It also supports more advanced data types like streams, geospatial indexes, bitmaps, and vectors, making it a versatile tool for a wide range of applications, including AI agents.


The main reason to use an in-memory database like Valkey is speed. Accessing data from RAM is faster than reading it from disk, which allows Valkey to achieve extremely low latency, often in the sub-millisecond range. This speed is critical for use cases like real-time caching and fast-moving analytics where immediate data access is essential.


The main difference lies in how they structure, store, and retrieve data. A key-value database like Valkey stores data as a collection of unique keys, each paired with a single value. This simple model offers fast retrieval when looking up data specifically by its key. A relational database, on the other hand, stores data in tables with JOINs linking related data. This model offers the best performance for queries that need to pull data from multiple collections.


Benefits of using Valkey

Valkey combines speed with the flexibility, scalability, and resilience needed for demanding workloads.

High-speed in-memory database

By primarily storing data in RAM instead of disks, Valkey delivers incredibly high throughput and low latency, making it well-suited for applications where speed is paramount.

Open source

As an open source project, Valkey benefits from community contributions and the flexibility to be integrated and customized without licensing restrictions.


High availability

Valkey supports master-replica replication for automatic failover and data redundancy, minimizing downtime and delivering continuous service.

Data persistence

While primarily an in-memory database, Valkey provides persistence options to prevent data loss. It can save data to disk through RDB (Redis database) snapshots and AOF (append-only file) logs to ensure data durability even after restarts.


Seamless horizontal scaling

Valkey is designed to distribute data efficiently across multiple nodes and clusters. This means you can easily scale out to handle increasing traffic without a drop in performance.


Real-time communication

Valkey offers a Pub/Sub messaging system, enabling real-time communication between various applications or different parts of a single application. This can support features like live chat and real-time data feeds.


Common use cases of Valkey

In-memory datastores like Valkey are particularly well-suited for applications and functions that depend on extremely low latency. These can include:

  • Caching: Valkey can act as a high-speed cache layer in front of slower, disk-based databases. Frequently accessed data can be stored in Valkey, significantly reducing the load on the primary database and accelerating response times.
  • Message queues: Valkey's Pub/Sub capabilities make it an excellent choice as a message broker. It can be used to facilitate real-time communication between different services or microservices, supporting use cases like chat applications and notifications.
  • Real-time analytics: For applications that need to process and analyze data in real time, such as parcel trackers, navigation apps, or gaming leaderboards, Valkey’s speed makes it an ideal choice.
  • Session stores: Managing user sessions in web applications, such as e-commerce sites or social media platforms, requires quick access to user preferences, login status, and shopping cart contents. Valkey provides a fast and reliable way to store and retrieve this session data.
  • Machine learning: Valkey’s low latency and vector search capabilities also make it a powerful database for applications that use machine learning. You can store and rapidly search vector embeddings—the numerical representations of data like text or images—to perform complex similarity searches. This can be used to suggest similar products, articles, or media to users in real time.

Valkey versus Redis

Valkey was initially forked from Redis 7.2, meaning it started from the same codebase and shares some fundamental characteristics and functionality. Like Valkey, Redis is an in-memory key-value datastore that offers high throughput and supports a range of data structures. Both are known for high availability and scalability.

The core difference is that Valkey is open source, released under a permissive BSD license, so developers are free to use, modify, and make contributions to the software.

In 2024, Valkey was created in response to Redis Inc. switching to a more restrictive ‘source available’ license. To maintain an open-source alternative for the community, several core Redis contributors came together and launched Valkey based on the final open-source version of Redis.

Valkey is driven by a collaborative, community-led development approach, where the roadmap and new features are decided by the contributing members of the Linux Foundation project with contributions from major cloud vendors, including Google Cloud.

Since Valkey 8.0, major enhancements have been made to the Valkey engine, focusing on vector search enhancements, improved cluster management, and multi-threaded command execution—the ability to process commands in parallel across multiple CPU cores.

Self manage your Valkey deployment with Google Cloud

For those who want more granular control, customization, or specific deployment configurations, Valkey can be manually deployed onto Google Cloud’s compute infrastructure. This method offers you complete oversight of the environment.

  • Compute Engine (VMs): Launch Google Compute Engine instances (VMs) and install the open-source Valkey server directly. This provides full autonomy over configuration, scaling parameters, and the underlying operating system.
  • Google Kubernetes Engine (GKE): Valkey can be readily containerized using Docker and deployed as a Pod or a stateful application on a GKE cluster. This can be a well-suited approach for applications already utilizing Kubernetes, offering robust and advanced orchestration capabilities.


Get fully managed service with Google Cloud Memorystore for Valkey

Memorystore for Valkey is a fully managed service that provides a high-performance, highly available, and scalable Valkey service in the cloud.

It provides the benefits of an in-memory database without the operational and financial burden of managing the underlying infrastructure.

Google Cloud takes care of the administrative workload, including patching, security, and scaling, so you can focus on building applications with real-time data.

  • Fully managed: Google handles the operational overhead of managing Valkey instances, including provisioning, patching, and failover.
  • High performance: Leveraging Google Cloud's infrastructure, Memorystore for Valkey delivers the low latency and high throughput required for demanding real-time applications—with sub-millisecond data access.
  • High availability: Memorystore automatically replicates data across multiple zones, providing up to a 99.99% SLA.
  • Scalable: Memorystore offers seamless scaling of Valkey instances without downtime to meet the changing demands of your application.
  • Secure: Memorystore is protected with robust security features, including virtual private cloud (VPC) connectivity, 24/7 monitoring, and identity and access management (IAM).
  • Perfect for AI applications: Memorystore for Valkey supports approximate nearest neighbor (ANN) vector search and exact nearest neighbor (KNN) vector search, making it an ideal low-latency datastore for generative AI.

Take the next step with Memorystore

Start building on Google Cloud with $300 in free credits.

Google Cloud