AlloyDB AI is a suite of features included with AlloyDB for PostgreSQL that let you apply the semantic and predictive power of machine learning (ML) models to your data. This page provides an overview of the ML-powered AI functions that are available through AlloyDB.
Store, index, and query vectors
The stock pgvector
PostgreSQL
extension
extension is customized for AlloyDB, and referred to as vector
.
It supports storing generated embeddings in a vector column. The extension also
adds support for scalar quantization feature to create IVF
indexes. You can
also create an IVFFlat
index or HSNW
index that are available with stock
pgvector
.
For more information about storing vectors, see Store vectors.
In addition to the customized vector
extension, AlloyDB
includes the alloydb_scann
extension that implements a highly efficient
nearest-neighbor index powered by the ScaNN
algorithm.
For more information about creating indexes and querying vectors, see Create indexes and query vectors.
Tune your vector query performance
You can tune your indexes for a balance between query-per-second (QPS) and recall with your queries. For more information about tuning your indexes, see Tune vector query performance.
Generate embeddings and text predictions
AlloyDB AI extends PostgreSQL syntax with two functions for
querying models using the google_ml_integration
extension:
Invoke predictions to call a model using SQL within a transaction.
Generate embeddings to have an LLM translate text prompts into numerical vectors.
You can then apply these vector embeddings as input to
pgvector
functions. This includes methods to compare and sort samples of text according to their relative semantic distance.
Use models in the cloud with Vertex AI
You can configure AlloyDB to work with Vertex AI.This gives your applications the following benefits:
Your applications can invoke predictions using any model stored in the Vertex AI Model Garden that they have access to.
Your applications can generate embeddings using the
textembedding-gecko
English models LLM.