This page contains information and examples for connecting to a Cloud SQL instance from a service running in Cloud Build.
Cloud SQL is a fully-managed database service that helps you set up, maintain, manage, and administer your relational databases in the cloud.
Cloud Build is a service that executes your builds on Google Cloud infrastructure.
Set up a Cloud SQL instance
- Enable the Cloud SQL Admin API in the Google Cloud project that you are connecting from, if you
haven't already done so:
- Create a Cloud SQL
for SQL Server instance. We recommend that you choose a Cloud SQL
instance location in the same region as your Cloud Run service for better latency, to avoid some networking costs, and to reduce
cross region failure risks.
By default, Cloud SQL assigns a public IP address to a new instance. You also have the option to assign a private IP address. For more information about the connectivity options for both, see the Connecting Overview page.
Set up an Artifact Registry Repository
- If you haven't already done so, then
enable the Artifact Registry API in the Google Cloud project that you are connecting from:
- Create a Docker Artifact Registry. To improve latency, reduce the risk of cross-region failure, and avoid additional networking costs, we recommend that you choose an Artifact Registry location in the same region as your Cloud Run service.
Configure Cloud Build
The steps to configure Cloud Build depend on the type of IP address that you assigned to your Cloud SQL instance.Public IP (default)
Make sure your
Cloud Build service account has the
IAM
roles and permissions required to connect to the Cloud SQL instance.
The Cloud Build service account is listed on the Google Cloud console
IAM
page as the Principal
[YOUR-PROJECT-NUMBER]@cloudbuild.gserviceaccount.com
.
To view this service account in the Google Cloud console, select the Include Google-provided role grants checkbox.
Your Cloud Build service account needs one of the following IAM roles:
Cloud SQL Client
(preferred)Cloud SQL Admin
cloudsql.instances.connect
cloudsql.instances.get
If the Cloud Build service account belongs to a different project than the Cloud SQL instance, then the Cloud SQL Admin API and IAM permissions need to be added for both projects.
Private IP
To connect to your Cloud SQL instance over private IP, Cloud Build must be in the same VPC network as your Cloud SQL instance. To configure this:
- Set up a private connection between the VPC network of your Cloud SQL instance and the service producer network.
- Create a Cloud Build private pool.
Once configured, your application will be able to connect directly using your
instance's private IP address and port 1433
when your build is run in the pool.
Connect to Cloud SQL
After you configure Cloud Build, you can connect to your Cloud SQL instance.
Public IP (default)
For public IP paths, Cloud Build supports TCP sockets.
You can use the Cloud SQL Auth Proxy in a Cloud Build step to allow connections to your database. This configuration:
- Builds your container and pushes it to Artifact Registry.
- Builds a second container, copying in the Cloud SQL Auth Proxy binary.
- Containers built by Cloud Build don't need to be pushed to any registry and are discarded on build completion.
- Using the second container, starts the Cloud SQL Auth Proxy and runs any migration commands.
steps: - id: install-proxy name: gcr.io/cloud-builders/wget entrypoint: sh args: - -c - | wget -O /workspace/cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/2.14.2 chmod +x /workspace/cloud-sql-proxy - id: migrate waitFor: ['install-proxy'] name: YOUR_CONTAINER_IMAGE_NAME entrypoint: sh env: - "DATABASE_NAME=${_DATABASE_NAME}" - "DATABASE_USER=${_DATABASE_USER}" - "DATABASE_PORT=${_DATABASE_PORT}" - "INSTANCE_CONNECTION_NAME=${_INSTANCE_CONNECTION_NAME}" secretEnv: - DATABASE_PASS args: - "-c" - | /workspace/cloud-sql-proxy ${_INSTANCE_CONNECTION_NAME} --port ${_DATABASE_PORT} & sleep 2; # Cloud SQL Proxy is now up and running, add your own logic below to connect python migrate.py # For example options: dynamic_substitutions: true substitutions: _DATABASE_USER: myuser _DATABASE_NAME: mydatabase _INSTANCE_CONNECTION_NAME: ${PROJECT_ID}:us-central1:myinstance _DATABASE_PORT: 1433 _DATABASE_PASSWORD_KEY: database_password _AR_REPO_REGION: us-central1 _AR_REPO_NAME: my-docker-repo _IMAGE_NAME: ${_AR_REPO_REGION}-docker.pkg.dev/${PROJECT_ID}/${_AR_REPO_NAME}/sample-sql-proxy availableSecrets: secretManager: - versionName: projects/$PROJECT_ID/secrets/${_DATABASE_PASSWORD_KEY}/versions/latest env: "DATABASE_PASS"
The Cloud Build code sample shows how you might run a hypothetical
migrate.py
script after deploying the previous sample app to update
its Cloud SQL database using the Cloud SQL Auth Proxy and Cloud Build.
To run this Cloud Build code sample the setup steps required are:
- Create a folder name
sql-proxy
- Create a
Dockerfile
file in thesql-proxy
folder with the following single line of code for its file contents:FROM gcr.io/gcp-runtimes/ubuntu_20_0_4
- Create a
cloudbuild.yaml
file in thesql-proxy
folder. - Update the
cloudbuild.yaml
file:- Copy the previous sample Cloud Build code and paste it into the
cloudbuild.yaml
file. - Update the example code
_DATABASE_PORT
within thesubstitutions:
block to be1433
, which is the port used by SQL Server. - Replace the following placeholder values with the values used in your project:
mydatabase
myuser
myinstance
- Copy the previous sample Cloud Build code and paste it into the
- Create a secret named
database_password
in Secret Manager.- In order for the Cloud Build service account to access this secret, you have to grant it the Secret Manager Secret Accessor role in IAM. See Using secrets from Secret Manager for more information.
- Create a migrate.py script file in the
sql-proxy
folder.- The script can reference the following environment variables and the secret created in the
cloudbuild.yaml
file using the following examples:os.getenv('DATABASE_NAME')
os.getenv('DATABASE_USER')
os.getenv('DATABASE_PASS')
os.getenv('INSTANCE_CONNECTION_NAME')
- To reference the same variables from a Bash script (for example:
migrate.sh
) use the following examples:$DATABASE_NAME
$DATABASE_USER
$DATABASE_PASS
$INSTANCE_CONNECTION_NAME
- The script can reference the following environment variables and the secret created in the
- Run the following
gcloud builds submit
command to build a container with the Cloud SQL Auth Proxy, start the Cloud SQL Auth Proxy, and run themigrate.py
script:gcloud builds submit --config cloudbuild.yaml
Private IP
For private IP paths, your application connects directly to your instance through private pools. This method uses TCP to connect directly to the Cloud SQL instance without using the Cloud SQL Auth Proxy.
Connect with TCP
Connect using the private IP address of your Cloud SQL instance as the host and port 1433
.
Python
To see this snippet in the context of a web application, view the README on GitHub.
Java
To see this snippet in the context of a web application, view the README on GitHub.
Note:
- CLOUD_SQL_CONNECTION_NAME should be represented as <MY-PROJECT>:<INSTANCE-REGION>:<INSTANCE-NAME>
- Using the argument ipTypes=PRIVATE will force the SocketFactory to connect with an instance's associated private IP
- See the JDBC socket factory version requirements for the pom.xml file here .
Node.js
To see this snippet in the context of a web application, view the README on GitHub.
Go
To see this snippet in the context of a web application, view the README on GitHub.
C#
To see this snippet in the context of a web application, view the README on GitHub.
Ruby
To see this snippet in the context of a web application, view the README on GitHub.
PHP
To see this snippet in the context of a web application, view the README on GitHub.
You can then create a Cloud Build step to run your code directly.
The Cloud Build code sample above shows how you might run a hypothetical migrate script after deploying the sample app above to update its Cloud SQL database using Cloud Build. To run this Cloud Build code sample the setup steps required are:
- Create a folder name
sql-private-pool
- Create a
Dockerfile
file in thesql-private-pool
folder with the following single line of code for its file contents:FROM gcr.io/gcp-runtimes/ubuntu_20_0_4
- Create a
cloudbuild.yaml
file in thesql-private-pool
folder. - Update the
cloudbuild.yaml
file:- Copy the sample Cloud Build code above and paste it into the
cloudbuild.yaml
file. - Replace the following placeholder values with the values used in your project:
mydatabase
myuser
databasehost
, in the formhost:port
.
- Copy the sample Cloud Build code above and paste it into the
- Create a secret named
database_password
in Secret Manager.- In order for the Cloud Build service account to access this secret, you will have to grant it the Secret Manager Secret Accessor role in IAM. See Using secrets from Secret Manager for more information.
- Create a migrate.py script file in the
sql-proxy
folder.- The script can reference the following environment variables and the secret created in the
cloudbuild.yaml
file using the following examples:os.getenv('DATABASE_NAME')
os.getenv('DATABASE_USER')
os.getenv('DATABASE_PASS')
os.getenv('DATABASE_HOST')
- To reference the same variables from a Bash script (for example:
migrate.sh
) use the following examples:$DATABASE_NAME
$DATABASE_USER
$DATABASE_PASS
$DATABASE_HOST
- The script can reference the following environment variables and the secret created in the
- Run the following
gcloud builds submit
command to build a container with the Cloud SQL Auth Proxy, start the Cloud SQL Auth Proxy, and run themigrate.py
script:gcloud builds submit --config cloudbuild.yaml
Best practices and other information
You can use the Cloud SQL Auth Proxy when testing your application locally. See the quickstart for using the Cloud SQL Auth Proxy for detailed instructions.
You can also test using the Cloud SQL Proxy via a docker container.
Database schema migrations
By configuring Cloud Build to connect to Cloud SQL, you can run database schema migration tasks in Cloud Build using the same code you would deploy to any other serverless platform.
Using Secret Manager
You can use Secret Manager to include sensitive information in your builds.