Running Django on the Cloud Run environment


Deploying stateful applications to Cloud Run like Django involves integrating services to interact with each other to form a cohesive project.

This tutorial assumes that you're familiar with Django web development. If you're new to Django development, it's a good idea to work through writing your first Django app before continuing.

While this tutorial demonstrates Django specifically, you can use this deployment process with other Django-based frameworks, such as Wagtail and Django CMS.

This tutorial uses Django 5, which requires at least Python 3.10.

Objectives

In this tutorial, you will:

  • Create and connect a Cloud SQL database.
  • Create and use Secret Manager secret values.
  • Deploy a Django app to Cloud Run.

  • Host static files on Cloud Storage.

  • Use Cloud Build to automate deployment.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. Enable the Cloud Run, Cloud SQL, Cloud Build, Secret Manager, and Compute Engine APIs.

    Enable the APIs

  5. Install the Google Cloud CLI.
  6. To initialize the gcloud CLI, run the following command:

    gcloud init
  7. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  8. Make sure that billing is enabled for your Google Cloud project.

  9. Enable the Cloud Run, Cloud SQL, Cloud Build, Secret Manager, and Compute Engine APIs.

    Enable the APIs

  10. Install the Google Cloud CLI.
  11. To initialize the gcloud CLI, run the following command:

    gcloud init
  12. Ensure sufficient permissions are available to the account used for this tutorial.

Prepare your environment

Clone a sample app

The code for the Django sample app is in the GoogleCloudPlatform/python-docs-samples repository on GitHub.

  1. You can either download the sample as a ZIP file and extract it or clone the repository to your local machine:

    git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
    
  2. Go to the directory that contains the sample code:

    Linux/macOS

    cd python-docs-samples/run/django
    

    Windows

    cd python-docs-samples\run\django
    

Confirm your Python setup

This tutorial relies on Python to run the sample application on your machine. The sample code also requires installing dependencies

For more details, refer to the Python development environment guide.

  1. Confirm your Python is at least version 3.10.

     python -V
    

    You should see Python 3.10.0 or higher.

  2. Create a Python virtual environment and install dependencies:

    Linux/macOS

    python -m venv venv
    source venv/bin/activate
    pip install --upgrade pip
    pip install -r requirements.txt
    

    Windows

    python -m venv venv
    venv\scripts\activate
    pip install --upgrade pip
    pip install -r requirements.txt
    

Download Cloud SQL Auth Proxy to connect to Cloud SQL from your local machine

When deployed, your app uses the Cloud SQL Auth Proxy that is built into the Cloud Run environment to communicate with your Cloud SQL instance. However, to test your app locally, you must install and use a local copy of the proxy in your development environment. For more details, refer to the Cloud SQL Auth Proxy guide.

The Cloud SQL Auth Proxy uses the Cloud SQL API to interact with your SQL instance. To do this, it requires application authentication through the gcloud CLI.

  1. Authenticate and acquire credentials for the API:

    gcloud auth application-default login
    
  2. Download and install the Cloud SQL Auth Proxy to your local machine.

    Linux 64-bit

    1. Download the Cloud SQL Auth Proxy:
      curl -o cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.0/cloud-sql-proxy.linux.amd64
      
    2. Make the Cloud SQL Auth Proxy executable:
      chmod +x cloud-sql-proxy
      

    Linux 32-bit

    1. Download the Cloud SQL Auth Proxy:
      curl -o cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.0/cloud-sql-proxy.linux.386
      
    2. If the curl command is not found, run sudo apt install curl and repeat the download command.
    3. Make the Cloud SQL Auth Proxy executable:
      chmod +x cloud-sql-proxy
      

    macOS 64-bit

    1. Download the Cloud SQL Auth Proxy:
      curl -o cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.0/cloud-sql-proxy.darwin.amd64
      
    2. Make the Cloud SQL Auth Proxy executable:
      chmod +x cloud-sql-proxy
      

    Mac M1

    1. Download the Cloud SQL Auth Proxy:
        curl -o cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.0/cloud-sql-proxy.darwin.arm64
        
    2. Make the Cloud SQL Auth Proxy executable:
        chmod +x cloud-sql-proxy
        

    Windows 64-bit

    Right-click https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.0/cloud-sql-proxy.x64.exe and select Save Link As to download the Cloud SQL Auth Proxy. Rename the file to cloud-sql-proxy.exe.

    Windows 32-bit

    Right-click https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.0/cloud-sql-proxy.x86.exe and select Save Link As to download the Cloud SQL Auth Proxy. Rename the file to cloud-sql-proxy.exe.

    Cloud SQL Auth Proxy Docker image

    The Cloud SQL Auth Proxy has different container images, such as distroless, alpine, and buster. The default Cloud SQL Auth Proxy container image uses distroless, which contains no shell. If you need a shell or related tools, then download an image based on alpine or buster. For more information, see Cloud SQL Auth Proxy Container Images.

    You can pull the latest image to your local machine using Docker by using the following command:

    docker pull gcr.io/cloud-sql-connectors/cloud-sql-proxy:2.14.0
    

    Other OS

    For other operating systems not included here, you can compile the Cloud SQL Auth Proxy from source.

    You can choose to move the download to somewhere common, such as a location on your PATH, or your home directory. If you choose to do this, when you start the Cloud SQL Auth Proxy later on in the tutorial, remember to reference your chosen location when using cloud-sql-proxy commands.

Create backing services

This tutorial uses several Google Cloud services to provide the database, media storage, and secret storage that support the deployed Django project. These services are deployed in a specific region. For efficiency between services, all services should be deployed in the same region. For more information about the closest region to you, see Products available by region.

This tutorial uses the integrated static asset hosting mechanisms in Cloud Run.

Set up a Cloud SQL for PostgreSQL instance

Django officially supports multiple relational databases, but offers the most support for PostgreSQL. PostgreSQL is supported by Cloud SQL, so this tutorial chooses to use that type of database.

The following section describes the creation of a PostgreSQL instance, database, and database user for the app.

  1. Create the PostgreSQL instance:

    Console

    1. In the Google Cloud console, go to the Cloud SQL Instances page.

      Go to the Cloud SQL Instances page

    2. Click Create Instance.

    3. Click PostgreSQL.

    4. In the Instance ID field, enter INSTANCE_NAME.

    5. Enter a password for the postgres user.

    6. Keep the default values for the other fields.

    7. Click Create.

    It takes a few minutes to create the instance and for it to be ready for use.

    gcloud

    • Create the PostgreSQL instance:

      gcloud sql instances create INSTANCE_NAME \
          --project PROJECT_ID \
          --database-version POSTGRES_13 \
          --tier db-f1-micro \
          --region REGION
      

    Replace the following:

    • INSTANCE_NAME: the Cloud SQL instance name
    • PROJECT_ID: the Google Cloud project ID
    • REGION: the Google Cloud region

    It takes a few minutes to create the instance and for it to be ready for use.

  2. Within the created instance, create a database:

    Console

    1. Within your instance page, go to the Databases tab.
    2. Click Create database.
    3. In the Database name dialog, enter DATABASE_NAME.
    4. Click Create.

    gcloud

    • Create the database within the recently created instance:

      gcloud sql databases create DATABASE_NAME \
          --instance INSTANCE_NAME
      

      Replace DATABASE_NAME with a name for the database inside the instance.

  3. Create a database user:

    Console

    1. Within your instance page, go to the Users tab.
    2. Click Add User Account.
    3. In the Add a user account to instance dialog under "Built-in Authentication":
    4. Enter the username DATABASE_USERNAME.
    5. Enter the password DATABASE_PASSWORD
    6. Click Add.

    gcloud

    • Create the user within the recently created instance:

      gcloud sql users create DATABASE_USERNAME \
          --instance INSTANCE_NAME \
          --password DATABASE_PASSWORD
      

      Replace PASSWORD with a secure password.

Set up a Cloud Storage bucket

You can store Django's included static assets, as well as user-uploaded media, in highly-available object storage using Cloud Storage. The django-storages[google] package handles Django's interaction with this storage backend.

Console

  1. In the Google Cloud console, go to the Cloud Storage Buckets page.

    Go to Buckets page

  2. Click Create bucket.
  3. On the Create a bucket page, enter your bucket information. To go to the next step, click Continue.
    • For Name your bucket, enter a name that meets the bucket naming requirements.
    • For Location, select the following: MEDIA_BUCKET
    • For Choose a default storage class for your data, select the following: Standard.
    • For Choose how to control access to objects, select an Access control option.
    • For Advanced settings (optional), specify an encryption method, a retention policy, or bucket labels.
  4. Click Create.

gcloud

  • Create a Cloud Storage bucket:

    gcloud storage buckets create gs://PROJECT_ID_MEDIA_BUCKET --location=REGION
    

    Replace MEDIA_BUCKET with a suffix for the media bucket. Combined with the project ID, this creates a unique bucket name.

Store secret values in Secret Manager

Now that the backing services are configured, Django needs information about these services. Instead of putting these values directly into the Django source code, this tutorial uses Secret Manager to store this information securely.

Cloud Run and Cloud Build interact with secrets by using their respective service accounts. Service accounts are identified by an email address that contains the project number.

Create Django environment file as a Secret Manager secret

You store the settings required to start Django in a secured env file. The sample app uses the Secret Manager API to retrieve the secret value, and the django-environ package to load the values into the Django environment. The secret is configured to be accessible by Cloud Run and Cloud Build.

  1. Create a file called .env, defining the database connection string, the media bucket name, and a new SECRET_KEY value:

    echo DATABASE_URL=postgres://DATABASE_USERNAME:DATABASE_PASSWORD@//cloudsql/PROJECT_ID:REGION:INSTANCE_NAME/DATABASE_NAME > .env
    echo GS_BUCKET_NAME=PROJECT_ID_MEDIA_BUCKET >> .env
    echo SECRET_KEY=$(cat /dev/urandom | LC_ALL=C tr -dc '[:alpha:]'| fold -w 50 | head -n1) >> .env
    
  2. Store the secret in Secret Manager:

    Console

    1. In the Google Cloud console, go to the Secret Manager page.

      Go to the Secret Manager page

    2. Click Create secret

    3. In the Name field, enter django_settings.

    4. In the Secret value dialog, paste the contents of your .env file.

    5. Click Create secret.

    6. In Details for django_settings, note the project number:

      projects/PROJECTNUM/secrets/django_settings
      
    7. Delete the local file to prevent local setting overrides.

    gcloud

    1. Create a new secret, django_settings, with the value of the .env file:

      gcloud secrets create django_settings --data-file .env
      
    2. To confirm the creation of the secret, check it:

      gcloud secrets describe django_settings
      
      gcloud secrets versions access latest --secret django_settings
      
    3. Get the value of the Project Number (PROJECTNUM):

      gcloud projects describe PROJECT_ID --format='value(projectNumber)'
      

      Save this value for later steps.

    4. Delete the local file to prevent local setting overrides:

      rm .env
      
  3. Configure access to the secret:

    Console

    1. Click on the Permissions tab.
    2. Click Add.
    3. In the New Members field, enter PROJECTNUM-compute@developer.gserviceaccount.com, and then press Enter.
    4. In the New Members field, enter PROJECTNUM@cloudbuild.gserviceaccount.com, and then press Enter.
    5. In the Role drop-down menu, select Secret Manager Secret Accessor.
    6. Click Save.

    gcloud

    1. Grant access to the secret to the Cloud Run service account:

      gcloud secrets add-iam-policy-binding django_settings \
          --member serviceAccount:PROJECTNUM-compute@developer.gserviceaccount.com \
          --role roles/secretmanager.secretAccessor
      
    2. Grant access to the secret to the Cloud Build service account:

      gcloud secrets add-iam-policy-binding django_settings \
          --member serviceAccount:PROJECTNUM@cloudbuild.gserviceaccount.com \
          --role roles/secretmanager.secretAccessor
      

      In the output, confirm that bindings lists the two service accounts as members.

Create secret for Django's admin password

The Django admin user is normally created by running the interactive management command createsuperuser.

This tutorial uses a data migration to create the admin user, retrieving the admin password from Secret Manager.

Console

  1. In the Google Cloud console, go to the Secret Manager page.
  2. Click Create secret.

  3. In the Name field, enter superuser_password.

  4. In the Secret value field, enter a random, unique password.

  5. Click Create secret.

  6. In Details for superuser_password, make a note of the project number (projects/PROJECTNUM/secrets/superuser_password).

  7. Click on the Permissions tab.

  8. Click Add.

  9. In the New Members field, enter PROJECTNUM@cloudbuild.gserviceaccount.com, and then press Enter.

  10. In the Role drop-down menu, select Secret Manager Secret Accessor.

  11. Click Save

gcloud

  1. Create a new secret, superuser_password, from a randomly generated password:

    echo -n "$(cat /dev/urandom | LC_ALL=C tr -dc '[:alpha:]'| fold -w 30 | head -n1)" | gcloud secrets create superuser_password --data-file -
    
  2. Grant access to the secret to Cloud Build:

    gcloud secrets add-iam-policy-binding superuser_password \
        --member serviceAccount:PROJECTNUM@cloudbuild.gserviceaccount.com \
        --role roles/secretmanager.secretAccessor
    

    In the output, confirm that bindings lists only the Cloud Build as a member.

Grant Cloud Build access to Cloud SQL

In order for Cloud Build to apply the database migrations, you need to grant permissions for Cloud Build to access Cloud SQL.

Console

  1. In the Google Cloud console, go to the Identity and Access Management page.

    Go to the Identity and Access Management page

  2. To edit the entry for PROJECTNUM@cloudbuild.gserviceaccount.com, click Edit.

  3. Click Add another role

  4. In the Select a role dialog, select Cloud SQL Client.

  5. Click Save

gcloud

  1. Grant permission for Cloud Build to access Cloud SQL:

    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member serviceAccount:PROJECTNUM@cloudbuild.gserviceaccount.com \
        --role roles/cloudsql.client
    

Run the app on your local computer

With the backing services configured, you can now run the app on your computer. This setup allows for local development, and applying database migrations. Note that database migrations are also applied in Cloud Build, but you will need to have this local setup in order to makemigrations.

  1. In a separate terminal, start the Cloud SQL Auth Proxy:

    Linux/macOS

    ./cloud-sql-proxy PROJECT_ID:REGION:INSTANCE_NAME
    

    Windows

    cloud-sql-proxy.exe PROJECT_ID:REGION:INSTANCE_NAME
    

    This step establishes a connection from your local computer to your Cloud SQL instance for local testing purposes. Keep the Cloud SQL Auth Proxy running the entire time you test your app locally. Running this process in a separate terminal allows you to keep working while this process runs.

  2. In the original terminal, set the Project ID locally (used by the Secret Manager API):

    Linux/macOS

    export GOOGLE_CLOUD_PROJECT=PROJECT_ID
    

    Windows

    set GOOGLE_CLOUD_PROJECT=PROJECT_ID
    
  3. Set an environment variable to indicate you are using Cloud SQL Auth Proxy (this value is recognised in the code):

    Linux/macOS

    export USE_CLOUD_SQL_AUTH_PROXY=true
    

    Windows

    set USE_CLOUD_SQL_AUTH_PROXY=true
    
  4. Run the Django migrations to set up your models and assets:

    python manage.py makemigrations
    python manage.py makemigrations polls
    python manage.py migrate
    python manage.py collectstatic
    
  5. Start the Django web server:

    python manage.py runserver 8080
    
  6. In your browser, go to http://localhost:8080.

    If you are in Cloud Shell, click the Web Preview button, and select Preview on port 8080.

    The page displays the following text: "Hello, world. You're at the polls index." The Django web server running on your computer delivers the sample app pages.

  7. Press Ctrl/Cmd+C to stop the local web server.

Deploy the app to Cloud Run

With the backing services setup, you can now deploy the Cloud Run service.

  1. Using the supplied cloudmigrate.yaml, use Cloud Build to build the image, run the database migrations, and populate the static assets:

    gcloud builds submit --config cloudmigrate.yaml \
        --substitutions _INSTANCE_NAME=INSTANCE_NAME,_REGION=REGION
    

    This first build takes a few minutes to complete.

  2. When the build is successful, deploy the Cloud Run service for the first time, setting the service region, base image, and connected Cloud SQL instance:

    gcloud run deploy polls-service \
        --platform managed \
        --region REGION \
        --image gcr.io/PROJECT_ID/polls-service \
        --add-cloudsql-instances PROJECT_ID:REGION:INSTANCE_NAME \
        --allow-unauthenticated
    

    You should see output that shows the deployment succeeded, with a service URL:

    Service [polls-service] revision [polls-service-00001-tug] has been deployed
    and is serving 100 percent of traffic at https://polls-service-<hash>-uc.a.run.app
    
  3. Now that the service URL is known, update the service to set this value as an environment variable:

    SERVICE_URL=$(gcloud run services describe polls-service --platform managed \
        --region REGION --format "value(status.url)")
    
    gcloud run services update polls-service \
        --platform managed \
        --region REGION \
        --set-env-vars CLOUDRUN_SERVICE_URL=$SERVICE_URL
    
  4. To see the deployed service, go to the service URL.

  5. To log into the Django admin, append /admin to the URL, and login with the username admin, and the password set earlier.

    To retrieve the superuser password from Secret Manager:

    gcloud secrets versions access latest --secret superuser_password && echo ""
    

Updating the application

While the initial provisioning and deployment steps were complex, making updates is a simpler process:

  1. Run the Cloud Build build and migration script:

    gcloud builds submit --config cloudmigrate.yaml \
        --substitutions _INSTANCE_NAME=INSTANCE_NAME,_REGION=REGION
    
  2. Deploy the service, specifying only the region and image:

    gcloud run deploy polls-service \
        --platform managed \
        --region REGION \
        --image gcr.io/PROJECT_ID/polls-service
    

Configuring for production

You now have a working Django deployment, but there are further steps you can take to ensure that your application is production-ready.

Disable debugging

Confirm that the DEBUG variable in mysite/settings.py is set to False. This will prevent detailed error pages from being displayed to the user, which can leak information about the configurations.

Limit the database user privileges

Any users that are created by using Cloud SQL have the privileges associated with the cloudsqlsuperuser role: CREATEROLE, CREATEDB, and LOGIN.

To prevent the Django database user from having these permissions, manually create the user in PostgreSQL. You will need to have the psql interactive terminal installed, or use Cloud Shell which has this tool pre-installed.

Console

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

  2. In Cloud Shell, use the built-in terminal to connect to your INSTANCE_NAME instance:

    gcloud sql connect INSTANCE_NAME --user postgres
    
  3. Enter the postgres user password.

    You are now using psql. You should see the postgres=> prompt.

  4. Create a user:

    CREATE USER DATABASE_USERNAME WITH PASSWORD 'DATABASE_PASSWORD';
    

    Replace PASSWORD with a random, unique password.

  5. Grant full rights on the new database to the new user:

    GRANT ALL PRIVILEGES ON DATABASE DATABASE_NAME TO DATABASE_USERNAME;
    
  6. Exit psql:

    \q
    

gcloud

  1. Start a connection to the SQL instance:

    gcloud sql connect INSTANCE_NAME --user postgres
    

    Replace INSTANCE_NAME with the created Cloud SQL instance.

  2. Enter the postgres user password.

    You are now using psql. You should see the postgres=> prompt.

  3. Create a user:

    CREATE USER DATABASE_USERNAME WITH PASSWORD 'DATABASE_PASSWORD';
    
  4. Grant full rights on the new database to the new user:

    GRANT ALL PRIVILEGES ON DATABASE DATABASE_NAME TO DATABASE_USERNAME;
    
  5. Exit psql:

    \q
    

Setting minimum permissions

By default, this service is deployed with the default compute service account. However, in some cases, using the default service account can provide too many permissions. If you want to be more restrictive, you need to create your own service account and assign only the permissions that are required by your service. The permissions required can vary from service to service, depending on the resources used by a particular service.

The minimum project roles required by this service are the following:

  • Cloud Run Invoker
  • Cloud SQL Client
  • Storage Admin, on the media bucket
  • Secret Manager Accessor, on the Django settings secret. (Access to the Django admin secret is not required by the service itself.)

To create a service account with the required permissions, and assign it to the service, run the following:

  1. In the gcloud CLI, create a service account with the required roles:

    gcloud iam service-accounts create polls-service-account
    SERVICE_ACCOUNT=polls-service-account@PROJECT_ID.iam.gserviceaccount.com
    
    # Cloud Run Invoker
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member serviceAccount:${SERVICE_ACCOUNT} \
        --role roles/run.invoker
    
    # Cloud SQL Client
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member serviceAccount:${SERVICE_ACCOUNT} \
        --role roles/cloudsql.client
    
    # Storage Admin, on the media bucket
    gcloud storage buckets add-iam-policy-binding gs://MEDIA_BUCKET \
        --member=serviceAccount:${SERVICE_ACCOUNT} \
        --role=roles/storage.objectAdmin
    
    # Secret Accessor, on the Django settings secret.
    gcloud secrets add-iam-policy-binding django_settings \
        --member serviceAccount:${SERVICE_ACCOUNT} \
        --role roles/secretmanager.secretAccessor
    
  2. Deploy the service, associating it with the new service account:

    gcloud run services update polls-service \
        --platform managed \
        --region REGION \
        --service-account ${SERVICE_ACCOUNT}
    

Understand the code

Sample application

The Django sample app was created using standard Django tooling. The following commands create the project and the polls app:

django-admin startproject mysite
python manage.py startapp polls

The base views, models, and route configurations were copied from Writing your first Django app (Part 1 and Part 2).

Secrets from Secret Manager

The settings.py file contains code that uses the Secret Manager Python API to retrieve the latest version of the named secret, and pull it into the environment (using django-environ):

# SECURITY WARNING: don't run with debug turned on in production!
# Change this to "False" when you are ready for production
env = environ.Env(DEBUG=(bool, True))
env_file = os.path.join(BASE_DIR, ".env")

# Attempt to load the Project ID into the environment, safely failing on error.
try:
    _, os.environ["GOOGLE_CLOUD_PROJECT"] = google.auth.default()
except google.auth.exceptions.DefaultCredentialsError:
    pass

if os.path.isfile(env_file):
    # Use a local secret file, if provided

    env.read_env(env_file)
# ...
elif os.environ.get("GOOGLE_CLOUD_PROJECT", None):
    # Pull secrets from Secret Manager
    project_id = os.environ.get("GOOGLE_CLOUD_PROJECT")

    client = secretmanager.SecretManagerServiceClient()
    settings_name = os.environ.get("SETTINGS_NAME", "django_settings")
    name = f"projects/{project_id}/secrets/{settings_name}/versions/latest"
    payload = client.access_secret_version(name=name).payload.data.decode("UTF-8")

    env.read_env(io.StringIO(payload))
else:
    raise Exception("No local .env or GOOGLE_CLOUD_PROJECT detected. No secrets found.")

The secret is used to store multiple secret values to reduce the number of different secrets that needed to be configured. While the superuser_password could have been created directly from the command line , the file-based method was used instead. If generated from the command line, care was taken using head -c to determine the length of the randomly generated string used, while ensuring there was no new-line character at the end of the file, which would have caused issues when the password was entered into the Django admin.

CSRF configurations

Django has built-in protection against Cross Site Request Forgery (CSRF). Starting in Django 4.0, changes to the way this works mean that it's important to tell Django what it's hosted URL is, so it can offer the best protections for users submitting data.

You supply the app's URL as an environment variable in the settings.py file This is the value that Django uses for the relevant settings.

# SECURITY WARNING: It's recommended that you use this when
# running in production. The URL will be known once you first deploy
# to Cloud Run. This code takes the URL and converts it to both these settings formats.
CLOUDRUN_SERVICE_URL = env("CLOUDRUN_SERVICE_URL", default=None)
if CLOUDRUN_SERVICE_URL:
    ALLOWED_HOSTS = [urlparse(CLOUDRUN_SERVICE_URL).netloc]
    CSRF_TRUSTED_ORIGINS = [CLOUDRUN_SERVICE_URL]
    SECURE_SSL_REDIRECT = True
    SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
else:
    ALLOWED_HOSTS = ["*"]

Local secret overrides

If a .env file is found on the local filesystem, it is used instead of the value from Secret Manager. Creating a .env file locally can help with local testing (e.g. local development against a SQLite database, or other local settings).

Database connection

The settings.py file contains the configuration for your SQL database. It uses the env.db() helper from django-environ to load the connection string set in DATABASE_URL into the DATABASES setting.

When running the application locally and using the Cloud SQL Auth Proxy to access the hosted database, the USE_CLOUD_SQL_AUTH_PROXY flag adjusts the database settings to use the proxy.

# Use django-environ to parse the connection string
DATABASES = {"default": env.db()}

# If the flag as been set, configure to use proxy
if os.getenv("USE_CLOUD_SQL_AUTH_PROXY", None):
    DATABASES["default"]["HOST"] = "127.0.0.1"
    DATABASES["default"]["PORT"] = 5432

Cloud-stored static

The settings.py file also uses django-storages to integrate the Cloud Storage media bucket directly into the project:

# Define static storage via django-storages[google]
GS_BUCKET_NAME = env("GS_BUCKET_NAME")
STATIC_URL = "/static/"
STORAGES = {
    "default": {
        "BACKEND": "storages.backends.gcloud.GoogleCloudStorage",
    },
    "staticfiles": {
        "BACKEND": "storages.backends.gcloud.GoogleCloudStorage",
    },
}
GS_DEFAULT_ACL = "publicRead"

Automation with Cloud Build

The cloudmigrate.yaml file performs not only the typical image build steps (creating the container image and pushing that to the container registry), but also the Django migrate and collectstatic commands. These require access to the database, which is performed by using the app-engine-exec-wrapper, a helper for Cloud SQL Auth Proxy:

steps:
  - id: "build image"
    name: "gcr.io/cloud-builders/docker"
    args: ["build", "-t", "gcr.io/${PROJECT_ID}/${_SERVICE_NAME}", "."]

  - id: "push image"
    name: "gcr.io/cloud-builders/docker"
    args: ["push", "gcr.io/${PROJECT_ID}/${_SERVICE_NAME}"]

  - id: "apply migrations"
    name: "gcr.io/google-appengine/exec-wrapper"
    args:
      [
        "-i",
        "gcr.io/$PROJECT_ID/${_SERVICE_NAME}",
        "-s",
        "${PROJECT_ID}:${_REGION}:${_INSTANCE_NAME}",
        "-e",
        "SETTINGS_NAME=${_SECRET_SETTINGS_NAME}",
        "--",
        "python",
        "manage.py",
        "migrate",
      ]

  - id: "collect static"
    name: "gcr.io/google-appengine/exec-wrapper"
    args:
      [
        "-i",
        "gcr.io/$PROJECT_ID/${_SERVICE_NAME}",
        "-s",
        "${PROJECT_ID}:${_REGION}:${_INSTANCE_NAME}",
        "-e",
        "SETTINGS_NAME=${_SECRET_SETTINGS_NAME}",
        "--",
        "python",
        "manage.py",
        "collectstatic",
        "--verbosity",
        "2",
        "--no-input",
      ]

substitutions:
  _INSTANCE_NAME: django-instance
  _REGION: us-central1
  _SERVICE_NAME: polls-service
  _SECRET_SETTINGS_NAME: django_settings

images:
  - "gcr.io/${PROJECT_ID}/${_SERVICE_NAME}"

Substitution variables are used in this configuration. Changing the values in the file directly mean the --substitutions flag can be dropped at migration time.

In this configuration, only existing migrations are applied. Migrations need to be created locally using the Cloud SQL Auth Proxy method defined in "Running the app locally". This template can be extended to run other manage.py commands, as required.

To extend the Cloud Build configuration to include the deployment in the one configuration without having to run two commands, see Continuous deployment from git using Cloud Build. This requires IAM changes, as described.

Superuser creation with data migrations

The Django management command createsuperuser can only be run interactively -- that is, when the user can enter in information in response to prompts. While you can use this command with Cloud SQL Proxy and executing commands within a local Docker setup, another way is to create the superuser as a data migration:

import os

from django.contrib.auth.models import User
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps

import google.auth
from google.cloud import secretmanager


def createsuperuser(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
    """
    Dynamically create an admin user as part of a migration
    Password is pulled from Secret Manger (previously created as part of tutorial)
    """
    if os.getenv("TRAMPOLINE_CI", None):
        # We are in CI, so just create a placeholder user for unit testing.
        admin_password = "test"
    else:
        client = secretmanager.SecretManagerServiceClient()

        # Get project value for identifying current context
        _, project = google.auth.default()

        # Retrieve the previously stored admin password
        PASSWORD_NAME = os.environ.get("PASSWORD_NAME", "superuser_password")
        name = f"projects/{project}/secrets/{PASSWORD_NAME}/versions/latest"
        admin_password = client.access_secret_version(name=name).payload.data.decode(
            "UTF-8"
        )

    # Create a new user using acquired password, stripping any accidentally stored newline characters
    User.objects.create_superuser("admin", password=admin_password.strip())


class Migration(migrations.Migration):

    initial = True
    dependencies = []
    operations = [migrations.RunPython(createsuperuser)]

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.

Delete the project

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next