Resource location restrictions

This page shows how to configure resource location restrictions to ensure your data stored by Cloud Composer is kept within the locations you specify.

Respecting location restrictions

Location restrictions for Cloud Composer will be detected based on the organizational policy applied to the project (assigned within it, or inherited from the organization) in which the Cloud Composer environment is created. In particular, it will not be possible to create an environment in a region prohibited by the policy (in the Deny list, or not on the Allow list). Keep in mind that Cloud Composer is region based, so for the successful creation of an environment the whole region (e.g. europe-west3, not a specific zone) must be allowed by the policy.

Location restrictions are checked at:

  • Environment creation
  • Environment upgrade, if it leads to creation of any additional resources
  • Environment update, for older environments that don't enforce location restrictions on Cloud Composer dependencies.

All environments will check the locations restrictions as described above. However, for the Preview of this feature, we are allowing customers to enforce those restrictions on Cloud Composer dependencies.

This includes:

  • Storing user customized Airflow images in regional Artifact Registry repositories
  • Bypassing the use of Cloud Build (which is a global service)

Those changes will only be applied for Cloud Composer environments with version >= 1.13.1, and in projects with locations restrictions enabled. Because this feature is released as a Preview, you must use the Cloud Composer Beta API or gcloud beta commands to access it.

Installing a Python dependency to a private IP environment with resource location restrictions

Keeping your project in line with the location restrictions prohibits the use of some tools. In particular, Cloud Build cannot be used for package installation, preventing direct access to repositories on the public internet.

To install Python dependencies for a private IP Composer environment when your location restrictions do not allow the US multiregion (used by Cloud Build), you have several options:

  1. Use a private PyPI repository hosted in your VPC network).
  2. Use a proxy server in your VPC network to connect to a PyPI repository on the public internet. Specify the proxy address in the /config/pip/pip.conf file in the Cloud Storage bucket.
  3. If your security policy permits access to your VPC network from external IP addresses, you can enable this by configuring Cloud NAT.
  4. Vendor the Python dependencies into the dags folder in the Cloud Storage bucket to install them as local libraries. This may not be a good option if the dependency tree is large.

Restricting locations for Cloud Composer logs

If you expect your logs to contain sensitive data, you may also choose to redirect Cloud Composer logs the regional Cloud Storage bucket via the Logs Router. This will prevent your logs from being sent to Logging. If you need support from Google Cloud Support, you will need to grant Google support engineers access to the Cloud Composer logs stored in Cloud Storage.

gcloud

  1. Create a new Cloud Storage bucket (e.g. composer-logs-${location}-${envname}).

    gsutil mb -l ${location} gs://${bucket_name}
    
  2. Create a new log sink.

    gcloud logging sinks create composer-log-sink-${envname} storage.googleapis.com/${bucket_name}
      --log-filter "resource.type=cloud_composer_environment AND resource.labels.environment_name=${envname} AND resource.labels.location=${location}"
    
  3. Grant the appropriate role to the service account for this bucket (shown in the result of the previous command).

    gcloud projects add-iam-policy-binding ${project} --member="serviceAccount:${serviceAccountNumber}@gcp-sa-logging.iam.gserviceaccount.com" --role='roles/storage.objectCreator' --condition=None
    
  4. Exclude the logs for your new environment from Monitoring.

    gcloud beta logging sinks update _Default --add-exclusion name=${envname}-exclusion,filter="resource.type=cloud_composer_environment AND resource.labels.environment_name=${envname} AND resource.labels.location=${location}"
    

Preview scope and limitations

  • Audit logs cannot be excluded. They are always sent to the default storage location.
  • Cloud Composer does not yet enforce location restrictions on user customized Airflow images in the following regions:

    • asia-northeast3
    • us-west3
    • us-west4