Il 15 settembre 2026, tutti gli ambienti Cloud Composer 1 e Cloud Composer 2 versione 2.0.x raggiungeranno la fine del ciclo di vita pianificata e non potrai più utilizzarli. Ti consigliamo di pianificare la migrazione a Cloud Composer 3.
Questa pagina mostra come configurare
le limitazioni della località delle risorse
in modo che i dati archiviati da Cloud Composer vengano conservati nelle località specificate.
Come funzionano le limitazioni relative alla località
Le limitazioni della località per Cloud Composer sono determinate in base
al criterio dell'organizzazione applicato al progetto in cui
viene creato l'ambiente Cloud Composer. Questo criterio è assegnato
all'interno del progetto o è ereditato dall'organizzazione.
Con le limitazioni della località abilitate, non è possibile creare
un ambiente in una regione vietata dalle norme. Se una regione
è elencata nell'elenco Nega o non è elencata nell'elenco Consenti, non puoi
creare ambienti in questa regione.
Per consentire la creazione di ambienti, il criterio deve consentire l'intera regione
e non una zona specifica all'interno di questa regione. Ad esempio, la regione europe-west3
deve essere consentita dal criterio per creare
ambienti Cloud Composer in questa regione.
Cloud Composer controlla le limitazioni della località in:
Creazione dell'ambiente.
Upgrade dell'ambiente, se vengono create risorse aggiuntive durante l'operazione.
Aggiornamento dell'ambiente, per gli ambienti precedenti che non applicano restrizioni di località alle dipendenze di Cloud Composer.
Oltre a verificare le limitazioni relative alla località, Cloud Composer
esegue le seguenti operazioni:
Archivia le immagini Airflow personalizzate dall'utente nei repository Artifact Registry regionali. Ad esempio, queste immagini vengono create quando installi
immagini PyPI personalizzate nel tuo ambiente.
Se la US multiregionale è esplicitamente vietata dal criterio, l'utilizzo di Cloud Build è disattivato. In questo caso, le immagini Airflow personalizzate dall'utente vengono create nel cluster dell'ambiente.
Installa una dipendenza Python in un ambiente IP privato con limitazioni di località delle risorse
Se imposti limitazioni alla località delle risorse per il tuo progetto, non puoi utilizzare Cloud Build per installare pacchetti Python. Di conseguenza,
l'accesso diretto ai repository sulla rete internet pubblica è disabilitato.
Per installare le dipendenze Python in un ambiente con IP privato quando le limitazioni
di località non consentono la US multiregionale, utilizza
una delle seguenti opzioni:
Utilizza un
server proxy
nella tua rete VPC per connetterti a un repository PyPI su internet
pubblico. Specifica l'indirizzo proxy nel file /config/pip/pip.conf nel bucket Cloud Storage.
Se la tua norma di sicurezza consente l'accesso alla rete VPC da indirizzi IP esterni, puoi configurare Cloud NAT.
Archivia le dipendenze Python nella cartella dags nel bucket Cloud Storage per installarle come librerie locali.
Questa potrebbe non essere una buona opzione se l'albero delle dipendenze è grande.
Limitare le località per i log di Cloud Composer
Se i log di Cloud Composer contengono dati sensibili, ti consigliamo di reindirizzarli a un bucket Cloud Storage regionale. Per farlo, utilizza
un sink di log. Dopo aver reindirizzato i log a
un bucket Cloud Storage, i log non vengono inviati a Cloud Logging.
LOCATION con la regione in cui si trova l'ambiente.
BUCKET_NAME con il nome del bucket. Ad esempio,
composer-logs-us-central1-example-environment.
Crea un nuovo sink di log.
gcloudloggingsinkscreate\
composer-log-sink-ENVIRONMENT_NAME\
storage.googleapis.com/BUCKET_NAME\
--log-filter"resource.type=cloud_composer_environment AND \resource.labels.environment_name=ENVIRONMENT_NAME AND \resource.labels.location=LOCATION"
Sostituisci:
ENVIRONMENT_NAME con il nome dell'ambiente.
BUCKET_NAME con il nome del bucket.
LOCATION con la regione in cui si trova l'ambiente.
L'output del comando precedente contiene il numero del service account. Concedi il ruolo Storage Object Creator a questo account di servizio:
SA_NUMBER con il numero del account di servizio fornito dal comando gcloud logging sinks create nel passaggio precedente.
Escludi i log per il tuo ambiente da Logging.
gcloudloggingsinksupdate_Default\
--add-exclusionname=ENVIRONMENT_NAME-exclusion,filter=\"resource.type=cloud_composer_environment AND \resource.labels.environment_name=ENVIRONMENT_NAME AND \resource.labels.location=LOCATION"
Sostituisci:
ENVIRONMENT_NAME con il nome dell'ambiente.
LOCATION con la regione in cui si trova l'ambiente.
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Difficile da capire","hardToUnderstand","thumb-down"],["Informazioni o codice di esempio errati","incorrectInformationOrSampleCode","thumb-down"],["Mancano le informazioni o gli esempi di cui ho bisogno","missingTheInformationSamplesINeed","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Altra","otherDown","thumb-down"]],["Ultimo aggiornamento 2025-08-29 UTC."],[[["\u003cp\u003eThis document details how to configure resource location restrictions for Cloud Composer environments, ensuring data remains within specified locations, which is determined by the organizational policy applied to the project.\u003c/p\u003e\n"],["\u003cp\u003eWith location restrictions, environments cannot be created in denied regions, and the policy must allow the entire region, not just a specific zone, for environment creation within it to be possible.\u003c/p\u003e\n"],["\u003cp\u003eWhen using Private IP environments and resource location restrictions disallow the \u003ccode\u003eUS\u003c/code\u003e multi-region, users must utilize a private PyPI repository, a proxy server, Cloud NAT, or local library installation from the \u003ccode\u003edags\u003c/code\u003e folder to install Python dependencies.\u003c/p\u003e\n"],["\u003cp\u003eTo redirect Cloud Composer logs containing sensitive data to a regional Cloud Storage bucket, users can employ a log sink, after which logs will no longer be sent to Cloud Logging, and certain steps are outlined to make this possible.\u003c/p\u003e\n"],["\u003cp\u003eThe location restrictions are checked at the time of environment creation, environment upgrade if it causes additional resource creation, and environment update for older environments.\u003c/p\u003e\n"]]],[],null,["# Configure resource location restrictions\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/configure-resource-location-restrictions \"View this page for Cloud Composer 2\") \\| [Cloud Composer 1](/composer/docs/composer-1/configure-resource-location-restrictions \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page shows how to configure\n[resource location restrictions](/resource-manager/docs/organization-policy/defining-locations)\nso that your data stored by Cloud Composer is kept within\nthe locations you specify.\n\nHow location restrictions work\n------------------------------\n\nLocation restrictions for Cloud Composer are determined based\non the organizational policy that is applied to the project where\nthe Cloud Composer environment is created. This policy is assigned\neither within the project or is inherited from the organization.\n\nWith location restrictions enabled, it is not possible to create\nan environment in a region that is prohibited by the policy. If a region\nis listed in the Deny list, or is not listed in the Allow list, then you\ncannot create environments in this region.\n\nTo enable the creation of environments, the policy must allow the whole region\nand not a specific zone within this region. For example, the `europe-west3`\nregion must be allowed by the policy in order to create\nCloud Composer environments in this region.\n\nCloud Composer checks location restrictions at:\n\n- Environment creation.\n- Environment upgrade, if any additional resources are created during the operation.\n- Environment update, for older environments that do not enforce location restrictions on Cloud Composer dependencies.\n\nIn addition to checking the location restrictions, Cloud Composer\ndoes the following:\n\n- Stores user-customized Airflow images in regional Artifact Registry repositories. As an example, such images are created when you install custom PyPI images in your environment.\n- If the [`US` multi-region](/storage/docs/locations#location-mr) is explicitly prohibited by the policy, Cloud Build use is disabled. In this case, user-customized Airflow images are built in your environment's cluster.\n\nInstall a Python dependency to a private IP environment with resource location restrictions\n-------------------------------------------------------------------------------------------\n\nIf you set resource location restrictions for your project, then\nCloud Build can't be used to install Python packages. As a consequence,\ndirect access to repositories on the public internet is disabled.\n\nTo install Python dependencies in a Private IP environment when your\nlocation restrictions don't allow the [`US` multi-region](/storage/docs/locations#location-mr), use\none of the following options:\n\n- Use a private\n [PyPI repository hosted in your VPC network](/composer/docs/composer-3/install-python-dependencies#install-private-repo).\n\n- Use a\n [proxy server](https://pip.pypa.io/en/stable/user_guide/#using-a-proxy-server)\n in your VPC network to connect to a PyPI repository on the public\n internet. Specify the proxy address in the `/config/pip/pip.conf` file in\n the Cloud Storage bucket.\n\n- If your security policy permits access to your VPC network from external\n IP addresses, you can configure [Cloud NAT](/nat/docs/overview).\n\n- Store the Python dependencies in the `dags` folder in\n the Cloud Storage bucket, to\n [install them as local libraries](/composer/docs/composer-3/install-python-dependencies#install-local).\n This might not be a good option if the dependency tree is large.\n\nRestrict locations for Cloud Composer logs\n------------------------------------------\n\nIf your Cloud Composer logs contain sensitive data, you might want\nto redirect Cloud Composer logs to a regional\nCloud Storage bucket. To do so, use\na [log sink](/logging/docs/export/configure_export_v2). After you redirect logs to\na Cloud Storage bucket, your logs are not sent to Cloud Logging.\n**Caution:** To get support from Cloud Customer Care, you might need to grant Google support engineers access to the Cloud Composer logs stored in Cloud Storage. \n\n### gcloud\n\n1. Create a new Cloud Storage bucket.\n\n gcloud storage buckets create gs://\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e --location=\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\n\n Replace:\n - `LOCATION` with the region where the environment is located.\n - `BUCKET_NAME` with the name of the bucket. For example, `composer-logs-us-central1-example-environment`.\n2. Create a new log sink.\n\n gcloud logging sinks create \\\n composer-log-sink-\u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n storage.googleapis.com/\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e \\\n --log-filter \"resource.type=cloud_composer_environment AND \\\n resource.labels.environment_name=\u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e AND \\\n resource.labels.location=\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\"\n\n Replace:\n - `ENVIRONMENT_NAME` with the name of the environment.\n - `BUCKET_NAME` with the name of the bucket.\n - `LOCATION` with the region where the environment is located.\n3. The output of the previous command contains the service\n account number. Grant the **Storage Object Creator** role to this\n service account:\n\n gcloud projects add-iam-policy-binding \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --member=\"serviceAccount:\u003cvar translate=\"no\"\u003eSA_NUMBER\u003c/var\u003e@gcp-sa-logging.iam.gserviceaccount.com\" \\\n --role='roles/storage.objectCreator' \\\n --condition=None\n\n Replace:\n - `PROJECT_ID` with the [Project ID](/resource-manager/docs/creating-managing-projects).\n - `SA_NUMBER` with the service account number provided by the `gcloud logging sinks create` command on the previous step.\n4. Exclude the logs for your environment from Logging.\n\n **Caution:** [Audit logs](/logging/docs/audit) cannot be excluded. They are always sent to the default storage. \n\n gcloud logging sinks update _Default \\\n --add-exclusion name=\u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e-exclusion,filter=\\\n \"resource.type=cloud_composer_environment AND \\\n resource.labels.environment_name=\u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e AND \\\n resource.labels.location=\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\"\n\n Replace:\n - `ENVIRONMENT_NAME` with the name of the environment.\n - `LOCATION` with the region where the environment is located.\n\nWhat's next\n-----------\n\n- [Cloud Composer security overview](/composer/docs/composer-3/composer-security-overview)\n- [Access control](/composer/docs/composer-3/access-control)"]]