Restez organisé à l'aide des collections
Enregistrez et classez les contenus selon vos préférences.
Pour utiliser les bibliothèques Python OpenAI, installez le SDK OpenAI :
pipinstallopenai
Pour vous authentifier avec l'API Chat Completions, vous pouvez modifier la configuration de votre client ou de votre environnement afin d'utiliser l'authentification Google et un point de terminaison Vertex AI. Choisissez la méthode qui vous semble la plus simple et suivez la procédure de configuration selon que vous souhaitez appeler des modèles Gemini ou des modèles Model Garden déployés automatiquement.
importopenaifromgoogle.authimportdefaultimportgoogle.auth.transport.requests# TODO(developer): Update and un-comment below lines# project_id = "PROJECT_ID"# location = "us-central1"# Programmatically get an access tokencredentials,_=default(scopes=["https://www.googleapis.com/auth/cloud-platform"])credentials.refresh(google.auth.transport.requests.Request())# Note: the credential lives for 1 hour by default (https://cloud.google.com/docs/authentication/token-types#at-lifetime); after expiration, it must be refreshed.############################### Choose one of the following:############################### If you are calling a Gemini model, set the ENDPOINT_ID variable to use openapi.ENDPOINT_ID="openapi"# If you are calling a self-deployed model from Model Garden, set the# ENDPOINT_ID variable and set the client's base URL to use your endpoint.# ENDPOINT_ID = "YOUR_ENDPOINT_ID"# OpenAI Clientclient=openai.OpenAI(base_url=f"https://{location}-aiplatform.googleapis.com/v1/projects/{project_id}/locations/{location}/endpoints/{ENDPOINT_ID}",api_key=credentials.token,)
Installez Google Cloud CLI. La bibliothèque OpenAI peut lire les variables d'environnement OPENAI_API_KEY et OPENAI_BASE_URL pour modifier l'authentification et le point de terminaison dans le client par défaut.
Définissez ces variables :
L'exemple suivant montre comment actualiser automatiquement vos identifiants si nécessaire :
Python
fromtypingimportAnyimportgoogle.authimportgoogle.auth.transport.requestsimportopenaiclassOpenAICredentialsRefresher:def__init__(self,**kwargs:Any)-> None:# Set a placeholder key hereself.client=openai.OpenAI(**kwargs,api_key="PLACEHOLDER")self.creds,self.project=google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])def__getattr__(self,name:str)-> Any:ifnotself.creds.valid:self.creds.refresh(google.auth.transport.requests.Request())ifnotself.creds.valid:raiseRuntimeError("Unable to refresh auth")self.client.api_key=self.creds.tokenreturngetattr(self.client,name)# TODO(developer): Update and un-comment below lines# project_id = "PROJECT_ID"# location = "us-central1"client=OpenAICredentialsRefresher(base_url=f"https://{location}-aiplatform.googleapis.com/v1/projects/{project_id}/locations/{location}/endpoints/openapi",)response=client.chat.completions.create(model="google/gemini-2.0-flash-001",messages=[{"role":"user","content":"Why is the sky blue?"}],)print(response)
Étapes suivantes
Consultez des exemples d'appel de l'API Chat Completions avec la syntaxe compatible avec OpenAI.
Consultez des exemples d'appel de l'API Inference avec la syntaxe compatible avec OpenAI.
Consultez des exemples d'appel de l'API Function Calling avec la syntaxe compatible avec OpenAI.
Sauf indication contraire, le contenu de cette page est régi par une licence Creative Commons Attribution 4.0, et les échantillons de code sont régis par une licence Apache 2.0. Pour en savoir plus, consultez les Règles du site Google Developers. Java est une marque déposée d'Oracle et/ou de ses sociétés affiliées.
Dernière mise à jour le 2025/09/04 (UTC).
[[["Facile à comprendre","easyToUnderstand","thumb-up"],["J'ai pu résoudre mon problème","solvedMyProblem","thumb-up"],["Autre","otherUp","thumb-up"]],[["Difficile à comprendre","hardToUnderstand","thumb-down"],["Informations ou exemple de code incorrects","incorrectInformationOrSampleCode","thumb-down"],["Il n'y a pas l'information/les exemples dont j'ai besoin","missingTheInformationSamplesINeed","thumb-down"],["Problème de traduction","translationIssue","thumb-down"],["Autre","otherDown","thumb-down"]],["Dernière mise à jour le 2025/09/04 (UTC)."],[],[],null,["# Authenticate\n\nTo use the OpenAI Python libraries, install the OpenAI SDK: \n\n pip install openai\n\nTo authenticate with the Chat Completions API, you can\neither modify your client setup or change your environment\nconfiguration to use Google authentication and a Vertex AI\nendpoint. Choose whichever method that's easier, and follow the steps for\nsetting up depending on whether you want to call Gemini models\nor self-deployed Model Garden models.\n\nCertain models in Model Garden and\n[supported Hugging Face models](/vertex-ai/generative-ai/docs/open-models/use-hugging-face-models)\nneed to be\n[deployed to a Vertex AI endpoint](/vertex-ai/docs/general/deployment)\nfirst before they can serve requests.\nWhen\ncalling these self-deployed models from the Chat Completions API, you need to\nspecify the endpoint ID. To list your\nexisting Vertex AI endpoints, use the\n[`gcloud ai endpoints list` command](/sdk/gcloud/reference/ai/endpoints/list). \n\n### Client setup\n\nTo programmatically get Google credentials in Python, you can use the\n`google-auth` Python SDK: \n\n pip install google-auth requests\n\n\u003cbr /\u003e\n\n### Python\n\n\nBefore trying this sample, follow the Python setup instructions in the\n[Vertex AI quickstart using\nclient libraries](/vertex-ai/docs/start/client-libraries).\n\n\nFor more information, see the\n[Vertex AI Python API\nreference documentation](/python/docs/reference/aiplatform/latest).\n\n\nTo authenticate to Vertex AI, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import openai\n\n from google.auth import default\n import google.auth.transport.requests\n\n # TODO(developer): Update and un-comment below lines\n # project_id = \"PROJECT_ID\"\n # location = \"us-central1\"\n\n # Programmatically get an access token\n credentials, _ = default(scopes=[\"https://www.googleapis.com/auth/cloud-platform\"])\n credentials.refresh(google.auth.transport.requests.Request())\n # Note: the credential lives for 1 hour by default (https://cloud.google.com/docs/authentication/token-types#at-lifetime); after expiration, it must be refreshed.\n\n ##############################\n # Choose one of the following:\n ##############################\n\n # If you are calling a Gemini model, set the ENDPOINT_ID variable to use openapi.\n ENDPOINT_ID = \"openapi\"\n\n # If you are calling a self-deployed model from Model Garden, set the\n # ENDPOINT_ID variable and set the client's base URL to use your endpoint.\n # ENDPOINT_ID = \"YOUR_ENDPOINT_ID\"\n\n # OpenAI Client\n client = openai.OpenAI(\n base_url=f\"https://{location}-aiplatform.googleapis.com/v1/projects/{project_id}/locations/{location}/endpoints/{ENDPOINT_ID}\",\n api_key=credentials.token,\n )\n\nBy default, access tokens last for 1 hour. You can\n[extend the life of your access token](/docs/authentication/token-types#at-lifetime)\nor periodically refresh your token and update the `openai.api_key` variable.\n\n### Environment variables\n\n[Install](/sdk/docs/install-sdk) the Google Cloud CLI. The OpenAI library can\nread the `OPENAI_API_KEY` and `OPENAI_BASE_URL` environment\nvariables to change the authentication and endpoint in their default client.\nSet the following variables: \n\n $ export PROJECT_ID=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\n $ export LOCATION=\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\n $ export OPENAI_API_KEY=\"$(gcloud auth application-default print-access-token)\"\n\nTo call a Gemini model, set the `MODEL_ID`\nvariable and use the `openapi` endpoint: \n\n $ export MODEL_ID=\u003cvar translate=\"no\"\u003eMODEL_ID\u003c/var\u003e\n $ export OPENAI_BASE_URL=\"https://${LOCATION}-aiplatform.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/${LOCATION}/endpoints/openapi\"\n\nTo call a self-deployed model from Model Garden, set the `ENDPOINT`\nvariable and use that in your URL instead: \n\n $ export ENDPOINT=\u003cvar translate=\"no\"\u003eENDPOINT_ID\u003c/var\u003e\n $ export OPENAI_BASE_URL=\"https://${LOCATION}-aiplatform.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/${LOCATION}/endpoints/${ENDPOINT}\"\n\nNext, initialize the client: \n\n client = openai.OpenAI()\n\nThe Gemini Chat Completions API uses OAuth to authenticate\nwith a\n[short-lived access token](/iam/docs/create-short-lived-credentials-direct#sa-credentials-oauth).\nBy default, access tokens last for 1 hour. You can\n[extend the life of your access token](/docs/authentication/token-types#at-lifetime)\nor periodically refresh your token and update the `OPENAI_API_KEY`\nenvironment variable.\n\nRefresh your credentials\n------------------------\n\nThe following example shows how to refresh your credentials automatically as\nneeded: \n\n### Python\n\n from typing import Any\n\n import google.auth\n import google.auth.transport.requests\n import openai\n\n\n class OpenAICredentialsRefresher:\n def __init__(self, **kwargs: Any) -\u003e None:\n # Set a placeholder key here\n self.client = openai.OpenAI(**kwargs, api_key=\"PLACEHOLDER\")\n self.creds, self.project = google.auth.default(\n scopes=[\"https://www.googleapis.com/auth/cloud-platform\"]\n )\n\n def __getattr__(self, name: str) -\u003e Any:\n if not self.creds.valid:\n self.creds.refresh(google.auth.transport.requests.Request())\n\n if not self.creds.valid:\n raise RuntimeError(\"Unable to refresh auth\")\n\n self.client.api_key = self.creds.token\n return getattr(self.client, name)\n\n\n\n # TODO(developer): Update and un-comment below lines\n # project_id = \"PROJECT_ID\"\n # location = \"us-central1\"\n\n client = OpenAICredentialsRefresher(\n base_url=f\"https://{location}-aiplatform.googleapis.com/v1/projects/{project_id}/locations/{location}/endpoints/openapi\",\n )\n\n response = client.chat.completions.create(\n model=\"google/gemini-2.0-flash-001\",\n messages=[{\"role\": \"user\", \"content\": \"Why is the sky blue?\"}],\n )\n\n print(response)\n\nWhat's next\n-----------\n\n- See examples of calling the [Chat Completions API](/vertex-ai/generative-ai/docs/migrate/openai/examples) with the OpenAI-compatible syntax.\n- See examples of calling the [Inference API](/vertex-ai/generative-ai/docs/model-reference/inference#examples) with the OpenAI-compatible syntax.\n- See examples of calling the [Function Calling API](/vertex-ai/generative-ai/docs/model-reference/function-calling#examples) with OpenAI-compatible syntax.\n- Learn more about the [Gemini API](/vertex-ai/generative-ai/docs/overview).\n- Learn more about [migrating from Azure OpenAI to the Gemini API](/vertex-ai/generative-ai/docs/migrate/migrate-from-azure-to-gemini)."]]