Mantenha tudo organizado com as coleções
Salve e categorize o conteúdo com base nas suas preferências.
Nesta página, descrevemos o processo para excluir um modelo de previsão on-line e todos os
recursos associados a ele.
Antes de começar
Para receber as permissões necessárias para acessar a previsão on-line,
peça ao administrador do IAM do projeto para conceder a você a função de
usuário de previsão da Vertex AI (vertex-ai-prediction-user).
Além disso, para receber as permissões necessárias para excluir objetos em um
bucket, peça ao administrador do IAM do projeto para conceder a você a função de administrador de objetos do bucket do projeto
(project-bucket-object-admin) no projeto.
Excluir recursos
Se quiser excluir um modelo de previsão on-line e todos os recursos associados a ele, siga estas etapas:
Exclua o recurso personalizado DeployedModel associado ao modelo no cluster de previsão:
Substitua ENDPOINT_NAME pelo nome do arquivo de definição de
Endpoint.
No arquivo YAML, exclua manualmente o objeto serviceRef que contém
a referência DeployedModel excluída anteriormente.
Salve as mudanças no arquivo YAML.
Exclua o modelo do bucket de armazenamento. Para mais informações sobre como
excluir objetos de buckets de armazenamento, consulte Excluir objetos de armazenamento em projetos.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2025-09-04 UTC."],[[["\u003cp\u003eOnline Prediction is a Preview feature not recommended for production environments and lacks service-level agreements or technical support.\u003c/p\u003e\n"],["\u003cp\u003eDeleting an online prediction model involves removing the associated \u003ccode\u003eDeployedModel\u003c/code\u003e custom resource from the prediction cluster using \u003ccode\u003ekubectl\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eDepending on whether the \u003ccode\u003eEndpoint\u003c/code\u003e hosts other models, you must either delete the entire \u003ccode\u003eEndpoint\u003c/code\u003e custom resource or edit it to remove the deleted \u003ccode\u003eDeployedModel\u003c/code\u003e's \u003ccode\u003eserviceRef\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eAfter removing the \u003ccode\u003eDeployedModel\u003c/code\u003e and adjusting the \u003ccode\u003eEndpoint\u003c/code\u003e, the final step is to delete the model from its storage bucket.\u003c/p\u003e\n"]]],[],null,["# Delete an online prediction model\n\n| **Preview:** Online Prediction is a Preview feature that is available as-is and is not recommended for production environments. Google provides no service-level agreements (SLA) or technical support commitments for Preview features. For more information, see GDC's [feature stages](/distributed-cloud/hosted/docs/latest/gdch/resources/feature-stages).\n\nThis page describes the process to delete an online prediction model and all the\nresources associated with it.\n\nBefore you begin\n----------------\n\nTo get the permissions that you need to access Online Prediction,\nask your Project IAM Admin to grant you the Vertex AI\nPrediction User (`vertex-ai-prediction-user`) role.\n\nFor information about this role, see\n[Prepare IAM permissions](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/vertex-ai-ao-permissions).\n\nAdditionally, to get the permissions that you need to delete objects in a\nbucket, ask your Project IAM Admin to grant you the Project Bucket Object Admin\n(`project-bucket-object-admin`) role in the project.\n\nDelete resources\n----------------\n\nIf you want to delete an online prediction model and all the resources\nassociated with it, perform the following steps:\n\n1. Delete the `DeployedModel` custom resource associated with your model\n on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e delete -f \u003cvar translate=\"no\"\u003eDEPLOYED_MODEL_NAME\u003c/var\u003e.yaml\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e: the path to the kubeconfig file in the prediction cluster.\n - \u003cvar translate=\"no\"\u003eDEPLOYED_MODEL_NAME\u003c/var\u003e: the name of the `DeployedModel` definition file.\n2. Edit the `Endpoint` custom resource in one of the following ways:\n\n - If the endpoint that the `DeployedModel` uses doesn't host other models,\n delete the `Endpoint` custom resource on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e delete -f \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e.yaml\n\n Replace \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e with the name of the\n `Endpoint` definition file.\n - If the endpoint that the `DeployedModel` uses hosts other models,\n perform the following steps:\n\n 1. Update the `Endpoint` custom resource on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e edit -f \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e.yaml\n\n Replace \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e with the name of the\n `Endpoint` definition file.\n 2. On the YAML file, manually delete the `serviceRef` object containing\n the `DeployedModel` reference you deleted previously.\n\n 3. Save the changes on the YAML file.\n\n3. Delete your model from the storage bucket. For more information about how to\n delete objects from storage buckets, see [Delete storage objects in projects](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/delete-storage-objects)."]]