Organiza tus páginas con colecciones
Guarda y categoriza el contenido según tus preferencias.
En esta página, se describe el proceso para borrar un modelo de predicción en línea y todos los recursos asociados a él.
Antes de comenzar
Para obtener los permisos que necesitas para acceder a la Predicción en línea, pídele al administrador de IAM del proyecto que te otorgue el rol de usuario de predicción de Vertex AI (vertex-ai-prediction-user).
Además, para obtener los permisos que necesitas para borrar objetos en un bucket, pídele a tu administrador de IAM del proyecto que te otorgue el rol de administrador de objetos del bucket del proyecto (project-bucket-object-admin) en el proyecto.
Borrar los recursos
Si deseas borrar un modelo de predicción en línea y todos los recursos asociados a él, sigue estos pasos:
Borra el recurso personalizado DeployedModel asociado con tu modelo en el clúster de predicción:
Reemplaza ENDPOINT_NAME por el nombre del archivo de definición de Endpoint.
En el archivo YAML, borra manualmente el objeto serviceRef que contiene la referencia DeployedModel que borraste anteriormente.
Guarda los cambios en el archivo YAML.
Borra el modelo del bucket de almacenamiento. Para obtener más información sobre cómo borrar objetos de los buckets de almacenamiento, consulta Borra objetos de almacenamiento en proyectos.
[[["Fácil de comprender","easyToUnderstand","thumb-up"],["Resolvió mi problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Información o código de muestra incorrectos","incorrectInformationOrSampleCode","thumb-down"],["Faltan la información o los ejemplos que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],["Última actualización: 2025-09-04 (UTC)"],[[["\u003cp\u003eOnline Prediction is a Preview feature not recommended for production environments and lacks service-level agreements or technical support.\u003c/p\u003e\n"],["\u003cp\u003eDeleting an online prediction model involves removing the associated \u003ccode\u003eDeployedModel\u003c/code\u003e custom resource from the prediction cluster using \u003ccode\u003ekubectl\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eDepending on whether the \u003ccode\u003eEndpoint\u003c/code\u003e hosts other models, you must either delete the entire \u003ccode\u003eEndpoint\u003c/code\u003e custom resource or edit it to remove the deleted \u003ccode\u003eDeployedModel\u003c/code\u003e's \u003ccode\u003eserviceRef\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eAfter removing the \u003ccode\u003eDeployedModel\u003c/code\u003e and adjusting the \u003ccode\u003eEndpoint\u003c/code\u003e, the final step is to delete the model from its storage bucket.\u003c/p\u003e\n"]]],[],null,["# Delete an online prediction model\n\n| **Preview:** Online Prediction is a Preview feature that is available as-is and is not recommended for production environments. Google provides no service-level agreements (SLA) or technical support commitments for Preview features. For more information, see GDC's [feature stages](/distributed-cloud/hosted/docs/latest/gdch/resources/feature-stages).\n\nThis page describes the process to delete an online prediction model and all the\nresources associated with it.\n\nBefore you begin\n----------------\n\nTo get the permissions that you need to access Online Prediction,\nask your Project IAM Admin to grant you the Vertex AI\nPrediction User (`vertex-ai-prediction-user`) role.\n\nFor information about this role, see\n[Prepare IAM permissions](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/vertex-ai-ao-permissions).\n\nAdditionally, to get the permissions that you need to delete objects in a\nbucket, ask your Project IAM Admin to grant you the Project Bucket Object Admin\n(`project-bucket-object-admin`) role in the project.\n\nDelete resources\n----------------\n\nIf you want to delete an online prediction model and all the resources\nassociated with it, perform the following steps:\n\n1. Delete the `DeployedModel` custom resource associated with your model\n on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e delete -f \u003cvar translate=\"no\"\u003eDEPLOYED_MODEL_NAME\u003c/var\u003e.yaml\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e: the path to the kubeconfig file in the prediction cluster.\n - \u003cvar translate=\"no\"\u003eDEPLOYED_MODEL_NAME\u003c/var\u003e: the name of the `DeployedModel` definition file.\n2. Edit the `Endpoint` custom resource in one of the following ways:\n\n - If the endpoint that the `DeployedModel` uses doesn't host other models,\n delete the `Endpoint` custom resource on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e delete -f \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e.yaml\n\n Replace \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e with the name of the\n `Endpoint` definition file.\n - If the endpoint that the `DeployedModel` uses hosts other models,\n perform the following steps:\n\n 1. Update the `Endpoint` custom resource on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e edit -f \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e.yaml\n\n Replace \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e with the name of the\n `Endpoint` definition file.\n 2. On the YAML file, manually delete the `serviceRef` object containing\n the `DeployedModel` reference you deleted previously.\n\n 3. Save the changes on the YAML file.\n\n3. Delete your model from the storage bucket. For more information about how to\n delete objects from storage buckets, see [Delete storage objects in projects](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/delete-storage-objects)."]]