Ajuster un modèle de classification d'images à l'aide de données personnalisées sur Vertex AI Pipelines
Restez organisé à l'aide des collections
Enregistrez et classez les contenus selon vos préférences.
Ce tutoriel explique comment utiliser Vertex AI Pipelines pour exécuter un workflow de ML de bout en bout, y compris pour effectuer les tâches suivantes :
Copiez l'exemple de code suivant dans votre environnement de développement et exécutez-le.
Classification d'images
# python3 -m pip install "kfp<2.0.0" "google-cloud-aiplatform>=1.16.0" --upgrade --quietfromkfpimportcomponentsfromkfp.v2importdsl# %% Loading componentsupload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')deploy_model_to_endpoint_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')transcode_imagedataset_tfrecord_from_csv_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')load_image_classification_model_from_tfhub_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')preprocess_image_data_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')train_tensorflow_image_classification_model_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')# %% Pipeline definitiondefimage_classification_pipeline():class_names=['daisy','dandelion','roses','sunflowers','tulips']csv_image_data_path='gs://cloud-samples-data/ai-platform/flowers/flowers.csv'deploy_model=Falseimage_data=dsl.importer(artifact_uri=csv_image_data_path,artifact_class=dsl.Dataset).outputimage_tfrecord_data=transcode_imagedataset_tfrecord_from_csv_op(csv_image_data_path=image_data,class_names=class_names).outputs['tfrecord_image_data_path']loaded_model_outputs=load_image_classification_model_from_tfhub_op(class_names=class_names,).outputspreprocessed_data=preprocess_image_data_op(image_tfrecord_data,height_width_path=loaded_model_outputs['image_size_path'],).outputstrained_model=(train_tensorflow_image_classification_model_op(preprocessed_training_data_path=preprocessed_data['preprocessed_training_data_path'],preprocessed_validation_data_path=preprocessed_data['preprocessed_validation_data_path'],model_path=loaded_model_outputs['loaded_model_path']).set_cpu_limit('96').set_memory_limit('128G').add_node_selector_constraint('cloud.google.com/gke-accelerator','NVIDIA_TESLA_A100').set_gpu_limit('8').outputs['trained_model_path'])vertex_model_name=upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(model=trained_model,).outputs['model_name']# Deploying the model might incur additional costs over timeifdeploy_model:vertex_endpoint_name=deploy_model_to_endpoint_op(model_name=vertex_model_name,).outputs['endpoint_name']pipeline_func=image_classification_pipeline# %% Pipeline submissionif__name__=='__main__':fromgoogle.cloudimportaiplatformaiplatform.PipelineJob.from_pipeline_func(pipeline_func=pipeline_func).submit()
Notez les points suivants concernant l'exemple de code fourni :
Un pipeline Kubeflow est défini comme une fonction Python.
Les étapes du workflow du pipeline sont créées à l'aide des composants du pipeline Kubeflow. En utilisant les sorties d'un composant comme entrée d'un autre composant, vous définissez le workflow du pipeline sous forme de graphe. Par exemple, la tâche du composant preprocess_image_data_op dépend de la sortie tfrecord_image_data_path de la tâche du composant transcode_imagedataset_tfrecord_from_csv_op.
Vous créez une exécution de pipeline sur Vertex AI Pipelines à l'aide du SDK Vertex AI pour Python.
Surveiller le pipeline
Dans la console Google Cloud , accédez à la section "Vertex AI", accédez à la page Pipelines et ouvrez l'onglet Exécutions.
Sauf indication contraire, le contenu de cette page est régi par une licence Creative Commons Attribution 4.0, et les échantillons de code sont régis par une licence Apache 2.0. Pour en savoir plus, consultez les Règles du site Google Developers. Java est une marque déposée d'Oracle et/ou de ses sociétés affiliées.
Dernière mise à jour le 2025/09/03 (UTC).
[[["Facile à comprendre","easyToUnderstand","thumb-up"],["J'ai pu résoudre mon problème","solvedMyProblem","thumb-up"],["Autre","otherUp","thumb-up"]],[["Difficile à comprendre","hardToUnderstand","thumb-down"],["Informations ou exemple de code incorrects","incorrectInformationOrSampleCode","thumb-down"],["Il n'y a pas l'information/les exemples dont j'ai besoin","missingTheInformationSamplesINeed","thumb-down"],["Problème de traduction","translationIssue","thumb-down"],["Autre","otherDown","thumb-down"]],["Dernière mise à jour le 2025/09/03 (UTC)."],[],[],null,["# Fine-tune an image classification model with custom data on Vertex AI Pipelines\n\nThis tutorial shows you how to use Vertex AI Pipelines to run an end-to-end ML workflow, including the following tasks:\n\n- Import and transform data.\n- Fine-tune an [image classification model from TFHub](https://tfhub.dev/s?module-type=image-classification) using the transformed data.\n- Import the trained model to Vertex AI Model Registry.\n- **Optional**: Deploy the model for online serving with Vertex AI Inference.\n\nBefore you begin\n----------------\n\n1. Ensure that you've completed steps 1-3 in [Set up a project](/vertex-ai/docs/start/cloud-environment#set_up_a_project).\n\n2. Create an isolated Python environment and install the\n [Vertex AI SDK for Python](/vertex-ai/docs/start/install-sdk).\n\n3. Install the Kubeflow Pipelines SDK:\n\n python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n\nRun the ML model training pipeline\n----------------------------------\n\nThe sample code does the following:\n\n- Loads components from a [component repository](https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/main/community-content/pipeline_components) to be used as pipeline building blocks.\n- Composes a pipeline by creating component tasks and passing data between them using arguments.\n- Submits the pipeline for execution on Vertex AI Pipelines. See [Vertex AI Pipelines pricing](/vertex-ai/pricing#pipelines).\n\nCopy the following sample code into your development environment and run it. \n\n### Image classification\n\n # python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n from kfp import components\n from kfp.v2 import dsl\n\n # %% Loading components\n upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')\n deploy_model_to_endpoint_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')\n transcode_imagedataset_tfrecord_from_csv_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')\n load_image_classification_model_from_tfhub_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')\n preprocess_image_data_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')\n train_tensorflow_image_classification_model_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')\n\n\n # %% Pipeline definition\n def image_classification_pipeline():\n class_names = ['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']\n csv_image_data_path = 'gs://cloud-samples-data/ai-platform/flowers/flowers.csv'\n deploy_model = False\n\n image_data = dsl.importer(\n artifact_uri=csv_image_data_path, artifact_class=dsl.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.Dataset.html).output\n\n image_tfrecord_data = transcode_imagedataset_tfrecord_from_csv_op(\n csv_image_data_path=image_data,\n class_names=class_names\n ).outputs['tfrecord_image_data_path']\n\n loaded_model_outputs = load_image_classification_model_from_tfhub_op(\n class_names=class_names,\n ).outputs\n\n preprocessed_data = preprocess_image_data_op(\n image_tfrecord_data,\n height_width_path=loaded_model_outputs['image_size_path'],\n ).outputs\n\n trained_model = (train_tensorflow_image_classification_model_op(\n preprocessed_training_data_path = preprocessed_data['preprocessed_training_data_path'],\n preprocessed_validation_data_path = preprocessed_data['preprocessed_validation_data_path'],\n model_path=loaded_model_outputs['loaded_model_path']).\n set_cpu_limit('96').\n set_memory_limit('128G').\n add_node_selector_constraint('cloud.google.com/gke-accelerator', 'NVIDIA_TESLA_A100').\n set_gpu_limit('8').\n outputs['trained_model_path'])\n\n vertex_model_name = upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(\n model=trained_model,\n ).outputs['model_name']\n\n # Deploying the model might incur additional costs over time\n if deploy_model:\n vertex_endpoint_name = deploy_model_to_endpoint_op(\n model_name=vertex_model_name,\n ).outputs['endpoint_name']\n\n pipeline_func = image_classification_pipeline\n\n # %% Pipeline submission\n if __name__ == '__main__':\n from google.cloud import aiplatform\n aiplatform.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.PipelineJob.html.from_pipeline_func(pipeline_func=pipeline_func).submit()\n\nNote the following about the sample code provided:\n\n- A Kubeflow pipeline is defined as a Python function.\n- The pipeline's workflow steps are created using Kubeflow pipeline components. By using the outputs of a component as an input of another component, you define the pipeline's workflow as a graph. For example, the `preprocess_image_data_op` component task depends on the `tfrecord_image_data_path` output from the `transcode_imagedataset_tfrecord_from_csv_op` component task.\n- You create a pipeline run on Vertex AI Pipelines using the Vertex AI SDK for Python.\n\nMonitor the pipeline\n--------------------\n\nIn the Google Cloud console, in the Vertex AI section, go to the\n**Pipelines** page and open the **Runs** tab.\n\n[Go to Pipeline runs](https://console.cloud.google.com/vertex-ai/pipelines/runs)\n\nWhat's next\n-----------\n\n- To learn more about Vertex AI Pipelines, see [Introduction to Vertex AI Pipelines](/vertex-ai/docs/pipelines/introduction)."]]