[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-09-04 UTC。"],[],[],null,["# Fine-tune an image classification model with custom data on Vertex AI Pipelines\n\nThis tutorial shows you how to use Vertex AI Pipelines to run an end-to-end ML workflow, including the following tasks:\n\n- Import and transform data.\n- Fine-tune an [image classification model from TFHub](https://tfhub.dev/s?module-type=image-classification) using the transformed data.\n- Import the trained model to Vertex AI Model Registry.\n- **Optional**: Deploy the model for online serving with Vertex AI Inference.\n\nBefore you begin\n----------------\n\n1. Ensure that you've completed steps 1-3 in [Set up a project](/vertex-ai/docs/start/cloud-environment#set_up_a_project).\n\n2. Create an isolated Python environment and install the\n [Vertex AI SDK for Python](/vertex-ai/docs/start/install-sdk).\n\n3. Install the Kubeflow Pipelines SDK:\n\n python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n\nRun the ML model training pipeline\n----------------------------------\n\nThe sample code does the following:\n\n- Loads components from a [component repository](https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/main/community-content/pipeline_components) to be used as pipeline building blocks.\n- Composes a pipeline by creating component tasks and passing data between them using arguments.\n- Submits the pipeline for execution on Vertex AI Pipelines. See [Vertex AI Pipelines pricing](/vertex-ai/pricing#pipelines).\n\nCopy the following sample code into your development environment and run it. \n\n### Image classification\n\n # python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n from kfp import components\n from kfp.v2 import dsl\n\n # %% Loading components\n upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')\n deploy_model_to_endpoint_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')\n transcode_imagedataset_tfrecord_from_csv_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')\n load_image_classification_model_from_tfhub_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')\n preprocess_image_data_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')\n train_tensorflow_image_classification_model_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')\n\n\n # %% Pipeline definition\n def image_classification_pipeline():\n class_names = ['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']\n csv_image_data_path = 'gs://cloud-samples-data/ai-platform/flowers/flowers.csv'\n deploy_model = False\n\n image_data = dsl.importer(\n artifact_uri=csv_image_data_path, artifact_class=dsl.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.Dataset.html).output\n\n image_tfrecord_data = transcode_imagedataset_tfrecord_from_csv_op(\n csv_image_data_path=image_data,\n class_names=class_names\n ).outputs['tfrecord_image_data_path']\n\n loaded_model_outputs = load_image_classification_model_from_tfhub_op(\n class_names=class_names,\n ).outputs\n\n preprocessed_data = preprocess_image_data_op(\n image_tfrecord_data,\n height_width_path=loaded_model_outputs['image_size_path'],\n ).outputs\n\n trained_model = (train_tensorflow_image_classification_model_op(\n preprocessed_training_data_path = preprocessed_data['preprocessed_training_data_path'],\n preprocessed_validation_data_path = preprocessed_data['preprocessed_validation_data_path'],\n model_path=loaded_model_outputs['loaded_model_path']).\n set_cpu_limit('96').\n set_memory_limit('128G').\n add_node_selector_constraint('cloud.google.com/gke-accelerator', 'NVIDIA_TESLA_A100').\n set_gpu_limit('8').\n outputs['trained_model_path'])\n\n vertex_model_name = upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(\n model=trained_model,\n ).outputs['model_name']\n\n # Deploying the model might incur additional costs over time\n if deploy_model:\n vertex_endpoint_name = deploy_model_to_endpoint_op(\n model_name=vertex_model_name,\n ).outputs['endpoint_name']\n\n pipeline_func = image_classification_pipeline\n\n # %% Pipeline submission\n if __name__ == '__main__':\n from google.cloud import aiplatform\n aiplatform.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.PipelineJob.html.from_pipeline_func(pipeline_func=pipeline_func).submit()\n\nNote the following about the sample code provided:\n\n- A Kubeflow pipeline is defined as a Python function.\n- The pipeline's workflow steps are created using Kubeflow pipeline components. By using the outputs of a component as an input of another component, you define the pipeline's workflow as a graph. For example, the `preprocess_image_data_op` component task depends on the `tfrecord_image_data_path` output from the `transcode_imagedataset_tfrecord_from_csv_op` component task.\n- You create a pipeline run on Vertex AI Pipelines using the Vertex AI SDK for Python.\n\nMonitor the pipeline\n--------------------\n\nIn the Google Cloud console, in the Vertex AI section, go to the\n**Pipelines** page and open the **Runs** tab.\n\n[Go to Pipeline runs](https://console.cloud.google.com/vertex-ai/pipelines/runs)\n\nWhat's next\n-----------\n\n- To learn more about Vertex AI Pipelines, see [Introduction to Vertex AI Pipelines](/vertex-ai/docs/pipelines/introduction)."]]