# python3 -m pip install "kfp<2.0.0" "google-cloud-aiplatform>=1.16.0" --upgrade --quietfromkfpimportcomponentsfromkfp.v2importdsl# %% Loading componentsupload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')deploy_model_to_endpoint_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')transcode_imagedataset_tfrecord_from_csv_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')load_image_classification_model_from_tfhub_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')preprocess_image_data_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')train_tensorflow_image_classification_model_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')# %% Pipeline definitiondefimage_classification_pipeline():class_names=['daisy','dandelion','roses','sunflowers','tulips']csv_image_data_path='gs://cloud-samples-data/ai-platform/flowers/flowers.csv'deploy_model=Falseimage_data=dsl.importer(artifact_uri=csv_image_data_path,artifact_class=dsl.Dataset).outputimage_tfrecord_data=transcode_imagedataset_tfrecord_from_csv_op(csv_image_data_path=image_data,class_names=class_names).outputs['tfrecord_image_data_path']loaded_model_outputs=load_image_classification_model_from_tfhub_op(class_names=class_names,).outputspreprocessed_data=preprocess_image_data_op(image_tfrecord_data,height_width_path=loaded_model_outputs['image_size_path'],).outputstrained_model=(train_tensorflow_image_classification_model_op(preprocessed_training_data_path=preprocessed_data['preprocessed_training_data_path'],preprocessed_validation_data_path=preprocessed_data['preprocessed_validation_data_path'],model_path=loaded_model_outputs['loaded_model_path']).set_cpu_limit('96').set_memory_limit('128G').add_node_selector_constraint('cloud.google.com/gke-accelerator','NVIDIA_TESLA_A100').set_gpu_limit('8').outputs['trained_model_path'])vertex_model_name=upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(model=trained_model,).outputs['model_name']# Deploying the model might incur additional costs over timeifdeploy_model:vertex_endpoint_name=deploy_model_to_endpoint_op(model_name=vertex_model_name,).outputs['endpoint_name']pipeline_func=image_classification_pipeline# %% Pipeline submissionif__name__=='__main__':fromgoogle.cloudimportaiplatformaiplatform.PipelineJob.from_pipeline_func(pipeline_func=pipeline_func).submit()
제공된 샘플 코드와 관련하여 다음 사항에 유의하세요.
Kubeflow 파이프라인은 Python 함수로 정의됩니다.
파이프라인의 워크플로 단계는 Kubeflow 파이프라인 구성요소를 사용하여 생성됩니다. 구성요소 출력을 다른 구성요소의 입력으로 사용하여 파이프라인의 워크플로를 그래프로 정의합니다. 예를 들어 preprocess_image_data_op 구성요소 태스크는 transcode_imagedataset_tfrecord_from_csv_op 구성요소 태스크의 tfrecord_image_data_path 출력에 따라 달라집니다.
Python용 Vertex AI SDK를 사용하여 Vertex AI Pipelines에서 파이프라인 실행을 만듭니다.
파이프라인 모니터링
Google Cloud console의 Vertex AI 섹션에서 파이프라인 페이지로 이동하여 실행 탭을 엽니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[],[],null,["# Fine-tune an image classification model with custom data on Vertex AI Pipelines\n\nThis tutorial shows you how to use Vertex AI Pipelines to run an end-to-end ML workflow, including the following tasks:\n\n- Import and transform data.\n- Fine-tune an [image classification model from TFHub](https://tfhub.dev/s?module-type=image-classification) using the transformed data.\n- Import the trained model to Vertex AI Model Registry.\n- **Optional**: Deploy the model for online serving with Vertex AI Inference.\n\nBefore you begin\n----------------\n\n1. Ensure that you've completed steps 1-3 in [Set up a project](/vertex-ai/docs/start/cloud-environment#set_up_a_project).\n\n2. Create an isolated Python environment and install the\n [Vertex AI SDK for Python](/vertex-ai/docs/start/install-sdk).\n\n3. Install the Kubeflow Pipelines SDK:\n\n python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n\nRun the ML model training pipeline\n----------------------------------\n\nThe sample code does the following:\n\n- Loads components from a [component repository](https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/main/community-content/pipeline_components) to be used as pipeline building blocks.\n- Composes a pipeline by creating component tasks and passing data between them using arguments.\n- Submits the pipeline for execution on Vertex AI Pipelines. See [Vertex AI Pipelines pricing](/vertex-ai/pricing#pipelines).\n\nCopy the following sample code into your development environment and run it. \n\n### Image classification\n\n # python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n from kfp import components\n from kfp.v2 import dsl\n\n # %% Loading components\n upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')\n deploy_model_to_endpoint_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')\n transcode_imagedataset_tfrecord_from_csv_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')\n load_image_classification_model_from_tfhub_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')\n preprocess_image_data_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')\n train_tensorflow_image_classification_model_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')\n\n\n # %% Pipeline definition\n def image_classification_pipeline():\n class_names = ['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']\n csv_image_data_path = 'gs://cloud-samples-data/ai-platform/flowers/flowers.csv'\n deploy_model = False\n\n image_data = dsl.importer(\n artifact_uri=csv_image_data_path, artifact_class=dsl.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.Dataset.html).output\n\n image_tfrecord_data = transcode_imagedataset_tfrecord_from_csv_op(\n csv_image_data_path=image_data,\n class_names=class_names\n ).outputs['tfrecord_image_data_path']\n\n loaded_model_outputs = load_image_classification_model_from_tfhub_op(\n class_names=class_names,\n ).outputs\n\n preprocessed_data = preprocess_image_data_op(\n image_tfrecord_data,\n height_width_path=loaded_model_outputs['image_size_path'],\n ).outputs\n\n trained_model = (train_tensorflow_image_classification_model_op(\n preprocessed_training_data_path = preprocessed_data['preprocessed_training_data_path'],\n preprocessed_validation_data_path = preprocessed_data['preprocessed_validation_data_path'],\n model_path=loaded_model_outputs['loaded_model_path']).\n set_cpu_limit('96').\n set_memory_limit('128G').\n add_node_selector_constraint('cloud.google.com/gke-accelerator', 'NVIDIA_TESLA_A100').\n set_gpu_limit('8').\n outputs['trained_model_path'])\n\n vertex_model_name = upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(\n model=trained_model,\n ).outputs['model_name']\n\n # Deploying the model might incur additional costs over time\n if deploy_model:\n vertex_endpoint_name = deploy_model_to_endpoint_op(\n model_name=vertex_model_name,\n ).outputs['endpoint_name']\n\n pipeline_func = image_classification_pipeline\n\n # %% Pipeline submission\n if __name__ == '__main__':\n from google.cloud import aiplatform\n aiplatform.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.PipelineJob.html.from_pipeline_func(pipeline_func=pipeline_func).submit()\n\nNote the following about the sample code provided:\n\n- A Kubeflow pipeline is defined as a Python function.\n- The pipeline's workflow steps are created using Kubeflow pipeline components. By using the outputs of a component as an input of another component, you define the pipeline's workflow as a graph. For example, the `preprocess_image_data_op` component task depends on the `tfrecord_image_data_path` output from the `transcode_imagedataset_tfrecord_from_csv_op` component task.\n- You create a pipeline run on Vertex AI Pipelines using the Vertex AI SDK for Python.\n\nMonitor the pipeline\n--------------------\n\nIn the Google Cloud console, in the Vertex AI section, go to the\n**Pipelines** page and open the **Runs** tab.\n\n[Go to Pipeline runs](https://console.cloud.google.com/vertex-ai/pipelines/runs)\n\nWhat's next\n-----------\n\n- To learn more about Vertex AI Pipelines, see [Introduction to Vertex AI Pipelines](/vertex-ai/docs/pipelines/introduction)."]]