Copie o exemplo de código a seguir no ambiente de desenvolvimento e execute-o.
Classificação de imagens
# python3 -m pip install "kfp<2.0.0" "google-cloud-aiplatform>=1.16.0" --upgrade --quietfromkfpimportcomponentsfromkfp.v2importdsl# %% Loading componentsupload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')deploy_model_to_endpoint_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')transcode_imagedataset_tfrecord_from_csv_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')load_image_classification_model_from_tfhub_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')preprocess_image_data_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')train_tensorflow_image_classification_model_op=components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')# %% Pipeline definitiondefimage_classification_pipeline():class_names=['daisy','dandelion','roses','sunflowers','tulips']csv_image_data_path='gs://cloud-samples-data/ai-platform/flowers/flowers.csv'deploy_model=Falseimage_data=dsl.importer(artifact_uri=csv_image_data_path,artifact_class=dsl.Dataset).outputimage_tfrecord_data=transcode_imagedataset_tfrecord_from_csv_op(csv_image_data_path=image_data,class_names=class_names).outputs['tfrecord_image_data_path']loaded_model_outputs=load_image_classification_model_from_tfhub_op(class_names=class_names,).outputspreprocessed_data=preprocess_image_data_op(image_tfrecord_data,height_width_path=loaded_model_outputs['image_size_path'],).outputstrained_model=(train_tensorflow_image_classification_model_op(preprocessed_training_data_path=preprocessed_data['preprocessed_training_data_path'],preprocessed_validation_data_path=preprocessed_data['preprocessed_validation_data_path'],model_path=loaded_model_outputs['loaded_model_path']).set_cpu_limit('96').set_memory_limit('128G').add_node_selector_constraint('cloud.google.com/gke-accelerator','NVIDIA_TESLA_A100').set_gpu_limit('8').outputs['trained_model_path'])vertex_model_name=upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(model=trained_model,).outputs['model_name']# Deploying the model might incur additional costs over timeifdeploy_model:vertex_endpoint_name=deploy_model_to_endpoint_op(model_name=vertex_model_name,).outputs['endpoint_name']pipeline_func=image_classification_pipeline# %% Pipeline submissionif__name__=='__main__':fromgoogle.cloudimportaiplatformaiplatform.PipelineJob.from_pipeline_func(pipeline_func=pipeline_func).submit()
Observe o seguinte sobre o exemplo de código fornecido:
Um pipeline do Kubeflow é definido como uma função do Python.
As etapas do fluxo de trabalho do pipeline são criadas usando componentes do pipeline do
Kubeflow. Ao usar as saídas de um componente como entrada de outro componente,
o fluxo de trabalho do pipeline é definido como um gráfico. Por exemplo, a tarefa do componente preprocess_image_data_op depende da saída tfrecord_image_data_path da tarefa do componente transcode_imagedataset_tfrecord_from_csv_op.
Crie um pipeline executado no Vertex AI Pipelines usando o
SDK da Vertex AI para Python.
Monitorar o pipeline
No console do Google Cloud , na seção "Vertex AI", acesse a
página Pipelines e abra a guia Execuções.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2025-09-04 UTC."],[],[],null,["# Fine-tune an image classification model with custom data on Vertex AI Pipelines\n\nThis tutorial shows you how to use Vertex AI Pipelines to run an end-to-end ML workflow, including the following tasks:\n\n- Import and transform data.\n- Fine-tune an [image classification model from TFHub](https://tfhub.dev/s?module-type=image-classification) using the transformed data.\n- Import the trained model to Vertex AI Model Registry.\n- **Optional**: Deploy the model for online serving with Vertex AI Inference.\n\nBefore you begin\n----------------\n\n1. Ensure that you've completed steps 1-3 in [Set up a project](/vertex-ai/docs/start/cloud-environment#set_up_a_project).\n\n2. Create an isolated Python environment and install the\n [Vertex AI SDK for Python](/vertex-ai/docs/start/install-sdk).\n\n3. Install the Kubeflow Pipelines SDK:\n\n python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n\nRun the ML model training pipeline\n----------------------------------\n\nThe sample code does the following:\n\n- Loads components from a [component repository](https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/main/community-content/pipeline_components) to be used as pipeline building blocks.\n- Composes a pipeline by creating component tasks and passing data between them using arguments.\n- Submits the pipeline for execution on Vertex AI Pipelines. See [Vertex AI Pipelines pricing](/vertex-ai/pricing#pipelines).\n\nCopy the following sample code into your development environment and run it. \n\n### Image classification\n\n # python3 -m pip install \"kfp\u003c2.0.0\" \"google-cloud-aiplatform\u003e=1.16.0\" --upgrade --quiet\n from kfp import components\n from kfp.v2 import dsl\n\n # %% Loading components\n upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Upload_Tensorflow_model/component.yaml')\n deploy_model_to_endpoint_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/399405402d95f4a011e2d2e967c96f8508ba5688/community-content/pipeline_components/google-cloud/Vertex_AI/Models/Deploy_to_endpoint/component.yaml')\n transcode_imagedataset_tfrecord_from_csv_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/transcode_tfrecord_image_dataset_from_csv/component.yaml')\n load_image_classification_model_from_tfhub_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/b5b65198a6c2ffe8c0fa2aa70127e3325752df68/community-content/pipeline_components/image_ml_model_training/load_image_classification_model/component.yaml')\n preprocess_image_data_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/preprocess_image_data/component.yaml')\n train_tensorflow_image_classification_model_op = components.load_component_from_url('https://raw.githubusercontent.com/GoogleCloudPlatform/vertex-ai-samples/main/community-content/pipeline_components/image_ml_model_training/train_image_classification_model/component.yaml')\n\n\n # %% Pipeline definition\n def image_classification_pipeline():\n class_names = ['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']\n csv_image_data_path = 'gs://cloud-samples-data/ai-platform/flowers/flowers.csv'\n deploy_model = False\n\n image_data = dsl.importer(\n artifact_uri=csv_image_data_path, artifact_class=dsl.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.Dataset.html).output\n\n image_tfrecord_data = transcode_imagedataset_tfrecord_from_csv_op(\n csv_image_data_path=image_data,\n class_names=class_names\n ).outputs['tfrecord_image_data_path']\n\n loaded_model_outputs = load_image_classification_model_from_tfhub_op(\n class_names=class_names,\n ).outputs\n\n preprocessed_data = preprocess_image_data_op(\n image_tfrecord_data,\n height_width_path=loaded_model_outputs['image_size_path'],\n ).outputs\n\n trained_model = (train_tensorflow_image_classification_model_op(\n preprocessed_training_data_path = preprocessed_data['preprocessed_training_data_path'],\n preprocessed_validation_data_path = preprocessed_data['preprocessed_validation_data_path'],\n model_path=loaded_model_outputs['loaded_model_path']).\n set_cpu_limit('96').\n set_memory_limit('128G').\n add_node_selector_constraint('cloud.google.com/gke-accelerator', 'NVIDIA_TESLA_A100').\n set_gpu_limit('8').\n outputs['trained_model_path'])\n\n vertex_model_name = upload_Tensorflow_model_to_Google_Cloud_Vertex_AI_op(\n model=trained_model,\n ).outputs['model_name']\n\n # Deploying the model might incur additional costs over time\n if deploy_model:\n vertex_endpoint_name = deploy_model_to_endpoint_op(\n model_name=vertex_model_name,\n ).outputs['endpoint_name']\n\n pipeline_func = image_classification_pipeline\n\n # %% Pipeline submission\n if __name__ == '__main__':\n from google.cloud import aiplatform\n aiplatform.https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform_v1.types.PipelineJob.html.from_pipeline_func(pipeline_func=pipeline_func).submit()\n\nNote the following about the sample code provided:\n\n- A Kubeflow pipeline is defined as a Python function.\n- The pipeline's workflow steps are created using Kubeflow pipeline components. By using the outputs of a component as an input of another component, you define the pipeline's workflow as a graph. For example, the `preprocess_image_data_op` component task depends on the `tfrecord_image_data_path` output from the `transcode_imagedataset_tfrecord_from_csv_op` component task.\n- You create a pipeline run on Vertex AI Pipelines using the Vertex AI SDK for Python.\n\nMonitor the pipeline\n--------------------\n\nIn the Google Cloud console, in the Vertex AI section, go to the\n**Pipelines** page and open the **Runs** tab.\n\n[Go to Pipeline runs](https://console.cloud.google.com/vertex-ai/pipelines/runs)\n\nWhat's next\n-----------\n\n- To learn more about Vertex AI Pipelines, see [Introduction to Vertex AI Pipelines](/vertex-ai/docs/pipelines/introduction)."]]