Vertex AI V1 API - Class Google::Cloud::AIPlatform::V1::TrainingPipeline (v0.33.0)

Reference documentation and code samples for the Vertex AI V1 API class Google::Cloud::AIPlatform::V1::TrainingPipeline.

The TrainingPipeline orchestrates tasks associated with training a Model. It always executes the training task, and optionally may also export data from Vertex AI's Dataset which becomes the training input, upload the Model to Vertex AI, and evaluate the Model.

Inherits

  • Object

Extended By

  • Google::Protobuf::MessageExts::ClassMethods

Includes

  • Google::Protobuf::MessageExts

Methods

#create_time

def create_time() -> ::Google::Protobuf::Timestamp
Returns

#display_name

def display_name() -> ::String
Returns
  • (::String) — Required. The user-defined name of this TrainingPipeline.

#display_name=

def display_name=(value) -> ::String
Parameter
  • value (::String) — Required. The user-defined name of this TrainingPipeline.
Returns
  • (::String) — Required. The user-defined name of this TrainingPipeline.

#encryption_spec

def encryption_spec() -> ::Google::Cloud::AIPlatform::V1::EncryptionSpec
Returns

#encryption_spec=

def encryption_spec=(value) -> ::Google::Cloud::AIPlatform::V1::EncryptionSpec
Parameter
Returns

#end_time

def end_time() -> ::Google::Protobuf::Timestamp
Returns
  • (::Google::Protobuf::Timestamp) — Output only. Time when the TrainingPipeline entered any of the following states: PIPELINE_STATE_SUCCEEDED, PIPELINE_STATE_FAILED, PIPELINE_STATE_CANCELLED.

#error

def error() -> ::Google::Rpc::Status
Returns
  • (::Google::Rpc::Status) — Output only. Only populated when the pipeline's state is PIPELINE_STATE_FAILED or PIPELINE_STATE_CANCELLED.

#input_data_config

def input_data_config() -> ::Google::Cloud::AIPlatform::V1::InputDataConfig
Returns

#input_data_config=

def input_data_config=(value) -> ::Google::Cloud::AIPlatform::V1::InputDataConfig
Parameter
Returns

#labels

def labels() -> ::Google::Protobuf::Map{::String => ::String}
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — The labels with user-defined metadata to organize TrainingPipelines.

    Label keys and values can be no longer than 64 characters (Unicode codepoints), can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed.

    See https://goo.gl/xmQnxf for more information and examples of labels.

#labels=

def labels=(value) -> ::Google::Protobuf::Map{::String => ::String}
Parameter
  • value (::Google::Protobuf::Map{::String => ::String}) — The labels with user-defined metadata to organize TrainingPipelines.

    Label keys and values can be no longer than 64 characters (Unicode codepoints), can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed.

    See https://goo.gl/xmQnxf for more information and examples of labels.

Returns
  • (::Google::Protobuf::Map{::String => ::String}) — The labels with user-defined metadata to organize TrainingPipelines.

    Label keys and values can be no longer than 64 characters (Unicode codepoints), can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed.

    See https://goo.gl/xmQnxf for more information and examples of labels.

#model_id

def model_id() -> ::String
Returns
  • (::String) — Optional. The ID to use for the uploaded Model, which will become the final component of the model resource name.

    This value may be up to 63 characters, and valid characters are [a-z0-9_-]. The first character cannot be a number or hyphen.

#model_id=

def model_id=(value) -> ::String
Parameter
  • value (::String) — Optional. The ID to use for the uploaded Model, which will become the final component of the model resource name.

    This value may be up to 63 characters, and valid characters are [a-z0-9_-]. The first character cannot be a number or hyphen.

Returns
  • (::String) — Optional. The ID to use for the uploaded Model, which will become the final component of the model resource name.

    This value may be up to 63 characters, and valid characters are [a-z0-9_-]. The first character cannot be a number or hyphen.

#model_to_upload

def model_to_upload() -> ::Google::Cloud::AIPlatform::V1::Model
Returns
  • (::Google::Cloud::AIPlatform::V1::Model) — Describes the Model that may be uploaded (via ModelService.UploadModel) by this TrainingPipeline. The TrainingPipeline's training_task_definition should make clear whether this Model description should be populated, and if there are any special requirements regarding how it should be filled. If nothing is mentioned in the training_task_definition, then it should be assumed that this field should not be filled and the training task either uploads the Model without a need of this information, or that training task does not support uploading a Model as part of the pipeline. When the Pipeline's state becomes PIPELINE_STATE_SUCCEEDED and the trained Model had been uploaded into Vertex AI, then the model_to_upload's resource name is populated. The Model is always uploaded into the Project and Location in which this pipeline is.

#model_to_upload=

def model_to_upload=(value) -> ::Google::Cloud::AIPlatform::V1::Model
Parameter
  • value (::Google::Cloud::AIPlatform::V1::Model) — Describes the Model that may be uploaded (via ModelService.UploadModel) by this TrainingPipeline. The TrainingPipeline's training_task_definition should make clear whether this Model description should be populated, and if there are any special requirements regarding how it should be filled. If nothing is mentioned in the training_task_definition, then it should be assumed that this field should not be filled and the training task either uploads the Model without a need of this information, or that training task does not support uploading a Model as part of the pipeline. When the Pipeline's state becomes PIPELINE_STATE_SUCCEEDED and the trained Model had been uploaded into Vertex AI, then the model_to_upload's resource name is populated. The Model is always uploaded into the Project and Location in which this pipeline is.
Returns
  • (::Google::Cloud::AIPlatform::V1::Model) — Describes the Model that may be uploaded (via ModelService.UploadModel) by this TrainingPipeline. The TrainingPipeline's training_task_definition should make clear whether this Model description should be populated, and if there are any special requirements regarding how it should be filled. If nothing is mentioned in the training_task_definition, then it should be assumed that this field should not be filled and the training task either uploads the Model without a need of this information, or that training task does not support uploading a Model as part of the pipeline. When the Pipeline's state becomes PIPELINE_STATE_SUCCEEDED and the trained Model had been uploaded into Vertex AI, then the model_to_upload's resource name is populated. The Model is always uploaded into the Project and Location in which this pipeline is.

#name

def name() -> ::String
Returns
  • (::String) — Output only. Resource name of the TrainingPipeline.

#parent_model

def parent_model() -> ::String
Returns
  • (::String) — Optional. When specify this field, the model_to_upload will not be uploaded as a new model, instead, it will become a new version of this parent_model.

#parent_model=

def parent_model=(value) -> ::String
Parameter
  • value (::String) — Optional. When specify this field, the model_to_upload will not be uploaded as a new model, instead, it will become a new version of this parent_model.
Returns
  • (::String) — Optional. When specify this field, the model_to_upload will not be uploaded as a new model, instead, it will become a new version of this parent_model.

#start_time

def start_time() -> ::Google::Protobuf::Timestamp
Returns

#state

def state() -> ::Google::Cloud::AIPlatform::V1::PipelineState
Returns

#training_task_definition

def training_task_definition() -> ::String
Returns
  • (::String) — Required. A Google Cloud Storage path to the YAML file that defines the training task which is responsible for producing the model artifact, and may also include additional auxiliary work. The definition files that can be used here are found in gs://google-cloud-aiplatform/schema/trainingjob/definition/. Note: The URI given on output will be immutable and probably different, including the URI scheme, than the one given on input. The output URI will point to a location where the user only has a read access.

#training_task_definition=

def training_task_definition=(value) -> ::String
Parameter
  • value (::String) — Required. A Google Cloud Storage path to the YAML file that defines the training task which is responsible for producing the model artifact, and may also include additional auxiliary work. The definition files that can be used here are found in gs://google-cloud-aiplatform/schema/trainingjob/definition/. Note: The URI given on output will be immutable and probably different, including the URI scheme, than the one given on input. The output URI will point to a location where the user only has a read access.
Returns
  • (::String) — Required. A Google Cloud Storage path to the YAML file that defines the training task which is responsible for producing the model artifact, and may also include additional auxiliary work. The definition files that can be used here are found in gs://google-cloud-aiplatform/schema/trainingjob/definition/. Note: The URI given on output will be immutable and probably different, including the URI scheme, than the one given on input. The output URI will point to a location where the user only has a read access.

#training_task_inputs

def training_task_inputs() -> ::Google::Protobuf::Value
Returns

#training_task_inputs=

def training_task_inputs=(value) -> ::Google::Protobuf::Value
Parameter
Returns

#training_task_metadata

def training_task_metadata() -> ::Google::Protobuf::Value
Returns
  • (::Google::Protobuf::Value) — Output only. The metadata information as specified in the training_task_definition's metadata. This metadata is an auxiliary runtime and final information about the training task. While the pipeline is running this information is populated only at a best effort basis. Only present if the pipeline's training_task_definition contains metadata object.

#update_time

def update_time() -> ::Google::Protobuf::Timestamp
Returns