Class PipelineJob (1.1.1)

PipelineJob(
    display_name: str,
    template_path: str,
    job_id: Optional[str] = None,
    pipeline_root: Optional[str] = None,
    parameter_values: Optional[Dict[str, Any]] = None,
    enable_caching: Optional[bool] = True,
    encryption_spec_key_name: Optional[str] = None,
    labels: Optional[Dict[str, str]] = None,
    credentials: Optional[google.auth.credentials.Credentials] = None,
    project: Optional[str] = None,
    location: Optional[str] = None,
)

Retrieves a PipelineJob resource and instantiates its representation.

Parameters

Name Description
display_name str

Required. The user-defined name of this Pipeline.

template_path str

Required. The path of PipelineJob JSON file. It can be a local path or a Google Cloud Storage URI. Example: "gs://project.name"

job_id str

Optional. The unique ID of the job run. If not specified, pipeline name + timestamp will be used.

pipeline_root str

Optional. The root of the pipeline outputs. Default to be staging bucket.

parameter_values Dict[str, Any]

Optional. The mapping from runtime parameter names to its values that control the pipeline run.

enable_caching bool

Optional. Whether to turn on caching for the run. Defaults to True.

encryption_spec_key_name str

Optional. The Cloud KMS resource identifier of the customer managed encryption key used to protect the job. Has the form: projects/my-project/locations/my-region/keyRings/my-kr/cryptoKeys/my-key. The key needs to be in the same region as where the compute resource is created. If this is set, then all resources created by the BatchPredictionJob will be encrypted with the provided encryption key. Overrides encryption_spec_key_name set in aiplatform.init.

labels Dict[str,str]

Optional. The user defined metadata to organize PipelineJob.

credentials auth_credentials.Credentials

Optional. Custom credentials to use to create this batch prediction job. Overrides credentials set in aiplatform.init.

project str

Optional. Project to retrieve PipelineJob from. If not set, project set in aiplatform.init will be used.

location str

Optional. Location to create PipelineJob. If not set, location set in aiplatform.init will be used.

Inheritance

builtins.object > google.cloud.aiplatform.base.VertexAiResourceNoun > builtins.object > google.cloud.aiplatform.base.FutureManager > google.cloud.aiplatform.base.VertexAiResourceNounWithFutureManager > PipelineJob

Properties

has_failed

Returns True if pipeline has failed.

False otherwise.

state

Current pipeline state.

Methods

cancel

cancel()

Starts asynchronous cancellation on the PipelineJob. The server makes a best effort to cancel the job, but success is not guaranteed. On successful cancellation, the PipelineJob is not deleted; instead it becomes a job with state set to CANCELLED.

Exceptions
Type Description
RuntimeError If this PipelineJob has not started running.

run

run(
    service_account: Optional[str] = None,
    network: Optional[str] = None,
    sync: Optional[bool] = True,
)

Run this configured PipelineJob.

Parameters
Name Description
service_account str

Optional. Specifies the service account for workload run-as account. Users submitting jobs must have act-as permission on this run-as account.

network str

Optional. The full name of the Compute Engine network to which the job should be peered. For example, projects/12345/global/networks/myVPC. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

sync bool

Optional. Whether to execute this method synchronously. If False, this method will unblock and it will be executed in a concurrent Future.