Google Cloud Dataproc V1 Client - Class OrderedJob (3.2.2)

Reference documentation and code samples for the Google Cloud Dataproc V1 Client class OrderedJob.

A job executed by the workflow.

Generated from protobuf message google.cloud.dataproc.v1.OrderedJob

Methods

__construct

Constructor.

Parameters
NameDescription
data array

Optional. Data for populating the Message object.

↳ step_id string

Required. The step id. The id must be unique among all jobs within the template. The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in prerequisiteStepIds field from other steps. The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters.

↳ hadoop_job Google\Cloud\Dataproc\V1\HadoopJob

Optional. Job is a Hadoop job.

↳ spark_job Google\Cloud\Dataproc\V1\SparkJob

Optional. Job is a Spark job.

↳ pyspark_job Google\Cloud\Dataproc\V1\PySparkJob

Optional. Job is a PySpark job.

↳ hive_job Google\Cloud\Dataproc\V1\HiveJob

Optional. Job is a Hive job.

↳ pig_job Google\Cloud\Dataproc\V1\PigJob

Optional. Job is a Pig job.

↳ spark_r_job Google\Cloud\Dataproc\V1\SparkRJob

Optional. Job is a SparkR job.

↳ spark_sql_job Google\Cloud\Dataproc\V1\SparkSqlJob

Optional. Job is a SparkSql job.

↳ presto_job Google\Cloud\Dataproc\V1\PrestoJob

Optional. Job is a Presto job.

↳ labels array|Google\Protobuf\Internal\MapField

Optional. The labels to associate with this job. Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: [\p{Ll}\p{Lo}][\p{Ll}\p{Lo}\p{N}-]{0,62} Label values must be between 1 and 63 characters long, and must conform to the following regular expression: [\p{Ll}\p{Lo}\p{N}-]{0,63} No more than 32 labels can be associated with a given job.

↳ scheduling Google\Cloud\Dataproc\V1\JobScheduling

Optional. Job scheduling configuration.

↳ prerequisite_step_ids array

Optional. The optional list of prerequisite job step_ids. If not specified, the job will start at the beginning of workflow.

getStepId

Required. The step id. The id must be unique among all jobs within the template.

The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in prerequisiteStepIds field from other steps. The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters.

Generated from protobuf field string step_id = 1 [(.google.api.field_behavior) = REQUIRED];

Returns
TypeDescription
string

setStepId

Required. The step id. The id must be unique among all jobs within the template.

The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in prerequisiteStepIds field from other steps. The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters.

Generated from protobuf field string step_id = 1 [(.google.api.field_behavior) = REQUIRED];

Parameter
NameDescription
var string
Returns
TypeDescription
$this

getHadoopJob

Optional. Job is a Hadoop job.

Generated from protobuf field .google.cloud.dataproc.v1.HadoopJob hadoop_job = 2 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\HadoopJob|null

hasHadoopJob

setHadoopJob

Optional. Job is a Hadoop job.

Generated from protobuf field .google.cloud.dataproc.v1.HadoopJob hadoop_job = 2 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\HadoopJob
Returns
TypeDescription
$this

getSparkJob

Optional. Job is a Spark job.

Generated from protobuf field .google.cloud.dataproc.v1.SparkJob spark_job = 3 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\SparkJob|null

hasSparkJob

setSparkJob

Optional. Job is a Spark job.

Generated from protobuf field .google.cloud.dataproc.v1.SparkJob spark_job = 3 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\SparkJob
Returns
TypeDescription
$this

getPysparkJob

Optional. Job is a PySpark job.

Generated from protobuf field .google.cloud.dataproc.v1.PySparkJob pyspark_job = 4 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\PySparkJob|null

hasPysparkJob

setPysparkJob

Optional. Job is a PySpark job.

Generated from protobuf field .google.cloud.dataproc.v1.PySparkJob pyspark_job = 4 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\PySparkJob
Returns
TypeDescription
$this

getHiveJob

Optional. Job is a Hive job.

Generated from protobuf field .google.cloud.dataproc.v1.HiveJob hive_job = 5 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\HiveJob|null

hasHiveJob

setHiveJob

Optional. Job is a Hive job.

Generated from protobuf field .google.cloud.dataproc.v1.HiveJob hive_job = 5 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\HiveJob
Returns
TypeDescription
$this

getPigJob

Optional. Job is a Pig job.

Generated from protobuf field .google.cloud.dataproc.v1.PigJob pig_job = 6 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\PigJob|null

hasPigJob

setPigJob

Optional. Job is a Pig job.

Generated from protobuf field .google.cloud.dataproc.v1.PigJob pig_job = 6 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\PigJob
Returns
TypeDescription
$this

getSparkRJob

Optional. Job is a SparkR job.

Generated from protobuf field .google.cloud.dataproc.v1.SparkRJob spark_r_job = 11 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\SparkRJob|null

hasSparkRJob

setSparkRJob

Optional. Job is a SparkR job.

Generated from protobuf field .google.cloud.dataproc.v1.SparkRJob spark_r_job = 11 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\SparkRJob
Returns
TypeDescription
$this

getSparkSqlJob

Optional. Job is a SparkSql job.

Generated from protobuf field .google.cloud.dataproc.v1.SparkSqlJob spark_sql_job = 7 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\SparkSqlJob|null

hasSparkSqlJob

setSparkSqlJob

Optional. Job is a SparkSql job.

Generated from protobuf field .google.cloud.dataproc.v1.SparkSqlJob spark_sql_job = 7 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\SparkSqlJob
Returns
TypeDescription
$this

getPrestoJob

Optional. Job is a Presto job.

Generated from protobuf field .google.cloud.dataproc.v1.PrestoJob presto_job = 12 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\PrestoJob|null

hasPrestoJob

setPrestoJob

Optional. Job is a Presto job.

Generated from protobuf field .google.cloud.dataproc.v1.PrestoJob presto_job = 12 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\PrestoJob
Returns
TypeDescription
$this

getLabels

Optional. The labels to associate with this job.

Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: [\p{Ll}\p{Lo}][\p{Ll}\p{Lo}\p{N}-]{0,62} Label values must be between 1 and 63 characters long, and must conform to the following regular expression: [\p{Ll}\p{Lo}\p{N}-]{0,63} No more than 32 labels can be associated with a given job.

Generated from protobuf field map<string, string> labels = 8 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Protobuf\Internal\MapField

setLabels

Optional. The labels to associate with this job.

Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: [\p{Ll}\p{Lo}][\p{Ll}\p{Lo}\p{N}-]{0,62} Label values must be between 1 and 63 characters long, and must conform to the following regular expression: [\p{Ll}\p{Lo}\p{N}-]{0,63} No more than 32 labels can be associated with a given job.

Generated from protobuf field map<string, string> labels = 8 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var array|Google\Protobuf\Internal\MapField
Returns
TypeDescription
$this

getScheduling

Optional. Job scheduling configuration.

Generated from protobuf field .google.cloud.dataproc.v1.JobScheduling scheduling = 9 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Cloud\Dataproc\V1\JobScheduling|null

hasScheduling

clearScheduling

setScheduling

Optional. Job scheduling configuration.

Generated from protobuf field .google.cloud.dataproc.v1.JobScheduling scheduling = 9 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\JobScheduling
Returns
TypeDescription
$this

getPrerequisiteStepIds

Optional. The optional list of prerequisite job step_ids.

If not specified, the job will start at the beginning of workflow.

Generated from protobuf field repeated string prerequisite_step_ids = 10 [(.google.api.field_behavior) = OPTIONAL];

Returns
TypeDescription
Google\Protobuf\Internal\RepeatedField

setPrerequisiteStepIds

Optional. The optional list of prerequisite job step_ids.

If not specified, the job will start at the beginning of workflow.

Generated from protobuf field repeated string prerequisite_step_ids = 10 [(.google.api.field_behavior) = OPTIONAL];

Parameter
NameDescription
var string[]
Returns
TypeDescription
$this

getJobType

Returns
TypeDescription
string