Google Cloud Dataproc V1 Client - Class HadoopJob (3.5.1)

Reference documentation and code samples for the Google Cloud Dataproc V1 Client class HadoopJob.

A Dataproc job for running Apache Hadoop MapReduce jobs on Apache Hadoop YARN.

Generated from protobuf message google.cloud.dataproc.v1.HadoopJob

Methods

__construct

Constructor.

Parameters
NameDescription
data array

Optional. Data for populating the Message object.

↳ main_jar_file_uri string

The HCFS URI of the jar file containing the main class. Examples: 'gs://foo-bucket/analytics-binaries/extract-useful-metrics-mr.jar' 'hdfs:/tmp/test-samples/custom-wordcount.jar' 'file:///home/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar'

↳ main_class string

The name of the driver's main class. The jar file containing the class must be in the default CLASSPATH or specified in jar_file_uris.

↳ args array

Optional. The arguments to pass to the driver. Do not include arguments, such as -libjars or -Dfoo=bar, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

↳ jar_file_uris array

Optional. Jar file URIs to add to the CLASSPATHs of the Hadoop driver and tasks.

↳ file_uris array

Optional. HCFS (Hadoop Compatible Filesystem) URIs of files to be copied to the working directory of Hadoop drivers and distributed tasks. Useful for naively parallel tasks.

↳ archive_uris array

Optional. HCFS URIs of archives to be extracted in the working directory of Hadoop drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, or .zip.

↳ properties array|Google\Protobuf\Internal\MapField

Optional. A mapping of property names to values, used to configure Hadoop. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/hadoop/conf/*-site and classes in user code.

↳ logging_config Google\Cloud\Dataproc\V1\LoggingConfig

Optional. The runtime log config for job execution.

getMainJarFileUri

The HCFS URI of the jar file containing the main class.

Examples: 'gs://foo-bucket/analytics-binaries/extract-useful-metrics-mr.jar' 'hdfs:/tmp/test-samples/custom-wordcount.jar' 'file:///home/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar'

Returns
TypeDescription
string

hasMainJarFileUri

setMainJarFileUri

The HCFS URI of the jar file containing the main class.

Examples: 'gs://foo-bucket/analytics-binaries/extract-useful-metrics-mr.jar' 'hdfs:/tmp/test-samples/custom-wordcount.jar' 'file:///home/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar'

Parameter
NameDescription
var string
Returns
TypeDescription
$this

getMainClass

The name of the driver's main class. The jar file containing the class must be in the default CLASSPATH or specified in jar_file_uris.

Returns
TypeDescription
string

hasMainClass

setMainClass

The name of the driver's main class. The jar file containing the class must be in the default CLASSPATH or specified in jar_file_uris.

Parameter
NameDescription
var string
Returns
TypeDescription
$this

getArgs

Optional. The arguments to pass to the driver. Do not include arguments, such as -libjars or -Dfoo=bar, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

Returns
TypeDescription
Google\Protobuf\Internal\RepeatedField

setArgs

Optional. The arguments to pass to the driver. Do not include arguments, such as -libjars or -Dfoo=bar, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

Parameter
NameDescription
var string[]
Returns
TypeDescription
$this

getJarFileUris

Optional. Jar file URIs to add to the CLASSPATHs of the Hadoop driver and tasks.

Returns
TypeDescription
Google\Protobuf\Internal\RepeatedField

setJarFileUris

Optional. Jar file URIs to add to the CLASSPATHs of the Hadoop driver and tasks.

Parameter
NameDescription
var string[]
Returns
TypeDescription
$this

getFileUris

Optional. HCFS (Hadoop Compatible Filesystem) URIs of files to be copied to the working directory of Hadoop drivers and distributed tasks. Useful for naively parallel tasks.

Returns
TypeDescription
Google\Protobuf\Internal\RepeatedField

setFileUris

Optional. HCFS (Hadoop Compatible Filesystem) URIs of files to be copied to the working directory of Hadoop drivers and distributed tasks. Useful for naively parallel tasks.

Parameter
NameDescription
var string[]
Returns
TypeDescription
$this

getArchiveUris

Optional. HCFS URIs of archives to be extracted in the working directory of Hadoop drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, or .zip.

Returns
TypeDescription
Google\Protobuf\Internal\RepeatedField

setArchiveUris

Optional. HCFS URIs of archives to be extracted in the working directory of Hadoop drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, or .zip.

Parameter
NameDescription
var string[]
Returns
TypeDescription
$this

getProperties

Optional. A mapping of property names to values, used to configure Hadoop.

Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/hadoop/conf/*-site and classes in user code.

Returns
TypeDescription
Google\Protobuf\Internal\MapField

setProperties

Optional. A mapping of property names to values, used to configure Hadoop.

Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/hadoop/conf/*-site and classes in user code.

Parameter
NameDescription
var array|Google\Protobuf\Internal\MapField
Returns
TypeDescription
$this

getLoggingConfig

Optional. The runtime log config for job execution.

Returns
TypeDescription
Google\Cloud\Dataproc\V1\LoggingConfig|null

hasLoggingConfig

clearLoggingConfig

setLoggingConfig

Optional. The runtime log config for job execution.

Parameter
NameDescription
var Google\Cloud\Dataproc\V1\LoggingConfig
Returns
TypeDescription
$this

getDriver

Returns
TypeDescription
string