A Dataproc job for running Apache Spark <http://spark.apache.org/>
__
applications on YARN. The specification of the main method to call to
drive the job. Specify either the jar file that contains the main
class or the main class name. To pass both a main jar and a main class
in that jar, add the jar to CommonJob.jar_file_uris
, and then
specify the main class name in main_class
.
.. attribute:: main_jar_file_uri
The HCFS URI of the jar file that contains the main class.
Optional. The arguments to pass to the driver. Do not include
arguments, such as --conf
, that can be set as job
properties, since a collision may occur that causes an
incorrect job submission.
Optional. HCFS URIs of files to be copied to the working directory of Spark drivers and distributed tasks. Useful for naively parallel tasks.
Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
Inheritance
builtins.object > google.protobuf.pyext._message.CMessage > builtins.object > google.protobuf.message.Message > SparkJobClasses
PropertiesEntry
API documentation for dataproc_v1beta2.types.SparkJob.PropertiesEntry
class.