Reference documentation and code samples for the Cloud Dataproc V1 API class Google::Cloud::Dataproc::V1::SparkRJob.
A Dataproc job for running Apache SparkR applications on YARN.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#archive_uris
def archive_uris() -> ::Array<::String>
Returns
- (::Array<::String>) — Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
#archive_uris=
def archive_uris=(value) -> ::Array<::String>
Parameter
- value (::Array<::String>) — Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Returns
- (::Array<::String>) — Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
#args
def args() -> ::Array<::String>
Returns
-
(::Array<::String>) — Optional. The arguments to pass to the driver. Do not include arguments,
such as
--conf
, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
#args=
def args=(value) -> ::Array<::String>
Parameter
-
value (::Array<::String>) — Optional. The arguments to pass to the driver. Do not include arguments,
such as
--conf
, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
Returns
-
(::Array<::String>) — Optional. The arguments to pass to the driver. Do not include arguments,
such as
--conf
, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
#file_uris
def file_uris() -> ::Array<::String>
Returns
- (::Array<::String>) — Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
#file_uris=
def file_uris=(value) -> ::Array<::String>
Parameter
- value (::Array<::String>) — Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
Returns
- (::Array<::String>) — Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
#logging_config
def logging_config() -> ::Google::Cloud::Dataproc::V1::LoggingConfig
Returns
- (::Google::Cloud::Dataproc::V1::LoggingConfig) — Optional. The runtime log config for job execution.
#logging_config=
def logging_config=(value) -> ::Google::Cloud::Dataproc::V1::LoggingConfig
Parameter
- value (::Google::Cloud::Dataproc::V1::LoggingConfig) — Optional. The runtime log config for job execution.
Returns
- (::Google::Cloud::Dataproc::V1::LoggingConfig) — Optional. The runtime log config for job execution.
#main_r_file_uri
def main_r_file_uri() -> ::String
Returns
- (::String) — Required. The HCFS URI of the main R file to use as the driver. Must be a .R file.
#main_r_file_uri=
def main_r_file_uri=(value) -> ::String
Parameter
- value (::String) — Required. The HCFS URI of the main R file to use as the driver. Must be a .R file.
Returns
- (::String) — Required. The HCFS URI of the main R file to use as the driver. Must be a .R file.
#properties
def properties() -> ::Google::Protobuf::Map{::String => ::String}
Returns
- (::Google::Protobuf::Map{::String => ::String}) — Optional. A mapping of property names to values, used to configure SparkR. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
#properties=
def properties=(value) -> ::Google::Protobuf::Map{::String => ::String}
Parameter
- value (::Google::Protobuf::Map{::String => ::String}) — Optional. A mapping of property names to values, used to configure SparkR. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
Returns
- (::Google::Protobuf::Map{::String => ::String}) — Optional. A mapping of property names to values, used to configure SparkR. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.