Google Cloud Dataproc: Node.js Client

release level npm version

Google Cloud Dataproc API client for Node.js

A comprehensive list of changes in each version may be found in the CHANGELOG.

Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.

Table of contents:

Quickstart

Before you begin

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the Google Cloud Dataproc API.
  4. Set up authentication with a service account so you can access the API from your local workstation.

Installing the client library

npm install @google-cloud/dataproc

Using the client library

// This quickstart sample walks a user through creating a Dataproc
// cluster, submitting a PySpark job from Google Cloud Storage to the
// cluster, reading the output of the job and deleting the cluster, all
// using the Node.js client library.

'use strict';

function main(projectId, region, clusterName, jobFilePath) {
  const dataproc = require('@google-cloud/dataproc');
  const {Storage} = require('@google-cloud/storage');

  // Create a cluster client with the endpoint set to the desired cluster region
  const clusterClient = new dataproc.v1.ClusterControllerClient({
    apiEndpoint: `${region}-dataproc.googleapis.com`,
    projectId: projectId,
  });

  // Create a job client with the endpoint set to the desired cluster region
  const jobClient = new dataproc.v1.JobControllerClient({
    apiEndpoint: `${region}-dataproc.googleapis.com`,
    projectId: projectId,
  });

  async function quickstart() {
    // Create the cluster config
    const cluster = {
      projectId: projectId,
      region: region,
      cluster: {
        clusterName: clusterName,
        config: {
          masterConfig: {
            numInstances: 1,
            machineTypeUri: 'n1-standard-2',
          },
          workerConfig: {
            numInstances: 2,
            machineTypeUri: 'n1-standard-2',
          },
        },
      },
    };

    // Create the cluster
    const [operation] = await clusterClient.createCluster(cluster);
    const [response] = await operation.promise();

    // Output a success message
    console.log(`Cluster created successfully: ${response.clusterName}`);

    const job = {
      projectId: projectId,
      region: region,
      job: {
        placement: {
          clusterName: clusterName,
        },
        pysparkJob: {
          mainPythonFileUri: jobFilePath,
        },
      },
    };

    const [jobOperation] = await jobClient.submitJobAsOperation(job);
    const [jobResponse] = await jobOperation.promise();

    const matches =
      jobResponse.driverOutputResourceUri.match('gs://(.*?)/(.*)');

    const storage = new Storage();

    const output = await storage
      .bucket(matches[1])
      .file(`${matches[2]}.000000000`)
      .download();

    // Output a success message.
    console.log(`Job finished successfully: ${output}`);

    // Delete the cluster once the job has terminated.
    const deleteClusterReq = {
      projectId: projectId,
      region: region,
      clusterName: clusterName,
    };

    const [deleteOperation] =
      await clusterClient.deleteCluster(deleteClusterReq);
    await deleteOperation.promise();

    // Output a success message
    console.log(`Cluster ${clusterName} successfully deleted.`);
  }

  quickstart();
}

const args = process.argv.slice(2);

if (args.length !== 4) {
  console.log(
    'Insufficient number of parameters provided. Please make sure a ' +
      'PROJECT_ID, REGION, CLUSTER_NAME and JOB_FILE_PATH are provided, in this order.'
  );
}

main(...args);

Samples

Samples are in the samples/ directory. Each sample's README.md has instructions for running its sample.

SampleSource CodeTry it
Autoscaling_policy_service.create_autoscaling_policysource codeOpen in Cloud Shell
Autoscaling_policy_service.delete_autoscaling_policysource codeOpen in Cloud Shell
Autoscaling_policy_service.get_autoscaling_policysource codeOpen in Cloud Shell
Autoscaling_policy_service.list_autoscaling_policiessource codeOpen in Cloud Shell
Autoscaling_policy_service.update_autoscaling_policysource codeOpen in Cloud Shell
Batch_controller.create_batchsource codeOpen in Cloud Shell
Batch_controller.delete_batchsource codeOpen in Cloud Shell
Batch_controller.get_batchsource codeOpen in Cloud Shell
Batch_controller.list_batchessource codeOpen in Cloud Shell
Cluster_controller.create_clustersource codeOpen in Cloud Shell
Cluster_controller.delete_clustersource codeOpen in Cloud Shell
Cluster_controller.diagnose_clustersource codeOpen in Cloud Shell
Cluster_controller.get_clustersource codeOpen in Cloud Shell
Cluster_controller.list_clusterssource codeOpen in Cloud Shell
Cluster_controller.start_clustersource codeOpen in Cloud Shell
Cluster_controller.stop_clustersource codeOpen in Cloud Shell
Cluster_controller.update_clustersource codeOpen in Cloud Shell
Job_controller.cancel_jobsource codeOpen in Cloud Shell
Job_controller.delete_jobsource codeOpen in Cloud Shell
Job_controller.get_jobsource codeOpen in Cloud Shell
Job_controller.list_jobssource codeOpen in Cloud Shell
Job_controller.submit_jobsource codeOpen in Cloud Shell
Job_controller.submit_job_as_operationsource codeOpen in Cloud Shell
Job_controller.update_jobsource codeOpen in Cloud Shell
Node_group_controller.create_node_groupsource codeOpen in Cloud Shell
Node_group_controller.get_node_groupsource codeOpen in Cloud Shell
Node_group_controller.resize_node_groupsource codeOpen in Cloud Shell
Session_controller.create_sessionsource codeOpen in Cloud Shell
Session_controller.delete_sessionsource codeOpen in Cloud Shell
Session_controller.get_sessionsource codeOpen in Cloud Shell
Session_controller.list_sessionssource codeOpen in Cloud Shell
Session_controller.terminate_sessionsource codeOpen in Cloud Shell
Session_template_controller.create_session_templatesource codeOpen in Cloud Shell
Session_template_controller.delete_session_templatesource codeOpen in Cloud Shell
Session_template_controller.get_session_templatesource codeOpen in Cloud Shell
Session_template_controller.list_session_templatessource codeOpen in Cloud Shell
Session_template_controller.update_session_templatesource codeOpen in Cloud Shell
Workflow_template_service.create_workflow_templatesource codeOpen in Cloud Shell
Workflow_template_service.delete_workflow_templatesource codeOpen in Cloud Shell
Workflow_template_service.get_workflow_templatesource codeOpen in Cloud Shell
Workflow_template_service.instantiate_inline_workflow_templatesource codeOpen in Cloud Shell
Workflow_template_service.instantiate_workflow_templatesource codeOpen in Cloud Shell
Workflow_template_service.list_workflow_templatessource codeOpen in Cloud Shell
Workflow_template_service.update_workflow_templatesource codeOpen in Cloud Shell
Quickstartsource codeOpen in Cloud Shell

The Google Cloud Dataproc Node.js Client API Reference documentation also contains samples.

Supported Node.js Versions

Our client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.

Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:

  • Legacy versions are not tested in continuous integration.
  • Some security patches and features cannot be backported.
  • Dependencies cannot be kept up-to-date.

Client libraries targeting some end-of-life versions of Node.js are available, and can be installed through npm dist-tags. The dist-tags follow the naming convention legacy-(version). For example, npm install @google-cloud/dataproc@legacy-8 installs client libraries for versions compatible with Node.js 8.

Versioning

This library follows Semantic Versioning.

This library is considered to be stable. The code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against stable libraries are addressed with the highest priority.

More Information: Google Cloud Platform Launch Stages

Contributing

Contributions welcome! See the Contributing Guide.

Please note that this README.md, the samples/README.md, and a variety of configuration files in this repository (including .nycrc and tsconfig.json) are generated from a central template. To edit one of these files, make an edit to its templates in directory.

License

Apache Version 2.0

See LICENSE