Use R with BigQuery

This page describes how to load data from BigQuery into an R dataframe, using the bigrquery R package.

Before you begin

Before you begin, create a new AI Platform Notebooks instance for R.

Create a BigQuery service account key

To create a BigQuery service account key:

  1. Go to the APIs & Services Credentials page in the Google Cloud Platform Console.

    Go to the APIs & Services Credentials page

  2. Select Create credentials -> Service account key.

    Create a credential

  3. In the Service account drop-down menu, select an existing credential that has BigQuery access, or select New service account to create a new one.

    To create a new service account, enter a Service account name, and on the Role drop-down menu, select BigQuery, and then select the role that you want to assign, such as "BigQuery Admin."

    Create a service account

  4. Select JSON as the key type.

    Select JSON

  5. Click Create. Your browser downloads a JSON file that contains the service account key. This file allows access to your resources, so store it securely.

  6. Click Close to close the Private key saved to your computer message.

Load data from BigQuery using R

To load data from BigQuery using the R package, bigrquery, follow these steps:

  1. Go to the AI Platform Notebooks page in the Google Cloud Platform Console.

    Go to the AI Platform Notebooks page

  2. Select Open JupyterLab for the R instance that you want to open.

    Open JupyterLab

  3. Click the Upload Files button.

    The Upload Files button

  4. Go to and select the JSON file that you created, and then click Open to upload it.

  5. Select File -> New -> Notebook, and then select the R kernel.

    Add an R notebook

  6. In the notebook's first code cell, enter the following: library("bigrquery").

    Enter code in the first cell

  7. Click the run button to run the command. R loads the bigrquery package.

    The run button

  8. To add a code cell to the notebook, click the notebook's + button.

    The + button

  9. In the new code cell, enter the following code sample. Replace path-name.json with the path to your notebook instance's service account key JSON file, and replace project-id with your Google Cloud Platform project ID.

    To copy the path for your JSON file, right-click (Command + click on Mac) the filename in the JupyterLab file browser, and then select Copy Path.

    To get your project ID, click the drop-down arrow next to your project name in the Google Cloud Platform Console.

    # Provide authentication through the JSON service account key
    path="path-name.json"
    set_service_token(path)
    
    # Store the project id
    projectid="project-id"
    
    # Set your query
    sql <- "SELECT year, month, day, weight_pounds FROM 'publicdata.samples.natality' limit 100"
    
    # Run the query and store the data in a dataframe
    tb <- bq_project_query(query=sql,x=projectid)
    df <- bq_table_download(tb)
    
    # View the query result
    df
    
  10. Run the cell to view 100 rows of data from one of BigQuery's public datasets.

What's next

Read bigrquery documentation to learn more about how you can use BigQuery data in your R notebooks.

Oliko tästä sivusta apua? Kerro mielipiteesi

Palautteen aihe:

Tämä sivu
AI Platform Notebooks
Tarvitsetko apua? Siirry tukisivullemme.