Managing datasets

This document describes how to manage datasets in BigQuery. After creating a dataset, you can manage the dataset in the following ways:

  • Update a dataset's:
    • Description
    • Default expiration time for new tables
    • Default expiration time for partitions in new partitioned tables
    • Access controls
    • Labels
  • Rename (recreate) a dataset
  • Copy (recreate) a dataset
  • Delete a dataset

For information on creating and using datasets including listing datasets, getting information about datasets, and setting dataset access controls, see Creating and Using Datasets.

Updating dataset properties

You can update a dataset's:

Required permissions

To update dataset properties, you must have OWNER access at the dataset level, or you must be assigned a project-level IAM role that includes bigquery.datasets.update permissions. The following predefined, project-level IAM roles include bigquery.datasets.update permissions:

In addition, because the bigquery.user role has bigquery.datasets.create permissions, a user assigned to the bigquery.user role can update any dataset that user creates. When a user assigned to the bigquery.user role creates a dataset, that user is given OWNER access to the dataset. OWNER access to a dataset gives the user full control over it.

For more information on IAM roles and permissions in BigQuery, see Access Control. For more information on dataset-level roles, see Primitive roles for datasets.

Updating dataset descriptions

To update a dataset's description, you can use the BigQuery web UI, the bq update CLI command, or the datasets.patch API method.

To update a dataset's description:

Web UI

  1. In the navigation pane, select your dataset.

  2. On the Dataset Details page, in the Description section, click Describe this dataset to open the description box if the dataset has no description. Otherwise, click the existing description text.

  3. Enter a description in the box or edit the existing description. When you click away from the box, the text is saved.

    Dataset description

CLI

Issue the bq update command with the --description flag. If you are updating a dataset in a project other than your default project, add the project ID to the dataset name in the following format: [PROJECT_ID]:[DATASET].

bq update --description "[DESCRIPTION]" [PROJECT_ID]:[DATASET]

Where:

  • [DESCRIPTION] is the text that describes the dataset in quotes.
  • [PROJECT_ID] is your project ID.
  • [DATASET] is the name of the dataset you're updating.

Examples:

Enter the following command to change the description of mydataset to "Description of mydataset." mydataset is in your default project.

bq update --description "Description of mydataset" mydataset

Enter the following command to change the description of mydataset to "Description of mydataset." The dataset is in myotherproject, not your default project.

bq update --description "Description of mydataset" myotherproject:mydataset

API

Call datasets.patch and use the description property to apply your dataset description. Because the datasets.update method replaces the entire dataset resource, the datasets.patch method is preferred.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Go API reference documentation .

ds := client.Dataset(datasetID)
meta, err := ds.Metadata(ctx)
if err != nil {
	return err
}
update := bigquery.DatasetMetadataToUpdate{
	Description: "Updated Description.",
}
if _, err = ds.Update(ctx, update, meta.ETag); err != nil {
	return err
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Java API reference documentation .

Create a Dataset.Builder instance from an existing Dataset instance with the Dataset.toBuilder() method. Configure the dataset builder object. Build the updated dataset with the Dataset.Builder.build() method, and call the Dataset.update() method to send the update to the API.
Dataset oldDataset = bigquery.getDataset(datasetName);
DatasetInfo datasetInfo = oldDataset.toBuilder().setDescription(newDescription).build();
Dataset newDataset = bigquery.update(datasetInfo);

Python

Before trying this sample, follow the Python setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Python API reference documentation .

Configure the Dataset.description property and call Client.update_dataset() to send the update to the API.

# from google.cloud import bigquery
# client = bigquery.Client()
# dataset_ref = client.dataset('my_dataset')
# dataset = client.get_dataset(dataset_ref)  # API request

assert dataset.description == 'Original description.'
dataset.description = 'Updated description.'

dataset = client.update_dataset(dataset, ['description'])  # API request

assert dataset.description == 'Updated description.'

Updating default table expiration times

To update a dataset's default table expiration time, use the BigQuery web UI, the bq update CLI command, or the datasets.patch API method.

You can set a default table expiration time at the dataset level, or you can set a table's expiration time when the table is created. If you set the expiration when the table is created, the dataset's default table expiration is ignored. If you do not set a default table expiration at the dataset level, and you do not set a table expiration when the table is created, the table never expires and you must delete the table manually.

When you update a dataset's default table expiration setting:

  • If you change the value from Never to a defined expiration time, any tables that already exist in the dataset will not expire unless the expiration time was set on the table when it was created.
  • If you are changing the value for the default table expiration, any tables that already exist expire according to the original table expiration setting. Any new tables created in the dataset have the new table expiration setting applied unless you specify a different table expiration on the table when it is created.

The value for default table expiration is expressed differently depending on where the value is set. Use the method that gives you the appropriate level of granularity:

  • In the BigQuery web UI, expiration is expressed in days.
  • In the command-line tool, expiration is expressed in seconds.
  • In the API, expiration is expressed in milliseconds.

To update the default expiration time for a dataset:

Web UI

To update the default expiration time using the web UI:

  1. In the navigation pane, select your dataset.

  2. On the Dataset Details page, in the Details section, to the right of Default Table Expiration, click Edit.

    Table expiration

  3. In the Update Expiration dialog, for Data expiration, click In and enter the expiration time in days. The default value is Never.

CLI

To update the default expiration time for newly created tables in a dataset, enter the bq update command with the --default_table_expiration flag. If you are updating a dataset in a project other than your default project, add the project ID to the dataset name in the following format: [PROJECT_ID]:[DATASET].

bq update --default_table_expiration [INTEGER] [PROJECT_ID]:[DATASET]

Where:

  • [INTEGER] is the default lifetime (in seconds) for newly-created tables. The minimum value is 3600 seconds (one hour). The expiration time evaluates to the current time plus the integer value. Specify 0 to remove the existing expiration time. Any table created in the dataset is deleted after [INTEGER] seconds from its creation time. This value is applied if you do not set a table expiration when the table is created.
  • [PROJECT_ID] is your project ID.
  • [DATASET] is the name of the dataset you're updating.

Examples:

Enter the following command to set the default table expiration for new tables created in mydataset to two hours (7200 seconds) from the current time. The dataset is in your default project.

bq update --default_table_expiration 7200 mydataset

Enter the following command to set the default table expiration for new tables created in mydataset to two hours (7200 seconds) from the current time. The dataset is in myotherproject, not your default project.

bq update --default_table_expiration 7200 myotherproject:mydataset

API

Call datasets.patch and use the defaultTableExpirationMs property to apply your default table expiration in milliseconds. Because the datasets.update method replaces the entire dataset resource, the datasets.patch method is preferred.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Go API reference documentation .

ds := client.Dataset(datasetID)
meta, err := ds.Metadata(ctx)
if err != nil {
	return err
}
update := bigquery.DatasetMetadataToUpdate{
	DefaultTableExpiration: 24 * time.Hour,
}
if _, err := client.Dataset(datasetID).Update(ctx, update, meta.ETag); err != nil {
	return err
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Java API reference documentation .

Create a Dataset.Builder instance from an existing Dataset instance with the Dataset.toBuilder() method. Configure the dataset builder object. Build the updated dataset with the Dataset.Builder.build() method, and call the Dataset.update() method to send the update to the API.

Configure the default expiration time with the Dataset.Builder.setDefaultTableLifetime() method.

Long beforeExpiration = dataset.getDefaultTableLifetime();

Long oneDayMilliseconds = 24 * 60 * 60 * 1000L;
DatasetInfo.Builder builder = dataset.toBuilder();
builder.setDefaultTableLifetime(oneDayMilliseconds);
bigquery.update(builder.build());  // API request.

Python

Before trying this sample, follow the Python setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Python API reference documentation .

Configure the Dataset.default_table_expiration_ms property and call Client.update_dataset() to send the update to the API.

# from google.cloud import bigquery
# client = bigquery.Client()
# dataset_ref = client.dataset('my_dataset')
# dataset = client.get_dataset(dataset_ref)  # API request

assert dataset.default_table_expiration_ms is None
one_day_ms = 24 * 60 * 60 * 1000  # in milliseconds
dataset.default_table_expiration_ms = one_day_ms

dataset = client.update_dataset(
    dataset, ['default_table_expiration_ms'])  # API request

assert dataset.default_table_expiration_ms == one_day_ms

Updating default partition expiration times

To update a dataset's default partition expiration, use the bq update CLI command, or the datasets.patch API method. Setting or updating a dataset's default partition expiration is not currently supported by the BigQuery web UI.

You can set a default partition expiration time at the dataset level that affects all newly created partitioned tables, or you can set a partition expiration time for individual tables when the partitioned tables are created. If you set the default partition expiration at the dataset level, and you set the default table expiration at the dataset level, new partitioned tables will only have a partition expiration. If both options are set, the default partition expiration overrides the default table expiration.

If you set the partition expiration time when the partitioned table is created, that value overrides the dataset-level default partition expiration if it exists.

If you do not set a default partition expiration at the dataset level, and you do not set a partition expiration when the table is created, the partitions never expire and you must delete the partitions manually.

When you set a default partition expiration on a dataset, the expiration applies to all partitions in all partitioned tables created in the dataset. When you set the partition expiration on a table, the expiration applies to all partitions created in the specified table. Currently, you cannot apply different expiration times to different partitions in the same table.

When you update a dataset's default partition expiration setting:

  • If you change the value from "never" to a defined expiration time, any partitions that already exist in partitioned tables in the dataset will not expire unless the partition expiration time was set on the table when it was created.
  • If you are changing the value for the default partition expiration, any partitions in existing partitioned tables expire according to the original default partition expiration. Any new partitioned tables created in the dataset have the new default partition expiration setting applied unless you specify a different partition expiration on the table when it is created.

The value for default partition expiration is expressed differently depending on where the value is set. Use the method that gives you the appropriate level of granularity:

  • In the command-line tool, expiration is expressed in seconds.
  • In the API, expiration is expressed in milliseconds.

To update the default partition expiration time for a dataset:

Web UI

Updating a dataset's default partition expiration is not currently supported by the BigQuery web UI.

CLI

To update the default expiration time for a dataset, enter the bq update command with the --default_partition_expiration flag. If you are updating a dataset in a project other than your default project, add the project ID to the dataset name in the following format: [PROJECT_ID]:[DATASET].

bq update --default_partition_expiration [INTEGER] [PROJECT_ID]:[DATASET]

Where:

  • [INTEGER] is the default lifetime (in seconds) for partitions in newly-created partitioned tables. This flag has no minimum value. Specify 0 to remove the existing expiration time. Any partitions in newly-created partitioned tables are deleted after [INTEGER] seconds from the partition date. This value is applied if you do not set a partition expiration on the table when it is created.
  • [PROJECT_ID] is your project ID.
  • [DATASET] is the name of the dataset you're updating.

Examples:

Enter the following command to set the default partition expiration for new partitioned tables created in mydataset to 26 hours (93,600 seconds). The dataset is in your default project.

bq update --default_partition_expiration 93600 mydataset

Enter the following command to set the default partition expiration for new partitioned tables created in mydataset to 26 hours (93,600 seconds). The dataset is in myotherproject, not your default project.

bq update --default_partition_expiration 7200 myotherproject:mydataset

API

Call datasets.patch and use the defaultPartitionExpirationMs property to apply your default partition expiration in milliseconds. Because the datasets.update method replaces the entire dataset resource, the datasets.patch method is preferred.

Updating dataset access controls

The process for updating a dataset's access controls is very similar to the process for assigning access controls to a dataset. Access controls cannot be applied during dataset creation using the BigQuery web UI or the command-line tool. You must create the dataset first and then update the dataset's access controls. The API allows you to update dataset access controls by calling the datasets.patch method.

Required permissions

To assign or update dataset access controls, you must have OWNER access at the dataset level, or you must be assigned a project-level IAM role that includes bigquery.datasets.update permissions. The following predefined, project-level IAM roles include bigquery.datasets.update permissions:

In addition, because the bigquery.user role has bigquery.datasets.create permissions, a user assigned to the bigquery.user role can update any dataset that user creates. When a user assigned to the bigquery.user role creates a dataset, that user is given OWNER access to the dataset. OWNER access to a dataset gives the user full control over it.

For more information on IAM roles and permissions in BigQuery, see Access Control. For more information on dataset-level roles, see Primitive roles for datasets.

Updating dataset access controls

When you update access controls on a dataset, you can modify access for the following users and groups:

  • User by e-mail - Gives an individual Google account access to the dataset
  • Group by e-mail - Gives all members of a Google group access to the dataset
  • Domain - Gives all users and groups in a Google domain access to the dataset
  • All Authenticated Users - Gives all Google account holders access to the dataset (makes the dataset public)
  • Project Owners - Gives all project owners access to the dataset
  • Project Viewers - Gives all project viewers access to the dataset
  • Project Editors - Gives all project editors access to the dataset
  • Authorized View - Gives a view access to the dataset

To update access controls on a dataset:

Web UI

  1. Click the drop-down arrow to the right of the dataset and choose Share Dataset.

  2. In the Share Dataset dialog, to modify existing entries:

    • Remove existing entries by clicking the X icon to the right of the user, group, or service account.
    • Change permissions for a user, group, or service account by clicking the permissions button and choosing an appropriate access level: Is owner (OWNER), Can edit (WRITER), or Can view (READER). For more information on dataset-level roles, see Primitive roles for datasets.
  3. In the Share Dataset dialog, to add new entries:

    1. Click the drop-down to the left of the Add People field and choose the appropriate option.

    2. Type a value in the text box. For example, if you chose User by e-mail, type the user's email address.

    3. To the right of the Add People field, click Can view and choose the appropriate role from the list.

      Add people to dataset

    4. Click Add.

  4. When you are done adding, deleting, or modifying your access controls, click Save changes.

  5. Verify your access controls by clicking the drop-down arrow to the right of the dataset and choosing Share Dataset. You can confirm the settings in the Share Dataset dialog.

Command-line

  1. Write the existing dataset information (including access controls) to a JSON file using the show command. If the dataset is in a project other than your default project, add the project ID to the dataset name in the following format: [PROJECT_ID]:[DATASET].

    bq show --format=prettyjson [PROJECT_ID]:[DATASET] > [PATH_TO_FILE]
    

    Where:

    • [PROJECT_ID] is your project ID.
    • [DATASET] is the name of your dataset.
    • [PATH_TO_FILE] is the path to the JSON file on your local machine.

      Examples:

      Enter the following command to write the access controls for mydataset to a JSON file. mydataset is in your default project.

      bq show --format=prettyjson mydataset > /tmp/mydataset.json

      Enter the following command to write the access controls for mydataset to a JSON file. mydataset is in myotherproject.

      bq show --format=prettyjson myotherproject:mydataset > /tmp/mydataset.json

  2. Make your changes to the "access" section of the JSON file. You can add or remove any of the specialGroup entries: projectOwners, projectWriters, projectReaders, and allAuthenticatedUsers. You can also add, remove, or modify any of the following: userByEmail, groupByEmail, and domain.

    For example, the access section of a dataset's JSON file would look like the following:

    {
     "access": [
      {
       "role": "READER",
       "specialGroup": "projectReaders"
      },
      {
       "role": "WRITER",
       "specialGroup": "projectWriters"
      },
      {
       "role": "OWNER",
       "specialGroup": "projectOwners"
      }
      {
       "role": "READER",
       "specialGroup": "allAuthenticatedUsers"
      }
      {
       "role": "READER",
       "domain": "[DOMAIN_NAME]"
      }
      {
       "role": "WRITER",
       "userByEmail": "[USER_EMAIL]"
      }
      {
       "role": "READER",
       "groupByEmail": "[GROUP_EMAIL]"
      }
     ],
    }
    

  3. When your edits are complete, use the update command and include the JSON file using the --source flag. If the dataset is in a project other than your default project, add the project ID to the dataset name in the following format: [PROJECT_ID]:[DATASET].

    bq update --source [PATH_TO_FILE] [PROJECT_ID]:[DATASET]
    

    Where:

    • [PATH_TO_FILE] is the path to the JSON file on your local machine.
    • [PROJECT_ID] is your project ID.
    • [DATASET] is the name of your dataset.

      Examples:

      Enter the following command to update the access controls for mydataset. mydataset is in your default project.

      bq update --source /tmp/mydataset.json mydataset

      Enter the following command to update the access controls for mydataset. mydataset is in myotherproject.

      bq update --source /tmp/mydataset.json myotherproject:mydataset

  4. To verify your access control changes, enter the show command again without writing the information to a file.

    bq show --format=prettyjson [DATASET]

    or

    bq show --format=prettyjson [PROJECT_ID]:[DATASET]

API

Call the datasets.patch and use the access property to update your access controls. For more information, see Datasets.

Because the datasets.update method replaces the entire dataset resource, datasets.patch is the preferred method for updating access controls.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Go API reference documentation .

ds := client.Dataset(datasetID)
meta, err := ds.Metadata(ctx)
if err != nil {
	return err
}
// Append a new access control entry to the existing access list.
update := bigquery.DatasetMetadataToUpdate{
	Access: append(meta.Access, &bigquery.AccessEntry{
		Role:       bigquery.ReaderRole,
		EntityType: bigquery.UserEmailEntity,
		Entity:     "sample.bigquery.dev@gmail.com"},
	),
}

// Leverage the ETag for the update to assert there's been no modifications to the
// dataset since the metadata was originally read.
if _, err := ds.Update(ctx, update, meta.ETag); err != nil {
	return err
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Java API reference documentation .

Create a Dataset.Builder instance from an existing Dataset instance with the Dataset.toBuilder() method. Configure the dataset builder object. Build the updated dataset with the Dataset.Builder.build() method, and call the Dataset.update() method to send the update to the API.

Configure the access controls with the Dataset.Builder.setAcl() method.

List<Acl> beforeAcls = dataset.getAcl();

// Make a copy of the ACLs so that they can be modified.
ArrayList<Acl> acls = new ArrayList<>(beforeAcls);
acls.add(Acl.of(new Acl.User("sample.bigquery.dev@gmail.com"), Acl.Role.READER));
DatasetInfo.Builder builder = dataset.toBuilder();
builder.setAcl(acls);

bigquery.update(builder.build());  // API request.

Python

Before trying this sample, follow the Python setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Python API reference documentation .

Set the dataset.access_entries property with the access controls for a dataset. Then call the client.update_dataset() function to update the property.

# from google.cloud import bigquery
# client = bigquery.Client()
# dataset = client.get_dataset(client.dataset('my_dataset'))

entry = bigquery.AccessEntry(
    role='READER',
    entity_type='userByEmail',
    entity_id='sample.bigquery.dev@gmail.com')
assert entry not in dataset.access_entries
entries = list(dataset.access_entries)
entries.append(entry)
dataset.access_entries = entries

dataset = client.update_dataset(dataset, ['access_entries'])  # API request

assert entry in dataset.access_entries

Renaming datasets

Currently, you cannot change the name of an existing dataset, and you cannot copy a dataset and give it a new name. If you need to change the dataset name, follow these steps to recreate the dataset:

  1. Create a new dataset and specify the new name.

  2. Copy the tables from the old dataset to the new one.

  3. Recreate the views in the new dataset.

  4. Delete the old dataset to avoid additional storage costs.

Copying datasets

Currently, you cannot copy a dataset. Instead follow these steps to recreate the dataset:

  1. Create a new dataset. Because dataset names must be unique per project, you must assign a new name to the dataset if you are recreating it in the same project.

  2. Copy the tables from the old dataset to the new one.

  3. Recreate the views in the new dataset.

  4. Delete the old dataset to avoid additional storage costs.

Deleting datasets

You can delete a dataset using the BigQuery web UI, the bq rm CLI command, or the datasets.delete API method.

Required permissions

To delete a dataset, you must have OWNER access at the dataset level, or you must be assigned a project-level IAM role that includes bigquery.datasets.delete permissions. If the dataset contains tables, bigquery.tables.delete is also required. The following predefined, project-level IAM roles include both bigquery.datasets.delete and bigquery.tables.delete permissions:

In addition, because the bigquery.user role has bigquery.datasets.create permissions, a user assigned to the bigquery.user role can delete any dataset that user creates. When a user assigned to the bigquery.user role creates a dataset, that user is given OWNER access to the dataset. OWNER access to a dataset gives the user full control over it.

For more information on IAM roles and permissions in BigQuery, see Access Control. For more information on dataset-level roles, see Primitive roles for datasets.

Deleting a dataset

When you delete a dataset using the web UI, tables in the dataset (and the data they contain) are deleted. When you delete a dataset using the CLI, you must use the -r flag to delete the dataset's tables.

After you delete a dataset, it cannot be recovered, restored, or undeleted. Deleting a dataset is permanent.

To delete a dataset:

Web UI

  1. Click the down arrow icon down arrow icon next to your dataset name in the navigation, and then click Delete dataset.

  2. In the Delete Dataset dialog:

    • For Dataset ID, enter the name of the dataset to delete.
    • Click OK.

      Delete dataset

Command-line

Use the bq rm command with the (optional) --dataset or -d shortcut flag to delete a dataset. When you use the CLI to remove a dataset, you must confirm the command. You can use the -f flag to skip confirmation.

In addition, if the dataset contains tables, you must use the -r flag to remove all tables in the dataset. If you are deleting a table in a project other than your default project, add the project ID to the dataset name in the following format: [PROJECT_ID]:[DATASET].

bq rm -r -f -d [PROJECT_ID]:[DATASET]

Where:

  • [PROJECT_ID] is your project ID.
  • [DATASET] is the name of the dataset you're deleting.

Examples:

Enter the following command to remove mydataset and all the tables in it from your default project. The command uses the optional -d shortcut.

bq rm -r -d mydataset

When prompted, type y and press enter.

Enter the following command to remove mydataset and all the tables in it from myotherproject. The command does not use the optional -d shortcut. The -f flag is used to skip confirmation.

bq rm -r -f myotherproject:mydataset

API

Call the datasets.delete method to delete the dataset and set the deleteContents parameter to true to delete the tables in it.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Go API reference documentation .

if err := client.Dataset(datasetID).Delete(ctx); err != nil {
	return fmt.Errorf("Failed to delete dataset: %v", err)
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Java API reference documentation .

DatasetId datasetId = DatasetId.of(projectId, datasetName);
boolean deleted = bigquery.delete(datasetId, DatasetDeleteOption.deleteContents());
if (deleted) {
  // the dataset was deleted
} else {
  // the dataset was not found
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Node.js API reference documentation .

// Imports the Google Cloud client library
const BigQuery = require('@google-cloud/bigquery');

/**
 * TODO(developer): Uncomment the following lines before running the sample.
 */
// const projectId = "your-project-id";
// const datasetId = "my_dataset";

// Creates a client
const bigquery = new BigQuery({
  projectId: projectId,
});

// Creates a reference to the existing dataset
const dataset = bigquery.dataset(datasetId);

// Deletes the dataset
dataset
  .delete()
  .then(() => {
    console.log(`Dataset ${dataset.id} deleted.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

PHP

Before trying this sample, follow the PHP setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery PHP API reference documentation .

use Google\Cloud\BigQuery\BigQueryClient;

/** Uncomment and populate these variables in your code */
// $projectId = 'The Google project ID';
// $datasetId = 'The BigQuery dataset ID';

$bigQuery = new BigQueryClient([
    'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->delete();
printf('Deleted dataset %s' . PHP_EOL, $datasetId);

Python

Before trying this sample, follow the Python setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Python API reference documentation .

# from google.cloud import bigquery
# client = bigquery.Client()

# Delete a dataset that does not contain any tables
# dataset1_id = 'my_empty_dataset'
dataset1_ref = client.dataset(dataset1_id)
client.delete_dataset(dataset1_ref)  # API request

print('Dataset {} deleted.'.format(dataset1_id))

# Use the delete_contents parameter to delete a dataset and its contents
# dataset2_id = 'my_dataset_with_tables'
dataset2_ref = client.dataset(dataset2_id)
client.delete_dataset(dataset2_ref, delete_contents=True)  # API request

print('Dataset {} deleted.'.format(dataset2_id))

Ruby

Before trying this sample, follow the Ruby setup instructions in the BigQuery Quickstart Using Client Libraries . For more information, see the BigQuery Ruby API reference documentation .

require "google/cloud/bigquery"

def delete_dataset dataset_id = "my_empty_dataset"
  bigquery = Google::Cloud::Bigquery.new

  # Delete a dataset that does not contain any tables
  dataset = bigquery.dataset dataset_id
  dataset.delete
  puts "Dataset #{dataset_id} deleted."
end

Next steps

¿Te sirvió esta página? Envíanos tu opinión:

Enviar comentarios sobre…

¿Necesitas ayuda? Visita nuestra página de asistencia.