Google Cloud console

Use the Google Cloud console to perform simple storage management tasks for Cloud Storage. Some typical uses for the console include:

  • Enabling the Cloud Storage API for a project.
  • Creating and deleting buckets.
  • Uploading, downloading, and deleting objects.
  • Managing Identity and Access Management (IAM) policies.

This page provides an overview of the console, including the tasks you can accomplish using the console to manage your data. For more advanced tasks, use the gsutil command line tool or any of the client libraries that support Cloud Storage.

Try it for yourself

If you're new to Google Cloud, create an account to evaluate how Cloud Storage performs in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.

Try Cloud Storage free

Access to the console

The console requires no setup or installation, and you can access it directly in a browser. Depending on your use case, you access console in slightly different ways. If you are:

A user granted access to a project


A current project owner can give you access to the entire project, which applies equally to all buckets and objects defined in the project. For more information, see Adding a principal to a project.

A user granted access to a bucket


In this use case, a project owner gives you access to an individual bucket within a larger project. The owner then sends you the bucket name which you substitute into the URL above. You are able to only work with objects in the specified bucket. This is useful for users who don't have access to the full project, but who need to access a bucket. When you access the URL, you are prompted to authenticate with a Google account if you are not already signed in.

A variation of this use case is when a project owner grants All Users permission to read objects in a bucket. This creates a bucket whose contents are publicly readable. For more information, see Setting permissions and metadata below.

A user granted access to an object


In this use case, a project owner gives you access to single objects within a bucket and sends you the URL to access the objects. When you access the URLs, you are prompted to authenticate with a Google account if you are not already signed in.

Note that the form of the URL above is different from the URL for objects that are shared publicly. When you share a link publicly, the URL is of the form: This public URL does not require a recipient to authenticate with Google and can be used for non-authenticated access to an object.

Tasks you can perform with the Google Cloud console

The console enables you to perform basic storage management tasks with your data using a browser. To use the console, you must authenticate with Google and have appropriate permission to complete a given task. If you are the account owner who created the project, it is likely you already have all the permission you need to complete the tasks below. Otherwise, you can be granted access to a project or be given permission to perform actions on a bucket.

Creating a bucket

Cloud Storage uses a flat namespace to store your data but you can use the console to create folders and mimic a folder hierarchy. Your data isn't physically stored in a hierarchical structure, but is displayed like that in the console.

Because Cloud Storage has no notion of folders, the folder suffix and object name delimiters are visible when you view your folders using gsutil or any other command-line tools that may work with Cloud Storage.

See Creating Storage Buckets for a step-by-step guide to creating buckets using the console.

Uploading data to a bucket

You can upload data to your bucket by uploading one or more files or a folder containing files. When you upload a folder, the console maintains the same hierarchical structure of the folder, including all of the files and folders it contains. You can track the progress of uploads to the console using the upload progress window. You can minimize the progress window and continue working with your bucket.

See Uploading Objects for a step-by-step guide to uploading objects to your buckets using the console.

You can also upload objects to the console by dragging and dropping files and folders from your desktop or file manager tool to a bucket or sub-folder in console.

Downloading data from a bucket

See Downloading Objects for a step-by-step guide to downloading objects from your buckets using the console.

You can also view details of an object by clicking it. If the object can be displayed, the details page includes a preview of the object itself.

Creating and using folders

Because the Cloud Storage system has no notion of folders, folders created in the console are a convenience to help you organize objects in a bucket. As a visual aid, the console shows folders with a folder icon image to help you distinguish folders from objects.

Objects added to a folder appear to reside within the folder in the console. In reality, all objects exist at the bucket level, and simply include the directory structure in their name. For example, if you create a folder named pets and add a file cat.jpeg to that folder, the console makes the file appear to exist in the folder. In reality, there is no separate folder entity: the file simply exists in the bucket and has the name pets/cat.jpeg.

Unlike buckets, folders don't have to be globally unique. That is, while a bucket name can only be used if there are no buckets already in existence with that name, folder names can be used repeatedly so long as they don't reside in the same bucket or sub-folder.

When navigating folders in the console, you can access higher levels of the directory by clicking the desired folder or bucket name in the breadcrumb trail above the file lists.

When you use other tools to work with your buckets and data, the presentation of folders may be different than as presented in the console. For more information on how different tools, such as gsutil, simulate folders in Cloud Storage, see Folders.

Filtering buckets or objects to view

In the console list of buckets for a project, you can filter the buckets you see by using the Filter buckets text box.

  • You can always filter by the bucket name prefix.

  • For projects with less than 1000 buckets, you can always filter by additional criteria, such as the Location of buckets.

  • For projects with more than 1000 buckets, you must enable additional criteria filtering by using the dropdown that appears next to the filtering text box. Note, however, that projects with 1000's of buckets experience degraded filtering performance when additional criteria filtering is enabled.

In the console list of objects for a bucket, you can filter the objects you see by specifying a prefix in the Filter by object or folder name prefix... text box, located above the list of objects. This filter displays objects beginning with the specified prefix. The prefix only filters objects in your current bucket view: it does not select objects contained in folders.

Setting object metadata

You can configure an object's metadata in the console. Object metadata controls aspects of how requests are handled, including what type of content your data represents and how your data is encoded. Use the console to set metadata on one object at a time. Use gsutil setmeta to set metadata on multiple objects simultaneously.

See Viewing and Editing Object Metadata for a step-by-step guide to viewing and editing an object's metadata.

Deleting objects, folders, and buckets

You can delete any bucket, folder, or object in the Google Cloud console by selecting the checkbox next to it, clicking the Delete button, and confirming you want to proceed with the action. When you delete a folder or bucket, you also delete all objects inside it, including any objects marked as Public.

See Deleting Objects for a step-by-step guide to removing objects from your buckets using the console.

See Deleting Buckets for a step-by-step guide to deleting buckets from your project using the console.

Sharing your data publicly

When you share an object publicly, a link icon appears in the object's public access column. Clicking on this link reveals a public URL for accessing the object.

The public URL is different from the link associated with directly right-clicking on an object. Both links provide access to an object, but using the public URL works without having to sign into a Google account. See Request Endpoints for more information.

See Accessing Public Data for ways to access a publicly shared object.

To stop sharing an object publicly:

You can stop publicly sharing an object by removing any permission entries that have allUsers or allAuthenticatedUsers as principals.

Using the public access column

Both buckets and objects in the console have a public access column that indicates when resources are shared publicly.

Bucket-level public access column

A bucket's public access column can have the following values: Public to internet, Not public, or Subject to object ACLs.

A bucket is Public to internet if it has an IAM role that meets these criteria:

  • The role is granted to the principal allUsers or allAuthenticatedUsers.
  • The role has at least one storage permission that is not storage.buckets.create or storage.buckets.list.

If these conditions are not true, the bucket is either Not public or Subject to object ACLs:

  • Not public: No IAM role grants public access to the objects in the bucket, and uniform bucket-level access is enabled for the bucket.

  • Subject to object ACLs: No IAM role grants public access to the objects in the bucket, but Access Control Lists (ACLs) may still be granting public access to individual objects within the bucket. Check each object's permissions to see if they are public. To use IAM exclusively, enable uniform bucket-level access.

Object-level public access column

An object is considered public if either of these conditions are true:

  1. The Access Control List (ACL) for the object includes an entry for allUsers or allAuthenticatedUsers.

  2. The bucket containing the object has an IAM role that meets these criteria:

    • The role is granted to the principal allUsers or allAuthenticatedUsers.
    • The role has at least one of the following storage permissions: storage.objects.get, storage.objects.getIamPolicy, storage.objects.setIamPolicy, storage.objects.update.

If either of these conditions are true, the public access column for the object reads Public to internet.

If none of these conditions are true, the public access column for the object reads Not public.

Setting bucket permissions

You can control access to a Cloud Storage bucket by using IAM permissions. For example, you can set a bucket's permissions to allow an entity such as a user or group to view or create objects in your bucket. You might do this in cases when it isn't appropriate to add a user at the project level. The entity specified in the IAM permission must authenticate by signing in to Google when accessing the bucket. Share the bucket URL with the user(s) as

Setting object permissions

You can easily and uniformly control access to objects in a bucket by using IAM permissions in the console. If you want to customize access for individual objects within a bucket, use Signed URLs or Access Control Lists (ACLs) instead.

See Using IAM Permissions for step-by-step guides to viewing and editing IAM permissions.

To view or change permissions for individual objects, see Changing ACLs.

Giving users project-level roles

When you create a project, you are given the Owner IAM role. Other entities, such as collaborators, must be given their own roles in order to work with your project's buckets and objects.

Once you have been given a role for the project, the project name appears in your list of projects. If you are an existing project owner, you can grant a principal access to the project. See Using IAM with projects for step-by-step guides to adding and removing access at the project level.

Working with Object Versioning

You can enable Object Versioning to retain noncurrent versions of an object in case of accidental deletion or replacement; however, enabling Object Versioning increases storage costs. You can mitigate costs by also adding Object Lifecycle Management conditions when you enable Object Versioning. These conditions automatically delete or down-class older object versions based on settings you specify. The configuration example for deleting objects gives one possible set of conditions for this use case.

Noncurrent versions of an object are listed and managed in the Version history tab of the object.

Scanning buckets with Cloud Data Loss Prevention

Cloud Data Loss Prevention (Cloud DLP) is a service allowing you to identify and protect sensitive data in your buckets. Cloud DLP can help you meet compliance requirements by finding and redacting information such as:

  • Credit card numbers
  • IP addresses
  • Other forms of personally identifiable information (PII)

For a list of the types of data Cloud DLP detects, see the Infotype detector reference.

You can initiate a Cloud DLP scan for a bucket by clicking the three-dot menu for the bucket and selecting Scan with Cloud Data Loss Prevention. For a guide to performing a Cloud DLP scan on a bucket, see Inspecting a Cloud Storage location.