Google Cloud Platform Console

Use the Google Cloud Platform Console to perform simple storage management tasks for Cloud Storage. Some typical uses for the GCP Console include:

  • Enabling the Cloud Storage API for a project.
  • Creating and deleting buckets.
  • Uploading, downloading, and deleting objects.
  • Managing Identity and Access Management (IAM) policies.

This page provides an overview of the GCP Console, including the tasks you can accomplish using the GCP Console to manage your data. For more advanced tasks, use the gsutil command line tool or any of the client libraries that support Cloud Storage.

Access to the GCP Console

The GCP Console requires no setup or installation, and you can access it directly in a browser. Depending on your use case, you access GCP Console in slightly different ways. If you are:

A user granted access to a project


In order to use Google Cloud Platform Console as a project member, you must be added to the project’s member list. A current project owner can give you access, which applies to all buckets and objects defined in the project. For more information, see Adding a member to a project.

A user granted access to a bucket


In this use case, a project owner gives you access to an individual bucket within a larger project. The owner then sends you the bucket name which you substitute into the URL above. You are able to only work with objects in the specified bucket. This is useful for users who are not project members, but who need to access a bucket. When you access the URL, you are prompted to authenticate with a Google account if you are not already signed in.

A variation of this use case is when a project owner grants All Users permission to read objects in a bucket. This creates a bucket whose contents are publicly readable. For more information, see Setting permissions and metadata below.

A user granted access to an object


In this use case, a project owner gives you access to single objects within a bucket and sends you the URL to access the objects. When you access the URLs, you are prompted to authenticate with a Google account if you are not already signed in.

Note that the form of the URL above is different from the URL for objects that are shared publicly. When you share a link publicly, the URL is of the form:[BUCKET_NAME]/[OBJECT_NAME]. This public URL does not require a recipient to authenticate with Google and can be used for non-authenticated access to an object.

Tasks you can perform with the Google Cloud Platform Console

The GCP Console enables you to perform basic storage management tasks with your data using a browser. To use the GCP Console, you must authenticate with Google and have appropriate permission to complete a given task. If you are the account owner who created the project, it is likely you already have all the permission you need to complete the tasks below. Otherwise, you can be added as a project member (Adding a member to a project) or be given permission to perform actions on a bucket (Setting bucket permissions).

Creating a bucket

Cloud Storage uses a flat namespace to store your data but you can use the GCP Console to create folders and mimic a folder hierarchy. Your data isn't physically stored in a hierarchical structure, but is displayed like that in the GCP Console.

Because Cloud Storage has no notion of folders, the folder suffix and object name delimiters are visible when you view your folders using gsutil or any other command-line tools that may work with Cloud Storage.

See Creating Storage Buckets for a step-by-step guide to creating buckets using the GCP Console.

Uploading data to a bucket

You can upload data to your bucket by uploading one or more files or a folder containing files. When you upload a folder, the GCP Console maintains the same hierarchical structure of the folder, including all of the files and folders it contains. You can track the progress of uploads to the GCP Console using the upload progress window. You can minimize the progress window and continue working with your bucket.

See Uploading Objects for a step-by-step guide to uploading objects to your buckets using the GCP Console.

You can also upload objects to the GCP Console by dragging and dropping files and folders from your desktop or file manager tool to a bucket or sub-folder in GCP Console.

Downloading data from a bucket

See Downloading Objects for a step-by-step guide to downloading objects from your buckets using the GCP Console.

You can also view details of an object by clicking it. If the object can be displayed, the details page includes a preview of the object itself.

Creating and using folders

Because the Cloud Storage system has no notion of folders, folders created in the GCP Console are a convenience to help you organize objects in a bucket. As a visual aid, the GCP Console shows folders with a folder icon image to help you distinguish folders from objects.

From within a bucket (or a folder in a bucket) you can create a new folder by clicking the Create Folder button. Unlike buckets, folders don't have to be globally unique. That is, while a bucket name can only be used if there are no buckets already in existence with that name, folder names can be used repeatedly so long as they don't reside in the same bucket or sub-folder.

Objects added to a folder appear to reside within the folder in the GCP Console. In reality, all objects exist at the bucket level, and simply include the directory structure in their name. For example, if you create a folder named pets and add a file cat.jpeg to that folder, the GCP Console makes the file appear to exist in the folder. In reality, there is no separate folder entity: the file simply exists in the bucket and has the name pets/cat.jpeg.

When navigating folders in the GCP Console, you can access higher levels of the directory by clicking the desired folder or bucket name in the breadcrumb trail above the file lists.

Working with folders in gsutil

When you use other tools to work with your buckets and data, the presentation of folders may be different than as presented in the GCP Console. For example, to see how gsutil interprets folders, see How Subdirectories Work.

Filtering objects to view

In the GCP Console, you can filter the objects you see by specifying a prefix in the Filter by prefix... text box located above the list of objects. This filter displays objects beginning with the specified prefix. The prefix only filters objects in your current bucket view: it does not select objects contained in folders.

Setting object metadata

You can configure an object's metadata in the GCP Console. Object metadata controls aspects of how requests are handled, including what type of content your data represents and how your data is encoded. Use the GCP Console to set metadata on one object at a time. Use gsutil setmeta to set metadata on multiple objects simultaneously.

See Viewing and Editing Object Metadata for a step-by-step guide to viewing and editing an object's metadata.

Deleting objects, folders, and buckets

You can delete any bucket, folder, or object in the Google Cloud Platform Console by selecting the checkbox next to it, clicking the Delete button, and confirming you want to proceed with the action. When you delete a folder or bucket, you also delete all objects inside it, including any objects marked as Public.

See Deleting Objects for a step-by-step guide to removing objects from your buckets using the GCP Console.

See Deleting Buckets for a step-by-step guide to deleting buckets from your project using the GCP Console.

Sharing your data publicly

When you share an object publicly, a link icon appears in the object's public access column. Clicking on this link reveals a public URL for accessing the object.

See Making Data Public for step-by-step guides to sharing your objects with others by making them publicly accessible.

See Accessing Public Data for ways to access a publicly shared object.

To stop sharing an object publicly:

You can stop publicly sharing an object by removing any permission entries that have allUsers or allAuthenticatedUsers as members.

Using the public access column

Both buckets and objects in the GCP Console have a public access column that indicates when resources are shared publicly.

Bucket-level public access column

A bucket is considered public if it has an IAM role that meets these criteria:

  • The role contains the member allUsers or allAuthenticatedUsers.
  • The role has at least one storage permission that is not storage.buckets.create or storage.buckets.list.

If these conditions are true, the public access column for the bucket reads Public.

If these conditions are not true, the public access column for the bucket reads Per object. This is because it's still possible that individual objects within the bucket are publicly accessible, depending on their individual Access Control Lists (ACLs).

Object-level public access column

An object is considered public if either of these conditions are true:

  1. The Access Control List (ACL) for the object includes an entry for allUsers or allAuthenticatedUsers.

  2. The bucket containing the object has an IAM role that meets these criteria:

    • The role contains the member allUsers or allAuthenticatedUsers.
    • The role has at least one of the following storage permissions: storage.objects.get, storage.objects.getIamPolicy, storage.objects.setIamPolicy, storage.objects.update.

If either of these conditions are true, the public access column for the object reads Public.

If none of these conditions are true, the public access column for the object reads Not public.

Setting bucket permissions

You can control access to a Cloud Storage bucket by using Identity and Access Management (IAM) permissions. For example, you can set a bucket's permissions to allow an entity such as a user or group to view or create objects in your bucket. You might do this in cases when it isn't appropriate to add a user at the project level. The entity specified in the IAM permission must authenticate by signing in to Google when accessing the bucket. Share the bucket URL with the user(s) as[BUCKET_NAME]/.

Setting object permissions

You can easily and uniformly control access to objects in a bucket by using Identity and Access Management (IAM) permissions in the GCP Console. If you want to customize access for individual objects within a bucket, use Signed URLs or Access Control Lists (ACLs) instead.

See Using IAM Permissions for step-by-step guides to viewing and editing IAM permissions.

To view or change permissions for individual objects, see Changing ACLs.

Giving users project-level roles

When you create a project, you are given the Owner IAM role. Other entities, such as collaborators, must be given their own roles in order to work with your project's buckets and objects.

Once you have been given a role for the project, the project name appears in your list of projects. If you are an existing project owner, you can add a member to your project. See Using IAM with projects for step-by-step guides to adding and removing access at the project level.

Scanning buckets with Cloud Data Loss Prevention

Cloud Data Loss Prevention (Cloud DLP) is a service allowing you to identify and protect sensitive data in your buckets. Cloud DLP can help you meet compliance requirements by finding and redacting information such as:

  • Credit card numbers
  • IP addresses
  • Other forms of personally identifiable information (PII)

For a list of the types of data Cloud DLP detects, see the Infotype detector reference.

You can initiate a Cloud DLP scan for a bucket by clicking the three-dot menu for the bucket and selecting Scan with Cloud Data Loss Prevention. For a guide to performing a Cloud DLP scan on a bucket, see Inspecting a Cloud Storage location.

Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Storage
Need help? Visit our support page.