Google Cloud console

Use the Google Cloud console to perform simple storage management tasks for Cloud Storage. Some typical uses for the Google Cloud console include:

  • Enabling the Cloud Storage API for a project.
  • Creating and deleting buckets.
  • Uploading, downloading, and deleting objects.
  • Managing Identity and Access Management (IAM) policies.

This page provides an overview of the Google Cloud console, including the tasks you can accomplish using the Google Cloud console to manage your data. For more advanced tasks, use the Google Cloud CLI or any of the client libraries that support Cloud Storage.

Try it for yourself

If you're new to Google Cloud, create an account to evaluate how Cloud Storage performs in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.

Try Cloud Storage free

Access to the Google Cloud console

The Google Cloud console requires no setup or installation, and you can access it directly in a browser. Depending on your use case, you access Google Cloud console in slightly different ways. If you are:

A user granted access to a project

Use: https://console.cloud.google.com/.

A current project owner can give you access to the entire project, which applies equally to all buckets and objects defined in the project.

A user granted access to a bucket

Use: https://console.cloud.google.com/storage/browser/BUCKET_NAME.

In this use case, a project owner gives you access to an individual bucket within a larger project. The owner then sends you the bucket name which you substitute into the URL above. You are able to only work with objects in the specified bucket. This is useful for users who don't have access to the full project, but who need to access a bucket. When you access the URL, you authenticate if you are not already signed in.

A variation of this use case is when a project owner grants All Users permission to read objects in a bucket. This creates a bucket whose contents are publicly readable. For more information, see Setting permissions and metadata.

A user granted access to an object

Use: https://console.cloud.google.com/storage/browser/_details/BUCKET_NAME/OBJECT_NAME

In this use case, a project owner gives you access to single objects within a bucket and sends you the URL to access the objects. When you access the URLs, you are prompted to authenticate with a user account if you are not already signed in.

Note that the form of the URL above is different from the URL for objects that are shared publicly. When you share a link publicly, the URL is of the form: https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME. This public URL does not require a recipient to authenticate with Google and can be used for non-authenticated access to an object.

Tasks you can perform with the Google Cloud console

The Google Cloud console enables you to perform basic storage management tasks with your data using a browser. To use the Google Cloud console, you must authenticate with Google and have appropriate permission to complete a given task. If you are the account owner who created the project, it is likely you already have all the permission you need to complete the tasks below. Otherwise, you can be granted access to a project or be given permission to perform actions on a bucket.

Creating a bucket

Cloud Storage uses a flat namespace to store your data, but you can use the Google Cloud console to create folders and mimic a folder hierarchy. Your data isn't physically stored in a hierarchical structure, but is displayed like that in the Google Cloud console.

For a step-by-step guide to bucket creation, see Create buckets.

Uploading data to a bucket

You can upload data to your bucket by uploading one or more files or a folder containing files. When you upload a folder, the Google Cloud console maintains the same hierarchical structure of the folder, including all of the files and folders it contains. You can track the progress of uploads to the Google Cloud console using the upload progress window. You can minimize the progress window and continue working with your bucket.

See Upload objects for a step-by-step guide to uploading objects to your buckets using the Google Cloud console.

You can also upload objects to the Google Cloud console by dragging and dropping files and folders from your desktop or file manager tool to a bucket or sub-folder in Google Cloud console.

Downloading data from a bucket

See Download objects for a step-by-step guide to downloading objects from your buckets using the Google Cloud console.

You can also view details of an object by clicking it. If the object can be displayed, the details page includes a preview of the object itself.

Creating and using folders

Because a Cloud Storage bucket exists as a flat namespace, folders created using the Google Cloud console are a simulated convenience to help you organize objects. The Google Cloud console represents these simulated folders with a folder icon image, and the Google Cloud console makes objects added to a simulated folder appear to reside within that folder. In reality, all objects exist at the bucket level. The Google Cloud console makes objects appear to be contained in a simulated folder when the folder's name is part of the object's overall name.

For example, if you create a simulated folder named pets and add a file cat.jpeg to that folder, the Google Cloud console makes the file appear to exist in the folder. In reality, there is no separate folder entity: the file simply exists in the bucket and has the name pets/cat.jpeg.

Simulated folders created by the Google Cloud console exist as 0-byte objects within the bucket. You cannot use the Google Cloud console to set metadata or access permissions on these 0-byte objects. Note that there is a separate Cloud Storage feature, called managed folders, that lets you control access to a group of objects that share a common prefix.

When navigating simulated folders in the Google Cloud console, you can access higher levels of the directory by clicking the desired folder or bucket name in the breadcrumb trail above the file lists.

When you use other tools to work with your buckets and data, the presentation of folders might be different than as presented in the Google Cloud console. For more information on how different tools, such as the gcloud CLI, simulate folders in Cloud Storage, see Simulated folders.

Filtering buckets or objects to view

In the Google Cloud console list of buckets for a project, you can filter the buckets you see by using the Filter buckets text box.

  • You can always filter by the bucket name prefix.

  • For projects with less than 1000 buckets, you can always filter by additional criteria, such as the Location of buckets.

  • For projects with more than 1000 buckets, you must enable additional criteria filtering by using the dropdown that appears next to the filtering text box. Note, however, that projects with 1000's of buckets experience degraded filtering performance when additional criteria filtering is enabled.

In the Google Cloud console list of objects for a bucket, you can filter the objects you see by specifying a prefix in the Filter by object or folder name prefix... text box, located above the list of objects. This filter displays objects beginning with the specified prefix. The prefix only filters objects in your current bucket view: it does not select objects contained in folders.

Setting object metadata

You can configure an object's metadata in the Google Cloud console. Object metadata controls aspects of how requests are handled, including what type of content your data represents and how your data is encoded. Use the Google Cloud console to set metadata on one object at a time. Use gcloud storage objects update to set metadata on multiple objects simultaneously.

See View and edit object metadata for a step-by-step guide to viewing and editing an object's metadata.

Deleting objects, folders, and buckets

You can delete any bucket, folder, or object in the Google Cloud console by selecting the checkbox next to it, clicking the Delete button, and confirming you want to proceed with the action. When you delete a folder or bucket, you also delete all objects inside it, including any objects marked as Public.

See Delete objects for a step-by-step guide to removing objects from your buckets using the Google Cloud console.

See Delete buckets for a step-by-step guide to deleting buckets from your project using the Google Cloud console.

Sharing your data publicly

When you share an object publicly, a link icon appears in the object's public access column. Clicking on this link reveals a public URL for accessing the object.

The public URL is different from the link associated with directly right-clicking on an object. Both links provide access to an object, but using the public URL works without having to sign into a user account. See Request endpoints for more information.

See Access public data for ways to access a publicly shared object.

To stop sharing an object publicly:

You can stop publicly sharing an object by removing any permission entries that have allUsers or allAuthenticatedUsers as principals.

Using the public access column

Both buckets and objects in the Google Cloud console have a public access column that indicates when resources are shared publicly.

Bucket-level public access column

A bucket's public access column can have the following values: Public to internet, Not public, or Subject to object ACLs.

A bucket is Public to internet if it has an IAM role that meets these criteria:

  • The role is granted to the principal allUsers or allAuthenticatedUsers.
  • The role has at least one storage permission that is not storage.buckets.create or storage.buckets.list.

If these conditions are not true, the bucket is either Not public or Subject to object ACLs:

  • Not public: No IAM role grants public access to the objects in the bucket, and uniform bucket-level access is enabled for the bucket.

  • Subject to object ACLs: No IAM role grants public access to the objects in the bucket, but Access control lists (ACLs) may still be granting public access to individual objects within the bucket. Check each object's permissions to see if they are public. To use IAM exclusively, enable uniform bucket-level access.

Object-level public access column

An object is considered public if either of these conditions are true:

  1. The Access control list (ACL) for the object includes an entry for allUsers or allAuthenticatedUsers.

  2. The bucket containing the object has an IAM role that meets these criteria:

    • The role is granted to the principal allUsers or allAuthenticatedUsers.
    • The role has at least one of the following storage permissions: storage.objects.get, storage.objects.getIamPolicy, storage.objects.setIamPolicy, storage.objects.update.

If either of these conditions are true, the public access column for the object reads Public to internet.

If none of these conditions are true, the public access column for the object reads Not public.

Setting bucket permissions

You can control access to a Cloud Storage bucket by using IAM permissions. For example, you can set a bucket's permissions to allow an entity such as a user or group to view or create objects in your bucket. You might do this in cases when it isn't appropriate to add a user at the project level. The entity specified in the IAM permission must authenticate by signing in to Google when accessing the bucket. Share the bucket URL with the user(s) as https://console.cloud.google.com/storage/browser/BUCKET_NAME/.

Setting object permissions

You can easily and uniformly control access to objects in a bucket by using IAM permissions in the Google Cloud console. If you want to customize access for individual objects within a bucket, use Signed URLs or Access control lists (ACLs) instead.

See Use IAM permissions for step-by-step guides to viewing and editing IAM permissions.

To view or change permissions for individual objects, see Changing ACLs.

Giving users project-level roles

When you create a project, you are given the Owner IAM role. Other entities, such as collaborators, must be given their own roles in order to work with your project's buckets and objects.

Once you have been given a role for the project, the project name appears in your list of projects. If you are an existing project owner, you can grant a principal access to the project. See Manage access to projects, folders, and organizations for step-by-step guides to adding and removing access at the project level.

Working with Object Versioning

You can enable Object Versioning to retain noncurrent versions of an object in case of accidental deletion or replacement; however, enabling Object Versioning increases storage costs. You can mitigate costs by also adding Object Lifecycle Management conditions when you enable Object Versioning. These conditions automatically delete or down-class older object versions based on settings you specify. The configuration example for deleting objects gives one possible set of conditions for this use case.

Noncurrent versions of an object are listed and managed in the Version history tab of the object.

Scanning buckets with Sensitive Data Protection

Sensitive Data Protection is a service allowing you to identify and protect sensitive data in your buckets. Sensitive Data Protection can help you meet compliance requirements by finding and redacting information such as:

  • Credit card numbers
  • IP addresses
  • Other forms of personally identifiable information (PII)

For a list of the types of data Sensitive Data Protection detects, see the Infotype detector reference.

You can initiate a Sensitive Data Protection scan for a bucket by clicking the three-dot menu for the bucket and selecting Scan with Sensitive Data Protection. For a guide to performing a Sensitive Data Protection scan on a bucket, see Inspecting a Cloud Storage location.