Use the Google Cloud console to perform simple storage management tasks for Cloud Storage. Some typical uses for the Google Cloud console include:
- Enabling the Cloud Storage API for a project.
- Creating and deleting buckets.
- Uploading, downloading, and deleting objects.
- Managing Identity and Access Management (IAM) policies.
This page provides an overview of the Google Cloud console, including the tasks you can accomplish using the Google Cloud console to manage your data. For more advanced tasks, use the Google Cloud CLI or any of the client libraries that support Cloud Storage.
Try it for yourself
If you're new to Google Cloud, create an account to evaluate how Cloud Storage performs in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
Try Cloud Storage freeAccess to the Google Cloud console
The Google Cloud console requires no setup or installation, and you can access it directly in a browser. Depending on your use case, you access Google Cloud console in slightly different ways. If you are:
- A user granted access to a project
Use:
https://console.cloud.google.com/
.A current project owner can give you access to the entire project, which applies equally to all buckets and objects defined in the project.
- A user granted access to a bucket
Use:
https://console.cloud.google.com/storage/browser/BUCKET_NAME
.In this use case, a project owner gives you access to an individual bucket within a larger project. The owner then sends you the bucket name which you substitute into the URL above. You are able to only work with objects in the specified bucket. This is useful for users who don't have access to the full project, but who need to access a bucket. When you access the URL, you authenticate if you are not already signed in.
A variation of this use case is when a project owner grants All Users permission to read objects in a bucket. This creates a bucket whose contents are publicly readable. For more information, see Setting permissions and metadata.
- A user granted access to an object
Use:
https://console.cloud.google.com/storage/browser/_details/BUCKET_NAME/OBJECT_NAME
In this use case, a project owner gives you access to single objects within a bucket and sends you the URL to access the objects. When you access the URLs, you are prompted to authenticate with a user account if you are not already signed in.
Note that the form of the URL above is different from the URL for objects that are shared publicly. When you share a link publicly, the URL is of the form:
https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME
. This public URL does not require a recipient to authenticate with Google and can be used for non-authenticated access to an object.
Tasks you can perform with the Google Cloud console
The Google Cloud console enables you to perform basic storage management tasks with your data using a browser. To use the Google Cloud console, you must authenticate with Google and have appropriate permission to complete a given task. If you are the account owner who created the project, it is likely you already have all the permission you need to complete the tasks below. Otherwise, you can be granted access to a project or be given permission to perform actions on a bucket.
Creating a bucket
Cloud Storage uses a flat namespace to store your data, but you can use the Google Cloud console to create folders and mimic a folder hierarchy. Your data isn't physically stored in a hierarchical structure, but is displayed like that in the Google Cloud console.
For a step-by-step guide to bucket creation, see Create buckets.
Uploading data to a bucket
You can upload data to your bucket by uploading one or more files or a folder containing files. When you upload a folder, the Google Cloud console maintains the same hierarchical structure of the folder, including all of the files and folders it contains. You can track the progress of uploads to the Google Cloud console using the upload progress window. You can minimize the progress window and continue working with your bucket.
See Upload objects for a step-by-step guide to uploading objects to your buckets using the Google Cloud console.
You can also upload objects to the Google Cloud console by dragging and dropping files and folders from your desktop or file manager tool to a bucket or sub-folder in Google Cloud console.
Downloading data from a bucket
See Download objects for a step-by-step guide to downloading objects from your buckets using the Google Cloud console.
You can also view details of an object by clicking it. If the object can be displayed, the details page includes a preview of the object itself.
Creating and using folders
Because a Cloud Storage bucket exists as a flat namespace, folders created using the Google Cloud console are a simulated convenience to help you organize objects. The Google Cloud console represents these simulated folders with a folder icon image, and the Google Cloud console makes objects added to a simulated folder appear to reside within that folder. In reality, all objects exist at the bucket level. The Google Cloud console makes objects appear to be contained in a simulated folder when the folder's name is part of the object's overall name.
For example, if you create a simulated folder named pets
and add a file
cat.jpeg
to that folder, the Google Cloud console makes the file appear to
exist in the folder. In reality, there is no separate folder entity: the file
simply exists in the bucket and has the name pets/cat.jpeg
.
Simulated folders created by the Google Cloud console exist as 0-byte objects within the bucket. You cannot use the Google Cloud console to set metadata or access permissions on these 0-byte objects. Note that there is a separate Cloud Storage feature, called managed folders, that lets you control access to a group of objects that share a common prefix.
When navigating simulated folders in the Google Cloud console, you can access higher levels of the directory by clicking the desired folder or bucket name in the breadcrumb trail above the file lists.
When you use other tools to work with your buckets and data, the presentation of folders might be different than as presented in the Google Cloud console. For more information on how different tools, such as the gcloud CLI, simulate folders in Cloud Storage, see Simulated folders.
Filtering and sorting lists of buckets and objects
In the Google Cloud console list of buckets for a project, you filter the buckets you see by using the Filter buckets field, and you sort the buckets by clicking the heading of the column you want to sort by.
You can always filter by the bucket name prefix.
For projects with less than 1,000 buckets, you can always filter by additional criteria, such as the Location of buckets, and you can always sort by column.
For projects with more than 1,000 buckets, you must enable additional criteria filtering and sorting by using the drop-down that appears next to the filtering field. Note, however, that projects with 1,000's of buckets experience degraded filtering and sorting performance when this is enabled.
In the Google Cloud console list of objects for a bucket, you filter the objects you see by entering a prefix in the Filter objects and folders field. This filter displays objects and folders beginning with the entered prefix.
Filtering and sorting only applies to objects and folders in the current path being displayed. For example, if you're viewing the top-level of a bucket, filtering and sorting don't return objects contained in folders.
To enable sorting, as well as additional filtering options, click the Filter by name prefix only drop-down, and select Sort and filter.
Once you have enabled sorting, sort the objects by clicking the heading of the column you want to sort by.
Buckets with large numbers of objects and folders in the current path experience degraded performance when sorted or when filtered with criteria other than the name prefix.
Showing and hiding columns
To show or hide columns for a list of buckets or objects, click View column (
), then select the columns you want to see or hide.Setting object metadata
You can configure an object's metadata in the Google Cloud console.
Object metadata controls aspects of how requests are handled, including what
type of content your data represents and how your data is encoded. Use the
Google Cloud console to set metadata on one object at a time. Use
gcloud storage objects update
to set metadata on multiple objects
simultaneously.
See View and edit object metadata for a step-by-step guide to viewing and editing an object's metadata.
Deleting objects, folders, and buckets
You can delete any bucket, folder, or object in the Google Cloud console by selecting the checkbox next to it, clicking the Delete button, and confirming you want to proceed with the action. When you delete a folder or bucket, you also delete all objects inside it, including any objects marked as Public.
See Delete objects for a step-by-step guide to removing objects from your buckets using the Google Cloud console.
See Delete buckets for a step-by-step guide to deleting buckets from your project using the Google Cloud console.
Sharing your data publicly
When you share an object publicly, a link icon appears in the object's public access column. Clicking on this link reveals a public URL for accessing the object.
The public URL is different from the link associated with directly right-clicking on an object. Both links provide access to an object, but using the public URL works without having to sign into a user account. See Request endpoints for more information.
See Access public data for ways to access a publicly shared object.
To stop sharing an object publicly:
You can stop publicly sharing an object by removing any permission entries that have allUsers or allAuthenticatedUsers as principals.
For buckets where you share only certain objects publicly, edit the ACL of the individual object.
For buckets where you share all objects publicly, remove the IAM access to allUsers.
Using the public access column
Both buckets and objects in the Google Cloud console have a public access column that indicates when resources are shared publicly.
Bucket-level public access column
A bucket's public access column can have the following values: Public to internet, Not public, or Subject to object ACLs.
A bucket is Public to internet if it has an IAM role that meets these criteria:
- The role is granted to the principal allUsers or allAuthenticatedUsers.
- The role has at least one storage permission that is not
storage.buckets.create
orstorage.buckets.list
.
If these conditions are not true, the bucket is either Not public or Subject to object ACLs:
Not public: No IAM role grants public access to the objects in the bucket, and uniform bucket-level access is enabled for the bucket.
Subject to object ACLs: No IAM role grants public access to the objects in the bucket, but Access control lists (ACLs) may still be granting public access to individual objects within the bucket. Check each object's permissions to see if they are public. To use IAM exclusively, enable uniform bucket-level access.
Object-level public access column
An object is considered public if either of these conditions are true:
The Access control list (ACL) for the object includes an entry for allUsers or allAuthenticatedUsers.
The bucket containing the object has an IAM role that meets these criteria:
- The role is granted to the principal allUsers or allAuthenticatedUsers.
- The role has at least one of the following storage permissions:
storage.objects.get
,storage.objects.getIamPolicy
,storage.objects.setIamPolicy
,storage.objects.update
.
If either of these conditions are true, the public access column for the object reads Public to internet.
If none of these conditions are true, the public access column for the object reads Not public.
Setting bucket permissions
You can control access to a Cloud Storage bucket by
using IAM permissions. For example,
you can set a bucket's permissions to allow an entity such as a user or group
to view or create objects in your bucket. You might do this in cases when
it isn't appropriate to add a user at the project level. The entity
specified in the IAM permission must authenticate by signing in to Google
when accessing the bucket. Share the bucket URL with the user(s) as
https://console.cloud.google.com/storage/browser/BUCKET_NAME/
.
Setting object permissions
You can easily and uniformly control access to objects in a bucket by using IAM permissions in the Google Cloud console. If you want to customize access for individual objects within a bucket, use Signed URLs or Access control lists (ACLs) instead.
See Use IAM permissions for step-by-step guides to viewing and editing IAM permissions.
To view or change permissions for individual objects, see Changing ACLs.
Giving users project-level roles
When you create a project, you are given the Owner IAM role. Other entities, such as collaborators, must be given their own roles in order to work with your project's buckets and objects.
Once you have been given a role for the project, the project name appears in your list of projects. If you are an existing project owner, you can grant a principal access to the project. See Manage access to projects, folders, and organizations for step-by-step guides to adding and removing access at the project level.
Working with Object Versioning
You can enable Object Versioning to retain noncurrent versions of an object in case of accidental deletion or replacement; however, enabling Object Versioning increases storage costs. You can mitigate costs by also adding Object Lifecycle Management conditions when you enable Object Versioning. These conditions automatically delete or down-class older object versions based on settings you specify. The configuration example for deleting objects gives one possible set of conditions for this use case.
Noncurrent versions of an object are listed and managed in the Version history tab of the object.
Cross-product integrations
The following integrations with other Google Cloud products are available in the Objects tab of a bucket:
Large scale data transfers to and from the bucket using Storage Transfer Service.
Storage Transfer Service is a service that lets you transfer large volumes of data between your bucket and other storage options, such as your on-premises file system, other buckets, or other cloud providers.
You can initiate a transfer by clicking the Transfer data drop-down in the Objects tab, selecting either Transfer data in or Transfer data out, and following the instructions.
Scanning the bucket for sensitive data using Sensitive Data Protection.
Sensitive Data Protection is a service that lets you identify and protect sensitive data in your buckets, such as credit card numbers, IP IP addresses, and other forms of personally identifiable information (PII).
For a list of the types of data Sensitive Data Protection detects, see the Infotype detector reference.
You can initiate a Sensitive Data Protection scan for a bucket by clicking the Other services drop-down in the Objects tab, selecting Inspect for sensitive data, and following the instructions. For a guide to performing a Sensitive Data Protection scan on a bucket, see Inspecting a Cloud Storage location.
Exporting data from the bucket to Pub/Sub.
Pub/Sub is a messaging service that lets you notify subscribers about events that occur for your Google Cloud resources. Pub/Sub supports receiving event records that are stored as text files in your bucket and publishing them to a Pub/Sub topic.
You can create an export job for a bucket by clicking the Other services drop-down in the Objects tab, selecting Export data to Pub/Sub, and following the instructions. For more information, see Cloud Storage text to Pub/Sub (Batch) template.
Processing data in the bucket using Cloud Run functions.
Cloud Run functions is a service that lets you specify code that should run when certain events occur within the bucket. For example, you could create a Java function that runs every time an object in the bucket is deleted.
You can define a function for a bucket by clicking the Other services drop-down in the Objects tab, selecting Process data, and following the instructions.