Pricing and support
- Where can I find pricing information?
- Read the Pricing page for detailed information on pricing, including how Cloud Storage calculates bandwidth and storage usage.
- What are my support options?
- See the Cloud Storage Getting Support page for information about support options.
- Does Cloud Storage offer a service level agreement (SLA)?
- Yes. You are covered under the Cloud Storage Service Level Agreement.
- How do I notify Google of SLA Financial Credit eligibility?
- Use the SLA Financial Credit Eligibility form.
- How do I give product feedback?
- From the Cloud Storage documentation click "Send feedback" near the top right of the page. This will open a feedback form. Your comments will be reviewed by the Cloud Storage team.
- When do I need to activate Cloud Storage and enable billing?
- If you want to create buckets, store data, or control who can access your data, you must activate Cloud Storage and enable billing.
- How do I sign up?
- Sign up for Cloud Storage by turning on the Cloud Storage service in the Google Cloud Console.
- Do I need to activate Cloud Storage and turn on billing if I was granted access to someone else's bucket?
No, in this case another individual has already set up a Google Cloud project and either added you as a project team member or granted you permission to their buckets and objects. Once you authenticate, typically with your Google account, you can read or write data according to the access that you were granted.
For information about adding project team members, see Adding a member to a project.
- I am just trying to download or access some data that is freely available to the public. How can I do that?
Simply follow the Accessing public data guide, which offers several methods for accessing freely available, public data that is stored in Cloud Storage. Depending on the method you use, you do not need to turn on billing, create credentials, or authenticate to Cloud Storage.
- What tools and libraries are available for Cloud Storage?
The browser-based Google Cloud Console performs basic operations on buckets and objects.
The gsutil command-line tool provides a command-line interface with Cloud Storage.
The Cloud Storage Client Libraries provide programmatic support for a number of programming languages, including Java, Python, and Ruby.
You can find additional, third-party tools and libraries by searching the Internet.
- I'm developing a library or tool for Cloud Storage and I want to sell it on the Internet. Is this okay?
- How do I cancel my Cloud Storage account?
To cancel your Cloud Storage account, take the following steps:
Delete all your buckets and objects.
For step-by-step instructions, see Deleting Buckets.
Disable the Cloud Storage services for your project.
For the desired project, open the list of enabled APIs in the Google Cloud Console. In the list of APIs, click Disable for Google Cloud Storage and Google Cloud Storage JSON API.
Disable billing (optional).
You do not incur any new Cloud Storage charges after you perform the above steps, but you can disable billing to stop receiving statements. For step-by-step instructions, see Disable billing for a project. You will receive one last bill for any remaining changes incurred between the beginning of the billing cycle and when you disabled billing.
Storage and content policy
- How durable is my data in Cloud Storage?
Cloud Storage is designed for 99.999999999% (11 9's) annual durability, which is appropriate for even primary storage and business-critical applications. This high durability level is achieved through erasure coding that stores data pieces redundantly across multiple devices located in multiple availability zones.
Objects written to Cloud Storage must be redundantly stored in at least two different availability zones before the write is acknowledged as successful. Checksums are stored and regularly revalidated to proactively verify that the data integrity of all data at rest as well as to detect corruption of data in transit. If required, corrections are automatically made using redundant data. Customers can optionally enable object versioning to add protection against accidental deletion.
- How can I maximize the availability of my data?
Consider storing your data in a multi-regional or dual-regional bucket location if high availability is a top requirement. This ensures that your data is stored in at least two geographically separated regions, providing continued availability even in the rare event of a region-wide outage, including ones caused by natural disasters. All data, regardless of storage class, is stored geo-redundantly in these types of locations, which allows you to use storage lifecycle management without giving up high availability.
- What other advantages does Cloud Storage provide for disaster recovery scenarios?
Cloud Storage always provides strongly consistent object listings from a single bucket, even for buckets with data replicated across multiple regions. This means a zero RTO in most circumstances for geo-redundant storage locations. In the unlikely case of an region-wide outage, the existing bucket remains available, with no need to change storage paths. Note that this also frequently requires setting up geo-redundant compute instances so that your service can stay up if a particular region goes offline.
- Where is my data stored?
Where Cloud Storage stores your data depends on the location of the bucket in which your data resides. For information on available locations and implications for choosing a location, see the Bucket Locations page.
- How do I protect myself from accidental data deletion?
Cloud Storage offers several different ways for you to protect your data from accidental deletion. See the best practices topic for an overview of each.
- Can I delete a Cloud Storage object that I accidentally uploaded to a locked, retention-enabled bucket?
No. You can only delete such an object after it has fulfilled its retention period.
If you have not locked the bucket, you can temporarily remove the retention policy, remove the object, and then reinstate the retention policy.
- I believe some content hosted on your service is inappropriate, how do I report it?
Certain types of content are not allowed on this service; please refer to the Terms of Services and Platform Policies for details. If you believe a piece of content is in violation of our policies, report it here (select See more products, then Google Cloud Storage & Cloud Bigtable).
- What is the default bucket location if I do not specify a location constraint?
- The default bucket location is within the US. If you do not specify a location constraint, then your bucket and data added to it are stored on servers in the US.
- Can I move buckets from one location to another or change the project that the bucket is associated with?
- Changing a bucket's location or project is not intrinsically provided by Cloud Storage; a bucket remains in the location and project that you set during bucket creation. If you want to change either of these parameters, you have to delete the bucket and recreate it.
- How can I get a summary of space usage for a Cloud Storage bucket?
- You can use Cloud Monitoring for daily monitoring of your bucket's byte
count, or you can use the
gsutil ducommand to get the total bytes in your bucket at a given moment. For more information, see Determining a Bucket's Size.
- I created a bucket, but don't remember which project I created it in. How can I find it?
For most common Cloud Storage operations, you only need to specify the relevant bucket's name, not the project associated with the bucket. In general, you only need to specify a project identifier when creating a bucket or listing buckets in a project. For more information, see When to specify a project.
To find which project contains a specific bucket:
- If you are searching over a moderate number of projects and buckets, use the Google Cloud Console, select each project, and view the buckets it contains.
- Otherwise, go to the storage.bucket.get page in the API Explorer and
enter the bucket's name in the bucket field. When you click
Authorize and Execute, the associated project number appears as
part of the response. To get the project name, use the project number
in the following terminal command:
gcloud projects list | grep [PROJECT_NUMBER]
Using with other Google services
- Can I use Cloud Storage to upload files to services in G Suite, such as Google Drive?
- No, Cloud Storage is not integrated with G Suite.
- Can I use Cloud Storage with my G Suite account or Cloud Identity domain?
- Yes, you can use Cloud Storage with either.
- Does Google offer other unstructured storage options?
- Yes, Google offers several storage options for unstructured data, such as Google Drive. For an overview of Google storage options, including a video explaining the differences between the options, see Storing Your Data.
- Can charges associated with accessing data be billed to the user who accesses the data?
- Yes. You can use the Requester Pays feature to require that requesters include a billing account project in their requests. The requester's project is then billed for access charges instead of the owner of the accessed bucket.
- Does Cloud Storage provide any acceleration capabilities for uploads and downloads?
- Yes. Cloud Storage allows customers to use a global DNS name for uploads and downloads. Google uses its private network to transfer data to/from the closest POP that the data is being uploaded from or downloaded to. This generally results in significantly higher performance for the transfers than what would be possible over the public Internet. This functionality is included with all Cloud Storage buckets at no additional charge.
- I want to let someone download an individual object. How do I do that?
- There are several ways that you can share an individual object. You can use
a signed URL, which gives time-limited access to anyone in possession
of the signed URL. See V4 signing with Cloud Storage tools
for instructions to create a signed URL. Alternatively, you can use the
resource.nameIAM condition to selectively grant access to objects in a bucket. See Using IAM conditions on buckets for instructions to apply an IAM condition.