Troubleshooting

This page describes troubleshooting methods for common errors you may encounter while using Cloud Storage.

Logging raw requests

When using tools such as gsutil or the Cloud Storage client libraries, much of the request and response information is handled by the tool. However, it is sometimes useful to see details to aid in troubleshooting. Use the following instructions to return request and response headers for your tool:

Console

Viewing request and response information depends on the browser you're using to access the Google Cloud Console. For the Google Chrome browser:

  1. Click Chrome's main menu button ().

  2. Select More Tools.

  3. Click Developer Tools.

  4. In the pane that appears, click the Network tab.

gsutil

Use the top-level -D flag in your request. For example:

gsutil -D ls gs://my-bucket/my-object

Client libraries

C++

  • Set the environment variable CLOUD_STORAGE_ENABLE_TRACING=http to get the full HTTP traffic.

  • Set the environment variable CLOUD_STORAGE_ENABLE_CLOG=yes to get logging of each RPC.

C#

Add a logger via ApplicationContext.RegisterLogger, and set logging options on the HttpClient message handler. For more information, see the FAQ entry.

Go

Set the environment variable GODEBUG=http2debug=1. For more information, see the Go package net/http.

If you want to log the request body as well, use a custom HTTP client.

Java

  1. Create a file named "logging.properties" with the following contents:

    # Properties file which configures the operation of the JDK logging facility.
    # The system will look for this config file to be specified as a system property:
    # -Djava.util.logging.config.file=${project_loc:googleplus-simple-cmdline-sample}/logging.properties
    
    # Set up the console handler (uncomment "level" to show more fine-grained messages)
    handlers = java.util.logging.ConsoleHandler
    java.util.logging.ConsoleHandler.level = CONFIG
    
    # Set up logging of HTTP requests and responses (uncomment "level" to show)
    com.google.api.client.http.level = CONFIG
  2. Use logging.properties with Maven

    mvn -Djava.util.logging.config.file=path/to/logging.properties insert_command

For more information, see Pluggable HTTP Transport.

Node.js

Set the environment variable NODE_DEBUG=https before calling the Node script.

PHP

Provide your own HTTP handler to the client using httpHandler and set up middleware to log the request and response.

Python

Use the logging module. For example:

import logging
import http.client

logging.basicConfig(level=logging.DEBUG)
http.client.HTTPConnection.debuglevel=5

Ruby

At the top of your .rb file after require "google/cloud/storage", add the following:

ruby
Google::Apis.logger.level = Logger::DEBUG

Error codes

The following are common HTTP status codes you may encounter.

301: Moved Permanently

Issue: I'm setting up a static website, and accessing a directory path returns an empty object and a 301 HTTP response code.

Solution: If your browser downloads a zero byte object and you get a 301 HTTP response code when accessing a directory, such as http://www.example.com/dir/, your bucket most likely contains an empty object of that name. To check that this is the case and fix the issue:

  1. In the Google Cloud Console, go to the Cloud Storage Browser page.

    Go to Browser

  2. Click the Activate Cloud Shell button at the top of the Google Cloud Console. Activate Cloud Shell
  3. Run gsutil ls -R gs://www.example.com/dir/. If the output includes http://www.example.com/dir/, you have an empty object at that location.
  4. Remove the empty object with the command: gsutil rm gs://www.example.com/dir/

You can now access http://www.example.com/dir/ and have it return that directory's index.html file instead of the empty object.

400: Bad Request

Issue: While performing a resumable upload, I received this error and the message Failed to parse Content-Range header.

Solution: The value you used in your Content-Range header is invalid. For example, Content-Range: */* is invalid and instead should be specified as Content-Range: bytes */*. If you receive this error, your current resumable upload is no longer active, and you must start a new resumable upload.

401: Unauthorized

Issue: Requests to a public bucket directly, or via Cloud CDN, are failing with a HTTP 401: Unauthorized and an Authentication Required response.

Solution: Check that your client, or any intermediate proxy, is not adding an Authorization header to requests to Cloud Storage. Any request with an Authorization header, even if empty, is validated as if it were an authentication attempt.

403: Account Disabled

Issue: I tried to create a bucket but got a 403 Account Disabled error.

Solution: This error indicates that you have not yet turned on billing for the associated project. For steps for enabling billing, see Enable billing for a project.

If billing is turned on and you continue to receive this error message, you can reach out to support with your project ID and a description of your problem.

403: Access Denied

Issue: I tried to list the objects in my bucket but got a 403 Access Denied error and/or a message similar to Anonymous caller does not have storage.objects.list access.

Solution: Check that your credentials are correct. For example, if you are using gsutil, check that the credentials stored in your .boto file are accurate. Also, confirm that gsutil is using the .boto file you expect by using the command gsutil version -l and checking the config path(s) entry.

Assuming you are using the correct credentials, are your requests being routed through a proxy, using HTTP (instead of HTTPS)? If so, check whether your proxy is configured to remove the Authorization header from such requests. If so, make sure you are using HTTPS instead of HTTP for your requests.

403: Forbidden

Issue: I am downloading my public content from storage.cloud.google.com, and I receive a 403: Forbidden error when I use the browser to navigate to the public object:

https://storage.cloud.google.com/BUCKET_NAME/OBJECT_NAME

Solution: Using storage.cloud.google.com to download objects is known as authenticated browser downloads; it always uses cookie-based authentication, even when objects are made publicly accessible to allUsers. If you have configured Data Access logs in Cloud Audit Logs to track access to objects, one of the restrictions of that feature is that authenticated browser downloads cannot be used to access the affected objects; attempting to do so results in a 403 response.

To avoid this issue, do one of the following:

  • Use direct API calls, which support unauthenticated downloads, instead of using authenticated browser downloads.
  • Disable the Cloud Storage Data Access logs that are tracking access to the affected objects. Be aware that Data Access logs are set at or above the project level and can be enabled simultaneously at multiple levels.
  • Set Data Access log exemptions to exclude specific users from Data Access log tracking, which allows those users to perform authenticated browser downloads.

409: Conflict

Issue: I tried to create a bucket but received the following error:

409 Conflict. Sorry, that name is not available. Please try a different one.

Solution: The bucket name you tried to use (e.g. gs://cats or gs://dogs) is already taken. Cloud Storage has a global namespace so you may not name a bucket with the same name as an existing bucket. Choose a name that is not being used.

Diagnosing Google Cloud Console errors

Issue: When using the Google Cloud Console to perform an operation, I get a generic error message. For example, I see an error message when trying to delete a bucket, but I don't see details for why the operation failed.

Solution: Use the Google Cloud Console's notifications to see detailed information about the failed operation:

  1. Click the Notifications button in the Google Cloud Console header.

    Notifications

    A dropdown displays the most recent operations performed by the Google Cloud Console.

  2. Click the item you want to find out more about.

    A page opens up and displays detailed information about the operation.

  3. Click on each row to expand the detailed error information.

    Below is an example of error information for a failed bucket deletion operation, which explains that a bucket retention policy prevented the deletion of the bucket.

    Bucket deletion error details

gsutil errors

The following are common gsutil errors you may encounter.

gsutil stat

Issue: I tried to use the gsutil stat command to display object status for a subdirectory and got an error.

Solution: Cloud Storage uses a flat namespace to store objects in buckets. While you can use slashes ("/") in object names to make it appear as if objects are in a hierarchical structure, the gsutil stat command treats a trailing slash as part of the object name.

For example, if you run the command gsutil -q stat gs://my-bucket/my-object/, gsutil looks up information about the object my-object/ (with a trailing slash), as opposed to operating on objects nested under my-bucket/my-object/. Unless you actually have an object with that name, the operation fails.

For subdirectory listing, use the gsutil ls instead.

gcloud auth

Issue: I tried to authenticate gsutil using the gcloud auth command, but I still cannot access my buckets or objects.

Solution: Your system may have both the stand-alone and Cloud SDK versions of gsutil installed on it. Run the command gsutil version -l and check the value for using cloud sdk. If False, your system is using the stand-alone version of gsutil when you run commands. You can either remove this version of gsutil from your system, or you can authenticate using the gsutil config command.

Static website errors

The following are common issues that you may encounter when setting up a bucket to host a static website.

HTTPS serving

Issue: I want to serve my content over HTTPS without using a load balancer.

Solution: You can serve static content through HTTPS using direct URIs such as https://storage.googleapis.com/my-bucket/my-object. For other options to serve your content through a custom domain over SSL, you can:

Domain verification

Issue: I can't verify my domain.

Solution: Normally, the verification process in Search Console directs you to upload a file to your domain, but you may not have a way to do this without first having an associated bucket, which you can only create after you have performed domain verification.

In this case, verify ownership using the Domain name provider verification method. See Ownership verification for steps to accomplish this. This verification can be done before the bucket is created.

Inaccessible page

Issue: I get an Access denied error message for a web page served by my website.

Solution: Check that the object is shared publicly. If it is not, see Making Data Public for instructions on how to do this.

If you previously uploaded and shared an object, but then upload a new version of it, then you must reshare the object publicly. This is because the public permission is replaced with the new upload.

Permission update failed

Issue: I get an error when I attempt to make my data public.

Solution: Make sure that you have the setIamPolicy permission for your object or bucket. This permission is granted, for example, in the Storage Admin role. If you have the setIamPolicy permission and you still get an error, your bucket might be subject to public access prevention, which does not allow access to allUsers or allAuthenticatedUsers. Public access prevention might be set on the bucket directly, or it might be enforced through an organization policy that is set at a higher level.

Content download

Issue: I am prompted to download my page's content, instead of being able to view it in my browser.

Solution: If you specify a MainPageSuffix as an object that does not have a web content type, then instead of serving the page, site visitors are prompted to download the content. To resolve this issue, update the content-type metadata entry to a suitable value, such as text/html. See Editing object metadata for instructions on how to do this.

Proxy servers

Issue: I'm connecting through a proxy server. What do I need to do?

Solution: To access Cloud Storage through a proxy server, you must allow access to these domains:

  • accounts.google.com for creating OAuth2 authentication tokens via gsutil config
  • oauth2.googleapis.com for performing OAuth2 token exchanges
  • *.googleapis.com for storage requests

If your proxy server or security policy doesn't support whitelisting by domain and instead requires whitelisting by IP network block, we strongly recommend that you configure your proxy server for all Google IP address ranges. You can find the address ranges by querying WHOIS data at ARIN. As a best practice, you should periodically review your proxy settings to ensure they match Google's IP addresses.

We do not recommend configuring your proxy with individual IP addresses you obtain from one-time lookups of oauth2.googleapis.com and storage.googleapis.com. Because Google services are exposed via DNS names that map to a large number of IP addresses that can change over time, configuring your proxy based on a one-time lookup may lead to failures to connect to Cloud Storage.

If your requests are being routed through a proxy server, you may need to check with your network administrator to ensure that the Authorization header containing your credentials is not stripped out by the proxy. Without the Authorization header, your requests are rejected and you receive a MissingSecurityHeader error.

What's next