{% setvar lang_is_python %}1{% endsetvar %} {% setvar lang_name %}python{% endsetvar %} {% setvar lang_name_uc %}Python{% endsetvar %}

App Engine and Google Cloud Storage Sample

This tutorial shows everything you need to do to enable your App Engine Python app to access Cloud Storage. This non-interactive sample show to create, write, read, and list files in the Cloud Storage bucket.

The tutorial assumes that you are familiar with Python, and that you've already gone through the Quickstart for Python App Engine Standard Environment.

When this sample runs, it executes a script and writes the output to the browser. The script demonstrates the following features of the Cloud Storage client library:

  • Creating a file and writing the file to a bucket.
  • Reading the file and obtaining its file metadata.
  • Creating several files and then listing them from the bucket.
  • Listing out the files just added to the bucket.
  • Reading that same set of files.
  • Deleting that set of files.


  • Walk through the Python project to view the required layout and files.
  • Understand the code for connecting to Cloud Storage.
  • Understand the code for creating, writing, reading, listing, and deleting files.
  • Understand the code for retries.
  • Build and test the app in your local development server.
  • Deploy the app to production on Google App Engine.


App Engine has free a level of usage. If your total usage of App Engine is less than the limits specified in the App Engine free quota, there is no charge for doing this tutorial.

Before you begin

Before you can run this sample, you need a project ID, the gcloud command-line tool, and a Cloud Storage bucket:

  1. Create a new Cloud Platform Console project or retrieve the project ID of an existing project from the Google Cloud Platform Console:

    Go to the Projects page

    Tip: Retrieve a list of your existing project IDs with gcloud.

  2. Install and then initialize the Google Cloud SDK:

    Download the SDK

  3. Activate the default Cloud Storage bucket.

Cloning the tutorial project and copying the library

To clone the project:

  1. Clone the client library and sample (demo) app to your local machine and change directory to its gcs-client/python subdirectory:

    git clone https://github.com/GoogleCloudPlatform/appengine-gcs-client.git gcs-client

    Alternatively, you can download the sample as a zip file and extract it.

  2. Change directory to python in the cloned or downloaded project:

    cd gcs-client/python
  3. Install the Cloud Storage client library into the demo project. You need this to run the sample, both locally and also in production App Engine:

    pip install GoogleAppEngineCloudStorageClient -t demo/lib

Viewing the project layout and files

The cloned project (or download) contains two subdirectories:

  • python, which contains the Cloud Storage client library and sample for Python.
  • java, which contains the Cloud Storage client library and sample for Java.

This is what the project looks like:

Sample Code Layout

Here are some of the files you'll be learning about:

File Description
__init__.py Normal init file to make Python treat this directory as containing a package.
app.yaml Used to specify your application's configuration.
blobstore.py Ignore this file. It's supplied as a convenience for developers who need to support an older technology. Blobstore is superseded by Cloud Storage.
main.py The main code file that does all the application work.
cloudstorage/ Directory containing the Cloud Storage client library. Required to run locally and in deployment.

Running locally

To run the sample locally:

  1. In the project subdirectory gcs-client/python/, run the app by invoking the command:

    dev_appserver.py demo
  2. Wait for the success message, which looks something like this:

    INFO     2016-04-12 21:33:35,446 api_server.py:205] Starting API server at: http://localhost:36884
    INFO     2016-04-12 21:33:35,449 dispatcher.py:197] Starting module "default" running at: http://localhost:8080
    INFO     2016-04-12 21:33:35,449 admin_server.py:116] Starting admin server at: http://localhost:8000
  3. Visit this URL in your browser:


    The application will execute on page load, displaying output to the browser to indicate what it has executed. The output looks like this:


  4. Terminate the development server by pressing Control-C.

app.yaml walkthrough

The app.yaml file specifies application configuration details:

runtime: python27
api_version: 1
threadsafe: yes


- url: /blobstore.*
  script: blobstore.app

- url: /.*
  script: main.app

For more information about the configuration options available in this file, see the app.yaml reference.

Imports walkthrough

The main.py file contains the typical imports used for accessing Cloud Storage via the client library:

import logging
import os
import cloudstorage as gcs
import webapp2

from google.appengine.api import app_identity

You need the os module and the app_identity API to get the default bucket name at runtime. You'll need this bucket name for all Cloud Storage access operations.

The sample also uses the webapp2 web framework.

Specifying the Cloud Storage bucket

Before doing any operations in Cloud Storage, you need to supply the bucket name. The easiest way to do this is to use the default bucket for your project, which can be obtained as follows:

def get(self):
  bucket_name = os.environ.get('BUCKET_NAME',

  self.response.headers['Content-Type'] = 'text/plain'
  self.response.write('Demo GCS Application running from Version: '
                      + os.environ['CURRENT_VERSION_ID'] + '\n')
  self.response.write('Using bucket name: ' + bucket_name + '\n\n')

Writing a file to Cloud Storage

The following sample shows how to write to the bucket:

def create_file(self, filename):
  """Create a file.

  The retry_params specified in the open call will override the default
  retry params for this particular file handle.

    filename: filename.
  self.response.write('Creating file %s\n' % filename)

  write_retry_params = gcs.RetryParams(backoff_factor=1.1)
  gcs_file = gcs.open(filename,
                      options={'x-goog-meta-foo': 'foo',
                               'x-goog-meta-bar': 'bar'},
  gcs_file.write('f'*1024*4 + '\n')

Notice that in the call to open the file for write, the sample specifies certain Cloud Storage headers that write custom metadata for the file; this metadata can be retrieved using cloudstorage.stat. You can find the list of supported headers in the cloudstorage.open reference.

Notice also that the x-goog-acl header is not set. That means the default Cloud Storage ACL of public read is going to be applied to the object when it is written to the bucket.

Finally, notice the call to close the file after you finish the write. If you don't do this, the file is not written to Cloud Storage. Be aware that after you call close, you cannot append to the file. If you need to modify a file, you'll have to open the file again in write mode, which does an overwrite, not an append.

Reading a file from Cloud Storage

The following sample shows how to read a file from the bucket:

def read_file(self, filename):
  self.response.write('Abbreviated file content (first line and last 1K):\n')

  gcs_file = gcs.open(filename)
  gcs_file.seek(-1024, os.SEEK_END)

The sample shows how to display selected lines from the file being read, in this case, the opening line and the last 1K lines,using seek.

Notice that no mode is specified in the code above when the file is opened for read. The default for open is read-only mode.

Listing bucket contents

The sample code shows how to page through a bucket with a large number of files, using the marker, and max_keys parameters to page through a list of the contents of the bucket:

def list_bucket(self, bucket):
  """Create several files and paginate through them.

  Production apps should set page_size to a practical value.

    bucket: bucket.
  self.response.write('Listbucket result:\n')

  page_size = 1
  stats = gcs.listbucket(bucket + '/foo', max_keys=page_size)
  while True:
    count = 0
    for stat in stats:
      count += 1

    if count != page_size or count == 0:
    stats = gcs.listbucket(bucket + '/foo', max_keys=page_size,

Note that the complete file name is displayed as one string without directory delimiters. If you want to display the file with its more recognizable directory hierarchy, set the delimiter parameter to the directory delimiter you want to use.

Deleting files

The code sample shows file deletes, in this case, the deleting of all the files added during application execution. You wouldn't do that in your code, as this is only a cleanup feature of this sample:

def delete_files(self):
  self.response.write('Deleting files...\n')
  for filename in self.tmp_filenames_to_clean_up:
    self.response.write('Deleting file %s\n' % filename)
    except gcs.NotFoundError:

Deploying the sample

To deploy and run the demo sample on App Engine:

  1. Upload the demo sample app by running the following command from within the gcs-client/python/demo directory where the app.yaml file is located:

    gcloud app deploy

    Optional flags:

    • Include the --project flag to specify an alternate Cloud Platform Console project ID to what you initialized as the default in the gcloud tool. Example: --project [YOUR_PROJECT_ID]
    • Include the -v flag to specify a version ID, otherwise one is generated for you. Example: -v [YOUR_VERSION_ID]

    Tip: If you specify a version ID of an application that was previously uploaded, your deployment will overwrite the existing version on App Engine. This isn't always desirable, especially if the version on App Engine is serving traffic. To avoid disrupting traffic to your application, you can deploy your application with a different version ID and then move traffic to that version. For more information about moving traffic, see traffic splitting.

  2. After the deployment process completes, you can view the application at https://[YOUR_PROJECT_ID].appspot.com by running the following command:

    gcloud app browse

    The demo app executes on page load, just as it did when you ran it locally. However, now the app will actually write to and read from your Cloud Storage bucket.

To learn more about deploying your app from the command line, see Deploying a Python App.

What's next

Send feedback about...

App Engine standard environment for Python