App Engine and Google Cloud Storage Sample

Learn how to enable Cloud Storage access to your App Engine Python app and create, write, read, and list files in the Cloud Storage bucket.

The tutorial assumes that you are familiar with Python and have set up your development environment.

When this sample runs, it executes a script and writes the output to the browser. The script demonstrates the following features of the Cloud Storage client library:

  • Creating a file and writing the file to a bucket.
  • Reading the file and obtaining its file metadata.
  • Creating several files and then listing them from the bucket.
  • Listing out the files just added to the bucket.
  • Reading that same set of files.
  • Deleting that set of files.


  • Walk through the Python project to view the required layout and files.
  • Understand the code for connecting to Cloud Storage.
  • Understand the code for creating, writing, reading, listing, and deleting files.
  • Understand the code for retries.
  • Build and test the app in your local development server.
  • Deploy the app to production on Google App Engine.


App Engine has a free level of usage. If your total usage of App Engine is less than the limits specified in the App Engine free quota, there is no charge for doing this tutorial.

Before you begin

Before you can run this sample, you need a project ID, the gcloud command-line tool, and a Cloud Storage bucket:

  1. Create a new Google Cloud console project or retrieve the project ID of an existing project from the Google Cloud console:

    Go to the Projects page

  2. Install and then initialize the Google Cloud CLI:

    Download the SDK

  3. Activate the default Cloud Storage bucket.

Cloning the tutorial project

To clone the project:

  1. Clone the client library and sample (demo) app to your local machine.

    git clone

    Alternatively, you can download the sample as a zip file and extract it.

  2. Navigate to the appropriate directory in the cloned or downloaded project:

    cd python-docs-samples/appengine/standard/storage/appengine-client

Installing dependencies

The virtualenv tool lets you create a clean Python environment on your system. For App Engine development, this helps ensure the code that you test locally is similar to the environment that your code will be deployed to. To learn more, read Using third-party libraries.

To install virtualenv and the sample's dependencies:

Mac OS / Linux

  1. If you don't have virtualenv, install it system-wide using pip.
    sudo pip install virtualenv
  2. Create an isolated Python environment:
    virtualenv env
    source env/bin/activate
  3. If you're not in the directory that contains the sample code, navigate to the directory that contains the hello_world sample code. Then install dependencies:
    pip install -t lib -r requirements.txt


If you have installed the Google Cloud CLI, you should already have Python 2.7 installed, typically in C:\python27_x64\ (for 64-bit systems). Use PowerShell to run your Python packages.

  1. Locate your installation of PowerShell.
  2. Right-click on the shortcut to PowerShell and start it as an administrator.
  3. Try running the python command. If it's not found, add your Python folder to your environment's PATH.
    $env:Path += ";C:\python27_x64\"
  4. If you don't have virtualenv, install it system-wide using pip:
    python -m pip install virtualenv
  5. Create an isolated Python environment.
    python -m virtualenv env
    . env\Scripts\activate
  6. Navigate to your project directory and install dependencies. If you're not in the directory that contains the sample code, navigate to the directory that contains the hello_world sample code. Then, install dependencies:
    python -m pip install -t lib -r requirements.txt

The sample code that you cloned or downloaded already contains an file, which is required to instruct App Engine to use the lib folder for loading the dependencies both locally and when deployed.

Running locally

To run the sample locally:

  1. In the project subdirectory python-docs-samples/appengine/standard/storage/appengine-client, run the app in the local development server:

    python3 CLOUD_SDK_ROOT/bin/ .
  2. Wait for the success message, which looks something like this:

    INFO     2016-04-12 21:33:35,446] Starting API server at: http://localhost:36884
    INFO     2016-04-12 21:33:35,449] Starting module "default" running at: http://localhost:8080
    INFO     2016-04-12 21:33:35,449] Starting admin server at: http://localhost:8000
  3. Visit this URL in your browser:


    The application will execute on page load, displaying output to the browser to indicate what it has executed. The output looks like this:


  4. Stop the development server by pressing Control-C.

app.yaml walkthrough

The app.yaml file specifies application configuration details:

runtime: python27
api_version: 1
threadsafe: yes


- url: /blobstore.*

- url: /.*

For more information about the configuration options available in this file, see the app.yaml reference.

Imports walkthrough

The file contains the typical imports used for accessing Cloud Storage via the client library:

import os

import cloudstorage
from google.appengine.api import app_identity

import webapp2

You need the os module and the app_identity API to get the default bucket name at runtime. You'll need this bucket name for all Cloud Storage access operations.

The sample also uses the webapp2 web framework.

Specifying the Cloud Storage bucket

Before doing any operations in Cloud Storage, you need to supply the bucket name. The easiest way to do this is to use the default bucket for your project, which can be obtained as follows:

def get(self):
    bucket_name = os.environ.get(
        'BUCKET_NAME', app_identity.get_default_gcs_bucket_name())

    self.response.headers['Content-Type'] = 'text/plain'
        'Demo GCS Application running from Version: {}\n'.format(
    self.response.write('Using bucket name: {}\n\n'.format(bucket_name))

Writing a file to Cloud Storage

The following sample shows how to write to the bucket:

def create_file(self, filename):
    """Create a file."""

    self.response.write('Creating file {}\n'.format(filename))

    # The retry_params specified in the open call will override the default
    # retry params for this particular file handle.
    write_retry_params = cloudstorage.RetryParams(backoff_factor=1.1)
        filename, 'w', content_type='text/plain', options={
            'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
            retry_params=write_retry_params) as cloudstorage_file:
                cloudstorage_file.write('f'*1024*4 + '\n')

Notice that in the call to open the file for write, the sample specifies certain Cloud Storage headers that write custom metadata for the file; this metadata can be retrieved using cloudstorage.stat. You can find the list of supported headers in the reference.

Notice also that the x-goog-acl header is not set. That means the default Cloud Storage ACL of public read is going to be applied to the object when it is written to the bucket.

Finally, notice the call to close the file after you finish the write. If you don't do this, the file is not written to Cloud Storage. Be aware that after you call close, you cannot append to the file. If you need to modify a file, you'll have to open the file again in write mode, which does an overwrite, not an append.

Reading a file from Cloud Storage

The following sample shows how to read a file from the bucket:

def read_file(self, filename):
        'Abbreviated file content (first line and last 1K):\n')

    with as cloudstorage_file:
        self.response.write(cloudstorage_file.readline()), os.SEEK_END)

The sample shows how to display selected lines from the file being read, in this case, the opening line and the last 1K lines,using seek.

Notice that no mode is specified in the code above when the file is opened for read. The default for open is read-only mode.

Listing bucket contents

The sample code shows how to page through a bucket with a large number of files, using the marker, and max_keys parameters to page through a list of the contents of the bucket:

def list_bucket(self, bucket):
    """Create several files and paginate through them."""

    self.response.write('Listbucket result:\n')

    # Production apps should set page_size to a practical value.
    page_size = 1
    stats = cloudstorage.listbucket(bucket + '/foo', max_keys=page_size)
    while True:
        count = 0
        for stat in stats:
            count += 1

        if count != page_size or count == 0:
        stats = cloudstorage.listbucket(
            bucket + '/foo', max_keys=page_size, marker=stat.filename)

Note that the complete file name is displayed as one string without directory delimiters. If you want to display the file with its more recognizable directory hierarchy, set the delimiter parameter to the directory delimiter you want to use.

Deleting files

The code sample shows file deletes, in this case, the deleting of all the files added during application execution. You wouldn't do that in your code, as this is only a cleanup feature of this sample:

def delete_files(self):
    self.response.write('Deleting files...\n')
    for filename in self.tmp_filenames_to_clean_up:
        self.response.write('Deleting file {}\n'.format(filename))
        except cloudstorage.NotFoundError:

Deploying the sample

To deploy and run the sample on App Engine:

  1. Upload the sample app by running the following command from within the python-docs-samples/appengine/standard/storage/appengine-client directory where the app.yaml file is located:

    gcloud app deploy

    Optional flags:

    • Include the --project flag to specify an alternate Google Cloud console project ID to what you initialized as the default in the gcloud CLI. Example: --project [YOUR_PROJECT_ID]
    • Include the -v flag to specify a version ID, otherwise one is generated for you. Example: -v [YOUR_VERSION_ID]
  2. After the deployment process completes, you can view the application at by running the following command:

    gcloud app browse

    The demo app executes on page load, just as it did when you ran it locally. However, now the app will actually write to and read from your Cloud Storage bucket.

To learn more about deploying your app from the command line, see Deploying a Python 2 App.

What's next