Constructor

File

new File(bucket, name, options)

A File object is created from your Bucket object using Bucket#file.

Parameter

bucket

Bucket

The Bucket instance this file is attached to.

name

string

The name of the remote file.

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

encryptionKey

Optional

string

A custom encryption key.

generation

Optional

number

Generation to scope the file to.

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

Property

acl

Cloud Storage uses access control lists (ACLs) to manage object and bucket access. ACLs are the mechanism you use to share objects with other users and allow other users to access your buckets and objects.

An ACL consists of one or more entries, where each entry grants permissions to an entity. Permissions define the actions that can be performed against an object or bucket (for example, READ or WRITE); the entity defines who the permission applies to (for example, a specific user or group of users).

The acl object on a File instance provides methods to get you a list of the ACLs defined on your bucket, as well as set, update, and delete them.

Mixes in
Acl
See also

About Access Control lists

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
//-
// Make a file publicly readable.
//-
var options = {
  entity: 'allUsers',
  role: storage.acl.READER_ROLE
};

file.acl.add(options, function(err, aclObject) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.acl.add(options).then(function(data) {
  var aclObject = data[0];
  var apiResponse = data[1];
});

Methods

copy

copy(destination, options, callback) returns Promise containing CopyResponse

Copy this file to another file. By default, this will copy the file to the same bucket, but you can choose to copy it to another Bucket by providing a Bucket or File object or a URL starting with "gs://".

Parameter

destination

(string, Bucket, or File)

Destination file.

options

Optional

object

Configuration options. See an Object resource.

Values in options have the following properties:

Parameter

token

Optional

string

A previously-returned rewriteToken from an unfinished rewrite request.

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

CopyCallback

Callback function.

See also

Objects: copy API Documentation

Throws

Error 

If the destination file is not provided.

Returns

Promise containing CopyResponse 

Example

var storage = require('@google-cloud/storage')();

//-
// You can pass in a variety of types for the destination.
//
// For all of the below examples, assume we are working with the following
// Bucket and File objects.
//-
var bucket = storage.bucket('my-bucket');
var file = bucket.file('my-image.png');

//-
// If you pass in a string for the destination, the file is copied to its
// current bucket, under the new name provided.
//-
file.copy('my-image-copy.png', function(err, copiedFile, apiResponse) {
  // `my-bucket` now contains:
  // - "my-image.png"
  // - "my-image-copy.png"

  // `copiedFile` is an instance of a File object that refers to your new
  // file.
});

//-
// If you pass in a string starting with "gs://" for the destination, the
// file is copied to the other bucket and under the new name provided.
//-
var newLocation = 'gs://another-bucket/my-image-copy.png';
file.copy(newLocation, function(err, copiedFile, apiResponse) {
  // `my-bucket` still contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-image-copy.png"

  // `copiedFile` is an instance of a File object that refers to your new
  // file.
});

//-
// If you pass in a Bucket object, the file will be copied to that bucket
// using the same name.
//-
var anotherBucket = storage.bucket('another-bucket');
file.copy(anotherBucket, function(err, copiedFile, apiResponse) {
  // `my-bucket` still contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-image.png"

  // `copiedFile` is an instance of a File object that refers to your new
  // file.
});

//-
// If you pass in a File object, you have complete control over the new
// bucket and filename.
//-
var anotherFile = anotherBucket.file('my-awesome-image.png');
file.copy(anotherFile, function(err, copiedFile, apiResponse) {
  // `my-bucket` still contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-awesome-image.png"

  // Note:
  // The `copiedFile` parameter is equal to `anotherFile`.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.copy(newLocation).then(function(data) {
  var newFile = data[0];
  var apiResponse = data[1];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the source bucket, e.g. "my-bucket"
// const srcBucketName = "my-bucket";

// The name of the source file, e.g. "file.txt"
// const srcFilename = "file.txt";

// The destination bucket, e.g. "my-other-bucket"
// const destBucketName = "my-other-bucket";

// The destination filename, e.g. "file.txt"
// const destFilename = "file.txt";

// Instantiates a client
const storage = Storage();

// Copies the file to the other bucket
storage
  .bucket(srcBucketName)
  .file(srcFilename)
  .copy(storage.bucket(destBucketName).file(destFilename))
  .then(() => {
    console.log(
      `gs://${srcBucketName}/${srcFilename} copied to gs://${destBucketName}/${destFilename}.`
    );
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

createReadStream

createReadStream(options) returns ReadableStream

Create a readable stream to read the contents of the remote file. It can be piped to a writable stream or listened to for 'data' events to read a file's contents.

In the unlikely event there is a mismatch between what you downloaded and the version in your Bucket, your error handler will receive an error with code "CONTENT_DOWNLOAD_MISMATCH". If you receive this error, the best recourse is to try downloading the file again.

For faster crc32c computation, you must manually install fast-crc32c:

$ npm install --save fast-crc32c

NOTE: Readable streams will emit the end event when the file is fully downloaded.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

validation

Optional

(string or boolean)

Possible values: "md5", "crc32c", or false. By default, data integrity is validated with a CRC32c checksum. You may use MD5 if preferred, but that hash is not supported for composite objects. An error will be raised if MD5 is specified but is not available. You may also choose to skip validation completely, however this is not recommended.

start

Optional

number

A byte offset to begin the file's download from. Default is 0. NOTE: Byte ranges are inclusive; that is, options.start = 0 and options.end = 999 represent the first 1000 bytes in a file or object. NOTE: when specifying a byte range, data integrity is not available.

end

Optional

number

A byte offset to stop reading the file at. NOTE: Byte ranges are inclusive; that is, options.start = 0 and options.end = 999 represent the first 1000 bytes in a file or object. NOTE: when specifying a byte range, data integrity is not available.

Returns

ReadableStream 

Example

//-
// <h4>Downloading a File</h4>
//
// The example below demonstrates how we can reference a remote file, then
// pipe its contents to a local file. This is effectively creating a local
// backup of your remote data.
//-
var storage = require('@google-cloud/storage')();
var bucket = storage.bucket('my-bucket');

var fs = require('fs');
var remoteFile = bucket.file('image.png');
var localFilename = '/Users/stephen/Photos/image.png';

remoteFile.createReadStream()
  .on('error', function(err) {})
  .on('response', function(response) {
    // Server connected and responded with the specified status and headers.
   })
  .on('end', function() {
    // The file is fully downloaded.
  })
  .pipe(fs.createWriteStream(localFilename));

//-
// To limit the downloaded data to only a byte range, pass an options object.
//-
var logFile = myBucket.file('access_log');
logFile.createReadStream({
    start: 10000,
    end: 20000
  })
  .on('error', function(err) {})
  .pipe(fs.createWriteStream('/Users/stephen/logfile.txt'));

//-
// To read a tail byte range, specify only `options.end` as a negative
// number.
//-
var logFile = myBucket.file('access_log');
logFile.createReadStream({
    end: -100
  })
  .on('error', function(err) {})
  .pipe(fs.createWriteStream('/Users/stephen/logfile.txt'));

createResumableUpload

createResumableUpload(options, callback) returns Promise containing CreateResumableUploadResponse

Create a unique resumable upload session URI. This is the first step when performing a resumable upload.

See the Resumable upload guide for more on how the entire process works.

Note

If you are just looking to perform a resumable upload without worrying about any of the details, see File#createWriteStream. Resumable uploads are performed by default.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

metadata

Optional

object

Metadata to set on the file.

origin

Optional

string

Origin header to set for the upload.

predefinedAcl

Optional

string

Apply a predefined set of access controls to this object.

Acceptable values are:
- **`authenticatedRead`** - Object owner gets `OWNER` access, and
  `allAuthenticatedUsers` get `READER` access.

- **`bucketOwnerFullControl`** - Object owner gets `OWNER` access, and
  project team owners get `OWNER` access.

- **`bucketOwnerRead`** - Object owner gets `OWNER` access, and project
  team owners get `READER` access.

- **`private`** - Object owner gets `OWNER` access.

- **`projectPrivate`** - Object owner gets `OWNER` access, and project
  team members get access according to their roles.

- **`publicRead`** - Object owner gets `OWNER` access, and `allUsers` get
  `READER` access.

private

Optional

boolean

Make the uploaded file private. (Alias for options.predefinedAcl = 'private')

public

Optional

boolean

Make the uploaded file public. (Alias for options.predefinedAcl = 'publicRead')

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

CreateResumableUploadCallback

Callback function.

See also

Resumable upload guide

Returns

Promise containing CreateResumableUploadResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
file.createResumableUpload(function(err, uri) {
  if (!err) {
    // `uri` can be used to PUT data to.
  }
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.createResumableUpload().then(function(data) {
  var uri = data[0];
});

createWriteStream

createWriteStream(options) returns WritableStream

Create a writable stream to overwrite the contents of the file in your bucket.

A File object can also be used to create files for the first time.

Resumable uploads are automatically enabled and must be shut off explicitly by setting options.resumable to false.

There is some overhead when using a resumable upload that can cause noticeable performance degradation while uploading a series of small files. When uploading files less than 10MB, it is recommended that the resumable feature is disabled.

For faster crc32c computation, you must manually install fast-crc32c:

$ npm install --save fast-crc32c

NOTE: Writable streams will emit the finish event when the file is fully uploaded.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

gzip

Optional

boolean

Automatically gzip the file. This will set options.metadata.contentEncoding to gzip.

metadata

Optional

object

See the examples below or Objects: insert request body for more details.

offset

Optional

string

The starting byte of the upload stream, for resuming an interrupted upload. Defaults to 0.

predefinedAcl

Optional

string

Apply a predefined set of access controls to this object.

Acceptable values are:
- **`authenticatedRead`** - Object owner gets `OWNER` access, and
  `allAuthenticatedUsers` get `READER` access.

- **`bucketOwnerFullControl`** - Object owner gets `OWNER` access, and
  project team owners get `OWNER` access.

- **`bucketOwnerRead`** - Object owner gets `OWNER` access, and project
  team owners get `READER` access.

- **`private`** - Object owner gets `OWNER` access.

- **`projectPrivate`** - Object owner gets `OWNER` access, and project
  team members get access according to their roles.

- **`publicRead`** - Object owner gets `OWNER` access, and `allUsers` get
  `READER` access.

private

Optional

boolean

Make the uploaded file private. (Alias for options.predefinedAcl = 'private')

public

Optional

boolean

Make the uploaded file public. (Alias for options.predefinedAcl = 'publicRead')

resumable

Optional

boolean

Force a resumable upload. NOTE: When working with streams, the file format and size is unknown until it's completely consumed. Because of this, it's best for you to be explicit for what makes sense given your input.

uri

Optional

string

The URI for an already-created resumable upload. See File#createResumableUpload.

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

validation

Optional

(string or boolean)

Possible values: "md5", "crc32c", or false. By default, data integrity is validated with a CRC32c checksum. You may use MD5 if preferred, but that hash is not supported for composite objects. An error will be raised if MD5 is specified but is not available. You may also choose to skip validation completely, however this is not recommended.

See also

Upload Options (Simple or Resumable)

Objects: insert API Documentation

Returns

WritableStream 

Example

var fs = require('fs');
var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

//-
// <h4>Uploading a File</h4>
//
// Now, consider a case where we want to upload a file to your bucket. You
// have the option of using {@link Bucket#upload}, but that is just
// a convenience method which will do the following.
//-
fs.createReadStream('/Users/stephen/Photos/birthday-at-the-zoo/panda.jpg')
  .pipe(file.createWriteStream())
  .on('error', function(err) {})
  .on('finish', function() {
    // The file upload is complete.
  });

//-
// <h4>Uploading a File with gzip compression</h4>
//-
fs.createReadStream('/Users/stephen/site/index.html')
  .pipe(file.createWriteStream({ gzip: true }))
  .on('error', function(err) {})
  .on('finish', function() {
    // The file upload is complete.
  });

//-
// Downloading the file with `createReadStream` will automatically decode the
// file.
//-

//-
// <h4>Uploading a File with Metadata</h4>
//
// One last case you may run into is when you want to upload a file to your
// bucket and set its metadata at the same time. Like above, you can use
// {@link Bucket#upload} to do this, which is just a wrapper around
// the following.
//-
fs.createReadStream('/Users/stephen/Photos/birthday-at-the-zoo/panda.jpg')
  .pipe(file.createWriteStream({
    metadata: {
      contentType: 'image/jpeg',
      metadata: {
        custom: 'metadata'
      }
    }
  }))
  .on('error', function(err) {})
  .on('finish', function() {
    // The file upload is complete.
  });

delete

delete(options, callback) returns Promise containing DeleteFileResponse

Delete the file.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

DeleteFileCallback

Callback function.

See also

Objects: delete API Documentation

Returns

Promise containing DeleteFileResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
file.delete(function(err, apiResponse) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.delete().then(function(data) {
  var apiResponse = data[0];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the file to delete, e.g. "file.txt"
// const filename = "file.txt";

// Instantiates a client
const storage = Storage();

// Deletes the file from the bucket
storage
  .bucket(bucketName)
  .file(filename)
  .delete()
  .then(() => {
    console.log(`gs://${bucketName}/${filename} deleted.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

download

download(options, callback) returns Promise containing DownloadResponse

Convenience method to download a file into memory or to a local destination.

Parameter

options

Optional

object

Configuration options. The arguments match those passed to File#createReadStream.

Values in options have the following properties:

Parameter

destination

Optional

string

Local file path to write the file's contents to.

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

DownloadCallback

Callback function.

Returns

Promise containing DownloadResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

//-
// Download a file into memory. The contents will be available as the second
// argument in the demonstration below, `contents`.
//-
file.download(function(err, contents) {});

//-
// Download a file to a local destination.
//-
file.download({
  destination: '/Users/me/Desktop/file-backup.txt'
}, function(err) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.download().then(function(data) {
  var contents = data[0];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the remote file to download, e.g. "file.txt"
// const srcFilename = "file.txt";

// The path to which the file should be downloaded, e.g. "./local/path/to/file.txt"
// const destFilename = "./local/path/to/file.txt";

// Instantiates a client
const storage = Storage();

const options = {
  // The path to which the file should be downloaded, e.g. "./file.txt"
  destination: destFilename,
};

// Downloads the file
storage
  .bucket(bucketName)
  .file(srcFilename)
  .download(options)
  .then(() => {
    console.log(
      `gs://${bucketName}/${srcFilename} downloaded to ${destFilename}.`
    );
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

Example of downloading an encrypted file:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the remote file to download, e.g. "file_encrypted.txt"
// const srcFilename = "file_encrypted.txt";

// The path to which the file should be downloaded, e.g. "./file.txt"
// const destFilename = "./file.txt";

// Instantiates a client
const storage = Storage();

const options = {
  // The path to which the file should be downloaded, e.g. "./file.txt"
  destination: destFilename,
};

// Descrypts and downloads the file. This can only be done with the key used
// to encrypt and upload the file.
storage
  .bucket(bucketName)
  .file(srcFilename)
  .setEncryptionKey(Buffer.from(key, 'base64'))
  .download(options)
  .then(() => {
    console.log(`File ${srcFilename} downloaded to ${destFilename}.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

Example of downloading a file where the requester pays:

// Imports the Google Cloud client library
const Storage = require(`@google-cloud/storage`);

// The project ID to bill from
// const projectId = process.env.GCLOUD_PROJECT;

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the remote file to download, e.g. "file.txt"
// const srcFilename = "file.txt";

// The path to which the file should be downloaded, e.g. "./local/path/to/file.txt"
// const destFilename = "./local/path/to/file.txt";

// Creates a client
const storage = Storage();

const options = {
  // The path to which the file should be downloaded, e.g. "./file.txt"
  destination: destFilename,

  // The project to bill from, if requester-pays requests are enabled
  userProject: projectId,
};

// Downloads the file
storage
  .bucket(bucketName)
  .file(srcFilename)
  .download(options)
  .then(() => {
    console.log(
      `gs://${bucketName}/${srcFilename} downloaded to ${destFilename} using requester-pays requests.`
    );
  })
  .catch(err => {
    console.error(`ERROR:`, err);
  });

exists

exists(options, callback) returns Promise containing FileExistsResponse

Check if the file exists.

Parameter

options

Optional

options

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

FileExistsCallback

Callback function.

Returns

Promise containing FileExistsResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

file.exists(function(err, exists) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.exists().then(function(data) {
  var exists = data[0];
});

get

get(options, callback) returns Promise containing GetFileResponse

Get a file object and its metadata if it exists.

Parameter

options

Optional

options

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

GetFileCallback

Callback function.

Returns

Promise containing GetFileResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

file.get(function(err, file, apiResponse) {
  // file.metadata` has been populated.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.get().then(function(data) {
  var file = data[0];
  var apiResponse = data[1];
});

getMetadata

getMetadata(options, callback) returns Promise containing GetFileMetadataResponse

Get the file's metadata.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

GetFileMetadataCallback

Callback function.

See also

Objects: get API Documentation

Returns

Promise containing GetFileMetadataResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

file.getMetadata(function(err, metadata, apiResponse) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.getMetadata().then(function(data) {
  var metadata = data[0];
  var apiResponse = data[1];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the file to access, e.g. "file.txt"
// const filename = "file.txt";

// Instantiates a client
const storage = Storage();

// Gets the metadata for the file
storage
  .bucket(bucketName)
  .file(filename)
  .getMetadata()
  .then(results => {
    const metadata = results[0];

    console.log(`File: ${metadata.name}`);
    console.log(`Bucket: ${metadata.bucket}`);
    console.log(`Storage class: ${metadata.storageClass}`);
    console.log(`ID: ${metadata.id}`);
    console.log(`Size: ${metadata.size}`);
    console.log(`Updated: ${metadata.updated}`);
    console.log(`Generation: ${metadata.generation}`);
    console.log(`Metageneration: ${metadata.metageneration}`);
    console.log(`Etag: ${metadata.etag}`);
    console.log(`Owner: ${metadata.owner}`);
    console.log(`Component count: ${metadata.component_count}`);
    console.log(`Crc32c: ${metadata.crc32c}`);
    console.log(`md5Hash: ${metadata.md5Hash}`);
    console.log(`Cache-control: ${metadata.cacheControl}`);
    console.log(`Content-type: ${metadata.contentType}`);
    console.log(`Content-disposition: ${metadata.contentDisposition}`);
    console.log(`Content-encoding: ${metadata.contentEncoding}`);
    console.log(`Content-language: ${metadata.contentLanguage}`);
    console.log(`Metadata: ${metadata.metadata}`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

getSignedPolicy

getSignedPolicy(options, callback) returns Promise containing GetSignedPolicyResponse

Get a signed policy document to allow a user to upload data with a POST request.

Parameter

options

object

Configuration options.

Values in options have the following properties:

Parameter

equals

Optional

(array or Array of array)

Array of request parameters and their expected value (e.g. [['$ ', ' ']]). Values are translated into equality constraints in the conditions field of the policy document (e.g. ['eq', '$ ', ' ']). If only one equality condition is to be specified, options.equals can be a one- dimensional array (e.g. ['$ ', ' ']).

expires

any type

A timestamp when this policy will expire. Any value given is passed to new Date().

startsWith

Optional

(array or Array of array)

Array of request parameters and their expected prefixes (e.g. [['$ ', ' ']). Values are translated into starts-with constraints in the conditions field of the policy document (e.g. ['starts-with', '$ ', ' ']). If only one prefix condition is to be specified, options.startsWith can be a one- dimensional array (e.g. ['$ ', ' ']).

acl

Optional

string

ACL for the object from possibly predefined ACLs.

successRedirect

Optional

string

The URL to which the user client is redirected if the upload is successful.

successStatus

Optional

string

The status of the Google Storage response if the upload is successful (must be string).

contentLengthRange

Optional

object

contentLengthRange.min

Optional

number

Minimum value for the request's content length.

contentLengthRange.max

Optional

number

Maximum value for the request's content length.

callback

Optional

GetSignedPolicyCallback

Callback function.

See also

Policy Document Reference

Throws

Error 

If an expiration timestamp from the past is given.

Error 

If options.equals has an array with less or more than two members.

Error 

If options.startsWith has an array with less or more than two members.

Returns

Promise containing GetSignedPolicyResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
var options = {
  equals: ['$Content-Type', 'image/jpeg'],
  expires: '10-25-2022',
  contentLengthRange: {
    min: 0,
    max: 1024
  }
};

file.getSignedPolicy(options, function(err, policy) {
  // policy.string: the policy document in plain text.
  // policy.base64: the policy document in base64.
  // policy.signature: the policy signature in base64.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.getSignedPolicy(options).then(function(data) {
  var policy = data[0];
});

getSignedUrl

getSignedUrl(config, callback) returns Promise containing GetSignedUrlResponse

Get a signed URL to allow limited time access to the file.

Parameter

config

object

Configuration object.

Values in config have the following properties:

Parameter

action

string

"read" (HTTP: GET), "write" (HTTP: PUT), or "delete" (HTTP: DELETE).

cname

Optional

string

The cname for this bucket, i.e., "https://cdn.example.com".

contentMd5

Optional

string

The MD5 digest value in base64. If you provide this, the client must provide this HTTP header with this same value in its request.

contentType

Optional

string

If you provide this value, the client must provide this HTTP header set to the same value.

expires

any type

A timestamp when this link will expire. Any value given is passed to new Date().

extensionHeaders

Optional

object

If these headers are used, the server will check to make sure that the client provides matching values.

promptSaveAs

Optional

string

The filename to prompt the user to save the file as when the signed url is accessed. This is ignored if config.responseDisposition is set.

responseDisposition

Optional

string

The response-content-disposition parameter of the signed url.

responseType

Optional

string

The response-content-type parameter of the signed url.

callback

Optional

GetSignedUrlCallback

Callback function.

See also

Signed URLs Reference

Throws

Error 

if an expiration timestamp from the past is given.

Returns

Promise containing GetSignedUrlResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

//-
// Generate a URL that allows temporary access to download your file.
//-
var request = require('request');

var config = {
  action: 'read',
  expires: '03-17-2025'
};

file.getSignedUrl(config, function(err, url) {
  if (err) {
    console.error(err);
    return;
  }

  // The file is now available to read from this URL.
  request(url, function(err, resp) {
    // resp.statusCode = 200
  });
});

//-
// Generate a URL to allow write permissions. This means anyone with this URL
// can send a POST request with new data that will overwrite the file.
//-
file.getSignedUrl({
  action: 'write',
  expires: '03-17-2025'
}, function(err, url) {
  if (err) {
    console.error(err);
    return;
  }

  // The file is now available to be written to.
  var writeStream = request.put(url);
  writeStream.end('New data');

  writeStream.on('complete', function(resp) {
    // Confirm the new content was saved.
    file.download(function(err, fileContents) {
      console.log('Contents:', fileContents.toString());
      // Contents: New data
    });
  });
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.getSignedUrl(config).then(function(data) {
  var url = data[0];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the file to access, e.g. "file.txt"
// const filename = "file.txt";

// Instantiates a client
const storage = Storage();

// These options will allow temporary read access to the file
const options = {
  action: 'read',
  expires: '03-17-2025',
};

// Get a signed URL for the file
storage
  .bucket(bucketName)
  .file(filename)
  .getSignedUrl(options)
  .then(results => {
    const url = results[0];

    console.log(`The signed url for ${filename} is ${url}.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

makePrivate

makePrivate(options, callback) returns Promise containing MakeFilePrivateResponse

Make a file private to the project and remove all other permissions. Set options.strict to true to make the file private to only the owner.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

strict

Optional

boolean

If true, set the file to be private to only the owner user. Otherwise, it will be private to the project.

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

MakeFilePrivateCallback

Callback function.

See also

Objects: patch API Documentation

Returns

Promise containing MakeFilePrivateResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

//-
// Set the file private so only project maintainers can see and modify it.
//-
file.makePrivate(function(err) {});

//-
// Set the file private so only the owner can see and modify it.
//-
file.makePrivate({ strict: true }, function(err) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.makePrivate().then(function(data) {
  var apiResponse = data[0];
});

makePublic

makePublic(callback) returns Promise containing MakeFilePublicResponse

Set a file to be publicly readable and maintain all previous permissions.

Parameter

callback

Optional

MakeFilePublicCallback

Callback function.

See also

ObjectAccessControls: insert API Documentation

Returns

Promise containing MakeFilePublicResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

file.makePublic(function(err, apiResponse) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.makePublic().then(function(data) {
  var apiResponse = data[0];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the file to make public, e.g. "file.txt"
// const filename = "file.txt";

// Instantiates a client
const storage = Storage();

// Makes the file public
storage
  .bucket(bucketName)
  .file(filename)
  .makePublic()
  .then(() => {
    console.log(`gs://${bucketName}/${filename} is now public.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

move

move(destination, options, callback) returns Promise containing MoveResponse

Move this file to another location. By default, this will rename the file and keep it in the same bucket, but you can choose to move it to another Bucket by providing a Bucket or File object or a URL beginning with "gs://".

Warning: There is currently no atomic move method in the Cloud Storage API, so this method is a composition of File#copy (to the new location) and File#delete (from the old location). While unlikely, it is possible that an error returned to your callback could be triggered from either one of these API calls failing, which could leave a duplicate file lingering.

Parameter

destination

(string, Bucket, or File)

Destination file.

options

Optional

object

Configuration options. See an Object resource.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

MoveCallback

Callback function.

See also

Objects: copy API Documentation

Throws

Error 

If the destination file is not provided.

Returns

Promise containing MoveResponse 

Example

var storage = require('@google-cloud/storage')();
//-
// You can pass in a variety of types for the destination.
//
// For all of the below examples, assume we are working with the following
// Bucket and File objects.
//-
var bucket = storage.bucket('my-bucket');
var file = bucket.file('my-image.png');

//-
// If you pass in a string for the destination, the file is moved to its
// current bucket, under the new name provided.
//-
file.move('my-image-new.png', function(err, destinationFile, apiResponse) {
  // `my-bucket` no longer contains:
  // - "my-image.png"
  // but contains instead:
  // - "my-image-new.png"

  // `destinationFile` is an instance of a File object that refers to your
  // new file.
});

//-
// If you pass in a string starting with "gs://" for the destination, the
// file is copied to the other bucket and under the new name provided.
//-
var newLocation = 'gs://another-bucket/my-image-new.png';
file.move(newLocation, function(err, destinationFile, apiResponse) {
  // `my-bucket` no longer contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-image-new.png"

  // `destinationFile` is an instance of a File object that refers to your
  // new file.
});

//-
// If you pass in a Bucket object, the file will be moved to that bucket
// using the same name.
//-
var anotherBucket = gcs.bucket('another-bucket');

file.move(anotherBucket, function(err, destinationFile, apiResponse) {
  // `my-bucket` no longer contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-image.png"

  // `destinationFile` is an instance of a File object that refers to your
  // new file.
});

//-
// If you pass in a File object, you have complete control over the new
// bucket and filename.
//-
var anotherFile = anotherBucket.file('my-awesome-image.png');

file.move(anotherFile, function(err, destinationFile, apiResponse) {
  // `my-bucket` no longer contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-awesome-image.png"

  // Note:
  // The `destinationFile` parameter is equal to `anotherFile`.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.move('my-image-new.png').then(function(data) {
  var destinationFile = data[0];
  var apiResponse = data[1];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the file to move, e.g. "file.txt"
// const srcFilename = "file.txt";

// The destination path for the file, e.g. "moved.txt"
// const destFilename = "moved.txt";

// Instantiates a client
const storage = Storage();

// Moves the file within the bucket
storage
  .bucket(bucketName)
  .file(srcFilename)
  .move(destFilename)
  .then(() => {
    console.log(
      `gs://${bucketName}/${srcFilename} moved to gs://${bucketName}/${destFilename}.`
    );
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

save

save(data, options, callback) returns Promise

Write arbitrary data to a file.

This is a convenience method which wraps Fileile#createWriteStream.

Parameter

data

any type

The data to write to a file.

options

Optional

object

See File#createWriteStream's options parameter.

callback

Optional

SaveCallback

Callback function.

Returns

Promise 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
var contents = 'This is the contents of the file.';

file.save(contents, function(err) {
  if (!err) {
    // File written successfully.
  }
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.save(contents).then(function() {});

setEncryptionKey

setEncryptionKey(encryptionKey) returns File

The Storage API allows you to use a custom key for server-side encryption.

Parameter

encryptionKey

(string or buffer)

An AES-256 encryption key.

See also

Customer-supplied Encryption Keys

Returns

File 

Example

var crypto = require('crypto');
var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var encryptionKey = crypto.randomBytes(32);

var fileWithCustomEncryption = myBucket.file('my-file');
fileWithCustomEncryption.setEncryptionKey(encryptionKey);

var fileWithoutCustomEncryption = myBucket.file('my-file');

fileWithCustomEncryption.save('data', function(err) {
  // Try to download with the File object that hasn't had
  // `setEncryptionKey()` called:
  fileWithoutCustomEncryption.download(function(err) {
    // We will receive an error:
    //   err.message === 'Bad Request'

    // Try again with the File object we called `setEncryptionKey()` on:
    fileWithCustomEncryption.download(function(err, contents) {
      // contents.toString() === 'data'
    });
  });
});

Example of uploading an encrypted file:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the local file to upload, e.g. "./local/path/to/file.txt"
// const srcFilename = "./local/path/to/file.txt";

// The path to which the file should be uploaded, e.g. "file_encrypted.txt"
// const destFilename = "file.txt";

// Instantiates a client
const storage = Storage();

const options = {
  // The path to which the file should be uploaded, e.g. "file_encrypted.txt"
  destination: destFilename,
  // Encrypt the file with a customer-supplied key, e.g. "my-secret-key"
  encryptionKey: Buffer.from(key, 'base64'),
};

// Encrypts and uploads a local file, e.g. "./local/path/to/file.txt".
// The file will only be retrievable using the key used to upload it.
storage
  .bucket(bucketName)
  .upload(srcFilename, options)
  .then(() => {
    console.log(
      `File ${srcFilename} uploaded to gs://${bucketName}/${destFilename}.`
    );
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

Example of downloading an encrypted file:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the bucket to access, e.g. "my-bucket"
// const bucketName = "my-bucket";

// The name of the remote file to download, e.g. "file_encrypted.txt"
// const srcFilename = "file_encrypted.txt";

// The path to which the file should be downloaded, e.g. "./file.txt"
// const destFilename = "./file.txt";

// Instantiates a client
const storage = Storage();

const options = {
  // The path to which the file should be downloaded, e.g. "./file.txt"
  destination: destFilename,
};

// Descrypts and downloads the file. This can only be done with the key used
// to encrypt and upload the file.
storage
  .bucket(bucketName)
  .file(srcFilename)
  .setEncryptionKey(Buffer.from(key, 'base64'))
  .download(options)
  .then(() => {
    console.log(`File ${srcFilename} downloaded to ${destFilename}.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

setMetadata

setMetadata(metadata, options, callback) returns Promise containing SetFileMetadataResponse

Merge the given metadata with the current remote file's metadata. This will set metadata if it was previously unset or update previously set metadata. To unset previously set metadata, set its value to null.

You can set custom key/value pairs in the metadata key of the given object, however the other properties outside of this object must adhere to the official API documentation.

See the examples below for more information.

Parameter

metadata

Optional

object

The metadata you wish to update.

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

SetFileMetadataCallback

Callback function.

See also

Objects: patch API Documentation

Returns

Promise containing SetFileMetadataResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

var metadata = {
  contentType: 'application/x-font-ttf',
  metadata: {
    my: 'custom',
    properties: 'go here'
  }
};

file.setMetadata(metadata, function(err, apiResponse) {});

// Assuming current metadata = { hello: 'world', unsetMe: 'will do' }
file.setMetadata({
  metadata: {
    abc: '123', // will be set.
    unsetMe: null, // will be unset (deleted).
    hello: 'goodbye' // will be updated from 'hello' to 'goodbye'.
  }
}, function(err, apiResponse) {
  // metadata should now be { abc: '123', hello: 'goodbye' }
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.setMetadata(metadata).then(function(data) {
  var apiResponse = data[0];
});

setStorageClass

setStorageClass(storageClass, options, callback) returns Promise containing SetStorageClassResponse

Set the storage class for this file.

Parameter

storageClass

string

The new storage class. (multi_regional, regional, nearline, coldline)

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

SetStorageClassCallback

Callback function.

See also

Per-Object Storage Class

Storage Classes

Returns

Promise containing SetStorageClassResponse 

Example

file.setStorageClass('regional', function(err, apiResponse) {
  if (err) {
    // Error handling omitted.
  }

  // The storage class was updated successfully.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.setStorageClass('regional').then(function() {});