Constructor

File

new File(bucket, name, options)

A File object is created from your Bucket object using Bucket#file.

Parameter

bucket

Bucket

The Bucket instance this file is attached to.

name

string

The name of the remote file.

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

encryptionKey

Optional

string

A custom encryption key.

generation

Optional

number

Generation to scope the file to.

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');

Property

acl

Cloud Storage uses access control lists (ACLs) to manage object and bucket access. ACLs are the mechanism you use to share objects with other users and allow other users to access your buckets and objects.

An ACL consists of one or more entries, where each entry grants permissions to an entity. Permissions define the actions that can be performed against an object or bucket (for example, READ or WRITE); the entity defines who the permission applies to (for example, a specific user or group of users).

The acl object on a File instance provides methods to get you a list of the ACLs defined on your bucket, as well as set, update, and delete them.

Mixes in
Acl
See also

About Access Control lists

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
//-
// Make a file publicly readable.
//-
var options = {
  entity: 'allUsers',
  role: storage.acl.READER_ROLE
};

file.acl.add(options, function(err, aclObject) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.acl.add(options).then(function(data) {
  var aclObject = data[0];
  var apiResponse = data[1];
});

Methods

copy

copy(destination, options, callback) returns Promise containing CopyResponse

Copy this file to another file. By default, this will copy the file to the same bucket, but you can choose to copy it to another Bucket by providing a Bucket or File object or a URL starting with "gs://".

Parameter

destination

(string, Bucket, or File)

Destination file.

options

Optional

object

Configuration options. See an Object resource.

Values in options have the following properties:

Parameter

token

Optional

string

A previously-returned rewriteToken from an unfinished rewrite request.

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

CopyCallback

Callback function.

See also

Objects: copy API Documentation

Throws

Error 

If the destination file is not provided.

Returns

Promise containing CopyResponse 

Example

var storage = require('@google-cloud/storage')();

//-
// You can pass in a variety of types for the destination.
//
// For all of the below examples, assume we are working with the following
// Bucket and File objects.
//-
var bucket = storage.bucket('my-bucket');
var file = bucket.file('my-image.png');

//-
// If you pass in a string for the destination, the file is copied to its
// current bucket, under the new name provided.
//-
file.copy('my-image-copy.png', function(err, copiedFile, apiResponse) {
  // `my-bucket` now contains:
  // - "my-image.png"
  // - "my-image-copy.png"

  // `copiedFile` is an instance of a File object that refers to your new
  // file.
});

//-
// If you pass in a string starting with "gs://" for the destination, the
// file is copied to the other bucket and under the new name provided.
//-
var newLocation = 'gs://another-bucket/my-image-copy.png';
file.copy(newLocation, function(err, copiedFile, apiResponse) {
  // `my-bucket` still contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-image-copy.png"

  // `copiedFile` is an instance of a File object that refers to your new
  // file.
});

//-
// If you pass in a Bucket object, the file will be copied to that bucket
// using the same name.
//-
var anotherBucket = storage.bucket('another-bucket');
file.copy(anotherBucket, function(err, copiedFile, apiResponse) {
  // `my-bucket` still contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-image.png"

  // `copiedFile` is an instance of a File object that refers to your new
  // file.
});

//-
// If you pass in a File object, you have complete control over the new
// bucket and filename.
//-
var anotherFile = anotherBucket.file('my-awesome-image.png');
file.copy(anotherFile, function(err, copiedFile, apiResponse) {
  // `my-bucket` still contains:
  // - "my-image.png"
  //
  // `another-bucket` now contains:
  // - "my-awesome-image.png"

  // Note:
  // The `copiedFile` parameter is equal to `anotherFile`.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.copy(newLocation).then(function(data) {
  var newFile = data[0];
  var apiResponse = data[1];
});

Another example:

// Imports the Google Cloud client library
const Storage = require('@google-cloud/storage');

// The name of the source bucket, e.g. "my-bucket"
// const srcBucketName = "my-bucket";

// The name of the source file, e.g. "file.txt"
// const srcFilename = "file.txt";

// The destination bucket, e.g. "my-other-bucket"
// const destBucketName = "my-other-bucket";

// The destination filename, e.g. "file.txt"
// const destFilename = "file.txt";

// Instantiates a client
const storage = Storage();

// Copies the file to the other bucket
storage
  .bucket(srcBucketName)
  .file(srcFilename)
  .copy(storage.bucket(destBucketName).file(destFilename))
  .then(() => {
    console.log(
      `gs://${srcBucketName}/${srcFilename} copied to gs://${destBucketName}/${destFilename}.`
    );
  })
  .catch(err => {
    console.error('ERROR:', err);
  });

createReadStream

createReadStream(options) returns ReadableStream

Create a readable stream to read the contents of the remote file. It can be piped to a writable stream or listened to for 'data' events to read a file's contents.

In the unlikely event there is a mismatch between what you downloaded and the version in your Bucket, your error handler will receive an error with code "CONTENT_DOWNLOAD_MISMATCH". If you receive this error, the best recourse is to try downloading the file again.

For faster crc32c computation, you must manually install fast-crc32c:

$ npm install --save fast-crc32c

NOTE: Readable streams will emit the end event when the file is fully downloaded.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

validation

Optional

(string or boolean)

Possible values: "md5", "crc32c", or false. By default, data integrity is validated with a CRC32c checksum. You may use MD5 if preferred, but that hash is not supported for composite objects. An error will be raised if MD5 is specified but is not available. You may also choose to skip validation completely, however this is not recommended.

start

Optional

number

A byte offset to begin the file's download from. Default is 0. NOTE: Byte ranges are inclusive; that is, options.start = 0 and options.end = 999 represent the first 1000 bytes in a file or object. NOTE: when specifying a byte range, data integrity is not available.

end

Optional

number

A byte offset to stop reading the file at. NOTE: Byte ranges are inclusive; that is, options.start = 0 and options.end = 999 represent the first 1000 bytes in a file or object. NOTE: when specifying a byte range, data integrity is not available.

Returns

ReadableStream 

Example

//-
// <h4>Downloading a File</h4>
//
// The example below demonstrates how we can reference a remote file, then
// pipe its contents to a local file. This is effectively creating a local
// backup of your remote data.
//-
var storage = require('@google-cloud/storage')();
var bucket = storage.bucket('my-bucket');

var fs = require('fs');
var remoteFile = bucket.file('image.png');
var localFilename = '/Users/stephen/Photos/image.png';

remoteFile.createReadStream()
  .on('error', function(err) {})
  .on('response', function(response) {
    // Server connected and responded with the specified status and headers.
   })
  .on('end', function() {
    // The file is fully downloaded.
  })
  .pipe(fs.createWriteStream(localFilename));

//-
// To limit the downloaded data to only a byte range, pass an options object.
//-
var logFile = myBucket.file('access_log');
logFile.createReadStream({
    start: 10000,
    end: 20000
  })
  .on('error', function(err) {})
  .pipe(fs.createWriteStream('/Users/stephen/logfile.txt'));

//-
// To read a tail byte range, specify only `options.end` as a negative
// number.
//-
var logFile = myBucket.file('access_log');
logFile.createReadStream({
    end: -100
  })
  .on('error', function(err) {})
  .pipe(fs.createWriteStream('/Users/stephen/logfile.txt'));

createResumableUpload

createResumableUpload(options, callback) returns Promise containing CreateResumableUploadResponse

Create a unique resumable upload session URI. This is the first step when performing a resumable upload.

See the Resumable upload guide for more on how the entire process works.

Note

If you are just looking to perform a resumable upload without worrying about any of the details, see File#createWriteStream. Resumable uploads are performed by default.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

metadata

Optional

object

Metadata to set on the file.

origin

Optional

string

Origin header to set for the upload.

predefinedAcl

Optional

string

Apply a predefined set of access controls to this object.

Acceptable values are:
- **`authenticatedRead`** - Object owner gets `OWNER` access, and
  `allAuthenticatedUsers` get `READER` access.

- **`bucketOwnerFullControl`** - Object owner gets `OWNER` access, and
  project team owners get `OWNER` access.

- **`bucketOwnerRead`** - Object owner gets `OWNER` access, and project
  team owners get `READER` access.

- **`private`** - Object owner gets `OWNER` access.

- **`projectPrivate`** - Object owner gets `OWNER` access, and project
  team members get access according to their roles.

- **`publicRead`** - Object owner gets `OWNER` access, and `allUsers` get
  `READER` access.

private

Optional

boolean

Make the uploaded file private. (Alias for options.predefinedAcl = 'private')

public

Optional

boolean

Make the uploaded file public. (Alias for options.predefinedAcl = 'publicRead')

userProject

Optional

boolean

If this bucket has requesterPays functionality enabled (see Bucket#enableRequesterPays), set this value to the project which should be billed for this operation.

callback

Optional

CreateResumableUploadCallback

Callback function.

See also

Resumable upload guide

Returns

Promise containing CreateResumableUploadResponse 

Example

var storage = require('@google-cloud/storage')();
var myBucket = storage.bucket('my-bucket');

var file = myBucket.file('my-file');
file.createResumableUpload(function(err, uri) {
  if (!err) {
    // `uri` can be used to PUT data to.
  }
});

//-
// If the callback is omitted, we'll return a Promise.
//-
file.createResumableUpload().then(function(data) {
  var uri = data[0];
});

createWriteStream

createWriteStream(options) returns WritableStream

Create a writable stream to overwrite the contents of the file in your bucket.

A File object can also be used to create files for the first time.

Resumable uploads are automatically enabled and must be shut off explicitly by setting options.resumable to false.

There is some overhead when using a resumable upload that can cause noticeable performance degradation while uploading a series of small files. When uploading files less than 10MB, it is recommended that the resumable feature is disabled.

For faster crc32c computation, you must manually install fast-crc32c:

$ npm install --save fast-crc32c

NOTE: Writable streams will emit the finish event when the file is fully uploaded.

Parameter

options

Optional

object

Configuration options.

Values in options have the following properties:

Parameter

gzip

Optional

boolean

Automatically gzip the file. This will set options.metadata.contentEncoding to gzip.

metadata

Optional

object

See the examples below or Objects: insert request body for more details.

offset

Optional

string

The starting byte of the upload stream, for resuming an interrupted upload. Defaults to 0.

predefinedAcl

Optional

string