This page shows you how to upload objects to your Cloud Storage bucket from your local file system. An uploaded object consists of the data you want to store along with any associated metadata. For a conceptual overview, including how to choose the optimal upload method based on your file size, see Uploads and downloads.
For instructions on uploading from memory, see Upload objects from memory.
Required roles
To get the permissions that you need to upload objects to a bucket, ask your
administrator to grant you the Storage Object User
(roles/storage.objectUser
) IAM role on the bucket. This
predefined role contains the permissions required to upload an object to a
bucket. To see the exact permissions that are required, expand the
Required permissions section:
Required permissions
storage.objects.create
storage.objects.delete
- This permission is only required for uploads that overwrite an existing object.
storage.objects.get
- This permission is only required if you plan on using the Google Cloud CLI to perform the tasks on this page.
storage.objects.list
- This permission is only required if you plan on using the Google Cloud CLI to perform the tasks on this page. This permission is also required if you want to use the Google Cloud console to verify the objects you've uploaded.
If you plan on using the Google Cloud console to perform the tasks on this
page, you'll also need the storage.buckets.list
permission, which is not
included in the Storage Object User (roles/storage.objectUser
) role. To get
this permission, ask your administrator to grant you the Storage Admin
(roles/storage.admin
) role on the project.
You can also get these permissions with other predefined roles or custom roles.
For information about granting roles on buckets, see Use IAM with buckets.
Upload an object to a bucket
Complete the following steps to upload an object to a bucket:
Console
- In the Google Cloud console, go to the Cloud Storage Buckets page.
In the list of buckets, click the name of the bucket that you want to upload an object to.
In the Objects tab for the bucket, either:
Drag files from your desktop or file manager to the main pane in the Google Cloud console.
Click the Upload Files button, select the files you want to upload in the dialog that appears, and click Open.
To learn how to get detailed error information about failed Cloud Storage operations in the Google Cloud console, see Troubleshooting.
Command line
Use the gcloud storage cp
command:
gcloud storage cp OBJECT_LOCATION gs://DESTINATION_BUCKET_NAME
Where:
OBJECT_LOCATION
is the local path to your object. For example,Desktop/dog.png
.DESTINATION_BUCKET_NAME
is the name of the bucket to which you are uploading your object. For example,my-bucket
.
If successful, the response looks like the following example:
Completed files 1/1 | 164.3kiB/164.3kiB
You can set fixed-key and custom object metadata as part of your object upload by using command flags.
Client libraries
For more information, see the
Cloud Storage C++ API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
For more information, see the
Cloud Storage C# API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
For more information, see the
Cloud Storage Go API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
For more information, see the
Cloud Storage Java API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
The following sample uploads an individual object: The following sample uploads multiple objects concurrently: The following sample uploads all objects with a common prefix concurrently:
For more information, see the
Cloud Storage Node.js API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
The following sample uploads an individual object: The following sample uploads multiple objects concurrently: The following sample uploads all objects with a common prefix concurrently:
For more information, see the
Cloud Storage PHP API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
For more information, see the
Cloud Storage Python API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
The following sample uploads an individual object: The following sample uploads multiple objects concurrently: The following sample uploads all objects with a common prefix concurrently:
For more information, see the
Cloud Storage Ruby API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
C++
C#
Go
Java
Node.js
PHP
Python
Ruby
Terraform
You can use a Terraform resource to upload an object.
Either content
or source
must be specified.
REST APIs
JSON API
The JSON API distinguishes between media uploads, in which only object data is included in the request, and JSON API multipart uploads, in which both object data and object metadata are included in the request.
Media upload (a single-request upload without object metadata)
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorization
header.Use
cURL
to call the JSON API with aPOST
Object request:curl -X POST --data-binary @OBJECT_LOCATION \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Content-Type: OBJECT_CONTENT_TYPE" \ "https://storage.googleapis.com/upload/storage/v1/b/BUCKET_NAME/o?uploadType=media&name=OBJECT_NAME"
Where:
OBJECT_LOCATION
is the local path to your object. For example,Desktop/dog.png
.OBJECT_CONTENT_TYPE
is the content type of the object. For example,image/png
.BUCKET_NAME
is the name of the bucket to which you are uploading your object. For example,my-bucket
.OBJECT_NAME
is the URL-encoded name you want to give your object. For example,pets/dog.png
, URL-encoded aspets%2Fdog.png
.
JSON API multipart upload (a single-request upload that includes object metadata)
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorization
header.Create a
multipart/related
file that contains the following information:--BOUNDARY_STRING Content-Type: application/json; charset=UTF-8 OBJECT_METADATA --BOUNDARY_STRING Content-Type: OBJECT_CONTENT_TYPE OBJECT_DATA --BOUNDARY_STRING--
Where:
BOUNDARY_STRING
is a string you define that identifies the different parts of the multipart file. For example,separator_string
.OBJECT_METADATA
is metadata you want to include for the file, in JSON format. At a minimum, this section should include aname
attribute for the object, for example{"name": "myObject"}
.OBJECT_CONTENT_TYPE
is the content type of the object. For example,text/plain
.OBJECT_DATA
is the data for the object.
For example:
--separator_string Content-Type: application/json; charset=UTF-8 {"name":"my-document.txt"} --separator_string Content-Type: text/plain This is a text file. --separator_string--
Use
cURL
to call the JSON API with aPOST
Object request:curl -X POST --data-binary @MULTIPART_FILE_LOCATION \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Content-Type: multipart/related; boundary=BOUNDARY_STRING" \ -H "Content-Length: MULTIPART_FILE_SIZE" \ "https://storage.googleapis.com/upload/storage/v1/b/BUCKET_NAME/o?uploadType=multipart"
Where:
MULTIPART_FILE_LOCATION
is the local path to the multipart file you created in step 2. For example,Desktop/my-upload.multipart
.BOUNDARY_STRING
is the boundary string you defined in Step 2. For example,my-boundary
.MULTIPART_FILE_SIZE
is the total size, in bytes, of the multipart file you created in Step 2. For example,2000000
.BUCKET_NAME
is the name of the bucket to which you are uploading your object. For example,my-bucket
.
If the request succeeds, the server returns the HTTP 200 OK
status
code along with the file's metadata.
XML API
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorization
header.Use
cURL
to call the XML API with aPUT
Object request:curl -X PUT --data-binary @OBJECT_LOCATION \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Content-Type: OBJECT_CONTENT_TYPE" \ "https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME"
Where:
OBJECT_LOCATION
is the local path to your object. For example,Desktop/dog.png
.OBJECT_CONTENT_TYPE
is the content type of the object. For example,image/png
.BUCKET_NAME
is the name of the bucket to which you are uploading your object. For example,my-bucket
.OBJECT_NAME
is the URL-encoded name you want to give your object. For example,pets/dog.png
, URL-encoded aspets%2Fdog.png
.
You can set additional object metadata as part of your object upload
in the headers of the request in the same way the previous example sets
Content-Type
. When working with the XML API, metadata can only be set at
the time the object is written, such as when uploading, copying, or
replacing the object. For more information, see
Editing object metadata.
What's next
- Learn about naming requirements for objects.
- Learn about using folders to organize your objects.
- Transfer objects from your Compute Engine instance.
- Transfer data from cloud providers or other online sources, such as URL lists.
- Control who has access to your objects and buckets.
- View your object's metadata, including the URL for the object.
Try it for yourself
If you're new to Google Cloud, create an account to evaluate how Cloud Storage performs in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
Try Cloud Storage free