This page shows how to backup data from a local machine to Cloud Storage using Cloud Tools for PowerShell. Unlike most resources, Cloud Tools for PowerShell provides two ways to access Cloud Storage resources, the cmdlets and a PowerShell provider.
The provider allows you to access Storage buckets and objects like a file system, using the file system commands you are already familiar with. The provider has some limitations, however. Not all legal object names convert to legal provider paths. You can't use the provider to manage ACLs. For these advanced cases, you can use the cmdlets. See the Cloud Tools for PowerShell cmdlet reference to learn more about Cloud Storage cmdlets.
Uploading data
Data in Cloud Storage is organized into buckets. Create a new bucket as follows:
Cmdlets
Use the New-GcsBucket
cmdlet
to create a new bucket:
$bucket = "my-gcs-bucket" New-GcsBucket $bucket
Provider
Buckets are folders at the root of the gs:\
drive. Creating a
new item at that level will create a new bucket.
cd gs:\ $bucket = "my-gcs-bucket" mkdir $bucket
Upload files to a bucket
You can upload a single file or an entire directory to your bucket:
Cmdlets
Use New-GcsObject
. This requires a destination bucket and an object name
as parameters. Where the new Storage object’s contents come from depends on
which parameter set you use.
You can upload the contents of a local file to Cloud Storage by
using the -File
parameter and specifying a file path. Alternatively, you
can pass the object’s contents as a string via the PowerShell pipeline, or
you can use the -Value
parameter.
You can upload an entire directory from the local disk to
Cloud Storage by using the -Folder
parameter and specifying the
folder path. If you do not want the folder to be uploaded directly to the
root of the Cloud Storage bucket, use -ObjectNamePrefix
to
specify a prefix that will be applied to every object uploaded.
# Upload the folder LogFiles and its content to the root of the widget bucket. New-GcsObject -Bucket "widget" -Folder "C:\inetpub\logs\LogFiles" # Upload the folder LogFiles and its content to directory Test in the widget bucket. New-GcsObject -Bucket "widget" -Folder "C:\inetpub\logs\LogFiles" -ObjectNamePrefix "Test"
Provider
Use New-Item
. It requires a path to the item being created. This can be
an absolute path or a relative path. The contents of the new Storage Object
can be specified either as a string to the -Value
parameter or by
specifying a file path to the -File
parameter.
New-Item gs:\my-gcs-bucket\new-object -File $file
The following snippet uploads an entire directory from the local disk to Cloud Storage.
cd $folder $files = Get-ChildItem -Recurse $data = @() foreach ($file in $files) { $objectPath = $file | Resolve-Path -Relative $data += @{file = $file; objectPath = $objectPath} } cd gs:\my-gcs-bucket foreach($element in $data) { Write-Host "`t${$element.objectPath}" New-Item $element.objectPath -File $element.file }
Searching data
You can search data with cmdlets, or with the provider through the common file search cmdlets.
Cmdlets
You can search through a bucket’s objects using Get-GcsObject
. This can be
useful when combined with the Out-GridView
cmdlet to visualize your data:
Get-GcsObject $bucket | Select Name, Size | Out-GridView
Provider
You can use Get-ChildItem
or one of its aliases: dir
, ls
, or gci
.
You can use the -Recurse
parameter to look within all of the logical
folders:
cd gs:\my-gcs-bucket ls -Recurse
Reading data
To read data through the provider, use the standard Get-Content
cmdlet.
Alternatively, use the Read-GcsObject
cmdlet.
Cmdlets
To read the contents of a Cloud Storage object, use the
Read-GcsObject
cmdlet. By default it reads the object’s contents as a
string and writes it to the PowerShell pipeline. You can specify the
-OutFile
parameter to download the object’s contents to the local disk
instead:
Read-GcsObject $bucket "timestamp.txt" | Write-Host Read-GcsObject $bucket "logo.png" ` -OutFile "$Env:UserProfile\pictures\logo.png"
Provider
To read the contents of a Cloud Storage object, use the Get-Content
cmdlet, or one of its aliases: cat
, gc
, or type
.
cd gs:\my-gcs-bucket cat my-object-name
Deleting data
To delete data through the provider, use the standard Remove-Item
cmdlet.
Alternatively, use the Remove-GcsObject
cmdlet.
Cmdlets
To remove any data in Cloud Storage, use the Remove-GcsObject
cmdlet:
Get-GcsObject $bucket | Remove-GcsObject
Provider
To remove data in Cloud Storage, use the Remove-Item
cmdlet, or
one of its aliases del
, rm
, erase
:
cd gs:\my-gcs-bucket rm my-object-name