n2-standard-4
machine type and how long each
instance has been running. You can also review the storage space of a persistent
disk, and information about other Compute Engine features.
Usage reports do not provide billing or activity information, such as information about API requests. For billing information, see the Billing Export feature. For activity logs, see Activity logs.
Before you begin
- If you want to use the command-line examples in this guide:
- Install or update to the latest version of the gcloud command-line tool.
- Set a default region and zone.
- If you want to use the API examples in this guide, set up API access.
Overview
When you enable usage reports, Compute Engine delivers two types of reports to the Google Cloud Storage bucket you specify:
Daily usage reports
These reports are delivered daily and include usage data from the preceding day. Each report is a separate file that contains data from the last period. Data in these reports are immutable, meaning that Compute Engine will not update or rewrite the log file if there are inaccuracies. Instead, the data is corrected in the next new report that is delivered to the bucket.
Daily usage reports have the following name format:
<bucket>/<report_prefix>_<numeric_project_id>_<YYYYMMDD>.csv
Monthly rollup report
A single monthly rollup report is delivered daily, which contains monthly usage data for that project up to, but not including, that day. The monthly usage report is overwritten each day with new data that reflects the monthly usage of resources up to that date. There will only be one monthly usage data file per project, per month.
Monthly rollup reports have following name format:
<bucket>/<report_prefix>_<numeric_project_id>_<YYYYMM>.csv
The daily and monthly report files look very similar, except for the difference in date format, where the monthly rollup reports are dated using the year and month (
YYYYMM
), and the daily usage reports are dated using the year, month, and date (YYYYMMDD
).
All usage reports are delivered in comma-separated values (CSV) format
and usage report files are prefixed using <report_prefix>
. The <report_prefix>
is a customizable value chosen by the user. If you don't specify a report prefix,
the prefix usage_gce
will be used by default. All times are given in Pacific time (PST).
Prerequisites
Before you can start using Compute Engine usage export:
- Sign up for Google Cloud Storage if you haven't already.
- You must have an existing bucket to store usage logs.
Setting up usage export
When you first enable the usage export feature, the first report will be sent the following day, detailing the previous day's usage. Afterwards, you will receive reports in 24 hour intervals.
To set up the usage export feature, enable the feature in the
gcloud compute
tool. When you enable this
feature, you must define two properties:
The Google Cloud Storage bucket where you would like your reports to be delivered.
You can select any Cloud Storage bucket for which you are an owner, including buckets that are from different projects. This bucket must exist before you can start exporting reports and you must have owner access to the bucket. Google Cloud Storage charges for usage, so you should review the Cloud Storage pricesheet for information on how you might incur charges for the service.
Any user who has read access to the Cloud Storage bucket will be able to view the usage reports in the bucket. Any user who has write access to the bucket will be able to create, view, and modify existing files in the bucket. For more information, see the Access control section.
The report prefix for your files.
You can specify the report prefix to use for your usage reports. Your usage reports will have file names that contain this prefix. For example, specifying "my-cool-project-report" as your report prefix results in a file name similar to the format
my-cool-project-report_1234567890_20131230.csv
. If you do not specify a report prefix, the default prefixusage_gce
is used.
After you decide on these two properties, you can enable the usage export feature in the following ways:
Console
Go to the Compute Engine Settings page.
Check the Enable usage export box.
Fill in the field asking for a Bucket name. Optionally, provide a Report prefix, if desired. If you leave the report prefix empty, the default prefix
usage_gce
is used. All usage reports delivered to the bucket will be named with this prefix.Click Save.
gcloud
In gcloud compute
, use the gcloud compute project-info set-usage-bucket
command to enable this feature:
gcloud compute project-info set-usage-bucket --bucket [BUCKET_NAME] [--prefix [PREFIX]]
Where:
[BUCKET_NAME]
is the name of an existing bucket to receive the usage reports. Must be in the formatgs://<bucket-name>
orhttps://storage.googleapis.com/<bucket-name>
. The user running this command must be an owner of the bucket.[PREFIX]
is the optional prefix for the usage report names. If not specified, the default prefix will beusage_gce
.
API
Client libraries
Using the client libraries, set the setUsageExportBucket()
method in the
Projects
collection to enable usage export. For example, in the Python
Client Library, you can enable the feature like so:
def setUsageExport(gce_service, auth_http):
body = {
"bucketName": "https://storage.googleapis.com/usage-export-sample",
"reportNamePrefix": "exampleprefix"
}
request = gce_service.projects().setUsageExportBucket(project=PROJECT_ID, body=body)
response = request.execute(http=auth_http)
print response
HTTP
You can also directly make a HTTP request to the setUsageExportBucket
method as well. The following example uses the httplib2
library to enable
usage export:
#!/usr/bin/python
import urllib
import argparse
import logging
import sys
import random
from json import load,dumps
...
PROJECT_ID = "myproject"
API_VERSION = "v1"
API_URL = "https://www.googleapis.com/compute/" + API_VERSION + "/projects/" + PROJECT_ID
OAUTH_FILE = "oauth-dev.dat"
def main(argv):
logging.basicConfig(level=logging.INFO)
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter,
parents=[tools.argparser])
# Parse the command-line flags.
flags = parser.parse_args(argv[1:])
http = httplib2.Http()
# Add code to authenticate to the service
....
url = API_URL + "/setUsageExportBucket"
body = {
"bucketName": "https://storage.googleapis.com/usage-export-sample",
"reportNamePrefix": "exampleprefix"
}
formattedBody = dumps(body)
resp,content = http.request(url,
"POST",
headers=listOfHeaders,
body=formattedBody)
print str(content)
if __name__ == "__main__":
main(sys.argv)
For more information, see the API reference documentation.
Downloading usage export reports
After you start receiving usage reports in your bucket, download your reports like you would download other objects from Cloud Storage. For more information, see Download objects.
Supported metrics
Daily usage reports provide usage information about the following resources:
- Virtual machines
- Persistent disks
- Images
- Snapshots
- Static IP addresses
- Load balancers
Each resource is described using the following metrics:
Metric Name | Metric Properties |
---|---|
Report Date |
|
MeasurementId |
|
Quantity |
|
Unit |
|
Resource URI |
|
ResourceId |
|
Location |
|
An example entry in the report would look like the following:
Report Date | MeasurementId | Quantity | Unit | Resource URI | Resource ID | Location |
---|---|---|---|---|---|---|
02/13/2019 |
com.google.cloud/services/compute-engine/VmimageE2Standard_2
|
86400 | seconds |
https://compute.googleapis.com/compute/v1/projects/myproject/zones/us-central1-a/instances/my-instance
|
16557630484 | us-central1-a |
Access control
When you enable the usage export feature for a Cloud Storage bucket, Compute Engine automatically adds itself to the bucket with write access in order to deliver usage reports. As long as Compute Engine has access to the bucket and the usage export feature is enabled, Compute Engine will continue to export usage reports to the specified Cloud Storage bucket.
You can identify that Compute Engine has access to a bucket if you see the following identity added to the bucket access control list:
cloud-cluster-analytics-export@google.com
Any user who is an owner of the project has full access to the Google Cloud Storage bucket. Other users, such as writers and readers, have different degrees of access to the bucket. To learn about ACLs for a bucket, read the Cloud Storage access control documentation.
If you disable the usage export feature, Compute Engine will automatically remove write access from Compute Engine to the bucket. If you modify the permissions on the cloud-cluster-analytics-export@google.com account and then disable the usage export feature, Compute Engine will disable the usage export feature but won't remove the account from the project access list. You can choose to remove the account manually if desired.
Checking if usage reports are enabled
You can check on a project's usage export settings by getting information about the project:
gcloud compute project-info describe
Look for the usageExportLocation
field:
+-------------------------+----------------------------------------------------+ | name | myproject | | description | | | creation-time | 2019-10-18T16:31:52.308-07:00 | | usage | | | snapshots | 1.0/1000.0 | | networks | 2.0/2.0 | | firewalls | 3.0/10.0 | |... | | | usageExportLocation | | | bucketName | https://storage.googleapis.com/usage-export-sample | | reportNamePrefix | | +-------------------------+----------------------------------------------------+
Disabling usage reports
When you disable usage reports, Compute Engine automatically removes write access for Compute Engine to your Cloud Storage bucket and discontinues sending any new reports.
Console
Go to the Compute Engine Settings page.
Uncheck the Enable usage export box to disable usage export.
gcloud
In gcloud compute
, disable the usage export feature by running the
gcloud compute project-info set-usage-bucket
command with the --no-bucket
flag:
gcloud compute project-info set-usage-bucket --no-bucket
API
Client libraries
To use the client libraries, make a request to the setUsageExportBucket()
method, with a blank bucket name. For example, in the Python Client Library,
you can disable the feature like so:
def disableUsageExport(gce_service, auth_http):
body = {
"bucketName": "",
}
request = gce_service.projects().setUsageExportBucket(project=PROJECT_ID, body=body)
response = request.execute(http=auth_http)
print response
HTTP
You can also directly make a HTTP request to the setUsageExportBucket
method as well. The following example uses the httplib2 library to make a
request to disable the usage export bucket:
#!/usr/bin/python
import urllib
import argparse
import logging
import sys
import random
from json import load,dumps
...
PROJECT_ID = "myproject"
API_VERSION = "v1"
API_URL = "https://www.googleapis.com/compute/" + API_VERSION + "/projects/" + PROJECT_ID
OAUTH_FILE = "oauth-dev.dat"
def main(argv):
logging.basicConfig(level=logging.INFO)
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter,
parents=[tools.argparser])
# Parse the command-line flags.
flags = parser.parse_args(argv[1:])
http = httplib2.Http()
# Add code to authenticate to the service
....
url = API_URL + "/setUsageExportBucket"
body = {
"bucketName": ""
}
formattedBody = dumps(body)
resp,content = http.request(url,
"POST",
headers=listOfHeaders,
body=formattedBody)
print str(content)
if __name__ == "__main__":
main(sys.argv)
What's next
- Turn on the Billing Export feature to view your billing logs.
- Track the activity in your project using Activity Logs.
- Learn more about Cloud Storage buckets.
- Learn more about Compute Engine pricing.
- Use the pricing calculator to get an estimated price.