Reverse Geocoding of Geolocation Telemetry in the Cloud Using the Maps API

This tutorial shows how to use Google Cloud Platform to build an app that receives telemetric data about geolocation, processes it, and then stores the processed and transformed data for further analysis.

This tutorial illustrates the concepts shown in Building a Scalable Geolocation Telemetry System in the Cloud with the Maps API. It includes and uses data from the San Diego, California, freeways public dataset. This data was captured from actual automobile journeys, recorded by using road sensors. You can register for an account to have access to the full dataset if you want to run your own experiments with additional data.

The tutorial:

  • Starts with traffic data stored in CSV files.
  • Processes messages in a Google Cloud Pub/Sub queue.
  • Reverse geocodes latitude and longitude to convert the coordinates to a street address.
  • Calculates the elevation above sea level.
  • Converts from Coordinated Universal Time (UTC) to local time by querying which timezone each location is in.
  • Writes the data, with the added geographic contextual information, to a BigQuery dataset for your analysis.
  • Visualizes the data as heat maps superimposed over a map of the San Diego metro area.

The following diagram illustrates the major components and how the data moves between them:

Tutorial pipeline

Objectives

  • Creating credentials.
  • Setting up Cloud Pub/Sub and BigQuery.
  • Loading and analyzing the data in BigQuery.
  • Visualizing the data on a web page.
  • Understanding the code.

Costs

This tutorial uses billable components of Google Cloud Platform, including:

  • Google BigQuery (5 GB storage, 5 GB streaming inserts)
  • Google Cloud Pub/Sub (< 200k operations)
  • Google Maps API

The cost of running this tutorial will vary depending on run time. Use the pricing calculator estimate to see a cost estimate based on your projected usage.

New Cloud Platform users might be eligible for a free trial.

The Maps API standard plan offers a free quota and pay-as-you-go billing after the quota has been exceeded. If you have an existing license for the Maps API or have a Maps APIs Premium Plan, see the documentation first for some important notes. You can purchase a Maps APIs Premium Plan for higher quotas.

You must have a Google Maps license for any application that restricts access, such as behind a firewall or on a corporate intranet. For more details about Google Maps API pricing and plans, see the online documentation.

Before you begin

Set up Google Cloud Platform

  1. Select or create a Google Cloud Platform Console project.
  2. Enable billing for your project.
  3. Click the following button to enable the required Cloud Platform APIs. If prompted, be sure to select the project you created in step 1.

    Enable APIs

These APIs include:

  • BigQuery API
  • Google Cloud Pub/Sub API
  • Google Cloud Storage
  • Google Maps Geocoding API
  • Google Maps Elevation API
  • Google Maps Time Zone API
  • Google Maps JavaScript API

Set up your development environment

  1. Install the Cloud SDK.
  2. Initialize and authenticate the gcloud command-line tool:

    gcloud init
    
  3. Set up your Python development environment. Follow these instructions to install on Linux, OS X, or Windows.

    On Debian/Ubuntu, be sure to install build-essential:

    sudo apt-get install build-essential libssl-dev libffi-dev python-dev
    
  4. Download the code and other files to your local computer. To clone the GitHub repository by using git:

    git clone https://github.com/GoogleCloudPlatform/bigquery-reverse-geolocation
    

    Alternatively, you can download and unzip the archive.

  5. For Debian-based systems, install additional prerequisites:

    1. Change to the resources directory from the directory where you downloaded the code:

      cd resources
      
    2. If you don't have it already, install pip:

      wget https://bootstrap.pypa.io/get-pip.py
      sudo python get-pip.py
      
    3. Install the prerequisites:

      sudo pip install -r requirements.txt
      
    4. Change directory back to the root of the project:

      cd ..
      

Creating credentials

For this tutorial, you'll need the following credentials:

  • An OAuth 2.0 client ID.
  • A Google API server key.
  • A Google API browser key.

Creating an OAuth 2.0 client ID

Create a client ID that you can use to authenticate end-user requests to BigQuery. Follow these steps:

  1. Find the IPv4 address of your computer. For example, in your browser, view a page such as http://ip-lookup.net. If your computer is on a corporate intranet, you need to get the address from your operating system. For example, run ifconfig on Linux or ipconfig -all on Windows.
  2. Click the following button to open the Credentials page in the Cloud Platform Console. If you have more than one project, you might be prompted to select a project.

    Create credentials

  3. Click Create credentials > OAuth client ID.

  4. Select Web application.
  5. Enter "Maps API client ID" in the Name field.
  6. Add the following two origin URLs in the Restrictions section, in Authorized JavaScript origins. Replace [YOUR_IP_ADDRESS] with the IPv4 address of your computer.

    http://[YOUR_IP_ADDRESS]:8000
    https://[YOUR_IP_ADDRESS]:8000

    Adding these URLs enables an end user to access BigQuery data through JavaScript running in a browser. You need this authorization for an upcoming section of the tutorial, when you display a visualization of data on a map in your web browser.

  7. Click Create to generate the new client ID.

  8. Click OK to close the dialog box that shows your new client ID and secret.

Creating a server key

  1. In the Cloud Platform Console, click Create credentials > API key.
  2. Click Restrict key.
  3. Name the key "Maps tutorial server key".
  4. In the Key restriction section, select IP addresses.
  5. In the Accept requests from these server IP addresses field, enter the IPv4 address of you computer, which you noted in the previous section.
  6. Click Save.
  7. Stay on the page.

Creating a browser key

The browser key is a requirement for using the Maps JavaScript API. Follow these steps:

  1. Click Create credentials > API key.
  2. Click Restrict key.
  3. Name the key "Maps tutorial browser key".
  4. Click Save.

Setting up Cloud Pub/Sub

Cloud Pub/Sub is the messaging queue that handles moving the data from CSV files to BigQuery. You need to create a topic, which publishes the messages, and a subscription, which receives the published messages.

Create a Cloud Pub/Sub topic

The topic publishes the messages. Follow these steps to create the topic:

  1. Browse to the Cloud Pub/Sub topic list page in the Cloud Platform Console:

    Open the Pub/Sub page

  2. Click Create a topic. A dialog box opens.

  3. Add "traffic" to the end of the path that is provided for you in the Name field. The path is determined by the system. You can provide only the name for the topic.

    Cloud Pub/Sub topics require a name.

  4. Click Create. The dialog box closes.

  5. Stay on the page.

Create a Cloud Pub/Sub subscription

The subscription receives the published messages. Follow these steps to create the subscription:

  1. In the topic list, in the row that contains the traffic topic, click the downward arrow on the right-hand end of the row.
  2. Click New subscription to open the Create a new subscription page.
  3. Add "mysubscription" to the end of the path that is provided for you in the Subscription name field.

    Cloud Pub/Sub topics require a name.

  4. Select Pull for the Delivery Type.

  5. Click Create.

Setting up BigQuery

To prepare BigQuery to receive the data that you want to analyze, you must create a dataset, which is the logical container for the data tables, and then add a table to the dataset, which specifies the schema and stores the data in the specified format.

  1. In your terminal, change to the resources directory:

    cd resources
    
  2. Create the empty dataset, named sandiego_freeways:

    bq mk sandiego_freeways
    

    The terminal shows a confirmation message:

    Dataset '[YOUR_PROJECT_ID]:sandiego_freeways' successfully created.

  3. Add the table, named geocoded_journeys, to the dataset:

    bq mk --schema geocoded_journeys.json sandiego_freeways.geocoded_journeys
    

    The terminal prints a confirmation message:

    sandiego_freeways.geocoded_journeys' successfully created.

    This command creates a table that conforms to the schema defined in geocoded_journeys.json.

Viewing the schema

To view the schema in the BigQuery console, follow these steps:

  1. Open the BigQuery Console:

    BigQuery console

  2. Near the top of the left-hand panel, locate your project name. You should see sandiego_freeways listed below the project name.

  3. Click sandiego_freeways to expand the node.
  4. Click geocoded_journeys.

You should see the schema displayed in the right-hand panel of the console:

BigQuery schema can be viewed ina table.

Loading the data into BigQuery

Now you can perform the steps that import the data from the CSV files, transcode the data, and load it into a BigQuery table.

Pushing the data to the topic

To push the data to the topic, first modify the setup file and then run the Python script.

Modifying the setup file

  1. From the resources directory, use your preferred text editor to edit setup.yaml.

    env:
    # Change to your project ID
        PROJECT_ID: 'your-project-id'
    # Change to  datasetid
        DATASET_ID: 'sandiego_freeways'
    # Change to tableid
        TABLE_ID: 'geocoded_journeys'
    # Change this to your pubsub topic
        PUBSUB_TOPIC: 'projects/your-project-id/topics/traffic'
    # Change the following to your rootdir
        ROOTDIR: '/tmp/creds/data'
    # Change the following to your pull subscription    
        SUBSCRIPTION: 'projects/your-project-id/subscriptions/mysubscription'
    # Change to your Google Maps API Key, see https://developers.google.com/maps/web-services/
        MAPS_API_KEY: 'Your-server-key'

  2. For PROJECT_ID, replace your-project-id with your project ID. Keep the single quotation marks in this and all other values that you replace.

  3. For DATASET_ID, don't change sandiego_freeways.
  4. For TABLE_ID, don't change geocoded_journeys.
  5. For PUBSUB_TOPIC, replace your-project-id with your project ID.
  6. For ROOTDIR, replace the provided path with resources/data.
  7. For SUBSCRIPTION, replace your-project-id with your project ID.
  8. For MAPS_API_KEY, replace Your-server-key with the server key you created and named "Maps tutorial server key". You can see your credentials by clicking the following button:

    View credentials

  9. Save and close the file.

Running the push script

From the bigquery-reverse-geolocation root directory, run the Python script that populates the Cloud Pub/Sub topic.

  1. Change directory to the root of the project:

    cd ..
    
  2. Run the script:

    python config_geo_pubsub_push.py
    

You should see repeated lines of output like this one:

Vehicle ID: 1005, location: 33.2354833333, -117.387343333; speed: 44.698 mph, bearing: 223.810 degrees

This output confirms that each line of data has been pushed. If you don't see many lines of such output, double-check setup.yaml to ensure that you provided all the correct path and name information.

It can take some time to push all the data to Cloud Pub/Sub.

Understanding the push script

The code in config_geo_pubsub_push.py performs some straightforward tasks. First, the code creates a Cloud Pub/Sub client.

def create_pubsub_client(http=None):
    credentials = oauth2client.GoogleCredentials.get_application_default()
    if credentials.create_scoped_required():
        credentials = credentials.create_scoped(PUBSUB_SCOPES)
    if not http:
        http = httplib2.Http()
    credentials.authorize(http)
    return discovery.build('pubsub', 'v1', http=http)

Next, the code finds a CSV data file and opens it.

with open(myfile) as data_file:
    reader = csv.reader(data_file)
    for line in reader:
        line_count += 1

        if line_count > 1:
            # Convert NMEA GPS format to decimal degrees.
            # See http://www.gpsinformation.org/dale/nmea.htm#position for NMEA GPS format details.
            lat = float(line[3][0:2])
            lng = float(line[5][0:3])
            lng_minutes = float(line[5][3:])/60
            lat_minutes = float(line[3][2:])/60
            latitude = lat + lat_minutes
            longitude =  0 - (lng + lng_minutes)
            ts = create_timestamp(line[1],line[9])
            msg_attributes = {'timestamp': ts}
            print "Vehicle ID: {0}, location: {1}, {2}; speed: {3} mph, bearing: {4} degrees".format(vehicleID, latitude,longitude, line[7], line[8])
            proc_line =  "{0}, {1}, {2}, {3} ,{4} ".format(vehicleID, latitude,longitude, line[7], line[8])
            publish(client, pubsub_topic, proc_line, msg_attributes)

For each line in the CSV file, the script performs a basic conversion on the latitude and longitude values to format them in units of degrees. Then, the code formats a timestamp based on the time information in the CSV file, and saves the timestamp in the msg_attributes variable. After logging the values to the terminal window, the code formats the data into a line and publishes the data to the Cloud Pub/Sub topic.

def publish(client, pubsub_topic, data_line, msg_attributes=None):
    """Publish to the given pubsub topic."""
    data = base64.b64encode(data_line)
    msg_payload = {'data': data}
    if msg_attributes:
        msg_payload['attributes'] = msg_attributes
    body = {'messages': [msg_payload]}
    resp = client.projects().topics().publish(
        topic=pubsub_topic, body=body).execute(num_retries=NUM_RETRIES)
    return resp

Pulling data from the topic

To pull the data from the topic and load it into your BigQuery table, you run another Python script. This script uses the same setup.yaml file that you already modified.

Running the pull script

Run the Python script that pulls data from Cloud Pub/Sub and loads it into BigQuery:

python config_geo_pubsub_pull.py

You should see repeated output like this:

Appended one row to BigQuery.
Address: Vandegrift Blvd, Oceanside, CA 92058, USA
Elevation: 29.9088001251 metres
Timezone: America/Los_Angeles

This output confirms that each line of data has been loaded into your BigQuery table.

It can take some to pull all the data from the topic. When it's done, the terminal window will stop showing lines of output as it waits for further data. You can exit the process at any time by pressing Control+C.

Understanding the pull script

When you ran config_geo_pubsub_pull.py, the script performed some important work, so it's worth taking a moment to review it and understand what happened.

First, the code creates a Cloud Pub/Sub client object, exactly like the push script did. The code also sets up some configuration values, such as the size of a message batch and some limits for geocoding operations to stay within daily quotas.

def main(argv):

    client = create_pubsub_client()

    # You can fetch multiple messages with a single API call.
    batch_size = 100

    # Options to limit number of geocodes e.g to stay under daily quota.
    geocode_counter = 0
    geocode_limit = 10

    # Option to wait for some time until daily quotas are reset.
    wait_timeout = 2

Next, the code creates an instance of the Google Maps API client and creates an HTTP POST body for Cloud Pub/Sub requests to be posted. The following code also retrieves the name of the subscription from the setup.yaml file.

# Create a Google Maps API client.
gmaps = googlemaps.Client(key=cfg["env"]["MAPS_API_KEY"])
subscription = cfg["env"]["SUBSCRIPTION"]

# Create a POST body for the Cloud Pub/Sub request.
body = {
    # Setting ReturnImmediately to False instructs the API to wait
    # to collect the message up to the size of MaxEvents, or until
    # the timeout.
    'returnImmediately': False,
    'maxMessages': batch_size,
}

The code then enters a loop that runs until you terminate the process, such as by pressing Control+C. This loop pulls messages using the previously cached Cloud Pub/Sub subscription name:

while running_proc:
    # Pull messages from Cloud Pub/Sub
    resp = client.projects().subscriptions().pull(
        subscription=subscription, body=body).execute()

    received_messages = resp.get('receivedMessages')

The code processes each message. The key point here is that the code uses the Google Maps API to reverse geocode from latitude and longitude to street address. Click View on GitHub to see the rest of the data extraction and transcoding of the data format for storage in BigQuery.

# Extract latitude,longitude for input into Google Maps API calls.
latitude = float(data_list[1])
longitude = float(data_list[2])

# Construct a row object that matches the BigQuery table schema.
row = { 'VehicleID': data_list[0], 'UTCTime': None, 'Offset': 0, 'Address':"", 'Zipcode':"", 'Speed':data_list[3], 'Bearing':data_list[4], 'Elevation':None, 'Latitude':latitude, 'Longitude': longitude }

# Maps API Geocoding has a daily limit - this lets us limit API calls during development.
if geocode_counter <= geocode_limit:

    # Reverse geocode the latitude, longitude to get street address, city, region, etc.
    address_list = reverse_geocode(gmaps, latitude, longitude)

The code saves the row of data to BigQuery .

# save a row to BigQuery
result = stream_row_to_bigquery(bq, row)

Finally, the code sends an acknowledgement for the original message and then repeats the loop.

Analyzing the data

Now that the you have transcoded and loaded the data into BigQuery, you can use BigQuery to gain insights. This section of the tutorial shows you how to use the BigQuery console run a few simple queries against this data.

  1. Open the BigQuery Console:

    BigQuery Console

  2. Select the sandiego_freeways database.

  3. Click the Compose Query button.
  4. In the New Query text box, enter the following query that produces average speed by zip code:

    SELECT AVG(Speed) avg_speed, Zipcode FROM [sandiego_freeways.geocoded_journeys]
    WHERE Zipcode <> ''
    GROUP BY Zipcode ORDER BY avg_speed DESC
    

You should see results like this:

BigQuery shows results in a table.

Here are two more queries you can try:

Average speed by street name

SELECT AVG(Speed) as avg_speed FROM [sandiego_freeways.geocoded_journeys]
WHERE Address CONTAINS('Vandegrift Blvd')

Worst speeding places

SELECT Speed, VehicleID, Address, Zipcode FROM [sandiego_freeways.geocoded_journeys]
WHERE Speed > 65
ORDER BY Speed DESC

Visualizing the data

You can use Google Maps to visualize the data you stored in BigQuery. This part of the tutorial shows you how to superimpose a heat map visualization onto a map of the region. The heat map shows the volume of traffic activity captured in the data in BigQuery.

To keep the tutorial straightforward, the provided example uses OAuth 2.0 to authenticate the user for the BigQuery service. You could choose another approach that might be better-suited for your scenario. For example, you could export query results from BigQuery and create a static map layer that doesn’t require the end user to authenticate against BigQuery, or you could set up authentication by using a service account and a proxy server.

To show the data visualization, follow these steps.

Modify bqapi.html

The file named bqapi.html is the web page source file. It requires some changes to work with your data. For these modifications, you need to use keys and credentials you created earlier. You can see these values in the Cloud Platform Console on the Credentials page.

Open Credentials

  1. Make a copy of the file named bqapi.html. You can find the file in the following directory where you installed the source code:

    bigquery-reverse-geolocation/web/
    
  2. Open the file in a text editor.

  3. In the following script element, in the src attribute, replace Your-Maps-API-Key with your Google Maps API browser key:

    <script src="https://maps.googleapis.com/maps/api/js?libraries=visualization,drawing&key=Your-Maps-API-Key">
    </script>

  4. For the clientId variable, replace Your-Client-ID with the OAuth 2.0 client ID you created earlier.

  5. For the projectId variable, replace Your-Project-ID with your project ID.
  6. Save the file.

Viewing the web page

You can serve the web page from the Python simple HTTP server. Follow these steps:

  1. In your terminal window, navigate to the bigquery-reverse-geolocation/web directory where you cloned the source code.
  2. Run the web server:

    python -m SimpleHTTPServer
    
  3. From your web browser, browse to the following URL. Replace [YOUR_IP_ADDRESS] with the address of your computer. Recall that you found this IP address in a previous step and used it to set the origin URL for your OAuth 2.0 client ID.

    http://[YOUR_IP_ADDRESS]:8000/bqapi.html
    

    If your browser has a pop-up blocker, you must disable it for traffic on localhost:8000and then refresh the page.

  4. Click Allow in the OAuth 2.0 authentication pop-up dialog. You won't have to repeat this flow in this session if, for example, you reload the web page.

  5. After the map has loaded, select the rectangle tool in the upper-left corner of the map.
  6. Use the tool to draw a rectangle around the entire currently visible land mass on the map.

The page shows a heat map, similar to the following map. Exactly where the heat map regions display on the map depends on the data you loaded into BigQuery.

Google Maps can superimpose a heat map.

Enter Control+C in your terminal window to stop serving the web page.

Understanding the web page script

The web page uses the Google Maps JavaScript API to perform its work. You saw how the page sets up some of the configuration, such as how it references the visualization library when you added your browser key. In this section, you take a deeper look at how the page authorizes the user, retrieves data, and renders the heat-map regions.

Authorizing the user

The following functions handle authentication and authorization through OAuth 2.0. The function named authorise makes the request for authorization. The function named handleAuthResult receives a callback from the OAuth 2.0 library with the result of the request. If the result is successful, the function named loadAPI loads the BigQuery API.

function authorise(event) {
  gapi.auth.authorize({client_id: clientId, scope: scopes, immediate: false}, handleAuthResult);
  return false;
}

// If authorized, load BigQuery API.
function handleAuthResult(authResult) {
  if (authResult && !authResult.error) {
    loadApi();
  } else {
    console.log("Sorry, you are not authorised to access BigQuery.")
  }
}

// Load BigQuery client API.
function loadApi(){
  gapi.client.load('bigquery', 'v2').then(
    function() {
      console.log('BigQuery API loaded.');
      createMap();
    }
  );
}

Fetching the data

Recall that you draw a rectangle around the region of the map where you want to see the heat maps. The rectangle you draw defines a set of coordinate boundaries that restrict the subset of data to retrieve from BigQuery. The function named setUpDrawingTools adds an event listener that notifies your code when the rectangle is drawn.

google.maps.event.addListener(drawingManager, 'rectanglecomplete', function (rectangle) {
  currentShape = rectangle;
  rectangleQuery(rectangle.getBounds());
});

The callback is handled by the rectangleQuery function:

function rectangleQuery(latLngBounds){
  var queryString = rectangleSQL(latLngBounds.getNorthEast(), latLngBounds.getSouthWest());
  sendQuery(queryString);
}

The rectangleQuery function calls rectangleSQL, which constructs a SQL string based on the boundaries of the rectangle.

// Construct the SQL for a rectangle query.
function rectangleSQL(ne, sw){
  var queryString = "SELECT Latitude, Longitude "
  queryString +=  "FROM [" + projectId + ":" + datasetId + "." + table_name + "]"
  queryString += " WHERE Latitude > " + sw.lat();
  queryString += " AND Latitude < " + ne.lat();
  queryString += " AND Longitude > " + sw.lng();
  queryString += " AND Longitude < " + ne.lng();
  queryString += " LIMIT " + recordLimit;
  return queryString;
}

You can see that this function uses southwest and northeast corners of the rectangle to define the boundaries for latitudes and longitudes in the dataset. For example, the longitude represented by sw.lng coincides with the left- vertical edge of the rectangle. Any longitude in the dataset that is greater this value would be to the right of the that edge, and therefore inside the boundaries of the rectangle. Similar logic applies to the other three sides of the rectangle.

The sendQuery function executes the query through the BigQuery API by using the Google API Client Library for JavaScript:

function sendQuery(queryString){
  var request = gapi.client.bigquery.jobs.query({
      "query": queryString,
      "timeoutMs": 30000,
      "datasetId": datasetId,
      "projectId": projectId
  });
  request.execute(function(response) {
      console.log(response);
      checkJobStatus(response.jobReference.jobId);
  });
}

The client library produces a URL and post body similar to the follow examples:

URL:

https://content.googleapis.com/bigquery/v2/projects/[YOUR_PROJECT_ID]/queries?alt=json

POST body:

{
"query":"SELECT Latitude, Longitude FROM [YOUR_PROJECT_ID]:sandiego_freeways.geocoded_journeys WHERE Latitude > 32.685041939169665 AND Latitude < 32.85536439443039 AND Longitude > -117.31063842773438 AND Longitude < -117.05451965332031 LIMIT 10000"
}

BigQuery responds with a jobID used to poll the API to check the job status until results are ready to be retrieved:

GET https://content.googleapis.com/bigquery/v2/projects/[YOUR_PROJECT_ID]/queries/[JOB_ID]

The checkJobStatus function shows how to check the status of a job periodically, calling the get method with the jobId returned by the original query request. The function in the sample uses a 500 millisecond timeout.

function checkJobStatus(jobId){
  var request = gapi.client.bigquery.jobs.get({
    "projectId": projectId,
    "jobId": jobId
  });
  request.execute(function(response){
    if(response.status.errorResult){
      console.log(response.status.error);
    } else {
      if(response.status.state == 'DONE'){
        //get the results
        clearTimeout(jobCheckTimer);
        getQueryResults(jobId);
      } else {
        // No error, not finished, check again in a moment.
        console.log("Job running, waiting 0.5 seconds...");
        jobCheckTimer = setTimeout(checkJobStatus, 500, [jobId]);
      }
    }
  });
}

The checkJobStatus function calls getQueryResults. This function uses the jobs.getQueryResults method to get the results and then passes them to the doHeatMapfunction.

function getQueryResults(jobId){
  var request = gapi.client.bigquery.jobs.getQueryResults({
    "projectId": projectId,
    "jobId": jobId
  });
  request.execute(function(response){
    // Draw a heatmap from the list of rows returned.
    doHeatMap(response.result.rows);
  })
}

Showing the heat map visualization

It's important to understand that the amount of data that can be returned from a BigQuery dataset can be huge, sometimes amounting to petabytes of data. You must be careful to aggregate such data in a way that makes it usable so that it can be processed and displayed in a reasonable amount of time. For example, trying to plot the location of every row of traffic data would be untenable in this scenario. Fortunately, the Maps API provides the visualization.HeatmapLayer object, which is well-suited for this purpose. There are more details in the Maps API Too Many Markers developer guide.

The doHeatMap function creates the heat map and then superimposes the visualization onto the map that is displayed in the browser.

function doHeatMap(rows){
  // Remove the user drawing.
  if(currentShape){
    currentShape.setMap(null);
  }
  var heatmapData = [];
  if(heatmap!=null){
    heatmap.setMap(null);
  }
  if(rows){
    for (var i = 0; i < rows.length; i++) {
        var f = rows[i].f;
        var coords = { lat: parseFloat(f[0].v), lng: parseFloat(f[1].v) };
        var latLng = new google.maps.LatLng(coords);
        heatmapData.push(latLng);
    }
    heatmap = new google.maps.visualization.HeatmapLayer({
        data: heatmapData
    });
    heatmap.setMap(map);
  }
}

Additional tips

If you’re working with very large tables, your query might return too many rows to display efficiently on a map. You can limit the results by adding a WHERE clause or a LIMIT statement to the SQL query.

BigQuery scans the entire table with every query. To optimize your BigQuery quota usage, select only the columns you need in your query.

Queries run faster if you store latitude and longitude as float rather than string.

There are other ways to use SQL to run spatial queries against data in BigQuery, including queries that approximate a bounding circle, and User Defined Functions that can be used to construct more-advanced geometry operations. There are examples of bounding-box and circle-radius queries in the Advanced Examples section of the BigQuery reference.

Cleaning up

After you've finished the reverse-geocoding tutorial, you can clean up the resources you created on Google Cloud Platform so you won't be billed for them in the future. The following sections describe how to delete or turn off these resources.

Deleting the project

The easiest way to clean up most Cloud Platform resources is to delete the Cloud Platform Console project.

  • In the Cloud Platform Console, go to the Projects page.

    Go to the Projects page

  • In the project list, select the project you want to delete and click Delete project. After selecting the checkbox next to the project name, click
      Delete project
  • In the dialog, type the project ID, and then click Shut down to delete the project.
  • Deleting data stored in BigQuery

    To delete stored data, follow these steps:

    1. Open the BigQuery Console:

      BigQuery console

    2. In the left-hand panel, point to the dataset name and then click the downward-facing arrow.

    3. Click Delete dataset.
    4. Follow the instructions to confirm the deletion.

    Deleting the Cloud Pub/Sub topic and subscription.

    To delete the Cloud Pub/Sub components:

    1. Open the Cloud Pub/Sub topic list page in the Cloud Platform Console:

      Open the Pub/Sub page

    2. In the topic list, select the checkbox for the topic.

    3. Click Delete and confirm the operation.

    What's next

    • Try out other Google Cloud Platform features for yourself. Have a look at our tutorials.

    Monitor your resources on the go

    Get the Google Cloud Console app to help you manage your projects.

    Send feedback about...