title: AVRO/CSV Import to BigQuery from Cloud Storage with a Cloud Function description: Use this Cloud Function to import AVRO or CSV files into BigQuery from Google Cloud Storage. author: mikekahn tags: Cloud Functions, BigQuery, Cloud Storage date_published: 2018-05-06
This tutorial demonstrates using a Cloud Function to create a serverless cron scheduled import for data management or data science workflows. One such use case would be when a third party provides data uploaded to a Cloud Storage bucket on a regular basis in a GCP project. Instead of manually importing the CSV or AVRO to BigQuery each day, you can use a cloud function with a trigger on object.finalize on a set bucket. This way, whenever a CSV or an AVRO file is uploaded to that bucket, the function imports the file to a new BigQuery table to the specified dataset.
Here is how to set it up:
- Enable Cloud Functions, Cloud Storage and BigQuery APIs in the GCP Console.
- Open Cloud Shell in the GCP console
Download the zip with all files
--trigger-resourceflag with your source Cloud Storage bucket (replace avro-import-source)
Run the function:
Verify the function is running in the GCP Console.
- Upload an AVRO file to the source Cloud Storage bucket you specified in
This cloud function should deploy and wait for new objects to be finalized on the source Cloud Storage bucket. Once a new AVRO file is uploaded to the source it will use the BigQuery API to load the new dataset into a new table.Note: This will only create a new table for each new import. You might need to update the function for replacing previous tables.