This tutorial demonstrates using a Cloud Function to create a serverless cron scheduled import for data management or data science workflows. One such use case would be when a third party provides data uploaded to a Cloud Storage bucket on a regular basis in a GCP project. Instead of manually importing the CSV or AVRO to BigQuery each day, you can use a cloud function with a trigger on object.finalize on a set bucket. This way, whenever a CSV or an AVRO file is uploaded to that bucket, the function imports the file to a new BigQuery table to the specified dataset.
Here is how to set it up:
- Enable Cloud Functions, Cloud Storage and BigQuery APIs in the GCP Console.
- Open Cloud Shell in the GCP console
Download the zip with all files
--trigger-resourceflag with your source Cloud Storage bucket (replace avro-import-source)
Run the function:
Verify the function is running in the GCP Console.
Upload an AVRO file to the source Cloud Storage bucket you specified in
This cloud function should deploy and wait for new objects to be finalized on the source Cloud Storage bucket. Once a new AVRO file is uploaded to the source it will use the BigQuery API to load the new dataset into a new table.