Open Credo

April 5, 2018 | Cloud, DevOps, Hashicorp, Terraform Provider

Google Cloud Functions with Terraform

Google Cloud Functions is the Google Cloud Platform (GCP) function-as-a-service offering. It allows you to execute your code in response to event triggers – HTTP, PubSub and Storage. While it currently only supports Node.js code for execution, it has proved very useful for running low-frequency operational tasks and other batch jobs in GCP.

Google Cloud Functions with Terraform


When using Hashicorp Terraform to create your infrastructure, it was previously not possible to create Google Cloud Functions. This was painful as you were required to make Cloud Functions using the gcloud tool, losing the benefits of Terraform such as declarative syntax and convergence onto state. At best, you have two codebases (Terraform and gcloud) rather than one.

But times have changed…! I recently implemented Cloud Functions in Terraform Google Cloud provider, and will demonstrate here how you can now create Google Cloud Functions using Terraform.

Cloud Function Resource

So let’s look at the Terraform Cloud Function resource definition:

resource "google_cloudfunctions_function" "test" {
    name                      = "[FunctionName]"
    entry_point               = "helloGET"
    available_memory_mb       = 128
    timeout                   = 61
    project                   = "[GCPProjectName]"
    region                    = "us-central1"
    trigger_http              = true
    trigger_topic             = "[PubSubTopic]"
    trigger_bucket            = "[StorageBucketName]"
    source_archive_bucket     = "${}"
    source_archive_object     = "${}"
    labels {
    deployment_name           = "test"

As usual with Terraform this resource definition is straightforward and declarative. We do need to look at those parameters in more detail.

You specify the function name, the function which acts as entry_point on invocation of the Cloud Function, memory available to the function call in MB (available_memory_mb), how long the Cloud Function has to execute in seconds (timeout), the GCP project name (project) and the GCP region (region).

Some less clear parameters are:

  • trigger_http configures the Cloud Function to be triggered via HTTP GET.
  • trigger_topic configures the Cloud Function to be triggered by a PubSub topic
  • trigger_bucket configures the Cloud Function to be triggered by changes to a GCP Storage bucket.
  • (N.B. You should only specify one of the previous three parameters.)
  • Your function code needs to be delivered to the Cloud Function infrastructure. source_archive_bucket and source_archive_object specify the GCP Storage bucket Cloud Functions can pull the function code from. Your code must be delivered into Cloud Storage as a zipped code archive. Here I specify these fields by referencing other terraform resource definitions…
resource "google_storage_bucket" "bucket" {
  name = "cloudfunction-deploy-test1"

data "archive_file" "http_trigger" {
  type        = "zip"
  output_path = "${path.module}/files/"
  source {
    content  = "${file("${path.module}/files/http_trigger.js")}"
    filename = "index.js"

resource "google_storage_bucket_object" "archive" {
  name   = ""
  bucket = "${}"
  source = "${path.module}/files/"
  depends_on = ["data.archive_file.http_trigger"]

Here I am creating a Storage bucket, zipping up the function code and delivering it into the GCP Storage bucket.

Sample Function

This sample function, called helloGET (remember to put this as the entry_point)…

 * HTTP Cloud Function.
 * @param {Object} req Cloud Function request context.
 * @param {Object} res Cloud Function response context.
exports.helloGET = function helloGET (req, res) {
    res.send(`Hello ${ || 'World'}!`);


Here we showed how to get started how to create Google Cloud Functions with Terraform. In general, Terraform makes defining infrastructure-as-code easy and intuitive, and deploying functions is a great fit for it. Your next steps might be to see what else you can define in terraform. Alternatively, for more info about this module, please see the documentation.


This blog is written exclusively by the OpenCredo team. We do not accept external contributions.



Twitter LinkedIn Facebook Email