• home
  • about
  • blog

Deploying a Node.js API with Cloud Functions and Terraform

November 01, 2020

TL;DR: Here is the repository with the code

In this tutorial we are going to build a simple Node.js API with multiple routes and deploy it to one single function on Google Cloud Platform (GCP).

The main idea is to spend as little money as possible with the infrastructure, ideally none. Since Cloud Functions have a pay-as-you-go pricing model with the first 2 million invocations for free, it makes them a great candidate.

We also don’t want things to get too complex, so we’re going with one single Cloud Function where the routes are managed by the Node.js HTTP framework - in this case Fastify, but feel free to plug in Express or any other.

I use this setup on personal projects. It’s great if you want to deploy something quick, with no costs and that is ready to be used.

The tutorial assumes a basic knowledge of Terraform. Something you can get by watching a couple of introduction videos on YouTube.

What you will need

GCP account. Create one at cloud.google.com. You will also need to activate billing but the whole setup should cost you close to zero.

A few command line tools. Make sure you have installed terraform, gcloud and node. There should be a Homebrew formula for each one of these.

Setting up GCP

After you have gcloud installed and authenticated, we run this command to create a new project:

gcloud projects create PROJECT_ID --name="My App"

Note: PROJECT_ID and My App will be chosen by you.

Yes, project creation and setup can be automated with Terraform as well, but we are doing it manually on this tutorial.

Now make your project the default one to make it easier to work with the Google Cloud CLI:

gcloud config set project PROJECT_ID

We are going to use Cloud Functions and Cloud Storage, two services of GCP that require a billing account to be associated to the project. To do that, you can run:

# List billing accounts available
gcloud beta billing accounts list

# Link a billing account to project
gcloud beta billing projects link PROJECT_ID --billing-account=BILLING_ACCOUNT_ID

And for the last part of our manual setup, we will create a Cloud Storage bucket to store the Terraform .tfstate files:

gsutil mb gs://PROJECT_ID-tfstate

gsutil comes bundled with the Google Cloud SDK.

Why create a bucket? If you are working with a team (more than one person) the code will be typically hosted in a VCS (like Git, for example). Terraform stores the all the infrastructure state - .tfstate files - locally by default. Committing the state files could be an option to share state, but the good practice is to host them in a remote storage like a Cloud Storage bucket.

Project structure

Create all the files so your repository looks like this:

.
├── terraform
│ 	├── modules
│   │   └── functions
│   │       ├── main.tf
│   │       ├── outputs.tf
│   │       └── variables.tf
│ 	├── main.tf
│ 	├── backend.tf
│ 	├── outputs.tf
│ 	├── terraform.tfvars # Do not commit this one
│ 	└── variables.tf
├── src
│		└── index.js
└── package.json

The way it will works is:

  • terraform/ for all the infrastructure code
  • src/ for all the Node.js code with the API

Writing our API code

Since the focus here is on the infrastructure, let’s write a very simple code for our API using Fastify.

First, create a package.json and install fastify:

npm init

npm i fastify

Now add this code on src/index.js:

// src/index.js

const fastify = require("fastify")
const app = fastify({ logger: true })

const heroes = [
  { id: 1, name: "Iron Man" },
  { id: 2, name: "Thor" },
  { id: 3, name: "Black Widow" },
  { id: 4, name: "Hulk" },
]

app.get("/", async (req, res) => {
  return { works: true }
})

app.get("/heroes", async (req, res) => {
  return heroes
})

app.get("/heroes/:id", async (req, res) => {
  const { id } = req.params
  return heroes.find(h => h.id === Number(id))
})

exports.app = async (req, res) => {
  await app.ready()
  app.server.emit("request", req, res)
}

A few tweaks

Add your app entry point and the Node.js version on your package.json file:

// package.json

{
// ...
	"main": "./src/index.js",
	"engines": {
		"node": "12"
	}
}

This tells Cloud Function which file to run and which Node version to use (v12 is the latest one supported as of today).

Terraform infrastructure

First we’re going to write a Terraform backend. The word “backend” in this context can be a bit confusing since we usually associate backends with REST APIs. This is not the case. A Terraform backend is nothing more than a way to tell Terraform how we want to manage our state - where to store the state files, how to apply that state, etc.

Backends are completely optional, but strongly recommended (by me, heh) on real-world projects.

Let’s write a backend for our state files:

# terraform/backend.tf

terraform {
  backend "gcs" {
    bucket = "PROJECT_ID-tfstate"
  }
}

Remember the bucket we created on the beginning? Here is where we are using it.

See here for a list of available backends.

As of today there is no way to dynamically create the backend bucket when declaring Terraform backends, so we are left with hardcoding the bucket name there.

Now let’s write two simple variables that we’re going to use from now on:

# terraform/variables.tf

variable “project” {}
variable “region” {}

And on our terraform.tfvars we will add values to these variables:

# terraform/terraform.tfvars

project = "PROJECT_ID"
region  = "us-central1"

And on terraform/main.tf, the entry point for all our Terraform commands, we will declare our provider:

# terraform/main.tf

provider "google" {
  project = var.project
  region  = var.region
}

# More code here soon!!!

We are going to go back to this file soon.

Writing the Function module

First, let’s remember a couple of Terraform definitions:

Resources. They are infrastructure objects. For example, a resource can be a storage bucket, a virtual network and, in our case, a function.

Modules. Multiple resources can be combined to create something more meaningful and complex. This is done using Modules. They also provide many advantages such as organization, encapsulation and reusability.

To create a Cloud Function we will need the following:

  • A storage bucket, to store the code that will be executed by the function
  • The function itself
  • An IAM policy, to allow users to invoke the function

We already have all the file structure defined to our module. All the necessary files live inside terraform/modules/function, so let’s add the code to them:

On terraform/modules/function/variables.tf we will add the arguments that will be needed every time we are using the module. Think of it as function parameters:

# terraform/modules/function/variables

variable "project" {}
variable "name" {}
variable "entry_point" {}

On terraform/modules/function/main.tf we add all the logic for creating the actual infrastructure:

⚠️ This file is a bit dense, so follow through the comments to get an idea of what’s happening.

# terraform/modules/function/main.tf

locals {
  timestamp = formatdate("YYMMDDhhmmss", timestamp())
}

# Compress source code
data "archive_file" "source" {
  type        = "zip"
  source_dir  = "../../../"
  output_path = "/tmp/function-${local.timestamp}.zip"
}

# Create bucket that will host the source code
resource "google_storage_bucket" "bucket" {
  name = "${var.project}-function"
}

# Add source code zip to bucket
resource "google_storage_bucket_object" "zip" {
  # Append file MD5 to force bucket to be recreated
  name   = "source.zip#${data.archive_file.source.output_md5}"
  bucket = google_storage_bucket.bucket.name
  source = data.archive_file.source.output_path
}

# Enable Cloud Functions API
resource "google_project_service" "cf" {
  project = var.project
  service = "cloudfunctions.googleapis.com"

  disable_dependent_services = true
  disable_on_destroy         = false
}

# Enable Cloud Build API
resource "google_project_service" "cb" {
  project = var.project
  service = "cloudbuild.googleapis.com"

  disable_dependent_services = true
  disable_on_destroy         = false
}

# Create Cloud Function
resource "google_cloudfunctions_function" "function" {
  name    = var.name
  runtime = "nodejs12"

  available_memory_mb   = 128
  source_archive_bucket = google_storage_bucket.bucket.name
  source_archive_object = google_storage_bucket_object.zip.name
  trigger_http          = true
  entry_point           = var.entry_point
}

# Create IAM entry so all users can invoke the function
resource "google_cloudfunctions_function_iam_member" "invoker" {
  project        = google_cloudfunctions_function.function.project
  region         = google_cloudfunctions_function.function.region
  cloud_function = google_cloudfunctions_function.function.name

  role   = "roles/cloudfunctions.invoker"
  member = "allUsers"
}

Notice we are using the variables we declared before on terraform/modules/function/variables.tf to make it somehow dynamic and allow us to create multiple functions just by playing with the values.

And for last, we will want to know the URL of the function that was just created. We can get that by defining outputs to our module. When running terraform apply, outputs are printed at the end of the CLI.

Outputs can also be used to pass data between modules.

On terraform/modules/function/outputs.tf we ask for the https_trigger_url of the function, which will be know after we apply the configuration:

# terraform/modules/function/outputs.tf

output "function_url" {
  value = google_cloudfunctions_function.function.https_trigger_url
}

Putting it together

If modules work like functions, now we will need to call them. Let’s go back to our entry point file on terraform/main.tf and call the module:

# terraform/main.tf

provider "google" {
  project = var.project
  region  = var.region
}

+ module "function" {
+   source      = "../../modules/function"
+   project     = var.project
+   name        = "my-function"
+   entry_point = "app"
+ }

On this code, we’re telling Terraform where to look for the local module code on source and declaring the variables required by the module - project, name and entry_point.

The entry_point must match the name of exported variable on our Node.js code. You will find a exports.app = … on the API file.

To create the infrastructure, you can run the following commands:

# Make sure you are on the terraform folder
cd terraform

# Plan the configuration
terraform plan

# Create all the resources
terraform apply

If everything works well, you might see the following output on the end:

Apply complete! Resources: 6 added, 0 changed, 0 destroyed.

Outputs:

function_url = https://us-central1-my-project-1234567.cloudfunctions.net/my-function

And that’s the URL for your API!

Updating the function

When you make changes to the API you will want to re-deploy the code. To do that you can simply run terraform apply another time and it will package your current code and update it on the bucket.

There’s a catch. Terraform is not smart enough to know that the ZIP file for the code is different than the previous one and the function needs to be updated with the new code (see this issue). We achieved this with this nifty workaround here on terraform/modules/function/main.tf:

# terraform/modules/function/main.tf
# ...
resource "google_storage_bucket_object" "zip" {
  name   = "source.zip#${data.archive_file.source.output_md5}"
	# ...
}

By changing the path of the source code to something different than before (using the file MD5 in this case) we force the function to be updated when we run terraform apply.

Going further

Different environments. You might be wondering how we can manage development, staging and production environments on this setup. Well, there are many ways to do that with Terraform. In the repository with the final code I added development and production by playing a bit with the folder structure. Check it out!

Continuous deployment. Automating the deployment of our function would be a great addition. Ideally, we wouldn’t be running terraform plan and terraform apply from our local machines and this would be done by a CI/CD solution such as Cloud Build or GitHub Actions.

Add a database. If you need to persist data, you can go ahead and add a database such as Cloud Firestore.

Thanks for reading this far! If you wanna get in touch send me an email or a DM on Twitter.

© 2017 - 2020 Ruan Martinelli.