Exporting GCP Projects to Terraform

By Rohan Paithankar | @intelia | November 29

Have you found yourself in a situation where you went on creating resources in a project through ClickOps and that eventually became your production environment? If yes, this article is for you!

In my consulting journey, I have come across some very unique challenges that organisations face. One such challenge was that of a client that had created a ‘production environment’ through ClickOps and they found themselves in a pickle trying to maintain these resources. In an environment like this, engineers often feared touching anything because it could even mean that it was their last day at the job. Added to this were the rising concerns of costs incurred by the many unknown resources running somewhere in the project.

You see, psychological safety is often under-appreciated but it is an important ingredient for building high performing teams.

Watching the team’s turmoil with this so called ‘production environment’ forced me to explore ways to bring the situation under control. And voila! Terraform-ing the resources was the most favourable solution. Why?

    • Infrastructure-as-code(IaC) would first help us audit the resources that had been deployed and thereby monitor costs incurred.
    • Infrastructure-as-code(IaC) would help us re-build the infrastructure if needed.
    • Infrastructure-as-code(IaC) would help the team deploy newer resources and most importantly, move away from ClickOps.

Having said that, I was now left with only half the answer. The second part of this challenge was to export the resources into Terraform. I had only seen this uni-directional flow:

 

 

 

 

But what I wanted was:

 

 

 

 

 

How did I achieve it?

Note: This process only works for Linux/MacOS. The user exporting the resources requires elevated permissions in the project. Alternatively, service account impersonation can be used.

Here are the steps:

  • Setup a new git repository and clone it to your local machine.
  • In the repository, setup Terraform backend file backend.tf with remote state file in a GCS bucket.

 

 

 

 

  • Authenticate with GCP and configure the target project. Additionally, service account impersonation can also be used.

 

 

 

  • As per the steps mentioned in the GCP documentation, install GCP’s config-connector and run the gcloud cli command to export the required resources to Terraform scripts.

 

 

 

 

The export is arranged in the format:

 

 

Along with the bulk export of resources, it also provides the ‘terraform import’ commands to import the resources into the state file. Once the scripts are exported, run ‘terraform init’ followed by the ‘terraform import’ command.

 

The Catch-22

The time taken to export is linearly proportional to the number of resources running in the project. In my case, I had a ‘huge’ number of BigQuery Datasets and Tables that took ages to export if combined with all other resources. To solve this challenge, I first listed out all the resource types supported for the export using:

 

 

I then created two separate lists, one for BigQuery resources and another for non-BigQuery resources.

BigQuery resources: (resourceTypes_BQ.txt)

 

 

 

 

 

Non-BigQuery resources: (resourceTypes_exceptBQ.txt)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

I then wrote a simple parameterised bash script to export BigQuery & Non-BigQuery resources.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The script accepts the GCP Project ID as command line argument. The script can be executed as follows:

 

 

The second argument ‘bq’ is optional. If specified, it exports the BigQuery resources. If not present, the script exports all Non-BigQuery resources.