How to Create Terraform Webhook Stage in Spinnaker

What you’ll learn:

  1. How to configure and set up Terraform stage in Spinnaker as a custom webhook stage

  2. How to execute Terraform operations through

Prerequisites:

  1. Understanding of spinnaker and spinnaker custom webhook.

  2. Running spinnaker.

  3. Running Terraspin Rest service.

    • Get detailed information about Spinnaker Custom Webhook. Spinnaker Custom Webhook.

    • At OpsMx we have developed open-source TerraSpin micro-service which seamlessly integrates with spinnaker through spinnaker custom webhook. Which Ideally creates three stages in spinnaker ( TSPlanRest, TSApplyRest, and TSDestroyRest ). Each stage has its own input fields and output.

Configuration Steps:

Execute the below steps to configure Spinnaker with TerraSpin Custom Webhook service.

  1. Create a file ‘artifactaccounts.json’ add the below content and replace values according to your artifact account. The artifactaccounts.json contains account details from where the TerraSpin service pulls terraform code

    • accountname : This can be any descriptive name without spaces

    • artifacttype: Github, S3 or any of the supported artifiact repositories

    • Username and Password: These are the credentials for accessing the repo that will be specified during the stage input ( show later)

Note: Ensure GIT Username & Password doesn't have ‘@’ included.

artifactaccounts.json Available here

2.Different ways to run TerraSpin Rest service.

  • Running on a local machine: create directory opsmx/app/config/ under the home directory of your machine where you want to run TerraSpin service and put artifactaccounts.json in it.Here you will find GitHub readme file to run TerraSpin Rest service

Note: If a user running TerraSpin service locally then it required to install terraform tool on that local system terraform version should be greater or equal to v0.12.12

  • Running on Kubernetes: Create a config map from where TerraSpin Deployment will read the information provided in artifactaccount.json, following command ( kubectl create cm terraspinbackendconfig --from-file=artifactaccounts.json -n default ) will create a Secret. Here you will find Kubernetes TerraSpin deployment and service manifest yaml.

3.Create a file “orca-local.yml” with the following contents and replace ALL occurrences of:

Host & Port: Under the url section of each stage, replace the actual host & port machine where TerraSpin Rest service is running.

Orca-local.yml Available here

4.Navigate to the following location ~/.hal/default/profiles in halyard pod/machine and then copy the orca-local.yml file created above. In case an orca-local.yml already exists, please append the contents as appropriate.

5.Execute hal deploy apply (in halyard pod or on halyard machine)

At this point spinnaker configuration for Terraform Customer Webhook stage is complete. Please wait for all Spinnaker service/pods to restart and stabilize.

Executing Terraform Scripts via the custom webhook stage

Once spinnaker configuration for TerraSpin Customer Webhook stage is complete, the following stages should be available in pipeline configuration:

  1. TSPlanRest

  2. TSApplyRest

  3. TSDestroyRest

An example set of screenshots with valid inputs is provided towards the end of this document.

TSPlanRest:

This stage does Terraform infra-code initial formal run basically (terraform init and terraform plan). This stage has the following five inputs.

  1. Artifact account : This must be one of the “account name” values that were defined in artifactaccounts.json

  2. Terraform plan:

    • This is a location of Terraform Script.

    • Provide the location in the form of ‘username/repo-name.git//folder’.

    Note

    The two // separting the repo and the folder containing the terraform root module script. E.g. for github: OpsMx/staging-terraform-pm.git//azure/k8cluster. The credentials for accessing this repo were provided in artifactaccounts.json.

  3. Override file (optional): If present, the file specified here will be applied on the root module. Possible use-case might be to provide a tfvars file.

  4. State repo: This is repo where the terraform state files are stored and retrieved across multiple stages such as between plan and apply, apply and destroy. This is mandatory for the Terraform Custom Job Stage to function.

    Note

    As state-information can contain credentials in plain-text, this repo should have control access. Same account name and their credentials will be used to access this repo (E.g. opsmx/staging-terraform-states.git).

  5. UUId: This can be any unique string based on user choice to identify the terraform state across multiple stages. It is not mandatory to have all the stages(TSPlan, TSApply, TSDestroy) in the same pipeline. However, they all should have the same UUid.

Output: This stage will show terraform init and plan command output.

TSApplyRest:

Functionality of this stage is to create terraform infra-code( terraform apply ). Here stage output will have properties with terraform infra-code output-values in a key-value format so that user can use those values in the next subsequent stage of pipeline this stage has four inputs. so that user can use those values in the next subsequent stage of pipeline this stage has four inputs.

1.Artifact account.

2.Override file

3.State repo.

4.UUId.

Output: This stage will show terraform apply command output.

TSDestroyRest:

Functionality of this stage to destroy terraform infra-code basically ( terraform destroy ) this stage has four inputs.

1.Artifact account.

2.State repo.

3.UUId.

Output: This stage will show terraform destroy command output