How to Create Terraform Webhook Stage in Spinnaker
What you’ll learn:
How to configure and set up Terraform stage in Spinnaker as a custom webhook stage
How to execute Terraform operations through
Prerequisites:
Understanding of Spinnaker and Spinnaker custom webhook.
Running Spinnaker.
Running Terraspin Rest service.
Get detailed information about Spinnaker Custom Webhook. Spinnaker Custom Webhook.
At OpsMx we have developed an open-source TerraSpin micro-service which seamlessly integrates with Spinnaker through Spinnaker custom webhook. Which ideally creates three stages in Spinnaker ( TSPlanRest, TSApplyRest, and TSDestroyRest ). Each stage has its own input fields and output.
Configuration Steps:
Execute the below steps to configure Spinnaker with TerraSpin Custom Webhook service.
Create a file ‘
artifactaccounts.json
’ add the below content and replace values according to your artifact account. Theartifactaccounts.json
contains account details from where the TerraSpin service pulls terraform code. The details are:accountname : This can be any descriptive name without spaces
artifacttype: Github, S3 or any of the supported artifiact repositories
Username and Password: These are the credentials for accessing the repo that will be specified during the stage input ( show later)
Note: Ensure GIT Username & Password doesn't have ‘@’ included.
artifactaccounts.json Available here
2.Different ways to run TerraSpin Rest service.
Running on a local machine: Create directory opsmx/app/config/ under the home directory of your machine where you want to run TerraSpin service and put artifactaccounts.json in it. Here you will find GitHub readme file to run TerraSpin Rest service.
Note: If a user running TerraSpin service locally then it required to install terraform tool on that local system terraform version should be greater or equal to v0.12.12
Running on Kubernetes: Create a config map from where TerraSpin Deployment will read the information provided in artifactaccount.json. To create a secret run the following command:
Here you will find Kubernetes TerraSpin deployment and service manifest yaml.
3.Create a file “orca-local.yml” with the following contents and replace ALL occurrences of:
Host & Port: Under the URL section of each stage, replace the actual host & port machine where TerraSpin Rest service is running.
4.Navigate to the following location ~/.hal/default/profiles
in halyard pod/machine and then copy the orca-local.yml file created above. In case an orca-local.yml already exists, please append the contents as appropriate.
5.Execute hal deploy apply (in halyard pod or on halyard machine)
At this point Spinnaker configuration for Terraform Customer Webhook stage is complete. Please wait for all Spinnaker service/pods to restart and stabilize.
Executing Terraform Scripts via the custom webhook stage
Once Spinnaker configuration for TerraSpin Customer Webhook stage is complete, the following stages should be available in pipeline configuration:
TSPlanRest
TSApplyRest
TSDestroyRest
TSPlanRest:
This stage does Terraform infra-code initial formal run basically (terraform init and terraform plan). This stage has the following five inputs:
Tf script account: This must be one of the “account” values that are defined in artifactaccounts.json! Choose that account where your tf script present.
Terraform plan:
This is a location of Terraform Script.
Provide the location in the form of ‘username/repo-name.git//folder’.
Note: The two // separating the repo and the folder containing the terraform root module script. E.g. for github: OpsMx/staging-terraform-pm.git//azure/k8cluster. The credentials for accessing this repo were provided in artifactaccounts.json.
Override file (optional): If present, the file specified here will be applied on the root module. Possible use-case might be to provide a tfvars file.
Note: If you would like to ignore this option, ensure the block is empty by removing the help text available.
4. Tf state account: This must be one of the “account” values that were defined in artifactaccounts.json! User has to choose that account where they want tf state to be store.
5. State repo: This is a repo where the terraform state files are stored and retrieved across multiple stages such as between plan and apply, apply, and destroy. This is mandatory for the Terraform Custom Job Stage to function.
Note - As state-information can contain credentials in plain-text, this repo should have control access. Same account name and their credentials will be used to access this repo (E.g. opsmx/staging-terraform-states.git).
6. UUId: This can be any unique string based on user choice to identify the terraform state across multiple stages. It is not mandatory to have all the stages(TSPlan, TSApply, TSDestroy) in the same pipeline. However, they all should have the same UUid.
Output: This stage will show terraform init and plan command output.
TSApplyRest:
Functionality of this stage is to create terraform infra-code( terraform apply ). Here stage output will have properties with terraform infra-code output-values in a key-value format so that user can use those values in the next subsequent stage of pipeline this stage has four inputs. so that user can use those values in the next subsequent stage of pipeline this stage has four inputs.
1.Tf script account.
2.Override file
3. Tf state account
4.State repo.
5.UUId.
Output: This stage will show terraform apply command output.
TSDestroyRest:
Functionality of this stage to destroy terraform infra-code basically ( terraform destroy ) this stage has four inputs.
1.Tf state account.
2.State repo.
3.UUId.
Output: This stage will show terraform destroy command output
Last updated