# How to Create Terraform Custom Job in Spinnaker

## Configure and use Terraform Custom Job Stage <a href="#configure-and-use-terraform-custom-job-stage" id="configure-and-use-terraform-custom-job-stage"></a>

#### What you’ll learn: <a href="#what-youll-learn" id="what-youll-learn"></a>

1. How to configure and set up Terraform stage in Spinnaker as a custom stage job.
2. How to execute Terraform operations through.

#### Prerequisites: <a href="#prerequisites" id="prerequisites"></a>

1. Understanding of spinnaker and spinnaker custom job.
2. Running spinnaker with Kubernetes cluster account configured in it.
   * Get detailed information about [Spinnaker Custom Job](https://www.spinnaker.io/guides/operator/custom-job-stages/)
   * At OpsMx we have developed open-source TerraSpin micro-service which seamlessly integrates with spinnaker through spinnaker custom job. Which ideally creates three stages in spinnaker ( TSPlanJob, TSApplyJob, and TSDestroyJob ). Each stage has its own input fields and output.

### Configuration Steps: <a href="#configuration-steps" id="configuration-steps"></a>

Execute the below steps to configure Spinnaker with TerraSpin Custom Job.

1. Create a file ‘artifactaccounts.json’ add the below content and replace values according to your artifact account. The `artifactaccounts.json` contains account details from where the job pulls terraform code. The details are:
   * accountname : This can be any descriptive name without spaces.
   * artifacttype: Github, S3 or any of the supported artifiact repositories.
   * Username and Password: These are the credentials for accessing the repo that will be specified during the stage input.

{% hint style="info" %}
[artifactaccounts.json Available here](https://github.com/OpsMx/terraform-stage/blob/master/spinterrajob-core/container/artifactaccounts.json)
{% endhint %}

**Note**: Ensure GIT Username & Password doesn't have ‘@’ included.

2.Create a secret from where TerraSpin Job will read the information provided in `artifactaccount.json`

```
kubectl create cm terraspinbackendconfig --from-file=artifactaccounts.json -n default => ”default”
```

&#x20;The default namespace is the one that is planned for using the TerraSpin Jobs. Ensure that access to this namespace is limited as credentials are available here.

3\.  Create a file `orca-local.yml` with the following contents and replace ALL occurrences of:

* account: Name of the spinnaker kubernetes account.
* application: Name of the application in Spinnaker.
* credentials: Provide the same Kubernetes Account Name provided above.
* namespace (under metadata): Namespace where the Terraform Job should run. This should be the same namespace where terraspinbackendconfig configmap was created in (a) above.

{% hint style="info" %}
[Orca-local.yml Available here](https://github.com/OpsMx/terraform-stage/blob/master/spinterrajob-core/container/orca-local.yml)
{% endhint %}

4\. Navigate to the following location:

```
~/.hal/default/profiles in halyard pod/machine
```

&#x20; Copy the `orca-local.yml` file created above. In case an `orca-local.yml` already exists, please append the contents as appropriate.

5\. Execute the following in halyard pod or on halyard machine:

```
hal deploy apply 
```

At this point spinnaker configuration for Terraform Customer Job stage is complete. Please wait for all pods to restart and stabilize.

### Executing Terraform Scripts via the custom job stage <a href="#executing-terraform-scripts-via-the-custom-job-stage" id="executing-terraform-scripts-via-the-custom-job-stage"></a>

Once spinnaker configuration for Terraform Customer Job stage is complete, the following stages should be available in pipeline configuration:

1. TSPlanJob
2. TSApplyJob
3. TSDestroyJob

#### TSPlanJob: <a href="#tsplanjob" id="tsplanjob"></a>

This stage does Terraform infra-code initial formal run basically (terraform init and terraform plan). This stage has the following five inputs.

1. Tf Script Account : This must be one of the “account” values that is defined in `artifactaccounts.json`. You have to choose that account where your tf script is present.
2. Terraform plan:

   * This is a location of Terraform Script.
   * Provide the location in the form of ‘`username/repo-name.git//folder`’.

   **Note:** The two // separating the repo and the folder containing the terraform root module script. E.g. for `github: OpsMx/staging-terraform-pm.git//azure/k8cluster`. The credentials for accessing this repo were provided in `artifactaccounts.json`.
3. Override file (optional): If present, the file specified here will be applied on the root module. Possible use-case might be to provide a tfvars file.

   **Note:** If you would like to ignore this option, ensure the block is empty by removing the help text available
4. Tf state account: This must be one of the “account” values that is defined in `artifactaccounts.json.`You have to choose that account where you want to store the **tf** state.
5. State repo: This is repo where the terraform state files are stored and retrieved across multiple stages such as between plan and apply, apply and destroy. This is mandatory for the Terraform Custom Job Stage to function.

   **Note:** As state-information can contain credentials in plain-text, this repo should have control access. Same account name and their credentials will be used to access this repo (E.g. opsmx/staging-terraform-states.git).
6. UUId: This can be any unique string based on user choice to identify the terraform state across multiple stages. It is not mandatory to have all the stages(TSPlan, TSApply, TSDestroy) in the same pipeline. However, they all should have the same UUid.

![](https://2047464521-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MBEa1hoX6SqpDj-ymNs%2Fuploads%2FdDjMrC9NEnisDtVv3fdE%2Fimage.png?alt=media\&token=6cc914bb-6b94-42d6-8c95-d789245e9d82)

Output: This stage will show terraform init and plan command output.

#### TSApplyJob: <a href="#tsapplyjob" id="tsapplyjob"></a>

Functionality of this stage is to create terraform infra-code( terraform apply ). Here stage output will have properties with terraform infra-code output-values in a key-value format so that user can use those values in the next subsequent stage of pipeline this stage has four inputs.

1. Artifact account
2. Override file
3. Tf State Account
4. State repo
5. UUId

![](https://2047464521-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MBEa1hoX6SqpDj-ymNs%2Fuploads%2FH0DzQ8aiLjVyLQpu1C9w%2Fimage.png?alt=media\&token=f28c440d-6855-4bf6-93c4-f86a6e4bdf4d)

Output: This stage will show terraform apply command output.

#### TSDestroyJob: <a href="#tsdestroyjob" id="tsdestroyjob"></a>

Functionality of this stage to destroy terraform infra-code basically ( terraform destroy ) this stage has four inputs.

1. Tf State Account.
2. State repo.
3. UUId.

![](https://2047464521-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MBEa1hoX6SqpDj-ymNs%2Fuploads%2FhVWErq9wawryvUoRrD0G%2Fimage.png?alt=media\&token=883467b4-ca39-4c7a-9622-d5aef01b274b)

Output: This stage will show terraform destroy command output


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.opsmx.com/opsmx-context-graph-and-data-fabric/additional-resources/code-labs/how-to-create-terraform-custom-job-in-spinnaker.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
