Terraform Libraries for Azure — Blog 2

Steve Dillon
4 min readJan 11, 2021

In this blog will will work through the basics of using the Terraform-AzureRM repository.

We have created a companion repository called Terraform-AzureRM-Samples with usage examples of the main module.

This article assumes that you have some milage with Terraform and Git under your belt. I’ll cover a couple of basics but it is assumed that “Terraform Plan, Apply, state file” are all part of your vocabulary.

Environment Setup

One of the problems of writing this blog is the setup of the client machine. You need terraform, AZ CLI, Git, and be logged into Azure with ‘az login’, and your audience may be using OSX, Linux or Windows (in Powershell, CMD, linux subsystem and Git-Bash). The common ground I am going to use for this series of blogs is Azure Cloud Shell (bash) with the built in graphical editor ‘code’ pre-installed in Azure Cloud Shell. In azure cloud shell typing ‘code .’ will open a decent editor that everybody should be able to navigate.

Using Azure Cloud shell will get newer people running faster. If you are all setup in OSX or Linux and are very familiar with the the pre-requisites, you can probably follow along on your machine.

Pulling Code with Git-Submodule

There are a few ways to pull a remote repository into your terraform code. Our recommended method of including Terraform-AzureRM is to use Git’s submodule support. This allows you to pull in all of our modules in one go, and the module source code will be easily visible and not buried in the .terraform folder.

Depending on your version of Git, submodule processing may be automatic, but this series of commands will download the sample code as well as the modules you need to complete this blog

steve_dillon@Azure:~$ git clone git://github.com/persistentsystems/terraform-azurerm-samples.git
Cloning into 'terraform-azurerm-samples'...
steve_dillon@Azure:~$ cd terraform-azurerm-samples/
steve_dillon@Azure:~/terraform-azurerm-samples$ git submodule init
Submodule 'submodules/terraform-azurerm' (git://github.com/persistentsystems/terraform-azurerm.git) registered for path 'submodules/terraform-azurerm'
steve_dillon@Azure:~/terraform-azurerm-samples$ git submodule update
Cloning into '/home/steve_dillon/terraform-azurerm-samples/submodules/terraform-azurerm'...
Submodule path 'submodules/terraform-azurerm': checked out '0fe1907fc7bb48ea989893715a1532f5b5879421'
steve_dillon@Azure:~/terraform-azurerm-samples$ cd submodules/terraform-azurerm/
steve_dillon@Azure:~/terraform-azurerm-samples/submodules/terraform-azurerm$ git fetch --all --tags
Fetching origin
steve_dillon@Azure:~/terraform-azurerm-samples/submodules/terraform-azurerm$ git tag -n

v0.1.3 bulk conversion, removing all v1.x and move lastest version to V1, we will version with tags
v0.1.4 fix internal reference
v0.1.5 trival change to show tagging
steve_dillon@Azure:~/terraform-azurerm-samples/submodules/terraform-azurerm$ git pull origin v0.1.5

Versioning

The terraform-azurerm module is versioned with semantic versioning which is applied via git tags. We create versioned releases of the module package and release all the modules as a package. We don’t want to create a testing nightmare of allowing you to pick and choose any module version and mix and match. There are some interdependencies between module inputs and outputs so we version the entire package as a whole. To pick a version you pull the repository at a specific tagged version.

cd terraform-azurerm-samples/submodules/terraform-azurerm
steve_dillon@Azure:~/terraform-azurerm-samples/submodules/terraform-azurerm$ git tag -n

v0.1.3 bulk conversion, removing all v1.x and move lastest version to V1, we will version with tags
v0.1.4 fix internal reference
v0.1.5 trival change to show tagging
steve_dillon@Azure:~/terraform-azurerm-samples/submodules/terraform-azurerm$ git pull origin v0.1.5

Remote State

All of the sample code uses local state files. This is not production or even development team ready. Local state files are only to be used for the most basic demonstrations. There are many ways of storing state remotely and we do not cover your choice for remote state in this series of Articles.

You now have the terraform-azurerm repository checked out at your desired revision in the /submodules/terraform-azurerm folder of the sample repo. To play with the editor, cd into the folder and ‘code .’

Creating an Azure Landing Zone

In our opinionated code everything should be securely installed with appropriate logging. Our first demonstration lays down the infrastructure to allow that to happen. In 01-coreinfra we deploy:

  • Application Insights
  • A log storage account
  • A resource group.

Most of our modules require you to pass in an observability_settings object. The object is defined as below, which are the identifiers for the logging aspects of our deployment. The observability_settings are created in the file observability.tf and can be passed in bulk to any module.

variable "observability_settings" {
type = object({
instrumentation_key = string
# log analytics
workspace_id = string
storage_account = string
retention_days = number
})
}

Context variable

Similar to observabilty_settings is the context variable required by all of our modules. The input context variable is shown below and required by all modules.

variable "context" {
type = object({
application_name = string # the name of what we are deploying
environment_name = string # dev,prod, test, etc
resource_group_name = string
location = string # EastUS, WestUS, Azure region name
location_suffix = string # us-east, us-west (used in naming)
})
}

The context provides the what and the where of our deployment. These are very commonly used parameters used in the deployment of almost every resource. Some of these (resource_group_name, locaion) are used directly in calling azure resources. The rest of these settings are used in naming and tagging resources. We have a pretty defined naming system “<app>-<environment>-descriptive-name-<location_suffix>”.

Let’s Deploy It

In azure cloud shell, terraform and azure cli are installed and pre-configured for authentication to the subscription that you launched the cloud shell from. So to deploy the resource group.

cd terraform-azurerm-samples/samples/scenarios/01-coreinfra
terraform plan
terraform apply

This will create the landing zone. After you have deployed and looked around, make sure to ‘terraform destory’ as there will be charges for the created resources.

Wrapping Up

There was a lot of setting up shop in this article. We deployed the basic landing zone for future blog articles about our module library.

--

--

Steve Dillon

Cloud Architect and Automation specialist. Specializing in AWS, Hashicorp and DevOps.