I am delighted to announce that I have been awarded Microsoft Most Valuable Professional (MVP) 2021-2022 for the 7th consecutive year. And this is my second award in the Azure category (the first one and my MVP path is described shortly here):
Dear , It is with great pride we announce that Roman Levchenko has been awarded as a Microsoft® Most Valuable Professional (MVP) for 7/1/2021 – 7/1/2022. The Microsoft MVP Award is an annual award that recognizes exceptional technology community leaders worldwide who actively share their high quality, real world expertise with users and Microsoft. All of us at Microsoft recognize and appreciate Roman’s extraordinary contributions and want to take this opportunity to share our appreciation with you.
As always, I provide my activities done throughout the previous year:
Multiple Blog Posts, Social Networks activity (groups moderations and etc.)
Terraform-based deployment of almost all Azure Data Services (default deployment settings are in the parenthesis):
Azure Service Bus (Standard, namespace,topic,subscription, auth. rules)
Azure Data Lake Storage (ZRS, Hot, Secured, StandardV2)
Azure Data Factory (w/Git or without)
Azure Data Factory linked with Data Lake Storage
Azure Data Factory Pipeline
Azure DataBricks WorkSpace (Standard)
Azure EventHub (Standard, namespace)
Azure Functions (Dynamic, LRS storage, Python, w/App.Insights or without)
Azure Data Explorer (Kusto, Standard_D11_v2, 2 nodes)
Azure Analysis Server (backup-enabled, S0, LRS, Standard)
Azure Event Grid (domain, EventGridSchema)
Azure SQL Server (version 12.0)
Azure SQL Database (ElasticPool SKU name, 5 GB max data size)
Azure SQL Elastic Pool (StandardPool, LicenseIncluded, 50 eDTU/50GB)
Properties and content
Over 1k strings and 26 terraform resources in total
Almost every string is commented out, multiple conditions in each resource, variable conditions to check it’s value before the deployment and etc. So, it’s flexible, not hardcoded and allows you to create infrastructure with your own set of resources.
Written a few years ago, updated once since then to fix deprecated features
June, 2021 Update: SQL Server, Database and Elastic Pool , added variable conditions (for example, sql password must be longer than 8 symbols and have upper-case, digits and special characters), added a sensitive variable (just for sample), new Terraform 0.15.5 syntax/features were added, multiple minor changes
Tested with the latest Terraform 0.15.5 and Azure provider 2.62.0 (the first version of the script worked fine with >=0.12 and AzureRM >=1.35, just check the syntax and try out)
auth.tf – provider authentication and version settings
main.tf – a desired Azure infrastructure
terraform.tfvars – controls deployment settings
variables.tf – variables list
outputs.tf – outputs useful information
Deployment settings (excerpt)
#--------------------------------------------------------------
# What should be deployed?
#--------------------------------------------------------------
servicebus = true # Azure Service Bus
datafactory = true # Azure Data Factory
datafactory_git = false # Enable GIT for Data Factory? (don't forget to set Git settings in the Data Factory section)
databricks = true # Azure DataBricks
eventhub = true # Azure EventHub
functions = true # Azure Functions
functions_appins = true # Integrate App.Insights with Azure Functions?
eventgrid = true # Azure EventGrid
kusto = true # Azure Data Explorer (kusto)
analysis = true # Azure Analysis Server
sqlserver = true # Azure SQL Server
sqlep = true # Azure SQL Elastic Pool
sqldb = true # Azure SQL Database
variable "az_sqlserver_password" {
type = string
description = "Azure SQL Server Admin's Password"
validation {
condition = length(var.az_sqlserver_password) > 8 && can(regex("(^.*[A-Z0-9].*[[:punct:]].*$)", var.az_sqlserver_password)) # meets Azure SQL password's policy
error_message = "SQL Server Admin's password must contain more than 6 symbols (lowercase + upper-case and special/punctuation characters!)."
}
}
Usage guide
Open the terraform.tfvars file
Indicate the “What Should Be Deployed?” section
Use true/false to set your desired configuration
Check or change Azure services settings in the appropriate sections (naming convention (prefix/suffix), location, SKUs and etc.)
Run terraform init to get required Terraform providers
Run terraform plan to initiate pre-deployment check
Run terraform apply to start a deployment
(optional) terraform destroy to delete Azure resources
Requirements
The script uses Service Principal authentication, so define the subscription ID, client ID, tenand ID and principal secret in the auth.tf (or use another authentication type – Managed Identity, if your CI is running on Azure VMs, for instance)
If you are going to deploy Analysis Server (enabled, by default), provide valid Azure AD user(s) UPN(s) to set them as administrators of Analysis Server (az_ansrv_users variable, file – terraform.tfvars)
Result
P.S. feel free to share/commit/fork/slam/sell/copy and do anything that your conscience allows you 🙂