Preliminary Configurations in Azure Devops for WSO2 CI/CD pipelines

How did we automate a WSO2 Deployment, CI/CD with Azure Devops — part 01

Mahesh Chinthaka
7 min readFeb 17, 2021
automate wso2 deployments with azure devops cicd pipelines — part 01

Bit of a background here. There was a WSO2 deployment where all deployment related stuff including product upgrades and artefact upgrades were done manually. I was assigned to automate the whole Deployment process.

The deployment was on an OpenShift cluster which was in house(on premise) and as the CI/CD tool/solution, customer had chosen Azure Devops. After completing this task, Azure Devops became one of my favourites(I would say the most favourite) in CI/CD domain. Thing that I loved most about Azure devops is, its documentation. Very informative and comprehensive. And at the same time simple and easy to learn.

This deployment had 3 main environments called Dev, Test and Prod. There were 7 WSO2 product components in each environment. We had to create a pipeline to build the docker image and a pipeline to deploy openshift artefacts for each wso2 component/product.

  1. WSO2 API Manager
  2. WSO2 API Manager — Analytics
  3. WSO2 Identity Server as Key Manager (wso2is-km)
  4. WSO2 Enterprise Integrator — BPS profile
  5. WSO2 Enterprise Integrator — Integrator profile
  6. WSO2 Enterprise Integrator — MB profile
  7. WSO2 Enterprise Integrator — Analytics profile

Furthermore, there were 6 Maven Multi Module Projects(Carbon Applications), and 6 EI Custom Mediator Projects(Java) and also 3 Angular projects. We created pipeline for each. Finally we ended up creating 30+ pipelines to automate whole deployment process. It was actually fun. I learned a lot and enjoyed every bit of it in this whole automation process.

Since I feel that this will be a lengthy post for sure, Im going to break it in to several pieces and write them in different posts. I will be writing this in a way by looking at this as a journey. Therefore all subsections/sub-sequent posts will be a task/challenge we achieved/accomplished through out automation journey.

Links to other posts can be found in the bottom of this post.

Source Version Control

Built in Git Repositories in Azure Devops

We moved all our source code to Azure Devops. It supports multiple source repository types. We created Git repositories under Azure Repos. Each repository had,

  1. 3 main branches (Develop, Test, Master) in each repository. Each branch maps to the actual environment in the deployment. Master branch maps to Production environment.
  2. Direct commits are allowed only to the Develop branch(you can set this under repository settings)
  3. Developers can create feature branches bug-fix branches etc and merge to Develop branch first.
  4. Merging PRs for Test and Master branch needs minimum no. of code reviews
  5. Azure-pipeline.yaml file is in the root and same content available in each branch.

We used this branching strategy to push changes from one environment to another. There can be many other approaches to maintain codes for different environments in a deployment and promote changes from lower environment to higher. But I found that above branching strategy fulfilled all our requirements.

Maintain environment specific values

Variable Groups + Azure Key Vault

Library → Variable Groups

We created variable groups, created parameters and assigned values accordingly. These parameter names are coded in the source. During the pipeline execution, those variable names are replaced with its relevant value. We use Replace Tokens plugin in the pipeline for this task.

Something to note here, Variable Group Name + Variable Name => should be both environment and product specific. And the variable name cannot contain any environment specific keywords as the this parameter name is in the source code. You may get merge conflicts, and merging the code from one branch to another wont be straight forward if you keep environment specific variable names in the source code.

eg: Variable Group Name = PROD, Variable name = EI_HOSTNAME

In this case you will have Variable groups called DEV, TEST, PROD and inside those you will have same variable names but with different values.

If the variable is specific to a product then you will prefix the variable with product name eg: EI_HOSTNAME, APIM_HOSTNAME etc.

If the variable is common to all products in that environment then use just the parameter name without any prefix. eg: DOCKER_REGISTRY_HOST, OPENSHIFT_USERNAME.

You can import variables from Azure key vault. And also you can lock your variables if needed. This way you can store your passwords too inside a variable as its secure.

Certificates and Keystores

Secure Files

Library → Secure Files

We uploaded all .jks files and certificates to Azure Secure files.

There is a built in task called Download Secure Files where you can use that task in your pipeline to retrieve the secure files for the build.

Unfortunately, grouping the secure files wasn’t possible, therefore we had to add a prefix with every file when uploading, but within the pipeline after retrieving, we rename it to the generic name.

eg: prod_wso2carbon.jks, test_wso2carbon.jks, dev_wso2carbon.jk all ended up as wso2carbon.jks in the docker container during each build for its respective branch/environment.

Securing Deployments

Every change made in lower environment had to go through minimum two checkpoints.

Code reviews — We set minumum number of code reviews for Test and Prod branch before merging the code.

You can do this by going in to Project Settings → Repositories → Policies → Branch Policies

Deployment Approvals — For each environment created in Azure Devops, we added Approvals and Checks to hold the pipeline until an authorised person approve the pushing stuff to Openshift. Read more on Approvals and Checks.

environments in azure devops
approvals and checks for environments in azure devops

Writing Azure Pipelines

We created a yaml file called azure-pipeline.yml in the root of each repository. And selected that file during pipeline creation.

Pipelines → New Pipeline → Select repository location → Select repository → Existing Azure pipelines YAML file

Pipelines → New Pipeline
Exisiting Azure Pipelienes YAML file

I would say, learn to write Azure pipelines is like learning a new programming language. There are Azure specific syntaxes and coding practices and flows. I will try to cover everything that are related to this WSO2 deployment automation.

I would recommend you to read their documentation to understand the basics and pipeline syntaxes. Mainly a pipeline would consist of different Stages, a Stage can have multiple Jobs and a Job can have multiple Steps and tasks.

Add Triggers to the Pipeline

Creating the pipeline yaml file and adding it to the Azure pipeline isn’t enough. We need to add triggers to the pipeline. These triggers will define, for which events on which branch this pipeline will be executed.

We added Continuous Integration(CI) triggers and Pull Request(PR) triggers covering all 3 branches. This helped us to avoid executing the pipeline when committing/merging codes to minor branches like bug fix branches and feature branches.

add CI and PR triggers to azure pipeline

But if executing the pipeline for minor branches is required we can have that too. Just a matter of adding the branch name or pattern inside include: section. Or if you want to exclude any branch or branch pattern, you can add that in the exclude: section. See below,

trigger:
branches:
include:
- master
- bug-fix/*
- feature/*
exclude:
- releases/old*
- feature/*-working

Use Variables and Variable Groups in Pipeline

We defined some variables inside the pipeline.yaml file because we wanted to trigger the pipeline whenever the value of that variable get changed.

eg: We have wum update timestamp as a variable in the pipeline.

We have defined some global variables which are common for all the “Stages” in the pipeline, and also some local variables that are specific only to that particular stage.

Links to other posts in this series…

Part 01

Part 02

Part 03

Part 04

--

--