1 (866) 866-2320 Resources Events Blog

Automated Application Deployment Is Not Enough! 3 Reasons Why You Absolutely Still Need To Validate Your Releases

Blog

Automated Application Deployment Is Not Enough! 3 Reasons Why You Absolutely Still Need To Validate Your Releases


 

Despite the widespread use of deployment  automation software today, change management and release validation plays a critical role in maintaining optimum availability and performance, preventing harmful downtime.

So why spend the extra time and trouble to validate releases, after all you can automate your deployments, and everything is supposed to run as planned – no surprises, right?

Well, not exactly.There are several perspectives to look at this, and see where deployment automation still falls short.

#1  Environment Modeling Only Maps What We Already Know, NOT Unknowns

When you use automation tools you create models of an environment's configuration. Each tool creates these models in different ways, but the goal is the same - to model the environment that is going to be created. 

The problem of creating these models is that you are only modeling what you know, not what you don't know. 

What do I mean? 

Let's say you are deploying to Microsoft IIS, and you want to change the connection timeout parameter. So, you set the things that you will be changing. But there are hundreds of parameters that can be changed in IIS. Do you know what parameters there are, and what they are supposed to be? The way that you deploy in one environment may not be the same as another environment. There are aspects of IIS configuration that are impacted by its native deployment. 

Is this model of the environment comprehensive? Does the model include everything? 

The answer is no. The result of this modeling you end up with something that you don't have 100% control over, rather only those parameters that you actually defined, leaving the deployment exposed to failure as it moves through environments.   

#2 Automated Deployment Tools Are Complex and Not Fully Automatic

Despite being automated tools designed to streamline the release management effort, automated deployment tools don't function completely automatically. You need an operator to configure these tools and  to ensure that the results will be error-free. This is the same situation as software development, where bugs can occur in the coding process. So, in the same way you can have a deployment automation system that contains bugs, even after you used it to deploy your release.  The fact that you first released to a test environment, then released to production, won't ensure that the releases are consistent. There are many different kinds of configurations and dependencies specific to each of these environments. 

Since these tools can contain such errors, release validation plays a crucial role in stability. 

#3 Don't Believe That you Achieved 100% Automation 

Besides all of the automated changes that are implemented, there are many manual actions that take place. 

Very rarely automated deployment tools comprise 100% of deployment and maintenance activities.

It is not as if, now you automated the deployment and don't need touch the system. Sometimes, after deployments, you will discover that you need to change configurations. Individually changing configurations creates a discrepancy, that is to say, you will have an environment that includes both automated and manual changes. Some changes are defined outside the automated tools, and can fall through the cracks in the deployment process.

What is the impact?  Let's say that I am in a production environment and I went and changed an image or a parameter and then forgot about it. What happens? The automated deployment tool comes and wipes out all the manual changes that were made, making the situation worse! Or take the opposite case: I automate the deployment on one server but I forgot to deploy on another server. This results in serious discrepancies throughout the environment, impacting stability and reducing performance.

Not By Automated Deployment Alone. Why You Need Release Validation 

  • Automated Deployment tools only define part of the environment, leaving you without full authority that the environment that you launched is functioning the way you planned it to
  • Automated Deployment tools are manually configured and can contain bugs, so you need to verify that mistakes and errors don't impact deployments 
  • There is no automation for everything, since in many cases a percentage of manual changes take place either before or after deployment that can contribute to discrepancies, destabilizing the environment 

How does release validation complement automated application deployment tools?

Independent Release Validation is Critical

Since what you don't put into the system, won't be defined by itself, independent release validation is critical. Just like in programming for developing software, you still need to validate the code independently. The same is true with deployments, since the automated deployment can run perfectly, but is not verified for all configuration settings. Automation complicates the process, for when a problem occurs, where do you look for the root cause?

Release Validation Checks for Manual Changes

When you run a performance testing environment, your application deployment or software deployment tool handles the roll out, then you follow up by checking the deployment.How do you check?You tune. You improve here and there. You make some configuration management changes. And you see an improvement. Then, let's say, of the 10 changes you made, you remember about 8 of them, adding them to your deployment tool, and you redeploy. The release doesn't work.Why?There are still 2 other changes that you forgot to include in the redeployment.So how do you check this?

Release Validation compares and verifies what you had already checked and changed, identifying, in this case, any changes that were not implemented in to the automated deployment tool.

Find out More

Learn more about how Evolven's groundbreaking solution takes on the dynamics and complexity of the modern data center and cloud, in a way that was never available before.


About the Author
Sasha Gilenson
Sasha Gilenson enjoyed a long and successful career at Mercury Interactive (acquired by HP), having led the company's QA organization, participating in establishing Mercury's Software as a Service (SaaS), as well as leading a Business Unit in Europe and Asia.

Sasha played a key role in the development of Mercury's worldwide Business Technology Optimization (BTO) strategy and drove field operations of the Wireless Business Unit, all while taking on the duties as the Mercury's top "guru" in quality processes and IT practices domain. In this capacity, Sasha has advised numerous Fortune 500 companies on technology and process optimization, and in turn, acquired a comprehensive and rare knowledge of the market and industry practices.

Sasha holds an M.Sc. in Computer Science from Latvian University and MBA from London Business School.