continuous-deployments

Frequency

book-continuous-delivery#frequent-releases

More frequent releases lead to lower risk in putting out any particular release. This is true because the amount of change between releases goes down. So, if you release every change, the amount of risk is limited just to the risk inherent in that one change. Continuous deployment is a great way to reduce the risk of any particular release.

 
book-site-reliability-engineering#very-small-sets-of-changes

Simple releases are generally better than complicated releases. It is much easier to measure and understand the impact of a single change rather than a batch of changes released simultaneously. If we release 100 unrelated changes to a system at the same time and performance gets worse, understanding which changes impacted performance, and how they did so, will take considerable effort or additional instrumentation. If the release is performed in smaller batches, we can move faster with more confidence because each code change can be understood in isolation in the larger system. This approach to releases can be compared to gradient descent in machine learning, in which we find an optimum solution by taking small steps at a time, and considering if each change results in an improvement or degradation.

 

Forces good process

book-continuous-delivery#force-you-to-do-right-thing

Perhaps most importantly, continuous deployment forces you to do the right thing (as Fitz points out in his blog post). You can't do it without automating your entire build, deploy, test, and release process. You can't do it without a comprehensive, reliable set of automated tests. You can't do it without writing system tests that run against a production-like environment.

 

A big change

book-continuous-delivery#shift-in-way-of-doing-things

Continuous deployment takes this approach to its logical conclusion. It should be taken seriously, because it represents a paradigm shift in the way software is delivered.