A recent 2i blog Going with the Flow, from Principal Consultant Greg McKenna, explains DevOps from a perspective of people and operations and how performance can be improved through a methodology framed around a headline concept of 'velocity flow', a measurable and manageable quality system that increases productivity.
The headline insight is that embedding testing throughout the DevOps life-cycle is also a function of establishing Quality reporting and improvement feedback loops, indeed it is logically the best way to do it. Quality programs seek to identify and reduce the Error Rates of production lines, for software that means fewer bugs.
Greg describes how 'Value Stream Mapping' can be utilised to document and chart the flow of work across an organization, and how principles of 'Flow' can be used to guide the building of one continuous system of work without silos, from idea to production, with process improvements that embed better quality controls through better use of testing, such as Test Driven Development, automated unit and acceptance tests, etc.
This allows work to be assessed as being 'done' or to be fed back through a review cycle, and to identify global-level insights to improve the overall Throughput, such as reducing batch sizes. Flow principles focus on common problem points like the hand offs between teams or the use of manual methods, within a global view so that they can be prioritised and addressed in terms of how much of an overall constraint they cause.
In their in-depth article, McKinsey explores the dynamics and successes of the ‘Digital Factory’ model, a new approach to organising digital teams, a self-organising structure vs rigid departmental hierarchies, being pioneered by the likes of Skyscanner in Scotland, who operate a culture of ‘tribes and squads’, and they describe the KPI improvements the Digital Factory enables:
“We see reductions in management overhead of 50 percent for technology teams in the DF, 70 percent in the number of business analysts needed to write technology requirements, and, as test automation becomes the norm, a drop of 90 percent in the number of testers. Finally, we see top engineering talent performing at eight times the level of their peers, as measured with metrics such as code commits.”
In short McKinsey describe an improving development team that improves quality and finds efficiencies is a direct by-product of one that leverages the latest tools and team methods to produce better digital products, faster. Organisations should seek both a reduction in error rates AND this better throughput, as one enables the other.
Greg describes the role technologies play in these new environments, notably Cloud computing, containers and code deployment, with the headline theme being automation. 'DevOps Flow Automation' is a focus on identifying DevOps flows and KPIs, and then applying automation tools to improve performance in line with those metrics.
2i offers an Automation Maturity Review, an assessment that follows DevOps Flow value stream mapping, so that we can identify the optimal target areas for automation, which can be achieved via various automation solutions, applied at different levels of scope and in niche functional areas.
At a functional level 2i specialises in testing tool vendors like Postman, a popular tool set for testing APIs, who recently announced a new feature release, making it available via a browser to make it easier to share and inspect APIs.
APIs are the building blocks of modern digital businesses, and so their robust testing is fundamental to successful deployments, especially so in this new 'Cloud Native' era. As they describe in news about their 'State of the API' report, "53.9% said microservices is the most exciting technology for developers in the next year, while 45.5% said containers and 44.0% said serverless architecture."
More Cloud and more microservices means more APIs and more testing, and so capacity and tools need to keep pace. Previously, Postman required DevOps teams to download a desktop application to access the Postman platform, and now provides a browser interface so that it can be accessed via distributed DevOps teams working remotely. In today's Covid world it's important functions can be operated and scaled this way.
Cloud Native GitOps
Automation can also be applied at a larger organisational-wide capability level, such as the popular use of tools like Jenkins, which automates much of the software and deployment life-cycle. New innovations include 'GitOps', the use of Github as a central version control platform for Jenkins et al, that keeps Kubernetes-based production systems in line with source code through automated monitoring and replication.
The need for this type of larger scale automation is clear when you consider the ultimate in digital platform businesses like Netflix, who operate a planet-scale media distribution system. Netflix applies testing from top to bottom, start to finish, of their entire environment including but not limited to their software development life-cycle.
Given the principle of ‘infrastructure as code’ they know that failures can occur at any point within the overall environment not just the code they write. They operate a global infrastructure spanning multiple AWS zones executing thousands of inter-operating microservices, and so they also test at this level, such as purposely decommissioning an entire AWS region, to ensure their entire infrastructure can fail over gracefully.
Cloud guru David Linthicum makes the point that test automation is an essential success factor for this evolution to Cloud Native, and 2i Automation Maturity Review services can help you pinpoint which combination of tools and methods will enable your organisation to successfully undertake the same journey.