Application programming interfaces (APIs) have clearly been a boon to IT service providers and their customers alike. In fact, a new report crafted by Apigee (an API management firm that was recently acquired by Google) and a Massachusetts Institute of Technology digital fellow suggests that companies that employ APIs see annual revenue increases of 13.5 percent on average.
But APIs are only the beginning of a much larger trend that is transforming how software is built, delivered, and managed. The general availability of APIs has given rise to microservices architectures that better isolate specific software functions. Rather than building and trying to support monolithic applications, many IT organizations are starting to employ container technologies such as Docker to change the way they build and manage applications.
How containers change the game
Docker containers change the way applications are built and run because a Docker container consists of a complete filesystem that includes the code, runtime, system tools, and system libraries that an application needs to run. That capability not only makes it simpler to package application code up in a more portable way; it enables IT organization of all sizes to adopt a microservices approach to IT that simultaneously makes applications more robust and flexible.
Just as importantly, containers change the way applications are upgraded. Instead of essentially patching a monolithic application to add new functionality, developers add or replace containers that provide new functionality. Developers can even group containers to create a set of logically connected functions in software.
The impact these changes are having on how IT is managed is already quite substantial. A recent survey of more than 500 IT professionals conducted by Docker Inc. finds that more than half report running at least one container application in a production environment. But, 90 percent say they are employing Docker containers in application development today. On average, these organizations are reporting a factor 13 increase in the rate at which applications are being released inside their organizations.
A full 85 percent say that following the adoption of Docker containers they are seeing improvements in their overall approach to IT operations. Some IT organizations have even moved to run entire legacy applications in containers as part of first steps toward eventually decomposing them into a set of microservices that promises to be simpler to manage.
Overcoming microservices challenges
Docker containers dramatically increase the potential for contention within the IT environment as well. With hundreds of containers running on top of the same physical infrastructure, one of the keys to success is making sure IT operations teams have the tools they need to discover issues that start to emerge when deploying containers at scale. This challenge has already given rise to a host of IT monitoring tools that are optimized for containers.
Today, containers can be deployed on virtual machines, in a platform-as-a-service (PaaS) environment, or on a physical machine. Because of a lack of tools optimized specifically for containers, though, most containers currently run on top of virtual machines or in a PaaS environment. But as container technology continues to rapidly mature more containers will wind up running on physical machines as a lighter weight alternative to virtual machines.
Clearly, containers and microservices are going to have a profound impact on how DevOps needs to be approached. Instead of developers throwing code over the wall for IT operations teams to manage, developers are now a core part of the overall application management lifecycle.
Naturally, containers will take some getting used to for both internal IT organizations and IT service providers, but at this juncture there is no going back. IT service providers will need to adapt quickly to a different approach to managing IT that fundamentally changes almost every assumption they now take for granted.
Photo by Greg Rakozy