This week all major industry players joined forces to come up with a common standard format for Linux containers. This is great news because imho containers are the future™ of deploying code. They are a key enabler of the DevOps methodology and will drive IT efficiency improvements in many organizations.
Containers are lightweight virtualization environments
Linux Containers (LXC) is a lightweight virtualization technology. It allows running multiple isolated Linux systems on a single Linux host. In comparison to traditional virtual machines, containers are much more lightweight because they are technically only operating system processes. They don’t need to bring an own kernel or drivers or some of the other bulk that operating systems need. As a result, containers boot in only a few seconds. In fact, the overhead of starting a container is so small that most people spin up a container for each app instance.
Being a virtualization technology, containers are well isolated from each other and can only communicate over a virtual network with each other or through file mounts with the host.
As the name already suggests, containers bundle up an application with all of their dependencies. Just think about the real-world containers for a second. They are designed to fit on a wide range of vehicles including ships, trucks and trains. This standardization was a massive enabler for global trade. The same theory applies to their virtual counterparts – containers can be run on development laptops or data center clusters. Basically, developers can work in the very same execution environment as the production system. Although this is not a new idea, it is an effective relief for dependency hell.
Currently, the most popular container format is Docker. However, standardization is key in this game. This week’s announcement of a common format supported by all major players is a big step towards success.
Developers own the whole lifecycle of their applications
Developers are now enabled to configure the target environment of their application by themselves. This increases efficiency, because the handoffs to operations are reduced. Deployments can be done in a matter of seconds and many times a day. This is fantastic news for anybody who wants to do Continuous Delivery. It also creates a deeper ownership for application. The developers become largely responsible for how the app is run and are in charge when things go wrong.
On the other hand side, operations can focus on building and maintaining the infrastructure. In theory they don’t need to know about the apps that are running on the systems. To make this process work automation is key. Developers need to have an API to deploy their code efficiently – and containers are the exchangable data format.
This may sound familiar because it is the very concept of cloud service providers (IaaS and PaaS). Amazon AWS, Google Compute Engine and Microsoft Azure already support containers for running code on their infrastructure. This strenghtens the case for outsourcing IT operations. On top, public clouds have the benefits of scaling infinitely and worldwide coverage. Many of the successful startup companies like Dropbox or Airbnb didn’t even bother setting up their own servers. However, most efficiency improvements also apply for private clouds.
DevOps will drive IT speed in many organizations
Being fast and agile is what matters these days for business to become successful or stay relevant. This is widely acknowledged in many industries. Companies like Netflix or Spotify already profitted massively from that competitive advantage.
Reducing deployment times from hours or days to seconds implies a drastically lower cost of change. It is one aspect of achieving greater speed. Paired with the ownership of development teams of their application, it will change how businesses deliver software. This is sometimes referred to as the DevOps methodology. Containers are a technical enabler for that and companies should take advantage of it.
Most of these ideas and concepts have been discussed before. If you’d like to learn more I’d highly recommend these resources:
- Adrian Cockcroft: State of the Art in Microservices (YouTube)
- Adrian Cockroft: Faster, cheaper, safer (slides)
- Martin Fowler: Continuous Delivery
- Gene Kim et al.: The Phoenix Project (Book)
- Open Container Project
Title photo by Kevin Talec (CC BY-SA 2.0)
I built a web app to train sheet reading skills with a real piano. The web app renders random music sheets and the user has to play the notes on his piano. The communication between piano and browser is implemented with the help of the Web MIDI API.
Topic models are great to categorize WHAT a text is about. It is pretty easy as well: Get an off-the-shelf LDA, train it on your corpus and you are set to go. But there are even more insights you can get about your texts. Modifying your corpus in a certain way (mostly removing everything but verb phrases) allows you to gather a deeper understanding about WHY a certain text was written.