2013 was the year that Docker arrived on the scene and since then software container technology has advanced significantly. Like any new technology, containers would not be possible without an advanced environment in which to run; what I refer to as “the modern cloud”.

Containers are still a relatively new concept in the software development world, so here I’ll give a broad introduction to the technology, its relation to what came before, the benefits of containers, and the modern cloud infrastructure that they rely on to be used effectively.

What does the term ‘container’ really mean?

Containers are a means of software abstraction used by developers. The days of ‘bare metal’ and software designed specifically for them are behind us. Flexibility and agility are extremely important to modern developers. For that reason, virtual machines (VM) were created as the next step in a chain of technologies that has led to containers as we know them today.

[easy-tweet tweet=”Containers are a means of software abstraction used by developers. ” hashtags=”Cloud, Tech “]

VMs use software known as a hypervisor to abstract their work away from hardware. Hypervisors replicate hardware capabilities like CPU, networking and storage and enable more tasks to be run simultaneously across multiple VMs per physical device. However, using VMs and replicating them across numerous devices can be a significant drain on resources.

It can be helpful to see containers as a lightweight version of VMs, because, while they share the same basic function of abstracting processing work away from underlying hardware, they do not require a virtual copy of host hardware or their own operating system to be fully operational.

What this means in practical terms is that a developer can fit far more containers on a single server than would be possible with VMs, resulting in more power and more flexibility – the developer can move faster and deploy to the cloud with greater ease.

More agile and more secure

Containers can promise to run virtually everywhere and that is one of their most appealing aspects. The technology has the capacity to scale from a single developer on a laptop all the way up to an entire production cluster. Containers are much more portable than previous software development methodologies and allow developers to work with greater flexibility across complex applications.

Containers could also increase application security: before containers we had the ‘monolithic model’ of software development – when code had to be dealt with as a single, complex entity. If there were an error or an issue then the development team would have to analyse all their code, determine where the issue was located and remedy it without breaking any dependencies – a time-consuming process for even the most skilled developers.

Containerised software is more reliable and more secure. Issues can be isolated, removed and replaced with minimal disruption to the overall application. In addition, container technology supports the use of multiple coding languages in the same application – this means that cross-compatibility issues are minimised and different teams can work together more effectively.

A note on the modern cloud

It all started in 2006 with the Elastic Compute Cloud released by Amazon (EC2). Before then, low-cost and developer-friendly VMs were hard to come by – only the most forward-thinking companies with advanced, internal cloud functionality could access them.

Tech giants such as Amazon stepped in and started to do much of the heavy lifting on the infrastructure side, allowing smaller companies with specialised knowledge to build features that were really relevant for their customers.

This supply of cheap, quick VMs allowed teams to move faster as they could rapidly spin up new VMs without having to manage the infrastructure requirements themselves.

Containers only take a few seconds to load whereas VMs can take minutes. Containers are more flexible than VMs – which are often locked-in to a particular cloud provider. It is therefore faster to scale workloads in response to demand and, if required, migrate to another cloud provider using containers – something which can be highly challenging using VMs.

Rather than a hypervisor, containers require a scheduling tool to be managed within the framework of the modern cloud. Containers, and their orchestration tools, can span multiple cloud infrastructures, a step closer to the end goal of ‘build once, run anywhere’.

What is next for containers?

Container technology is thriving: 81% of businesses surveyed earlier this year suggested that they would increase their investment in the container space.

Container technology has uses across a wide variety of industries, some of which may come as a surprise. Goldman Sachs, the American investment bank, has invested around $95m into Docker and plans to move the bulk of its workload onto the platform over the next two years. Tech companies such as Amazon, Microsoft and Google are some of the other high-profile advocates of Docker technology.

Containers allow developers to compartmentalise and manage complex code – a step towards full software development automation. Adoption of the technology has been widespread in the developer community, and the next step is for larger companies and enterprises to begin using the technology en masse. Container technology, when used in conjunction with schedulers such as Kubernetes on modern cloud infrastructure have the potential to help automate more and more aspects of developers’ working lives.