Back in 1998, a small company called VMware was the first to successfully virtualize the x86 infrastructure, and in 2004 it became a part of EMC. Today, virtualization is as commonplace as The Phoenix Project books on the bookshelves of IT workers. Many organizations and service providers rely on virtualized environment to run their workloads, and VMware’s technology saved their customers’ huge sums of money they would to otherwise use to buy additional hardware. In the last 16 years, technology has come a long way, bringing with it new requirements to deploy applications quickly, across any device, platform, or cloud infrastructure.
Enter Docker. Founded in 2013 as a side project at dotCloud, Docker allows developers to “containerize” their applications, and run processes in isolation. That means that the app no longer needs to rely on an operating system or disk &etc. Docker still runs on a Linux kernel, however, everything else is completely isolated and independent of the operating system. Docker now works across many public cloud providers like AWS, Azure, Rackspace and others.
What does this all mean to all of us? Well, for one thing, it will make applications even more portable, developers and DevOps engineers will be able to quickly and easily move applications from public to private clouds, from on-premise to someone’s laptop. Second, it will be easier to engineer fast application deployment. Thirdly, it will allow even more resource optimization, taking the next step of VMware started 16 years ago.
Are we living in an app-driven world? Absolutely. And Docker, valued at around $400m, is living proof.