Google is putting its considerable weight behind an open source technology that’s already one of the hottest new ideas in the world of cloud computing.
This technology is called Docker. You can think of it as a shipping container for things on the internet–a tool that lets online software makers neatly package their creations so they can rapidly move them from machine to machine to machine. On the modern internet–where software runs across hundreds or even thousands of machines–this is no small thing. Google sees Docker as something that can change the way we think about building software, making it easier for anyone to instantly tap massive amounts of computing power. In other words, Google sees Docker as something that can help everyone else do what it has been doing for years.
“Google and Docker are a very natural fit,” says Eric Brewer, a kind of über-engineer inside Google. “We both have the same vision of how applications should be built.”
On Tuesday, with a keynote speech at a conference in San Francisco, Brewer is set to unveil new ways that Google will combine Docker with its cloud computing services, Google App Engine and Google Compute Engine. For the company, this is a way of fueling interest in these services as it strives to challenge Amazon’s dominance in the burgeoning cloud market. But considering Google’s widely recognized knack for building its own massive internet applications, from Google Search to Gmail, Brewer’s speech will also provide an enormous boost for Docker.
The news will carry a particular weight because it’s coming from Brewer. You can think of him as the patron saint of modern internet architecture. From Google and Amazon to Facebook and Twitter, today’s tech giants run their web services across thousands of dirt-cheap computer servers, using sweeping software tools to transform so many tiny machines into one massive whole. It’s a bit like building computers the size of warehouses. It’s the only viable way of dealing with the ever increasing demands of modern web services. And it all began with Eric Brewer.
In the mid-1990s, as a professor of computer science at the University of California, Berkeley, Brewer built Inktomi, the first web search engine to run on a vast network of cheap machines, as opposed to one enormously powerful–and enormously expensive–computer server. And as the Googles and the Amazons and the Facebooks took this idea to new extremes over the next two decades, they leaned on Brewer’s most famous bit of computing philosophy: the CAP theorem, a kind of guide to how these massive systems must be built. “He is the grandfather of all the technologies that run inside Google,” says Craig Mcluckie, a longtime product manager for Google’s cloud services.
Now, none too surprisingly, Brewer is also a key cog in the Google machine, part of the team of elite engineers that oversee the design of the company’s entire online empire. What this means is that, after reshaping the net the first time around, the slick-bald computing guru is bringing the next wave of new ideas to the realm of online architecture.
It’s not just that he’s helping to refine Google’s global network of data centers, the most advanced operation on the net. Like Amazon and Microsoft and so many others, Google is now offering cloud computing services that let anyone else build and run software atop its vast infrastructure, and Brewer is among those working to impart Google’s particular expertise to all the companies that can benefit from these cloud offerings. Today’s cloud computing services can simplify life for developers–letting them build online software without setting up their own hardware in their own data centers–but in backing Docker, Brewer hopes to make things even easier.
Brewer says that Docker mirrors the sort of thing that Google has done for years inside its own data centers, providing a better way of treating hundreds of machines like a single computer, and he believes it represents the future of software development on the net.
The Super Container
Built by a tiny startup in San Francisco, Docker is open source software that’s freely available to the world at large. At first blush, it may seem like a small thing, but among Silicon Valley engineers, it’s all the rage. “If you believe that what makes life easier for developers is where things are moving, then this containerization thing is where things are moving,” eBay developer Ted Dzuiba told us this past fall. According to Docker, over 14,000 applications are now using its containers, and Brewer says a developer technology hasn’t taken off so quickly and so enormously since the rise of the Ruby on Rails programming framework eight or nine years ago.
That said, the importance of Docker can be hard for even seasoned developers to grasp. For one thing, it’s based on technologies that have been around years. The open source Linux operating system–the bedrock of today’s online services–has long offered “containers” that isolate various tasks on a computer server, preventing them from interfering with one another. Google runs its vast empire atop containers like these, having spent years honing the way they work. But Docker has made it easier to move such containers from one machine to another. “They’ve done a very nice job of making it easy to package up your software and deploy it in a regularized way,” Brewer says. “They’re making the container a more effective container.”
This can help developers in multiple ways. It means that if they build a software application on a laptop, they can immediately move it onto a cloud service and run it–without making changes. But the hope is that it will also let them more easily move applications wherever they want to run them, whether that’s their own data centers or Google cloud services or Amazon’s or a combination of all three. “It can make machines fungible,” says Solomon Hykes, the chief technology officer at Docker and the driving force behind the company’s open source project. This has always been the promise of cloud computing–that we could treat the internet like one giant computer–but we’re nowhere near that reality. Due to the vagaries of different operating system and different cloud services, it can be quite hard to move software from place to place.
The Bigger Effect
Granted, Docker can’t change this over night. First off, in order to run Docker containers, each machine must be equipped with a small sliver of additional software. And though this software is designed to operate in the same way on any version of Linux, Brewer says this isn’t always the case. “It’s not perfect yet. This is an area where both Google and the community have some work to do,” he says. “A container running on one OS may not run on another.”
But if the big operating system makers and the other big cloud services get behind the technology too, we can bootstrap a new world of cloud computing that behaves more like it should, where we can treat all cloud services as a single playground. The good news is that Google isn’t the only one getting behind the technology. Cloud services from Amazon, Rackspace, and Digtial Ocean have also backed the technology, at least in small ways.
You might think that this grand vision would end up hurting Google’s cloud business–a business it is deeply interested in expanding. In theory, Docker will make it easier for developers and companies to move their operations off the Google cloud. But the company also realizes that Docker will encourage more people to use its cloud. This will be the bigger effect–the much bigger effect. “It’s OK for them to make it so that payloads can be more easily moved from Google to somewhere else,” says Hykes, “because they’re betting that more payloads will flow in than out.”
For Brewer, containers are all about creating a world where developers can just build software, where they don’t have to think about the infrastructure needed to run that software. This, he says, is how cloud computing will continue to evolve. Developers will worry less about the thousands of machines needed to run their application and more about the design of the application itself. “The container is more of an application-level view of what you’re doing, versus a machine-level view,” he says, “and it’s pretty clear that the application view is more natural and will win in the longterm.”
So many others are saying the same thing. But they’re not Eric Brewer.