Cloud infrastructure services, find a niche or die?

Back in May it was reported that Morgan Stanley had been appointed to explore options for the sale of hosted? services provider Rackspace. Business Week, May 16th reported the story with the headline Who Might Buy Rackspace? It’s a Big List. 24/7 WALLST reported analysis from Credit Suisse that narrowed this to three potential suitors; Dell, Cisco and HP.

To cut a long story short, Rackspace sees a tough future competing with the big three in the utility cloud market; Amazon, Google and Microsoft. Rackspace could be attractive to Dell, Cisco, HP and other traditional IT infrastructure vendors, that see their core business being eroded by the cloud and need to build out their own offerings (as does IBM which has already made significant acquisitions).

Quocirca sees another question that needs addressing. If Rackspace, one of the most successful cloud service providers, sees the future as uncertain in the face of competition from the big three, then what of the myriad of smaller cloud infrastructure providers? For them the options are twofold.

Be acquired or go niche

First achieve enough market penetration to become an attractive acquisition target for the larger established vendors that want to bolster their cloud portfolios. As well as the IT infrastructure vendors this includes communications providers and system integrators.

Many have already been acquisitive in the cloud market. For example the US number three carrier CenturyLink buying Savvis, AppFog and Tier-3 and NTT’s system integrator arm Dimension Data added to existing cloud services with OpSource and BlueFire. Other cloud service providers have merged to beef up their presence, for example Claranet and Star.

The second option for smaller provider is to establish a niche, where the big players will find it hard to compete. There are a number of cloud providers that are already doing quite well at this, they rely on a mix of geographic, application or industry specialisation. Here are some examples:

Exponential E – highly integrated network and cloud services

Exponential-E’s background is as a UK focussed virtual private network provider, using its own cross London metro-network and services from BT. In 2010 the vendor moved beyond networking to provide infrastructure-as-a-service. Its differentiator is to embed this into in to its own network services at network level 2 (switching etc.) rather than higher levels. Its customers get the security and performance that would be expected from internal WAN based deployments that cannot be achieved for cloud services accessed over the public internet.

City Lifeline – in finance latency matters

City Lifeline’s data centre is shoe-horned in an old building near Moorgate in central London. Its value proposition is low latency for which it charges a premium over out of town premises for its proximity to the big city institutions.

Eduserve – governments like to know who they are dealing with

For reasons of compliance, ease of procurement and security of tenure, government departments in any country like to have some control over their suppliers, this includes the procurement of cloud services. Eduserv is a not for profit long-term supplier of consultancy and managed services to the UK government and charity organisations. In order to help its customers deliver better services, Eduserve has developed cloud infrastructure offerings out of its own data centre in the central south UK town of Swindon. As a UK G-Cloud partner it has achieved IL3 security accreditation enabling it to host official government data. Eduserve provides value added services to help customers migrate to cloud, including cloud adoption assessments, service designs and on-going support and management.

Firehost – performance and security for payment processing

Considerable rigour needs to go into building applications for processing highly secure data for sectors such as financial services and healthcare. This rigour must also extend to the underlying platform. Firehost has built an IaaS platform to targets these markets. In the UK its infrastructure is co-located with Equinix, ensuring access to multiple high speed carrier connections. Within such facilities, Firehost applies its own cage level physical security. Whilst infrastructure is shared it maintains the feel a private cloud, with enhanced security through protected VMs with built in web application firewall, DDoS protection, IP reputation filtering and two factor authentication for admin access.

Even for these providers the big three do not disappear. In some cases their niche capability may simply see them bolted on to bigger deployments, for example, a retailer off-loading its payment application to a more secure environment. In other cases, existing providers are staring to offer enhanced services around the big three to extend in-house capability, for example UK hosting provider Attenda now offers services around Amazon Web Services (AWS).

For many IT service providers the growing dominance of the big three cloud infrastructure providers, along with the strength of software-as-a-service providers such as salesforce.com, NetSuite and ServiceNow will turn them into service brokers. This is how Dell positioned itself at its analyst conference last week; of course, that may well change if it bought Rackspace.

http://www.computerweekly.com/blogs/quocirca-insights/2014/06/cloud-infrastructure-services.html

Intel expands custom chip work for big cloud providers

intelThe enterprise IT world is moving toward an open, multi-cloud model, but a clear definition of just what multi-cloud means is still a question mark.

Speaking at Gigaom Structure 2014 in San Francisco on Thursday, Pivotal president and head of products Scott Yara acknowledged that the big tech industry at-large is still trying to converge on what this definition might be.

Yara suggested that the future of multi-cloud starts with looking closer at cloud-based computers around the world and unlock the power presented by new, modern hardware.

Platform-as-a-Service, he continued, serves as the foundation for a common abstraction for cloud-based computers.

When Pivotal meets with customers big and small, Yara said a lot of them want to take advantage of cloud-based computing but still aren’t sure how to do it.

“A lot of these cloud-based data services are going to need to be built close to where the data is generated,” Yara predicted, positing how the multi-cloud approach simply provides businesses with more options.

The end result is to enable enterprises to build up to the data or build down to it, depending on what the user case and requirements might be.

It is here were Pivotal comes in, he suggested, framing it as a “unique company” being that it was spun out from VMware and its parent company EMC more than a year ago.

The San Francisco-based software firm launched with a big data suite combining the powers of the Greenplum analytic database system, Hadoop-based data platforms, and GemFire, a scale-out data transaction database with real-time processing.

“The interesting thing about Pivotal is just how new it is,” Yara remarked. “We’re still very early in the journey of building this company.”

Nevertheless, Yara reiterated Pivotal’s commitment to the open source community, touting the achievements by other industry players. He also described how we’re seeing everyone from Amazon to Google to Rackspace, among others, innovate in order to unlock “big pools of computing,” such as memory and storage.

Pointing toward an industry competitor but also a partner on General Electric’s Industrial Internet cloud, Yara praised Amazon especially for doing a great job in elevating the cloud-based computing market, noting that the “breadth of AWS capabilities is broad. ”

At the same time, Yara stressed that the open source Platform-as-a-Service Cloud Foundry is getting us to the open cloud world “that we all want.”

http://www.zdnet.com/pivotals-head-of-products-were-moving-to-a-multi-cloud-world-7000030737/

Adobe Creative Cloud 2014: It’s all about apps and APIs

In a recent briefing to prepare us for the annual barrage of Adobe announcements, one of Adobe’s technology development people said of the company’s app development, "These early experiments…are one of the first stages of constructing the core products. Sooner or later, mouse and keyboard won’t be enough in Photoshop, Illustrator and InDesign. We’re trying to prepare ourselves for when that entire market gets disrupted. "

To me, that’s the theme of this year’s Creative Cloud extravaganza. Yes, Adobe’s offering up the usual updates to the desktop applications, major and minor. But we’re also seeing a company whose success has been built on monolithic, computationally intensive applications struggling to address the changing needs of its core base of imaging professionals — whose need for lightweight mobile apps is growing rapidly.

The launch of CC in 2011 was the first step down that road, building an infrastructure to connect all the as-yet-unforseen tools. This year’s announcements display a strategic commitment to underlying technologies, but in my opinion, its actual mobile apps still convey a sense of confusion rather than clear purpose.

The company’s delivering its first hardware products plus two new drawing apps (Ink and Slide), a brand-new Photoshop app variation (Photoshop Mix) an iPhone version of Lightroom, and a brand-new Creative software development kit (SDK) designed to help third parties hook into remote processing for advanced imaging capabilities. In general, the apps and hardware fall rather flat for me, and mostly seem to serve the purpose of attracting developers to its little corner of the world.

I think part of the lack of cohesion I sense is a result of the development incentives of the subscription system. During the briefing for Ink and Slide, the result of the company’s Projects Mighty and Napoleon, the assembled group asked questions about possible new features or extensions that we might see for these products. And all of the answers began with, "Well, if it’s successful…" So I asked: what’s your metric for gauging success? The blunt reply: if it drives subscriptions. What I see is Adobe throwing apps at the wall to see if any stick.

And they’re all iOS apps. On one hand, it makes sense for the company to devote its resources to a platform that most of its existing users are on. But as a multiplatform user — I have an iPad and an Android phone — I feel the loss. There’s now a Creative Cloud app for iOS that lets you manage your profile and assets similar to the way you can with the desktop application.

http://www.cnet.com/news/adobe-creative-cloud-2014-its-all-about-apps-and-apis/

Telecom NZ buys cloud desktop specialist Appserv

Telecom New Zealand has inked a conditional agreement to buy cloud services specialist Appserv.

Telecom chief executive Simon Moutter said that the acquisition provides a major new piece in the "cloud jigsaw" after significant investment across the business in cloud services.

“We see this as a strategic business move giving the Telecom group a very strong New Zealand cloud services portfolio. The acquisition of Appserve is an excellent complement to the purchase in 2013 of Revera and the ongoing investment we are making in data centres,” Moutter said.

Telecom Retail CEO Chris Quin said thousands of businesses already supported by Telecom and its ICT services arm, Gen-i, want to take advantage of the cloud but need a hand to get there.

Auckland-based Appserv has capability in providing cloud and desktop ‘as-a-service’ offerings, with customers across New Zealand, Telecom said.

“Revera is a cloud computing specialist that provides cloud solutions predominantly for government and corporates, while Appserv brings specialist cloud desktop-as-a-service solutions for a range of business sizes, particularly small to medium sized," Moutter said.

Telecom bought Revera for NZ$96.5 million last year.

Moutter said Appserv adds to Telecom’s existing strengths in cloud infrastructure, mobility, managed ICT and platform as a service.

"Gen-i is making big strides in all these areas. We have secured a number of significant new IT as-a-service contracts. We have added new or imminent data centre business totalling in excess of 250 racks since the beginning of 2014. Revera is continuing to grow strongly and perform well," he said.

Revera opened a new datacentre in Wellington last year that Moutter said is now 100% allocated. A third centre is due to open in October. In Christchurch, a new green-fields centre opened in August and is over 30% full while in Auckland, a new high resiliency datacentre in Takanini is due to open in October, with 38% of capacity already allocated.

Telecom is about to embark on a brand change to "Spark". Auditions are being held for a major TV ad campaign and agencies across the country are preparing to bid for work on the supporting campaign.

http://www.zdnet.com/telecom-nz-buys-cloud-desktop-specialist-appserv-7000030648/

Enterprise spend on public cloud IaaS to reach $650m by 2018

There were twice as many enterprises adopting infrastructure-as-a-service (Iaas) in 2013 over the previous year, according to a new study by Telsyte.

The Telsyte Australian Infrastructure and Cloud Computing Market Study 2014 showed that more than half of all organisations greater than 20 employees are now using public cloud IaaS for at least some part of their IT infrastructure, indicating that cloud computing within Australian enterprises is "skyrocketing".

The study has forecasted the total market value for public cloud infrastructure services to reach AU$650 million by 2018, up from AU$305 million in 2014.

Telsyte senior analyst Rodney Gedda said with cloud services presenting a low barrier to entry for IT infrastructure, the organisation penetration is growing strongly.

"There are a few key things driving this. One is the plain volume of organisations that will subscribe to cloud services that aren’t already. The second thing is cloud penetration within the organisation will increase, so organisations that are already using cloud services will spend more," he said.

"The third thing is the value of enterprise services in the cloud will expand. For example, if a company is running database workloads and billing systems on-premise, to move them into the cloud might be more of an exercise than just spinning up a VPS (virtual private server) with a credit card, so it’s higher value sell."

The study also sets out to dispel ongoing discussions that Australian organisations are being prevented by corporate and legal policies from using offshore cloud services. The Telsyte research indicates two-thirds of businesses that use the cloud are already using an offshore provider. Furthermore, 46 percent of CIOs said they are not subject to any restrictions on the use of offshore cloud services.

"The multinationals have made inroads into the local cloud market and local providers will need to compete on features and service levels, and not simply the fact that data is hosted in Australia," said Gedda.

The research identified there’s a growing trend in the adoption of a hybrid cloud model, which is expected to be in use by some 30 percent of enterprises by 2018. But on-premise IT is still considered a favoured choice. In fact, the study showed more organisations are implementing and considering private clouds, and are using virtualisation technology to implement a cloud architecture under their control.

"We can’t forget most of IT still runs on-premise, so enterprise IT is already an existing investment in service, storage, and infrastructure," Gedda said.

"For an organisation that has quite important workloads, they can’t just toss everything out and start again in the cloud, so the hybrid approach is a midpoint between organisations who want to keep key applications within the company’s boundaries, and between looking at moving some workload that may deem to be not as business critical, but they need to take advantage of the cloud benefits which are flexibility and agility."

Gedda concluded that there is now a "healthy range" of private cloud management options available to organisations that are looking to replicate the "scalability and manageability of public clouds with their own servers".

http://www.zdnet.com/enterprise-spend-on-public-cloud-iaas-to-reach-650m-by-2018-7000030557/

Microsoft Announces Azure ML, Cloud-based Machine Learning Platform That Can Predict Future Events

image00111

Microsoft has been on quite a cloud roll lately and today it announced a new cloud-based machine learning platform called Azure ML, which enables companies to use the power of the cloud to build applications and APIs based on big data and predict future events instead of looking backwards at what happened.

The product is built on the machine learning capabilities already available in several Microsoft products including Xbox and Bing and using predefined templates and workflows has been built to help companies launch predictive applications much more quickly than traditional development methods, even allowing customers to publish APIs and web services on top of the Azure ML platform.

Joseph Sirosh, corporate vice president at Microsoft, who was in charge of the Azure ML, and spent years at Amazon before joining Microsoft to lead this effort, said the platform enables customers and partners to build big data applications to predict, forecast and change future outcomes.

He says this ability to look forward instead of back is what really stands out in this product.

“Traditional data analysis let you predict the future. Machine learning lets you change the future,” Sirosh explained. He says by allowing you to detect patterns, you can forecast demand, predict disease outbreaks, anticipate when elevators need maintenance before they break and even predict and prevent crime, as just a few examples.

Sirosh says the cloud really changes the dynamic here because it provides the ability to scale, and the service takes care of much of the heavy lifting that would have taken weeks or months for companies trying to do it themselves in-house in a data center.

“The cloud solves the last mile problem, Sirosh explained. Before a service like this, you needed data scientists to identify the data set, and then have IT build an application to support that. This last part often took weeks or months to code and engineer at scale. He says Azure ML takes that process and provides a way to build that same application in hours.

What’s more is it supports more than 300 packages from the popular open source project R used by many data scientists.

Sirosh says the hope is that as more people use the platform and generate APIs and applications, and create what he called, “a virtuous cycle between data and APIs. ” People have data. They bring it to [Azure ML] to create APIs. People hook into applications then feed data back to the cloud and fuel more APIs, “he explained.

The product is currently in confidential preview, but Microsoft did mention a couple of examples including Max 451, a Microsoft partner working with large retailers to help predict which products customers are most likely to purchase, allowing them to stock their stores before the demand.

Carnegie Mellon University is working with Azure ML to help reduce energy costs in campus buildings by predicting and mitigating activities to reduce overall energy usage and cost.

Microsoft is not alone in this space, however. IBM launched Watson as a cloud service last winter for similar types of machine learning application building and just last week a startup called Ersatz Labs also launched a deep learning artificial intelligence cloud platform.

Azure ML goes into public preview next month. There is no word yet on the official launch date.

http://techcrunch.com/2014/06/16/microsoft-announces-azure-ml-cloud-based-machine-learning-platform-that-can-predict-future-events/

Even Early Adopters See Major Flaws in the Cloud

The C.I.A. isn’t afraid of the cloud.

Amazon, a relative baby in the field of technology services, triumphed over stalwart IBM to gain a $600 million C.I.A. contract, but the most remarkable part of the deal was that the agency was a cloud convert in the first place. The fact that a tech company could warehouse data involving the government’s spies is the clearest signal yet that cloud computing is having its moment.

Somewhat like outsourcing a decade ago, cloud computing is the coming technology destined to sweep away all before it. Amazon Web Services is the fastest-growing part of fast-growing Amazon, and analysts expect it someday to be the dominant part of the company. Google, IBM, Verizon, Microsoft and a host of smaller players are competing for a part of the action. Global spending on public cloud services alone is forecast to hit $210 billion in 2016, up 172 percent from 2010.

And yet outsourcing provides a cautionary tale of how enthusiasm can be derailed by reality. Outsourcing advocates said every customer service call, every information technology fix, even the creation of routine legal documents was destined to be done in India or the Philippines. They said this would be cheaper and more efficient. American companies would be hollowed out, with only executives and their aides left on the payroll.

It didn’t happen quite that way. While many companies outsourced routine tasks, some moved them back after complaints of poor service. Others never outsourced. Outsourcing was ultimately a segment of the market rather than becoming the market.

An inside look at how technology is remaking an industry, lowering costs for some and handing even more influence to a handful of powerful companies.

Cloud computing is already confronting similar issues.

Members of the Open Data Center Alliance, a consortium of global information technology companies like Infosys, Disney, Deutsche Telekom and SAP, are cloud enthusiasts. But in a recent alliance survey, two-thirds of the members said concerns about data security were delaying their move to the cloud. That was down from the 80 percent of respondents who expressed a concern about security the previous year.

Other results, however, are headed away from cloud computing. Fifty-six percent of members now say regulatory issues will limit their adoption, up from 47 percent. And 47 percent worry about being tied to one vendor, up from 39 percent.

Cloud computing “is kind of in the first wave,” said Michael Masterson, director of cloud solutions for Compuware, which helps clients improve the performance of their applications. “I don’t know if I’d call it immature, but it’s definitely Version 1.”

“Immature” is exactly what Roman Stanek would call it. Mr. Stanek founded GoodData in 2007 with the mission of disrupting business intelligence. Amazon had just started Amazon Web Services, and GoodData became a client. It was not an entirely happy experience, Mr. Stanek said.

“Imagine if your electric company didn’t know whether it would be up or down — if they told you, ‘No guarantees, but we believe it will be mostly up,’ ” he said. “Maybe that works for some clients.”

Amazon Web Services had well-publicized failures in October 2012 and two months later. But reliability is not a problem specific to Amazon. Mr. Masterson pointed out that Hewlett-Packard, which publishes its service agreements on the web, says it has 99.95 percent availability, which works out to about four hours of trouble a year. The service agreement also says a failure only counts if it lasts for more than six minutes.

“Imagine a company selling a premium new car whose warranty includes 2M piston revolutions, 10k door latch cycles, and 20k window open and closes,” Mr. Masterson wrote in a recent blog post. “And even then, with 99.5 percent availability, you might still be unable to start the car two days a year, or during winter there might be two weeks where the doors won’t unlock until the sun melts the ice in the door locks. Ready to buy?”

GoodData grew with Amazon. It has raised more than $75 million and has nearly 300 employees. But in the first quarter, the company left Amazon. It moved to a private cloud hosted by Rackspace, which is based near San Antonio. With Rackspace, it had more control. The only thing worse than a company offering unreliable service appears to be a company whose very existence is in doubt. Nirvanix was a cloud company based in San Diego with impressive backers, including Intel Capital and Khosla Ventures, and impressive hype. It was going to take on the big boys, Amazon Web Services and Google.

Last Sept. 16, Nirvanix warned customers they had two weeks to retrieve their data. On Oct. 1, it filed for bankruptcy. Apparently all customers got their data out, but it was a near miss. Then last month Rackspace, whose stock dropped by half since the beginning of 2013, said it was hiring Morgan Stanley to advise it on a possible sale or merger. While profitable, Rackspace faces increasing competition.

Companies that fail or are sold would matter less if data was more portable.

“It’s still difficult to tap into Rackspace and change your mind, or tap into AWS and change to something else,” said James Staten, an analyst with Forrester Research. “We’re a long way from sufficient standards where that’s a possibility.”

Mr. Staten said the last companies to go to the cloud would be those that had no experience with it — hospitals, medical device makers, and architecture and construction companies. “The reason they should go last is they don’t yet know what they don’t know,” he said. “They’ll start with applications that do not involve compliant data or customer data.”

In general, he said, “it’s hard to make a case for organizations that should not go to cloud at all.” It is not like outsourcing, which faltered over communications issues and rising prices in the host countries, he said.

GoodData expects to ultimately go full circle as competition increases. “This used to be very much Amazon’s monopoly, and a one-horse race was not good,” Mr. Stanek said. “That’s why I’m happy to see Google get in. In a couple of years, you will be able to go to Google or Amazon and say, ‘Give me the features I need, the service agreements and a good price, and I will have no reason to build my own cloud.’ ”

http://bits.blogs.nytimes.com/2014/06/11/even-early-adopters-see-major-flaws-in-the-cloud/

Savvy businesses looking at ways cloud can make money

The jury is still out on whether cloud can save you money, says Gartner research director, Michael Warrilow.

While we wait for the jury to come back in on the cost-saving question, Warrilow says savvy businesses are looking to make money from cloud adoption. Businesses in the gaming (gambling), digital media, and media sectors are taking advantage of cloud’s scaling potential to meet peaks in demand for services, he adds.

Australian and New Zealand (ANZ) businesses have started to ramp up adoption of cloud in the last 12 months, in contrast to the situation a year ago when there was still a lot of resistance to cloud. "It’s like a switch has been turned and a light has gone off," says Warrilow.

Warrilow was talking to ZDNet Cloud TV about what’s happening with cloud in the APAC region.

The entry of some of the larger global cloud players into the local market has played a part in the growing acceptance of cloud in ANZ, Warrilow explains.

He says some of the traditional heavy users of IT have not necessarily been the ones that have moved first with cloud. In the government sector, for instance, he has seen different approaches ranging from "complete rejection to wholesale adoption." By contrast, organisations in the services and retail sectors are moving ahead with cloud.

Cloud has a higher priority in China than in any other economy, according to Gartner’s 2014 global CIO research study. Other North Asian countries like Taiwan and Hong Kong are also moving ahead with cloud while Japan is moving less quickly, although local cloud services are springing up. South East Asian countries are also moving to cloud, but at a slower clip.

Warrilow cautions businesses against "falling into cloud", where small projects expand into larger initiatives and where the proper due diligence isn’t done.

Some of the more prevalent use-cases for cloud right now are software development and testing and disaster recovery. Software-as-a-service (SaaS) also provides opportunities, with Warrilow emphasising there are "a lot of discrete applications… better off handled by a 3rd party."

http://www.zdnet.com/zdnet-cloud-tv-savvy-businesses-looking-at-ways-cloud-can-make-money-highlights-video-7000030348/

Cloud: Not New, Just A Big Disruption To How We Communicate

This article is the final part of a 4 part follow up to the earlier published 4 Trends Transforming the Way We Communicate.

It seem like over the past couple of years the term cloud has found itself as the center of attention after a long spell of being of interest to only meteorologists and folks with nothing important to talk about.

Of course, this is because over the past few years “Cloud” as a means of computing has piqued the interest of so many from tech savvy consumers to enterprise CIO’s.

Perhaps the most interesting thing about Cloud Computing is that it isn’t new. In fact, the earliest attribution of cloud computing goes back to the 1960’s and a little known man name Joseph Carl Robnett Licklider who is believed to be its inventor. However, it really wasn’t until the mid 2000’s when Amazon launched its storage cloud that cloud computing started gaining wide acceptance among technologists and really until the past 2 or 3 years with the growth of Apple AAPL +0.59%’s iCloud that the adoption grew rapidly among consumers.

While the origins, adoption and acceptance can be debated, one thing that cannot is the amount of impact cloud is starting to have on our lives and the way we communicate.

It Seems Like Everyone Uses Cloud

Does anyone remember when email seemed like the second mailbox? It was by no means an everyday part of our lives and the process of exchanging email was still pretty cumbersome. With storage at a premium and transfer rates slower than molasses, email was limited mainly to simple text communication.

How about now, though? Is email still your second box? As a professional or even in your personal life, do you spend a lot of time in your inbox? Given that over 200 million emails are sent per minute, it would seem pretty likely that email has become our first means of written correspondence. What is more interesting is that the cloud is what made this possible.

For businesses, the use of an email client and server was the early deployment of private cloud and for the average consumer, thinking back to your days of Prodigy, AOl or Hotmail accounts; all of those were driven by the cloud. Those were just early iterations of private/public clouds, but nonetheless it was a very good indication of things to come.

Cloud Spreads Its Wings For Productivity, Applications and Communication

Today, cloud is a widespread means for all types of business and personal productivity tools.

Many of the leading business tools such as accounting packages, CRM systems, VoIP and ERP platforms all ride on the back of cloud computing. These tools, which drive the entire business ecosystem, are often hosted thousands of miles away from where they are being used.

This trend of hosting our most important applications extends straight into our personal computing habits. Our social networks like Facebook and Twitter are both accessed in the cloud. Many of our important documents are sitting in Dropbox folders or Google. Drive accounts, both built on cloud. For many of us our favorite games or lifestyle management tools for health, diet and entertainment are all cloud driven. Heck, even our music is now powered by cloud applications such as Spotify, Pandora or iTunes Radio.

Communication Disrupted

As more and more technology can continuously be crammed into smaller and smaller products, the cloud will continue to play a bigger and bigger part in all that we do.

Storage will find itself more and more removed from the device as thin clients and ubiquitous Internet access give us endless accessibility to our information from anywhere we are.

Applications will become more robust and more evolutionary as changes can be made with minimal disruption while always offering users availability and improved experience.

Communication will continue to evolve at breakneck speed with large files, videos and documents being available for on demand delivery and consumption. Today we can capture, transform and stream video to a mobile device anywhere in the world in seconds. Basically making any business or individual with a camera a media outlet, all because of the cloud.

A buzzword perhaps, and most certainly an idea that gets no shortage of attention, the cloud and what it offers to businesses and consumers is going to continue to shift the ever changing landscape of how we communicate. It has certainly disrupted my life. Question is, how has it disrupted yours?

http://www.forbes.com/sites/danielnewman/2014/06/10/cloud-not-new-just-a-big-disruption-to-how-we-communicate/

Google Embraces Docker, the Next Big Thing in Cloud Computing

Google is putting its considerable weight behind an open source technology that’s already one of the hottest new ideas in the world of cloud computing.

This technology is called Docker. You can think of it as a shipping container for things on the internet–a tool that lets online software makers neatly package their creations so they can rapidly move them from machine to machine to machine. On the modern internet–where software runs across hundreds or even thousands of machines–this is no small thing. Google sees Docker as something that can change the way we think about building software, making it easier for anyone to instantly tap massive amounts of computing power. In other words, Google sees Docker as something that can help everyone else do what it has been doing for years.

“Google and Docker are a very natural fit,” says Eric Brewer, a kind of über-engineer inside Google. “We both have the same vision of how applications should be built.”

On Tuesday, with a keynote speech at a conference in San Francisco, Brewer is set to unveil new ways that Google will combine Docker with its cloud computing services, Google App Engine and Google Compute Engine. For the company, this is a way of fueling interest in these services as it strives to challenge Amazon’s dominance in the burgeoning cloud market. But considering Google’s widely recognized knack for building its own massive internet applications, from Google Search to Gmail, Brewer’s speech will also provide an enormous boost for Docker.

The news will carry a particular weight because it’s coming from Brewer. You can think of him as the patron saint of modern internet architecture. From Google and Amazon to Facebook and Twitter, today’s tech giants run their web services across thousands of dirt-cheap computer servers, using sweeping software tools to transform so many tiny machines into one massive whole. It’s a bit like building computers the size of warehouses. It’s the only viable way of dealing with the ever increasing demands of modern web services. And it all began with Eric Brewer.

In the mid-1990s, as a professor of computer science at the University of California, Berkeley, Brewer built Inktomi, the first web search engine to run on a vast network of cheap machines, as opposed to one enormously powerful–and enormously expensive–computer server. And as the Googles and the Amazons and the Facebooks took this idea to new extremes over the next two decades, they leaned on Brewer’s most famous bit of computing philosophy: the CAP theorem, a kind of guide to how these massive systems must be built. “He is the grandfather of all the technologies that run inside Google,” says Craig Mcluckie, a longtime product manager for Google’s cloud services.

Now, none too surprisingly, Brewer is also a key cog in the Google machine, part of the team of elite engineers that oversee the design of the company’s entire online empire. What this means is that, after reshaping the net the first time around, the slick-bald computing guru is bringing the next wave of new ideas to the realm of online architecture.

It’s not just that he’s helping to refine Google’s global network of data centers, the most advanced operation on the net. Like Amazon and Microsoft and so many others, Google is now offering cloud computing services that let anyone else build and run software atop its vast infrastructure, and Brewer is among those working to impart Google’s particular expertise to all the companies that can benefit from these cloud offerings. Today’s cloud computing services can simplify life for developers–letting them build online software without setting up their own hardware in their own data centers–but in backing Docker, Brewer hopes to make things even easier.

Brewer says that Docker mirrors the sort of thing that Google has done for years inside its own data centers, providing a better way of treating hundreds of machines like a single computer, and he believes it represents the future of software development on the net.

The Super Container

Built by a tiny startup in San Francisco, Docker is open source software that’s freely available to the world at large. At first blush, it may seem like a small thing, but among Silicon Valley engineers, it’s all the rage. “If you believe that what makes life easier for developers is where things are moving, then this containerization thing is where things are moving,” eBay developer Ted Dzuiba told us this past fall. According to Docker, over 14,000 applications are now using its containers, and Brewer says a developer technology hasn’t taken off so quickly and so enormously since the rise of the Ruby on Rails programming framework eight or nine years ago.

That said, the importance of Docker can be hard for even seasoned developers to grasp. For one thing, it’s based on technologies that have been around years. The open source Linux operating system–the bedrock of today’s online services–has long offered “containers” that isolate various tasks on a computer server, preventing them from interfering with one another. Google runs its vast empire atop containers like these, having spent years honing the way they work. But Docker has made it easier to move such containers from one machine to another. “They’ve done a very nice job of making it easy to package up your software and deploy it in a regularized way,” Brewer says. “They’re making the container a more effective container.”

This can help developers in multiple ways. It means that if they build a software application on a laptop, they can immediately move it onto a cloud service and run it–without making changes. But the hope is that it will also let them more easily move applications wherever they want to run them, whether that’s their own data centers or Google cloud services or Amazon’s or a combination of all three. “It can make machines fungible,” says Solomon Hykes, the chief technology officer at Docker and the driving force behind the company’s open source project. This has always been the promise of cloud computing–that we could treat the internet like one giant computer–but we’re nowhere near that reality. Due to the vagaries of different operating system and different cloud services, it can be quite hard to move software from place to place.

The Bigger Effect

Granted, Docker can’t change this over night. First off, in order to run Docker containers, each machine must be equipped with a small sliver of additional software. And though this software is designed to operate in the same way on any version of Linux, Brewer says this isn’t always the case. “It’s not perfect yet. This is an area where both Google and the community have some work to do,” he says. “A container running on one OS may not run on another.”

But if the big operating system makers and the other big cloud services get behind the technology too, we can bootstrap a new world of cloud computing that behaves more like it should, where we can treat all cloud services as a single playground. The good news is that Google isn’t the only one getting behind the technology. Cloud services from Amazon, Rackspace, and Digtial Ocean have also backed the technology, at least in small ways.

You might think that this grand vision would end up hurting Google’s cloud business–a business it is deeply interested in expanding. In theory, Docker will make it easier for developers and companies to move their operations off the Google cloud. But the company also realizes that Docker will encourage more people to use its cloud. This will be the bigger effect–the much bigger effect. “It’s OK for them to make it so that payloads can be more easily moved from Google to somewhere else,” says Hykes, “because they’re betting that more payloads will flow in than out.”

For Brewer, containers are all about creating a world where developers can just build software, where they don’t have to think about the infrastructure needed to run that software. This, he says, is how cloud computing will continue to evolve. Developers will worry less about the thousands of machines needed to run their application and more about the design of the application itself. “The container is more of an application-level view of what you’re doing, versus a machine-level view,” he says, “and it’s pretty clear that the application view is more natural and will win in the longterm.”

So many others are saying the same thing. But they’re not Eric Brewer.

http://www.wired.com/2014/06/eric-brewer-google-docker/