Does Cloud Really Hamper Server Sales?

Common sense suggests that mainstream cloud adoption will gut server sales, but that’s not really happening, analysts say. Here’s why.

8 Data Centers For Cloud's Toughest Jobs
8 Data Centers For Cloud’s Toughest Jobs
(Click image for larger view and slideshow.)

It’s often said that tablets are eviscerating traditional PC sales. That’s an overstatement, but the same logic could be applied to traditional servers: As companies move the workloads once assigned to on-premises infrastructure to the cloud, they’ll buy fewer servers. Right?

Again, there’s some partial truth there. There are businesses, especially new startups and other small companies, that do just about everything online. There are plenty of others that have opted to move at least part of their infrastructure to the cloud, rather than maintaining their own data centers. And there’s a bevy of stats to support the notion that cloud adoption will push down corporate capital expenditures on new servers.

Most research firms reported modest gains in server shipments in 2013, but revenue actually declined. Gartner reported a 2.1% bump in shipments with a 4.5% drop in revenue for the year; IDC reported a revenue decline of 4.4% for the year. Yet it wasn’t all bad news; shipments grew 3.2% by IDC’s figures, good for a record 9 million units. The firm also said it expected evidence of a refresh cycle to appear in 2014.

So where’s the server market really headed next? Don’t expect a bunch of "cloud killed the server" sound bites, said Techaisle analyst Anurag Agrawal.

[SAP business apps are coming to Azure. Read Microsoft Brings SAP Apps To Azure Cloud.]

The "fall in revenue is primarily due to tepid demand for behemoth expensive Unix/RISC-based servers, as those were usually underutilized," Agrawal said in an email interview. "With the rapid emergence of cloud and virtualization, those workloads have started to move towards cloud. On the other hand, demand for smaller, right-sized servers continues to rise."

There’s a reason that Lenovo bought IBM’s server business for $2.3 billion, he said, just as there’s a reason Dell keeps rolling out new server models and Intel’s datacenter business grew in the first quarter of this year. Servers aren’t going away; they’re just going to change. He also said original design manufacturers in Taiwan are shaking up the server establishment by offering custom-built servers to spec.

So what’s in store for servers? Agrawal sees five themes emerging:

1. Hybrid cloud: the dominant model
Agrawal expects hybrid cloud — a mix of public cloud, private cloud, and traditional on-premises infrastructure — to be the favored IT strategy in the long term. Nearly one-third of midsized businesses (100-1,000 employees) already use a hybrid approach, according to Techaisle data. Cloud spending will take up a greater slice of the pie in those businesses, but that doesn’t mean on-premises servers will evaporate. Recent Techaisle research shows 83% of midsized businesses that use or plan to use cloud platforms also plan to buy servers, for example.

"While storage and data backup workloads may migrate to cloud, server workloads may still remain on-premise as most mid-market businesses and enterprises register a high rate of concern regarding the difficulty of integrating operational systems across hybrid traditional/cloud-based systems, security of applications and corporate data, and about control over data, users and applications," Agrawal said.

2. First server purchases: plenty left
The first-server market isn’t saturated — 1.5 million small and midsized companies in the US alone have yet to purchase their first

server, and that number is much higher globally, according to Techaisle data.

"It is a fallacy to assume that all first-server businesses will migrate to a cloud server," Agrawal said. "With security, server configuration, and managed services the cost of using a cloud server could easily exceed $2,000 per month, which over a long period exceeds the cost of a new server substantially."

3. Collaboration and mobility: "interesting dichotomies"
Agrawal elaborated on those dichotomies: "On the one hand, there can be a completely SaaS application-based approach; but on the other, a more robust deployment has been on-premise or hybrid deployments. And mobility is essentially seamless and secure delivery of applications to multiple screens. This usually requires the deployment" of virtual desktop infrastructure or desktop-as-a-service. "And businesses from small to large that use VDI usually have to upgrade servers, storage, and network bandwidth. This opportunity may be in the form of tower servers, rack servers, and blade servers."

4. Internet of things (IoT): server innovation catalyst?
Wait, what? Isn’t the IoT all-cloud, all the time?

Yes, by definition, the IoT is kind of an online thing. But Agrawal expects the business of IoT to actually drive demand for new kinds of servers. "Granted, cloud is an important component for IoT, but with exceptional levels of security requirements and large amounts of proprietary data being collected, collated, and analyzed, it is difficult to imagine all implementation to be on cloud-based servers."

5. Big data: big server and storage needs
Another trendy topic, big data, is poised to force infrastructure upgrades as more companies give green lights to data-related projects.

"Big data initiatives in large enterprises put pressure on infrastructure and forces server [and] storage upgrades," Agrawal said. "As proof-of-concepts get completed and move on to become full projects, businesses will spend more on compute and storage platforms as big data project deployments will require better and updated storage, servers, and other analytical solutions."

The bottom line: Cloud computing is changing the traditional server, but it’s not eliminating it. "Businesses will continue to purchase servers, smaller-sized, energy-efficient performance servers with or without integrated storage and networking capabilities," Agrawal said. "Adoption of cloud servers will continue to increase but only for some workloads. The decision will come down to cost, security, comfort, [and] business objective."

Private clouds are moving rapidly from concept to production. But some fears about expertise and integration still linger. Also in the Private Clouds Step Up issue of InformationWeek: The public cloud and the steam engine have more in common than you might think (free registration required).

Kevin Casey is a writer based in North Carolina who writes about technology for small and mid-size businesses. View Full Bio

How cloud computing is like getting a rental car

Though it’s commonly associated with free storage providers, like Dropbox, or online word processors, like Google Docs, cloud computing can involve more advanced areas of technology, too.

That’s part of the reason that talking about "the cloud" can get confusing for a lot of people. Especially when you’re trying to figure out a sensible cloud strategy for your small business.

Operating an information technology system in the cloud is like renting a car. Where you rent a car, you expect to be able to jump in and drive off, safe in the knowledge that your car will work. You also expect the rental company to take care of all necessary maintenance, repairs and breakdown assistance.

The same is true of the "cloud." When you sign up, you get to use the software without worrying about installation, maintenance, updates or security. You also don’t need a server or any of the other additional IT investments that larger suites of software used to require. All that is taken care of by your cloud service provider.

Renting a car doesn’t require the significant upfront investment that buying a car does. Also, by renting, you can always stay in a current model, versus buying a car that depreciates in value as soon as you drive it off the lot. Cloud services also don’t require an upfront cost. Your business pays a monthly flat-rate per user fee, and if your business grows and you hire new staff, you can switch on new licenses, and similarly turn them off as needed.

There are many benefits to small businesses that wish to leverage cloud computing capabilities. For many small businesses, it’s helpful to start with something simple — like email.

Hosted email through Microsoft or Google, which are the two biggest players in the market right now, is a great (and safe) place to start your cloud strategy. By hosting your email, calendars, contacts and chat through one of these providers, you don’t have to purchase servers, license software or upgrade your infrastructure — shifting email costs entirely to your operating budget.

There are questions that you need to ask before you consider moving a critical line of business software applications to the cloud, and it is important to work with a trusted IT adviser who can help answer these tough questions. This adviser can work with your software vendor to understand the various dynamics of what moving into the cloud actually entails and lay those practical considerations out to you in a way that’s easy to understand.

James Fields is owner and president of IT service provider Concept Technology and IT staffing company Scout Staffing. Visit Concept Technology online at and Scout online at

Something to consider

why you wouldn’t want to move an internal software application into the cloud


When moving business applications to the cloud, you’re at the mercy of your cloud provider when answering the question: Is your data secure? You can’t control the provider’s diligence, and if provider is not doing its job to secure the application, it can lead to a direct compromise of your data.


Does your business software vendor support having your system in the cloud? There are some that still expect it be hosted on an internal server.


The more cloud space you "rent," the more bandwidth you require. Before moving an entire application offsite, you need to make sure you have enough available bandwidth to support the move.

Microsoft, Salesforce Unveil Cloud-Computing Partnership

Microsoft Corp. (MSFT) and Inc. (CRM) agreed to make some of their business-software products work better together, signaling a thaw between two longtime rivals.

Salesforce’s customer-management programs will become available for Microsoft’s Windows and Windows Phone operating systems, and will work with Office 365 online productivity software, the companies said in a statement today.

The agreement marks a shift in what has sometimes been a fractious relationship. In 2010, Microsoft sued Salesforce for patent infringement, setting off a countersuit before the companies settled later that year. Microsoft in 2005 also announced plans to “give Salesforce a very effective run for their money” with a competing product and in 2010 ran an anti-Salesforce ad campaign with the tagline “Don’t Get Forced.”

Under new Chief Executive Officer Satya Nadella, Redmond, Washington-based Microsoft has been looking to bolster its Internet-based cloud software and corporate programs. Software makers are looking for more ways to let applications, even from rivals, work together as customers seek to use multiple products and share information.

“This announcement is really about putting our customers first,” Salesforce CEO Marc Benioff said on a conference call.

Terms of the accord weren’t disclosed. Microsoft and San Francisco-based Salesforce both also agreed to use some of each other’s programs, including expanding Salesforce’s use of Microsoft’s database and Azure cloud software.

Developing Apps

Microsoft has been asking Salesforce to deliver apps for Windows and Windows Phone for at least a year. Kendall Collins, an executive vice president at Salesforce, said in a 2013 interview that Microsoft had talked to Benioff about developing apps for the products. At the time, the company wasn’t interested because it didn’t think there was enough customer demand, though Collins said Salesforce would monitor the situation.

Benioff cited Nadella’s ascent to the CEO job for giving him an opportunity to improve Salesforce’s relationship with Microsoft, a company he referred to as the “evil empire” just a few years ago.

Taiwan’s Acer launches cloud computing drive in shift from PC reliance

Taiwan’s Acer Inc detailed its long-touted push into cloud computing on Thursday, as the struggling computer maker responds to a shrinking PC market by pitting itself against cloud leaders Inc and Google Inc.

The world’s fourth-biggest producer of personal computers (PC) aims to start making software and offering online computing services under the heading Build Your Own Cloud (BYOC).

Acer announced BYOC with few details at the end of 2013 when the company booked a third straight loss after the global PC market shrank 10 percent. PCs have been losing out to tablet computers and sidestepped by the cloud, where users store files remotely and run applications over the internet.

"The computer is still our foundation, but BYOC is a new platform for integration, cross-compatibility and convenience," company founder and chairman Stan Shih said at a news conference.

Acer is promoting BYOC as the future of cloud computing by focusing on the so-called Internet of Things, which allows for remote connectivity across a range of devices. In a promotional video, Acer detailed how BYOC will allow users to operate home appliances or automobiles, for example, using smartphones.

Shih said, without elaborating, that he will help Acer find partners for BYOC initiatives beyond his previously announced retirement in June.

But with BYOC, Acer will enter a fledgling market already so competitive that in March Amazon and Google dropped their prices. Either side of their announcements, both Cisco Systems Inc and Hewlett-Packard Co revealed cloud investment of $1 billion.

Acer is therefore likely to have difficulty differentiating BYOC, but the company may benefit from its strength in manufacturing and hardware cost management, said analyst James Lin of KGI Securities.

"Acer has proven itself good at supply-chain integration, so it may be able to exert better cost control over its data centers than players who have less hardware experience," Lin said before Acer’s Thursday announcement.

Acer has been one of the more notable casualties of a decline in the global PC market. Acer’s PC shipments fell 20.2 percent in the first quarter of 2014 compared with an overall market decline of 4.4 percent, showed data from researcher IDC.

That led to a January-March net profit of only T$1 million ($33,200), continuing Acer’s trend of booking a meager profit or loss in every quarter since early 2011.

Over that time frame, Acer has fallen to the world’s No.4 PC vendor from No.2, according to researcher Gartner. The company has also had three chief executives, and has had to contend with two employees being investigated for insider trading.

Shares of Acer have fallen almost 80 percent since January 2011. They were trading 0.8 percent lower after Thursday’s BYOC announcement, compared with a 0.2 percent decline in the broader TAIEX index.

Forget Price. The Real Cloud War Is About Features

Amazon Web Services (AWS), Google, and Microsoft have been consistently dropping their prices in the battle to become the low cost provider of compute, network, and storage cloud services. You can bet that if one of them drops their price, the other two will announce price reductions within days. The reality is that the consumption of these core cloud services has become commoditized. The real differentiator is the features that provide agility to developers. IaaS (Infrastructure as a Service) public cloud providers are in a war to create the best platform for developers to quickly build and launch applications. There are two key categories of features that they are focusing on to win the hearts and minds of enterprises: hybrid clouds and PaaS (Platform as a Service) capabilities.

Building the bridge between public and private clouds

For startups and small or midsized companies, public clouds are a no brainer. These companies don’t have large investments in data centers or the people needed to run them. SMBs also have more limited IT budgets. Enterprises, on the other hand, have complex heterogeneous environments with years of legacy, tons of existing infrastructure, and cultures that are used to owning and controlling infrastructure. In order to get a foot in the door of large enterprises, public cloud service providers are implementing new services that allow enterprises to build hybrid clouds.

AWS has led the way while its competitors try to catch up. AWS offers a collection of APIs to assist in moving workloads between private datacenters and the AWS public cloud. Direct Connect provides a private, dedicated network connection between your datacenter and the AWS cloud. AWS also offers the AWS Storage Gateway to provide seamless integration for securely storing data in a hybrid cloud architecture. Documentation on AWS is also extensive. AWS has published numerous best practices for using their APIs for providing disaster recovery capabilities.

Microsoft has made strides to keep up with AWS, recently announced ExpressRoute which is their equivalent to AWS’s Direct Connect and also releasing Azure Files as their sharing file systems. Microsoft also announced major enhancements to Azure Site Recovery, a hybrid disaster recovery solution.

Google’s public IaaS offering, Google Compute Engine (GCE), recently launched this January. Expect to see them announce APIs in the near future to match the hybrid capabilities that AWS and Microsoft offer.

One advantage that AWS has over its competitors is its partnership with Eucalyptus. Eucalyptus is a private cloud solution that has AWS compatible APIs. The combination of AWS and Eucalyptus gives developers a common development platform to write applications that work seamlessly whether the work loads are running in public or private clouds.

IaaS providers are morphing into PaaS providers

There are many pure play PaaS companies that abstract the underlying physical infrastructure like IaaS providers, but also abstract the application stack made up of web servers, application servers, database servers, and a collection of programming languages. The power of PaaS is to hide all of the complex “IT plumbing” work required to scale, secure, failover, and meet all of those hard technical requirements that take developer time away from building business functionality.

IaaS providers have been adding many PaaS-like APIs to their portfolios to complement their network, compute, and storage APIs. Without officially announcing it, these IaaS players are gradually becoming powerful platforms. AWS, Microsoft, and Google offer APIs that target gaming, mobile, streaming media, and database as a service capabilities to name a few.

Each of these big three have a different strategy for growing their service offerings. AWS prefers to build all of their APIs internally. They are able to innovate quickly and release new APIs to the market place better than anyone. Microsoft has a build and partner approach. They have built a number of APIs to try to keep up with AWS and recently partnered with Apprenda, a pure play private PaaS provider. This partnership gives them a hybrid PaaS offering, something that AWS and Google currently don’t have.

Google has taken the build and buy approach. On top of delivering several unique APIs in the areas of big data, mobile, and integrating with their own PaaS, Google App Engine, they recently purchased Stackdriver, a cloud monitoring service. Expect Google to play catch up by purchasing more companies that fill the gaps in their offering.


The battle for enterprise dollars has moved past price and is now all about features and speed to market. The public IaaS cloud provider who can make the transition to hybrid clouds simple while adding a suite of APIs for developers to build application faster will be in the best position to win. While AWS has a huge lead in the depth and breadth of APIs it offers to developers, I predict Google and Microsoft to continue to buy and partner with companies in an effort to catch up.

Dimension Data’s latest private cloud service builds on Microsoft platform

Dimension Data is adding to its private cloud infrastructure services with a new offering that builds on the Microsoft Cloud Platform. The service builds on Windows Server 2012 R2 with Hyper-V, System Center 2012 R2 and Windows Azure Pack.

Even though the $6 billion integrator already has its own private cloud service, called Private Compute-as-a-Service (CaaS), the new offering is meant for businesses and organizations that are seeking to move Microsoft workloads between on-premise, Microsoft Azure and Dimension Data cloud environments. Microsoft is a Microsoft Certified Gold Partner and a Microsoft Cloud OS Network partner.

The new service can support 32-bit and 64-bit Windows and Linux environments for mission-critical business and Web applications, software development and testing scenarios, and virtual desktop services.

It’s just another example of Dimension Data’s quest to support its clients with cloud services in pretty much whatever form they need or want.

Dimension Data’s cloud services presence is extensive: the company will deliver its new service either within client data centers or in any of this 10 Managed Cloud Platform locations, including Santa Clara, Calif., Ashburn, Va., Amsterdam, London, Sydney, Melbourne, Tokyo, Hong Kong or Johannesburg.

Queensland orders agencies to buy cloud as default

Queensland agencies will be asked to justify any non-cloud IT purchases going forward, under a new policy that will see cloud solutions formally preferenced ahead of on-premise competitors in government tenders.

The policy makes Queensland the first Australian government to take the plunge and commit fully to a ‘cloud-first’ approach, following in the footsteps of the UK, Denmark and New Zealand.

The policy, released today, states that agencies “must consider first cloud-based solutions in preference to traditional ICT investments”.

As a result cloud will become the “default ICT-as-a-service solution unless a sound business case exists for a contrary solution”.

A spokesperson for the Department of Science, IT, Innovation and the Arts said the state’s position should not be considered a "’cloud-only’ policy".

"Some ICT systems may not be suitable or available in a cloud environment," she said. "The Directors-General Council will endorse non cloud-based investment proposals only after full and careful consideration of all aspects including cost/benefit analysis, risk mitigation strategies and whole-of-government potential and implications."

The state first flagged its intent to go down the cloud-first path a year ago, but has remained silent on the practical implications of the procurement stance until today.

Launching the policy, plus a suite of cloud implementation guidelines, IT Minister Ian Walker called the release “one of the most robust suites of tools to be delivered by any Australian government” which will help to “revitalize frontline services by giving agencies the agility to respond to changing business needs and priorities.”

“The strategy and implementation model is an update to the Queensland Government’s ICT strategy and is critical to progress our ICT as a service policy, allowing us to only pay for what we use,” he said in a statement.

Queensland’s fellow states have so far eschewed this kind of strong-arm approach to technology buying, with both NSW and Victoria favoring more nuanced approaches.

Just last week the Victorian Government cemented its own cloud-buying policy (PDF), which encourages agencies to include at least one cloud option in their procurement considerations – but makes no attempt to influence the final outcome of a tender.

Speaking to iTnews at the time, Grantly Mailes, the state’s chief technology advocate said he would not call the Victorian stance "cloud-first".

“Cloud-first implies that there is a burden of proof on agencies to show there is no viable cloud offering they could select before going with a non-cloud product,” he said.

Rather, the Victorian policy serves as guidance for agencies to investigate cloud options on the basis that an increasingly mature software-as-a-service market can offer a number of benefits to government buyers. The government does not plan to enforce or monitor applicable agency procurements.

The NSW Government – and for the time being the Federal Government – have adopted much the same policy position, but there have been indications that Canberra could follow Queensland down the cloud-first path in the near future.

Queensland has also clarified its stance on offshore data hosting for the benefit of government buyers.

Before hosting data in the cloud, agencies will be required to classify the sensitivity of the information into a ‘classified’, ‘cabinet-in-confidence’, ‘protected’ or ‘below-protected’ category.

The government will prohibit anything classified above ‘protected’ from being hosted offshore.

However with the approval of their chief executive, agencies will have the option of hosting ‘protected’ and ‘cabinet-in-confidence’ information in overseas technology environments as long as they have satisfied the requirements of a data sovereignty and security risk assessment.,qld-orders-agencies-to-buy-cloud-as-default.aspx

ServiceNow: Unleashing The Real Power Of The Cloud

Like most buzzwords, the “cloud” is way overused. It’s actually useless – if not misleading. Somehow the belief is that the cloud is only about delivering software. That’s it.

But for those companies that are making a huge difference in today’s competitive tech market, this view is an anachronism, almost laughable.

OK, so then what is the cloud really about? Well, to get some perspective on this, I reached out to Frank Slootman, who is the CEO of ServiceNow. The company’s technology allows for a single system of record for IT (information technology) environments, such as by automating manual takes, standardizing processes and consolidating systems.

And yes, the financial results for ServiceNow have been stellar. In the latest quarter, it booked $139 million in revenues, up 62% over the past year. The installed base is over 2,100 and there are accounts with over 400 Global 2000 customers. In fact, in the quarter, there were nine new transactions with annual contract values over $1 million and two were in excess of $10 million!

“The first stage of the cloud was about modernizing technology,” said Frank. “Now we are in the period of transformation. It’s about changing the way people are doing things.”

First of all, this means that the cloud can make technology truly useful to anyone in an organization. For example, ServiceNow has its own app development platform, which allows for creative customization.

“There is no programming skills required, so you are liberated from the high priests of software development,” said Frank. “The net effect is that orders of magnitude more people are now engaged and involved — which expands the market greatly.”

He also likes to refer to something he calls “lights out, light speed.” In other words, the cloud offers tremendous automation capabilities, which are distintermediating the traditional approach to service delivery.

Frank points to as an example.

Let’s face it, is there a phone number on the site? Do you need one? Not really. Yet is known for its standout service.

Finally, the cloud provides lots of potential for globalization. ”Everyone connects from where they are,” said Frank, “touching the exact same systems and experience that everybody else does. Cloud systems are designed with the latency of the internet in mind, such that systems can massively consolidate, standardize and globalize service processes.”

How Cloud Startups Are Changing The Face Of Innovation

After more than a decade of hype and billions of dollars of value creation, it would be reasonable to expect investors to start losing interest in cloud-related startups. In the venture business, me-too investors reap ever-diminishing returns compared to early entrants in a new sector. And while the sweeping transformation that began with the migration of non-critical applications from the data center to the cloud has proven massive indeed, it’s certainly no longer early days.

Indeed, even the second wave of cloud opportunity — essentially reinventing the infrastructure underlying cloud applications to meet the growing demands for performance and scalability in an increasingly mobile, app-centric world — has begun to level off in terms of new opportunity. The enabling technologies that were hand-sewn by cloud pioneers like Google or Facebook are becoming more and more commercially available; one by one, obstacles are being eliminated and problems are being solved, primarily by startups.

A number of startups addressing cloud scalability challenges for massively scalable web applications have already become market leaders (Fusion-io and Palo Alto Networks) or have achieved outstanding exits by being acquired by market leaders (Nicira and XtremIO). There’s still plenty of room for innovation, especially around mobile bandwidth and real-time interaction with diverse data sets, and it’s still early in the cycle as far as value creation and exits. But the days of low-hanging fruit are well past.

Yet cloud computing touches every software-related company that we invest in today in one way or another. Why? If the first wave of cloud computing was all about applications (SaaS), and the second wave was all about fixing everything that the cloud breaks (networking, storage, security, etc.), then what’s left?

A lot, as it turns out. Because cloud didn’t just change everything — it also enabled an entirely new opportunity set – “de novo” applications that could not have existed prior to cloud computing. So on the downslope of the first wave of the cloud transformation, the underlying architecture was a severely limiting factor. But that second wave doesn’t just allow us to fully exploit the first; it also fueled a third: agile development of cloud applications, utilizing cloud APIs, application assembly and other techniques to rapidly enhance developer productivity.

NoSQL databases are fundamental components of the agile development movement, allowing a team to start coding an application immediately and forego the long drawn out process of planning a database schema. In general, the hopper is loaded with great future exits for companies that play into application assembly: security and authentication services (e.g., Stormpath); communication (Twilio); integration between cloud and on-premises applications (MuleSoft); application deployment and management (Docker); analytics (; and application delivery controllers (NGINX).

This next-gen development wave is still ramping up and “application aware” is quickly becoming the new catch-phrase in technology infrastructure.

“Big data,” a hype cycle unto itself, starts earning its keep in this third wave. Now that we know how to process, serve and store the massive volumes of data being generated, we can finally start doing something with it. Next-generation cloud applications are dynamically data driven; the data consumed by a given application and how it reacts to that data will be key differentiators. The application will leverage programming interfaces based on open web standards to ingest and crunch this data from multiple sources, while also exposing its own set of services to others with open APIs.

What about the underlying infrastructure technology? It too looks different going forward.

The scalability of the technology will be at hyper scale. No longer will it just be about using virtual compute but also virtual networking and storage technologies. And the data layer will embrace a new wave of technologies offered by next-gen infrastructure companies such as MapR and Hadoop for data warehousing or Databricks and Spark for real-time analytics.

This new wave of opportunities, likely to drive venture returns well into the next decade, combines modern technologies across all layers of the stack — from the very top access tier down through the business logic and data persistence layer — with the scale of cloud computing to deliver something that legacy technology could never achieve: data-driven, mobile-first, personalized cloud solutions, leveraging open APIs and operating at hyper-scale in real time.

The stack was the great divide between the past and future of cloud, and it makes for a fairly effective sniff test when looking at new cloud investments. Similar to the way customer acquisition cost ratios, “magic number” and ROI became the de facto metrics for screening SaaS businesses, investors looking for de novo cloud opportunities need to assess the extent to which the company has embraced the modern cloud stack.

The former criteria have become so prevalent in cloud entrepreneur parlance that we rarely get five minutes into a presentation before someone mentions unit economics. Today, we’re just as eager to assess the pervasiveness of cloud at every layer of their business, from the way the application is designed to the underlying guts of the infrastructure powering the application.

The cloud has suffused the innovation landscape; it’s intrinsic to how we think about technology today and a building block for everything that comes next.

And from this VC’s perspective, it’s got a pretty long runway.

Note: some of the companies referenced, Fusion-io, Mulesoft, Nicira, NGINX, and Stormpath are NEA investments.

Companies and the Clouds They Keep

After co-founding (CRM) in 1999, Marc Benioff spent years preaching the gospel of its radical new business model, cloud computing. The days of the hard sell are long gone, Benioff says. Just this month, the Salesforce chief executive officer played host to News Corp. (NWSA) Chairman Rupert Murdoch and Wal-Mart Stores (WMT) CEO Doug McMillon, who’d traveled to Silicon Valley to hear how Salesforce could help them cut costs and increase sales.

Murdoch and McMillon didn’t come just to hear about the customer-relationship-management software that’s Salesforce’s bread and butter. According to Benioff, they wanted to learn more about Salesforce’s cloud “platform,” a mélange of programming tools, data centers, and partnerships that let companies quickly and cheaply create cloud services of their own. “The world’s most important CEOs realize that their companies also need to become cloud platforms, and that’s why they’ve become so interested in our services,” says Benioff.

Salesforce is not alone in recognizing that the competition for cloud dominance is moving in this direction. Phase 1 was the “software as a service” craze, during which hundreds of companies followed Salesforce’s lead and rushed to create cheaper, easier-to-use online versions of traditional programs. Among the startups that rose to prominence are Workday (WDAY), which creates software for human resources departments, and Dropbox, which specializes in document storage.

Then, in 2006, (AMZN) pioneered “infrastructure as a service” by renting out use of its vast data centers so companies didn’t need to build their own. Booming demand enticed others to enter what is now an $8 billion-a-year market, according to Synergy Research Group. Amazon’s rivals have been cutting prices furiously in the hopes of whittling down its commanding 43 percent share. In March, Google (GOOG) announced it was slashing the cost to rent storage by 68 percent and the price of computing power by 32 percent. Microsoft (MSFT) and others are following suit.

Now these companies—along with enterprise software titans such as IBM (IBM), Oracle (ORCL), and SAP (SAP)—are focusing on building out soup-to-nuts cloud platforms. All include programming tools, some designed so that even Luddites can quickly create new cloud services. They also handle tedious tasks such as applying security patches and adding more servers when traffic spikes. As a result, companies can roll out cloud services faster and more frequently.

Amazon is “one of the only companies I know that tells customers, ‘Hey, you’re paying us too much.’ ”—Larry Carvalho, IDC

Coca-Cola Enterprises (CCE), a large Coke bottler in Europe, has used various platforms to create new apps. One it recently installed on the phones of hundreds of salespeople has cut the time needed to tally inventory on store visits by more than 30 percent, says Chief Information Officer Esat Sezer. Rather than count cans and bottles the old-fashioned way, all they do is snap a picture of cooler shelves and let the software do the work. Sezer says it took just a few months from concept to deployment. “In the past this would have taken years, literally,” he says.

Salesforce launched an online store for cloud software applications called AppExchange in 2006 (that’s two years before Apple (AAPL) opened up its own App Store for iPhone users). In 2007 it made the technology it used to write its flagship customer-relationship-management product available to developers through its site; app makers pay Salesforce a cut of their sales. Last year it repackaged all of this into a broader suite called Salesforce1, which also includes tools to adapt developers’ apps for smartphones and other mobile devices. This month, Benioff says, it will add tools so developers on its platform can also have their apps run on wearable devices—a salesman out on his morning job could counter a competitor’s discount using his smartwatch.

By themselves, these platform services are not huge moneymakers: They contributed about $400 million of Salesforce’s $4.1 billion in revenue last year, according to Gartner (IT) analyst Yefim Natis. Yet ultimately, the cloud platform war could be every bit as consequential to the future of computing as the Mac vs. Windows clash of operating systems in the 1980s and ’90s. By making it easier and more lucrative for companies to create Windows-based applications, Microsoft engineered a de facto monopoly. The same will hold true in the cloud, says Natis—though it’s unlikely that one company will achieve a comparable level of dominance. “The one with the most software developers wins,” he says.

To help foster an ecosystem around Salesforce’s cloud platform, Benioff has sprinkled an undisclosed sum of money over more than 100 startups. Many are focused on areas outside Salesforce’s wheelhouse. The primary goal of these investments, says John Somorjai, Salesforce’s senior vice president for corporate development and strategy, is to create a service that’s not available from Amazon, Google, or the others.

Salesforce is currently the market leader in the $4.6 billion-a-year “platform-as-a-service market,” with a 20 percent share, according to Synergy. But after years as the clear front-runner, the company is now neck-and-neck with Microsoft and Amazon. When it first rolled out its Azure cloud offering in 2010, Microsoft required developers to use only Windows-related technologies to write their code. When few agreed, it followed Amazon’s lead and focused on renting out its data centers. Only recently has the company begun layering in more platform-oriented features, this time in a way that doesn’t require use of its in-house tools, says Microsoft technology fellow Mark Russinovich.

IBM bought a data center operator called SoftLayer a year ago and has since been introducing pieces of a platform it calls BlueMix that’s designed to help longtime corporate customers easily create cloud services that work with their existing setups, according to Steve Robinson, general manager of IBM’s Cloud Platforms Services division.

Amazon has been mostly content to focus on renting its raw data center capacity and offers only rudimentary tools for software developers, says IDC analyst Larry Carvalho. Yet the company, with its vast customer base and aggressive pricing philosophy, would make fast headway if it chose to do more. While IBM, Microsoft, and Oracle are famous for cranking up prices to support their rich profit margins, Amazon last year introduced a Trusted Advisor feature that automatically scans cloud customers’ usage to find ways to lower their monthly bill. Says Carvalho: “They’re one of the only companies I know of that tells customers, ‘Hey, you’re paying us too much.’ ”

Benioff’s devotion to cloud computing has its downsides. Most companies still have some key applications they deem too important to run from someone else’s data center. And all of the cloud platform wannabes could be threatened by open-source alternatives offered by a range of upstarts, including Pivotal, a subsidiary of EMC (EMC). “Marc was a game-changer, but that was a decade ago,” says Sezer, who says open-source services may end up being a 10th of the cost of platform providers such as Salesforce. “His next level of innovation and service offerings cannot wait a decade. It has to be a matter of months, not years.”