Databricks Snags $33M In Series B And Debuts Cloud Platform For Processing Big Data

Databricks, the commercial entity created by the developers of the open source Apache Spark project, announced $33M in Series B funding today and the launch of a new cloud product, their first one as a company.

There is little doubt that big data is a big deal these days and companies are popping up to help customers process the data. Databricks hopes to simplify the entire matter by moving it to the cloud to reduce management headaches, while speeding it up by using Apache Spark to drive the platform.

First, let’s look at the funding, which is led by New Enterprise Associates (NEA) with a contribution from previous investor Andreessen Horowitz. It brings the total funding to date to $47M.

The latest round gives the company a huge financial boost and CEO Ian Stoica says, they hope to increase the number of employees and expand rapidly.

In addition to the funding, the company also announced a new cloud platform called Databricks Cloud that Stoica, says has been designed to simplify big data processing by bringing the process under one cloud umbrella.

The cloud solution consists of three pieces: The Databricks Platform, Spark and the Databricks workspace. The idea behind the product, Stoica says, is to provide a single place to process data without having to worry about managing a Hadoop cluster to process your data. It’s all done in the cloud instead in a managed environment.

After you add your data to a project, you can begin working with it immediately. The product has several core concepts starting with Notebooks, which provide a way to interact with the data and build graphs. As you discover ways of displaying your data, you can begin to build dashboards to monitor certain types of data. Finally, the platform includes a job launcher, which enables users to schedule Apache Spark jobs to run at certain times.

The product has been designed to allow customers to access and plug in third-party Spark applications, so if they have additional requirements not available in the base Databricks platform, they can use existing third-party applications to take advantage of whatever those tools have to offer.

The company believes that by providing a set of tools built in the cloud, they will remove much of the pain and complexity involved with a typical big data processing project where so much time is spent simply getting the right tool set in place before any work even gets done.

Initially, Stoica told TechCrunch, the product will run on AWS, but they are looking to expand to Google Compute Engine and Microsoft Azure –and their large infusion of cash should help facilitate that.

The company was born out of the Apache Spark project, which was originally developed by Stoica and colleagues from research at The University of California, Berkeley in 2009. He and his fellow researchers were looking for something that was faster than Hadoop and they developed Spark, which they open sourced in 2010.

Stoica says that it’s faster for a number of reasons including that it requires less code to process the job and it runs entirely in-memory, rather than using disk reads, which can slow down the processing.

While the company continues to support the open source project, last year as Spark was gaining traction, they decided to create a commercial entity on top of that and got $13.7M from Andreessen Horowitz to build the product and the company.

Today marks their debut product. Stoica indicated the platform is available to a limited set of customers today, but they will be expanding that gradually in the coming months.

VMware Horizon 6 is Here!

By: Courtney Burry, Director of Product Marketing, End-User Computing, VMware

It’s here—right in the middle of the World Cup! VMware Horizon 6 is officially shipping as of 20th June 2014.

With this release, IT can now deliver RDS hosted applications alongside virtual desktops through a single platform. And these desktops and applications—inclusive of packaged ThinApp, SaaS apps, RDS hosted apps and even Citrix XenApp applications—can all be accessed by end-users through a unified workspace.

Horizon 6 graphic

This is a big deal for customers looking for a cost-effective alternative to supporting end-users who may just need access to a few applications across mobile devices or remote locations.

What’s more, VMware has been hard at work to ensure that end-user performance with PCoIP is better than ever before. In fact, with Horizon 6, new bandwidth management enhancements and changes to the PCoIP default values improve user experience over wireless networks and reduce bandwidth usage by up to 30% out of the box. (Look for more details on this one in a subsequent blog post).

But this release is not just about apps and making end-user access a whole lot easier, it’s also about ensuring that IT organizations everywhere can streamline and automate desktop and application management and ultimately drive down costs.

To help with this—Horizon 6 includes Mirage—which takes a layered approach to image management ensuring that IT can simply patch, update and manage images for physical, full-clone virtual and even BYO PCs.

Horizon 6 also ensures that it is easier to set up desktops—with the Horizon vCenter Orchestrator (vCO) plug-in. By using this plug-in with vCloud Automation Center (vCAC) –IT can approve and provide self-service to end-user requests for applications and desktops and end-users can refresh and recycle their desktops as well. Delegated adminstrators can also do these things on behalf of end-users or simply manage entitlements and assignments for specific pools. So if you have end-users and admins across multiple locations—you now have the ability to empower them without losing control along the way.

VMware is also making it easier to monitor virtual desktop environments to ensure end-users have the best possible user experience. With Horizon 6, vCenter Operations Manager for Horizon now supports monitoring for both desktops and hosted RDS sessions and applications. This allows organizations to monitor all Horizon components including desktops and apps, the network, your virtualization stack, and your host and application servers.

Now many customers need to provide five nines of availability to their end-users—particularly in healthcare, financial services, retail and government. And when a desktop that resides on a server in the datacenter goes down—it often means that end-user goes down too—and that employee may as well go and watch the World Cup because they can’t get to the apps and data they need to get their job done. With Horizon 6—we have introduced a Cloud Pod Architecture that allows organizations to scale Horizon 6 across multiple datacenter and sites. This architecture supports an active-active configuration to ensure high availability–so if an end-user’s desktop in Site A goes down, they can easily be entitled to another desktop in Site B without any downtime.

All of these management enhancements will help drive down the day to day costs of supporting end-users. But how does Horizon 6 help you maximize the investments in VMware you have already made? (And what I am really talking about here is those investments in vSphere and the software defined datacenter).

Well for starters–Horizon 6 comes with vSphere and leverages vCenter to manage and set up pools of users—making it a no-brainer for anyone on vSphere looking to take advantage of desktop and application virtualization in their environment.

Customers can leverage vSphere with Horizon 6 to deliver great 3D graphics inclusive of software rendered graphics, shared graphics acceleration and dedicated graphics. This is not new. But Horizon 6 isn’t just about optimizing workloads for vSphere-we also have optimized our virtual desktops and apps for VMware Virtual SAN. And Virtual SAN along with vSphere and vCenter all come bundled with Horizon 6.

With Virtual SAN, organizations can now use local storage and get the performance you’d expect to see from an all-SSD storage system. Not only will this save you money—but it is also makes it a lot easier to manage. It literally takes 3 clicks to install Virtual SAN through the vSphere web client. What’s more, once you have this set up—you no longer need to manage it on a day to day basis. This goes a long way in making storage management actually manageable.

Horizon 6 VSAN graphic

Now whether you are turning to desktop and application virtualization for security and compliance reasons, to streamline desktop management, to provide end-users with anytime, anywhere access or to provide a level of disaster recovery—you also should have the flexibility to deliver those services on premises or through the cloud. With

, you do. Both of these offerings leverage PCoIP—ensuring that end-users can get the same experience whether the desktops are served up from the cloud with Horizon DaaS or from your datacenter with Horizon 6.

Horizon 6 cloud graphic

We are excited about Horizon 6. And based on the “book” I have just written above-you can see that there is a lot in this release. In fact this is –by far—the biggest release the desktop team at VMware has pulled together to date. So between World Cup matches (go team USA!) be sure to find out more:

Get some hands on experience with Horizon 6 in the Hands On Labs

Get a full overview of Horizon 6 through the EUC Insights Virtual Event

Download a free evaluation of Horizon 6 today – available at 8pm PST.

View article…

Meet, the ambitious ‘cloud’ startup that plans to destroy sync

While works for individuals, the startup is courting the enterprise market. But at least one bigger player is skeptical of’s chances to strike major deals.

“This may be great for personal use or for the 5 to 25 seat type companies, but would never make its way into an enterprise-type deployment,” Vineet Jain, CEO of cloud storage company Egnyte, told VentureBeat. “I say ‘never’ loosely, but it is highly unlikely.

“I see this as somewhat of an Apple product-type answer to the more complex [network-attached storage] devices that already exist on the market right now,” said Jain. is currently running a Kickstarter campaign for the “Sherlybox,” a network-attached storage device that works with’s software. But the software also works for Mac and PC, with Linux support coming in a few weeks, and mobile support after that. Marciniak emphasized that the Sherlybox is a cool product, but the company’s primary focus is its software, which it’s currently beta-testing with around 2,000 users. The company expects to make the software generally available this fall.

“My goal is to introduce better and faster way to handle data,” said Marciniak. “I believe it is within our reach.”, which raised $550,000 from a seed-stage investment firm in Poland, will soon discover if that vision appeals to Silicon Valley’s financiers. Marciniak is currently traveling around the Valley seeking American backers.

Why Google Cloud Dataflow is no Hadoop killer

Unveiled earlier this week, Google’s Cloud Dataflow service clearly competes against Amazon’s streaming-data processing service Kinesis and big data products like Hadoop — particularly since Cloud Dataflow is built on technology that Google claims replaces the algorithms behind Hadoop.

But on closer look, Cloud Dataflow is better thought of as a way for Google Cloud users to enrich the applications they develop — and the data they deposit — with analytics components. A Hadoop killer? Probably not.

Google bills the service as "the latest step in our effort to make data and analytics accessible to everyone," with an emphasis on the application you’re writing rather than the data you’re manipulating.

Significantly, Google Cloud Dataflow is meant to replace MapReduce, the software at the heart of Hadoop and other big data processing systems. MapReduce was originally developed by Google and later open-sourced, but Urs Hölzle, senior vice president of technical infrastructure, declared in the Google I/O keynote on Wednesday that "we [at Google] don’t really use MapReduce anymore."

In place of MapReduce, Google uses two other projects, Flume and MillWheel, that apparently influenced Dataflow’s design. The former lets you manage parallel piplines for data processing, which MapReduce didn’t provide on its own. The latter is described as "a framework for building low-latency data-processing applications," and has apparently been in wide use at Google for some time.

Most prominent, Cloud Dataflow is touted as superior to MapReduce in the amount of data that can be processed efficiently. Hölzle claimed MapReduce’s poor performance began once the amount of data reached the multipetabyte range. For perspective, Facebook claimed in 2012 it had a 100-petabyte Hadoop cluster, although the company did not go into detail about how much custom modification was used or even if MapReduce itself was still in operation.

Ovum analyst Tony Baer sees Google Cloud Dataflow as "part of an overriding trend where we are seeing an explosion of different frameworks and approaches for dissecting and analyzing big data. Where once big data processing was practically synonymous with MapReduce," he said in an email, "you are now seeing frameworks like Spark, Storm, Giraph, and others providing alternatives that allow you to select the approach that is right for the analytic problem."

Hadoop itself seems to be tilting away from MapReduce in favor of more advanced (if demanding) processing algorithms, such as Apache Spark. "Many problems do not lend themselves to the two-step process of map and reduce," explained InfoWorld’s Andy Oliver, "and for those that do, Spark can do map and reduce much faster than Hadoop can."

Baer concurs: "From the looks of it, Google Cloud Dataflow seems to have a resemblance to Spark, which also leverages memory and avoids the overhead of MapReduce."

The single greatest distinction between Hadoop and Google Cloud Dataflow, though, lies in where and how each is most likely to be deployed. Data tends to be processed where it sits, and for that reason Hadoop has become a data store as much as a data processing system. Those eying Google Cloud Dataflow aren’t likely to migrate petabytes of data into it from an existing Hadoop installation. It’s more likely Cloud Dataflow will be used to enhance applications already written for Google Cloud, ones where the data already resides in Google’s system or is being collected there. That’s not where the majority of Hadoop projects, now or in the future, are likely to end up.

"I don’t see this as a migration play," said Baer.

Some Cheerful Truths About Cloud Storage

Yes, Jon William Toigo makes some excellent points about lack of cloud storage management tools in his recent post, "Some Depressing But Hard Truths About Cloud Storage," but there has been some good news lately: dropping cloud storage prices.

Sure, the ongoing cloud storage price wars among companies such as Google, Apple and Microsoft might be consumer-oriented, but cheaper cloud storage for the public at large surely translates into lower costs for enterprises.

The latest volley came this week from Microsoft, which increased the amount of free storage on its OneDrive (formerly SkyDrive) service and slashed prices for paid storage. The move follows similar initiatives from competitors.

In March, Google cut prices for its Google Drive service.

Shortly after, Amazon also got in on the act, posting lower prices effective in April.

Enterprise-oriented Box at about the same time decreased the cost of some of its services, such as its Content API.

Earlier this month, Apple announced price reductions for its iCloud storage options and previewed a new iCloud Drive service coming later this year with OS upgrades.

Just a couple weeks ago, IBM downgraded the expense of object storage in its SoftLayer cloud platform.

Interestingly, "Dropbox refuses to follow Amazon and Google by dropping prices," CloudPro reported recently. Coincidentally, The Motley Fool today opined that "Microsoft May Have Just Killed Dropbox."

In addition to price wars, the cloud storage rivals are also improving other aspects of their services in order to remain competitive. For example, in Microsoft’s announcement a couple days ago, the company also increased the free storage limit from 7 GB to 15 GB. And if you subscribe to Office 365, you get a whopping 1 TB of free storage. In paid storage, the new monthly prices will be $1.99 for 100 GB (previously $7.49) and $3.99 for 200 GB (previously $11.49), the company said.

And, of course, Amazon and Google both last week announced dueling solid-state drive (SSD) offerings.

So as Jon William Toigo ably explained, enterprise cloud storage management is a headache and it’s not getting enough attention. But if these consumer trends continue apace in the enterprise arena, lower TCO and increased ROI should lessen the pain.

Salesforce Takes Its Cloud Model to Health Care

Salesforce, the cloud pioneer known for customer management software, is going into health care.

In an announcement on Thursday, Salesforce and Philips, the Dutch electronics maker, are jointly announcing what they call an “open cloud-based, health care platform.” That means, they say, health care software developers, producers of medical devices, health care providers and insurance companies will be able to link to the Salesforce health cloud.

The foray into health care is a significant step by Salesforce into a specific industry, as opposed to supplying offerings that span industries, like customer relationship management software as a service. It is part of the growth strategy at Salesforce, and is being led by Vivek Kundra, a former chief information officer of the federal government in the Obama administration, who joined Salesforce two years ago. His title is executive vice president of industries.

There have been other high-profile cloud entries in health care, notably Google Health, a personal health records initiative, which opened in 2008 and was shut in 2011. But Salesforce is an enterprise cloud company and it is taking a very different approach. Its initial move, with Philips, is to focus on a specific target in health care — using technology to manage chronic ailments.

The statistics on chronic illnesses are well-known and sobering. According to the Centers for Disease Control and Prevention, chronic diseases account for seven of every 10 deaths, and for 75 percent of health care costs in America. The five major chronic conditions, in the CDC list, are heart disease, cancer, diabetes, arthritis and obesity.

Jeroen Tas, chief executive of health care informatics at Philips, points to another set of figures — on the “1 percent” in American health care. Mr. Tas cites a study that found the treatment for 1 percent of the patients in the United States represents 21 percent of the nation’s health care expenses, averaging about $88,000 a patient a year.

The two companies have worked together for the last six months on the health care offering. They are beginning with a couple of applications from Philips built on the Salesforce technology, which allow hospitals and care givers to remotely monitor patients with chronic conditions. “We start where you need the most collaboration, and the health need is greatest,” Mr. Tas said.

The applications work, Mr. Tas explained, by linking patient sensors with smart software over the cloud. Philips is a large producer of a digital patient-sensing devices, including blood pressure cuffs, weight scales, fall detectors and activity trackers.

The data from such devices, Mr. Tas said, is then sent over the Internet, aggregated and analyzed by Philips software to assess which patient among the many being tracked by a nurse practitioner should be the priority — the one in greatest need of advice or assistance and so the first one to be called.

Chronic care management programs are being pursued by most of America’s large and integrated health care providers and can reduce costly hospital admissions and complications. The promise of Saleforce’s cloud approach is that it could reduce the expense of such programs by freeing up hospital groups from having to do much of the technical work themselves.

If the Salesforce health platform succeeds, Mr. Kundra said, it will “unleash a wave of innovation” in health care technology as developers write the “next generation of software apps” in the industry.

For the last few years, Mr. Kundra said, the focus of policy and the industry has been on moving the health care system from paper records to electronic medical records, an effort supported by federal subsidies. “We’re moving,” Mr. Kundra said, “into a post-EMR world,” in which innovation in health care information technology accelerates.

Some Banks Are Heading To The Cloud — More Are Planning To

A recent survey of 29 senior technology executives at financial institutions found that cloud computing holds the potential to redefine the relationship between corporate tech departments and financial institution business units. More important, the change is coming at a time when costs and regulatory compliance are high priorities, according to the Boston-based Aité Group.

The different demands enable banks to choose from several types of cloud applications such as private clouds, for the more sensitive data, and public clouds to store other information. More frequently, banks are going with a hybrid model that combines the two, Aité Group analyst David Albertazzi said.

“I think the new solutions are much better in terms of technology … and therefore they are being viewed as less risky,” he said.

The Aité Group found that 50 percent of those surveyed responded that they were likely or highly likely to use private clouds in the next 24 months.

Cloud certainly has risen to top of mind. In July 2013, PriceWaterhouseCoopers LLP reported that 71 percent of financial services respondents said they would invest more in cloud-based technologies this year compared with only 18 percent the year before.

In the U.S., cloud computing for banking tends to lag, unless you could service bureaus like FIS, Fiserv FISV +0.1% and Jack Henry as cloud providers. While they do provide remote processing they don’t offer the dynamic reconfiguration of cloud service providers which can allocate computer resources by the hour for applications, testing and development.

European cloud banking got a boost last summer when De Nederlandsche Bank (DNB ), the Netherlands’ national banking regulator, cleared Amazon Web Services for a range of banking services including websites, mobile applications, retail banking platforms, high performance computing and credit risk analysis.

Amazon Web Services (AWS) has scooped up significant business from financial services firms which have been early adopters of cloud.

Robeco Direct N.V., a Dutch bank offering saving products, investment funds, mortgages and other services currently manages over €8 billion in assets. It recently moved its entire retail banking platform to the cloud. To reduce complexity and costs while improving agility , the bank selected a banking platform from Ohpen, a banking technology provider based in Amsterdam.

The sixth largest bank in Spain, Bankinter , uses the Amazon cloud to run credit risk simulations in 20 minutes, down from 23 hours before. A major American investment firm uses AWS to run credit risk simulations for its long-term debt and equity-backed securities in its London branch. By using AWS it can scale up vast amounts of technology infrastructure on demand and pay only for what it uses. Having such vast compute capacity available on demand is extremely important in the capital markets industry where milliseconds can mean millions of difference in profit, the company has said.

An English wealth management group decided to build all of its new systems on AWS. New functionality included data warehousing and an electronic business processing system to replace a paper-based application system. It has now started to move legacy systems to the cloud which can accommodate its fast growth — nearly 50 percent year on year.

Australia has also been a strong market for AWS. Suncorp Bank Australia placed an emphasis on innovation and launched a working virtual private cloud and virtual data center in under three months with plans to move 2,000 applications to the AWS Cloud along with large parts of their core banking. Commonwealth Bank of Australia said moving storage to the cloud cut costs in half while it achieved similar savings in app testing and development. Expanding technology has also become easier and cheaper; while before it took the bank eight weeks and several thousand dollars to stand up a new server, now it takes eight minutes and 25 cents to do the same thing in the cloud, making the bank much more responsive to changing customer demands.

In Africa, Zitouna Bank in Tunisia has selected IBM’s cloud capabilities to host its Temenos banking platform.

IBM said the project will support Zitouna Bank’s objectives to open up to 18 new branches per year and roll out of new mobile and Internet banking services.

“Benefits are expected to be reduced waiting times for customers, a greater choice of banking channels and services, and extending services to the nation’s unbanked and underbanked citizens.”

Adopting a cloud model will provide the bank with a platform that is secure, scalable and can handle mission-critical workloads while offering greater flexibility and performance, the company said.

Zitouna Bank’s expects to open up to 18 new branches per year and roll out new mobile and Internet banking services.

“Our aim is to provide a diversified portfolio of modern banking services to enterprises and individuals, and establish ourselves as a leader in Tunisian banking,” said Lasaad Jaziri, CIO for Zitouna Bank. “IBM’s cloud capabilities will help us roll out a wider range of services and products, such as new mobile and Internet banking services, while also improving efficiency of internal banking processes and reducing our operational footprint.”

While the US has lagged in cloud adoption in banking, that could be changing, according to a publication from Unisys consultants.

“The cloud conversation at many banks has been reenergized by regulators’ latest demand for higher bank capital levels, wrote Robert Olson, Colin Lacey and Ashwini Almad in BAI’s Banking Strategies. “Cloud offers the tempting opportunity to shrink capital expenditures on technology and shift them to operating expenditures.

“Agility and time-to-market also augur for a more aggressive cloud strategy. When a competitor ups the ante on your mobile application and your bank wants to quickly match it to avoid losing customers, saying “months or weeks” is unacceptable. If cloud can answer “hours and minutes,” cloud is where that application is going.

“And cost remains a powerful force behind cloud. At 14.3%, banks’ Information Technology (IT) spending as a percent of total costs (14.3%) is highest of all industries. Measured another way, as a percentage of revenues, banks’ IT costs (7.3%) are about twice the average across all industries surveyed (3.7%).”

Aereo’s Supreme Court Smackdown Won’t Harm Apple iCloud Or Others In The Cloud Computing Industry

The cloud computing industry is breathing a sigh of relief.

Although the U.S. Supreme Court has declared internet streaming startup Aereo illegal, this decision will not harm cloud computing technology, as many in that industry feared it would.

That’s because SCOTUS specifically said that the ruling doesn’t apply to other technologies "not before them" such as cloud computing.

The ruling only covers Aereo’s business model and technology. Aereo’s tech captured over-the-air broadcast TV signals and sent them over the internet to paid subscribers who could watch the shows, or record them on a cloud DVR to watch later. Those who create TV shows, and charge a lot of money to cable companies to carry them, argued that Aereo violated copyright law. The justices just agreed.

But the fear among other companies was that if Aereo lost, it could have unintended consequences for similar technology, such as cloud storage for songs and movies.

Cablevision was one of the many companies who wrote an opinion to the Court, Engadget reported. It wrote that if Aereo lost, this could "attack the legal underpinning of all cloud-based services, everything from the Apple iCloud to Cablevision’s own remote storage DVR service."

But not only did the court say that cloud computing was not covered, it also said that there’s a difference between storing files in the cloud that a consumer has already "lawfully acquired" and paying Aereo to watch broadcast TV on the Internet.

Here’s the part of the Supreme Court ruling that exempts cloud computing from being impacted:

Further, we have interpreted the term “the public” to apply to a group of individuals acting as ordinary members of the public who pay primarily to watch broadcast television programs, many of which are copyrighted. We have said that it does not extend to those who act as owners or possessors of the relevant product. And we have not considered whether the public performance right is infringed when the user of a service pays primarily for something other than the transmission of copyrighted works, such as the remote storage of content. See Brief for United States as Amicus Curiae 31 (distinguishing cloud-based storage services because they “offer consumers more numerous and convenient means of playing back copies that the consumers have already lawfully acquired” (emphasis in original)). In addition, an entity does not transmit to the public if it does not transmit to a substantial number of people outside of a family and its social circle.

… We cannot now answer more precisely how the Transmit Clause or other provisions of the Copyright Act will apply to technologies not before us. We agree with the Solicitor General that “[q]uestions involving cloud computing, [remote storage] DVRs, and other novel issues not before the Court, as to which ‘Congress has not plainly marked [the] course,’ should await a case in which they are squarely presented.”

IBM’s (Not So) Secret Weapon: Hybrid Cloud Computing

Big Blue has been singing the blues a lot lately. As other tech names woo Wall Street with shiny gadgets and myriad online services, IBM (IBM) sputters.

IBM, perhaps the biggest player in the “Old Tech” space, saw its stock dip 2.1% last year just as the broad markets posted a blockbuster performance that included stellar results by a wide variety of technology stocks.

This year hasn’t been much better for the Armonk, NY-based heavyweight, as the firm nurses a year-to-date loss of 3.4% against 6% gains for U.S. equities. In fact, the stock has underperformed the S&P 500 every year since 2012 after beating the benchmark barometer by a wide margin in 2011, according to data from FactSet.

Brokerage houses aren’t exactly enthusiastic on IBM, either: Out of 21 analysts who have recently reviewed the company, 17 logged a “hold” rating, three notched a “buy” rating, and one suggested selling the stock, FactSet data show. On average, analysts expect sales to contract by 2.4% this year to $97.4 billion, compared to forecasts for revenue growth of 5.4% for tech companies included in the S&P 500.

New Entrants Cloud IBM’s Server Business

But IBM and its nearly half-a-million employees have a plan to get the company’s "cool" back. It all revolves around one somewhat wonky phrase: Hybrid cloud computing.

For many years, “cloud” has been the hip term in the computing world. It refers to a trend of storing and processing information across the Internet so it’s available at the snap (or tap) of a finger from just about any device. The exponential growth of smartphones and tablets that have mostly uninterrupted data service but also fairly shallow computing power, has helped the sector take off.

Companies big and small have looked to tap into the revolution. A few notable examples are Google (GOOGL), with its widely-used cloud-based email and office apps, (AMZN) and (CRM) with their enterprise platforms, and Akamai Technologies (AKAM), a lesser-known name that delivers 15%-30% of Internet traffic.

The bottom line is the migration from firm-based IT assets to networks that increasingly tap shared computing power spread globally across the Internet has been a major headwind to enterprise-equipment producers

Indeed, IBM’s systems and technology hardware sales plummeted 23% on a year-over-year basis to $2.4 billion in the first quarter, causing Big Blue to log a pre-tax loss of $660 million for the segment. At the same time, analysts at tech consultancy Gartner reckon IBM’s server market-share narrowed to 19.8% on a revenue basis in the first quarter from 25.5% in the same three months in 2013.

Inversely, IBM’s cloud sales swelled more than 50%, although, the company didn’t offer any specifics except to say the annual run rate was $2.3 billion. Cloud revenue totaled $4.4 billion in 2013, meaning it was a relatively small sales driver for the massive company.

Lofty Ambitions

But IBM is making a big play to gobble up a much bigger share of the burgeoning marketplace.

Phil Guido, general manager for IBM North America, who oversees the firm’s consulting, technology services, software, hardware and client financing portfolios, said Big Blue has already spent $10 billion on its cloud division. That includes a whopping $7 billion in acquisitions and $1.2 billion in datacenter rollouts.

The company sports some 40 datacenters around the world and employs 40,000 cloud-computing experts. Guido said data is “the next natural resource.” In the same way companies drill for oil and refine it into petroleum products, IBM thinks companies will want to mine data, store it and analyze it.

“We’re at an inflection point as the markets come together,” he said.

At face value, that doesn’t seem so different from what most of the market is pushing for: Life, increasingly, on the cloud. But what makes IBM’s pitch more unique, and well-tailored to a long-time leader in enterprise equipment, is the “hybrid” component.

Enter: Lance Crosby, CEO of SoftLayer, a company IBM bought in 2013 in a deal reportedly worth about $2 billion.

“Unlike our competitors, we don’t think everything is going to the cloud,” he said. “We think 50-80% will be going to the cloud,”

Crosby specifically targeted rivals like Google and Amazon, which he said are focused too closely on the consumer cloud and missing opportunities in the enterprise space. It’s not too surprising for a company that has for many years been one of the biggest global server vendors.

He said many companies have already invested large amounts of money and resources in their own networking infrastructure, and are wary of moving everything to the cloud. Guido and Crosby cited a slew of concerns clients have brought to their attention, including: reliability, regulatory hurdles, security, and transparency.

IBM’s pitch to companies is this: why not keep your most valuable and sensitive resources on a “private cloud” and move customer-facing applications, programs that involve lots of number crunching, and ones that have to be easily accessible by clients across the world, to datacenters IBM manages?

Crosby said this style of cloud computing particularly resonates in the financial and health-care arenas – two sectors where data regulations are especially tight. Banks, for example, can continue processing transactions internally, so they maintain complete control over the highly-secure systems, but then can move social-media projects, apps and other lower-security installations to the cloud, where it’s much more cost efficient to reach customers around the globe.

Through its acquisition of SoftLayer, and other services, IBM has also devised systems that make it easy to connect the public and private clouds. For example, so-called APIs let companies easily apply cloud-computing technology to their existing programs.

IBM has also set out to offer a high level of transparency, security and reliability in its datacenters. Crosby said the datacenters are broken into “pods” of 5,000 servers that are clustered in such a way that if there was a “catastrophic” event, it would only affect certain segments instead of cascading.

From a transparency perspective, Crosby said customers can access information right down to the serial number of specific hard-disk drives: “Our competitors are this black box where you don’t know what’s going on behind the scenes,” he explained, “we open up everything so we can see inside the cloud.”

So far, SoftLayer has seen at least some level of success under IBM’s umbrella. Crosby said the segment has added 3,000 customers since joining the blue-chip firm last year.

In the near term, IBM expects its cloud division to be a $7 billion unit by next year. It will likely also invest another $10 billion in its quest to control the cloud.

Still, Crosby has his sights set quite a bit higher: “We plan to be the biggest global player in cloud,” he said, adding that he thinks only five firms will ultimately control the macro-cloud landscape, with other companies creating more niche products.

Competitive Field

Donna Scott, a vice president and distinguished analyst at Gartner Research, agreed with IBM’s assessment that hybrid cloud computing is a big opportunity for the technology space as a whole, noting many of the established server vendors like Hewlett-Packard (HPQ), IBM and Dell are “definitely competing for new business” on that front.

“IBM has a whole lot of different entry points and as a large company they’re trying to exploit them all,” she added. ­

Part of that strategy, she said, involves trying to garner the attention of the developers and IT managers who will ultimately be utilizing the systems most directly. As an example, Scott pointed to an IBM system called BlueMix that enables programmers to create enterprise software without crafting the complex architecture that underlies the cloud.

IBM is also a leading member of a growing open-source cloud-computing software foundation called OpenStack. The group, founded by Rackspace Hosting (RAX) and NASA, aims to “enable any organization to create and offer cloud computing services running on standard hardware.”

Basically, OpenStack is looking to lower the barriers of entry for developers to produce highly-scalable cloud software. The organization boasts 9,500 individual members and sponsorships from hundreds of companies, including H-P and AT&T (T).

Scott said the standard could help lure developers, and thereby businesses, away from Amazon’s highly-popular platform called Amazon Web Services – which has become one of the big services to beat.

Given cloud, especially hybrid cloud, is so new, Scott said it is yet to be seen who will emerge as the leader.

IBM, for its part, has its fair share of fans on the Street despite tepid analyst ratings. Indeed, Berkshire Hathaway chairman and CEO Warren Buffett told FOX Business in April that his sole vehicle for cloud exposure is Big Blue.

“That’s a fight that’s not over-with yet,” Buffett, whose firm owns 6.8% of IBM and is its biggest shareholder said, noting that “it’s not easy to figure out the affect [cloud] will have” on the industry as a whole.

Chris Grisanti, co-founder at registered investment advisor Grisanti Capital Management, said he also sees green ahead for Big Blue. Grisanti, whose main investment vehicle holds 19 stocks for an average duration of three years, said he doesn’t agree with the lukewarm marks most analysts pin on the stock.

“IBM is a great brand, it’s a great company,” he said.

Joyent Rolls Out Its Latest Private-Cloud-In-A-Box Offering. One To Add To The Thousand Or So Other Players

Joyent is an interesting company. While private cloud products from initiatives like OpenStack and CloudStack get lots of market buzz, and Amazon Web Services (AWS)-clone Eucalyptus flies the flag for an AWS-compliant private cloud, Joyent quietly goes abut delivering services far and wide. And they really do go wide, Joyent offers compute, networking and appliance-based solutions as well as Joyent Manta, a scalable, distributed object storage service. On top of all this Joyent is the corporate backer of Node.js, the fast-growing open source server-side JavaScript project.

But today the news is all about private cloud software as Joyent rolls out version 7 of its SmartDataCenter product. SmartDataCenter is firmly positioned as the best of both worlds – it gives the compelling economics of more modular solutions from the likes of OpenStack, but does so with the benefits that a “one stop shop” can deliver. Of course Joyent isn’t the only vendor thinking holistically. Red Hat, in an attempt to become the… errr Red Hat of the cloud, has been on something of a spending spree of late. As well as investing heavily in its OpenStack offering, Red Hat has also acquired storage companies Gluster and Inktank to round out its cloud offerings.

Anyway, back to Joyent. SmartDataCenter is designed around lightweight, container-based virtualization. It’s also not an untested product – SmartDataCenter is a product borne through Joyent’s own use of the software to power its own public cloud offering. In rolling out this offering as a product, Joyent is aggressively critiquing the other private cloud offerings. According to Bryan Cantrill, CTO at Joyent:

…private cloud hasn’t meaningfully made its way into the data center because the options are limited. Operators have to choose between a kit cloud that requires the team to spend months researching every decision before they see a result, or legacy enterprise software that views ‘cloud’ as a mere re-branding opportunity

SmartDataCenter, if it wasn’t obvious before, is pitched as a complete solution. As well as the obvious cloud operating components, it includes such peripheral services as SSH key management, LDAP directory services, a full identity and role-based management and built in firewall capabilities. Obviously given the fact that it is a product that already runs Joyent’s public cloud, SmartDataCentre is completely consistent between the public and private offerings. Thus the release of the product, as well as generating private cloud revenue, will conceivably increase Joyent’s public cloud market share. An interesting hybrid story thus emerges that OpenStack, a natural enough party to deliver a similar story, simple cannot yet do due to the significant differences between the various OpenStack distributions.

Joyent is relatively unknown, at least to the mass majority cloud adopters who are busy comparing Microsoft MSFT -0.63%, Google GOOGL 0% and IBM Softlayer with AWS. The release of SmartDataCenter could very well change that, especially for organizations who are looking for a rock solid and completely seamless hybrid cloud story.