Recognizing the changing cloud

When early cloud adopters bought into massive, on-demand scale, they also assumed the responsibility of managing failure-prone commodity hardware. They had to adapt their applications to run in the cloud in addition to monitoring a complex new system. While curing some IT headaches, the public cloud created new ones, like:

Unreliability – Cheap, failure-prone commodity hardware creating a requirement to overprovision cloud resources to plan for server failure.
Management – Attaining experienced cloud engineers to tweak and tune cloud resources, handle monitoring and alert response, and manage the applications and workloads running on top of cloud resources.
Cloud sprawl: To get the additional services needed to run workloads, secure deployments and other cloud-related tasks, businesses needed multiple providers leading to shadow IT, platform lock-in, billing confusion, and service compatibility glitches.

The DIY nature of public cloud didn’t work for everybody. Some organizations wanted the speed-to-launch and flexibility of on-demand cloud resources, but they also needed round-the-clock support and architecture expertise. As Gartner describes in the Magic Quadrant for Cloud-Enabled Managed Hosting North America report, “Cloud-enabled managed hosting brings cloudlike consumption and provisioning attributes to the traditional managed hosting market.” Those early cloud adopters quickly found that managing a cloud can cause as many problems as it solves, but the benefits were too enticing to pass up. To mitigate the risks of the new technology, businesses sought out providers who could not only provide raw infrastructure, but also provide:

Support and expertise to manage and grow the environment in accordance with business strategy
Full-stack management that runs the underlying cloud resources and complex workloads, such as databases, ecommerce software, email, and other critical applications
Shorter lead times to deployment by integrating on-demand resources and the implementation of DevOps automation methods.

Find out how this category change could impact your business and why Rackspace Managed Cloud was positioned the furthest for completeness of vision and ability to execute in the Leaders Quadrant in the North American and European categories.

Cloud Computing: IT’s Driving Again

Verizon’s second annual "State of the Market: Enterprise Cloud" report, released Tuesday, reports that 71% of businesses expect to have public-facing applications in the cloud by 2017.

One value of the Verizon report is its ability to track changes from year to year. For example, in 2013 it reported cloud storage was growing at a rate 2.5 times that of virtual machines deployed. Memory capacity, it said in 2013, was growing even faster, at 2.9 times the rate of VMs deployed, reflecting perhaps an increasing concern for performance. In the 12 months ended July 2014, the rate of growth of storage per VM increased six times over the number of VMs deployed, and the capacity of memory had grown to "a staggering 12 times" the rate of VMs deployed.

The Verizon report also is broader than vendor surveys that primarily reference their own customer base. While Verizon draws data from its own enterprise-level cloud customers, which produced 181 responses for this year’s report, it also included 451 Research’s 988 responses from cloud users (through May this year); The Harvard Business Review’s Analytic Services report, "Business Agility in the Cloud" in July, with 527 respondents; and Forrester Research, Accenture, IDC, and Gartner reports on cloud computing. As such, it tends to offer more of a composite picture than some surveys and reports.

[Are you spending wisely? See IT Investments: 5 Common Mistakes.]

And if it was once true that non-IT departments led a company’s decision-making on adopting cloud, that predominance of shadow IT is no longer the case. Not only has the enterprise spend on cloud grown by 38% over the last year, IT’s share of that spending now reaches 80% of the total. "Over half is managed directly by the CIO," said the report, borrowing that conclusion from a study by the 451 Research Group earlier this year. So much for concern about the chief marketing officer taking over the CIO’s budget.

Market researcher IDC supported that finding. Not only is spending on the cloud growing, but it represents a rapidly increasing part of the overall IT budget. Organizations adopting cloud services expect to spend "54% of their IT budget on cloud in the next two years," Verizon said, citing a recent IDC report.

The debate over public versus private cloud "is now moot," Verizon reported, without necessarily making it clear exactly what the outcome had been. Debating the merits of one versus the other "is insufficient to describe the complex decisions that companies are facing when selecting a delivery model," it said. In other words, companies don’t expect all their needs to be met by one or other. If anything, they’re mapping out a future that will include both. "We’ve seen the rigor of the selection process increase noticeably in the last 12 months," Verizon concluded, which suggests companies have a clearer idea of what they’re looking for as they venture into the public cloud.

Verizon also noted increased use of the term virtual private cloud, "indicating a blurring of the line separating public from private cloud." Virtual private cloud, however, usually means a more private implementation of enterprise servers inside a public cloud provider, although not necessarily in a multi-tenant environment.

While infrastructure-as-a-service is viewed as an implicitly efficient way of running computing, only 14% of those contacted said that saving money was a primary motive for using cloud computing. Rather, 32% said the primary perceived benefit of the cloud was agility, not cost savings, according to the report.

Commodity cloud is a highly visible subject in the marketplace, but the "the most interesting work is happening on the various types of enterprise-class cloud available."

That appears to be Verizon’s own conclusion, as no source was named for it. Verizon earlier this year said it would continue to offer commodity cloud services, but it was putting more emphasis on enterprise-level cloud services. For example, in late April, it announced Secure Cloud Interconnect to give enterprise customers the ability to connect workloads in different clouds via a private-line IP service.

A surprising 41% of cloud users say they are now relying on public cloud services to run mission-critical workloads, whereas in the past they’ve emphasized public facing websites and informational services. By mission-critical, respondents mean things such as e-commerce for retailers.

CIOs Broker Cloud Services, Thanks to Shadow IT

More CIOs, emboldened by stronger internal processes and a maturing market for rented software, are working with their business peers to select cloud technology, says Verizon Communications Inc.VZ +0.95% Credit shadow IT with helping triggering this shift, with CIOs realizing they’d better become proactive in choosing cloud services before the business beats them to it.

Seventy-one percent of 1,000 IT leaders surveyed said IT was primarily responsible for decisions about deploying cloud, with 80% saying they directed spending on it, according to research conducted by 451 Research for Verizon. But 38% of respondents said the business line was involved during the planning phase, suggesting that IT and the business are collaborating, said the report, which Verizon published Tuesday.

The evolution has been painful for IT leaders. In a Wall Street Journal story Monday one IT executive compared the task of tracking unsanctioned cloud services to playing “Whac a Mole.”

Ironically, it was “shadow IT,” the term describing the implementation of cloud apps by business line leaders or rank-and-file employees without IT department consent, that helped spur CIOs’ adoption of cloud. CIOs previously avoided cloud because they were concerned about selecting a new third party, let alone entrusting their data to it, said Siki Giunta, a Verizon vice president of cloud services.

Shadow IT pushed CIOs to do their homework and become as proficient at selecting and vetting vendors as they are in purchasing hardware and software to run internally, said Ms. Giunta. CIOs have adopted their vendor vetting procedure for the cloud, she said. They have learned to ask whether cloud services are appropriately configured, patched, and monitored, and how workloads are protected by firewalls. They want to know what vendors’ employees may access their data and how they are indemnified in the event of a data loss.

Many organizations started to build this discipline in IT because they knew what to look for when consuming infrastructure and operational services, said Jeff Shipley, CIO of Blue Cross Blue Shield Kansas City, who built a private cloud and purchases software-as-a-service for human resources credit card processing and pharmacy benefits.

Vendors, eager to prove themselves in a burgeoning cloud market that IDC said could top $100 billion this year, are scrambling to meet CIOs’ requests. “CIOs have become… much more comfortable with being in multi-source environment [of several cloud services,]” said Ms. Giunta.

Now CIOs are also working with their business peers, such as chief marketing officers, to establish policies and procedures so they can jointly evaluate, select, implement and manage cloud services to meet their corporate priorities, said Jeff Kaplan, managing director of Thinkstrategies Inc., a boutique research firm that consults CIOs on cloud computing.

CIOs are creating IT service catalogs from which business leaders may select technology services, said Ms. Giunta. “Shadow IT… drove the comprehension of the new dynamic in the market,” said Ms. Giunta. “CIOs realize they can accelerate the delivery of IT services.”

IBM and SAP teaming up in the cloud

Services moving to the cloud need reliable infrastructure to run on, even if they’re owned by an industry behemoth like SAP. This week, the company announced that it has selected IBM as a premier strategic provider of cloud infrastructure services for its business critical applications.

“We look forward to extending one of the longest and most successful partnerships in the IT industry,” said Bill McDermott, CEO of SAP. “The demand for SAP HANA and SAP Business Suite on SAP HANA in the cloud is tremendous and this global agreement with IBM heralds a new era of cloud collaboration. We anticipate customers will benefit from this collaboration and expansion of SAP HANA Enterprise Cloud.”

The partnership leverages IBM’s 32,000 SAP consultants worldwide, said Malcolm Smith, IBM Canada’s director of cloud computing. It will allow customers to quickly deploy SAP products in a pay-as-you-go model. SAP HANA and the SAP Business Suite are available now on the SAP HANA Enterprise Cloud.

“It lets organization start small and scale up,” said Mr. Smith. “It’s an attractive model; it lets clients be more agile.”

“The unique point of the deal between IBM and SAP is its size,” noted Charles King, Principal Analyst, Pund-IT, Inc. “Both companies are major players in the enterprise IT space — some would say the biggest in their relative markets — and share numerous large customers that stand to benefit from the SAP HANA Enterprise Cloud. Given the magnitude of the agreement, plus the ongoing success of its SoftLayer cloud infrastructure services, IBM can rightfully claim to be cloud platform of choice for global enterprises.”

For companies concerned about the location of their data, the companies plan to have one or two instances in every country IBM operates in, Mr. Smith said. IBM has already deployed HANA in the cloud, so the appropriate infrastructure is already built. “One of the attributes of cloud that clients like is the ability to act quickly, and to scale up and down,” he said. “It means that IBM must demonstrate value and commitment on a daily basis.”

He said that as customers start moving HANA into the cloud, they have the opportunity to invest in other SAP applications as well. It is, he said, a “watershed moment” for companies who aren’t sure about HANA in the cloud.

“If a CIO’s organization is already a SAP customer or has any interest in trying out HANA, the new service is probably a no-brainer,” Mr. King added. “That’s because it offers an easy way to experiment with HANA and get many SAP applications and other workloads up and running on HANA with far less risk and CAPEX costs compared to buying/deploying a dedicated HANA system. If a CIO isn’t already working with SAP, it would be wise to look into SAP HANA Enterprise Cloud and see how its costs and benefits compare to his or her organization’s existing solutions.”

Microsoft sales beat Street hopes, cloud profits up

(Reuters) – Microsoft Corp (MSFT.O) reported higher-than-expected quarterly revenue, helped by stronger sales of its phones, Surface tablets and cloud-computing products for companies, while keeping its profit margins largely intact.

The results on Thursday allayed fears of investors in recent days that the industry shift toward lower-margin cloud services was proving hard for established technology leaders to master.

Microsoft shares, which have climbed 33 percent over the past year, rose another 3 percent in after-hours trading to $46.36.

"In light of recent negative earnings results from tech bellwethers Oracle, IBM, SAP, VMware, and EMC, Microsoft is bucking the trend and we would label these September results as a solid accomplishment," said Daniel Ives, an analyst at FBR Capital Markets.

Investors were keenly watching Microsoft after harsh warnings from International Business Machines Corp (IBM.N) and SAP (SAPG.DE) about operating profits as they make tentative inroads into the cloud, which generally yields thinner margins than technology companies are used to.

Microsoft did not disclose its cloud-based revenue for the fiscal first quarter, but said commercial cloud sales rose 128 percent, while sales of services based on its Azure cloud platform rose 121 percent.

Perhaps more importantly, it said gross profit margin in the unit that includes Azure rose 194 percent, despite rising infrastructure costs, which includes the huge expense of building and operating datacenters.

In the last four years, Microsoft’s gross profit margin has drifted down to about 65 percent from above 80 percent, largely due to its move into the less profitable business of making tablets and phones, but accelerated by the move to the cloud.

Nomura analyst Rick Sherlund figures Microsoft is on track to hit $6 billion a year in cloud revenue soon, which would make it the industry’s largest cloud vendor by his calculations. That represents only about 6 percent of overall expected revenue this fiscal year, but investors are highly sensitive to a business they see as key to the future.

"We’re the only company with cloud revenue at our scale that is growing at triple digit rates," said Satya Nadella, on a conference call with analysts.

Nadella was keen to stress that Microsoft is more focused on selling higher-margin services via the cloud to its commercial customers rather than just storage and computing power. "Our premium services on Azure create new monetization opportunities in media, data, machine learning, fast analytics, and enterprise mobility," he said.


Microsoft’s fiscal first-quarter profit actually fell 13 percent, largely due to an expected $1.1 billion charge related to mass layoffs announced in July, which lopped 11 cents per share off earnings.

Including that charge, the world’s largest software company reported profit of $4.5 billion, or 54 cents per share, compared with $5.2 billion, or 62 cents per share, in the year-ago quarter.

Still, it easily beat Wall Street’s forecast of 49 cents per share, including the charge, according to Thomson Reuters I/B/E/S.

The charge resulted from Microsoft’s plan, launched in July, to cut 18,000 jobs, or about 14 percent of its workforce, with most of those cuts coming from its newly acquired Nokia phone business.

Revenue rose 25 percent to $23.2 billion, helped by the phone business it bought from Nokia in April, handily exceeding analysts’ average estimate of $22 billion.

Sales of its Lumia smartphones hit 9.3 million in the first full quarter since the close of the Nokia deal. Sales of the Surface tablet more than doubled to $908 million from $400 million in the year-ago quarter.

Coming of Age in Cloud Computing

Cloud computing isn’t merely changing the way much of the technology business works. Now it is changing itself, and putting even more computing power in more places.

On Monday, Microsoft, which operates one of the biggest so-called “public clouds,” or large and flexible computing systems available for remote rental, announced several changes to its data storageand processing services that will make them more powerful.

Microsoft also announced a partnership with Dell to sell a kind of “cloud in a box,” or hardware and software that created a mini-version of Microsoft’s cloud, called Azure, inside a company.

The idea is that a company could work with its own version of Azure, then easily move up to the giant version Microsoft has to handle big workloads. Hewlett-Packard may be after something similar with its effort to create a private-public cloud business based on the HP cloud, which uses a kind of open source software.

What all of this means is that cloud computing, which makes it easier to tie more things to computers and more easily manage software, is starting to appear in even more forms and types. Within each corporate proposition, including Google and Amazon, as well as Microsoft, HP and others, there appears to be an increasing trend toward offering more flexibility. Generally it’s done by abstracting what were functions of specialized hardware into more easily altered software.

Microsoft’s announcements this week came after the news last week that it would offer Windows Server technology on Docker, a fast-moving open source project (and start-up company of the same name) that takes cloud-type software abstractions even farther. Docker’s so-called “containers,” which were previously available on the Linux operating system, make it possible to build, deploy and update a software application anywhere in the world.

Adding to this confusing paradise of computing power, flexibility, and global software deployment, on Wednesday a company calledBracket Computing announced that it had a technology that makes it possible to run high-performance corporate computing systems across several public clouds at the same time.

While it now works only with different geographic locations inside the global cloud of Amazon Web Services, Bracket hopes eventually to enable companies to securely manage their computing across several public clouds at once. This kind of brokering, if successful, could mean further competition among the public clouds, either on price or service.

People who had worked with the Bracket System were impressed. “Even just with A.W.S., this is powerful,” said Frank Palase, senior vice president of strategy at DirectTV. “Abstracting over several cloud providers would mean we could have high levels of performance with no fear of outages,” since one system could be brought up if another failed.

It’s also possible that Bracket’s Computing Cell could hold containers, like Docker, inside its system.

For all the new terminology and hand-waving around these developments, at least one thing is clear: The cheap and easy cloud is also catching up in areas like reliability and management ease, where it has been criticized. Like all big computing trends, it has started rough, but it appears to be stabilizing and getting bigger.

IBM struggles to reinvent itself in an age of cloud

(Reuters) – When IBM Corp CEO Ginni Rometty was asked recently for her tips on how to transform companies, she spoke of "relentless reinvention" and not protecting the past.

Applying those precepts to IBM is proving particularly tricky as many of the company’s old-line businesses have already been shut down or sold, while the units that are supposed to push future growth, such as cloud and security, face stiff competition.

Earlier on Monday, IBM reported a marked slowdown in business in September and abandoned its 2015 operating earnings target.

The company also announced it will hive off its loss-making semiconductor unit to contract chipmaker Globalfoundries Inc.

Analysts and investors agree that relentless reinvention is something to strive for at Big Blue, but some of the moves tried by other old line technology companies – such as a split or a spinoff of weaker businesses – might not be tenable for IBM, which successfully reinvented itself into a services provider in the early 2000s.

"I don’t expect to see anything of that magnitude coming from IBM in part because they have systematically divested some of the businesses that were a drag on earnings in the last decade," said Charles King, principal analyst at research firm Pund-It in California.

Rometty, who took over as CEO nearly three years ago, has accelerated divestitures with the sale of IBM’s low-end server business to Lenovo Group Inc and its chip-making business to Globalfoundries Inc. Those moves eliminated two areas that were a drag on profits, but other problems remain.

Its storage and server hardware and enterprise software sectors are slumping and the company faces growing competition in cloud computing from companies such as Inc.

"They’re on the wrong side of the IT spending food chain. All the growth is in the cloud," said Dan Ives, analyst at FBR, adding that mature companies have struggled the most with the shift to cloud.

International Business Machines Corp’s faster-growing cloud computing, mobile, business analytics, social and security services contribute 25 percent of its revenue

"It is not that they are making a lot of bad choices," said Scott Kessler at S&P Capital IQ.

"It is just that they are so big and so far along one path, that even if they make some good decisions in terms of investments and acquisitions, it seems like it is too little too late in many contexts."

Future spin-offs will likely be centered on its storage unit, analysts said.

The company is also facing criticism that it pursued buybacks at the expense of investment in new technology. IBM spent $13.5 billion to repurchase stock in the first nine months of the year, more than double its net income.

"The company has been using its cash to repurchase stocks and support its earnings per share, but that has severely weakened its balance sheet, which used to be one of the true strengths of the company," said Kessler.

The disappointing third-quarter results also raise questions about investors’ patience with Rometty. The CEO, who joined IBM in 1981, recently dispensed advice such as "never define yourself as a product" and "never protect the past" in a Fortune Magazine video interview.

Making such ideals into reality is proving tricky at IBM.

"When you miss expectations and come in with results the market is unhappy with, then that puts much more pressure on her, but she bought herself more time by selling off the micro-electronics business," said Rob Enderle, chief analyst at Enderle Group in California.

"The market today is different than the market that existed a decade ago. She is struggling to readjust to an ever more mobile and cloud business, which caught most of the industry by surprise," he said.

Cloud May Be The New Outsourcing, But The Same Due Diligence Must Apply

In times gone by, when enterprises sought to shave costs off non-core processes, or add capabilities not immediately available in their organizations, they turned to third-party providers. Now, they’re more likely to first look at cloud services before bringing aboard live partners. But this doesn’t mean it’s okay to sit back and let automation take over. Cloud is simply outsourcing in a new form.

This is not lost on the outsourcing industry itself, which is shifting its business model to cloud delivery. The cloud and outsourcing world have merged into one. (Jimmy Harris and Gavin Michael of Accenture pointed this out in an article a couple of years back.) Many outsourcing providers now offer strong cloud or SaaS offerings. ADP, for example, offers many key services such as payroll over the cloud. Tata Group’s Tata Communications division offers an infrastructure-as-a-service (IaaS) cloud computing service for IT shops. Accenture offers its cloud platform to clients. Infosys recently announced it has signed a partnership agreement with China’s Huawei Technologies Co Ltd to offer customers more cloud services, on the heels of recently expanding partnerships with Microsoft Corp and Hitachi Ltd. to expand its cloud offerings. HP’s services arm (incorporating the former EDS) offers HP Helion, a portfolio of cloud products and services that enable organizations to build, manage and consume workloads in hybrid IT environments.

Yes, it has become significantly easier and faster to sign on new services (literally with the swipe of a credit card) in the cloud era, versus the previous era when potential suppliers had to be sought through word of mouth, professional directories, and endless phone calls. Cloud services can be small-scale tactical resources meant to fill a small piece of a process. But the immediate accessibility of cloud services doesn’t mean organizations should be less careful about the reliability, security and after-sales service associated with cloud services they procure. The IBM Center for Applied Insights just published a brief study on the movement from traditional IT outsourcing arrangements to cloud-based interactions. What’s noteworthy is that aside from a few tweaks, the same due diligence applied to traditional outsourcing engagements should be applied to cloud engagements as well.

As the IBM study’s authors, Marsha Trant and Romala Ravi put it, there are five rules of the road worth keeping in mind:

1) Know your traveling partners well
2) Protect your passengers
3) Keep everyone on course
4) Chart the itinerary together
5) Plan for arrival

It sounds very much like the best practices learned in the IT and business processing outsourcing worlds. A company with experience in the outsourcing world should apply the same principles to cloud partners as well. For companies with little outsourcing experience, there are lessons that can be learned. Cloud engagements are, for all intents and purposes, electronic outsourcing.

CIOs I have spoken to who have satisfactory arrangements with cloud service providers take pains to get to know their partners, in terms of financial stability, staff members they will be dealing with, and future roadmaps. Some have even taken things a step further and have gotten involved with user councils or boards, to help move their cloud vendor in the direction they want to see. These are all the best practices essential for outsourcing arrangements, or for major software acquisitions. Cloud does not diminish or change any of these requirements — if anything, it makes them more essential. Not only do cloud partners need to be responsive with service calls or requests, but they also need to maintain 24×7 uptime and availability via their data centers.

If anything, Trant and Ravi point out, the move from traditional outsourcing to cloud means paying attention to more aspects of the arrangement. For starters, there is an abundance of choice in the cloud world. “Be prepared to invest more time and effort evaluating an ever-expanding roster of cloud providers and point solutions. Find out about potential providers – their platforms, development roadmaps, contingency plans and more. Talk to current vendors and pure cloud players, as well as to their clients.” Remember too that the cloud vendor engagement may not be limited to just one department. “Cloud providers are now courting the whole enterprise,” the study points out. This means IT leaders need to play a highly proactive role in serving as advisor and partner to various business units looking to adopt cloud services.

Be proactive about security as well — don’t rely on cloud vendors to deliver assurances. “Require audit reports and security assessments; confirm that security protocols are in place,” Trant and Ravi advise. “And leaving nothing to chance, set up ad hoc checkpoints.”

Expanding cloud-storage’s health-focused capabilities will help everyone

What cloud-based file storage Box set out to do for companies in the mid-2000s, a little startup named MedXTwants to do for hospitals and clinics.

The two companies joined cloud-powered forces when Box acquired MedXT last week and are now working together on bringing the health industry to the cloud. The two companies have actually been working together for a year now, but Box’s recent announcement that it wants to get serious about industry bundles made it clear that the two companies should get hitched.

MedXT, which started less than two years ago, is “a big photo-sharing app for healthcare,” as co-founder Cody Ebberson told VentureBeat in an interview. In short, it has created rendering and viewing technology that supports two of the most popular file formats for medical imaging, DICOM and HL7. With MedXT’s tools, doctors and even patients can easily open and view these files, just as they would open a JPEG photo file, for example.

So what exactly does this marriage mean?

Well, obviously MedXT’s technology will be integrated into Box’s storage and sharing services. Although Ebberson declined to share too many details as to how exactly it will work, this will more or less mean that doctors and hospitals that use Box will be able to open, view, and work with medical images saved there, just as they do with traditional, non-cloud based viewers.

This will also open more doors for telemedicine — the diagnosis and treatment via telecommunications tools. Specialists miles and miles away can open MRI scans through Box and diagnose a patient from wherever they are. Remote villages can access expert opinions through some simple file-sharing. Smaller clinics that don’t have a full-time specialist can still help patients in need.

This and many other health data topics will be discussed at VentureBeat’s HealthBeat conference on Oct. 27-28. Reserve your place now.

Actually, it could someday mean that all facilities outsource this to a specialized center, not only making facilities and processes more efficient, but also cutting costs — and everyone wins when you cut costs.

But on the most basic and immediate level, it could decrease the number of providers and programs that facilities have to purchase, use, and manage. That’s pretty much any IT head’s biggest wish.

Patients hugely benefit from these technologies as well. The most obvious gain is that they can now access their own medical imaging files, and much more easily than before. They can say goodbye to the films they don’t have any way to view or the CDs that tend to scratch.

But second opinions are perhaps one of the biggest door cloud-based and consumer-friendly technology like MedXT’s opens. There are virtually no barriers to patients and additional doctors being able to take a look at these digital medical images. Ebberson added that computer vision — computers processing and “understanding” images on their own — could eventually add to second opinions.

On a bigger scale, making medical images available through consumer-friendly cloud-storage is one more step toward making patient records available to them. The Health Information Technology for Economic and Clinical Health Act, passed in 2009 by the U.S. Congress, has created an incentive for eligible doctors and hospital to digitize their patient records by 2015 and receive “meaningful use” checks for doing so. After that, they’ll be penalized.

But of course, there are still hurdles and kinks. HIPAA compliance, that is, specific security measures for digital documents and communications that the health industry must take, can be tricky to understand and implement, especially for smaller practices. Not all cloud-based tools are currently HIPAA compliant either, making it more difficult for organizations to outfit their practices with new tools.

Even the general acceptance and adoption of cloud technology by the health care industry still has a long way to go. Still, Ebberson is optimistic and says it’s come a long way in the last couple of years.

But new tools and technologies are moving in, and digital files and data are getting more and more attention from the health industry and even patients, a movement we’ll be discussing at HealthBeat a couple of weeks from now. Wave Analytics Cloud: Pros & Cons

Will’s new Wave Analytics Cloud cause ripples in the enterprise tech sector, or will it be a tsunami?

Make no mistake: Wave is a big bet for Salesforce, and it’s not some lightweight tool that was thrown together as brochureware for Dreamforce. At least two years in the making, Wave is a cloud-based data platform as well as a data-analysis front end, and it’s designed to analyze not just Salesforce sales, service, and marketing data, but also any third-party app data, desktop data, or public data you care to bring into the mix. isn’t coming out of nowhere with this platform. Alex Dayon, the company’s president of products, was a co-founder of BusinessObjects, and more than two years ago he hired Keith Bigelow, a BusinessObjects veteran, to start work on an analytics cloud. Things really got rolling in June 2013 when Salesforce acquired EdgeSpring, an innovative analytics startup, and hired its CEO and founder, Vijay Chakravarthy, to become chief product officer for analytics.

There are dozens of BI and analytics vendors that have been trying to democratize this technology for a lot longer than two years. But with more than 100,000 customers and roughly 25 million users, Salesforce is better positioned than any startup, and many incumbents, to succeed in what is really a new market for analytics and BI — one in which the data is often managed in the cloud and analyses are heavily consumed through mobile devices.

Given the platform behind it, the Wave Analytics Cloud already has plenty of pent-up demand. It certainly didn’t hurt to be launched at a Dreamforce event that Salesforce says had 150,000 attendees and another 3 million tuning in online. But the ultimate success of Wave will depend on how well it exploits and overcomes these six pros and cons.

Pro: It’s a secure, cloud-based platform.
Lots of competitors are now casting aspersions on Wave, summing it up as a simple data-visualization tool that’s just for sales data. That’s dead wrong. This cloud-based platform encompasses a back-end data-management service plus developer/power user and end-user-facing query-and-analysis "lenses" for data exploration and dashboards for persisted reporting and key performance indicators.

As a data platform, Wave’s starting point encompasses all the sales, service, and marketing data that Salesforce customers generate. It can take advantage of Chatter collaboration, enrichment data, and Radian6 social data. The system picks up its security-and-access controls hierarchies from the Salesforce platform.

According to GE Capital’s Eric Johnson, a VP of commercial sales who joined a Wednesday keynote session at Dreamforce, Wave was able to meet rigorous data-security and data-privacy requirements. That included the ability to analyze confidential data that had to remain on GE’s side of the firewall, he said.

Pro: The underlying database is flexible.
Wave is based on a key-value-store, NoSQL database. Because there are no predefined schemas or cubes or requirements to conform all data to a fixed model, you can quickly bring any third-party app data, public data, or desktop data into the data store. The hardest part is extracting data from legacy systems and filtering or transforming it, if required. Salesforce does not have its own data-integration tools, but there’s a long list of integration partners, including Dell Boomi, IBM CastIron, Informatica, and others.

This schema-on-read approach, which is widely being adopted in big data circles, gives analysts and end users the ability to ask any question, not just those baked into a predefined data model that takes months to develop and days or weeks to change after the fact.

Pro: It’s mobile first.
Wave was designed first and foremost for smartphone and tablet interaction, but of course there are also designer- and power-user-oriented

Web interfaces for laptops and desktops. Device support will follow Salesforce’s overall pattern, with hybrid apps blending native and HTML5 capabilities for iOS, Android, and (eventually) Windows devices.

BI vendors have been noodling around with mobile BI for years, and it looks like Salesforce is standing on the shoulders of successes (as well as its own experience with other apps) in not just trying to just shrink down the same interfaces to smaller form factors. Wave demos featured distinct experiences on smartphones, tablets, and desktops that exploited the strengths of each form factor.

To give you a flavor of the analyses, GE Capital is already using Wave to give its salespeople and executives mobile, table, and desktop dashboards and visualizations detailing lending trends, loan cycle times, and loan-conversion ratios, by time, by region, and by salesperson, and they can also do root-cause analysis of lost deals, Johnson said. Blue Cross/Blue Shield, another development partner, has deployed Wave-based analyses to a sales team that has grown 25 times over the last two years. Executives get analyses of cost-per-sale, distribution mix, and administrative expense ratios

Con: It’s expensive.
Salesforce says Wave "Explorer" business-user subscriptions will be $125 per user, per month while "Builder" admin/power-user subscriptions will be $250 per user, per month. That’s the front-end expense. Customers will also pay $40,000-per-company, per-month for the back-end infrastructure (that part wasn’t mentioned during my first briefing on Wave). Salesforce co-founder Parker Harris and products president Alex Dayon said this pricing is based on focus-group feedback and market analysis, but there’s no way these prices will fly.

More than likely these are trial-balloon figures that won’t even be close to what large enterprises will pay. Big companies are the real target for Wave, at least initially, and they tend to negotiate all-you-can-eat enterprise deals. But if Wave is truly about democratizing BI and analytics and bringing it to every employee, it can’t cost more than the Sales Cloud and Service Cloud services combined. I’m already hearing off-the-record comments from midlevel Salesforce executives who hint that analytic apps and services will somehow be exposed in a more affordable way.

Con: Data-analysis capabilities aren’t well known.
With NoSQL on the back end of Wave, it’s not entirely clear just what latitude customers will have to support the analyses they need. Salesforce says its home-grown database has a Salesforce Analytic Query Language (SAQL) available for developers and administrators to set up analysis for users. But if SAQL is anything like the other SQL-like languages seen in the big data world, it will have to mature to even approach the capabilities it took 30 years to develop in SQL.

It’s true that sometimes simpler is better, and the saved lenses and dashboard views and ad-hoc "group, measure, filter, view," and "action" options may cover a lot of needs. But this is a V1 product, and it will undoubtedly take some time to mature.

Con: It will roll out sloooooowly.
Wave was demonstrated live at Dreamforce, it can be downloaded and demoed from the Apple App Store, and it will become generally available, technically, on October 20. But don’t count on using Wave next week, next month, or, if you’re from a small or midsized company, next year. Salesforce is still rolling out back-end infrastructure for Wave across its data centers. Harris and Dayon said Salesforce will bring customers onto Wave in a methodical, phased approach — starting with large enterprises — to make sure that the platform lives up to performance expectations.

Once you’re up and running on Wave, Dayon suggested customers will simply redirect existing ETL processes and point them at the Wave API. "Off you go; you’ll have your users on mobile overnight with the flip of a switch." I doubt it will be quite that easy, and for the bulk of Salesforce customers, it’s a safe bet that Wave won’t be broadly available anytime soon.

The impact of Wave is hard to gauge at this early stage, but there’s no doubt it will ultimately be a very good thing for customers, delivering new capabilities and stoking competition. Partners like Aptus, C9, FinancialForce, Fliptop, Informatica, Kenandy, Snap Logic, and Xactly are counting on extending the Wave platform. Industry challengers including Adaptive Insights, Attensity, Birst, Data Hero, and BeyondCore are throwing cold water on Wave (some with false and misguided information), which leads me to believe it’s a big threat.

As for the BI incumbents, from SAP, IBM, Oracle, and Microsoft to Tableau and Qlik, Wave will be less of an immediate threat, but those players better not stand still. Wave just might stimulate improvements in data-management flexibility, data-analysis simplicity, and application-embedded decision support that are long overdue. If customers see a better way, they’ll use it or demand it from their incumbent vendors. Everybody wins if we finally get faster, simpler, data-driven insight for all.