Alphabet studies how deep learning could predict heart disease risk | Healthcare Dive

Researchers at Google parent Alphabet’s research arm used an algorithm to speed assessments of patient cardiovascular risk factors using eye scans.

  • Researchers at Alphabet and its research arm Verily Life Sciences have found a way to predict a person’s cardiovascular risk factors using eye scans and deep learning, according to a study published this week in Nature Biomedical Engineering.
  • By analyzing scans of the retinal fundus, the tissue at the back of a patient’s eye, the company’s software can tease out data such as blood pressure, age and whether an individual smokes — all potential risk factors for having a major cardiac event.
  • The algorithm could speed assessments of patients’ cardiovascular risk, but more testing is required before it can be used in a clinical setting.

Dive Insight:

“Using deep learning algorithms trained on data from 284,335 patients, we were able to predict CV risk factors from retinal images with surprisingly high accuracy for patients from two independent data sets of 12,026 and 999 patients,” Dr. Lily Peng, product manager of the Google Brain Team and a co-author of the study, wrote in the Google Research Blog.

“For example, our algorithm could distinguish the retinal images of a smoker from that of a non-smoker 71% of the time, compared to a ~50% (i.e. random) accuracy by human experts,” she said.

The results also show “strong gender differences” in the fundus images that could help guide research on the differences in male and female eyes, as well as how cardiovascular disease or risk factors affect retinal health, according to the study.

Last year, Verily began recruiting 10,000 volunteers to help build a comprehensive database of biometric data. The company is also working with French drugmaker Sanofi to develop tools for diabetes management and with 3M on solutions for population health.

This latest study could have implications for treating patients. Cardiology has been trending toward population health and tools such as this new algorithm could help identify and inform care paths for patients.

Recommended Reading:

Source: Alphabet studies how deep learning could predict heart disease risk | Healthcare Dive

Practice Fusion wants to start charging doctors, sources say

Practice Fusion is scrapping free software model after agreeing to sell to Allscripts

Practice Fusion is planning to start charging doctors to use its software, sources say. The change comes weeks after Practice Fusion agreed to a disappointing $100 million sale to Allscripts. Practice Fusion has struggled to build a growing business model based on ads

After more than a decade in the market with a free product, Practice Fusion has plans to start charging doctors.

Six weeks after Practice Fusion agreed to sell itself to Allscripts for a fraction of its prior valuation, the medical software company is scrapping the business model that propelled it to unicorn status.

Practice Fusion gained traction by offering free electronic health records software to doctors — as an alternative to the expensive systems from big vendors — and the company made money by serving relevant pharmaceutical ads to its users.

But Practice Fusion recently started notifying customers that, beginning this summer, the service will convert to subscription payments and cost $100 per physician per month, according to two sources familiar with the matter who asked not to be named because the change hasn’t been made public.

It’s a massive shift for a company whose founder and ex-CEO preached about the virtues of a free product and promised that it would never cost money for users. Ryan Howard, who was ousted in 2015 after the company missed financial targets, told Medgaget two years earlier that “Practice Fusion will always be free.”

The product proved to be a particular favorite among small physician groups, like primary care doctors and dermatologists, and the company said that its user base has grown to 100,000 health-care professionals. One industry publication called it the “poster child” of free platforms.

In a statement to CNBC, a Practice Fusion spokesperson said that as part of its mission the company has “been offering some features and services to our customers at no cost while other solutions and services offered do involve reasonable prices,” and that a change is on the way next month.

“We have a product announcement upcoming in early March, and we look forward to sharing it further with you and all of our stakeholders very soon,” the company said.

Practice Fusion has had a rough start to 2018. In January, the company said it was being acquired by Allscripts for $100 million. That’s about one-fifteenth its expected valuation in 2016, when it reportedly hired J.P. Morgan to explore an IPO.

Soon after the acquisition was announced, CNBC reported that top executives pulled in millions of dollars as part of a pre-arranged deal, while common shareholders were wiped out.

‘Evaluate their options’

During its growth years, Practice Fusion benefited from legislation passed in 2009 that incentivized the medical community to move from paper to digital records.

The market exploded with dozens of medical records vendors, but most charged subscription fees for the service and additional expenses to upgrade. Epic and Cerner have captured the top end of the market, which includes academic teaching hospitals, while Practice Fusion and a handful of others compete for the smaller physician groups.

Industry experts including Ken Comee, CEO of rival CareCloud, said the change could be a boon for other vendors that target independent practices.

“Maintaining the customer base could be a challenge because they’re charging for something that was once free,” Comee told CNBC. “It might encourage doctors to evaluate their options.”

Source: Practice Fusion wants to start charging doctors, sources say

Cloud infrastructure services, find a niche or die?

Back in May it was reported that Morgan Stanley had been appointed to explore options for the sale of hosted? services provider Rackspace. Business Week, May 16th reported the story with the headline Who Might Buy Rackspace? It’s a Big List. 24/7 WALLST reported analysis from Credit Suisse that narrowed this to three potential suitors; Dell, Cisco and HP.

To cut a long story short, Rackspace sees a tough future competing with the big three in the utility cloud market; Amazon, Google and Microsoft. Rackspace could be attractive to Dell, Cisco, HP and other traditional IT infrastructure vendors, that see their core business being eroded by the cloud and need to build out their own offerings (as does IBM which has already made significant acquisitions).

Quocirca sees another question that needs addressing. If Rackspace, one of the most successful cloud service providers, sees the future as uncertain in the face of competition from the big three, then what of the myriad of smaller cloud infrastructure providers? For them the options are twofold.

Be acquired or go niche

First achieve enough market penetration to become an attractive acquisition target for the larger established vendors that want to bolster their cloud portfolios. As well as the IT infrastructure vendors this includes communications providers and system integrators.

Many have already been acquisitive in the cloud market. For example the US number three carrier CenturyLink buying Savvis, AppFog and Tier-3 and NTT’s system integrator arm Dimension Data added to existing cloud services with OpSource and BlueFire. Other cloud service providers have merged to beef up their presence, for example Claranet and Star.

The second option for smaller provider is to establish a niche, where the big players will find it hard to compete. There are a number of cloud providers that are already doing quite well at this, they rely on a mix of geographic, application or industry specialisation. Here are some examples:

Exponential E – highly integrated network and cloud services

Exponential-E’s background is as a UK focussed virtual private network provider, using its own cross London metro-network and services from BT. In 2010 the vendor moved beyond networking to provide infrastructure-as-a-service. Its differentiator is to embed this into in to its own network services at network level 2 (switching etc.) rather than higher levels. Its customers get the security and performance that would be expected from internal WAN based deployments that cannot be achieved for cloud services accessed over the public internet.

City Lifeline – in finance latency matters

City Lifeline’s data centre is shoe-horned in an old building near Moorgate in central London. Its value proposition is low latency for which it charges a premium over out of town premises for its proximity to the big city institutions.

Eduserve – governments like to know who they are dealing with

For reasons of compliance, ease of procurement and security of tenure, government departments in any country like to have some control over their suppliers, this includes the procurement of cloud services. Eduserv is a not for profit long-term supplier of consultancy and managed services to the UK government and charity organisations. In order to help its customers deliver better services, Eduserve has developed cloud infrastructure offerings out of its own data centre in the central south UK town of Swindon. As a UK G-Cloud partner it has achieved IL3 security accreditation enabling it to host official government data. Eduserve provides value added services to help customers migrate to cloud, including cloud adoption assessments, service designs and on-going support and management.

Firehost – performance and security for payment processing

Considerable rigour needs to go into building applications for processing highly secure data for sectors such as financial services and healthcare. This rigour must also extend to the underlying platform. Firehost has built an IaaS platform to targets these markets. In the UK its infrastructure is co-located with Equinix, ensuring access to multiple high speed carrier connections. Within such facilities, Firehost applies its own cage level physical security. Whilst infrastructure is shared it maintains the feel a private cloud, with enhanced security through protected VMs with built in web application firewall, DDoS protection, IP reputation filtering and two factor authentication for admin access.

Even for these providers the big three do not disappear. In some cases their niche capability may simply see them bolted on to bigger deployments, for example, a retailer off-loading its payment application to a more secure environment. In other cases, existing providers are staring to offer enhanced services around the big three to extend in-house capability, for example UK hosting provider Attenda now offers services around Amazon Web Services (AWS).

For many IT service providers the growing dominance of the big three cloud infrastructure providers, along with the strength of software-as-a-service providers such as, NetSuite and ServiceNow will turn them into service brokers. This is how Dell positioned itself at its analyst conference last week; of course, that may well change if it bought Rackspace.

Intel expands custom chip work for big cloud providers

intelThe enterprise IT world is moving toward an open, multi-cloud model, but a clear definition of just what multi-cloud means is still a question mark.

Speaking at Gigaom Structure 2014 in San Francisco on Thursday, Pivotal president and head of products Scott Yara acknowledged that the big tech industry at-large is still trying to converge on what this definition might be.

Yara suggested that the future of multi-cloud starts with looking closer at cloud-based computers around the world and unlock the power presented by new, modern hardware.

Platform-as-a-Service, he continued, serves as the foundation for a common abstraction for cloud-based computers.

When Pivotal meets with customers big and small, Yara said a lot of them want to take advantage of cloud-based computing but still aren’t sure how to do it.

“A lot of these cloud-based data services are going to need to be built close to where the data is generated,” Yara predicted, positing how the multi-cloud approach simply provides businesses with more options.

The end result is to enable enterprises to build up to the data or build down to it, depending on what the user case and requirements might be.

It is here were Pivotal comes in, he suggested, framing it as a “unique company” being that it was spun out from VMware and its parent company EMC more than a year ago.

The San Francisco-based software firm launched with a big data suite combining the powers of the Greenplum analytic database system, Hadoop-based data platforms, and GemFire, a scale-out data transaction database with real-time processing.

“The interesting thing about Pivotal is just how new it is,” Yara remarked. “We’re still very early in the journey of building this company.”

Nevertheless, Yara reiterated Pivotal’s commitment to the open source community, touting the achievements by other industry players. He also described how we’re seeing everyone from Amazon to Google to Rackspace, among others, innovate in order to unlock “big pools of computing,” such as memory and storage.

Pointing toward an industry competitor but also a partner on General Electric’s Industrial Internet cloud, Yara praised Amazon especially for doing a great job in elevating the cloud-based computing market, noting that the “breadth of AWS capabilities is broad. ”

At the same time, Yara stressed that the open source Platform-as-a-Service Cloud Foundry is getting us to the open cloud world “that we all want.”

Adobe Creative Cloud 2014: It’s all about apps and APIs

In a recent briefing to prepare us for the annual barrage of Adobe announcements, one of Adobe’s technology development people said of the company’s app development, "These early experiments…are one of the first stages of constructing the core products. Sooner or later, mouse and keyboard won’t be enough in Photoshop, Illustrator and InDesign. We’re trying to prepare ourselves for when that entire market gets disrupted. "

To me, that’s the theme of this year’s Creative Cloud extravaganza. Yes, Adobe’s offering up the usual updates to the desktop applications, major and minor. But we’re also seeing a company whose success has been built on monolithic, computationally intensive applications struggling to address the changing needs of its core base of imaging professionals — whose need for lightweight mobile apps is growing rapidly.

The launch of CC in 2011 was the first step down that road, building an infrastructure to connect all the as-yet-unforseen tools. This year’s announcements display a strategic commitment to underlying technologies, but in my opinion, its actual mobile apps still convey a sense of confusion rather than clear purpose.

The company’s delivering its first hardware products plus two new drawing apps (Ink and Slide), a brand-new Photoshop app variation (Photoshop Mix) an iPhone version of Lightroom, and a brand-new Creative software development kit (SDK) designed to help third parties hook into remote processing for advanced imaging capabilities. In general, the apps and hardware fall rather flat for me, and mostly seem to serve the purpose of attracting developers to its little corner of the world.

I think part of the lack of cohesion I sense is a result of the development incentives of the subscription system. During the briefing for Ink and Slide, the result of the company’s Projects Mighty and Napoleon, the assembled group asked questions about possible new features or extensions that we might see for these products. And all of the answers began with, "Well, if it’s successful…" So I asked: what’s your metric for gauging success? The blunt reply: if it drives subscriptions. What I see is Adobe throwing apps at the wall to see if any stick.

And they’re all iOS apps. On one hand, it makes sense for the company to devote its resources to a platform that most of its existing users are on. But as a multiplatform user — I have an iPad and an Android phone — I feel the loss. There’s now a Creative Cloud app for iOS that lets you manage your profile and assets similar to the way you can with the desktop application.

Telecom NZ buys cloud desktop specialist Appserv

Telecom New Zealand has inked a conditional agreement to buy cloud services specialist Appserv.

Telecom chief executive Simon Moutter said that the acquisition provides a major new piece in the "cloud jigsaw" after significant investment across the business in cloud services.

“We see this as a strategic business move giving the Telecom group a very strong New Zealand cloud services portfolio. The acquisition of Appserve is an excellent complement to the purchase in 2013 of Revera and the ongoing investment we are making in data centres,” Moutter said.

Telecom Retail CEO Chris Quin said thousands of businesses already supported by Telecom and its ICT services arm, Gen-i, want to take advantage of the cloud but need a hand to get there.

Auckland-based Appserv has capability in providing cloud and desktop ‘as-a-service’ offerings, with customers across New Zealand, Telecom said.

“Revera is a cloud computing specialist that provides cloud solutions predominantly for government and corporates, while Appserv brings specialist cloud desktop-as-a-service solutions for a range of business sizes, particularly small to medium sized," Moutter said.

Telecom bought Revera for NZ$96.5 million last year.

Moutter said Appserv adds to Telecom’s existing strengths in cloud infrastructure, mobility, managed ICT and platform as a service.

"Gen-i is making big strides in all these areas. We have secured a number of significant new IT as-a-service contracts. We have added new or imminent data centre business totalling in excess of 250 racks since the beginning of 2014. Revera is continuing to grow strongly and perform well," he said.

Revera opened a new datacentre in Wellington last year that Moutter said is now 100% allocated. A third centre is due to open in October. In Christchurch, a new green-fields centre opened in August and is over 30% full while in Auckland, a new high resiliency datacentre in Takanini is due to open in October, with 38% of capacity already allocated.

Telecom is about to embark on a brand change to "Spark". Auditions are being held for a major TV ad campaign and agencies across the country are preparing to bid for work on the supporting campaign.

Enterprise spend on public cloud IaaS to reach $650m by 2018

There were twice as many enterprises adopting infrastructure-as-a-service (Iaas) in 2013 over the previous year, according to a new study by Telsyte.

The Telsyte Australian Infrastructure and Cloud Computing Market Study 2014 showed that more than half of all organisations greater than 20 employees are now using public cloud IaaS for at least some part of their IT infrastructure, indicating that cloud computing within Australian enterprises is "skyrocketing".

The study has forecasted the total market value for public cloud infrastructure services to reach AU$650 million by 2018, up from AU$305 million in 2014.

Telsyte senior analyst Rodney Gedda said with cloud services presenting a low barrier to entry for IT infrastructure, the organisation penetration is growing strongly.

"There are a few key things driving this. One is the plain volume of organisations that will subscribe to cloud services that aren’t already. The second thing is cloud penetration within the organisation will increase, so organisations that are already using cloud services will spend more," he said.

"The third thing is the value of enterprise services in the cloud will expand. For example, if a company is running database workloads and billing systems on-premise, to move them into the cloud might be more of an exercise than just spinning up a VPS (virtual private server) with a credit card, so it’s higher value sell."

The study also sets out to dispel ongoing discussions that Australian organisations are being prevented by corporate and legal policies from using offshore cloud services. The Telsyte research indicates two-thirds of businesses that use the cloud are already using an offshore provider. Furthermore, 46 percent of CIOs said they are not subject to any restrictions on the use of offshore cloud services.

"The multinationals have made inroads into the local cloud market and local providers will need to compete on features and service levels, and not simply the fact that data is hosted in Australia," said Gedda.

The research identified there’s a growing trend in the adoption of a hybrid cloud model, which is expected to be in use by some 30 percent of enterprises by 2018. But on-premise IT is still considered a favoured choice. In fact, the study showed more organisations are implementing and considering private clouds, and are using virtualisation technology to implement a cloud architecture under their control.

"We can’t forget most of IT still runs on-premise, so enterprise IT is already an existing investment in service, storage, and infrastructure," Gedda said.

"For an organisation that has quite important workloads, they can’t just toss everything out and start again in the cloud, so the hybrid approach is a midpoint between organisations who want to keep key applications within the company’s boundaries, and between looking at moving some workload that may deem to be not as business critical, but they need to take advantage of the cloud benefits which are flexibility and agility."

Gedda concluded that there is now a "healthy range" of private cloud management options available to organisations that are looking to replicate the "scalability and manageability of public clouds with their own servers".

Microsoft Announces Azure ML, Cloud-based Machine Learning Platform That Can Predict Future Events


Microsoft has been on quite a cloud roll lately and today it announced a new cloud-based machine learning platform called Azure ML, which enables companies to use the power of the cloud to build applications and APIs based on big data and predict future events instead of looking backwards at what happened.

The product is built on the machine learning capabilities already available in several Microsoft products including Xbox and Bing and using predefined templates and workflows has been built to help companies launch predictive applications much more quickly than traditional development methods, even allowing customers to publish APIs and web services on top of the Azure ML platform.

Joseph Sirosh, corporate vice president at Microsoft, who was in charge of the Azure ML, and spent years at Amazon before joining Microsoft to lead this effort, said the platform enables customers and partners to build big data applications to predict, forecast and change future outcomes.

He says this ability to look forward instead of back is what really stands out in this product.

“Traditional data analysis let you predict the future. Machine learning lets you change the future,” Sirosh explained. He says by allowing you to detect patterns, you can forecast demand, predict disease outbreaks, anticipate when elevators need maintenance before they break and even predict and prevent crime, as just a few examples.

Sirosh says the cloud really changes the dynamic here because it provides the ability to scale, and the service takes care of much of the heavy lifting that would have taken weeks or months for companies trying to do it themselves in-house in a data center.

“The cloud solves the last mile problem, Sirosh explained. Before a service like this, you needed data scientists to identify the data set, and then have IT build an application to support that. This last part often took weeks or months to code and engineer at scale. He says Azure ML takes that process and provides a way to build that same application in hours.

What’s more is it supports more than 300 packages from the popular open source project R used by many data scientists.

Sirosh says the hope is that as more people use the platform and generate APIs and applications, and create what he called, “a virtuous cycle between data and APIs. ” People have data. They bring it to [Azure ML] to create APIs. People hook into applications then feed data back to the cloud and fuel more APIs, “he explained.

The product is currently in confidential preview, but Microsoft did mention a couple of examples including Max 451, a Microsoft partner working with large retailers to help predict which products customers are most likely to purchase, allowing them to stock their stores before the demand.

Carnegie Mellon University is working with Azure ML to help reduce energy costs in campus buildings by predicting and mitigating activities to reduce overall energy usage and cost.

Microsoft is not alone in this space, however. IBM launched Watson as a cloud service last winter for similar types of machine learning application building and just last week a startup called Ersatz Labs also launched a deep learning artificial intelligence cloud platform.

Azure ML goes into public preview next month. There is no word yet on the official launch date.

Even Early Adopters See Major Flaws in the Cloud

The C.I.A. isn’t afraid of the cloud.

Amazon, a relative baby in the field of technology services, triumphed over stalwart IBM to gain a $600 million C.I.A. contract, but the most remarkable part of the deal was that the agency was a cloud convert in the first place. The fact that a tech company could warehouse data involving the government’s spies is the clearest signal yet that cloud computing is having its moment.

Somewhat like outsourcing a decade ago, cloud computing is the coming technology destined to sweep away all before it. Amazon Web Services is the fastest-growing part of fast-growing Amazon, and analysts expect it someday to be the dominant part of the company. Google, IBM, Verizon, Microsoft and a host of smaller players are competing for a part of the action. Global spending on public cloud services alone is forecast to hit $210 billion in 2016, up 172 percent from 2010.

And yet outsourcing provides a cautionary tale of how enthusiasm can be derailed by reality. Outsourcing advocates said every customer service call, every information technology fix, even the creation of routine legal documents was destined to be done in India or the Philippines. They said this would be cheaper and more efficient. American companies would be hollowed out, with only executives and their aides left on the payroll.

It didn’t happen quite that way. While many companies outsourced routine tasks, some moved them back after complaints of poor service. Others never outsourced. Outsourcing was ultimately a segment of the market rather than becoming the market.

An inside look at how technology is remaking an industry, lowering costs for some and handing even more influence to a handful of powerful companies.

Cloud computing is already confronting similar issues.

Members of the Open Data Center Alliance, a consortium of global information technology companies like Infosys, Disney, Deutsche Telekom and SAP, are cloud enthusiasts. But in a recent alliance survey, two-thirds of the members said concerns about data security were delaying their move to the cloud. That was down from the 80 percent of respondents who expressed a concern about security the previous year.

Other results, however, are headed away from cloud computing. Fifty-six percent of members now say regulatory issues will limit their adoption, up from 47 percent. And 47 percent worry about being tied to one vendor, up from 39 percent.

Cloud computing “is kind of in the first wave,” said Michael Masterson, director of cloud solutions for Compuware, which helps clients improve the performance of their applications. “I don’t know if I’d call it immature, but it’s definitely Version 1.”

“Immature” is exactly what Roman Stanek would call it. Mr. Stanek founded GoodData in 2007 with the mission of disrupting business intelligence. Amazon had just started Amazon Web Services, and GoodData became a client. It was not an entirely happy experience, Mr. Stanek said.

“Imagine if your electric company didn’t know whether it would be up or down — if they told you, ‘No guarantees, but we believe it will be mostly up,’ ” he said. “Maybe that works for some clients.”

Amazon Web Services had well-publicized failures in October 2012 and two months later. But reliability is not a problem specific to Amazon. Mr. Masterson pointed out that Hewlett-Packard, which publishes its service agreements on the web, says it has 99.95 percent availability, which works out to about four hours of trouble a year. The service agreement also says a failure only counts if it lasts for more than six minutes.

“Imagine a company selling a premium new car whose warranty includes 2M piston revolutions, 10k door latch cycles, and 20k window open and closes,” Mr. Masterson wrote in a recent blog post. “And even then, with 99.5 percent availability, you might still be unable to start the car two days a year, or during winter there might be two weeks where the doors won’t unlock until the sun melts the ice in the door locks. Ready to buy?”

GoodData grew with Amazon. It has raised more than $75 million and has nearly 300 employees. But in the first quarter, the company left Amazon. It moved to a private cloud hosted by Rackspace, which is based near San Antonio. With Rackspace, it had more control. The only thing worse than a company offering unreliable service appears to be a company whose very existence is in doubt. Nirvanix was a cloud company based in San Diego with impressive backers, including Intel Capital and Khosla Ventures, and impressive hype. It was going to take on the big boys, Amazon Web Services and Google.

Last Sept. 16, Nirvanix warned customers they had two weeks to retrieve their data. On Oct. 1, it filed for bankruptcy. Apparently all customers got their data out, but it was a near miss. Then last month Rackspace, whose stock dropped by half since the beginning of 2013, said it was hiring Morgan Stanley to advise it on a possible sale or merger. While profitable, Rackspace faces increasing competition.

Companies that fail or are sold would matter less if data was more portable.

“It’s still difficult to tap into Rackspace and change your mind, or tap into AWS and change to something else,” said James Staten, an analyst with Forrester Research. “We’re a long way from sufficient standards where that’s a possibility.”

Mr. Staten said the last companies to go to the cloud would be those that had no experience with it — hospitals, medical device makers, and architecture and construction companies. “The reason they should go last is they don’t yet know what they don’t know,” he said. “They’ll start with applications that do not involve compliant data or customer data.”

In general, he said, “it’s hard to make a case for organizations that should not go to cloud at all.” It is not like outsourcing, which faltered over communications issues and rising prices in the host countries, he said.

GoodData expects to ultimately go full circle as competition increases. “This used to be very much Amazon’s monopoly, and a one-horse race was not good,” Mr. Stanek said. “That’s why I’m happy to see Google get in. In a couple of years, you will be able to go to Google or Amazon and say, ‘Give me the features I need, the service agreements and a good price, and I will have no reason to build my own cloud.’ ”

Savvy businesses looking at ways cloud can make money

The jury is still out on whether cloud can save you money, says Gartner research director, Michael Warrilow.

While we wait for the jury to come back in on the cost-saving question, Warrilow says savvy businesses are looking to make money from cloud adoption. Businesses in the gaming (gambling), digital media, and media sectors are taking advantage of cloud’s scaling potential to meet peaks in demand for services, he adds.

Australian and New Zealand (ANZ) businesses have started to ramp up adoption of cloud in the last 12 months, in contrast to the situation a year ago when there was still a lot of resistance to cloud. "It’s like a switch has been turned and a light has gone off," says Warrilow.

Warrilow was talking to ZDNet Cloud TV about what’s happening with cloud in the APAC region.

The entry of some of the larger global cloud players into the local market has played a part in the growing acceptance of cloud in ANZ, Warrilow explains.

He says some of the traditional heavy users of IT have not necessarily been the ones that have moved first with cloud. In the government sector, for instance, he has seen different approaches ranging from "complete rejection to wholesale adoption." By contrast, organisations in the services and retail sectors are moving ahead with cloud.

Cloud has a higher priority in China than in any other economy, according to Gartner’s 2014 global CIO research study. Other North Asian countries like Taiwan and Hong Kong are also moving ahead with cloud while Japan is moving less quickly, although local cloud services are springing up. South East Asian countries are also moving to cloud, but at a slower clip.

Warrilow cautions businesses against "falling into cloud", where small projects expand into larger initiatives and where the proper due diligence isn’t done.

Some of the more prevalent use-cases for cloud right now are software development and testing and disaster recovery. Software-as-a-service (SaaS) also provides opportunities, with Warrilow emphasising there are "a lot of discrete applications… better off handled by a 3rd party."