Taking data from insurance to insight

Cohesity’s Douglas Ko explains how the firm’s offerings can transform data back-ups from a ‘worst case scenario’ solution, to actively achieving business intelligence

Taking data from insurance to insight

Like car or home insurance, data back-ups have always been a necessary ingredient and cost of an effective IT environment. But until recently, that is all they have been good for. When something goes wrong, whether it be an accidental deletion, server crash, or a malicious attack, organisations use the back-up to recover data. The back-up sits idle, waiting for disaster to strike, just like insurance. 

Cohesity is helping its customers transition their backed-up data from a purely defensive role to become a competitive asset, enabling it to go beyond data recovery and actively help enterprises do more.  

“With Cohesity’s modern data management platform there is a whole other area where you could start using your data,” says Douglas Ko, director of product marketing at Cohesity. “Once you have your data all backed up, there is a lot you can do with it.” 

“Legacy back-up systems were designed to collect, store and protect all of a company’s critical data. This often entails bundling the data together and dumping it on some hardware-based storage, at which point it becomes entirely opaque. It sits dark until a recovery is needed, which requires that someone find the necessary data in a long, onerous process. The performance and visibility of data was limited.” 

Cohesity has changed the game in this respect. Its modern data management solution allows data to be stored on a software defined parallel file system which provides the scale and performance to back up, search and recover data quickly. 

The firm has also increased its efforts in leveraging data with its agile development testing capabilities and the Cohesity Marketplace. Agile development testing enables users to create thin zero-cost clones of data for development, testing, and analysis. Still under construction, the marketplace uses the Cohesity platform to bring applications and artificial intelligence (AI) to the aggregated data, rather than bringing the data to the applications. 

“Traditionally if you were to do any type of data work such as training AI models or business intelligence, you would take your production data and copy it to the AI or analytics app for use,” explains Ko. “There’s a lot of friction involved in this: you must first find that data, then collect it, then move it to the app. With Cohesity, we have already done all that for you; the data is backed up, consolidated, indexed and searchable. Because we are not just a storage destination, we can run apps on the platform, which removes further complications.” 

According to Ko, the marketplace can also be used to improve data security. “Having the data consolidated means that you can scan it for any type of security threat, whether it be ransomware, viruses or vulnerabilities such as out-of-date system updates,” he says. It also helps with compliance by scanning for personally identifiable information. 

The platform and marketplace have been designed to keep costs and complications low from the inside out. Cohesity’s file system is designed to be immutable through a ‘write once, read many’ algorithm, which protects data from being changed – while at the same time allowing for an infinite number of snapshots against which applications can access the data. “When you bring the app to the data, it uses a mirror image of the data set for analysis, which is based on pointers to the original data,” Ko explains. “When you do things to the data that might change or overwrite it, it does it with pointers again so that it doesn’t actually alter the original data. This keeps the number of data copies low and reduces storage costs, as well as enhancing the integrity and security of data.”  

Ko believes that the Cohesity platform and marketplace could kickstart a shift in how organisations in all industries leverage their backed-up data for additional value, especially in the cloud where new data analysis, AI and machine learning data processing services are being built. “Seismic data and large medical files are often stored on-premises, so we have created a capability to transparently move those files to the cloud in their original formats, which can be accessed through an application programming interface.” 

“There was a study published recently in the journal Nature about AI detecting cancer cells in medical images more accurately than radiologists. This is absolutely the type of use case where our technology can come into play.” 

This article was originally published in the Spring 2020 issue of The Record. Subscribe for FREE here to get the next issues delivered directly to your inbox.

Number of views (1002)/Comments (-)

Comments are only visible to subscribers.

Theme picker