IT teams also create silos through their expertise and work processes, Nijmeijer added.
“Backup is a completely different scale than managing virtualization, so backup teams are very specialized,” he said.
Silos also happen during duplication and backup. Multiple copies of data in the primary architecture may become invisible when the data goes to backup.
“You get this data sprawl,” Nijmeijer said. “And you might store a lot more data than you actually need.”
Public cloud platforms enable IT teams to scale rapidly in lockstep with new demand or seasonal business. In hybrid environments, primary and secondary architectures inevitably expand (or contract) on a scale that affects data lifecycle management.
“The moment you give me that infrastructure, I'm going to start generating data — new PowerPoints as an IT worker, new code, whatever it is,” Prasad said. That requires more attention to cybersecurity and data protection.
As the world adapted to the COVID-19 pandemic, many IT teams scaled up overnight with virtual desktop infrastructure (VDI) and desktop-as-a-service (DaaS) to serve their remote workforces. These technologies use file shares that require specific data-lifecycle approaches, according to Nijmeijer.
“You have to make sure those file shares are protected and can be easily restored and can scale out as well,” Nijmeijer said. “If you’re growing from 100 users to 1,000 users, then your file services need to scale out very, very efficiently as well.”
Managing Backup and Disaster Recovery
Before the advent of low-cost cloud computing, tape drives were the go-to source of data backup.
“In the old world, tape was an order of magnitude cheaper than primary storage,” Nijmeijer said. “But the moment you go to tape, you lose both data and visibility of the data.”
Recovering data from tape can take hours, days or weeks. Nijmeijer said hybrid cloud backup and disaster-recovery can make backup data available much faster at dramatically lower costs.
But there may be trade-offs in hybrid clouds.
“Now you need a cloud admin, somebody with an AWS (Amazon Web Services) account, et cetera, et cetera,” Nijmeijer said. “You also lose data visibility. What’s out there? How do I index it? How do I know how many copies of my data are there?”
Another piece is the orchestration of data recovery. Orchestration pulls everything back together when recovering systems from backups. Software and automation are increasingly crucial to data lifecycle management, because IT systems and architectures are becoming so complex.
Tools for Simplifying Data Lifecycle Management
Data lifecycle management is a straightforward concept. Applications, sensors and computing devices give life to data. At some point, data gets copied, analyzed and stored on a hard disk or memory chip. When it’s deleted, new data takes its place.
But things get tricky in an ever-expanding universe of data-driven workloads in hybrid cloud environments. At enterprise scale, it’s all but hopeless without automation.
That’s the challenge Prasad, Nijmeijer and their colleagues are taking on at Nutanix, one of the many tech companies developing sophisticated tools for data lifecycle management in hybrid multi-cloud IT systems. Nutanix pioneered the concept of hyperconverged infrastructure (HCI), which uses software to virtualize compute, storage and networking in a single management plane, streamlining processes and reducing total cost of ownership. The move to software-defined infrastructure is shining new light on how to best manage and move data.
A Nutanix service called Mine, for instance, leverages automation to unify backup and recovery in an easy-to-use interface that emulates the HCI model. That makes life much easier for IT pros to manage the data lifecycle in hybrid clouds.
“We bring value by simplifying complex data life-cycle management operations,” Nijmeijer said. “We make it easy to manage data end-to-end, from creation to long-term archival, all under one platform. This saves IT people a lot of time. Time that they typically spend keeping the lights on in the data center.”
Applying new approaches and automation to data lifecycles is increasingly critical as companies move to hybrid multi-cloud IT systems.