As data analytics, automation and artificial intelligence (AI) capabilities become more critical to business success, many strategists view data has the lifeblood of modern enterprises with some describing it as the “new oil of the digital economy.” The Economist dubbed data as the world’s most valuable resource. But where to put it, how to protect it and when to move or remove it require new strategies and capabilities.
Market researcher IDC predicts that cloud IT solutions will surpass on-premises infrastructure offerings as the primary location where operational data is stored for the majority of enterprises in many geographies by 2025. It also notes that cloud technologies designed to help facilitate data management and mobility are among the three core pillars defining the future of digital infrastructure frameworks. With over 2.5 quintillion bytes of data being generated every day, companies are now operating anywhere from hundreds to tens of thousands of databases at any given time. And it’s no easy feat.
Data management and data operations have become cornerstones of an increasingly software-defined and cloud-powered business world. Different data regulations around the world and a preference for hybrid multicloud capabilities that allow data and applications to move and run across different infrastructure types are driving many to rebalance their center of gravity by prioritizing the location of data.
“Data is the core of every organization,” said Tobias Ternström, vice president of engineering and product at Nutanix. He leads Nutanix Database Service, which automates database needs in private and public clouds.
“It’s critical to step back, focus on the right strategy and build from there then optimize.”
Ternström said customers are making sure their data is secured, highly available and protected from disasters. This is complex, especially when IT teams need to manage different database engines.
“You run cache, messaging and search systems, and a large organization can run thousands of apps with each app running multiple of these services,” he said. “Or focus is to just help customers manage and run all of this complexity at scale, making sure that they live up to the business requirements that they have.”
All of the data a company gathers and uses has gravity. It accumulates quickly and regulations require some of it to be stored in certain locations, securely and retrievable. This is challenging and expensive, so IT leaders are exploring how data and applications can work better together from development to deployment and beyond.
Many are turning to so-called DataOps approaches and tools for better collaboration between data engineers and analysts to improve real-time insights and decision-making. This area is expected to grow significantly as businesses generate more data, use more cloud services and embrace AI and IT automation.
DataOps is importance is increasing as organizations seek to use data to drive business decisions and gain a competitive advantage. The DataOps methodology oversees the ecosystem of vendors and products across the data platform and data pipeline, according to Stephen Catanzano, senior analyst at TechTarget's Enterprise Strategy Group.
“Only 4% of organizations said they get immediate data insights, Catanzano wrote, referring to findings from his research firm’s 2022 “Cloud Analytics Trends” report. “In comparison, an additional 45% of organizations reported a week or longer to get quality insights into their data.”
He sees smaller companies able to implement the same capabilities as Fortune 500 companies -- available in public clouds and at cloud scale -- to better use their data for decision-making.
“We predict a high adoption rate in 2023 of these technologies and for real-time data insights to improve significantly.”
When it comes to tackling data management and DataOps issues, one could argue that no IT role is more important than the database administrator (DBA), said Jeff Kelly, senior product marketing manager for Nutanix’s Database Service.
“Almost every application needs a database to store and manage its associated data,” Kelly said. “Developers need to be able to spin up new database instances quickly and they need access to different types of databases… to build new, innovative applications and features that today’s customers expect.”
He said looking to database-as-a-service solutions is helping firms more rapidly scale, efficiently operate and otherwise manage exponentially-growing volumes of data in complex or hybrid cloud environments.
“Database administrators are just overloaded managing all of these things, keeping them updated and protected,” said Terstrom.
He said new services reduce their burden so DBAs can focus on strategic things rather than patching and maintaining operations. These services simplify and speed up their ability to scale database needs, especially as businesses embrace open source databases, which can introduce more things to manage on a regular basis.
Industry research firm Gartner and other experts provide these tips for collecting and managing data:
- Assess dependencies between applications, networks and data… then design interconnecting IT solutions around them
- Employ a multicloud data management strategy and distributed-by-design IT architecture
- Optimize network operations for cloud-native applications and databases
- Leverage edge computing solutions to enhance data availability and performance
- Consolidating information sources and workflows wherever possible
- Reduce IT infrastructure complexity and controls
- Partner with cloud providers that can align computing and storage capacity and then streamline the flow of information between applications and systems
- Create and enact robust and resilient data management and data governance strategies
- Identify datasets that generate the most gravity (and associated applications) and host these items where they can most easily scale
- Use analytics and automation tools to analyze large volumes of data in real-time to sift through and delete unnecessary information
A hybrid cloud database systems enable companies to “run the database where the business and data dictates, rather than where the technology dictates,” according to Neil Carson, CEO of Yellowbrick Data.
“Using database-as-a-service tools, hyperconverged infrastructures (HCI), and other cutting-edge technology offerings can help drastically reduce cost, overhead, and IT complexity,” said Taylor Tresatti, head of industry research for BIZDEV: The International Association for Business Development.
“These helpful solutions effectively allow you to virtualize your database and analytics solutions, and keep data from piling up in silos, making it easier to access information where it’s needed, when it’s needed, as demands arise.”
The way IT leaders are embracing agile DevOps and DataOps approaches and hybrid multicloud strategies reflects how data is moving to the center of gravity for more businesses. New services are helping them find better, more efficient ways to extract value from their data and put it to good use.