If 2023 was the Generative AI’s breakout year, the years that followed could be characterized as a period of experimentation and implementation of new AI capabilities.
The 2025 Enterprise Cloud Index (ECI) report by Nutanix found that nearly 85% of surveyed companies had a GenAI deployment strategy in place and 55% were actively implementing it. As the path forward unfolds, IT teams are reassessing their strategies and systems.
“To successfully unlock ROI with GenAI projects, organizations need to take a holistic approach to modernizing applications and infrastructure and embrace containerization,” said Lee Caswell, senior vice president of product and solutions marketing at Nutanix.
Most of those who moved fast and strategically struggle to measure success. In its October 2025 AI ROI report, Deloitte identified the paradox of rising investment and elusive returns. The research firm observed that organisations are pouring in investment, yet returns are slow to materialise and hard to measure. For generative AI, ROI is most often assessed on efficiency and productivity gains. For agentic AI, measurement is likely to focus on cost savings, process redesign, risk management and longer-term transformation.
IT veteran and computer engineer Alex Almeida told The Forecast it’s critical to continually assess how data is managed, especially as organizations embrace artificial intelligence, which unleashes another deluge of data.
“I can’t just throw every single piece of data I have at my AI instance,” explained Almeida, senior product marketing manager for Nutanix Unified Storage, a software-defined data services platform that consolidates data management and protection into a single, unified solution.
“That’s the reaction that everybody has right now. They think, ‘The more data you feed AI, the better it’s going to come out.’ But that’s not necessarily the case.”
Instead of a “more data, better results” mindset, organizations need a “right data, right results” toolset, Almeida suggests.
“Organizations must ask themselves, ‘What data is it, and is it something that is valuable to be input into a large language model or RAG-type workflow to make your results coming out of AI that much more effective for your business?’
Almeida said that enterprises seeking a return on AI investments must first get their data in order. It requires a commitment to managing data quantity and data quality.
“You need tools that will help you classify your data and analyze it to determine its effectiveness,” he said.
Almeida likened the present moment in AI to the early days of IT virtualization.
“What we’re seeing now is typical in any sort of IT wave,” he said.
“When virtual machines and hyperconverged infrastructure (HCI) first came out, they were used a certain way. Now, people have found much more efficient ways of deploying them. It will be the same with AI. Over time, enterprises will learn how to better leverage the data.”
Almeida said that implementing AI without a data management strategy is like building a house with a broken foundation. The good news is, once the foundation is fixed, a beautiful house awaits.
“For a lot of IT administrators, there was this initial promise and hype that AI was going to transform their business,” he said. “And it will. You just have to keep going. You may not have the ROI you wanted now, but it will come.”
Strategic data management for AI can build on data backup and protection processes.
“You need to protect your data, but do you protect everything?” Almeida asked. “Maybe you just need to protect the database that’s essential for your transactional workflow, for your sales and for your revenue. The classification of data in that instance is important.”
Data classification can be equally important in the context of AI.
“The rate of data that’s being generated because of AI is going to continue to grow exponentially,” explained Almeida, who said enterprises are going to have to decide what data from AI is essential to their business. They must leverage data that differentiates it from competitors, and find ways for it to generate revenue.
"When we look at the results and outcomes you get from prompting AI, you have to consider, is that intellectual property that needs to be protected too?”
If it is, the same classification tools that help decide what data to protect can help decide what data to act upon. For example, he explained, knowing what data is worth feeding back into an LLM (large language model) as part of a RAG workflow.
“There’s a lot of technology under the hood that allows us to classify data and analyze it for cybersecurity purposes,” he said.
“The question has to be asked: why couldn’t that be used for other things? Why couldn’t you do more data analytics to make it easier to determine whether or not data needs to go into an LLM?”
Almeida cited video surveillance as one of many exciting potential use cases.
“Imagine having the ability to capture video from different sites and then feeding those recordings into an analytics engine,” he explained.
“The surveillance data moves into an intelligent workflow that extracts insights. The data management practices determine how quickly data from the edge sites gets accessed at the analytics site, which provides insights that may help the business.”
Increased speed and better classification require the right storage infrastructure.
“If you have a hodgepodge of data storage solutions, and you’re feeding your database with a typical three-tier architecture of storage, compute and networking, there’s a silo of data. And how well does that talk to your next-generation HCI implementation?” asked Almeida.
“Having a smart data management process makes it easier to manage all of that disaggregated storage to make it easier to feed into AI classification tools to then feed in closer to LLMs and things like that.”
“IT team that leverage new tools and best practices can push storage into the next generation, and this will help with the ROI of enterprise AI.”
Editor’s note: Learn about Nutanix Unified Storage, a software-defined solution that consolidates file, object and block storage while offering rich data services such as analytics, lifecycle management, cybersecurity, and strong data protection.
Ken Kaplan is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan and LinkedIn.
Matt Alderton contributed to this story. Find him on LinkedIn.
© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.