Technology

AI is Hungry for Data, But Can IT Infrastructures Keep Up?

IT analysts explain why bottlenecks in data storage, system latency and observability stymie the success of big investments in AI capabilities.

October 23, 2025

From the call center to the factory floor, AI is billed as a lever for accelerating services, sparking innovation, and fine-tuning operations. But when enterprises put those promises to the test, they often struggle with latency, higher costs, and subpar results.

Contrary to common perception, the choke points aren’t just immature language models. They lie in how organizations collect, aggregate, store, and feed data to their AI applications, according to Brad Shimmin, vice president and lead for data and analytics at The Futurum Group. Too many companies are still using network infrastructure built before ChatGPT’s emergence to prepare data for AI models, and it isn’t working.

“Enterprises have had a data problem for quite a while,” Shimmin told The Forecast. “What we’re seeing right now is that with the emphasis on AI, these issues are coming to the foreground.”

The cause isn’t difficult to pinpoint. Organizations keep hitting the same chokepoints: storage, latency, security, and observability, he said.

As a result, despite the tens of billions of dollars invested in generative AI, 95% of organizations report no measurable return on investment, and only 5% of pilots make it into production, a recent MIT study found. Most AI tools, meanwhile, fail due to “brittle workflows, lack of contextual learning, and misalignment with day-to-day operations,” according to the report.

“Agentic systems are only going to make these problems worse,” Shimmin said. “They expose how little companies really understand their data; where it is, what it is, and who has access to it.”

Storage Under Strain

Shimmin sees storage as one of the most pressing problems for ensuring AI models have the data they need. Legacy systems weren’t built to handle the massive, unstructured datasets that feed generative AI, and scaling them often drives up costs. Enterprises spend heavily to expand capacity yet still struggle to get data into usable form quickly.

Related Get a Grip on Data Storage in Quest for Enterprise AI
In this video interview with The Forecast, Simon Robinson, principal analyst at Enterprise Strategy Group, discusses the complexities of managing data in cloud and hybrid multicloud environments, a challenge that is growing more acute with the rise of enterprise AI applications and data.

April 2, 2025

This leads to costly embedding and indexing unstructured information, which is one of the worst bottlenecks, according to Shimmin.

“If you have a lot of data that changes, that becomes a very difficult procedure in terms of cost and time,” he said.

Latency and Security Create Headaches

Latency is another pain point, often worsened by the sprawl of enterprise data estates. Moving information across systems or providers adds friction.

“Every single chink that you put in the armor of performance, whether that is moving data or managing data, that is added cost and trouble,” Shimmin said.

Related Importance of AI Data Storage Performance
How MLPerf Storage benchmark helps AI and ML developers compare performance of different data storage technologies.

November 22, 2024

That friction multiplies as data sources expand. Without close attention to where data lives and how it moves, enterprises pay the price in slower performance and higher bills.

Security measures can also sap performance. Firewalls, encryption layers, identity checks, and data loss prevention tools all introduce additional steps into data flows. Shimmin says having solutions with security and compliance built in can help reduce such performance issues.

Observability: Seeing is Knowing

Observability is emerging as a make-or-break factor for AI. Traditional monitoring tools were built for static applications, not adaptive systems that change behavior based on data inputs and probabilistic outputs. According to Sanjeev Mohan, founder of the consultancy SanjMo and a former Gartner analyst, to make AI work, enterprises need visibility into how data flows and whether it is fit for use.

“If I cannot observe my data, I cannot know if it is accurate, timely, or even available for my AI,” Mohan told The Forecast

“Observability is what tells me if my data estate is truly ready. Without it, I am just guessing.”

Related Healthcare Data Storage Trends and Challenges
A comprehensive look at the trends, challenges, and transformations driving healthcare data storage strategies.

February 20, 2024

That visibility helps enterprises catch drift, hallucinations, and other anomalies before they spiral into failures. It also shows leaders whether their infrastructure can handle the workloads they want to run. Without observability, organizations are flying blind, Mohan said.

Waiting for Perfect Data is a Trap

Even so, he argues too many companies let data quality concerns stall their AI plans. He advises leaders to accept that data will never be flawless and to use AI itself to help improve it.

“Data is always going to be imperfect no matter how hard we try,” Mohan noted.

That mindset also applies to architecture. Mohan stressed that companies don’t need to scrap their existing systems and start over. Instead, they should treat data architecture as the foundation and enhance it with AI, taking advantage of the most up-to-date capabilities from their providers.

Get Unstuck

Both experts offered practical steps for moving forward:

  • Inventory the estate. Catalog both structured and unstructured data, along with the metadata that describes it, so the organization knows what it has, where it resides, and how it maps to business goals.

  • Keep data and compute together. Many enterprises separate them, forcing data to move across systems. Every transfer adds latency and cost. Running workloads where the data already resides reduces friction and improves performance.

  • Prioritize integrated stacks. Use the observability and governance features built into core platforms before adding extra tools. Extend only when gaps emerge.

  • Design for flexibility. Adopt open standards and modular approaches to swap components as needs evolve.

  • Lighten the query layer. Where possible, use smaller, faster query tools instead of standing up heavyweight databases for every task.

  • Measure value continuously. Pilot use cases with clear metrics. If an initiative saves time or reduces ticket volume, document the benefit to justify further investment.

Embracing Unified Storage

Another way to overcome data hurdles is through software-defined unified storage, said Alex Almeida, senior product marketing manager at Nutanix. A unified storage system helps deliver data for AI more quickly and efficiently by consolidating data on a single platform, he said.

Almeida said unified storage not only breaks down silos and speeds up data access for model training and real-world applications but also scales more easily across environments.

Related Making the Case for Unified Storage
Unstructured data is growing exponentially and legacy storage infrastructure hamper efficiency, the use of analytics, and the ability to monetize data insights. Unified storage simply delivers data storage for any workload to any location ─ whether core, edge, or cloud.

February 9, 2021

“Successful AI adoption starts with strong data management,” he said. 

“By establishing a solid data foundation with a unified data management and services platform, organizations can cost-efficiently accelerate AI pipelines and achieve faster, more efficient outcomes."

Looking Ahead

To succeed with enterprise AI, Shimmin believes it’s critical to treat data delivery as a core competency. While choosing the right AI models matters, the ability to manage storage, latency, and observability at scale is what separates the 5% who see value from the 95% who don’t.

For IT leaders, that means more than technical upgrades. It requires prioritizing data flow and reliability over chasing experimental models. Without that shift, the promise of AI will remain out of reach, Shimmin explained.

“If you don’t have your data estate together, in terms of both the quality of that data and the availability of that data, you’re in trouble,” he said. “You’re not going to achieve those supposed results from your investment in AI.”

David Rand is a business and technology reporter whose work has appeared in major publications worldwide. He specializes in spotting and investigating what’s next and helping executives in organizations of all sizes determine what to do about it.

David Rand is a business and technology reporter whose work has appeared in major publications around the world. He specializes in spotting and digging into what’s coming next–and helping executives in organizations of all sizes know what to do about it.

© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles