Podcast

Survey Shows Speed of AI Innovation Strains IT Control

In this Tech Barometer podcast, analyst Steve McDowell and cloud native technology expert Dan Ciruli discuss top topics from the 2026 Enterprise Cloud Index, a survey of IT professionals, which revealed tension between the need for IT governance and the reality of easy-to-build-and-deploy containerized apps. Demand for AI capabilities is driving up shadow IT use, forcing IT teams to manage more risks.
  • Key Play:Enterprise Ai
  • Nutanix-Newsroom:Article, Podcast
  • Use Cases:Cloud Native

March 31, 2026

There’s no escape from the AI push and pull in the world of enterprise IT. Onboarding new AI capabilities is bringing tradewinds that IT teams must navigate. Data from the 2026 Enterprise Cloud Index (ECI), a global survey of IT leaders, reveals the progress and some of the biggest challenges IT teams face as AI moves from experimentation to implementation.

“AI is running a bit faster than the policies within the enterprise,” Steve McDowell, principal of NAND Research, told The Forecast

“This stuff has real value, and your employees are going to use it,” he said. And this makes it tough for IT teams to manage.

According to ECI findings based on a survey of 1,600 cloud, IT and engineering executives, shadow IT is on the rise despite IT leaders’ need for oversight over new AI applications in their environments. AI tools that accelerate innovation are widely available, and employees often spin them up without asking for permission. 

The ECI reported a broad unease about corporate employees experimenting with AI without oversight from central IT. It also noted that virtualization tools accelerating cloud native AI development have achieved mainstream adoption.

Podcast Survey Shows Speed of AI Innovation Strains IT Control
In this Tech Barometer podcast, analyst Steve McDowell and cloud native technology expert Dan Ciruli discuss top topics from the 2026 Enterprise Cloud Index, a survey of IT professionals, which revealed tension between the need for IT governance and the reality of easy-to-build-and-deploy containerized apps. Demand for AI capabilities is driving up shadow IT use, forcing IT teams to manage more risks.
  • Key Play:Enterprise Ai
  • Nutanix-Newsroom:Article, Podcast
  • Use Cases:Cloud Native

March 31, 2026

While enterprise personnel drive AI demand, cloud native technologies such as containerization applications and container orchestration tools like Kubernetes are supporting AI supply, according to ECI findings. The report indicates containers are becoming foundational to application strategy. Over the next 3 years:

  • 87% of respondents expect application containerization to increase at their companies.
  • 85% think AI is meaningfully accelerating adoption of containers for their organization.
  • 83% are building new applications in containers.

“Containers in general and Kubernetes in particular are the de facto standard for developing and deploying new applications,” said Dan Ciruli, vice president and general manager for cloud native with Nutanix, a pioneer in software virtualization technologies.

“Now we're in the phase that everybody needs to be comfortable talking about it, using it, deploying it, running things on it.”

AI Enters the Agentic Era

The ECI identified trends that have broad implications for enterprise IT today as the use of AI evolves. The rise in use of AI applications and the cloud native technologies used to create them are forcing IT teams to reassess their technologies and strategies. The survey asked IT leaders about the AI applications they expect to use over the next three years. Generative AI (GenAI) came out on top (58% of responses), followed by agentic (56%), chatbot (49%), predictive (48%) and computer vision (46%).  

RELATED AI Sparks Rise in Shadow IT
The 2026 Enterprise Cloud Index shows 79% of IT leaders encounter unauthorized AI deployments, and this familiar pattern of Shadow IT puts them at risk.
  • Article:Business
  • Key Play:Enterprise Ai, Hybrid Cloud, Thought Leadership
  • Nutanix-Newsroom:Article
  • Products:Nutanix Enterprise AI (NAI)
  • Use Cases:Security

March 19, 2026

While GenAI has made huge inroads since the release of ChatGPT in November 2022, the ECI showed that today’s hottest AI trend in corporations is “agentic” workflows that use machine intelligence to simplify manual processes.

“That's booming right now,” McDowell said.

Agentic workflows are different than autonomous AI agents that execute commands based on predictive analysis without human intervention. Figuring out how central IT will oversee these agents is a work in progress.

Rise of Shadow AI

Most AI apps are developed and run in containers that can operate across different IT infrastructures, including public cloud services and private datacenters. The Kubernetes platform emerged to meet this need by enabling rapid experimentation with AI models and capabilities. The ECI showed that widespread use of containers and the rise of AI applications are increasing the risks apps spun up and managed outside the oversight of central IT.

Ciruli likened this to what happened when cloud services first became available and people inside organizations began using their credit cards to pay for compute capabilities rather than go through their own IT department. This opened a flood of shadow IT projects that eventually needed to be tamped down or better managed to avoid runaway costs and business risks. 

“I think Kubernetes started the same way, where a team was like, ‘Hey, I don't want to deploy in VMs. I'm just going to start using Kubernetes.’ Then another team and another team and at some point, someone a higher up and more central in the IT organization said, ‘Wait, there's way too much of this going on.’”

Shadow AI is widespread and largely unmanaged

He said IT leaders want to centralize the use of containerized applications. 

“We need to standardize this and we need to do this in a way that does give the teams what they want, but does it in a way that doesn't put us at risk financially or from a security perspective," he said, describing how IT team leaders are taking action.

The ECI survey’s respondents showed widespread concern about corporate staff running AI applications unsupervised by central IT:

  • 79% said they know of non-IT staff implementing AI applications or agents.
  • 87% cited the business risks of using AI tools and agents without official oversight.
  • 82% said silos between business units and IT make it tougher to execute technology projects.

McDowell noted that enterprise IT leaders are still figuring out which policies they need to address shadow AI. But he added that people will adopt AI on their own no matter what IT leaders decide. 

“My advice to enterprise IT is to set up the guardrails and make it easy for your employees to adopt it and then enforce the policies however you will,” McDowell said.

It’s still not clear who owns all AI workloads, he said. That makes it difficult to assign oversight responsibilities. Enterprise cloud providers have secure environments designed for trustworthy storage and governance of sensitive data, he added. Though data sovereignty creates specific compliance implications, the core challenge is trustworthiness.

“It's not that I need to run it on-prem, it's that I need to trust where my data's going,” McDowell said. “And right now we haven't done a lot of that vetting.”

Containers Accelerate Innovation

Containers are far beyond the experimentation phase, according to Ciruli. Kubernetes isn’t a specialized discipline only for the few anymore. He said containers arose to help software developers get their work into production quickly. But because different projects and people implement containers is slightly different ways, confusion and duplication of effort can be pervasive when running containers across an enterprise.

“As you adopt this technology wholesale across the organization, then you have to think about, well, how do we standardize?” Ciruli said. He explained that a mainstream platform like Kubernetes lets IT teams enforce standards and operate from a central location.

Containers are becoming foundational to application strategy

The ECI survey found that 71% of respondents are running AI-enabled applications on a mix of traditional apps in virtual machines (VMs) and modern apps in containers on VMs, while 14% are running their AI-enabled apps directly on bare metal servers.

These findings point to an enduring challenge: Kubernetes and similar applications may give innovation a boost, especially with AI, but they can’t replace large portfolios of conventional VMs in enterprise environments. IT teams must manage those environments while embracing Kubernetes and containers.

Companies are looking for ways to oversee these infrastructure variants from a single platform that creates common policies for security, networking and backup/recovery. 

“All of those can be consistent, and it becomes much easier to interoperate between the new and the old,” Ciruli said.

RELATED How Kubernetes Catalyzes Enterprise IT
Dan Ciruli, vice president and general manager of cloud native technologies at Nutanix, explains how Kubernetes started as a tool for managing containers but has rapidly evolved into the foundation for modern cloud-native computing across data centers and the edge.
  • Article:Technology
  • Key Play:Enterprise Ai
  • Nutanix-Newsroom:Article
  • Products:Nutanix Kubernetes Platform (NKP)
  • Use Cases:Cloud Native

January 8, 2026

Ciruli acknowledged that Kubernetes requires specific training and experience. That poses a question: What can be done to make it easier to run a Kubernetes cluster? AI is taking on more grunt work from the folks running container environments. 

“This is a case where AI already is making their lives easier,” Ciruli said.

What’s Ahead

Global IT leaders expect to integrate AI agents into their business strategies, according to the

ECI report. Top responses included enhancing customer or employee experiences (61%); improving productivity and efficiency (58%); and creating new products, services, or revenue streams (57%).

For all this optimism, AI’s future seems uncertain. 

“We're still waiting for the killer app from the end user perspective,” McDowell said.

“It's one of these technologies that is going to change everything, but often under the covers,” McDowell said. 

RELATED Rising Agentlakes Feed AI Agent Sprawl
As autonomous AI agents spread across enterprises, industry analysts explain why they’ll need new architectures to orchestrate data, workflows, and governance or risk drowning in complexity.
  • Article:Technology
  • Key Play:Enterprise Ai, Thought Leadership
  • Nutanix-Newsroom:Article

March 7, 2026

He said AI already is embedded across Microsoft’s user platform, for instance. Google Workspace and Zoom use it, too. It’s found in many apps for translation, transcription and image analysis.

“This is the year it's going to start to pervade through the enterprise,” McDowell added.

Current AI implementations give an impression on where things are going. For instance, McDowell said, railways use edge-computing devices that scan passing railcar wheels for signs of early wear or failure. In retail, AI at the edge answers key logistics questions.

“These stores have cameras pointed at the shelves,” McDowell said, “If I'm out of Kellogg's Frosted Flakes, it sends an alert. The camera, the AI knows.”

Podcast transcript:

Jason Lopez: This is the Tech Barometer podcast. I'm Jason Lopez. Nutanix released its eighth annual enterprise cloud index this month, and two of the key findings stood out. Containers are rapidly becoming the foundation of how modern applications are built and run, and Shadow AI is spreading through organizations largely unmanaged. We talked about this issue of Shadow AI with NAND Research chief analyst, Steve McDowell.

Steve McDowell: You're like, "You know what? These IT guys, they're out of their mind. I need to use this because it's going to help me and I'm just going to pull the trigger and make the decision." Yeah, sure. The employee's getting a lot of benefit from this engagement, but you have no idea it's happening. I don't know where it's happening and I don't know what data's being exposed and where that data's going to. I think we see this kind of every big technology transition. Users are going to make their own decisions. Your employees are going to make their own decisions about the technology they use. It may or may not intersect with your corporate guidelines for IT. I remember when smartphones entered the world, that caused a lot of consternation among enterprise IT teams because how do I manage these devices? And they even coined a word, right?

BYOD, bring your own device. That was a hot topic for several years until Apple came out with kind of enterprise management tools and things smoothed over. And CloudWorld kind of did the same thing a half a decade later. We're doing that now with AI. I mean, AI is so beneficial. Now, scrolling through LinkedIn and reading all the AI-generated slop, it's not clear everybody knows how to use it, but they're using it.

RELATED AI’s Next Wave
Nutanix CEO Rajiv Ramaswami sees AI’s biggest economic impact coming after organizations move past initial investment and experimentation to real-world use.
  • Article:News
  • Key Play:Enterprise Ai, Platform
  • Nutanix-Newsroom:Article

March 18, 2026

Jason Lopez: The Enterprise Cloud Index is a snapshot of an industry in the middle of a profound transition, and it raises a question worth asking. What exactly is driving all of this and where does it lead? Dan Ciruli has watched this transformation from the inside. As the cloud native product leader at Nutanix, he spends his days talking to organizations navigating this shift firsthand. Ken Kaplan, editor-in-chief of The Forecast, interviewed him and asked him to step back and survey the arc of technology he's lived with for more than a decade. And what he's hearing now from the people on the front lines.

Dan Ciruli: I think it is safe to say at this point that containers in general and Kubernetes in particular are, I'm going to say it out loud, the de facto standard for developing and deploying new applications. It is no longer something that is a science experiment, which it was eight years ago. It's no longer a viable option, which it might have been three or four years ago. I would say these days, building your application to be packaged in containers and deployed on Kubernetes is the defacto standard for how applications are being developed. And I'm pretty comfortable saying that out loud.

Ken Kaplan: What are you seeing the success and struggles now in this new period compared to the early days? How have things changed?

Dan Ciruli: One thing that people are wrestling with is that when it was in that phase that it was more of a science experiment or just some of the new applications, you had certain developers who would lean in and certain developers who were not affected by it. And now I think we're reaching the stage that everybody needs to be comfortable with it. Everybody needs to be ready for whatever application you're working on the next time they ask you for a thing like, "Okay, it better be packaged up in a container. We better have some Kubernetes running wherever it is. We need to deploy that application." In becoming the de facto standard, that has means we're no longer, there's a pocket of the organization who's doing in Kubernetes. Now we're in the phase that everybody needs to be comfortable talking about it, using it, deploying it, running things on it, SREing on it.

Ken Kaplan: You're talking to more customers probably than ever before. You would like to talk to more. What are you learning from them?

Dan Ciruli: One of the things we're trying to help our customers with is as they approach that transition is how they standardize. When you have pockets of people leaning into a technology, then you have pockets of people leaning into a technology and some might lean in a slightly different direction than others. Some people might be making a certain architectural choice, a certain security choice using a certain technology, which works for that team, but might not work for the organization as a whole. And as you adopt this technology wholesale across the organization, then you have to think about, well, how do we standardize? What policies do we want to set that apply to everybody? How do we centralize this so that this can be run by a centralized team rather than by individual teams? So it's a big transition. And as I say, it's become the de facto standard.

And I say that with confidence, but for many organizations, they are still figuring out how that will affect them and how they, as an organization, do that in an efficient way that gives them the benefits that developers want, that give them the ability to innovate quickly, deploy frequently and scale as needed.

Ken Kaplan: And do you see a correlation or it's similar to what happened with cloud and people could go out and use a credit card and get compute services for the first time. And then there was all these projects going on and IT didn't know about it. Do you see some similarities?

Dan Ciruli: I had never thought about that way, and I think that is a fantastic way to look at it. Cloud was very much the same way. I was part of a team that we got frustrated with internal IT and the length of time it was taking them to get us literally just VMs and the amount they were going to charge us back. And we said, "Somebody on your corporate card, go to Amazon and start using this stuff." Yes. So I think Kubernetes started the same way where a team was like, "Hey, I don't want to deploy in VMs. I'm just going to start using Kubernetes." And then another team is. And then another team is, at some point, someone higher up and more central in the IT organization said, "Wait, there's way too much of this going on. This is not efficient. We've got different teams.

We're spending too much money on this. We're duplicating effort all over the place. And from a security posture, we're at risk." So someone centrally said, "We need to, for all of those reasons, we need to centralize this. We need to standardize this and we need to do this in a way that does give the teams what they want, but does it in a way that doesn't put us at risk financially or from a security perspective?" That's an excellent analogy.

RELATED Ecosystem Scorecards Help CIOs Avoid Vendor Lock-In
Digital service company leaders explain why using a vendor ecosystem scorecard helps organizations find service providers that align with business values and why they need to continuously adapt existing technologies with new ones to remain competitive.
  • Article:Business
  • Nutanix-Newsroom:Article

February 19, 2026

Ken Kaplan: Yeah. You hear about smart cloud strategy now. It used to be cloud first, now smart, but those people who have lived through that transition and are getting smart about their use of cloud, those are probably lessons that they can be applying to Kubernetes and containers. Why would the companies benefit now from saying, "That stuff's useful. Let's do it here together on this platform that can do the old and the new."

Dan Ciruli: So the interesting thing about me saying very strongly, this is the defacto standard for how applications are written. That doesn't change history and it doesn't change the fact that at essentially every enterprise, they have decades worth of applications running in virtual machines that are for the most part going to stay in virtual machines. That isn't changing. And while the new stuff is all containerized, the old stuff was virtualized and will continue to be virtualized. Very few of those apps will be rewritten to be containerized, which means as you centralize operations for all of this, now you've got one IT team that is going to be for the next decade, two decades, responsible for running tons of VMs and a growing number of containers that some point will outnumber the VMs, but those VMs will still be there. Companies have a choice. They can build up two separate groups of people, pieces of hardware, networking strategies, security technologies to manage those, in which case they're building literal silos in data centers of which hardware can run which applications, building silos in their organizations of which people can work on which organizations and actually hampering integration.You might have a new application that needs to get an old database, old piece of data that is in a virtual machine. And if you're building those things entirely separately, that's a challenge every time you do it. Whereas companies that say, "Let's combine these on one." They allow themselves to run essentially any application. It doesn't matter if it's virtualized or containerized, you can run it in the same locations. It means the same people who work on running the infrastructure can run that infrastructure, whether it is running virtualized or any combination of containerized applications. And it means you can do things like set security policies, set backup policies, set disaster recovery, networking policies. All of those can be consistent and it becomes much easier to interoperate between the new and the old. So I think it's a really strong driver for companies to invest in ways to run all of their both containerized and virtualized applications in a way that makes sense.

RELATED Clouds With Borders: IT Teams Design for Geopatriation
To follow local data-use rules and reduce risks, IT teams are creating data systems that tell where data can and cannot go.
  • Article:Business
  • Nutanix-Newsroom:Article

February 3, 2026

Ken Kaplan: And when they're doing that, are the apps that are running on VMs and containers, are they able to tap into the same storage or databases or different aspects of the system? Or is it you have this ability to manage them both, but they're still kind of separate?

Dan Ciruli: I mean, I talked to some organizations who they have their VMware environment and their OpenShift environment. They are literally buying separate teams, buying separate hardware, run by separate people. From a integration calling back and forth, it might as well be separate companies because you've got different networking, you've got different security, right? And what we advocate for is running a hypervisor on all of that hardware that you buy. And then within that, some of those, it's just an app running in a VM and some of those it's Kubernetes running in those, but there's the same networking underneath. And with Flow, you can write a networking policy that says, "This containerized application needs to talk to this virtualized application." That we feel is a tremendous advantage. You don't end up with hardware silos. Hardware silos are always a bad idea because inevitably one team has more hardware than they need.

We bought a huge cluster and we're only using 60% of it for our, say, containerized applications. Meanwhile, we've got this other cluster that's running virtualized applications. It's 100% full, but we have another virtualized application that we need to run. In that case, the only way to do that is to go procure more hardware. Whereas when you're running things more homogeneously, one system that can handle either, you can say, "Oh, no problem. We'll run more VMs on it. " So the hardware siloing is a really big deal. And that same thing that feeds down into the networking, "I want this application to talk to this application." Well, that's complicated when you've got completely different networking solutions on the two side.

Ken Kaplan: In the report, it said, "Currently, 71% are running their AI enabled applications on a mix of traditional apps and virtual machines and modern apps in containers on VMs." While 14% are running their AI enabled apps directly on bare metal servers, is that significant and is it something you see might be changing?

Dan Ciruli: I think what that's pointing to is AI is, in some cases, it's an experiment. Companies are buying hardware specifically to run these AI applications. So we do see people doing that, but what most enterprises will quickly realize is that you want AI embedded in all of your applications. I'm firmly convinced that in three years we won't differentiate between applications and AI applications. We'll just call them applications. It doesn't matter if it's just a business process workflow, if it's your email, if it's your sales database, you want AI everywhere. As I said before, some of those applications aren't going to move out of a VM, which means you need to be able to figure out how to, maybe it's an API call out that is going to hit a new service, which is an agent running in a container, but you need that logic embedded in that traditional application.So there is some experimentation going on where people are just going to buy some new hardware and run AI there. In the long run, you're going to want to run your containerizer and your virtualized applications together. You will also want that to be, I'll call it AI enabled. You will want to be able to, from anywhere, embed logic that depends on maybe an agent or maybe an LLM that is independent of whether or not that original application is deployed in a VM or deployed in a container.

RELATED AI Flips the Data Storage Paradigm
As data storage shifts from passive to active IT infrastructure that responds to AI’s beck and call, Nutanix's Vishal Sinha explains the mindshift needed for managing data now and in the future.
  • Key Play:Hybrid Cloud
  • Nutanix-Newsroom:Article, Video
  • Use Cases:AI ML

March 5, 2026

Ken Kaplan: Since we're on this topic, like AI agents, you just think of AI agents as another app that's probably cloud native.

Dan Ciruli: Almost certainly said before, it's become the defacto ... Containerization has become the defacto standard for writing and deploying new applications. The word agentic is only about 18 months old. All of these agents are new applications. They're all being written to be deployed in containers. And so yeah, all of that. And what I think of as agents is these are AI that can do things. That's how I think of AI. In LLM, you ask it a question and answers and an agent is actually doing things. It might be validating someone else's answer. It might be going to actually interact with a system, but the agents is where AI actually does things. And yes, we have already moved from the phase of AI where I just wanted to help me answer a question too. I want it to go do a thing for me.

Ken Kaplan: Yeah. I'm going to set up a playbook and have them run my playbook. And yeah, it's happened pretty quickly. All right. Here's my last question here. Does it take specialized skills to manage containers and Kubernetes and is that changing IT teams?

Dan Ciruli: It still does take additional knowledge to run containerized applications, to run Kubernetes. It does mean running Kubernetes, and there are still things that you do need to learn. There's no doubt about that. In part, I think that this gets solved through education and people learning to use new tools. If you think about how the data center evolved. At some point, Linux became a pretty kind of, I won't say de facto standard, but super, super common. And people had to learn how to navigate in Linux, right? If people in infrastructure and operations. At some point, virtualization went from being a science experiment to the standard way things were networked and people had to learn the concepts of virtualization. The same thing is happening with the Kubernetes landscape and more and more people are having to learn these concepts. But the other thing I think that is happening at the same time is that as an industry, and this is someplace that we are leaning in as a company, how do we make that process easier?How do we not just say, make that easier by giving lots and lots of education, but how do we make it easier for the average operator to run a Kubernetes cluster? How do we make the tools easy to use? How do we make the tools smart so that the tools are doing more and more of the grunt work? And I think this is a case where AI already is making their lives easier. The analogy I like to use with this is when I learned to drive in the 1980s, it was very common that you knew how to change the oil in your car, certainly check the oil level. You had to know where your dipstick was. You were going to be doing that. You probably knew where your carburetor was. You definitely knew where your spark plugs were. You knew how to clean a spark plug, you knew how to replace a spark plug.

RELATED Why the Future of IT Belongs to Open Systems
Vendor lock-in is a failed strategy because flexibility is the key to long-term success, says Paul Updike, technical marketing engineer at Nutanix.
  • Article:Profile
  • Key Play:Hybrid Cloud
  • Nutanix-Newsroom:Article

February 24, 2026

Not every driver did, but most did. That was just common things, right? When was the last time anyone did any of those things in their car? The car has gotten better. It is tuning itself constantly. It is so much better at operating. The car in effect has gotten mechanically smarter, and now in some cases, actually computationally smarter. Well, our systems are doing the same thing. As vendors we are trying to make just as the car is continually tuning itself, we want Kubernetes to continually be tuning itself. We don't need to teach everybody in the world to be a mechanic. We want to make this machine easier to operate. And that's a big part of where we're investing is how do we make it easier for those operations teams for all the reasons we discussed are having to run Kubernetes. How do we make it so that doesn't feel like a burden for them, but it's just another tool in their toolbox.

Jason Lopez: Dan Ciruli is the general manager of cloud native at Nutanix. He's also co-founder of the OpenAPI initiative, which established the universal standard for how software systems describe and communicate with each other. Ken Kaplan is the editor-in-chief of The Forecast. The Forecast produced this interview. It's part of the Tech Parameter Podcast series. I'm your host, Jason Lopez. You can find articles and other podcasts at The Forecast. Just go to The Forecast by Nutanix. That's all one word, the forecastbynutanix.com.

Tom Mangan is a contributing writer. He is a veteran B2B technology writer and editor, specializing in cloud computing and digital transformation. Contact him on his website or LinkedIn.

Jason Lopez is executive producer of Tech Barometer, the podcast outlet for The Forecast. He’s the founder of Connected Social Media. Previously, he was executive producer at PodTech and a reporter at NPR.

Related Articles