The Disruptive Force of Cloud Native

How developing and running applications in the cloud is accelerating business success.

  • Article:Technology
  • Key Play:Enterprise Ai
  • Nutanix-Newsroom:Article
  • Products:Nutanix Kubernetes Platform (NKP)
  • Use Cases:Cloud Native

November 15, 2025

The AI revolution is here and enterprises are relying more heavily on cloud native technologies to speed their pace. According to the Nutanix 2025 Enterprise Cloud Index (ECI), a majority of organizations (80%) have implemented a GenAI strategy. The report states that organizations are primarily using GenAI for customer support and experience right now to improve their organization’s levels of productivity, automation and efficiency.

The rapid move toward AI, with most applications developed using cloud native technologies, is making kubernetes and containers a common tool for building and managing AI applications. The Nutanix ECI showed that 94% of developers say that cloud native architectures, in combination with containerization tools, have become the “gold standard” for deploying and supporting GenAI and other modern AI applications at scale.

“I think AI is going to be on premise,” said Dan Ciruli, vice president and general manager of the cloud native product team at Nutanix in a recent article published on The New Stack

“It’s going to be in the cloud. It’s going to be everywhere. It’s going to be on a mobile phone — it’s already there. All cloud apps are eventually going to be AI apps, right? With AI applications, it’s good to build them on containers because that gives you scalability and the portability that you would need.”

RELATED Orthogonal Advantages of Cloud Native Technologies
In this video interview, Nutanix cloud native technology expert Dan Ciruli describes the trends and technologies powering an explosion in new applications, particularly those with AI capabilities.
  • Nutanix-Newsroom:Article, Video

August 19, 2025

In a video interview with The Forecast, Ciruli said GenAI is driving the shift toward cloud-native applications and containerization. He explained that all new GenAI applications are written to deploy in this cloud-native application. Virtually everything that’s being done in the AI space is happening in Kubernetes, which is very scalable, said Ciruli.

“Ten years from now, we'll be blown away at where container-based applications are running and what those things are accomplishing,” Ciruli said. 

However, he noted that implementing AI is challenging due to the complexity of Kubernetes. Because AI requires these containers as a foundation, Ciruli said many companies are now starting to think about managing these containers at scale, often for the first time. He explained that many organizations still lack the basics, including microservices and cloud-native architecture, as well as knowledge of how to implement Kubernetes and cloud services at scale.

“The problem is that Kubernetes is too hard to use,” he told The Forecast

“AI is going to be one of the things that solves that. And when we get to what we call intelligent infrastructure that is using ML to understand that here's a problem that maybe hasn't happened yet but is about to happen, and here's what we do to solve it. 

“And you can imagine a day, and we're already seeing this, we've already got AI chatbots built into our software that allow you to say, in plain English, why am I seeing this crash loop error, and what can I do about it? And have AI help to solve that problem. So AI will help us bridge that skills gap by making the technology easier to use.”

How Cloud AI Changes Application Development

Mark Lavi, former DevOps solutions architect at Nutanix, told The Forecast in 2019 that cloud native requires developers to take a new approach to development by creating applications built with services packaged within containers. He explained that containers, which enable developers to bundle software into a single executable package, do not include operating system images, meaning they require fewer system resources than traditional development environments do.

“Developers generally just want to code, build, deploy and test on their laptop rather than deal with infrastructure and operations teams,” said Lavi. “Utilizing widely deployed cloud native applications such as distributed databases, message queues, storage, etc., allows standardization on proven technologies.”

RELATED How Ivy League Dartmouth College Moved to a Future-Ready IT Platform
Dartmouth College Director of IT Infrastructure Services Ty Peavey explains how moving away from VMware software to the Nutanix Cloud Platform helped his team manage virtual machines and container orchestration across hybrid cloud resources, enabling the University to adapt quickly to rising needs for enterprise AI capabilities.
  • Article:Industry
  • Nutanix-Newsroom:Article

August 7, 2025

Lavi said developers can also tap into community expertise, find efficiencies and focus on the value-differentiated part of their work. Combining code builds and cloud-native services with container delivery creates a multiplying effect, making it possible to code more efficiently. Lavi said that because developers can build artifact reuse and rapid delivery, they can code on their laptops without worrying about infrastructure operations.

In an article published in Cloud Native Now titled Prepare for the Second Wave of Container Management, Lee Caswell, senior vice president of product and solutions marketing at Nutanix, explained why containerization and Kubernetes have been blessings for developers. He said the previously nitty-gritty work of developing, testing, deploying and scaling applications can now be automated.

“Instead of spending their time meticulously packaging an application with all of its dependencies (libraries, databases, APIs, etc.), developers can focus on writing good code that is segmented into individual microservices and essentially hand it over to Kubernetes, which automates testing, deployment and scaling,” said Caswell.

According to Nutanix ECI, 96% of government organizations are actively containerizing its applications, significantly outpacing adoption rates across most private industries. However, organizations quickly run into hidden costs with deploying Kubernetes in the cloud, said Ciruli. 

Enterprises are often have to “bolt on” extra storage capacity. That include cloud storage, which limits portability, or buying on-premises storage, which introduces yet another vendor, he said. When organizations take this “less-than-duct-tape” approach to extra storage, Ciruli said the result is often fragile integration, which risks data inconsistencies.

“While cloud providers focus primarily on Kubernetes orchestration, enterprises still need to integrate additional capabilities to make Kubernetes enterprise-grade, including observability, networking, service mesh and other tools needed alongside Kubernetes,” said Ciruli.

Second Wave of Kubernetes Providing Efficient Management

The challenge with first-wave Kubernetes platforms is that individual DevOps teams ran them, wrote Caswell. As a result, applications had many platforms with different components and upgrading cycles, he said. Additionally, Caswell said it was very challenging to detect vulnerabilities in every unique Kubernetes cluster. However, the second wave provides a virtual machine-like experience, allowing customers to manage the new container-based Kubernetes alongside existing VM-based apps.

“These second-wave Kubernetes platforms will provide centralized management of containers running on-premises, in the cloud and at the edge,” said Caswell.

RELATED AI’s Rising Tide Confounds IT Decision Makers
In a video interview, Nutanix’s Lee Caswell tells how IT customers face changing needs as they embrace AI capabilities.
  • Nutanix-Newsroom:Article, Video

July 25, 2025

“Retailers that want to adopt containers to run in-store applications, or manufacturers that want to deploy containerized apps in a new facility, for example, can count on a second-wave Kubernetes platform to help deploy those environments and manage them centrally using the tools and processes they’re already used to using.”

By enabling automation and developer collaboration throughout the development process, highly prized staff is freed from much of the drudgery of old-school development and can focus on more strategic tasks. These include building more intuitive, customer-facing applications, hopefully ahead of the competition, which in turn drives more revenue and profit.

Jennifer Goforth Gregory updated this article, which was originally published on May 3, 2019 and written Bill Laberis, a veteran IT writer and former editor in chief of Computerworld.

© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles