Technology

The Arc of Cloud Native Transformation

IT industry experts discuss the paradigm shifts driven by over a decade of cloud native computing innovation.
  • Article:Technology
  • Job Title:ITDM
  • Key Play:Cloud Native, Enterprise Ai, Thought Leadership
  • Nutanix-Newsroom:Article
  • Products:Nutanix Kubernetes Platform (NKP)
  • Use Cases:Cloud Native

March 12, 2026

Many believe it first hit the scene in 2010 as a way to describe applications that "behave well in the cloud." Cloud native innovation is now an essential part of nearly every new enterprise application, evolving from containers, container orchestration to microservices. As organizations grapple with managing AI workloads, security and cost control mandates today, cloud native technologies are poised to proliferate for years to come, according to industry experts.

“We're moving from cloud native to AI native, and the companies that understand this shift will define the next decade of enterprise IT,” Rob Enderle, principal analyst at Enderle Group, told The Forecast.

Docker's promise of “build once, run anywhere” matured into sophisticated orchestration platforms like Kubernetes that enabled organizations to deploy applications at unprecedented scale and velocity. Rapid innovation and widespread use of these cloud native technologies have altered how businesses operate and build for the future.

Eighty-seven percent of respondents to the 2026 Enterprise Cloud Index (ECI) expect to increase their use of containers for applications over the next three years. Of 1,600 cloud, IT, and engineering executives surveyed from around the world, 85% of respondents believe AI is accelerating their container adoption.

http://nutanix.com/enterprise-cloud-index-hub

"It is safe to say at this point that containers in general and Kubernetes in particular are the de facto standard for developing and deploying new applications," said Dan Ciruli, vice president and general manager of Cloud Native at Nutanix. 

He traced the technology's evolution from experimental eight years ago to a viable option a few years later the default choice for application development today.

RELATED How Kubernetes Catalyzes Enterprise IT
Dan Ciruli, vice president and general manager of cloud native technologies at Nutanix, explains how Kubernetes started as a tool for managing containers but has rapidly evolved into the foundation for modern cloud-native computing across data centers and the edge.
  • Article:Technology
  • Key Play:Enterprise Ai
  • Nutanix-Newsroom:Article
  • Products:Nutanix Kubernetes Platform (NKP)
  • Use Cases:Cloud Native

January 8, 2026


"The word agentic is only about 18 months old,” Ciruli told The Forecast. “All of these agents are new applications. They’re all being written to be deployed in containers."

"Ten years from now, we'll be blown away at where container-based applications are running and what those things are accomplishing."

The Container Revolution

The containerization revolution rose around 2013, when Docker offered developers an elegant solution to the age-old challenge of application portability across different IT infrastructures. By packaging applications with their dependencies into lightweight, portable containers, Docker eliminated the friction between development and production environments. This breakthrough enabled consistent deployment across diverse infrastructure, from developer laptops to production cloud platforms.

The impact was immediate and profound. Development teams could iterate faster, operations teams could deploy more reliably, and organizations could leverage infrastructure more efficiently. Containers provided the abstraction layer that cloud native applications needed, separating application logic from underlying infrastructure concerns. This decoupling proved essential for the microservices revolution that followed.

Orchestration and the Rise of Kubernetes

As container adoption accelerated, a critical challenge emerged: managing containers at scale. Early container deployments revealed gaps in scheduling, service discovery, load balancing, and automated rollouts. Organizations running hundreds or thousands of containers needed sophisticated orchestration to maintain operational sanity.

Kubernetes emerged as the standard for container orchestration, providing automated deployment, scaling, and management of containerized applications. Originally developed at Google and open-sourced in 2014, Kubernetes brought battle-tested practices from Google’s internal systems to the broader market. The platform’s declarative configuration model and robust ecosystem enabled enterprises to operate container infrastructure at previously impossible scales.

The Kubernetes ecosystem matured rapidly, with contributions from major technology vendors and a vibrant open-source community. This collaborative approach accelerated innovation and prevented vendor lock-in, making Kubernetes the Switzerland of cloud infrastructure. Today, every major cloud provider offers managed Kubernetes services, and the platform has become synonymous with cloud native computing.

Microservices Architecture

Container orchestration platforms like Kubernetes enabled organizations to fully embrace microservices architectures. Rather than building monolithic applications where all functionality resides in a single codebase, development teams decomposed applications into smaller, independently deployable services. Each microservice handles a specific business capability and communicates with other services through well-defined APIs.

This architectural shift delivered tangible business benefits. Teams could develop, test, and deploy services independently, accelerating release cycles and reducing blast radius when problems occurred. Organizations gained flexibility to choose optimal technologies for each service while maintaining overall system coherence. The ability to scale individual services independently improved resource utilization and cost efficiency

RELATED Orthogonal Advantages of Cloud Native Technologies
In this video interview, Nutanix cloud native technology expert Dan Ciruli describes the trends and technologies powering an explosion in new applications, particularly those with AI capabilities.
  • Job Title:IT Practitioner
  • Key Play:Cloud Native, Thought Leadership
  • Nutanix-Newsroom:Article, Video

August 19, 2025


However, microservices introduced complexity. Distributed systems presented new challenges around service discovery, inter-service communication, distributed tracing, and failure handling. The operational overhead of managing dozens or hundreds of services required new tooling and practices. Despite these challenges, the benefits of microservices proved compelling for organizations building complex, rapidly evolving applications.

Cloud Native Business Impact

The cloud native transformation has delivered measurable business value. Organizations report faster time to market, improved operational efficiency, and greater agility in responding to changing requirements. The ability to leverage commodity infrastructure and open-source tooling has democratized access to sophisticated capabilities previously available only to tech giants.

RELATED AI and Cloud Native Spark Explosion of New Apps
Nutanix’s Dan Ciruli explains how the parallel paths of artificial intelligence and cloud native technologies are meeting the need for faster enterprise application development.
  • Article:Technology
  • Nutanix-Newsroom:Article

December 5, 2024


Industry analyst Rob Enderle of the Enderle Group observes that cloud native technologies have leveled the playing field. 

"The barrier to entry for sophisticated digital services has dropped dramatically,” Enderle said. “A startup can now deploy globally scalable infrastructure that would have required millions in capital investment just a decade ago."

However, the technology landscape continues evolving. The emergence of artificial intelligence and machine learning workloads presents new infrastructure requirements. Enderle explained that traditional cloud native architectures, optimized for stateless web applications and microservices, must adapt to support data-intensive AI training and inference workloads.

The AI-Native Infrastructure Challenge

Ciruli sees AI as the next inflection point for infrastructure evolution.

"We're entering the AI-native era, where infrastructure must be designed from the ground up to support machine learning workloads,” Ciruli said. “This requires rethinking storage, compute, and network architectures to handle the unique demands of AI applications.”

RELATED Dartmouth College Powers AI for Its Researchers
Dartmouth College Director of IT Infrastructure Services Ty Peavey explains how moving away from VMware software to the Nutanix Cloud Platform helped his team manage virtual machines and container orchestration across hybrid cloud resources, enabling the University to adapt quickly to rising needs for enterprise AI capabilities.
  • Education:Higher Education
  • Job Title:ITDM
  • Key Play:Platform, Thought Leadership
  • Nutanix-Newsroom:Article, Video

October 31, 2025

As enterprises navigate these changes, Ciruli emphasized the importance of balancing innovation with operational reality. IT teams are embracing containers and AI for new development while maintaining robust support for existing virtualized workloads, all while making these increasingly complex systems easier for IT teams to manage. He said having one unifying platform can simplify and fortify operations, and remove silos.  

"Companies that combine these on one platform allow themselves to run essentially any application, it doesn't matter if it's virtualized or containerized," he said. 

This approach enables consistent security policies, backup procedures, disaster recovery protocols, and networking configurations across both environments.

Looking ahead, Ciruli sees AI becoming deeply embedded across all enterprise applications rather than remaining a separate category.

RELATED AI and Cloud Native Spark Explosion of New Apps
Nutanix’s Dan Ciruli explains how the parallel paths of artificial intelligence and cloud native technologies are meeting the need for faster enterprise application development.
  • Article:Technology
  • Nutanix-Newsroom:Article

December 5, 2024


Brust said enterprises should also resist the urge to wait for a turnkey solution.

“There isn’t really anything to buy quite yet,” he said. “Customers are going to have to articulate what they need.”

In practice, that means treating early agent systems as supervised tools rather than autonomous workers. Human review, approval, and oversight remain essential.

Hybrid infrastructure adds another layer of complexity. Most enterprises operate across on-premises systems and multiple clouds, and agent platforms must function consistently in all of them. Fragmented governance in those environments only increases operational friction. For those reasons, some infrastructure providers, including Nutanix, are positioning their hybrid platforms as foundations for running "You want AI everywhere, whether in business process workflows, email systems, or sales data,” he said.

“In the long run, you're going to want to run your containerized and your virtualized applications together,” Ciruli said.  

“You will also want that to be AI-enabled. You will want that to be able to run from anywhere, whether or not that original application is deployed in the VM or deployed in a container.”

Cloud Native Innovation Moves Ahead

The trajectory from containers to Kubernetes to microservices demonstrates technology's capacity for rapid evolution. Each innovation builds on previous breakthroughs, creating new possibilities while solving emergent challenges. The shift toward AI-native infrastructure and operational efficiency represents the next chapter in this ongoing story, according to Enderle.

He said that with the rise of enterprise AI, the cloud native principles of automation, abstraction, and operational excellence remain relevant. They're simply being applied to a new generation of workloads and challenges.

RELATED The Disruptive Force of Cloud Native
How innovation around developing and managing modern applications is accelerating business success.
  • Article:Technology
  • Job Title:ITDM
  • Key Play:Cloud Native, Enterprise Ai, Thought Leadership
  • Nutanix-Newsroom:Article
  • Products:Nutanix Kubernetes Platform (NKP)
  • Use Cases:Cloud Native

November 15, 2025


“The fundamental promise of cloud native computing, agility, scalability, and efficiency, hasn't changed," Enderle noted. 

"What's changing is the workload mix and the infrastructure requirements. Organizations that can adapt their cloud native practices to support AI workloads while maintaining sustainability commitments will thrive in the next decade."

The journey from Docker containers to AI-native infrastructure drove organizations to rethink their approaches, invest in new capabilities, and embrace change. The path forward demands similar adaptability, as enterprises balance the opportunities of AI with the imperatives of sustainability and operational excellence.

Scott Steinberg is a business strategist, award-winning professional speaker, trend expert and futurist. He’s the bestselling author of Think Like a Futurist; Make Change Work for You: 10 Ways to Future-Proof Yourself, Fearlessly Innovate, and Succeed Despite Uncertainty; and Fast >> Forward: How to Turbo-Charge Business, Sales, and Career Growth. He’s the president and CEO of BIZDEV: The International Association for Business Development and Strategic Partnerships™. Learn more at www.FuturistsSpeakers.com and LinkedIn

Ken Kaplan, Editor in Chief for The Forecast by Nutanix, contributed to this story. Find him on X @kenekaplan and LinkedIn.

© 2026 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles