Technology

How Kubernetes Catalyzes Enterprise IT

Dan Ciruli, vice president and general manager of cloud native technologies at Nutanix, explains how Kubernetes started as a tool for managing containers but has rapidly evolved into the foundation for modern cloud-native computing across data centers and the edge.
  • Article:Technology
  • Key Play:Enterprise Ai
  • Nutanix-Newsroom:Article
  • Products:Nutanix Kubernetes Platform (NKP)
  • Use Cases:Cloud Native

January 8, 2026

Kubernetes began as a tool designed to simplify the lives of software developers. Now it’s poised to become omnipresent across the computing landscape.

“Essentially, all new applications are written to run on Kubernetes,” said Dan Ciruli, Nutanix’s vice president and general manager for cloud native technologies, in an interview with The Forecast

“We'll be using it for everything eventually.”

Ciruli recalled Kubernetes’ evolution from a cloud-native development tool to a must-have platform for every kind of computing: cloud infrastructures, on-premises data centers, AI/ML apps and devices connected at the network edge. His perspective addresses some of the key computing issues that IT leaders and software developers will face in the next few years. 

Enterprise: From Cloud to On-Prem

Ciruli has been closely following the rise of Kubernetes since its arrival a decade ago, which helps developers deploy microservices-based architectures. Typically, each microservice operates in its own container, which is essentially a portable operating system. “

"As a developer, it's so much easier to put your app in a container,” Ciruli said. “It makes testing and deployment really, really simple.”

Kubernetes and similar containerization tools help developers weave clusters of microservices together and keep them all running reliably. If one microservice crashes, for instance, the others keep operating. Kubernetes automatically identifies failures within clusters and cures whatever caused the problem. This is a distinct advantage over monolithic applications, where losing a single process can crash the entire app.

Cloud native development is growing increasingly mainstream. The Cloud Native Computing Foundation (CNCF) notes that 15.6 million developers worldwide build apps in the cloud. Precedence Research, meanwhile, expects the market for cloud native development to surge from $50.3 billion in 2025 to $172.5 billion in 2034

RELATED Orthogonal Advantages of Cloud Native Technologies
In this video interview, Nutanix cloud native technology expert Dan Ciruli describes the trends and technologies powering an explosion in new applications, particularly those with AI capabilities.
  • Job Title:IT Practitioner
  • Key Play:Cloud Native, Thought Leadership
  • Nutanix-Newsroom:Article, Video

August 19, 2025

Ciruli said that many enterprises are deploying cloud native applications on-premises across their own data centers. For many, that’s because certain data can’t be hosted in the cloud for compliance reasons. For example, he said government agencies and global corporations might want containerized apps in an air-gapped environment unplugged from the cloud.

“You should be able to run your application where you want to,” Ciruli said.

Simplifying On-Prem for Cloud-Native Apps

Putting cloud-native apps to work in on-prem environments is appealing in principle but complicated in practice. 

“Using Kubernetes in the cloud is solved,” Ciruli said. “But using Kubernetes on-prem can still be difficult.”

He said a vast majority of IT teams rely on virtual machines to handle workloads and getting kubernetes to run on virtual machines can be challenging. These challenges motivated Nutanix, a pioneer of hyperconverged infrastructure that virtualizes compute, storage and networking, to build a single control plane that runs Kubernetes anywhere. The Nutanix Kubernetes Platform (NKP) simplifies microservices management in on-prem data centers, in the cloud and on edge devices.

“NKP Full Stack lets enterprises run any combination of virtualized and containerized applications,” Ciruli said. 

He said Nutanix works with partners to provide additional capabilities, including Traekfik Labs, which adds a unified application intelligence layer that orchestrates application programming interfaces (APIs) to route some data traffic to VMs and other traffic to containers, depending on which way best suits an enterprise’s unique needs.

“It’s just making sure enterprises have everything they need in the world we'll be living in for the next 20 years, in which some applications or services are running in VMs and others are running in containers,” Ciruli said.

“We bring the application layer, Nutanix brings the rest of the infrastructure, and that's why it's a beautiful partnership,” said Sudeep Goswami, CEO of Traefik Labs, in an interview at KubeCon + CloudNativeCon North America 2025.

AI/ML: Narrowing the Kubernetes Skills Gap

“Kubernetes is the de facto platform for AI infrastructure,” Ciruli told The Forecast. Multiple data points bear this out, because microservices and containers are ideal for a wide range of AI functions. Portworx, for instance, commissioned research suggesting 54% of AI/ML workloads run on Kubernetes. Sysdig's 2025 Kubernetes and Cloud-Native Security Report, meanwhile, noted that AI/ML workloads increased 500% in the past year.

Related Study Shows Big Uptake of Enterprise AI and Cloud Native Technologies
As generative AI workloads and cloud native technologies proliferate, global decision-makers surveyed for the 2025 Enterprise Cloud Index cite infrastructure, security and talent issues as top deployment and scalability barriers.
  • Article:News
  • Nutanix-Newsroom:Article

February 12, 2025

For all its appeal to developers, Kubernetes has a steep learning curve because it touches on pretty much every aspect of coding, testing, deploying, securing and maintaining applications. Thus, finding seasoned Kubernetes practitioners is a global challenge.  

Could AI address close these gaps? 

“I have seen some pretty concrete examples of people using ML or AI to make running Kubernetes at scale easier,” Ciruli said. 

Ciruli is skeptical, however, of claims that large language models (LLMs) will usher in massive job losses. 

“I do believe that intelligent systems can be easier to run, meaning someone doesn't have to get five years of experience before they can run a Kubernetes cluster,” he said.

Completing the Circle: Cloud-Native at the Edge

“Right now, Kubernetes is in the middle of taking over the edge,” Ciruli said. Assembly lines and distribution centers, for instance, are installing smart edge devices to monitor performance and take automation into realms like robotics.

Related AI and Cloud Native Alchemize the Future of Enterprise IT
After hitting the top of the hype cycle, cloud native and various artificial intelligence technologies combine to drive IT capabilities into the future.
  • Article:Technology
  • Nutanix-Newsroom:Article

September 6, 2024

Ciruli sees containers in Kubernetes becoming the default way workloads are deployed to a physical device. 

“Containerization and Kubernetes have given a single model that lets developers deploy applications anywhere – from the cloud to the datacenter to the edge. We see it in vehicles on the road, on the sea, and in the air. It simplifies the lives of developers and makes operations much more consistent. And in the future, who knows where else we’ll see it.” 

Get more Kubernetes and cloud native technology insights from these articles: 

Tom Mangan is a contributing writer. He is a veteran B2B technology writer and editor, specializing in cloud computing and digital transformation. Contact him on his website or LinkedIn.

© 2026 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles