Business

Data Center and Cloud Cost Control Crucial in Enterprise AI Era

As AI increases demand for more power and cloud resources, organizations are finding ways to manage the cost of energy consumption.

May 12, 2025

Managing computing resources was already challenging enough over the past decade with digital transformation moving full steam ahead and the accessibility of public cloud services. Then came the fast-moving wave of cloud native and AI innovation, ratcheting up the complexity beyond previously imaginable. Now more than ever, a relentless focus on data system efficiency is a business imperative and differentiator. 

To meet the growing demand from running AI workloads, data center providers and managers who deliver performance while reducing their environmental footprint will have a competitive advantage, according to Andrew Schaap, CEO, Aligned Data Centers, who writes for Forbes Technology Council.

“The sustainability imperative extends well beyond innovations like liquid cooling technology that requires less power and water,” he wrote in Five Trends That Will Define the Data Center Industry in 2025

He pointed to more energy-efficient electrical infrastructure and modular construction to help shrink carbon emissions.

“We aim to bake sustainability into every aspect of data center operations, whether it’s waste management or reclamation of brownfield sites for new facilities. Providers are also looking at clean, renewable energy from wind and other sources.” 

Aside from raising sustainability concerns, today’s dense, next-generation data centers squeeze every piece of hardware inside them to deliver better computing efficiency and performance to combat energy’s rising price tag.

“Data centers are no longer just about storing data,” Jae Ro, head of marketing at SIGNAL+POWER, a manufacturer of power cords and electrical plug adapters for business customers — including data centers — told The Forecast

“They are about providing the massive computing resources that AI models demand,” Ro said. “These facilities are evolving to handle the increased computational load while maintaining high levels of performance.”

Zoom Into Every Aspect of Data Centers to Find Energy Efficiency

Major public cloud companies, including AWS, Microsoft and Google, now invest in nuclear power. In the U.S., government officials are exploring how companies contributing to the big data center boom will not only be technology providers but also designated energy companies.

But there’s more to it. While keeping a constant eye on the big picture, data center managers of all shapes and sizes must dive deeper into the minutiae of workload and code efficiency, says Harmail Chatha, senior director of cloud operations at Nutanix.

“Companies have to start really zooming into their environments and ask what does the workload look like, and how do they measure the emissions of that workload,” Chatha told The Forecast

“We're just kind of touching the surface on scope one and scope two emissions. How do you really measure scope three, which is the most challenging one? Scope three is all-encompassing, so how do you get to embodied emissions of a server, VMs, workloads, that single little cable within the system and measure the emissions of them?”

RELATED Bracing Data Centers for Wave of AI Workloads
In this video interview, Harmail Chatha, senior director of cloud computing operations at Nutanix, describes the growing challenges of managing data centers as business demands for enterprise AI applications climb.

April 25, 2025

He said the industry isn’t there yet, but is feverishly working on it because they have to in order for new innovations like AI to become feasible going forward. 

“There's a lot of effort that's going to go into measuring, and there are so many companies, new startups coming out, that are starting to just touch the surface of how to measure emissions in itself?

Managing Cloud Services and New AI Workloads

The cloud’s carbon footprint surpassed that of the entire aviation industry, according to MIT researchers in 2022. Cloud service providers have only grown since then, spiking upward with the rapid expansion of artificial intelligence projects. According to data from the U.S. Census Bureau, spending on new data center facilities rose at a compounded annual rate of 40% between 2021 and early 2025, reaching an annual rate of approximately $34.8 billion in February 2025.

Keep in mind that today just one data center can require the same amount of electricity needed to power 50,000 homes. Goldman Sachs reported that today’s data centers consume about 1-2% (some say more) of the world’s total power and conservatively estimated that that figure could double by the decade's end. Scientists at MIT Lincoln Laboratory expect that 1-2% figure to skyrocket, given rising AI demands, potentially hitting 21% by 2030, when costs related to delivering AI to consumers are factored in.

As demand for AI applications continues to rise, data center operators are squeezing more computing power than ever into server racks, which is driving up energy consumption and generating more heat.

RELATED Managing Enterprise AI Sprawl
CIOs must create a cohesive strategy for managing enterprise AI applications and data, which requires establishing a set of validated use cases, drafting policies and frameworks to govern use of AI tools, and centralizing oversight of the technology, says Nutanix CIO Rami Mazid.

March 13, 2025

Enterprises increasingly seek ways to reduce their energy consumption and carbon footprint, both on-premises and in the cloud. Still, the energy demand continues to far outweigh the results of their efforts so far.

“I don't think AI is necessarily pushing the limits within data centers yet, but it will very soon if data centers don't start to adapt to new technologies, new cooling infrastructure, new power densities that are required as well,” said Chatha. “As more and more consumption goes in and customers identify workloads that they're going to be running with ai, I definitely see it hitting a limit.”

According to a whitepaper, Hybrid Multicloud Deployment Choices Can Increase ROI Via Sustainability Benefits, IDC predicts that “By 2026, 60% of enterprises will implement sustainable AI frameworks, leveraging data-driven decisions to scale AI operations across datacenter locations while meeting decarbonization goals.1”

As organizations continue to launch ever more aggressive sustainability initiatives, some might be missing an often-overlooked – but incredibly crucial – area of concern: the cloud.

Data Centers All the Way Down

The cloud has revolutionized the business world, and since its beginning one of its most touted advantages was that organizations could reduce the need for on-premises hardware and management. When sustainability began to emerge a few years ago as a high-priority concern across the industry, that reduced need for on-prem hardware also came with the benefit of reducing an organization’s carbon footprint.

RELATED A New Generation of Data Centers Spreads Use of Enterprise AI
Forward-looking AI data centers are prioritizing memory, efficiency and sustainability.

April 24, 2025

While moving data and applications to the private or public cloud infrastructure can reduce an organization’s carbon footprint and energy consumption, that data now resides in the company’s data center or in various data centers owned by hyperscalers, which are often housed in buildings that can span millions of square feet, filled with thousands of servers and many miles of networking cable and gear.

Hyperscalers continually work hard to reduce their data centers' carbon footprint and energy consumption. But even with their sustainability measures, when an organization’s data resides in a cloud data center, how that organization uses available cloud services still directly affects global carbon emissions and energy usage.

Any data center, from a small business’s on-premises servers to the largest hyperscalers’ server farms, consume a lot of electricity. In fact, simply cooling those servers can account for up to 40% of the total power they consume, according to MIT research.

RELATED What’s Driving IT Decisions Around Enterprise AI and Cloud Native Technologies
In this Tech Barometer podcast, go behind the 2025 Enterprise Cloud Index findings numbers with Nutanix AI and cloud native technology experts, who explain current trends and challenges impacting CIOs and IT decision makers.

March 19, 2025

Data centers consume other types of resources, such as water, and contribute significantly to electronic waste. Concern over resources other than power is growing, said Chatha.

“Many enterprises already use dashboards and tools that give them insight into energy consumption, but over the next couple of years, organizations will also demand more clarity on water consumption and waste as well,” he said.

Within Nutanix’s Prism Central application, IT teams can measure the electrical consumption of a node, Chatha said.

“We're going a step deeper versus just being holistic at a data center level now, but where we ultimately need to do more work to get to the VM level. Once you can measure the VM, then you got to get to the workload level. And that's when you're going to be able to make smart, intelligent decisions on what a workload consumption looks like, correlate that back to the emission factor, and then intelligently, you're able to move those applications around to more sustainable data centers that might have lower PUEs, use more renewable energy as well.”

He said Nutanix deploys data centers in locations that offer renewable energy and have a lower PUE. 

“It's voluntary, but it's a passion project where we’re caring for what we currently have, but also preserving the future as well,” Chatha said..

Cloud Costs Are Directly Tied to Sustainability

As more organizations closely manage their cloud costs and overall data center energy use, they will rely on new tools and skills, including FinOps and GreenOps. FinOps focuses on monitoring cloud usage, analyzing cloud expenses, and optimizing resources to keep the cloud cost-effective for an organization. GreenOps’ focus is looking at technology and industry best practices to make an organization’s use of the cloud more efficient while reducing its impact on the environment. They can be complementary approaches.

With a combined FinOps/GreenOps approach, organizations can identify unneeded cloud usage and resources, reduce waste, keep operational costs aligned with sustainability goals, and make decisions that offer the optimal balance between cost-effectiveness and reduction of environmental impact. Organizations can also expand their ecological outlook by adopting green cloud practices, using renewable energy, and working with hyperscalers that offer more resource-efficient data centers.

RELATED Enterprise AI Reality Check: Implementing Practical Solutions
As enterprise AI kicks into gear, IT teams need to optimize infrastructure, control costs and deliver measurable business outcomes in this interview with Induprakas Keri, senior vice president and general manager for hybrid multicloud at Nutanix.

March 7, 2025

One example of FinOps and GreenOps contributing to more informed decision making is the selection of a cloud data center. Companies can invest in or deploy business applications in a region that facilitates the purchase or development of renewable energy. Relying on computing resources in the Nordic region can help reduce data center energy costs because external air can cool servers and other hardware. That benefit would be measured against the cost of storing and accessing data there.

Stunning Growth in Cloud Carbon Emissions Largely Due to AI

It’s early days for AI technology but promises of its many benefits have already enchanted enterprises around the world. That could be great for revolutionizing business practices and operations, but it could become an enormous issue for the environment.

GenAI can help organizations get more visibility and insight into their cloud costs and cloud emissions, but it also requires a lot of energy and water.

“AI queries need about 10x the power of traditional searches while generating AI-made music, photos, and videos requires much more,” according to an article published on Carbon Credits. That means querying ChatGPT, for instance, uses ten times more electricity than a search on Google.

RELATED Guiding Enterprise IT Hardware Buyers into the AI Future
In this Tech Barometer podcast, David Kanter, co-founder of ML Commons, talks about intellectual curiosity and how it led him to the forefront of the enterprise AI revolution.

April 23, 2025

Beth Kindig authored a recent article in Forbes, which stated that GPUs are the main culprit of energy consumption. And each generation of GPU is delivering incredible increases in performance and power consumption. Kindig found that Nvidia’s H100 GPU consumes up to 75% more power than the earlier-generation A100.

“The 75% increase in GPU power consumption happened rapidly, within two brief years, across one generation of GPUs,” Kindig wrote. 

Nvidia’s Blackwell generation boosts power consumption even further, with a 300% increase across a single generation of GPUs.

While each generation of Nvidia GPUs is more power-efficient than its predecessor, the increased power demand still easily overcomes those power-efficiency gains.

As single GPUs consume increasing energy, AI projects also increase in size. Today, LLMs can run to hundreds of billions of parameters but tech experts say the industry is working toward LLMs into the trillions – hence the need for ever-more-powerful GPUs that consume megadoses of energy.

The number of GPUs needed for those larger LLMs is also increasing. Kindig wrote: “Nvidia and other industry executives have laid out a path for GPU clusters in data centers to scale from the tens of thousands of GPUs per cluster to the hundred-thousand-plus range, even up to the millions of GPUs by 2027 and beyond.”

As the tech industry works to solve its challenges around growing energy consumption and other resources, organizations can still take positive steps to cut cloud costs, reduce carbon emissions and promote sustainable IT.

Ken Kaplan is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan and LinkedIn.

Jason Lopez contributed to this story.

© 2025 Nutanix, Inc. All rights reserved. For additional legal information, please go here.

Related Articles