Across the Purdue University campus in West Lafayette, Indiana, students, employees, professors and researchers rely on artificial intelligence to help them perform a wide range of tasks. Computer science majors learn to develop code with Co-Pilot, for example. Professors use AI tools to help create lesson plans. Researchers utilize the college’s GPU HPC clusters for machine learning and cutting-edge AI research. Even the school’s IT department uses AI, whose penchant for pattern recognition helps it analyze campus IT infrastructure and networks for purposes of monitoring, scaling and security.
Sundeep Rao, Purdue’s senior director of information technology, thinks AI in higher education is here to stay. However, to meet its potential for all manner of higher-ed users, AI requires the power of modern GPUs to run cutting-edge models and algorithms. This means AI simply cannot run on the older and legacy technology that dominates on many university campuses.
Rao suggested that colleges and universities have only one choice if they want to continue using AI to support education, research, and operations: They have to modernize their IT infrastructure.
“Bringing in modernized infrastructure for AI has a twofold benefit — one looking at the past and one with an eye to the future,” Rao noted.
“The first is that modernizing brings an end to technical debt that may have accumulated over the past few decades. Second, universities can then push the envelope of what researchers need for their work — work that is often years ahead of commercial products.”
Jason Duggan, CEO of Thesis Elements, provider of a cloud-based student information system for smaller colleges and universities, wrote in a June 2025 article for University Business that many small and mid-sized institutions still operate on administrative infrastructure that predates the iPhone.
“These legacy systems … have become liabilities,” Duggan said. “They no longer support the complexity of modern operations or the digital expectations of today’s students.”
It’s not just hardware. eCampus News wrote that universities often use legacy software to perform campus functions, such as student information systems (SIS), learning management systems (LMS) and administrative systems. In an April 2025 article about the hidden cost of legacy systems in higher ed, it pointed out that legacy systems can be more expensive to operate, exacerbate data silos, make processes redundant and create a fragmented user experience.
“When campus researchers use on-premises IT hardware to support their work, it can take some time to get the infrastructure set up — and when they inevitably encounter the ‘error’ part of trial and error, the wait for additional resources can delay their groundbreaking work,” author Calvin Hennick wrote in a 2024 article for EdTech.
The issue with legacy systems has been compounding for decades, according to Rao. Because universities like Purdue were early adopters of technology, he said, disparate systems proliferated over the course of many years. For example, when a new research project received funding, the researchers in charge of it would often set up a brand-new, separate lab that did not communicate or connect to other systems.
Rao said that modernizing an entire campus can be an expensive and complicated endeavor because of the vast number of legacy systems and isolated ecosystems.
“Modernization of legacy systems is happening in increments, much like a Tetris game. When everything aligns, one piece of the tech stack revs up,” he explained. “The slower pace of modernization is because of size, complexity and budget, not because any university wants to live in the Paleolithic times.”
Duggan wrote in his University Business piece that many universities frame digital transformation as disruptive or overwhelming. However, cloud-native platforms are built to reduce that friction, he explained.
“They offer lower total cost of ownership, continuous updates and scalable configurations that don’t require armies of consultants,” Duggan said.
“More importantly, they allow institutions to retire the fragmented point solutions that crept in over time — each one solving a narrow problem while contributing to overall data sprawl and workflow inconsistency.”
According to Duggan, St. Elizabeth University in New Jersey is one university that has experienced firsthand the benefits of cloud-native platforms.
“By modernizing their core infrastructure, St. Elizabeth not only reduced the operational drain caused by outdated systems, they also became more responsive to student needs,” he said.
“With fewer platforms to manage, IT teams could focus on delivering services rather than maintaining software. Administrative processes became more streamlined. Students experienced fewer friction points. The institution, as a whole, became more agile and capable of adapting to change.”
While Purdue University has numerous departments and groups relying on cloud-native platforms, the Civil Engineering JTRP group is one of the campus’s biggest adopters of cloud-based technology, according to Rao. The group regularly analyzes hyperlocalized data about weather, construction, traffic, and law enforcement to make decisions such as whether and when to deploy salt trucks in the winter.
“While the costs may be cheaper on-prem, their need to process and analyze data from many different sources, as well as share the data back to those entities, makes cloud one of the only viable options,” Rao said.
While higher education institutions move towards cloud-native infrastructure, IT departments are using multiple strategies to provide the AI-ready environment their community expects. In particular, schools like Dartmouth University are modernizing through AI-based Kubernetes.
Over the past decade Dartmouth University added more than 400 containers spread across four Kubernetes clusters and is currently migrating all of them to the Nutanix Kubernetes Platform (NKP) solution, according to Ty Peavey, director of infrastructure services for Dartmouth.
“We’ll always have a fair amount of full virtual machines, but off-the-shelf software is starting to normalize the deployment of applications in containers. As that number starts to grow, we’ll see our Kubernetes stack grow even more,” Peavey told The Forecast in an interview at Nutanix’s 2025 .NEXT event in Washington, DC.
A hybrid cloud strategy also provides performance and flexibility to help universities create an AI-ready environment. Because the university only pays for resources used on the cloud, it gains flexibility for new use cases while making smart budget decisions. According to the 2024 CDW Cloud Research Report, 88% of higher education institutions have deployed at least 25% of their applications into the cloud. The majority — 79% — reported that the cloud has met or surpassed expectations.
In an article for Education Technology Insights, Russell M. Kaurloto, vice president and CIO at Clemson University, recommended engaging the technical team — including infrastructure, applications, security, networking, customer service and project management — early in the process for brainstorming and whiteboarding. He suggested asking about technical roadblocks and then documenting the responses to discover opportunities for business process improvements. He also underscored the importance of conducting a strengths-and-weaknesses analysis before moving to the cloud for modernization.
“At a high level, discern precisely what your organization is good at,” Kaurloto said.
“What are you best positioned to provide that virtually no one else can? With these differentiators in mind, you can focus your team and talents on the things that will add value to the university function and reveal which opportunities are right to take a fresh look at potentially moving to a cloud platform.”
According to Peavey, modernization and AI readiness aren’t just about technology and infrastructure. They’re also about evolving the IT department’s role in the university’s future, citing infrastructure management as an example.
Peavey said that historically, infrastructure management roles were highly siloed, with team members dedicated to specific environments like Linux and Windows. By using modernized technology for infrastructure, team members can now be generalists instead of specialists, which makes the entire team more integrated, flexible, and adaptable.
Indeed, the simplicity offered by Nutanix’s hyperconverged infrastructure has allowed Peavey’s department to completely rethink how IT professionals do their jobs. His department now focuses on building “T-shaped” team members who have expertise in a wide range of IT technologies but also deeper knowledge in specific areas of interest. They no longer rely on individual systems administrators for different environments, which he said makes him very proud. A developer may work on his team on Kubernetes in the morning and VMs in the afternoon using Nutanix Cloud Platform software.
“We’ve found that it brings really good job satisfaction,” Peavey said.
“People like what they’re doing. It is challenging at times, but a good challenge. We’re right in that sweet spot, where we have time to learn and understand how AI is changing the environment, but yet we have a little bit of time to play and have fun with it and learn how to really incorporate it into our work. It’s a way to do things easier, faster, better.”
Jennifer Goforth Gregory is a contributing writer. Find her on X @byJenGregory.
© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.