When It's Time to Decommission Legacy IT Systems

IT observers explain what it takes to weed out aging technologies.

By Bill Laberis

By Bill Laberis May 3, 2019

Few people rely on a PC they bought back in the 1980s for their everyday work. Yet many enterprise-class organizations still operate large-scale systems developed decades ago, despite the clear advantages of upgrading to contemporary technology.

Decommissioning legacy IT systems helps liberate businesses to pursue innovation and move forward with digital transformation. But unplugging can be a cumbersome task that carries some governance and compliance concerns, especially in the highly regulated healthcare and financial sectors, according to Deloitte Consulting in its 2017 report, “Legacy Systems and Modernization.”

The research firm found that legacy systems, including older mainframe and minicomputer-based infrastructures, are rife with problems such as being dependent on outdated languages, aging databases and a dwindling supply of programmers who can operate them. These issues render the systems frequently incompatible with current technologies such as cloud, analytics and mobility, according to the report.

You cannot move forward with digital transformation if you’re putting a lot of energy into old systems.

Jim Hall, Founder and CEO, IT Mentor Group

Keep the Data, Ditch the System

“The trouble is that you can’t move forward with digital transformation if you’re putting a lot of energy into old systems,” Jim Hall, founder and CEO of IT Mentor Group, a St. Paul, Minn.-based IT consultancy, said in an interview.

Hall has seen technology still in use that includes a 1980s-vintage Digital Equipment Corp. VAX minicomputer, MS-DOS PCs, and Windows XP servers – all technologies no longer supported by the manufacturers.

Application programmer working at her laptop.

A good decommissioning program will both retain the data and retire the old system, according to Jon Russell, CIO at John Muir Health System, and Tony Paparella, president of MediQuant, Inc., in a presentation they gave at the 2017 Healthcare Information and Management Systems Society (HIMSS) annual conference.

Russel and Paparella estimated that savings can equal up to 80% of the costs of maintaining the legacy systems.

Who Owns the System?

Despite decommissioning’s advantages, IT leaders often struggle to get the job done, according to Hall. He oversaw numerous technology phaseouts in the five years he spent on campus as CIO at the University of Minnesota.

“One problem is that the CIO is often not the owner of the legacy system – that’s a department head somewhere else in the enterprise,” Hall said. “That person is often concerned about the cost of moving the system and the effort required to migrate data to a new platform.”

Hall indicated that it’s easy to fall behind to the point where it’s nearly impossible to bring older systems to up spec.

“These systems became out of date because they were simply ignored,” he said. “The apps ‘ran fine’ and [those] with pinched budgets didn’t want to invest precious resources to replace a working system.”

Hall said there’s a strong temptation to offload the hosting of those legacy applications and workloads to the cloud.

“But these legacy systems often require particular operating environments that may not be available in PaaS [platform-as-a-service] offerings.”

The Costs of Not Modernizing

CIOs clearly see the dangers of not replacing aging systems. In a recent survey of more than 900 CIOs conducted by VansonBourne for Rimini Street, 76% of respondents said that legacy systems hinder innovation. Similarly, 77% said they spend too much just “keeping the lights on” with old technologies, which takes away from innovating or helping compete more effectively.

In heavily regulated industries, decommissioning efforts often run smack into legacy data concerns, according to Mary Beth Haugen, CEO and president of Haugen Consulting Group, and Jeff Pelot, the CIO of Denver Health, in “Power Down,” a best-practices article on decommissioning they coauthored for the American Health Information Management Association (AHIMA) Journal.

They contend that data retention raises complex issues, like what data to keep or destroy and how to archive legacy data effectively while maintaining compliance.

“It’s easy to assume a simplistic approach: shut legacy systems down, do a final data dump in a data warehouse and be done with it,” the authors wrote. “Sounds easy, but it doesn’t work that way.”

Haugen and Pelot suggest establishing an enterprise information governance program – a data lifecycle management approach – to guide retention activities to ensure that decommissioning efforts are in sync with data compliance mandates and existing vendor contracts.

What’s Driving Decommissioning?

The Deloitte study and a Forrester Consulting study commissioned by New Relic, highlighted the key drivers encouraging businesses to unplug aging systems:

  • The need for systems better aligned with product strategies and objectives; most notably, bringing products and services to market more rapidly with the flexibility to swiftly modify service offerings as market conditions change.
  • The ability to gain greater technology relevance, with systems that are cloud-ready and can support modern languages, architectures, analytics tools and overall digital transformation efforts.
  • Being able to dovetail more effectively with emerging mobile environments, which are increasingly the primary customer touch points.

Rajiv Mirani, CTO of cloud platforms at Nutanix pointed out other challenges of holding on to legacy infrastructure for too long.

“If your IT staff is spending 90% of its time just making the sure the infrastructure is updated and the data security patches are applied – things that add little value to the business but must be done – they’re never going to get the time to learn new skills,” he said. “People can be hesitant to change. They've gotten used to that model of working and they don't necessarily want to learn new skills. But it's inevitable that they will have to.”

Best Practices

Hall stressed the importance of making a strong business case for decommissioning and advised against billing the effort as simply a system upgrade. Rather, he suggested pitching decommissioning as a chance to rethink how technology can be a “force multiplier” that can change business processes to boost competitiveness, drive efficiency and increase speed to market.

Haugen and Pelot suggest initiating decommissioning efforts by building a multi-disciplinary team with representatives across the enterprise to develop data retention policies and procedures. From there, make information governance a top priority to define proper and seamless data conversion, access, retrieval, storage and disposition policies, they advise.

The authors advocate measuring for success. They suggest implementing a dashboard and tracking tools, using data analysis to track all decommissioning activities and to monitor progress against predetermined goals.

No Room for Nostalgia

While IT professionals might feel nostalgic for technologies whose time has come and gone, digital transformation initiatives and competitive market forces leave little room for keeping them operational. The digital era is all about change, and while embracing it can be painful, doing so is necessary.

Haugen and Pelot advise having a documented plan for ensuring that you retain the data you need, where it will reside, how it will be used and for how long is critical before changing to a new system. From there, moving on can free up maintenance budgets and reduce infrastructure costs. That savings can be reinvested into new revenue-generating initiatives and improved processes.

Bill Laberis is a veteran IT writer and for 10 years was editor in chief of Computerworld.

Feature photo by Pixabay from Pexels.

© 2019 Nutanix, Inc. All rights reserved. For additional legal information, please go here.