Where applications and data run is critical for many reasons – optimal performance and regulatory requirements, for example – but increasingly businesses are using innovative tools to move and manage them across their own data centers, remote locations or multiple cloud services. They are eager to learn how to run applications and manage data across private and pubic cloud and edge computing locations, especially as AI moves to the edge. IT teams must know how to use the right system and when to switch from a central data center, remote location or public cloud. In many cases, it means interconnecting hub to edge to cloud, which requires the ability to see and simply control across all of those different IT infrastructures.
Manufacturing, retail and other industries that are adopting a hybrid multicloud approach are learning how to combine centralized, distributed and edge computing.
HPE describes it as edge to cloud, where enterprise data isn’t confined to the data centers anymore. Data is also increasingly generated at the edge, processed and stored in the cloud, and used by an increasingly distributed global workforce.
Multicloud IT systems are now the norm in many enterprises. Over the past decade or so, these enterprises have modernized legacy IT infrastructures by moving to software-defined and private, public and hosted cloud services or deploying a hybrid IT architecture that enables agility, mobility of applications and data, availability, and rapid scaling on demand.
Along with cloud technology, there have been significant advances in the areas of AI, ML, IoT and data analytics. The way these technologies work necessitates powerful and varying amounts of compute, memory and storage at the node level, which often reside in geographically remote locations that aren’t well connected to the central data center or cloud or supported by onsite IT staff.
This is where edge computing comes in. For workloads that require data to be processed – in many cases, in real time – or stored at the same place where it’s generated, the edge holds an edge over the cloud (or gives the cloud an edge).
Red Had explains cloud computing as the act of running workloads within clouds, while edge computing is the act of running workloads on edge devices.
To understanding which one better than the other for particular needs, it helps to explore key differences and synergies between cloud and edge computing.
There were 11 billion connected IoT devices worldwide at the end of 2021, according to a Statista study. This number is expected to reach 19 billion by 2025. Managing and processing the deluge of data these devices generate can be a nightmare if it has to be transmitted to the cloud first, given that much of the data stored in the cloud loses significance over time.
“Edge computing can apply to anything that involves placing service provisioning, data and intelligence closer to users and devices,” said Gordon Haff, Technology Evangelist at Red Hat.
Basically, edge computing becomes a necessity when:
Some real world scenarios that edge computing is certainly better suited to than cloud computing are:
Global airwaves and cables are already stressed with the humongous amounts of data generated and streamed across the internet. In situations where data can’t be moved closer to the data center, the edge moves the data center itself.
“Things that require real-time performance are going to tend to be done at the edge,” said Adam Drobot, Board Chairman at OpenTechWorks, Inc.
“Edge computing will take its place in a spectrum of computing and communications technologies as another place where system architects can place computing workloads.”
The benefits of edge computing over the cloud and data centers include:
The edge by no means replaces the public or private cloud. Rather, it is a more efficient alternative to workloads and business cases that stretched the cloud’s abilities beyond its inherent strengths.
“A centralized utility for IT was never a realistic expectation,” said Haff.
“Even public clouds themselves have evolved to offer provider-specific differentiated services, rather than competing as commoditized utilities. But more generally, edge computing is a recognition that enterprise computing is heterogeneous and doesn’t lend itself to limited and simplistic patterns.”
Both are good depending on the use case, workload and business need, said Greg White, Nutanix senior director of product solutions marketing at Nutanix.
“Being able to do both without creating separate management, cost structure, and employee knowledge silos is important and valuable,” White said.
He said orgaizations should strive for an IT infrastructure that is flexible to easily leverage data center, cloud and edge.
“That way they can adapt without ripping out the plumbing every time there’s a new need,” said White.
Many enterprises as well as public cloud providers are looking into ways where they can deploy both edge and cloud selectively.
“I think it’s going to be very rare that an application will live only in edge computing,” said Dalia Adib, Director, Edge Computing Consulting at STL Partners.
“It’s going to need to communicate and interact with other workloads that are in the cloud or in an enterprise data center or on another device.”
Operations that can be executed faster and better at the end device can be assigned to the edge while applications that aggregate data from multiple sources and perform large-scale operations can stay in the cloud. The right combination of edge and cloud systems will help companies match their IT infrastructure to their business model, organizational structure, workflow and hierarchy.
To meet AI scalability and performance requirements, organizations will need to adopt the distributed approach to architecture that edge computing provides, according to Steve McDowell, chief analyst and founder at NAND Research.
“The reason we push AI to the edge is because that's where the data is," McDowell said, summing up the trend in an interview with The Forecast.
McDowell published Taming the AI-enabled Edge with HCI-based Cloud Architectures, a 2024 report commissioned by hybrid multicloud software company, Nutanix. The report explores the impact of extending IT resources to the edge and the driving force of AI, particularly in areas like image recognition for retail, manufacturing and other industries.
"We've always defined edge as any resources that live outside the confines of your data center," he said. "And there's some definitions that say the extension of data center resources to a location where there are no data center personnel. It's remote."
Once IT teams start putting AI into edge computing environments, they need the ability to process that AI, often with the use of GPUs or other kinds of AI accelerators, he explained. Older embedded systems were fairly locked down and didn't need updating as often as today's systems that power what he calls "living workflow" applications like AI.
"If I'm doing image processing for manufacturing, for example, for quality assurance, I want to update those models continuously to make sure I've got the latest and the greatest," McDowell said.
Managing edge infrastructure as a connected part of overall IT infrastructure allows teams to manage patches, find and fix vulnerabilities. With software-defined IT operations built on hyperconverged infrastructure makes the edge look like it's part of data center.
"It gives me kind of a consistent control plane across my infrastructure," he said. "The power of Nutanix [software] is it allows me to extend outside of my traditional infrastructure into the edge without changing my management models."
Dipti Parmar is a marketing consultant and contributing writer to Nutanix. She’s a columnist for major tech and business publications such as IDG’s CIO.com, Adobe’s CMO.com, Entrepreneur Mag, and Inc. Follow Dipti on Twitter @dipTparmar or connect with her on LinkedIn for little specks of gold-dust-insights.
© 2022 Nutanix, Inc. All rights reserved. For additional legal information, please go here.