How Next-Gen Computing Is Enabling Next-Gen Physics

The Large Hadron Collider – the world's largest and most powerful particle accelerator – uses cloud computing to make groundbreaking new discoveries about the universe.

By Jennifer Roland Cadiente

By Jennifer Roland Cadiente November 17, 2022

Just outside of Geneva, Switzerland, mere steps from the Swiss border with France, is the European Organization for Nuclear Research, otherwise known as CERN. There, in a massive tunnel 100 meters belowground, rests a most magnificent machine: the Large Hadron Collider (LHC), data from which is revolutionizing the field of physics.

Comprising more than 1,200 car-length magnets that stretch for 27 kilometers alongside nearly 400 smaller magnets, the ring-shaped LHC is the largest particle accelerator in the world. Commissioned in 2008, its job sounds simple enough yet is incredibly complex — to catapult and supercharge trillions of protons until they’re traveling at nearly the speed of light, then collide them into each other so that scientists can observe what happens and, in so doing, uncover new laws of physics.


How NASA Uses Open Data to Save the Planet

For nearly 15 years, scientists at CERN have used the LHC to make groundbreaking physics discoveries. Even the most powerful machines on Earth, however, eventually need to rest. The machine, therefore, runs in fits and starts. Its first run began in 2009 and ended in 2013, after which it experienced a two-year shutdown for upgrades and maintenance. Its second run began in 2015 and ended in 2018. Three years later, it’s finally back to work for a third run with the help of cloud computing.

Making a ‘Big Bang’ with Cloud

Theoretical physicists who study the behavior of subatomic particles use the LHC to test their hypotheses. In 2012, they struck physics gold when they happened upon the Higgs boson. Otherwise known as the “God particle,” it’s believed to have caused the “Big Bang” that billions of years ago created the universe and everything in it. Studying it and other particles like it could help scientists develop a greater understanding of how stars, galaxies and planets are formed in the universe.

“The Higgs boson can't be ‘discovered’ by finding it somewhere, but has to be created in a particle collision,” CERN explains on its website. 

“Once created, it transforms — or ‘decays’ — into other particles that can be detected in particle detectors.” 

Physicists look for traces of these particles in data collected by the detectors. 

“The challenge is that these particles are also produced in many other processes, plus the Higgs boson only appears in about one in a billion LHC collisions,” explained CERN.

Although the Higgs boson is a needle in a haystack, CERN scientists found it with the help of advanced computing.

“The experiments [on the LHC] generate around a petabyte of data every second, but we don’t have the capacity to analyze all of that, and most of it is noise anyway. Complex filtering systems slow down the capture rate to tens of gigabytes per second. Even then, we keep adding a huge amount of data every year,” Ricardo Rocha, a computing engineer at CERN, told Google Cloud, which has been working with CERN since 2019 to explore the use of public cloud for LHC experiments.


Can the Cloud Save Endangered Species from Extinction?

“The resource-intensive tasks we’re exploring, such as machine learning, often require specialized hardware … [Cloud computing] opens up new options for our IT infrastructure to access GPUs and other types of accelerator hardware.”

That increased computing power could help physicists discover even more new particles, according to CERN theoretical physicist Sophie Renner. 

“If these exist, they can answer some of the questions that are so far puzzling us,” she said earlier this year in a video interview

“For example, a big one is: What is the dark matter that seems to be holding galaxies together? If this is a type of particle, then it could be closely related to some long-lived particles that we might be able to see at the LHC.”

Data Deluge

The LHC, which officially began its third run in July 2022, will search for “long-lived particles” using instruments that have been fine-tuned to create more precise calculations.

But scientists still need to be able to ingest and manage those calculations, according to Mélissa Gaillard, communications officer for CERN’s IT department, who said the team at CERN used the LHC’s three-year hiatus to “get ready to store and analyze more than 600 petabytes of data … equivalent to over 20,000 years of 24/7 HD video recording.” That’s more data than the LHC produced in its two previous runs combined.

CERN uses the cloud to turn this massive data store into meaningful insights: After it collects data, Gaillard explained, it sends it to the CERN Data Center for initial reconstruction and backup. Next, data goes to the Worldwide LHC Computing Grid, a cloud-based network encompassing hundreds of thousands of computers in 170 data centers in 42 countries. From there, more than 12,000 physicists around the world can access the data in real time to complete experiments and models.


Hungry for Sustainability: How Technology Can Help Control Food Waste and Climate Change

While LHC data is physically stored and backed up, scientists would be unable to use it without the power of a virtualized network, according to Gaillard, who said the Worldwide LHC Computing Grid currently performs more than 2 million computing tasks per hour. And she expects that number to continue growing.

Although some scientists are denied access to LHC data for political reasons — in June, for example, CERN announced plans to end data sharing with Russia and Belarus because of the war in Ukraine — decentralized access ensures scientists everywhere can utilize LHC data irrespective of local regulations and resources.

What’s Next?

Combined with advanced cloud technology, CERN’s commitment to open scientific collaboration ensures that the best is yet to come. In fact, CERN is already working on the LHC’s successor: the High Luminosity Large Hadron Collider (HL-LHC), which is expected to come online in 2029 and produce at least 15 million Higgs bosons per year, compared to around 3 million from the LHC in 2017.

“We are investigating ways to rapidly boost capacity when required, for a variety of different workloads,” Rocha told Google Cloud.

“We’ve discovered the Higgs boson, but more work is needed to understand it. As we explore new frontiers of high-energy physics, public cloud has the potential to boost CERN’s resources and therefore play a role in helping us ensure that we have the right tools to keep learning about the nature of the universe.”

Learn about unified storage and hybrid cloud capabilities enabled by the Nutanix Cloud Platform.

Jennifer Roland Cadiente is a full-time freelance writer focused on technology and financial institutions and host of the Grow Your Side Hustle podcast.

© 2022 Nutanix, Inc. All rights reserved. For additional legal information, please go here.

Related Articles


How Flash Memory is Reinventing Hybrid Cloud Storage

Cloud technology and applications in the enterprise are increasingly being powered by high-performance SSDs built with flash memory as more businesses look for value through active data analysis.