The World-Class Computing Resources Behind the DOE’s Fermilab: America’s Particle Physics and Accelerator Laboratory

The World Class Computing Resources Behind Fermilab

TL; DR: Fermilab, a DOE-sponsored particle physics and accelerator laboratory, is raising the bar on innovative and cost-effective computing solutions that help researchers explore high-energy physics. As a repository for massive sets of scientific data, the national laboratory is at the forefront of new computing approaches, including HEPCloud, a paradigm for provisioning computing resources.

It’s common knowledge that Tim Berners-Lee invented the World Wide Web in 1989. But if you’re not a quantum physicist, you may be surprised to learn that he accomplished the feat while working at the European Organization for Nuclear Research (CERN), a prominent scientific organization that operates the largest particle physics lab on the globe.

“It was the field of high-energy physics for which the web was started to provide a way for physicists to exchange documents,” said Marc Paterno, Assistant Head for R&D and Architecture at Fermilab, a premier national laboratory for particle physics and accelerator research that serves as the American counterpart to CERN.

Marc told us the particle physics field as a whole has been testing the limits of large-scale data analyzation since it first gained access to high-throughput computational resources. Furthermore, the high-energy physics community is responsible for developing some of the first software and computing tools suitable to meet the demands of the field.

Fermilab logo

Fermilab is a national particle physics and accelerator laboratory outside Batavia, Illinois.

“Of course, Google has now surpassed us in that its data is bigger than any particular set of experimental data; but even a small experiment at Fermilab produces tens of terabytes of data, and the big ones we are involved with produce hundreds of thousands of petabytes of data over the course of the experiment,” Marc said. “Then there are a few thousand physicists wanting to do analysis on that data.”

The lab is named after Nobel Prize winner Enrico Fermi, who made significant contributions to quantum theory and created the world’s first nuclear reactor. Located near Chicago, Fermilab is one of 17 U.S. Department of Energy Office of Science laboratories across the country. Though many DOE-funded labs serve multiple purposes, Marc said Fermilab works toward a single mission: “To bring the world together to solve the mysteries of matter, energy, space, and time.”

And that mission, he said, is made possible through high-powered computing. “For scientists to understand the huge amounts of raw information coming from particle physics experiments, they must process, analyze, and compare the information to simulations,” Marc said. “To accomplish these feats, Fermilab hosts high-performance computing, high-throughput (grid) computing, and storage and networking systems.”

In addition to leveraging high-performance computing systems to analyze complex datasets, Fermilab is a repository for massive sets of priceless scientific data. With plans to change the way computing resources are used to produce experimental results through HEPCloud, Fermilab is continuing to deploy innovative computing solutions to support its overarching scientific mission.

Pushing the Envelope on High-Throughput Computing

While Fermilab wasn’t built to develop computational resources, Marc told us “nothing moves forward in particle physics without computing.” That wasn’t always the case: When the lab was first founded, bubble chambers were used to detect electrically charged particles.

“They were analyzed by looking at pictures of the bubble chamber, taking a ruler, and measuring curvatures of trails to figure out what the particles were doing inside of a detector,” he said. “Now, detectors are enormous, complicated contraptions that cost tens of millions to billions of dollars to make.”

Fermilab's datacenters

Experiments at Fermilab typically involve massive datasets.

Marc said Fermilab is in possession of a large amount of computing resources and is heavily involved with CERN’s Compact Muon Solenoid (CMS), a general-purpose detector at the world’s largest and most powerful particle accelerator, the Large Hadron Collider (LHC). The CMS has an extensive physics agenda ranging from researching the Standard Model of particle physics to searching for extra dimensions and particles that possibly make up dark matter. “Fermilab provides one of the largest pools of resources for the CMS experiment and their worldwide collection,” Marc said.

Almost every experiment at Fermilab includes significant international involvement from universities and laboratories in other countries. “Fermilab’s upcoming Deep Underground Neutrino Experiment (DUNE) for neutrino science and proton decay studies, for example, will feature contributions from scientists in dozens of countries,” Marc said.

These international particle physics collaborations require Fermilab to transport large amounts of data around the globe quickly through high-throughput computing. To that end, Fermilab features 100Gbit connectivity with local, national, and international networks. The technology empowers researchers to quickly process these data to facilitate scientific discoveries.

A Repository for Large Sets of Valuable Scientific Data

Marc told us Fermilab also has mind-boggling storage capacity. “We’re the primary repository for all the data for all of the experiments here at the laboratory,” he said.

Fermilab’s tape libraries, fully automated and manned by robotic arms, provide more than 100 petabytes of storage capacity for data from particle physics and astrophysics experiments. “This includes a copy of the entire CMS experiment dataset and a copy of the dataset for every Fermilab experiment,” Marc said.

Fermilab also houses the entire dataset of The Sloan Digital Sky Survey (SDSS), a collaborative international effort to build the most detailed 3D map of the universe in existence. The data-rich project has measured compositions and distances of more than 3 million stars and galaxies and captured multicolor images of one-third of the sky.

The massive datacenters at Fermilab

The lab’s data management capabilities protect precious scientific data.

“SDSS was the first time there was an astronomical survey in which all data were digitized, much bigger than any survey done before,” Marc said. “In fact, even though the data collection has stopped, people are still actively using that dataset for current analysis.”

Marc said much of the particle physics research is done in concert with the academic community and can involve a significantly lengthy process.

“For example, the DUNE experiment is a worldwide collaboration that researchers have been developing for more than 10 years,” he said. “We are starting on the facility where the detector will go. The lifetime of a big experiment these days is measured in tens of years; even a small experiment with 100 collaborators easily takes 10 years to move forward.”

HEPCloud: A New Paradigm for Provisioning Computing Resources

Particle physics has historically required extensive computing resources from sources such as local batch farms, grid sites, private clouds, commercial clouds, and supercomputing centers — plus the knowledge required to access and use the resources efficiently. Marc told us all that changes with HEPCloud, a new paradigm Fermilab is pursuing in particle physics computing. The HEPCloud facility will allow Fermilab to provision computing resources through a single managed portal efficiently and cost-effectively.

“HEPCloud is a significant initiative to both simplify how we use these systems and make the process more cost-effective,” Marc said. “Here at Fermilab, trying to provision enough resources to meet demand peaks is just too expensive, and when we’re not on peak, there’d be lots of unused resources.”

The technology will change the way physics experiments use computing resources by elastically expanding resource pools on short notice — for example, by renting temporary resources on commercial clouds. This will allow the facility to respond to peaks without over-provisioning local resources.

Servers

HEPCloud will enable scientists to put computing resources to better use.

“HEPCloud is not a cloud provider,” Marc said. “It’s an intelligent brokerage system that can take a request for a certain amount of resources with a certain amount of data; a portal to use cloud resources, the open science grid, and even supercomputing centers such as the National Energy Research Scientific Computing Center (NERSC).”

Marc said the DOE funds a number of supercomputing sites across the country, and Fermilab’s goal is to make better use of those resources. “It’s not feasible for us to keep on growing larger with traditional computing resources,” Marc said. “So a good deal of our applied computing research is looking at how to do the kind of analysis we need to do on those machines.”

At the end of the day, Marc recognizes the importance of letting the public know how scientists, engineers, and programmers at Fermilab are tackling today’s most challenging computational problems. “This is taxpayer money, and we ought to be able to provide evidence that what we are doing is valuable and should be supported,” he said.

Ultimately, its solutions will help America stay at the forefront of innovation.

Advertiser Disclosure

HostingAdvice.com is a free online resource that offers valuable content and comparison services to users. To keep this resource 100% free, we receive compensation from many of the offers listed on the site. Along with key review factors, this compensation may impact how and where products appear across the site (including, for example, the order in which they appear). HostingAdvice.com does not include the entire universe of available offers. Editorial opinions expressed on the site are strictly our own and are not provided, endorsed, or approved by advertisers.

Our Editorial Review Policy

Our site is committed to publishing independent, accurate content guided by strict editorial guidelines. Before articles and reviews are published on our site, they undergo a thorough review process performed by a team of independent editors and subject-matter experts to ensure the content’s accuracy, timeliness, and impartiality. Our editorial team is separate and independent of our site’s advertisers, and the opinions they express on our site are their own. To read more about our team members and their editorial backgrounds, please visit our site’s About page.