Last Updated:
Frontier Supercomputer Server Racks

What Lies Ahead for the World's Fastest Supercomputers?

Shihaam Isaacs
Shihaam Isaacs Technology

At an estimated construction cost of $600,000,000 the worlds first exascale supercomputer is capable of over 1,100,000,000,000,000,000 operations per second.

Pioneering scientists have started experimenting on the world's first official exascale machine, Frontier. In the meantime, leaders at similar institutions across the globe have also begun building their own supercomputers to become part of an elite group. 

It's almost unimaginable for us to truly understand the computational capabilities of the world's fastest supercomputer. One of the easiest ways to look at it, as analogized by a University of Tennessee computer scientist, Jack Dongarra, is if each person on the planet performed one mathematical calculation per second, it would take 48 months to do the same as a supercomputer can do in as little as one second. 

The cutting-edge technology goes by the name Frontier. It was first unveiled to the world in May 2022. It's roughly the size of half an Olympic-sized swimming pool and is housed at the Oak Ridge National Laboratory in Tennessee's eastern hills. 

Check out some of these fun specifications: Frontier cost somewhere in the region of $600,000,000.00 to construct. The energy consumed is around 20,000,000.00 watts – an average laptop only consumes around 65 watts. Also, whereas most average laptops use anywhere from 16 to 24 processors, Frontier uses close to 50,000. 

As soon as Frontier went online, the world of computing entered an exciting new age of computer discovery in what has been referred to as exascale computing. This supercomputer has the power to carry out an exaflop, which equates to around just over a quintillion floating-point operations per second. At 1.1 exaflops, Frontier is the world's fastest supercomputer. Since launching, scientists have been preparing to develop more of these state-of-the-art computers, and so by 2024, both Europe and the US should have several exascale computers online. 

Scientific researchers aren't only looking for speed. They are also building supercomputers to deal with important scientific and engineering problems in various other major fields that have previously been impossible to deal with. Examples include astronomy, climate, and biology, to name a few. Over the coming years, researchers will use Frontier's computing capabilities to carry out some of humankind's most complex computer simulations ever conceived. The scientists involved will try to answer many unanswered questions about the world around us and will take what they find to help construct new technologies in various other major fields, from medicine to transportation. 

One such researcher from the University of Pittsburgh, Evan Schneider, is using the supercomputer to run simulations of how the Milky Way we live in has evolved through the eons. She is particularly fascinated with how gasses flow in and out of our galaxy. Just like many other living beings need to breathe to survive, in a similar way, so do galaxies. For example, gas flows into galaxies, combining with gravity into stars, and it also flows out of the galaxy – for example, during a supernova, when matter is released. One of the things Schneider will be closely looking at is the complex process behind the exhalation of galaxies. What has been observed in the real world can be compared with supercomputer simulations to determine whether they are getting the physics correct. 

The computational astrophysicist and her team can now build a computer model of our galaxy using the Frontier supercomputer. It will be powerful enough resolution to zoom in closely on individual exploding stars, meaning the simulation must attain wide-reaching properties of the Milky Way at 100,000 light-years, including capturing those of exploding stars (supernovas) that measure around ten light-years across. Schneider pointed out that this hasn't yet been done. To really grasp the kind of resolution we are talking about, you could compare it to creating a precise physical model of a can of lager, including the individual yeast cells it contains inside and the interactions at each scale in between.  

At the research and development division of General Electric, GE Research, Stephan Priebe is also taking advantage of Frontier's computing power. The senior engineer is using it to simulate how the next generation of airplanes may one day be designed. One area he is researching is a new engine design called 'open fan architecture' to increase fuel efficiency. Fans are currently used in jets to generate thrust, and you get more efficiency with bigger fans. To increase the size of fans, researchers have suggested getting rid of the outer structural frame, called the nacelle, meaning the blades are uncovered, much like in a child's pinwheel toy. Supercomputer models enable scientists to get a much better view of aerodynamic performance in the early design phases. Simulations also give engineers a much better idea about how to improve aerodynamics with the most efficient fan blade shape and, for example, how to reduce the amount of sound they produce.

Another important area of Priebe's studies that Frontier will assist the most is in the study of turbulence – a common phenomenon that describes the unpredictable flow of a disturbed fluid, taking air as an example, in this case, that surrounds the fan. It happens around us all the time, from the swirl of smoke rising from a matchstick that has just burned out to the crashing of waves in any of the world's oceans. Researchers will still grapple with being able to predict the exact flow of a turbulent fluid. The main reason for this is that the motion of fluid responds to microscopic and macroscopic influences. 'Microscopic influences' is the friction that occurs between individual nitrogen molecules when they come together, and 'macroscopic influences' is the changing of temperature and pressure. The way in which these forces interact complicates the motion. Presently, our understanding of turbulence is essentially zero. It is one of the world's classic unsolved problems in physics. Anyone who does say they truly understand turbulence is simply not telling the truth. 

The research currently being carried out by scientists and engineers illustrates the evident power and potential of supercomputers, including their ability to simulate physical objects simultaneously at numerous scales. It also has many other applications that reflect this idea. Frontier allows us to have more reliable climate models, which must simulate different types of weather at various scales at any location on planet Earth over short and long periods of time. Physicists can also use Frontier to simulate the turbulent process of nuclear fusion that happens on the sun when certain atoms are forged to create new types of elements. Researchers want to learn more about this process of using fusion as a means of clean energy technology. Although many of these multi-scale simulations have been done by supercomputers before, Oak Ridge National Laboratory's supercomputer can incorporate a much wider variety of scales than was previously possible. 

To gain access to Frontier's supercomputing abilities, only approved researchers, engineers, and scientists are permitted to log in remotely and submit their jobs online. To ensure everyone maximizes their time using Frontier, the supercomputer facility plans to have somewhere in the region of 90% of Frontier's 50,000 processors running computations every second of the day. Any data that researchers gather will be stored at a secure location on the premises, safeguarding up to 700 petabytes of data, roughly translating to how much data nearly three-quarters of a million portable hard drives can safely store. 

Frontier is the first of many exascale supercomputers. Others are expected to be available to researchers over the coming years. Researchers in the United States are already in the process of installing El Capitan at Lawrence Livermore National Laboratory in California and Aurora at Argonne National Laboratory in Illinois. When fully operational, both supercomputers will be capable of over two exaflops.

Starting from next year, researchers will be using Aurora to hunt for compounds that could simplify specific industrial processes, such as the production of fertilizer. It will also be used to create maps of neurons inside the human brain. El Capitan is also likely to come online around the same time as Aurora. This supercomputer will be used for things like simulating nuclear weapons, enabling the government to keep a supply of weapons without needing to carry out tests on them. Shortly after these two come online, Europe's first-ever supercomputer, Jupiter, is also expected to be deployed. 

According to many experts in the field, China also has a fully operational supercomputer. However, the government has yet to release any findings about its performance based on benchmark tests it has carried out, hence why their supercomputers haven't featured in the semiannual list of the fastest supercomputers known as TOP500. The Chinese government still has reservations about the US government imposing further sanctions that prevent China from gaining access to certain technological components. It's one of the reasons why they have been reluctant to release their findings. 

Now that we have reached the point of exascale computers being a reality and not something in the realms of science fiction, there is a new thirst to develop even more powerful computers. According to the director of science at Oak Ridge Leadership Computing Facility, where Frontier is housed, astrophysicist Bronson Messer said the facility has already considered the next generation of supercomputers, which are likely to be anywhere from three to five times more powerful than Frontier. The biggest obstacle that stands in the way is the extraordinary carbon/energy footprint such a huge computer would have on the environment. He pointed out that Frontier can power several thousand households – and that's only when it's in 'idle' mode! He did, however, highlight that for us to build bigger and bigger machines like this is simply not sustainable. 

With each new powerful supercomputer the facility has developed, the engineers involved have always looked at ways to make them as efficient as possible, using pioneering new cooling methods as just one example. Before Frontier, there was Summit, which is still operational at Oak Ridge. Currently, around one-tenth of the total energy it expends is used just to keep it cool. In comparison, this figure is around 3% to 4% for Frontier. Instead of using chilled water, engineers decided to use room-temperature water to keep the supercomputer cool. 

Any more advanced supercomputers that are built in the future would be able to simulate even more simultaneous computations and simulations. With Frontier, for example, the galaxy simulation that Schneider uses has managed to bring resolution down to the tenths of light-years. However, it needs to improve to get down to the scale of individual stars exploding, meaning researchers will have to simulate individual supernovas individually. A more advanced supercomputer in the future will hopefully be able to unite all these scales.  

Supercomputers will continue to push the boundaries of science as we know it by creating models that represent the complexity of technology and nature more accurately and representatively than ever. The sheer enormity of the universe can be at the tips of scientists' fingers thanks to hyperrealistic models of galaxies. To develop a precise model of turbulence flow found in the air that circumvents planes would require a hugely expensive win tunnel. Finally, improved climate simulations would give scientists more accurate predictions of what might happen to our planet in the future. Supercomputers like Frontier give experts in many fields an invaluable new tool to prepare for an uncertain future.