Obama Wants To Build The Most Powerful Supercomputer In The World By 2025

Carol Gauthier/Shutterstock.

Yesterday, on July 29, the White House made a pretty big announcement for the computing world: by 2025, the U.S. hopes to have on its hands a supercomputer so powerful that it can perform one quintillion, or 1018, operations each second. For those baffled by that unfathomably large number, it’s a billion billion.

This plan is part of an Executive Order, issued by President Obama, to see the creation of an ambitious scheme called the National Strategic Computing Initiative (NSCI). The ultimate aim for the next decade is to maintain the U.S.’s current position as world leaders in the development of High-Performance Computing (HPC) systems.

Marrying incredibly powerful computers with vast amounts of storage, supercomputers have been pivotal in a variety of research fields, helping scientists to model an array of physical systems and natural phenomena. For example, researchers can simulate what happens when galaxies collide, or when molecules interact with one another. They can even help us to predict short- and longer-term trends involving different scenarios, such as the potential impacts of climate change, or what could happen during a disease outbreak.

Although the performance of these computers can be assessed in a variety of ways, a common measure is “flops,” or the number of calculations that can be performed each second. As it stands, the speediest supercomputer is China’s Tianhe-2, which, running at 33.86 petaflops, is capable of quadrillions of calculations per second, BBC News reports. But Obama wants to reach the next level, stepping into the exascale realm. Exascale computers are those capable of at least an exaflop, which is about a thousand times the speed of petascale systems. So that’s a pretty big leap.

But the potential of such systems could be huge. With the capacity to rapidly analyze such vast amounts of data, scientists could make significant progress in fields such as personalized medicine. By processing huge medical databases, scientists could better predict how certain drugs may interact with different populations of people or individuals with certain genetic variations, identifying those that may be particularly susceptible or unsuitable for various therapies. Furthermore, the storage and processing capacities could allow scientists to combine real data with simulations, which could allow for better predictions of weather or climate change, for example.  

Although there seems to have been a focus on speed, the President’s Council of Advisors on Science and Technology has pointed out that high-performance computing isn’t just about flops, but also ability. In order for the field to progress, researchers need to create systems that are capable of keeping up with huge amounts of rapidly growing and changing data. There is clearly a lot of work to be done, but the ambitious goal does not sound unachievable.

[Via BBC News and Wired]

Comments

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.