Subtitles section Play video Print subtitles The next generation of computing is on the horizon, and it is super. No, literally: this field of computer science and engineering is called supercomputing, and several new machines may just smash all the records...with two nations neck and neck in a race to see who will get there first. Supercomputers are pretty different from something like your laptop. They can take up whole BUILDINGS, and are used to solve some of the most complicated problems in the world. Just by looking at them, they may not seem that different from a machine like the ENIAC, the first ever programmable digital computer. The ENIAC was capable of about 400 FLOPS. FLOPS stands for floating-point operations per second, which basically tells us how many calculations the computer can do per second. This makes measuring FLOPS a way of calculating computing power. So, the ENIAC was sitting at 400 FLOPS in 1945, and in the ten years it was operational, it may have performed more calculations than all of humanity had up until that point in time— that was the kind of leap digital computing gave us. From that 400 FLOPS we upgraded to 10,000 FLOPS, and then a million, a billion, a trillion, a quadrillion FLOPS. That's petascale computing, and that's the level of today's most powerful supercomputers. But what's coming next is exascale computing. That's, let's see...18 zeroes. 1 quintillion operations per second. Exascale computers will be a thousand times better performing than the petascale machines we have now. Or, to put it another way, if you wanted to do the same number of calculations that an exascale computer can do in ONE second... you'd be doing math for over 31 billion years. So...what the heck do we need that kind of computing power for? Large-scale phenomena like climate change have so many moving parts that are all affected by minute changes in all the other variables, and the effects of these changes need to be projected forward in time. That's a really complex situation to simulate. On the other end of the spectrum, molecular interactions between cells and drug compounds are also extremely complex— just on the nanoscale—and computer models of these interactions allow us to see the actual mechanisms of how diseases make us sick and how different medicines could interrupt those interactions. Exascale computing will provide us with more power, speed, specificity, and accuracy than we've ever had before. It'll be like looking at the world through a new pair of prescription glasses, bringing into sharper focus everything from chemistry to genetics, aircraft design to nuclear physics, even energy grid planning. But increased performance comes with increased cost. Exascale systems have price tags in the hundreds of millions of dollars, and they require huge amounts of electricity to run. And just like with humans, running makes computers hot, so computing facilities consume even more energy (and cold water) to cool the computers down and keep them at optimum performance. Computers that are unrivaled in their power are also unrivaled in their complexity. Exascale machines will, for lack of a better word, 'think' differently than their predecessors. So we're going to need to connect their processors in a different way. Not only that, but exascale processors have to connect to memory and storage in a different way too— and both of these will have to contain unprecedented amounts of information. From the software side, you essentially have to 'talk' to these computers in a different way than you do to petascale machines, so if you want to take codes that were designed to run on petascale computers and now run them on an exascale machine... you gotta do some major code overhaul. Which all means...the dawn of exascale requires huge innovations in everything from the physical architecture of the hardware to software programming to engineering the buildings these computers will live in. So, when can we expect to see these mega machines? Well, the first exascale machine in the U.S. was slated to arrive at Argonne National Lab sometime in 2021, but has been delayed. That supercomputer is called Aurora, and its team plans to use Intel GPU computer chips— the slow development of which that seems to be holding things up. So, the machine that was supposed to come online second has now moved into first place. That's the Frontier supercomputer, which may come online this year at Oak Ridge National Lab and will clock in at 1.5 exaflops. And in 2023 Frontier will be followed by El Capitan at Lawrence Livermore National Lab, a machine capable of 2 whole exaflops. That's a heck of a lot of power. But it remains to be seen if the U.S. will actually get to exascale computing first. Because China is also bringing three new exascale machines into the spotlight... and may very well get there before anyone else. Even though the U.S. and China are leading the pack, many other countries, from Japan to places in Europe, also have exascale machines in the works. Again—the machine hardware itself is really just the skeleton of exascale computing. To actually bring that maximum power to bear on some of the most complex problems scientists are trying to untangle today, there's a whole lot more going on behind the scenes. So, software engineers—now's your time to shine. If you want more on boundary-breaking computing innovations, check out our video on 'hot' quantum computing chips here. And if you have other computational news you want us to cover, let us know in the comments below. Make sure you subscribe to Seeker for all your coverage of bits and bytes, and as always, thanks for watching. I'll see ya in the next one.
B1 computing machine computer supercomputer power software The World’s Most Powerful Supercomputer Is Almost Here 18 1 Summer posted on 2021/05/03 More Share Save Report Video vocabulary