Intel, DOE Announce First-Ever Exascale Supercomputer ‘Aurora’
Intel and the Department of Energy have announced plans to deploy the first supercomputer with a sustained performance of one exaflop by 2021. That’s a bit of a slip compared to previous milestones — in fact, that 2021 delivery date means Horst Simon should win the bet he made in 2013 that supercomputers wouldn’t hit exascale performance until after 2020.
“Today is an important day not only for the team of technologists and scientists who have come together to build our first exascale computer – but also for all of us who are committed to American innovation and manufacturing,” said Bob Swan, Intel CEO. “The convergence of AI and high-performance computing is an enormous opportunity to address some of the world’s biggest challenges and an important catalyst for economic opportunity.”
Aurora will deploy a future Intel Xeon CPU, Intel’s Optane DC Persistent Memory (we’ve covered how DC PM could change the high performance computing market before), Intel’s Xe compute architecture, and Intel’s One API software. In short, this is an Intel win, top-to-bottom, back to front. That’s actually fairly surprising — in the last few years, we’ve seen a lot more companies opting for hybrid Intel-Nvidia deployments rather than going all-in with just Intel. Taking a bet on Intel Xe before consumer or HPC hardware is even in-market implies Intel showed off some impressive expected performance figures. A video about the project is available below:
Aurora is intended to push the envelope in a number of fields, including simulation-based computational science, machine learning, cosmological simulation, and other emerging fields. The system will be built in partnership with Cray, using that company’s next-generation supercomputing hardware platform, codenamed Shasta, and its high-performance scalable interconnect, codenamed Slingshot.
Exascale has more than symbolic importance. The level of compute capability in the human brain at the neural level has been estimated to be in the ballpark of exascale computing, though I can’t say strongly enough that such estimates are an incredible simplification of the differences between how the human brain performs computations versus how computers do. And simply hitting exascale (that’s 1018 FLOPS) doesn’t actually help us use those transistors to build a working model of a human brain. Having the theoretical computational capability of a brain doesn’t actually equal a working whole-brain computer model, any more than having a huge heap of concrete, steel, and enriched uranium is equivalent to a functional nuclear reactor. It’s how you put the thing together that dictates its function, and we’re a long way from that.
But hitting exascale computing levels is one critical component to how we get to the point of running those simulations and running them at scale. This is not to downplay the difficulty of simulating a human brain — open source projects like OpenWorm are working on a worm, C. elegans, with only a thousand cells in its entire body. Booting up a digital human consciousness is quite some ways away. But with exascale computers, we’re moving into new frontiers of complexity — and new discoveries undoubtedly await.
Continue reading
Astronomers May Have Found First-Ever Exoplanet Orbiting Three Stars
Scientists have struggled to explain the twisting and contortion of the dust rings in GW Orionis. A new analysis may have hit on a cause: a first-of-its-kind exoplanet that orbits all three stars in this solar system.
Astronomers Find First-Ever Planet Orbiting Dead Star
Could that mean Jupiter is destined to survive the sun's death throes?
First-Ever ISP Study Reveals Arbitrary Costs, Fluctuating Speeds, Lack of Options
Does anyone in the United States actually like their ISP? If new research is anything to go off of, the answer is probably no.
Webb Telescope Collects First-Ever Atmospheric Data From an Exoplanet
Webb is making history again by collecting a full chemical profile from the atmosphere of a distant exoplanet.