The project, known as the National Strategic Computing Initiative, aims to speed up the development of an "exascale computing system" -- a supercomputer that can process a billion billion operations per second. (That's not a typo, there are two billions there.) Its a system that is hard to fathom, but could revolutionize the way scientists measure climate change, discover new materials,and study the human brain.
"I think somewhere along the lines of a hundred million to a billion modern-day laptops would represent the early stages of exascale computing," said Thomas Sterling, a professor at the Indiana University's School of Informatics and Computing and chief scientist at its Center for Research in Extreme Scale Technologies.
But just what this computer would look like or exactly how long it will take to create is still unclear.
"The truth is that if you go back to the 1960s, the technology for a moonshot was largely known. It was largely an engineering effort, albeit one of tremendous scale," said J. Steve Binkley, the associate director of the Department of Energy's office ofAdvanced Scientific Computing Research. "In the case of exascale, there are a couple of areas where we still need to do active research."
Binkley believes the initiative can hit the exascale target in roughly a decade, but that depends on funding and overcoming some technical challenges.
And the Obama administrationis doubling down. "This national response will require a cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors," according to Obama'sexecutive order, which was signed on Wednesday.
The initiative came out of a two year inter-agency working groupfocused on figuring out howhigh powered computingcould helpnational security, economic competitiveness and scientific achievement, Binkley said.
The Department of Energy, the Department of Defense, and the National Science Foundation will lead the initiative -- designing systems that couldbe used by even more parts of government including the National Aeronautics and Space Administration and the Federal Bureau of Investigation.
An exascale computer would allow for the government to run detailed models of some of the world's most difficult problems, simulating solutions in ways that wouldn't be possiblewithout massive amounts of processing power. One key area for this is climate change and alternative energy sources, said Binkley.Such a system would also be valuable for dealing with massive scientific data sets, he said.
But while America is the land of tech giants, such asGoogle and IBM,China is currently leading the supercomputing arms race.
The Tianhe-2supercomputerdeveloped by China's National University of Defense technology is the most powerful systemin the world with a peak performance of around 54.9 petaflops -- the level below exascale -- according to TOP500, a project than tracks the performance of supercomputers.
"We've allowed much of our technology to go overseas and be exported -- so now we have to buy back technology we've developed," said Sterling. "If we're not careful, we will lose further ground."
Yetthe United States is still in the race. The Titan supercomputer at the Department of Energy's Oak Ridge National laboratory is currently world's second most powerful.And earlier this year, the agencystruck a deal with Intel and supercomputing company Cray to deliver a system capable of180 petaflops by 2018.
But supercomputing comes with its own set of challenges, including how efficiently they operate. Some existingsupercomputer systems essentially waste the vast majority of their processing power when attempting to complete tasks, said Sterling, and that wasted processing power has real world energy costs.
"If you scale current technology up to exascale levels, it would be up to the range of a nuclear powerplant just to run one computer," Binkley said.Addressing that problem will be one of the focuses of the initiative,he said.
And even if a machine with this sort of processing power is created, there remains the challenge of getting it to do what researchers want, said Sterling. "The overwhelming challenge is how do you program these machines," he said.
But the growth in processing power needed for supercomputer may be reaching its limits. The rapid development of such technology haslargely followedMoore's Law -- which holds that computing power approximately doubles every two years due to the shrinking size of technology. Now, Sterling said, we are rapidly approaching a point of nanoscale technology where fundamental things like atomic granularity and the speed of light will make Moore's Law obsolete.
And the government, too, acknowledges that a major change is on the horizon. Among the other goals of the initiative is to establish a "viable path forward" for the future of supercomputing in a "post-Moore's Law era" over the next 15 years.
Even with the challenges posed by the end ofMoore's Law, developing the exascale super computer is still largely achievable with the science we know now, Binkley said. But thenext generation may require turning to other technologies, he said.
That's one reasonSterling says he is excited to see the Obama administrationfocusing on the challenge. In a way, he said, this is more than a moonshot because "noone's even decided what the rocket is going to look like."