In other words, bigger is better - and not just for the scientific bragging rights. Having a top-ranked supercomputer on American soil "demonstrates global competitiveness and attracts brainpower," says Jack Wells. Take Jeremy Smith, director of Oak Ridge's Center for Molecular Biophysics, who used to work at the University of Heidelberg in Germany. "I found out that Oak Ridge would have this nice toy to play with," he says, "so I nipped across the pond." (Smith's research on biofuels began on Jaguar and will continue on Titan.)
Many of the smart people that Titan attracts will use the supercomputer to chart the future of supercomputing itself. So-called petascale machines like Titan and Sequoia can accomplish amazing feats of simulation, like screening millions of potential drug compounds against a target molecule in a single day. But researchers like Jeremy Smith want to do even more.
They envisage an "exascale" computer - a thousand times more powerful than Titan and able to do one quintillion calculations per second (a quintillion is a one with 18 zeroes after it). A machine like this "would have enough computing power to screen tens of millions of drug compounds against all known living protein classes," Smith says. "That means we'll be able to predict if the drug will work and what all the side effects will be - not only generically, but for individual people, based on their own genetic sequences. This is amazing potential."
The trouble with building an exascale machine, however, is the amount of energy required to get there. "If we just scaled up what we're doing today, it would take a couple of nuclear power plants to power," says Buddy Bland. But Wu Feng, who curates an annual list of the world's most energy-efficient supercomputers, is less pessimistic. "The trends indicate that we'll be able to get to the exascale for 50 megawatts," he says. That's about half as much power as Apple and Google’s data centers in North Carolina are estimated to use.
But government-funded scientific institutions don't have tech companies’ bottomless bank accounts. The DoE wants an exascale computer by 2020 that can run on 20 megawatts of electricity or less. Reaching that goal will require entirely new chip designs that draw even less power than the GPU-accelerated systems like Titan do.
Mobile devices, most of which use chip designs from the UK firm Arm, could offer a way forward. "You've probably noticed that when you put a smartphone in your pocket it doesn't burn through your pants," says Jack Wells. "The same design principle is going to be used in high-performance computing to get to the exascale." Jack Dongarra, a computer scientist at the University of Tennessee whose Top500 list ranks the world's fastest supercomputers, ran benchmarking software on an iPad 2 and found that the tablet was equivalent to some of the fastest supercomputers of the mid-1990s. "That's incredible computing power in your hand," he says. "The Arm processor is clearly capable."
Still, simply lashing together thousands of low-power processors - whether they come from smartphones, gaming consoles, or laptops - does not a supercomputer make. Passing data between all those chips creates bandwidth bottlenecks that limit the total speed of the system. "It's like having two hemispheres of your brain on opposite sides of the room connected by a wire," says Feng. An exascale computer will have to speed up its entire internal network - perhaps by using fibre optic connections between racks of chips, accelerators on every piece of silicon, or both.
Meanwhile, says Buddy Bland, jockeying for the title of "world's fastest supercomputer" will continue, and no single interconnect design or chip architecture is "best." "Whoever has the biggest budget is likely to be in the top spot," he says wryly. "But a healthy diversity in architectures is a wonderful thing because certain applications can run well on one, and others well on another."
What's indisputable is that supercomputing has become the "third pillar" of doing science, alongside theory and experimentation. The best way to grasp the power of Titan, says Bronson Messer, a computational astrophysicist at Oak Ridge, is not to compare it to a Formula 1 racing car or a turbocharged engine, but to the Large Hadron Collider. "Titan is like the particle accelerator, and the simulations and applications that we run on Titan are like the detectors that discovered the Higgs boson," Messer says. "The size or power of these machines isn't what pushes science forward. It's the people using them, who know what to look for."