Skip to main content
CNN.com
Search
Home World U.S. Weather Business Sports Analysis Politics Law Tech Science Health Entertainment Offbeat Travel Education Specials Autos I-Reports
Technology News

Supercomputers crunching potato chips, proteins and nuclear bombs

Story Highlights

• Fast-calculating computers needed to solve big problems
• Computer simulations used to test nuclear stockpile
• NASA designing new moon ship with supercomputer
• Global warming focus of Japan's Earth Simulator
By Peggy Mihelich
CNN
Adjust font size:
Decrease fontDecrease font
Enlarge fontEnlarge font

ATLANTA, Georgia (CNN) -- The chess match between Garry Kasparov and IBM's Deep Blue in 1997 was the showdown of man vs. machine: the world's greatest chess player versus the world's greatest chess-playing computer.

Deep Blue, a supercomputer that could calculate more than 200 million moves a second, defeated Kasparov 2 games to 1, with three games ending in a draw.

Deep Blue's match win was the first by a chess-playing computer in a traditional format over a reigning world champion.

Fast-forward nine years and supercomputers -- systems with multiple processors, huge memories and storage, and special software for performing the world's most complex calculations -- are doing far more than checkmating grandmasters.

Today's supercomputers are ensuring the nation's nuclear stockpile, forecasting weather, designing safer more fuel-efficient cars, mapping DNA, exploring the cosmos and even creating potato chips. (Gallery: Famous supercomputers)

"Pringles potato chips are designed using [supercomputing] capabilities -- to assess their aerodynamic features so that on the manufacturing line they don't go flying off the line," said Dave Turek, vice president of deep computing at IBM.

Supercomputers allow researchers to do in real time -- meaning days, weeks or months -- what could not be done during a lifetime with a single personal computer. (Compare a supercomputer to a PC)

"The amount of data that some of these supercomputers [produce] would be, maybe as much as 100,000 times more data than what you can put on a hard drive on a normal PC," Turek said.

The world's top supercomputers can produce data at a rate equal to putting out one Library of Congress every few seconds, said Bruce Goodwin, associate director for defense and nuclear technologies at Lawrence Livermore National Laboratory in California.

An average PC can perform in the tens or hundreds of megaflops -- millions of calculations per second. A supercomputer like Purple at Livermore can calculate 100 teraflops -- 100 million million calculations per second.

Testing nukes in a computer

Using this ability to think faster, Purple can simulate the explosion of a nuclear weapon -- from the moment the button is pressed to the point when the bomb detonates.

In just a just a few billionths of a second, many complex systems interact to create a nuclear explosion. To replicate that process accurately, Purple must calculate very fast.

In 1994 it would have taken the world's fastest computer 6,000 years to complete the highly classified "button to bang" simulation, says Goodwin. It took Purple about six weeks.

Purple was conceived by the Department of Energy and built by IBM at a cost of $290 million to test the nation's nuclear stockpile.

In 1996 the United States signed the Comprehensive Test Ban Treaty. Although the Senate has never ratified the treaty, which bans all testing of nuclear weapons, including underground explosions, the government continues to adhere to the moratorium.

To ensure the security and reliability of the nation's aging nuclear stockpile (anywhere from 15 to 35 years old) without performing an actual test, the Department of Energy turned to computer simulation.

"It's a little bit like taking care of a 1958 Buick in Havana. At some point it becomes very difficult to be sure the car's going to start," Goodwin said.

"We have to constantly be checking them to make sure that they still work the way they were suppose to."

This same technology can cut huge expenses from the manufacturing industry in the long run, Goodwin said. Future cars and planes will be designed and tested on a supercomputer before they are ever built. And they "will work the first time every time," he said.

The next generation

Purple is the newest supercomputer at the national laboratory, which also houses IBM's BlueGene/L system -- ranked the fastest computer in the world by Top500, a group that tracks and posts a yearly list of supercomputers. Purple ranks fourth in the world.

But these rankings will likely change.

In September the Department of Energy contracted with IBM to build a next generation supercomputer, one capable of sustaining a speed of 1,000 trillion calculations per second, or one petaflop.

Dubbed "Roadrunner," it will cost the government $110 million over three years and will be housed at the Los Alamos National Laboratory in New Mexico.

Goodwin said all this speed is necessary.

"The ability to do this kind of seamless integration all the way from the molecular level all the way up to the full functioning of a technical object -- like a nuclear weapon, airplane, spacecraft -- is an industrial competitiveness for the country."

An edge NASA needs in its space race with China. NASA's new moon ship, Orion, which is projected to ferry astronauts to the moon by 2018, is being designed with the help of the Columbia supercomputer.

Columbia, housed at Ames Research Center in California, is also used to aid engineers working on the space shuttle. It was used during the space shuttle's return to flight mission in August 2005.

Damage to the thermal blanket on Discovery posed a concern -- it could tear away during re-entry to Earth's atmosphere and strike the orbiter. NASA had to decide if an in-flight repair was needed. They turned to Columbia for real-time answers.

Plugging in scenarios, running wind tunnel tests, engineers at Ames concluded (in about 48 hours) the chance of the blanket damaging the orbiter was remote. NASA heeded their conclusions and Discovery landed safely.

"What [supercomputing] has done is move NASA from a situation where we can do post-mortem assessment of damage to one where we can come up with likely outcomes in time where we can do something about it," said Columbia project manager, Bill Thigpen.

"Not losing a craft, not losing lives is a huge advantage," he said.

Hunting mutant genes

For biologist Jeffery Skolnick at Georgia's Institute of Technology, saving lives here on Earth is what his research is all about.

He's employing an $8 million "poor man's" supercomputer for biomedical research. Poor in the sense that it's smaller in size and processing speed compared to the likes of Purple and Columbia, but rich in possible results -- drugs that could treat some of today's most life-threatening diseases.

"Having one of these computers allows you to ask a lot of the 'what if' questions," Skolnick said.

Skolnick is playing with amino acids -- globs of proteins that play a key role in the functioning of living cells. He uses a supercomputer to isolate mutant proteins that can cause diseases such as Alzheimer's, Parkinson's or mad cow disease.

"We have all these genes that are normal and abnormal. We want to understand, what do they do. What is their molecular function, what pathways are they associated with, what's working, and what's not working," Skolnick said.

If his research can find patterns or holes in the proteins and find ways to repair them then drugs can be developed.

The future of the Earth and universe, today

For climate researchers, NEC's Earth Simulator supercomputer helps them study global warming. Earth Simulator generates weather models that can forecast 50-100 years into the future.

"We are looking for temperature change, density of the air, cloud formation, wind speed and rainfall," says professor Tetsuya Sato, director-general of the Earth Simulator Center in Yokohama, Japan.

Climate data is collected from satellites and ocean buoys and fed into the supercomputer. Results are turned into animations that help climatologists understand what is happening to the planet, Sato said.

Animations also help move the heavens for professor Joel Primack at the University of California at Santa Cruz.

He feeds digitally enhanced pictures taken by ground-based telescopes into a supercomputer and turns them into movies.

"All that telescopes give us are snapshots. They see one moment in the billion [year] evolution of a galaxy and we never see it a billion years earlier or later -- we see it as it was when the light left it," Primack said.

With a supercomputer he can see what happens between the snapshots.

"It's a whole new world to be able to use such a powerful machine," Primack said.


story.pringles.chip.1.jpg

Supercomputers are used in the design of Pringles potato chips.

FACT BOX

WORLD'S FASTEST COMPUTERS

1. BlueGene/L
Builder: IBM; Location: Lawrence Livermore Laboratory

2. Red Storm
Builder: Cray; Location: Sandia National Laboratories

3. BGW
Builder: IBM; Location: IBM Thomas J. Watson Research Center

4. ASC Purple
Builder: IBM; Location: Lawrence Livermore Laboratory

5. MareNostrum
Builder: IBM; Location: Barcelona Supercomputing Center

Source: Top500 (November 2006 rankings)

SPECIAL REPORT

Advertisement

Advertisement

Career Builder.com
Quick Job Search
  More Options
International Edition
CNN TV CNN International Headline News Transcripts Advertise with Us About Us Contact Us
Search
© 2007 Cable News Network.
A Time Warner Company. All Rights Reserved.
Terms under which this service is provided to you.
Read our privacy guidelines. Contact us. Site Map.
SERVICES » E-mails RSSRSS Feed PodcastsRadio News Icon CNNtoGo CNN Pipeline
Offsite Icon External sites open in new window; not endorsed by CNN.com
Pipeline Icon Pay service with live and archived video. Learn more