• September 30, 2014

Your College Gets a Supercomputer! And Yours, and Yours!

Your College Gets a Supercomputer! And Yours, and Yours! 1

John Beale

Shawn T. Brown, of the Pittsburgh Supercomputing Center, helps build software to run simulations using controllers from Wii video games. His Buckyball bowling (above), for example, is used to teach biology.

Enlarge Image
close Your College Gets a Supercomputer! And Yours, and Yours! 1

John Beale

Shawn T. Brown, of the Pittsburgh Supercomputing Center, helps build software to run simulations using controllers from Wii video games. His Buckyball bowling (above), for example, is used to teach biology.

It's no longer a big deal for a college to own a supercomputer. Nearly every college, regardless of size, may soon have one.

Just 10 years ago, supercomputers cost millions of dollars each and were installed only in elite universities and government labs. These machines were used in just a couple of fields, like astrophysics, or for sensitive military research. Today supercomputers, generally defined as among the fastest currently possible, run simulations to help researchers understand weather, outer space, oceans, economies, human biology, and more. And they can be built on the cheap.

Consider the new supercomputer that Monmouth College, which has about 1,300 students, cobbled together from dozens of old high-end computers bought on eBay for about $200 each. It was assembled by Christopher G. Fasano, a professor of physics, who figures the machines retailed for $30,000 apiece a few years ago, when they were new. He uses the homemade device for research on protein folding, in hopes that his simulations of the behavior of living cells can one day help medical researchers cure diseases.

Online auctions are not the only source for cheap computing power. The U.S. government has spent hundreds of millions of dollars building a series of shared supercomputers that any college can use over the Internet—though many professors and administrators are unaware that the system exists. And leaders of high-performance computing efforts have recently worked to make supercomputers more user-friendly and to encourage scientists in more fields to try them.

As supercomputing reaches a wider array of colleges and disciplines, though, a new set of challenges are emerging.

Mr. Fasano worries about "a new kind of digital divide," putting small institutions that don't devote time and money to supercomputing at a competitive disadvantage when it comes to attracting the best researchers and students.

Case in point: Charlie Peck, an associate professor of computer science at Earlham College, a small liberal-arts institution, got a call from the academic dean one day asking him if the college had a supercomputer. "The chemist that I just interviewed needs one to come to Earlham," Mr. Peck remembers the dean saying of a job candidate he was wooing. The sought-after researcher, who relied on supercomputers for his research, was considering only institutions that could meet his need for speed.

"Yep," Mr. Peck told his boss, the college had one up and running. Mr. Peck had scored a grant to put one together.

Mr. Fasano points out that students, too, increasingly need exposure to supercomputing. "If you want your students to go off to grad school in highly technical fields, it's really important for them to have this background," he says.

The upside is that all kinds of colleges—big or small—can now participate in attacking the largest and most socially relevant research questions.

"We're getting to the point where a lot of the problems we want to solve are hard, and they require large amounts of computing power and the ability to access and process very large amounts of data," says Amy W. Apon, director of the Arkansas High Performance Computing Center.

Why Supercomputers?

It's not as if every researcher needs a supercomputer, of course. They are definitely overkill if you simply want to run a Web browser or a word processor. But they are essential to create computer models of complex phenomena.

Mr. Peck, of Earlham College, explains that for one chemistry professor on his campus, it took about a week on a conventional computer for her model of organic compounds to return an answer when she plugged in a given set of parameters. That's a lot of time to twiddle thumbs while the status bar ticks away. The college's supercomputer can answer the same question in about an hour.

"I like to say it's reducing the time to science," Mr. Peck says. That means researchers not only can cover more ground but also have time to ask more creative questions—to play 'what if?' with their material—and know that taking chances won't put them weeks behind schedule.

Supercomputers are also key to many projects that present high-resolution pictures of data, sometimes on giant displays that take up an entire wall of a room. Such pictures can help researchers spot patterns or details they might not otherwise have noticed.

Increasing access to supercomputers is crucial to keeping U.S. research competitive. That's the argument of a 2006 report by the National Science Foundation that lays out the agency's vision for the future of U.S. supercomputing, which the agency now calls cyberinfrastructure (referring to both high-powered computers and the high-speed networks that connect them). "Problems of national need," the report says, which include protecting the electric-power grid, better health-care delivery, sustainable energy, and faster transportation, will not be solved unless more institutions and more professors can use supercomputers for their toughest problems.

The Power of Sharing

A year ago, several colleges started a program to spread the word about supercomputing and help spur that expansion. The program, called Campus Champions, designates a professor at each participating campus as an evangelist and technical-support contact for the TeraGrid, a network of supercomputers supported by the National Science Foundation and formed in 2000.

If you've never heard of the Tera Grid, you're not alone—hence the evangelical push.

Many scientists who do know about it mistakenly think they are not eligible to use these computers, says Edward Seidel, director of the National Science Foundation's office of cyberinfrastructure.

"People think you have to be a member of a certain project, or have a certain type of grant, to use them, which is not true," he says. Researchers have to fill out a request for time on the systems, but Mr. Seidel says most requests are granted, and without much of a wait.

In the past, getting time on the TeraGrid was more of a challenge, he says, but in recent years powerful new machines have been added. "In two years we've increased the capacity by a factor of 10," he says.

So the TeraGrid could actually use more users. and that's where Campus Champions comes in. The program is now at 43 colleges and universities around the country, including campuses of many types. Navajo Technical College has a champion, as does Elizabeth City State University, the University of Hawaii, and Yale University. Mr. Fasano is a champion for Monmouth, as is Mr. Peck for Earlham. They signed up to help when their home-built supercomputers were not fast enough for a problem at hand, since the TeraGrid machines are some of the fastest in the country.

Using a TeraGrid computer, or even a supercomputer on your own campus, isn't as easy as just flipping the "on" switch, though. In most cases, researchers have to adapt their simulations or other programs to run on a supercomputer—and that can be a research project in itself. Klaus J. Schulten, a professor of physics at the University of Illinois at Urbana-Champaign, has helped build one of the most popular software environments that run on supercomputers, called VMD (for visual molecular dynamics). He says the team has worked to try to make it more user-friendly. It's free and open source, meaning others can modify it as well. He regularly travels to give training lectures on how to use it, and the audience is not just computer geeks. "The people who are being trained are not computational experts—they're normal researchers," he says. Mr. Schulten says that VMD has been downloaded more than 130,000 times, and that the number is growing steadily.

Mr. Peck, of Earlham, admits that supercomputing software is "nowhere near as user-friendly" as most commercial software for traditional PC's, so there's definitely a learning curve.

Shawn T. Brown, a senior scientific specialist for the Pittsburgh Supercomputing Center, may have put the friendliest face yet on supercomputers. He and his colleagues built software to run supercomputing simulations using controllers from the Wii video-game system. With his "Buckyball bowling" software, for instance, players simply wave the wireless Wii controller and the spherical model of a carbon molecule moves in the same direction.

Mr. Brown says he was inspired to create the system after seeing how easily his 5-year-old daughter took to the Wii system: "It seemed like a really good way to give an accessible interface to supercomputing." He has given demonstrations of his playful hookup to middle- and high-school students.

And if students get exposed to supercomputers in grade school, they will definitely be looking for them when they enter the college-campus gates.

College 2.0 explores how new technologies are changing colleges. Please send ideas to jeff.young@chronicle.com.

Comments

1. abbiatti - August 10, 2009 at 06:30 pm

Well said Jeff! We are experiencing the potential that resides in scaling our high performance computing and high speed networking resources across the K-20 curriculum and beyond! Mike Abbiatti, Executive Director Arkansas Research and Education Optical Network

2. abbiatti - August 10, 2009 at 06:30 pm

Well said Jeff! We are experiencing the potential that resides in scaling our high performance computing and high speed networking resources across the K-20 curriculum and beyond! Mike Abbiatti, Executive Director Arkansas Research and Education Optical Network

3. princeton67 - August 12, 2009 at 02:03 pm

The terabyte computers will generate petabyte problems which will lead to exabyte research which will result in zettabyte requirements in yottabyte networks...ad infinitum Then, we'll have to create new prefixes,

4. labronx - August 12, 2009 at 04:30 pm

Supercomputers are obsolete. Just network some linux machines and install MPI. Or, heck, use Windows. Besides, in a few more years, all computing will be done on remote servers and displayed on the web.

Add Your Comment

Commenting is closed.

subscribe today

Get the insight you need for success in academe.