|
Berkeley -- Two national supercomputing
centers were chosen Friday (3/28) by the National Science Foundation to
spearhead a nationwide effort to make high-performance supercomputers more
available to the country's scientists and engineers.
UC San Diego and its San Diego Supercomputer
Center will lead one of the partnerships, which involves 37 national research
institutions including UC Berkeley. (Click here for SDSC press release.) UC San Diego will soon begin negotiations to forge a cooperative agreement with NSF valued at approximately $170 million over five years.
"Until now the national supercomputing
labs have primarily been in the business of providing 'cycles,' that is,
providing the research community with access to high-performance computers,"
says Susan L. Graham, chief computer scientist for the UC San Diego-led
partnership and professor of computer science at UC Berkeley. "With
this new partnership, we now will also help the user community to be early
users of emerging high-performance technologies."
The group led by UC San Diego -- called
NPACI, for National Partnership for Advanced Computational Infrastructure
-- will concentrate on numerous scientific problems requiring intensive
computing. These problems range from climate and weather prediction to drug
and new materials design.
The UC Berkeley collaborators will concentrate
on the following areas:
- UC Berkeley computer scientists have
created a new kind of parallel computer by linking 100 inexpensive, off-the-shelf
workstations into a network that has the capabilities of a supercomputer.
Computer science professor David Culler will make this one-of-a-kind network
-- dubbed NOW, for Network of Workstations -- available to NPACI partners,
and in the process get feedback to help refine the system.
- With numerous partners, UC Berkeley computer
scientists Susan Graham, Katherine Yelick, James Demmel and Phillip Colella
hope to develop components of a common runtime system and a common parallel
linear algebra library that will help to coordinate the use of computers
scattered around the country -- a technique known as metacomputing. The
work is partly based on separately funded work in UC Berkeley's Titanium
group, which is developing compiler and language support for parallel programming
of distributed memory multiprocessors, and the ScaLAPACK group, which is
developing scalable parallel numerical software libraries.
- While most supercomputer users want their
capacity for fast, repetitive calculation, more and more need the ability
to crunch large amounts of data, such as the huge streams of data transmitted
from satellites. As an accompaniment to his Digital Library project, computer
science professor Robert Wilensky will work with 19 partners to develop
tools for data-intensive computing when the data is stored at many locations
around the world.
- James Demmel, Robert Wilensky and various
collaborators will work to improve atmospheric and ocean modeling, create
better visualization systems for such models, and develop ways to make
the data available through an Earth systems digital library.
- Several UC Berkeley faculty hope to provide
NPACI partners with supercomputing software solutions to investigate complicated
engineering problems. Gregory Fenves of civil engineering plans to look
at vehicle crash worthiness; Andrew Neureuther of electrical engineering
and computer science will attempt to simulate optical phenomena in photolithography;
and David Bogy of mechanical engineering will model air-bearing sliders
for hard disk drives.
|