[Source: ScienceDaily] - In less time than the blink of an eye, the Translational Genomics Research Institute's new supercomputer at Arizona State University can do operations equal to every dollar in the recent Wall Street bailout.
That would be 700 billion computations in less than 1/60th of a second, says Dan Stanzione, director of the High Performance Computing Initiative at ASU's Ira A. Fulton School of Engineering.
The "Saguaro 2" supercomputer, housed on the first floor of ASU's Barry M. Goldwater Center for Science and Engineering, is capable of 50 trillion mathematical operations per second.
"That's the equivalent of taking a calculator and doing one operation per second, by hand, continuously for the next one and a half million years," Stanzione said.
Although the computing world changes daily, and measurements depend on numerous factors, Stanzione said, for some functions, ASU's new computer may be among the top five in the world.
TGen will need that speed as it continues its research into a variety of human diseases through the use of data-rich DNA sequencing, genotyping, microarrays and bioinformatics.
"This is really a remarkable testament," to the cooperative efforts of ASU and TGen, said Dr. Jeffrey Trent, President and Scientific Director of TGen, especially in a tight funding environment.
The new supercomputer will help TGen's efforts in translational biomedicine, developing new therapies targeted for individual patients suffering from Alzheimer's, autism, diabetes, coronary heart disease, melanoma, pancreatic cancer, prostate cancer, colon cancer, multiple myeloma, and breast cancer.
Dr. Edward Suh, TGen's Chief Information Officer, said a joint TGen-ASU computer support team is being assembled, and he urged the creation of more partnerships between TGen and ASU.
"I am confident this new supercomputer system will help the ASU and TGen scientists expedite their research, and accelerate innovation in biomedical and engineering research," Suh said. "It is my hope to see this supercomputer system, and a supporting informatics program which Dan and I are putting together, bring the ASU and TGen scientists closer than before for even greater success."
Saguaro 2 – a partially water-cooled set of 7-foot-tall black monolith computer racks, each with as many as 512 processor cores, and linked by ultra-high-speed Infiniband cables – was funded in part by a nearly $2 million grant in July by the National Institutes of Health. The grant was in response to a wide range of scientific activities proposed by TGen, the Ira A. Fulton School of Engineering, and ASU's BioDesign Institute.
The new system doubles the capabilities of ASU's High Performance Computing Initiative (HPCI). The system consists of Intel microprocessors, servers from Dell, storage from Data Direct Networks, and components from a number of other partners, including fiber optic cables from Phoenix-based Zarlink.
More importantly for TGen, the new system has 20 times the previous computational power available to TGen researchers, said James Lowey, director of TGen's High Performance Biocomputing Center.
The new supercomputer also adds to the storage capacity of the HPCI, bringing the total storage to 1.5 quadrillion bytes, or 1.5 petabytes -- or 15 followed by 14 zeroes (1,500,000,000,000,000). That's enough storage space to record nearly a quarter million DVD discs.
The HPCI storage will be used to store a vast array of data from TGen's sequencers and simulations, as well as other large datasets from ASU researchers, including a high resolution mapping of the moon to be performed in 2009 by NASA's Lunar Reconnaissance Orbiter.
"As we move in science into the nano scale of materials and molecular design and diagnostics, or into the macro scale of global climate or the motion of the galaxies, experimentation becomes more expensive and difficult, and simulation becomes invaluable," Stanzione said. "The speed of those simulations determine the speed of progress."
The computational speed of Saguaro 2 is especially critical to the work of TGen. "In 2009, more genome sequence data will be generated than all the words spoken by humans in all of history. Teasing meaningful understanding from this avalanche of data is also the role of HPC (high performance computing)," Stanzione said.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment