Computational Chemistry Research Highlights
News Release
August 30, 2005
RICHLAND, Wash. —
Researchers associated with the Department of Energy's Pacific Northwest National Laboratory presented the following results at the 230th national meeting of the American Chemical Society, Aug. 28 through Sept. 1 in Washington, D.C.
Pacific Northwest National Laboratory researcher Tjerk Straatsma presented his results Monday, Aug. 29.
Killer microbe may be a lifesaver after all
Advances in the molecular modeling and simulation of complex biological systems are enabling researchers to study how certain microbial systems may play an important role in the remediation of contaminated soils. One target is Pseudomonas aeruginosa, a common microbe in sediments and the subsurface. This bacterium is also an important opportunistic pathogen that can cause fatal infections in people with a weakened immune systems.
T.P. Straatsma is leading a team of researchers modeling the lipopolysaccharide outer membrane of P. aeruginosa to learn how the membrane responds to its environment. This research is addressing the question of how this microbe adsorbs to mineral surfaces and what the mechanism is for the uptake and reduction of heavy metals. This has significant implications for bioremediation applications if these metals are radioactive and are reduced to insoluble form to prevent further spreading of the contamination.
In another project, the team also is addressing the health related issues concerning this microbe. Again focusing on the outer membrane, Straatsma and his coworkers are studying the role of a range of proteins embedded in the membrane, as well as the mechanism of action of certain antibiotics that are effective in treating P. aeruginosa infections that plague cystic fibrosis patients, burn victims and patients with compromised immune systems.
Pacific Northwest National Laboratory researcher Theresa Windus presented her results Tuesday, Aug. 30.
Supersizing the supercomputers: What's next?
Supercomputers excel at highly calculation-intensive tasks, such as molecular modeling and large-scale simulations, and have enabled significant scientific breakthroughs.
Yet supercomputers themselves are subject to technological advancements and redesigns that allow them to keep pace with the science they support.
The current vision of future supercomputers calls for them to be very heterogeneous-for example, rather than a central processing unit (CPU) with memory, disk and interconnect, the CPU will contain cores of smaller CPUs making up a larger whole-and have different types of processors, such as vectors and field programmable gate arrays (FPGAs). The location and type of memory will be more complex as well.
High performance components-encapsulated chunks of software that perform specific tasks-will be coupled to a dynamic framework that allows the scientists and the software to dynamically determine the algorithms or modifications to algorithms that will perform well on a particular architecture.
Multiple levels of parallelism will be explored, including parallelism at the component level, parallelism within the component, parallelism within a subroutine and threading.
These supercomputers of the future will provide orders of magnitude more computing power, but their increasing complexity also requires experts in computational science, mathematics and computer science working together to develop the software needed for the science.
University of Alabama, Tuscaloosa researcher David Dixon presented his results Monday, Aug. 29. Dixon conducted his research as a user of the Environmental Molecular Sciences Laboratory located at Pacific Northwest National Laboratory.
High-performance computing may improve combustion efficiency
Rising oil prices have revved momentum to develop more efficient combustion systems. But instrumental to this goal is a need to achieve greater understanding of the complex chemical reactions involved in combustion processes.
In one of the largest simulations ever brought to bear on this problem, researchers at Pacific Northwest National Laboratory performed quantum chemical calculations to accurately predict the heat of formation of octane, a key component of gasoline.
The calculation-performed using 1,400 parallel processors-took only 23 hours to complete and achieved a sustained efficiency of 75 percent, compared to the 5 to 10 percent efficiency of most codes. For comparison, the best one-processor desktop computer would have required a three and a half years and 2.5 terabytes of memory to run the calculation.
These pioneering calculations also helped identify the level of theory needed for subsequent efforts to reliably predict the heat of formation of larger alkanes in diesel fuel, for which there is very little experimental data, and the heat of formation of key reactive intermediates, such as alkyl and alkoxy radicals, for which there is no experimental data.
University of California, San Diego researcher John Weare presented his results Tuesday, Aug. 30. Weare conducted his research as a user of the Environmental Molecular Sciences Laboratory of Pacific Northwest National Laboratory.
Getting down to basics-new technology will make it possible
The goal is to produce large-scale, first-principle simulations of ion hydration and phosphoryl transfer signaling reactions-two fundamental processes that occur, respectively, in the environment and in the human body, but are little understood.
The processes themselves are not similar, however, and normally they would not be discussed in the same conference let alone the same presentation, yet they do share one thing in common: scale.
"This is as big as it gets in modeling," says John Weare, with 500-plus atoms to be scrutinized in any one simulation. By contrast, projects undertaken a few years ago may have modeled 20 or 30 atoms in a simulation.
Advancements in computational capabilities have made such monumental tasks possible. The central idea, Weare continues, is that "real life is large" and these multiscale projects illustrate "what we can do now that we couldn't do before."
In addition to simulating complex behaviors with "many many particles," Weare's team devotes about 40 percent of its efforts to developing algorithms and code to be implemented on a new generation of high-performance machines and architecture that is still on the horizon.
"There's a new wind blowing in science," Weare reports. "New equipment means new solutions are possible-we can get computers to solve really hard chemical problems, and that's changed how we approach theory. It's a different paradigm."
Tags: Environment, Fundamental Science, Computational Science, Chemistry