Recent advances in computing technology have brought about unprecedented new levels of understanding of the complexities in fluid dynamics. Researchers and scientists at the APS March Meeting demonstrated how they have been employing the fastest supercomputers in the world to better model how dynamic systems behave in a variety of contexts.
“Despite its intrinsic chaotic nature, there is also a level of order, some kind of organization, and we take advantage of that,” said Paolo Padoan of UC San Diego, “There is a sort of universality and in turbulent flows there are universal scientific properties.”
To better understand formation of stars Padoan developed a program to model the granular flow of astrophysical dust clouds. Using one of the world’s fastest supercomputers, at NASA’s Ames Research Center, he has been able to create virtual stellar nebula up to several light years across. The program tracks the evolution of these turbulent nebulae over millennia as gravity pulls and twists interstellar dust into star-forming tendrils.
The program created by Padoan can finely model these clouds by breaking down the cloud’s physical behavior into different levels of detail based on its density. In closely packed regions, a secondary physics engine kicks in to better simulate the more complicated interactions leading up to star formation. In areas that are dense enough that dust and gas collapse and stellar fusion begins, the program taps into a third physics engine to better detail the actual star formation.
Using the program, Padoan found that star-forming tendrils are created in patterns consistent with unexpectedly weak magnetic fields. In this way, Padoan and his team have primarily used the program to understand the underlying dynamic properties of astral clouds.
“What we do is highly idealized,” Padoan said, “We don’t try to reproduce the shape of a molecular cloud.” He added also that a colleague of his was using the program to better understand the formation of the first stars out of the primordial gas cloud.
Recreating existing complex clouds is exactly what Fuqing Zhang of Penn State University is doing. He has been taking Doppler radar information on the paths of hurricanes crossing the Gulf of Mexico in hopes of predicting where they’ll hit the coast. This approach to forecasting relies primarily on probabilities derived from estimated cloud turbulence models.
“This will be the future of hurricane predictions,” Zhang said, “With better data and a better way to get the data into the model we could potentially make a big difference.”
Zhang is able to produce a working forecast within seven hours of storm chasers’ data collection. In 2008, with Hurricane Ike bearing down on the Gulf states, his team at the Texas Advanced Computing Center predicted within a few miles where the storm was going to hit.
Zhang’s team calculated the hurricane’s path more accurately than the national weather service largely because of the finer resolution his model afforded him. The computing power at the TACC was able to process the chaotic hurricane down to roughly 1.5 km scale resolution, rather than the 5 km resolution available at NOAA. In order to further improve predictions, Zhang stressed that a better understanding of the turbulent dynamics of individual clouds was needed.
Hurricanes can be thought of as massive yet inefficient natural engines using heat from the sun to transfer moisture across great distances. Jacqueline Chen of Sandia National Laboratories has been working on the underlying science needed to improve the efficiency of car engines by using some of the world’s fastest computers to model ethylene-air jet flames.
“What our group has been doing for the past decade is we use high performance computing to directly simulate some of the gas-based chemical interactions.” Chen said.
Chen and her team use over 1.3 billion individual data points to map a lifted autoigniting turbulent jet flame. Taking advantage of peta-scale computing power, Chen’s team has been able to glean numerous insights into reactive turbulent mixing and its effect on finite-rate chemical effects. Engineers designing the next generation of energy-efficient cars will be able to adapt these simulations to develop engines that efficiently burn fuel with lower emissions.
“We’re at the fundamental science end of the spectrum rather than applied engineering,” Chen said, adding that even small boosts in engine efficiency would have a tremendous impact. Currently combustion accounts for roughly 85 percent of energy used in the US while transportation accounts for more than 65 percent of petroleum consumed.
In many ways, the human circulatory system is one of the most complex and efficient feats of engineering. Twenty-four hours a day, the heart pumps 5.5 liters of blood through an intricate network of veins and arteries over 60,000 miles long. Up to now, predicting the effect of medications such as blood thinners in a system so complex has been extremely difficult. To better understand their effects, George Karniadakis is helping to map the blood flow of the entire human circulatory system.
“The job is very complex,” Karniadakis said, “We have to simulate everything, both the large scale but also the very small scale. From the arteries down to the very small capillaries.”
Blood flows through a complicated network ranging from massive arteries as wide as a roll of quarters, down to capillaries only big enough to let a single blood cell through at a time. To best model the many different sizes and types of junctions Karniadakis split the work among separate computing labs across the country. Each lab modeled the fluid flow through one of the types of arterial intersections. Separately, another team, part of the human physiome project, built a complete three dimensional map of the entire circulatory system. Karniadakis is now in the process of plugging each junction’s virtual flow into the detailed circulatory map to creating a complete simulation of blood flow in the body.
To test the computer simulation’s accuracy, Karniadakis’ partners at Ghent University built a mockup of a highly simplified circulatory system using lengths of flexible tubing and a motorized pump. By first simulating this circulatory mockup, Karniadakis was able to experimentally verify the computer’s accuracy after comparing predicted pressures with actual readings taken in the mockup.