Latest Developments in Computational Physics Showcased at PC 1995
Some of the latest research results and developments in computational physics were featured during Physics Computing 1995, the bi-annual meeting of the APS Division of Computational Physics, held 5-7 June in Pittsburgh, Pennsylvania. With more than 40 invited and plenary sessions, and over 60 contributed sessions, the conference included minisymposia on lattice gauge theory, density functional methods, advanced computational materials science, computers in education, nonlinear dynamics, and time series analysis.
Using Monte Carlo Techniques To Explore Phase Transitions. Monte Carlo simulations have become a powerful tool for the study of phase transitions in a wide range of systems, according to David P. Landau of the University of Georgia's Center for Simulation Physics. Speaking at Monday morning's plenary session, Landau reviewed several sophisticated Monte Carlo methods and described a number of new analysis techniques he has successfully used to extract more information from the data collected from simulation experiments.
Nonlinear Dynamics. Nonlinear dynamics forecasting can be used to extract messages from chaotic communication systems, according to the University of New Hampshire's Kevin Short, who spoke at a Monday afternoon session. One-step prediction methods can occasionally reveal the presence of hidden messages as well as their frequency content, but Short believes multi-step prediction methods have the potential to improve message extraction. One possible approach uses the frequency information from one-step predictions to determine a block size to use for multi-step predictions.
At the same session, Tim Sauer (George Mason University) considered the possible existence of true trajectories of chaotic dynamical systems lying close to computer-generated trajectories, and described several new computational techniques for verifying the existence of so-called "shadowing trajectories." Salman Habib of Los Alamos National Laboratory described a recently developed method for the calculation of Lyapunov exponents of dynamical systems, which avoids the renormalization and reorthogonalization procedures necessary in the standard techniques by using the exponential representation of simplectic matrices.
Parallel Algorithms. Canopy is a software framework developed and used at Fermilab to facilitate coding grid-oriented problems. According to Fermilab's Mark Fischler, who spoke at a Tuesday morning session, the goal is to allow scientists to use massively parallel systems for a broad class of applications, without requiring expertise in any particular system, or in parallel programming techniques. Although initially designed for lattice QCD calculations, Canopy has recently been implemented on the laboratory's Cray T3D supercomputer, making it available for broader application. Michael Uchima, also of Fermilab, said that the efficiency of the applications was tested using the five algorithms most heavily used at the laboratory, and all scaled well to large numbers of processors.
Space Plasma Simulation. Unstructured meshes are commonly used in structural analysis and, more recently, in the gas dynamics and aerodynamics research communities, but these grids and their associated techniques are only now making inroads in the area of space plasma simulations. According to Steven Zalesak (NASA Goddard Space Flight Center), who spoke on Wednesday afternoon, these methods have two strong advantages over traditional structured mesh techniques: the extremely complex geometrics can be easily accommodated, and adaptive mesh refinement techniques can be easily implemented without altering the grid definition. While finite volume algorithms defined on unstructured meshes are less accurate than their finite element counterparts, their superior numerical quantities makes it easier for researchers to implement new kinds of physics, as well as such numerical enhancements as shock-capturing techniques.
Supercomputer Applications. A special Saturday session on supercomputer applications in astronomy was also held in conjunction with the American Astronomical Society. According to session organizers David Spergel (Princeton University) and Regina Schulte-Ladbeck (University of Pittsburgh), increasingly powerful supercomputers have enabled significant conceptual advances in many fields of astronomy, which in turn demand the development of new tools to maximize their power. Computer vendors require demonstrable performance from large scale applications to market their machines to industry and to aid them in designing the next generation. Astrophysical applications provide robust, scalable problems that are widely applicable for marketing, yet detailed and demanding enough to aid continuing development efforts.