Cosmological N-body Simulation
**Previous
abstract** **Next
abstract**

**Session 52 -- Grand Challenges in Computational Astrophysics Part II**
*Oral presentation, Wednesday, 1, 1994, 2:00-5:30*

## [52.02] Cosmological N-body Simulation

*George Lake (University of Washington)*
\def\gsim{\mathrel{\lower .80ex\hbox{\rlap{$\sim$}
\raise
.90ex\hbox{$>$} }}}
The ``N'' in N-body calculations
has doubled every year for the last two decades.
To continue this trend,
the UW N-body group is working on
algorithms for the fast evaluation
of gravitational forces on parallel computers and establishing
rigorous standards for the computations.
In these algorithms, the computational cost per
time step is $\sim 10^3$ pairwise forces per particle.
A new adaptive time integrator enables us to
perform high quality integrations that are fully
temporally and spatially adaptive.
SPH--smoothed
particle hydrodynamics will be added
to simulate the effects of dissipating gas and
magnetic fields.

The importance of these calculations is two-fold. First, they
determine the nonlinear consequences of theories for the structure
of the Universe. Second, they are essential for the interpretation
of observations. Every galaxy has
six coordinates of velocity and position. Observations
determine two sky coordinates and a line of sight
velocity that bundles
universal expansion (distance)
together with a random velocity created by
the mass distribution. Simulations are needed to determine the underlying
structure and masses. The importance of simulations has moved
from *ex post facto*
explanation to an integral
part of planning large observational programs.
I will show why high quality simulations with ``large N''
are essential to accomplish our scientific goals.

This year, our simulations have $N \gsim 10^7$. This
is sufficient to tackle some niche problems, but well short
of our 5 year goal--simulating {\sl The
Sloan Digital Sky Survey} using a
few Billion particles (a Teraflop-year simulation).
Extrapolating past trends, we would have to ``wait'' 7 years
for this hundred-fold improvement.
Like past gains, significant changes in the computational methods are
required for these advances.
I will describe new algorithms, algorithmic hacks and a dedicated computer
to perform Billion particle simulations.
Finally, I will describe research that can be enabled by Petaflop computers.

This research is supported by the NASA HPCC/ESS program.

**Wednesday
program listing**