[Previous] | [Session 32] | [Next]
P. Beiersdorfer (LLNL), K. R. Boyce (GSFC), G. V. Brown (LLNL), H. Chen (LLNL), K. C. Gendreau (GSFC), J. Gygax (GSFC), S. M. Kahn (Columbia), R. L. Kelley (GSFC), F. S. Porter (GSFC), C. K. Stahle (GSFC), A. E. Szymkowiak (GSFC), D. Thorn (LLNL), E. Träbert (LLNL)
The ionization balance of young supernova remnants typically is less than that expected from the electron temperature. The deviation from ionization equilibrium is determined by the ionization parameter \eta = net, i.e., the product of the electron density and time. In cases where the age of the remnant is known, the value of \eta represents a measure of the electron density. Spatially resolved measurements of \eta can thus be used to map the density and, from the total emission measure, the depth of an observed region. In order to simulate the non-equilibrium spectral emission from young supernova remnants, we have implemented the engineering model XRS detector from the Astro-E mission at the Livermore EBIT-2 source. The XRS comprises 32 independent pixels capable of observing the entire X-ray band from the carbon K-edge to about 10 keV with a resolution of 8 eV at 1 keV and 11 eV at 6 keV. Operating EBIT-2 in the Maxwellian mode, we recorded the time evolution of the iron L-shell and iron K-shell emission as a function of \eta in plasmas of known temperature. The results are series of spectra that evolve over time from very low to high stages of ionization that provide stringent tests of non-equilibrium SNR emission models.
Work by the UC-LLNL was performed under the auspices of DOE under contract W-7405-ENG-48 and supported by NASA Space Astrophysics Research and Analysis grants to LLNL, GSFC, and Columbia University.