Suppression of Shock X-ray Emission in Novae from Turbulent Mixing with Cool Gas
Suppression of Shock X-ray Emission in Novae from Turbulent Mixing with Cool Gas
Brian D. Metzger, Lachlan Lancaster, Rebecca Diesing
AbstractShock interaction in classical novae occurs when a fast outflow from the white dwarf > 1000 km s/s collides with a slower, cooler shell of gas released earlier in the outburst. The shocks radiate across the electromagnetic spectrum, from radio synchrotron to GeV gamma-rays. The hot shocked gas also emits >~ keV thermal X-rays, typically peaking weeks after the eruption, once the ejecta becomes transparent to photoelectric absorption. However, the observed hard X-ray luminosities are typically >4 orders of magnitude smaller than would be naively expected given the powerful shocks implied by the gamma-rays. We argue that a key missing piece to this puzzle is turbulence behind the shock, driven, e.g., by thin-shell and/or thermal instabilities. Turbulence efficiently mixes the hot X-ray emitting gas with cooler gas, sapping the hot gas of energy faster than it can directly radiate. Using analytic arguments motivated by numerical simulations, we show that energy losses due to turbulent mixing can easily balance shock heating, greatly reducing the volume of the hot gas and suppressing the X-ray luminosity. Equating the characteristic thickness of the X-ray emitting region to the minimum outer length scale of the turbulence capable of cooling the hot gas through mixing, we obtain X-ray luminosities consistent with nova observations if only ~1% of the shock's kinetic power goes into turbulent motions. A similar process may act to suppress thermal X-rays from other shock powered transients, such as interacting supernovae.