Cluster optical depth and pairwise velocity estimation using machine learning

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

Cluster optical depth and pairwise velocity estimation using machine learning

Authors

Yulin Gong, Rachel Bean

Abstract

We apply two machine learning methods, a CNN deep-leaning model and a gradient-boosting decision tree, to estimate individual cluster optical depths from observed properties derived from multiple complementary datasets. The models are trained and tested with simulated N-body derived halo catalogs and synthetic full-sky CMB maps designed to mirror data from the DESI and Simons Observatory experiments. Specifically, the thermal Sunyaev-Zel'dovich (tSZ) and CMB lensing convergence, along with cluster virial mass estimates are used as features to train the machine learning models. The predicted optical depths are combined with kinematic Sunyaev-Zel'dovich (kSZ) measurements to estimate individual cluster radial peculiar velocities. The method is shown to recover an unbiased estimate of the pairwise velocity statistics of the simulated cluster sample. The model's efficacy is demonstrated for halos with mass range $10^{13} M_{\odot} < M_{200} < 10^{15} M_{\odot}$ over a redshift range $0<z<1$, and validated in the presence of primary CMB, instrument noise, lensing convergence noise, and potential uncertainties in halo virial mass estimates. We apply the method to ACT CMB data, using ACT DR4 component-separated maps for tSZ and CMB lensing and ACT DR5 maps for kSZ, in conjunction with galaxy clusters observed in the SDSS DR15 spectroscopic survey. We demonstrate that the machine learning approach is an effective one to analyze data from current and upcoming CMB experiments such as Simons Observatory and CCAT, and galaxy surveys, such as DESI and Roman, for which the pairwise velocity statistics can provide valuable insights into the properties of neutrinos and gravity on cosmic scales.

Follow Us on

0 comments

Add comment