We conducted an in-depth exploration of the use of different machine learning (ML) for regression algorithms, including Linear, Ridge, LASSO, Bayesian Ridge, Decision Tree, and a variety of Deep Neural Network (DNN) architectures, to estimate the slope of the high-frequency feature (HFF), a prominent emergent feature found in the gravitational wave (GW) signals of core collapse supernovae (CCSN). We created a data set of CCSN GW signals generated by an analytical model that mimics the characteristics of the signals obtained from numerical simulations, particularly the HFF. This enabled us to simulate a wide range of HFF slope values and analyze their properties. We opted to employ ML for regression techniques, particularly a supervised learning approach, to analyze the data set due to the parameter chosen for estimating the slope of the HFF. This type of architecture is ideal for this purpose as it can detect the connections between input and output data. In addition, it is suitable for handling high-dimensional input data and produces efficient results with low computational cost. We evaluated the efficiency and performance of the ML algorithms using a set of metrics to measure their ability to accurately predict the HFF slope within the data set. The results showed that a DNN algorithm for regression exhibits the highest accuracy in estimating the slope of the HFF.
Read full abstract