Abstract

Abstract The estimation of spectroscopic and photometric redshifts (spec-z and photo-z) is crucial for future cosmological surveys. It can directly affect several powerful measurements of the universe, such as weak lensing and galaxy clustering. In this work, we explore the accuracies of spec-z and photo-z that can be obtained by the China Space Station Optical Surveys, which is a next-generation space survey, using a neural network. The one-dimensional Convolutional Neural Networks and Multi-Layer Perceptron (MLP, the simplest form of an artificial neural network) are employed to derive spec-z and photo-z, respectively. The mock spectral and photometric data used for training and testing the networks are generated based on the COSMOS catalog. The networks have been trained with noisy data by creating Gaussian random realizations to reduce the statistical effects, resulting in a similar redshift accuracy for data with both high and low signal-to-noise ratios. The probability distribution functions of the predicted redshifts are also derived via Gaussian random realizations of the testing data, and then the best-fit redshifts and 1σ errors also can be obtained. We find that our networks can provide excellent redshift estimates with accuracies of ∼0.001 and 0.01 on spec-z and photo-z, respectively. Compared to existing photo-z codes, our MLP has a similar accuracy but is more efficient in the training process. The fractions of catastrophic redshifts or outliers can be dramatically suppressed compared to the ordinary template-fitting method. This indicates that the neural network method is feasible and powerful for spec-z and photo-z estimations in future cosmological surveys.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.