Abstract

Galaxy morphology classification is essential for studying the formation and evolution of galaxies. However, previous studies based on Convolutional Neural Network (CNN) mainly focused on the structure of convolutional layers without exploring the designs of fully connected layers. In this regard, this paper trains and compares the performance of CNNs with 4 types of fully connected layers on the Galaxy10 DECaLS dataset. Each type of the fully connected layers contains one dropout layer, and dropout rates including 0%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, and 90% are tested in the experiment to investigate how dropout rates in fully connected layers can affect the overall performance of CNNs. Meanwhile, these models utilize the EfficientNetB0 and DensnNet121 as their feature extraction networks. During the training process, feature-wise standardization, morphological operations, and data argumentation are used for preprocessing. Technics including class weights, exponential learning rate decay, and early stopping are applied to improve model performance. Saliency maps and Grad-CAM are also used to interpret model behaviours. Results show that the architectures of fully connected layers have a significant effect on models overall performance. With the same dropout rate and convolutional layers, models using global average pooling and an additional dense layer outperform others in most cases. The best model obtained an accuracy of 85.23% on test set. Meanwhile, the experimental results on dropout indicate that dropout layer can reduce the effect of the architectures of fully connected layers on overall performance of some CNNs, leading to better performance with less parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call