Vegetation segmentation plays a crucial role in accurately monitoring and analyzing vegetation cover, growth patterns, and changes over time, which in turn contributes to environmental studies, land management, and assessing the impact of climate change. This study explores the potential of a multi-scale convolutional neural network (MSCNN) design for object classification, specifically focusing on vegetation detection. The MSCNN is designed to integrate multi-scale feature extraction and attention mechanisms, enabling the model to capture both fine and coarse vegetation patterns effectively. Moreover, the MSCNN architecture integrates multiple convolutional layers with varying kernel sizes (3 × 3, 5 × 5, and 7 × 7), enabling the model to extract features at different scales, which is vital for identifying diverse vegetation patterns across various landscapes. Vegetation detection is demonstrated using three diverse datasets: the CamVid dataset, the FloodNet dataset, and the multispectral RIT-18 dataset. These datasets present a range of challenges, including variations in illumination, the presence of shadows, occlusion, scale differences, and cluttered backgrounds, which are common in real-world scenarios. The MSCNN architecture allows for the integration of information from multiple scales, facilitating the detection of diverse vegetation types under varying conditions. The performance of the proposed MSCNN method is rigorously evaluated and compared against state-of-the-art techniques in the field. Comprehensive experiments showcase the effectiveness of the approach, highlighting its robustness in accurately segmenting and classifying vegetation even in complex environments. The results indicate that the MSCNN design significantly outperforms traditional methods, achieving a remarkable global accuracy and boundary F1 score (BF score) of up to 98%. This superior performance underscores the MSCNN’s capability to enhance vegetation detection in imagery, making it a promising tool for applications in environmental monitoring and land use management.
Read full abstract