Abstract

In recent years, deep convolutional neural networks (DCNNs) have shown their predominant place in building extraction. Generally, the performance of DCNNs depends on the quality of parameter learning. To improve the performance, many approaches chose to enlarge the network size or employ additional plug-in modules. However, there exists large redundancy in parameter learning of neural networks. We argue that the methods totally relying on parameter learning cannot fully exploit the useful information of the given data, and feature learned by such methods is not comprehensive enough. In this paper, we propose an efficient memory module, named EMM, to enhance the learning ability of DCNNs in building extraction. In our method, we encode each batch of data into memory key-value pairs and store them in memory banks, in which each memory will dynamically update during the training process. Inspired by the non-local module, we propose to use a tri-branch way for a more efficient memory retrieval process. By training with our memory module, DCNNs can learn more holistic and discriminative features efficiently and effectively. We conduct intensive experiments on three challenging public datasets. Compared with popular segmentation networks on all three datasets, our method can achieve competitive performance while keeping relatively fewer parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call