Abstract

Remote sensing object counting is finding applications in many fields. Global regression is a long-ignored method for object counting, though it needs much less manual annotations than the alternatives. This work revisits global regression and improves it in two ways—one way is by replacing one single regressor with a deep ensemble, the other is by breaking down global regression into two easier and smaller problems: learning to rank (L2R) and linear transformation. To this end, we make a PAC-Bayesian analysis of regression ensembles and give an upper bound for their generalization error, offering new theoretical insight into ensemble learning. We also adapt a ranking metric optimization scheme to suit object counting, elegantly handling the L2R problem with gradient descent. What is more, based on our theoretical perspective, we provide a novel way of building deep regression ensembles, on which the ambiguity constraint is imposed. Then, by incorporating L2R into a deep ensemble, we propose a new counting model called the “ensemble of first-rank-then-estimate networks (eFreeNet).” Our extensive evaluation on six benchmarks shows that eFreeNet exhibits compelling performance across the board while being more annotation-efficient than other methods. Our source code is publicly available at https://github.com/huangyongbobo/eFreeNet.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call