This paper presents an effective integration method of multiple modalities such as depth, color, and reflectance for place categorization. To achieve better performance with integrated multi-modalities, we introduce a novel descriptor, local N-ary patterns (LTP), which can perform robust discrimination of place categorization. In this paper, the LNP descriptor is applied to a combination of two modalities, i.e. depth and reflectance, provided by a laser range finder. However, the LNP descriptor can be easily extended to a larger number of modalities. The proposed LNP describes relationships between the multi-modal values of pixels and their neighboring pixels. Since we consider the multi-modal relationship, our proposed method clearly demonstrates more effective classification results than using individual modalities. We carried out experiments with the Kyushu University Indoor Semantic Place Dataset, which is publicly available. This data-set is composed of five indoor categories: corridors, kitchens, laboratories, study rooms, and offices. We confirmed that our proposed method outperforms previous uni-modal descriptors.
Read full abstract