Abstract

Jensen-Shannon divergence (JSD) is an effective method for measuring the distance between two probability distributions, but it is not a metric. In particular, JSD does not satisfy the triangle inequality, thus raising concerns over robustness issue for qualifying as a valid distance measure. JSD's square root is a metric while preserving all other characteristics of JSD. However, it does not provide adequate separation when the difference between input distributions is subtle. We extend metric version of JSD by reformulating it using alternate operators that provide different properties concerning robustness. Furthermore, we prove a number of important mathematical properties for this extension. Finally, we propose a family of new kernels, based on metric JSD and Chisini mean. We explore the utility of proposed technique, Metric-Chisini-Jensen-Shannon Divergence (M-CJSD), in SVM classification for IoT sensor data which presents data nuances challenges making it difficult to discriminate distributions with discernible differences. Additionally, we build a deep neural network (DNN) for performance benchmark, and explore an optimization approach to improve k-Means clustering. We found that while clustering is unable to detect visually identifiable clusters, M-CJSDs were able to capture underlying data regularity and give statistically significant improvement over DNN model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call