As higher spatiotemporal resolution tactile sensing systems are being developed for prosthetics, wearables, and other biomedical applications, they demand faster sampling rates and generate larger data streams. Sparsifying transformations can alleviate these requirements by enabling compressive sampling and efficient data storage through compression. However, research on the best sparsifying transforms for tactile interactions is lagging. In this work we construct a library of orthogonal and biorthogonal wavelet transforms as sparsifying transforms for tactile interactions and compare their tradeoffs in compression and sparsity. We tested the sparsifying transforms on a publicly available high-density tactile object grasping dataset (548 sensor tactile glove, grasping 26 objects). In addition, we investigated which dimension wavelet transform-1D, 2D, or 3D-would best compress these tactile interactions. Our results show that wavelet transforms are highly efficient at compressing tactile data and can lead to very sparse and compact tactile representations. Additionally, our results show that 1D transforms achieve the sparsest representations, followed by 3D, and lastly 2D. Overall, the best wavelet for coarse approximation is Symlets 4 evaluated temporally which can sparsify to 0.5% sparsity and compress 10-bit tactile data to an average of 0.04 bits per pixel. Future studies can leverage the results of this paper to assist in the compressive sampling of large tactile arrays and free up computational resources for real-time processing on computationally constrained mobile platforms like neuroprosthetics.
Read full abstract