Interactive image segmentation algorithms rely on the user to provide annotations as the guidance. When the task of interactive segmentation is performed on a small touchscreen device, the requirement of providing precise annotations could be cumbersome to the user. We design a new interaction mechanism that actively queries seeds for guiding the user to label. Our method enforces sparsity and diversity criteria on the selection of query seeds, and at each round of interaction, the user is only presented with a small number of informative query seeds that are far apart from each other. Therefore, the user merely has to swipe through on the ROI-relevant query seeds for checking which ones of the query seeds are inside the region of interest (ROI). This kind of interaction should be easy since those gestures are commonly used on a touchscreen. As a result, we are able to derive a user-friendly interaction mechanism for annotation on small touchscreen devices. The performance of our algorithm is evaluated on six publicly available datasets. The evaluation results show that our algorithm achieves high segmentation accuracy, with short computational time and less user feedback.
Read full abstract