Abstract

The increasing availability of pen-based tablets, and pen-based interfaces opened the avenue for computer graphics applications that can utilize sketch recognition technologies for natural interaction. This has led to an increasing interest in sketch recognition algorithms within the computer graphics community. However, a key problem getting in the way of building accurate sketch recognizers has been the necessity of creating large amounts of annotated training data. Several authors have attempted to address this issue by creating synthetic data, or by building easy-to-use annotation tools. In this paper, we take a different approach, and demonstrate that the active learning technology can be used to reduce the amount of manual annotation required to achieve a target recognition accuracy. In particular, we show that by annotating few, but carefully selected examples, we can surpass accuracies achievable with equal number of arbitrarily selected examples. This work is the first comprehensive study on the use of active learning for sketch recognition. We present results of extensive analyses and show that the utility of active learning depends on a number of practical factors that require careful consideration. These factors include the choices of informativeness measures, batch selection strategies, seed size, and domain-specific factors such as feature representation and the choice of database. Our results imply that the Margin based informativeness measure consistently outperforms other measures. We also show that active learning brings definitive advantages in challenging databases when accompanied with powerful feature representations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.