Abstract

We organized a challenge in "Unsupervised and Transfer Learning": the UTL challenge (http://clopinet.com/ul). We made available large datasets from various application domains: handwriting recognition, image recognition, video processing, text processing, and ecology. The goal was to learn data representations that capture regularities of an input space for re-use across tasks. The representations were evaluated on supervised learning "target tasks" unknown to the participants. The first phase of the challenge was dedicated to "unsupervised transfer learning" (the competitors were given only unlabeled data). The second phase was dedicated to "cross-task transfer learning" (the competitors were provided with a limited amount of labeled data from "source tasks", distinct from the "target tasks"). The analysis indicates that learned data representations yield significantly better results than those obtained with original data or data preprocessed with standard normalizations and functional transforms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.