Abstract
The learners considered here process data in cycles and maintain as a long term memory a string which provides all internal data the learner can use in the next cycle. Updating of these strings is usually done by either recursive or automatic learners. The present work looks at transduced learners, which sit in-between. The results include that transduced learners can learn all learnable automatic families with memory exponential in the size of the longest input seen so far. Furthermore, there is a hierarchy based on the memory-allowance: if n is the size of the largest datum seen so far, then for all k ≥ 1 , memory n k + 1 allows one to learn more automatic families than memory n k . Further results shed light on when it can be imposed that transduced learners be consistent, conservative or iterative. The main result of this kind is that all learnable automatic families have a consistent and conservative transduced learner.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have