Abstract
The first part of this article surveys different current trends in mathematical learning theory. The main divisions of the subject covered are stimulus-response theory, language learning, formal learning theory, perceptrons, cellular automata, and neural networks. The second part is concerned with extending the ideas of stimulus-response theory to universal computation. This is done by using register machines rather than Turing machines. The main theorem is that any partial recursive function can be asymptotically learned by a register learning model. In the discussion of this result the emphasis is on the need for a carefully organized hierarchy of concepts in order to have a rate of learning that is realistic for either organisms or machines.KeywordsCurrent DirectionCellular AutomatonTuring MachineRelative ClauseFinite AutomatonThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.