Abstract
Recent advances unveiled physical neural networks as promising machine learning platforms, offering faster and more energy-efficient information processing. Compared with extensively-studied optical neural networks, the development of mechanical neural networks remains nascent and faces significant challenges, including heavy computational demands and learning with approximate gradients. Here, we introduce the mechanical analogue of in situ backpropagation to enable highly efficient training of mechanical neural networks. We theoretically prove that the exact gradient can be obtained locally, enabling learning through the immediate vicinity, and we experimentally demonstrate this backpropagation to obtain gradient with high precision. With the gradient information, we showcase the successful training of networks in simulations for behavior learning and machine learning tasks, achieving high accuracy in experiments of regression and classification. Furthermore, we present the retrainability of networks involving task-switching and damage, demonstrating the resilience. Our findings, which integrate the theory for training mechanical neural networks and experimental and numerical validations, pave the way for mechanical machine learning hardware and autonomous self-learning material systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.