Abstract

An extended model of multilayer perceptron (MLP) based on reduced geometric algebra (RGA), namely RGA-MLP, is proposed for multi-dimensional signal processing. The RGA-MLP model treats multi-dimensional signals as multivectors in RGA space and all neuronal parameters such as inputs, connection weights, activation function and outputs, and also operators are encoded by RGA. The RGA-based back propagation (BP) algorithm is also provided. Thanks to the commutative property of RGA, multi-dimensional signals can be processed in a holistic manner which avoids losing relationship of multiple dimensions. The experiments demonstrate that the RGA-MLP model outperforms the traditional real-valued MLP model and quaternion based MLP model (QMLP) with faster convergence rate, higher classification accuracy and Lower computational complexity.

Highlights

  • Multilayer perceptron, known as multi-layer feed-forward neural network, is the most well-known artificial neural network (ANN) model

  • To address the problem of high computational complexity owing to non-commutative multiplication, Shen et al [32] presented a novel theory of reduced geometric algebra (RGA) with commutative multiplication rules and a novel vector-valued sparse representation model for color image using RGA

  • We present a RGA-version of back propagation (BP) algorithm to train the network

Read more

Summary

INTRODUCTION

Multilayer perceptron, known as multi-layer feed-forward neural network, is the most well-known artificial neural network (ANN) model. To address the problem of high computational complexity owing to non-commutative multiplication, Shen et al [32] presented a novel theory of reduced geometric algebra (RGA) with commutative multiplication rules and a novel vector-valued sparse representation model for color image using RGA. Inspired by the recent progress of GA based models in various fields of multi-dimensional signal processing, and the advantages of RGA theory, we present an extended multilayer perceptron model using reduced geometric algebra, which treats multi-dimensional signals as multivectors in RGA space, and simplifies the computation of the networks. The proposed RGA-MLP model is capable of achieving the state-of-art performance with lower computational complexity for multi-dimensional signal processing. Defining any multivector H ∈ (Gn)M×N in GA space,

MULTILAYER PERCEPTRON MODEL
THE PROPERTIES OF REDUCED GEOMETRIC ALGEBRA
LEARNING ALGORITHM
EXPERIMENTS AND ANALYSIS
Findings
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.