Abstract

Perceptron is an essential element in neural network (NN)-based machine learning, however, the effectiveness of various implementations by circuits is rarely demonstrated from chip testing. This paper presents the measured silicon results for the analog perceptron circuits fabricated in a 0.6 m/±2.5 V complementary metal oxide semiconductor (CMOS) process, which are comprised of digital-to-analog converter (DAC)-based multipliers and phase shifters. The results from the measurement convinces us that our implementation attains the correct function and good performance. Furthermore, we propose the multi-layer perceptron (MLP) by utilizing analog perceptron where the structure and neurons as well as weights can be flexibly configured. The example given is to design a 2-3-4 MLP circuit with rectified linear unit (ReLU) activation, which consists of 2 input neurons, 3 hidden neurons, and 4 output neurons. Its experimental case shows that the simulated performance achieves a power dissipation of 200 mW, a range of working frequency from 0 to 1 MHz, and an error ratio within 12.7%. Finally, to demonstrate the feasibility and effectiveness of our analog perceptron for configuring a MLP, seven more analog-based MLPs designed with the same approach are used to analyze the simulation results with respect to various specifications, in which two cases are used to compare to their digital counterparts with the same structures.

Highlights

  • Artificial neural network (ANN)-based machine learning, known as a promising technology, has been researched widely to enable electronic devices more intelligent and efficient [1,2,3,4]

  • We summarize simulation results with respect to various specifications and make a comparison to field programmable gate arrays (FPGAs)-based multi-layer perceptron (MLP) with the same structures, which show the feasibility and effectiveness to construct a MLP-based neural network by utilizing our analog perceptron

  • Two digital-to-analog converter (DAC) are used in a DAC-based multiplier, a DAC is inserted at the input of the negative feedback of the operational amplifier (OPAMP) circuit, while the other is at the loop of the feedback

Read more

Summary

Introduction

Artificial neural network (ANN)-based machine learning, known as a promising technology, has been researched widely to enable electronic devices more intelligent and efficient [1,2,3,4]. Perceptron is an essential element of ANN, which has multiple inputs and a single output. It is commonly represented by a mathematical model as follows: Each input is multiplied by a weight and all weighted inputs are summed at the output, the resulting sum is passed through an activation function. ANN-based machine learning fits the transfer function of the system by a training process where input-output pairs are iteratively presented, while the variable parameters/weights are adjusted with the process

Methods
Findings
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call