Abstract

Due to energy-efficiency requirements, computational systems are now being implemented using noisy nanoscale semiconductor devices whose reliability depends on energy consumed. We study circuit-level energy-reliability limits for deep feedforward neural networks (multilayer perceptrons) built using such devices, and en route also establish the same limits for formulas (boolean tree-structured circuits). To obtain energy lower bounds, we extend Pippenger’s mutual information propagation technique for characterizing the complexity of noisy circuits, since small circuit complexity need not imply low energy. Many device technologies require all gates to have the same electrical operating point; in circuits of such uniform gates, we show that the minimum energy required to achieve any non-trivial reliability scales superlinearly with the number of inputs. Circuits implemented in emerging device technologies like spin electronics can, however, have gates operate at different electrical points; in circuits of such heterogeneous gates, we show energy scaling can be linear in the number of inputs. Building on our extended mutual information propagation technique and using crucial insights from convex optimization theory, we develop an algorithm to compute energy lower bounds for any given boolean tree under heterogeneous gates. This algorithm runs in linear time in number of gates, and is therefore practical for modern circuit design. As part of our development we find a simple procedure for energy allocation across circuit gates with different operating points and neural networks with differently-operating layers.

Highlights

  • As neural networks become larger and more prevalent, their energy requirements are becoming of key concern [3]

  • We focus on deep feedforward networks, which are directed acyclic graphs (DAGs)

  • We again build on the mutual information propagation technique to obtain energy bounds that yield insights into the structural and connectivity requirements for reliable operations of nanoscale feedforward neural networks

Read more

Summary

INTRODUCTION

As neural networks become larger and more prevalent, their energy requirements are becoming of key concern [3]. A major challenge, in using nanoscale devices is that they can be very unreliable, especially when operated at low energy [11] This has renewed interest in the study of reliable circuit design using unreliable components, both digital and analog [12]–[15], a problem first addressed by von Neumann through a modular redundancy approach [16]. Determining best design strategies is useful not just as a proof technique, and for informing practical circuit design and explaining the nature of biological neural networks in sensory cortex, as we detail in separate works [8], [26]. We extend Pippenger’s mutual information propagation technique and use crucial insights from convex optimization theory to determine energy limits for reliable nanoscale boolean trees and feedforward neural networks with unreliable components.

Contributions
Circuit Graphs
Mutual Information Propagation
Energy-Failure Functions
BOOLEAN TREE CIRCUITS FROM HOMOGENOUS GATES
Computation Energy per Input Bit
Energy Bounds for Device Technologies
BOOLEAN TREE CIRCUITS FROM HETEROGENEOUS GATES
Minimum Energy Requirement
Minimum Energy for Device Technologies
1: Among nodes at maximum depth pick g with smallest index
Maximum Reliability for Device Technologies
FEEDFORWARD NEURAL NETWORKS
Homogeneous Neurons
Heterogeneous Neurons
NUMERICAL EXAMPLES AND PRACTICAL INSIGHTS
VIII. CONCLUSION AND FUTURE WORK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call