Abstract

The analog AI core concept is appealing for deep‐learning (DL) because it combines computation and memory functions into a single device. Yet, significant challenges such as noise and weight drift will impact large‐scale analog in‐memory computing. Here, effects of flicker noise and drift on large DL systems are explored using a new flicker‐noise model with memory, which preserves temporal correlations, including a flicker noise figure of merit (FOM) to quantify impacts on system performance. Flicker noise is characterized for (GST) based phase‐change memory (PCM) cells with a discovery of read‐noise asymmetry tied to shape asymmetry of mushroom cells. This experimental read polarity dependence is consistent with Pirovano's trap activation and defect annihilation model in an asymmetric GST cell. The impact of flicker noise and resistance drift of analog PCM synaptic devices on deep‐learning hardware is assessed for six large‐scale deep neural networks (DNNs) used for image classification, finding that the inference top‐1 accuracy degraded with the accumulated device flicker noise and drift as , and , respectively, where ν is the drift coefficient. These negative impacts could be mitigated with a new hardware‐aware (HWA) (pre)‐training of the DNNs, which is applied before programming to the analog arrays.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.