Malte J. Rasch, Tayfun Gokmen, et al.
Frontiers in Neuroscience
A critical bottleneck for the training of large neural networks (NNs) is communication with off-chip memory. A promising mitigation effort consists of integrating crossbar arrays of analogue memories in the Back-End-Of-Line, to store the NN parameters and efficiently perform the required synaptic operations. The “Tiki-Taka” algorithm was developed to facilitate NN training in the presence of device nonidealities. However, so far, a resistive switching device exhibiting all the fundamental Tiki-Taka requirements, which are many programmable states, a centered symmetry point, and low programming noise, was not yet demonstrated. Here, a complementary metal-oxide semiconductor (CMOS)-compatible resistive random access memory (RRAM), showing more than 30 programmable states with low noise and a symmetry point with only 5% skew from the center, is presented for the first time. These results enable generalization of Tiki-Taka training from small fully connected networks to larger long-/short-term-memory types of NN.
Malte J. Rasch, Tayfun Gokmen, et al.
Frontiers in Neuroscience
G. Karunaratne, M. Hersche, et al.
ESSDERC/ESSCIRC 2022
Abhishek Moitra, Arkapravo Ghosh, et al.
MLSYS 2025
Corey Liam Lammie, A. Vasilopoulos, et al.
ISCAS 2024