Publication
DeepLearn 2024
Talk

Oscillating Chemical Reaction Networks as novel energy-efficient system for Artificial Neural Networks

Abstract

It is well established that Moore’s law does not hold any longer given the approach of transistors to its physical limits. This is in contrast to an increasing demand of AI which needs more computational power than ever before [1]. There is a limit for optimizing an AI’s code to lower the needed energy for computation. An alternative way is to design novel computing architectures to lower the amount of energy needed. Our current research investigates the use of the Belousov-Zhabotinsky (BZ) reaction [2] as a medium for a novel computing hardware: a chemical computer. The BZ reaction is a non-linear chemical oscillator. Recent successful implementations of the BZ reaction as a medium to perform language recognition [3] or information processing [4] act as proof of concept for a chemical computing device. Our current research focuses on the miniaturization of the architecture to micrometer scale and show scalability of the system size. The BZ reaction reactants are filled into micro compartments with a volume in the range of fLs. These compartments are connected diffusively and can therefore transmit an oscillation peak in form of a reaction-diffusion wave front. The architecture can be in-situ changed by illuminating either the channels or the reactors to break the connection or to inhibit the oscillations of a compartment respectively. This novel architecture not only works on molecular energy efficiency but also overcomes the Von Neumann Bottleneck by having the memory in the same structures as the processing units. This novel approach presents an opportunity to take inspiration from biology and use local circuit motifs (Winner-Take-All, working memory or larger canonical cortical microcircuits) to perform computational tasks. It is a promising alternative to conventional computing architectures and can be a solution for the ever increasing demand of AI.

Date

Publication

DeepLearn 2024

Authors

Topics

Share