Guo-Jun Qi, Charu Aggarwal, et al.
IEEE TPAMI
Glial cells account for between 50% and 90% of all human brain cells, and serve a variety of important developmental, structural, and metabolic functions. Recent experimental efforts suggest that astrocytes, a type of glial cell, are also directly involved in core cognitive processes such as learning and memory. While it is well established that astrocytes and neurons are connected to one another in feedback loops across many timescales and spatial scales, there is a gap in understanding the computational role of neuron–astrocyte interactions. To help bridge this gap, we draw on recent advances in AI and astrocyte imaging technology. In particular, we show that neuron–astrocyte networks can naturally perform the core computation of a Transformer, a particularly successful type of AI architecture. In doing so, we provide a concrete, normative, and experimentally testable account of neuron–astrocyte communication. Because Transformers are so successful across a wide variety of task domains, such as language, vision, and audition, our analysis may help explain the ubiquity, flexibility, and power of the brain’s neuron–astrocyte networks.
Guo-Jun Qi, Charu Aggarwal, et al.
IEEE TPAMI
Nicolae Dobra, Jakiw Pidstrigach, et al.
NeurIPS 2025
Chen-chia Chang, Wan-hsuan Lin, et al.
ICML 2025
Hong-linh Truong, Maja Vukovic, et al.
ICDH 2024