Compression for data archiving and backup revisited
Corneliu Constantinescu
SPIE Optical Engineering + Applications 2009
Associative networks theory is increasingly providing tools to interpret update rules of artificial neural networks. At the same time, deriving neural learning rules from a solid theory remains a fundamental challenge. We make some steps in this direction by considering general energy-based associative networks of continuous neurons and synapses that evolve in multiple time scales. We use the separation of these timescales to recover a limit in which the activation of the neurons, the energy of the system and the neural dynamics can all be recovered from a generating function. By allowing the generating function to depend on memories, we recover the conventional Hebbian modeling choice for the interaction strength between neurons. Finally, we propose and discuss a dynamics of memories that enables us to include learning in this framework.
Corneliu Constantinescu
SPIE Optical Engineering + Applications 2009
Heng Cao, Haifeng Xi, et al.
WSC 2003
Kafai Lai, Alan E. Rosenbluth, et al.
SPIE Advanced Lithography 2007
Sonia Cafieri, Jon Lee, et al.
Journal of Global Optimization