Chaos is generic in strongly-coupled recurrent networks of model neurons, and thought to be an easily accessible dynamical regime in the brain. While neural chaos is typically seen as an impediment to robust computation, we show how such chaos might play a functional role in allowing the brain to learn and sample from generative models. We construct architectures that combine a classic model of neural chaos either with a canonical generative modeling architecture or with energy-based models of neural memory. We show that these architectures have appealing properties for sampling, including easy biologically-plausible control of sampling rates via overall gain modulation.
翻译:混沌在模型神经元的强耦合循环网络中普遍存在,并被认为是大脑中易于访问的动态机制。尽管神经混沌通常被视为稳健计算的障碍,但我们展示了这种混沌如何在使大脑学习和采样生成模型方面发挥功能性作用。我们构建了将经典的神经混沌模型与标准生成建模架构或基于能量的神经记忆模型相结合的体系结构。我们证明这些架构在采样方面具有吸引人的特性,包括通过整体增益调制实现易于生物合理控制的采样速率。