This document provides a correction to the proof of the theorem establishing the exponential speed of convergence of the locally competitive algorithm (LCA) in the paper "Convergence and Rate Analysis of Neural N...
详细信息
This document provides a correction to the proof of the theorem establishing the exponential speed of convergence of the locally competitive algorithm (LCA) in the paper "Convergence and Rate Analysis of Neural Networks for Sparse Approximation."
Sparse approximation is a hypothesized coding strategy where a population of sensory neurons (e. g. V1) encodes a stimulus using as few active neurons as possible. We present the Spiking LCA (locallycompetitive algor...
详细信息
Sparse approximation is a hypothesized coding strategy where a population of sensory neurons (e. g. V1) encodes a stimulus using as few active neurons as possible. We present the Spiking LCA (locally competitive algorithm), a rate encoded Spiking Neural Network (SNN) of integrate and fire neurons that calculate sparse approximations. The Spiking LCA is designed to be equivalent to the nonspiking LCA, an analog dynamical system that converges on a l(1)-norm sparse approximations exponentially. We show that the firing rate of the Spiking LCA converges on the same solution as the analog LCA, with an error inversely proportional to the sampling time. We simulate in NEURON a network of 128 neuron pairs that encode 8x8 pixel image patches, demonstrating that the network converges to nearly optimal encodings within 20ms of biological time. We also show that when using more biophysically realistic parameters in the neurons, the gain function encourages additional l(0)-norm sparsity in the encoding, relative both to ideal neurons and digital solvers.
We present an analysis of the locallycompetitive Algotihm (LCA), which is a Hopfield-style neural network that efficiently solves sparse approximation problems (e.g., approximating a vector from a dictionary using ju...
详细信息
We present an analysis of the locallycompetitive Algotihm (LCA), which is a Hopfield-style neural network that efficiently solves sparse approximation problems (e.g., approximating a vector from a dictionary using just a few nonzero coefficients). This class of problems plays a significant role in both theories of neural coding and applications in signal processing. However, the LCA lacks analysis of its convergence properties, and previous results on neural networks for nonsmooth optimization do not apply to the specifics of the LCA architecture. We show that the LCA has desirable convergence properties, such as stability and global convergence to the optimum of the objective function when it is unique. Under some mild conditions, the support of the solution is also proven to be reached in finite time. Furthermore, some restrictions on the problem specifics allow us to characterize the convergence rate of the system by showing that the LCA converges exponentially fast with an analytically bounded convergence rate. We support our analysis with several illustrative simulations.
暂无评论