We study the role of complex neurodynamics in learning and associative memory using a neural network model of the olfactory cortex. By varying the noise level and a control parameter, corresponding to the level of neu...
详细信息
We study the role of complex neurodynamics in learning and associative memory using a neural network model of the olfactory cortex. By varying the noise level and a control parameter, corresponding to the level of neuromodulator or arousal, we analyze the resulting nonlinear dynamics during learning and recall of constant and oscillatory input. Point attractor, limit cycle, and strange attractor dynamics occur at different values of the control parameter. We show that oscillations and chaos-like behavior can give shorter recall times and more robust memory states than in static cases. In particular, we show that the recall time can reach a minimum for additive and multiplicative noise. Also noise-induced state transitions and noise-induced chaos-like behavior is demonstrated.< >
A query-reply system based on a Bayesian neural network is described. Strategies for generating questions which make the system both efficient and highly fault tolerant are presented. This involves having one phase of...
A query-reply system based on a Bayesian neural network is described. Strategies for generating questions which make the system both efficient and highly fault tolerant are presented. This involves having one phase of question generation intended to quickly reach a hypothesis followed by a phase where verification of the hypothesis is attempted. In addition, both phases have strategies for detecting and removing inconsistencies in the replies from the user. Also described is an explanatory mechanism which gives information related to why a certain hypotheses is reached or question asked. Specific examples of the systems behavior as well as the results of a statistical evaluation are presented.
This article shows how discrete derivative approximations can be defined so thatscale-space properties hold exactly also in the discrete domain. Starting from a set of natural requirements on the first processing stag...
详细信息
This article shows how discrete derivative approximations can be defined so thatscale-space properties hold exactly also in the discrete domain. Starting from a set of natural requirements on the first processing stages of a visual system,the visual front end, it gives an axiomatic derivation of how a multiscale representation of derivative approximations can be constructed from a discrete signal, so that it possesses analgebraic structure similar to that possessed by the derivatives of the traditional scale-space representation in the continuous domain. A family of kernels is derived that constitutediscrete analogues to the continuous Gaussian derivatives. The representation has theoretical advantages over other discretizations of the scale-space theory in the sense that operators that commute before discretizationcommute after discretization. Some computational implications of this are that derivative approximations can be computeddirectly from smoothed data and that this will giveexactly the same result as convolution with the corresponding derivative approximation kernel. Moreover, a number ofnormalization conditions are automatically satisfied. The proposed methodology leads to a scheme of computations of multiscale low-level feature extraction that is conceptually very simple and consists of four basic steps: (i)large support convolution smoothing, (ii)small support difference computations, (iii)point operations for computing differential geometric entities, and (iv)nearest-neighbour operations for feature detection. Applications demonstrate how the proposed scheme can be used for edge detection and junction detection based on derivatives up to order three.
This article presents: (i) a multiscale representation of grey-level shape called the scale-space primal sketch, which makes explicit both features in scale-space and the relations between structures at different scal...
详细信息
This article presents: (i) a multiscale representation of grey-level shape called the scale-space primal sketch, which makes explicit both features in scale-space and the relations between structures at different scales, (ii) a methodology for extracting significant blob-like image structures from this representation, and (iii) applications to edge detection, histogram analysis, and junction classification demonstrating how the proposed method can be used for guiding later-stage visual processes. The representation gives a qualitative description of image structure, which allows for detection of stable scales and associated regions of interest in a solely bottom-up data-driven way. In other words, it generates coarse segmentation cues, and can hence be seen as preceding further processing, which can then be properly tuned. It is argued that once such information is available, many other processing tasks can become much simpler. Experiments on real imagery demonstrate that the proposed theory gives intuitive results.
MIN PB is the class of minimization problems whose objective functions are bounded by a polynomial in the size of the input. We show that there exist several problems which are MIN PB-complete with respect to an appro...
详细信息
This work is concerned with the question of how a population of neurons responds to tonic and transient synaptic input from other similar populations. Because of the methodological problems involved in measuring and m...
This work is concerned with the question of how a population of neurons responds to tonic and transient synaptic input from other similar populations. Because of the methodological problems involved in measuring and manipulating the firing properties of a large set of real neurons simultaneously, another strategy is used here: the experiments are made as a series of simulations using a population of realistic model neurons. The steady state response of this particular model neuron is found to be similar to that used in abstract nonspiking models. The transient response, however, reveals that even though each individual neuron simply changes its frequency moderately, the population can respond quickly and with damped oscillations. These oscillations are due to spike synchronization caused by systematic phase shifts induced by synchronous changes in the input.
The dynamic behavior of cortical structures can change significantly in character by different types of neuromodulators. We simulate such effects in a neural network model of the olfactory cortex and analyze the resul...
详细信息
The dynamic behavior of cortical structures can change significantly in character by different types of neuromodulators. We simulate such effects in a neural network model of the olfactory cortex and analyze the resulting nonlinear dynamics of this system. The model uses simple network units and a network connectivity which closely resembles that of the real cortex. The input-output relation of populations of neurons is represented as a sigmoid function, with a single parameter determining threshold, slope and amplitude of the curve. This parameter is taken to correspond to the level of neuromodulator and correlated with the level of arousal of an animal. By varying this "gain parameter" we show that the model can give point attractor, limit cycle attractor and strange chaotic or nonchaotic attractor behavior. We also display "transient chaos", which begins with chaos-like behavior but eventually goes to a limit cycle. We show how this complex dynamics is related to learning and associative memory in our system and discuss the biological significance of this.
Probabilistic neural networks can approximate class conditional densities in optimal (Bayesian) pattern classifiers. In natural pattern recognition applications, the size of the training set is always limited, making ...
详细信息
Probabilistic neural networks can approximate class conditional densities in optimal (Bayesian) pattern classifiers. In natural pattern recognition applications, the size of the training set is always limited, making the approximation task difficult. Invariance constraints can significantly simplify the task of density approximation. A technique is presented for learning invariant representations, based on a statistical approach to ground invariance. An iterative method is developed formally for computing the maximum likelihood estimate to the parameters of an invariant mixture model. The method can be interpreted as a competitive training strategy for a radial basis function (RBF) network. It can be used for self-organizing formation of both invariant templates and features.< >
We describe a robot vision system that achieves complex object recognition with two layers of behaviors, performing the tasks of planning and object recognition, respectively. The recognition layer is a pipeline in wh...
详细信息
暂无评论