Summary form only given. The authors have developed a neuralnetwork which consists of cooperatively interconnected Grossberg on-center off-surround subnets and which can be used to optimize a function related to the ...
详细信息
Summary form only given. The authors have developed a neuralnetwork which consists of cooperatively interconnected Grossberg on-center off-surround subnets and which can be used to optimize a function related to the log likelihood function for decoding convolutional codes or for more general FIR signal deconvolution problems. Connections in the network are confined to neighboring subnets, and the network is representative of the types of networks which lend themselves to VLSI implementation. Analytical and experimental results for convergence and stability of the network have been found. The structure of the network can be used for distributed representation of data items while allowing for fault tolerance and replacement of faulty units.
The growth and development of our goals in space utilization will reach out to utilize every possible technology. Artificial neural Systems (ANS) is a newly emerging technology which has already indicated a potential ...
详细信息
The growth and development of our goals in space utilization will reach out to utilize every possible technology. Artificial neural Systems (ANS) is a newly emerging technology which has already indicated a potential solution to many space engineering problems. A particularly interesting feature of ANS's is their ability to construct vital generalizations or inferences from sample data without the need for conventional programming. In order to evaluate ANS's, the Artificial Intelligence Section is conducting several initial projects implementing ANS's, developing a dedicated ANS workstation, and developing applications to assist with the immigration of this technology. This paper will describe the neural net based speech synthesis project. The novelty is that the reproduced speech was extracted from actual voice recordings. In essence, the neuralnetwork (NN) learns the timing, pitch fluctuations, connectivity between individual sounds, and speaking habits unique to that person. The parallel distributedprocessingnetwork used for this project is the generalized backward propagation network which has been modified to also learn sequences of actions or states given a particular plan.
A description is given of a parallel distributedprocessing (PDP) network simulator that is implemented on a multiprocessor transputer system. Two approaches are considered for the implementation of the simulator: an ...
详细信息
A description is given of a parallel distributedprocessing (PDP) network simulator that is implemented on a multiprocessor transputer system. Two approaches are considered for the implementation of the simulator: an array of processors onto which the distributednetwork is mapped, and a pipeline of processors that perform the calculations for a single unit. The performance of the chosen topology for several network models and sizes are compared, and the predicted performance is calculated for a simulator with a greater number of processors. The simulations show that it is a communication-dominated system. One method that would improve the performance would be to allocate the natural grouping of units to the individual processors, with only a small number of links connecting any two units on separate processors. The results show that the simulator is useful for analyzing PDP networks that are sparsely connected or having a structure which form dense groups.
The authors investigate the pattern-processing capabiblities of associative networks which consist of a large number of nodes (processors) communicating with each other through connections. Nodes and connections corre...
详细信息
The authors investigate the pattern-processing capabiblities of associative networks which consist of a large number of nodes (processors) communicating with each other through connections. Nodes and connections correspond to neurons and synapses, respectively, in a neuralnetwork. In the model studied here, the nodes are of an on/off TLU (threshold logic unit) type, and a node being on corresponds to a neuron that fires a train of nerve impulses. Information is stored in the distributed connectivity of the network. The strength of connections between nodes may change according to learning rules in response to input to the system ('experience'). There are also mechanisms for the formation of complex nodes (feature detectors) that respond to combinations of activity in simple nodes. The associative network model presented demonstrates good performance in storage capacity tasks and also solves the XOR, PARITY, and ADDER problems satisfactorily.
Two major aspects of network models are considered. The first involves network topology, i. e. , the particular structure of the graphs that underlie a network model. The second is the effect of distributed failures i...
详细信息
Two major aspects of network models are considered. The first involves network topology, i. e. , the particular structure of the graphs that underlie a network model. The second is the effect of distributed failures in a network of processing nodes. A special type of network topology that has been studied extensively by the author is used. It is based in the fab-graph, which is a type of directed graph with labels on its nodes and edges. Graphical folding transformations applied to fab graph to produce a graphical topology. This folding transformation changes one fab graph into another. The variety of graphical topologies that result and their organization. In applications, the goal is to find the best of the graphical topologies that can be generated from folding. The author show some example fab graphs that arise from applications of large-scale queuing networks. These applications come from simulating complex manufacturing areas.
We have developed a neuralnetwork which consists of cooperatively interconnected Grossberg on-center off-surround subnets and which can be used to optimize a function related to the log likelihood function for decodi...
We have developed a neuralnetwork which consists of cooperatively interconnected Grossberg on-center off-surround subnets and which can be used to optimize a function related to the log likelihood function for decoding convolutional codes or more general FIR signal deconvolution problems. Connections in the network are confined to neighboring subnets, and it is representative of the types of networks which lend themselves to VLSI implementation. Analytical and experimental results for convergence and stability of the network have been found. The structure of the network can be used for distributed representation of data items while allowing for fault tolerance and replacement of faulty units.
distributedprocessing in a network of computational cells which realise simple Boolean functions is used as a model for biological processing in neural cell assemblies. Recent work in establishing the criteria for de...
详细信息
distributedprocessing in a network of computational cells which realise simple Boolean functions is used as a model for biological processing in neural cell assemblies. Recent work in establishing the criteria for determining stable behavior is briefly summarized and the implications in characterizing an adaptivity mechanism for such a network are discussed. Empirical results are presented to indicate the deficiencies of typical adaptation algorithms in ensuring stability, and the paper finally points the discussion toward a theory of adaptation which emphasizes topological features of a network rather than individual cellular functions.
distributedprocessing in a network of computational cells which realise simple Boolean functions is used as a model for biological processing in neural cell assemblies. Recent work in establishing the criteria for de...
详细信息
distributedprocessing in a network of computational cells which realise simple Boolean functions is used as a model for biological processing in neural cell assemblies. Recent work in establishing the criteria for determining stable behavior is briefly summarized and the implications in characterizing an adaptivity mechanism for such a network are discussed. Empirical results are presented to indicate the deficiencies of typical adaptation algorithms in ensuring stability, and the paper finally points the discussion toward a theory of adaptation which emphasizes topological features of a network rather than individual cellular functions.
Primate vision with its system of interacting subsystems is a prime example of a distributedprocessingneuralnetwork. The available neurophysiological evidence indicates that the functional subsystem concerned with ...
详细信息
Primate vision with its system of interacting subsystems is a prime example of a distributedprocessingneuralnetwork. The available neurophysiological evidence indicates that the functional subsystem concerned with optimal pattern vision sequentially filters the neural image corresponding to retinal input and the filter functions can be described using linear systems analysis.
Primate vision with its system of interacting subsystems is a prime example of a distributedprocessingneuralnetwork. The available neurophysiological evidence indicates that the functional subsystem concerned with ...
详细信息
Primate vision with its system of interacting subsystems is a prime example of a distributedprocessingneuralnetwork. The available neurophysiological evidence indicates that the functional subsystem concerned with optimal pattern vision sequentially filters the neural image corresponding to retinal input and the filter functions can be described using linear systems analysis.
暂无评论