We present discrete stochastic optimization algorithms that adaptively learn the Nernst potential in membrane ion channels. The proposed algorithms dynamically control both the ion channel experiment and the resulting...
详细信息
We present discrete stochastic optimization algorithms that adaptively learn the Nernst potential in membrane ion channels. The proposed algorithms dynamically control both the ion channel experiment and the resulting Hidden Markov Model (HMM) signal processor and can adapt to time-varying behaviour of ion channels. One of the most important properties of the proposed algorithms are their its self-learning capability - they spends most of the computational effort at the global optimizer (Nernst potential). Numerical examples illustrate the performance of the algorithms on computer generated synthetic data.
In this article we present an application of Kalman filtering in Artificial Intelligence, where nonlinear Kalman filters were used as a learning algorithms for feed-forward neural networks. In the first part of this a...
详细信息
Semi-supervised learning (SSL) provides a powerful framework for leveraging unlabeled data when labels are limited or expensive to obtain. Approaches based on deep neural networks have recently proven successful on st...
详细信息
Most of the work which attempts to give bounds on the generalization error of the hypothesis generated by a learning algorithm is based on methods from the theory of uniform convergence. These bounds are a-priori boun...
详细信息
ISBN:
(纸本)9781581130577
Most of the work which attempts to give bounds on the generalization error of the hypothesis generated by a learning algorithm is based on methods from the theory of uniform convergence. These bounds are a-priori bounds that hold for any distribution of examples and are calculated before any data is observed. In this paper we propose a different approach for bounding the generalization error after the data has been observed. A self-bounding learning algorithm is an algorithm which, in addition to the hypothesis that it outputs, outputs a reliable upper bound on the generalization error of this hypothesis. We first explore the idea in the statistical query learning framework of Kearns. After that we give an explicit self bounding algorithm for learning algorithms that are based on local search.
Machine learning algorithms can be viewed as stochastic transformations that map training data to hypotheses. Following Bousquet and Elisseeff, we say that such an algorithm is stable if its output does not depend too...
详细信息
Traffic crashes are the severe issues confronting the world as they are the root reason for numerous deaths, wounds, and fatalities just as financial misfortunes consistently. Effective model to deduce the severity of...
详细信息
This research investigates road sign recognition using deep learning methods, comparing them to traditional approaches and emphasizing the potential of simplified convolutional neural networks. Evaluations were conduc...
详细信息
In this paper we present our approach of solving the PAN 2016 Author Profiling Task. It involves classifying users' gender and age using social media posts. We used SVM classifiers and neural networks on TF-IDF an...
详细信息
In this paper we present our approach of solving the PAN 2016 Author Profiling Task. It involves classifying users' gender and age using social media posts. We used SVM classifiers and neural networks on TF-IDF and verbosity features. Results showed that SVM classifiers are better for English datasets and neural networks perform better for Dutch and Spanish datasets.
Text classification is an essential and the most well-known topic of Artificial Intelligence as a discipline of Natural Language Processing (NLP). Because of the abundance of textual documents in Bangla, text classifi...
详细信息
With the rapid growth of computer technology and the arrival of the era of Big data, people have increasingly high requirements for information processing speed. Therefore, this design adopted a programming based Pyth...
详细信息
暂无评论