Complex learning rate schedules have become an integral part of deep learning. We find empirically that common fine-tuned schedules decay the learning rate after the weight norm bounces. This leads to the proposal of ...
详细信息
In real-world applications, the process generating the data might suffer from nonstationary effects (e.g., due to seasonality, faults affecting sensors or actuators, and changes in the users' behaviour). These cha...
详细信息
In this paper we present a novel training algorithm for the Spiking Neural Networks which we call SNN-ART. Typically to train SNN, supervised learning is being used. The error-based learning is based on reducing the t...
详细信息
The teaching–learning-based optimization algorithm (TLBO) is an efficient optimizer. However, it has several shortcomings such as premature convergence and stagnation at local optima. In this paper, the strengthened ...
详细信息
We study learning Censor Markov Random Fields (abbreviated CMRFs). These are Markov Random Fields where some of the nodes are censored (not observed). We present an algorithm for learning high temperature CMRFs within...
详细信息
Machine learning techniques are becoming a fundamental tool for scientific and engineering progress. These techniques are applied in contexts as diverse as astronomy and spam filtering. However, correctly applying the...
详细信息
This paper considers the problem of decentralized, personalized federated learning. For centralized personalized federated learning, a penalty that measures the deviation from the local model and its average, is often...
详细信息
Trusted execution environments (TEEs) such as Intel SGX facilitate the secure execution of an application on untrusted machines. Sadly, such environments suffer from serious limitations and performance overheads in te...
详细信息
When minimizing the empirical risk in binary classification, it is a common practice to replace the zero-one loss with a surrogate loss to make the learning objective feasible to optimize. Examples of well-known surro...
详细信息
Labels are costly and sometimes unreliable. Noisy label learning, semi-supervised learning, and contrastive learning are three different strategies for designing learning processes requiring less annotation cost. Semi...
详细信息
暂无评论