Deep neural networks (DNNs), particularly convolutional neural networks (CNNs), have garnered significant attention in recent years for addressing a wide range of challenges in image processing and computer vision. Ne...
详细信息
ISBN:
(纸本)9783031755422;9783031755439
Deep neural networks (DNNs), particularly convolutional neural networks (CNNs), have garnered significant attention in recent years for addressing a wide range of challenges in image processing and computer vision. neuralarchitecturesearch (NAS) has emerged as a crucial field aiming to automate the design and configuration of CNN models. In this paper, we propose a novel strategy to speed up the performance estimation of neuralarchitectures by gradually increasing the size of the training set used for evaluation as the search progresses. We evaluate this approach using the CGP-NASV2 model, a multi-objective NAS method, on the CIFAR-100 dataset. Experimental results demonstrate a notable acceleration in the search process, achieving a speedup of 4.6 times compared to the baseline. Despite using limited data in the early stages, our proposed method effectively guides the search towards competitive architectures. This study highlights the efficacy of leveraging lower-fidelity estimates in NAS and paves the way for further research into accelerating the design of efficient CNN architectures.
evolutionary neural architecture search (ENAS) algorithms benefit from the non-convex optimization capability of evolutionary computation. However, most ENAS algorithms use a genetic algorithm (GA) with fixed paramete...
详细信息
evolutionary neural architecture search (ENAS) algorithms benefit from the non-convex optimization capability of evolutionary computation. However, most ENAS algorithms use a genetic algorithm (GA) with fixed parameter settings, which limits the algorithm's search performance. Furthermore, the information generated during the evolutionary process is often ignored, while this information can be helpful for guiding the evolutionary direction of the population. This paper proposes a novel ENAS algorithm based on adaptive parameter control and gene potential contribution (AG-ENAS) to evolve neural networks efficiently. Firstly, an adaptive parameter adjustment mechanism is designed, based on population diversity and fitness. This enables better-informed adaptation of related parameters of genetic operators. Secondly, the mutation operator guided by the gene potential contribution of genes tends to produce better offspring. The gene potential contribution reflects the positive effect of the current gene on fitness. It guides the evolution by weighting the more valuable genes with the distribution index matrix. Finally, the concept of aging is introduced into the environmental selection, to offer more opportunities to the young generation and alleviate premature convergence. The proposed algorithm has been evaluated on eight different datasets, and compared against 44 state-of-the-art algorithms from the literature. The experimental results show that the network designed by AG-ENAS obtains higher classification accuracy than manual-designed networks such as SENet, DenseNet, and other ENAS algorithms such as Large-Evo and AE-CNN.
Recently, neuralarchitecturesearch (NAS) has gained a lot of attention as a tool for constructing deep neural networks automatically. NAS methods have successfully found convolutional neural networks (CNNs) that exc...
详细信息
Recently, neuralarchitecturesearch (NAS) has gained a lot of attention as a tool for constructing deep neural networks automatically. NAS methods have successfully found convolutional neural networks (CNNs) that exceed human expert-designed networks on image classification in computer vision. However, there are growing demands for semantic segmentation in several areas including remote sensing image analysis. In this paper, we introduce an evolutionary NAS method for semantic segmentation of high-resolution aerial images. The proposed method leverages the complementary strengths of gene expression programming and cellular encoding to develop an encoding scheme, called symbolic linear generative encoding (SLGE), for evolving cells (directed acyclic graphs) as building-blocks to construct modularized encoder-decoder CNNs via an evolutionary process. SLGE can evolve cells with multi-branch and shortcut connections similar to the Inception-ResNet-like modules which can improve training and inference performance in deep neural networks. In experiments, we demonstrate the effectiveness of the proposed method on the challenging ISPRS Vaihingen, Potsdam and UAVid semantic segmentation benchmarks. Compared with recent state-of-the-art systems, our network, dubbed SLGENet, improves the overall accuracy performance on Vaihingen and Potsdam;and achieves a competitive overall accuracy on UAVid using fewer parameters. Our method achieves promising results in a little time of 2.5 GPU days.
evolutionaryneural network architecturesearch (ENAS) has attracted the attention of many experts due to its global optimization capabilities to automatically search for convolutional neural network architectures bas...
详细信息
evolutionaryneural network architecturesearch (ENAS) has attracted the attention of many experts due to its global optimization capabilities to automatically search for convolutional neural network architectures based on the target task. The current search space for ENAS is not to design a fully structured network, but to search for smaller cell architectures to reduce search costs. However, blind search strategies do not effectively utilize the potential experience of the population. In order to utilize the potential experience learned by the current population to guide the evolutionarysearch of the population, we propose a similarity guided neural network architecturesearch algorithm based on cell architecture, which utilizes the similarity between pairwise architectures in the population as empirical knowledge learned by the population. Our proposed algorithm provides a novel method for calculating architecture similarity, which calculates architecture similarity separately from the cell and macro-structure. Then we decouple the connections and operations in the cell and calculate connection and operation similarity separately. In addition, we propose adaptive similarity selection and binary tournament selection strategies to enhance the algorithm's global and local search capabilities and effectively explore the search space. Finally, we design an improved single-point crossover operator to enhance the local search ability of the evolutionary operator. The experimental results show that SAGNAS is a competitive algorithm that achieves 97.44% and 81.60% in CIFAR10 and CIFAR100 with only 1.9 GPU-days spent.
Tracking-by-detection approaches have demonstrated their strength in addressing Multiple Object Tracking (MOT) problems. DeepSORT, one of the classical tracking-by-etection MOT methods, relies on a deep appearance des...
详细信息
ISBN:
(数字)9781728186719
ISBN:
(纸本)9781728186719
Tracking-by-detection approaches have demonstrated their strength in addressing Multiple Object Tracking (MOT) problems. DeepSORT, one of the classical tracking-by-etection MOT methods, relies on a deep appearance descriptor to extract global appearance features of identities. Although the appearance descriptor acts as a key component of such tracking-y-detection methods, which is responsible for modeling appearance information, the relationship between it and tracking performance remains unclear, especially whether further improvements to it will be reflected in the tracking performance. To explore that, extensive experiments are conducted on the appearance descriptor by applying various traditional optimization methods. Furthermore, we propose an evolutionary neural architecture search (ENAS) strategy for the appearance descriptor named Genetic-SORT to assist exploration. The experimental results demonstrate that tracking performance fails to follow the improvements applied on the appearance descriptor and even shows a negative correlation, which is contrary to our intuition.
neuralarchitecturesearch (NAS) algorithms have discovered highly novel state-of-the-art Convolutional neural Networks (CNNs) for image classification, and are beginning to improve our understanding of CNN architectu...
详细信息
ISBN:
(纸本)9781450371285
neuralarchitecturesearch (NAS) algorithms have discovered highly novel state-of-the-art Convolutional neural Networks (CNNs) for image classification, and are beginning to improve our understanding of CNN architectures. However, within NAS research, there are limited studies focussing on the role of skip-connections, and how the configurations of connections between layers can be optimised to improve CNN performance. Our work focusses on developing a new evolutionary NAS algorithm based on adjacency matrices to optimise skip-connection structures, creating more specialised and powerful skip-connection structures within a DenseNet-BC network than previously seen in the literature. Our work further demonstrates how simple adjacency matrices can be interpreted in a way which allows for a more dynamic variant of DenseNet-BC. The final algorithm, using this novel interpretation of adjacency matrices for architecture design and evolved on the CIFAR100 dataset, finds networks with improved performance relative to a baseline DenseNet-BC network on both the CIFAR10 and CIFAR100 datasets, being the first, to our knowledge, NAS algorithm for skip-connection optimisation to do so. Finally, skip-connection structures discovered by our algorithm are analysed, and some important skip-connection patterns are highlighted.
暂无评论