咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Multi-scale Evolutionary Neura... 收藏
arXiv

Multi-scale Evolutionary Neural Architecture Search for Deep Spiking Neural Networks

作     者:Pan, Wenxuan Zhao, Feifei Shen, Guobin Han, Bing Zeng, Yi 

作者机构:The Brain-inspired Cognitive Intelligence Lab Institute of Automation Chinese Academy of Sciences Beijing100190 China School of Artificial Intelligence University of Chinese Academy of Sciences Beijing100049 China School of Future Technology University of Chinese Academy of Sciences Beijing100049 China University of Chinese Academy of Sciences Beijing100049 China Center for Excellence in Brain Science and Intelligence Technology Chinese Academy of Sciences Shanghai200031 China 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2023年

核心收录:

主  题:Topology 

摘      要:Spiking Neural Networks (SNNs) have received considerable attention not only for their superiority in energy efficiency with discrete signal processing but also for their natural suitability to integrate multi-scale biological plasticity. However, most SNNs directly adopt the structure of the well-established Deep Neural Networks (DNNs), and rarely automatically design Neural Architecture Search (NAS) for SNNs. The neural motifs topology, modular regional structure and global cross-brain region connection of the human brain are the product of natural evolution and can serve as a perfect reference for designing brain-inspired SNN architecture. In this paper, we propose a Multi-Scale Evolutionary Neural Architecture Search (MSE-NAS) for SNN, simultaneously considering micro-, meso- and macro-scale brain topologies as the evolutionary search space. MSE-NAS evolves individual neuron operation, self-organized integration of multiple circuit motifs, and global connectivity across motifs through a Brain-inspired Indirect Evaluation (BIE) Function. This training-free fitness function could greatly reduce computational consumption and NAS s time, and its task-independent property enables the searched SNNs to exhibit excellent transferability on multiple datasets. Furthermore, MSE-NAS show robustness against the training method and noise. Extensive experiments demonstrate that the proposed algorithm achieves state-of-the-art (SOTA) performance with shorter simulation steps on static datasets (CIFAR10, CIFAR100) and neuromorphic datasets (CIFAR10-DVS and DVS128-Gesture). The thorough analysis also illustrates the significant performance improvement and consistent bio-interpretability deriving from the topological evolution at different scales and the BIE function. © 2023, CC BY-NC-ND.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分