Modern gradient based optimization methods for deep neural networks demonstrate outstanding results on image classification tasks. However, methods that do not rely on gradient feedback fail to tackle deep network opt...
详细信息
ISBN:
(纸本)9781450367486
Modern gradient based optimization methods for deep neural networks demonstrate outstanding results on image classification tasks. However, methods that do not rely on gradient feedback fail to tackle deep network optimization. In the field of evolutionary computation, applying evolutionary algorithms directly to network weights remains to be an unresolved challenge. In this paper we examine a new framework for the evolution of deep nets. Based on the empirical analysis, we propose the use of linear sub-spaces of problems to search for promising optimization trajectories in parameter space, opposed to weight evolution. We show that linear sub-spaces of loss functions are sufficiently well-behaved to allow trajectory evaluation. Furthermore, we introduce fitness measure to show that it is possible to correctly categorize trajectories according to their distance from the optimal path. As such, this work introduces an alternative approach to evolutionary optimization of deep networks.
暂无评论