We propose a non-hierarchical decentralized algorithm for the asymptotic minimization of possibly time-varying convex functions. In our method, each agent in a network has a private, local (possibly time-varying) cost...
详细信息
We propose a non-hierarchical decentralized algorithm for the asymptotic minimization of possibly time-varying convex functions. In our method, each agent in a network has a private, local (possibly time-varying) cost function, and the objective is to minimize asymptotically the sum of these local functions in every agent (this problem appears in many different applications such as, among others, motion planning, acoustic source localization, and environmental modeling). The algorithm consists of two main steps. First, to improve the estimate of a minimizer, agents apply a particular version of the adaptive projected subgradient method to their local functions. Then the agents exchange and mix their estimates using a communication model based on recent results of consensus algorithms. We show formally the convergence of the resulting scheme, which reproduces as particular cases many existing methods such as gossip consensus algorithms and recent decentralized adaptive subgradient methods (which themselves include as particular cases many distributed adaptive filtering algorithms). To illustrate two possible applications, we consider the problems of acoustic source localization and environmental modeling vianetworkgossiping with mobile agents.
暂无评论