版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:The School of Robotics Hunan University Changsha410082 China The State Key Laboratory of Modern Optical Instrumentation Zhejiang University Hangzhou310027 China The Institute for Anthropomatics and Robotics Karlsruhe Institute of Technology Karlsruhe76131 Germany The Department of Engineering Science University of Oxford OxfordOX1 3PJ United Kingdom The School of Computer Science and Electronic Engineering Hunan University Changsha410082 China
出 版 物:《arXiv》 (arXiv)
年 卷 期:2023年
核心收录:
主 题:Autonomous vehicles
摘 要:A semantic map of the road scene, covering fundamental road elements, is an essential ingredient in autonomous driving systems. It provides important perception foundations for positioning and planning when rendered in the Bird s-Eye-View (BEV). Currently, the prior knowledge of hypothetical depth can guide the learning of translating front perspective views into BEV directly with the help of calibration parameters. However, it suffers from geometric distortions in the representation of distant objects. In addition, another stream of methods without prior knowledge can learn the transformation between front perspective views and BEV implicitly with a global view. Considering that the fusion of different learning methods may bring surprising beneficial effects, we propose a Bi-Mapper framework for top-down road-scene semantic understanding, which incorporates a global view and local prior knowledge. To enhance reliable interaction between them, an asynchronous mutual learning strategy is proposed. At the same time, an Across-Space Loss (ASL) is designed to mitigate the negative impact of geometric distortions. Extensive results on nuScenes and Cam2BEV datasets verify the consistent effectiveness of each module in the proposed Bi-Mapper framework. Compared with exiting road mapping networks, the proposed Bi-Mapper achieves 2.1% higher IoU on the nuScenes dataset. Moreover, we verify the generalization performance of Bi-Mapper in a real-world driving scenario. The source code is publicly available at BiMapper. Copyright © 2023, The Authors. All rights reserved.