版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
T=题名(书名、题名),A=作者(责任者),K=主题词,P=出版物名称,PU=出版社名称,O=机构(作者单位、学位授予单位、专利申请人),L=中图分类号,C=学科分类号,U=全部字段,Y=年(出版发行年、学位年度、标准发布年)
AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
范例一:(K=图书馆学 OR K=情报学) AND A=范并思 AND Y=1982-2016
范例二:P=计算机应用与软件 AND (U=C++ OR U=Basic) NOT K=Visual AND Y=2011-2016
With the development of space targets towards larger viewing distances and sizes, precise pose measurement has become a critical factor for the success of various space missions. To meet the demands of pose measurement, cooperative targets specifically designed for spacecraft are usually installed on the target spacecraft, allowing pose information to be obtained through target recognition. This makes the recognition of cooperative targets essential for the stable operation of spacecraft systems. Traditional multi-point geometric configuration recognition methods, such as template matching and geometry-based recognition, struggle to perform effectively at long distances and with large attitude variations due to the unique space environment and variable poses of the targets. To address the challenges of recognizing multi-point geometric configurations with unknown attitude angles in specific space environments, this paper proposes a method that combines prior configuration information and projective invariant features. By designing cooperative targets and constructing two pairs of cross-ratios as invariant features under different poses, this method enables recognition of multi-point geometric configurations at long distances and with unknown attitude angles. It aims to solve the problem of stable and accurate recognition of cooperative targets at different poses at a distance of 50 m in complex background environments. The proposed method achieves a recognition accuracy of over 95% within the pose range of 0∘–85∘, outperforming the other methods compared.
电话和邮箱必须正确填写,我们会与您联系确认。
版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
暂无评论