###
计算机系统应用英文版:2023,32(10):123-131
本文二维码信息
码上扫一扫!
融合目标检测和空间投影的增强现实方法
(中电海康集团有限公司, 杭州 311100)
Augmented Reality Method Combining Object Detection and Spatial Projection
(CETHIK Group Co. Ltd., Hangzhou 311100, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 289次   下载 574
Received:February 10, 2023    Revised:April 20, 2023
中文摘要: 实现对固定目标物的注册跟踪方法中, 目前最常使用预制标识物的方法, 或者需要使用集成深度摄像头等配件的专业AR设备, 成本较高. 针对现有方法的缺陷, 提出一种融合目标检测和空间投影算法的简单协作式混合跟踪注册技术, 先通过深度学习算法进行目标检测得到目标物类型, 再利用传感器位姿信息通过空间投影算法确定目标物ID, 从而提高了虚拟信息叠加在真实场景中的匹配度和准确性. 基于此算法实现了智慧物联基础设施维护的增强现实应用, 并对灯杆、垃圾桶等目标物进行了实验. 实验结果表明, 本方法可以在普通智能手机及AR眼镜上运行, 取得了预期效果, 也避免了预制标识物, 降低了对硬件资源的要求.
Abstract:To register and track fixed objects, the common methods are using prefabricated markers, or using professional AR devices with integrated depth cameras and other accessories, whose costs are high. To address the defects of existing methods, a simple cooperative hybrid tracking and registration technology that integrates object detection and spatial projection algorithm is proposed. Firstly, the object type is obtained by the deep learning algorithm for object detection, and then the specific object ID is determined by the spatial projection algorithm using position and posture information obtained from sensors, which improves the matching degree and accuracy of the virtual information superimposed on the real scene. Based on this algorithm, an AR application for smart IoT infrastructure maintenance is realized and experiments are conducted on objects such as light poles and trashcans. The experimental results show that this method can run on ordinary smartphones and AR glasses, achieving the expected results, avoiding the need for prefabricated markers, and reducing the requirement for hardware resources.
文章编号:     中图分类号:    文献标志码:
基金项目:国家自然科学基金(U20B2074); 浙江省重点研发计划(2021C03032)
引用文本:
陈琼,林兴萍,舒元昊,胡青阳.融合目标检测和空间投影的增强现实方法.计算机系统应用,2023,32(10):123-131
CHEN Qiong,LIN Xing-Ping,SHU Yuan-Hao,HU Qing-Yang.Augmented Reality Method Combining Object Detection and Spatial Projection.COMPUTER SYSTEMS APPLICATIONS,2023,32(10):123-131