en
×

分享给微信好友或者朋友圈

使用微信“扫一扫”功能。
作者简介:

李明,男,硕士,工程师,研究方向为GNSS/SINS/视觉融合导航.liming110809@163.com;

柴洪洲,男,博士,教授,研究方向为海洋大地测量.chaihz1969@163.com

通讯作者:

柴洪洲,男,博士,教授,研究方向为海洋大地测量.chaihz1969@163.com

中图分类号:V249

文献标识码:A

DOI:10.13878/j.cnki.jnuist.20230214001

参考文献 1
Du Z Q,Chai H Z,Xiao G R,et al.The realization and evaluation of PPP ambiguity resolution with INS aiding in marine survey[J].Marine Geodesy,2021,44(2):136-156
参考文献 2
Tang H L,Zhang T S,Niu X J,et al.Impact of the earth rotation compensation on MEMS-IMU preintegration of factor graph optimization[J].IEEE Sensors Journal,2022,22(17):17194-17204
参考文献 3
Mur-Artal R,Tardós J D.Visual-inertial monocular SLAM with map reuse[J].IEEE Robotics and Automation Letters,2017,2(2):796-803
参考文献 4
Mourikis A,Roumeliotis S.A multi-state constraint Kalman filter for vision-aided inertial navigation[C]//Proceedings IEEE International Conference on Robotics and Automation.April 10-14,Rome,Italy.IEEE,2007:3565-3572
参考文献 5
Qin T,Li P L,Shen S J.VINS-mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020
参考文献 6
胡凯,吴佳胜,郑翡,等.视觉里程计研究综述[J].南京信息工程大学学报(自然科学版),2021,13(3):269-280.HU Kai,WU Jiasheng,ZHENG Fei,et al.A survey of visual odometry[J].Journal of Nanjing University of Information Science & Technology(Natural Science Edition),2021,13(3):269-280
参考文献 7
Campos C,Elvira R,Rodríguez J J G,et al.ORB-SLAM3:an accurate open-source library for visual,visual-inertial,and multimap SLAM[J].IEEE Transactions on Robotics,2021,37(6):1874-1890
参考文献 8
Liao J C,Li X X,Wang X B,et al.Enhancing navigation performance through visual-inertial odometry in GNSS-degraded environment[J].GPS Solutions,2021,25(2):1-18
参考文献 9
He M W,Rajkumar R R.Extended VINS-mono:a systematic approach for absolute and relative vehicle localization in large-scale outdoor environments[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 27-October 1,2021,Prague,Czech Republic.IEEE,2021:4861-4868
参考文献 10
Mascaro R,Teixeira L,Hinzmann T,et al.GOMSF:Graph-optimization based multi-sensor fusion for robust UAV pose estimation[C]//2018 IEEE International Conference on Robotics and Automation(ICRA).May 21-25,2018,Brisbane,QLD,Australia.IEEE,2018:1421-1428
参考文献 11
Leutenegger S,Furgale P,Rabaud V,et al.Keyframe-based visual-inertial SLAM using nonlinear optimization[C]//Proceedings of Robotics:Science and Systems IX,2013.DOI:10.15607/RSS.2013.IX.037
参考文献 12
Cao S Z,Lu X Y,Shen S J.GVINS:tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J].IEEE Transactions on Robotics,2022,38(4):2004-2021
参考文献 13
Li T,Pei L,Xiang Y,et al.P3-VINS:tightly-coupled PPP/INS/visual SLAM based on optimization approach[J].IEEE Robotics and Automation Letters,2022,7(3):7021-7027
参考文献 14
Strasdat H,Montiel J M M,Davison A J.Visual SLAM:why filter?[J].Image and Vision Computing,2012,30(2):65-77
参考文献 15
蒋郡祥.基于图优化的视觉/惯性/GNSS融合导航方法研究[D].武汉:武汉大学,2021.JIANG Junxiang.Research on visual/inertial/GNSS fusion navigation method based on graph optimization[D].Wuhan:Wuhan University,2021
参考文献 16
Jiang J X,Niu X J,Guo R N,et al.A hybrid sliding window optimizer for tightly-coupled vision-aided inertial navigation system[J].Sensors(Basel,Switzerland),2019,19(15):3418
参考文献 17
Jiang J X,Niu X J,Liu J N.Improved IMU preintegration with gravity change and earth rotation for optimization-based GNSS/VINS[J].Remote Sensing,2020,12(18):3048
参考文献 18
Sturm J,Engelhard N,Endres F,et al.A benchmark for the evaluation of RGB-D SLAM systems[C]//2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 7-12,2012,Vilamoura-Algarve,Portugal.IEEE,2012:573-580
参考文献 19
杨元喜.综合PNT体系及其关键技术[J].测绘学报,2016,45(5):505-510.YANG Yuanxi.Concepts of comprehensive PNT and related key technologies[J].Acta Geodaetica et Cartographica Sinica,2016,45(5):505-510
参考文献 20
杨元喜.弹性PNT基本框架[J].测绘学报,2018,47(7):893-898.YANG Yuanxi.Elastic PNT basic framework[J].Acta Geodaetica et Cartographica Sinica,2018,47(7):893-898
参考文献 21
杨元喜,杨诚,任夏.PNT智能服务[J].测绘学报,2021,50(8):1006-1012.YANG Yuanxi,YANG Cheng,REN Xia.PNT intelligent services[J].Acta Geodaetica et Cartographica Sinica,2021,50(8):1006-1012
目录contents

    摘要

    全球导航卫星系统(Global Navigation Satellite System,GNSS)、捷联惯性导航系统(Strapdown Inertial Navigation System,SINS)和视觉传感器优势互补,3者信息融合可获得高精度、无漂移的导航定位信息.针对GNSS/SINS/视觉融合导航易受运动速度、光照变化、遮挡等影响导致定位精度和鲁棒性降低问题,本文在图优化框架的代价函数中加入SoftLOne鲁棒核函数,设置量测值粗差检验程序,降低离群点带来的负面影响.进一步,对量测值计算残差进行卡方检验,对超限残差降权处理,提高系统精度和鲁棒性.实验结果表明,本文算法较不施加鲁棒核函数、不采用异常值剔除策略和卡方检验的传统算法,以及加入其他鲁棒核函数的算法精度更高、鲁棒性更好,能够较大程度提升GNSS/SINS/视觉导航定位精度和鲁棒性,在大尺度环境下,未出现较大漂移误差,绝对位姿均方根误差0.735 m,绝对位姿误差标准差0.336 m.

    Abstract

    Global Navigation Satellite System (GNSS),Strapdown Inertial Navigation System (SINS) and visual sensors can complement each other,and their information fusion can obtain high-precision,drift-free navigation and positioning information.Aiming at the problem that GNSS/SINS/vision fusion navigation is vulnerable to the impact of motion speed,light change,occlusion,etc.,which leads to the decline of navigation positioning accuracy and robustness,this paper adds the SoftLone robust kernel function to the cost function of the graph optimization framework,and sets the gross error test procedure of the measured value to reduce the negative impact of outliers.Further,the chi-square test is performed on the calculated residuals of the measured value,and the weight of the over-limit residual is reduced to improve the accuracy and robustness of the system.The experimental results show that the proposed algorithm has higher accuracy and better robustness than traditional algorithm without robust kernel function,outlier elimination strategy and chi-square test,and algorithm with other robust kernel functions.It can greatly improve the positioning accuracy and robustness of GNSS/SINS/visual navigation.In large scale scenario,there is no large drift errors,and root mean square error and standard deviation of absolute pose are 0.735 m and 0.336 m,respectively.

  • 0 引言

  • 捷联惯性导航系统(Strapdown Inertial Navigation System,SINS)具有无源、自主、短时高精度等诸多优点,但误差随时间累积[1].近年来兴起的微机电系统(Micro Electro Mechanical System,MEMS)惯导系统因成本低、体积小被广泛应用于自主机器人和自动驾驶汽车等领域,但其误差累积较快,一般无法完成导航任务[2].视觉传感器价格低、体积小,但纯视觉导航易受光照变化和运动速度干扰等影响,与SINS具有天然优势互补性[3].Mourikis等[4]提出了经典的多状态约束卡尔曼滤波(Multi-State Constraint Kalman Filter,MSCKF)视觉/SINS紧耦合框架,此后多数基于Kalman滤波形式的视觉/惯性里程计(Visual Inertial Odometry,VIO)以该系统框架为基础进行迭代升级.Qin等[5]利用图优化框架将单目视觉与SINS紧密耦合,为国内视觉/惯性里程计快速发展奠定重要基础,并因其功能强大、高精度等优点已成功应用于无人机和手持设备中[6].Campos等[7]提出适用于单目、双目、RGBD等多种相机类型的视觉/惯性里程计系统,精度可达分米级甚至厘米级.

  • 视觉/惯性里程计局部姿态估计性能优秀,但因其采用航迹推算导航方式,长距离导航可能产生较大漂移误差.全球导航卫星系统(Global Navigation Satellite System,GNSS)可在全球范围内获得绝对位置信息,与视觉/惯性里程计融合可有效抑制漂移误差[8].为此,He等[9]在VINS-Mono[5]基础上融合GNSS绝对定位信息,提供了在给定坐标系中具有不同精度、速率和延迟属性的多种选择,并进行25 km长距离实验,与视觉/惯性里程计VINS-Mono相比获得更卓越的定位精度.Mascaro等[10]基于解耦图优化松耦合框架结合通用六自由度视觉/惯性里程计和三自由度GNSS全局位置信息,实现机器人六自由度姿态信息实时估计,算法具有较高通用性.与基于关键帧的视觉/惯性里程计算法OKVIS[11]共同运行时,算法性能更加稳健和精确.Cao等[12]将GNSS伪距和多普勒频移原始测量信息与视觉/惯性里程计紧密耦合,获得全球范围实时无漂移的六自由度估计,定位精度高于VINS-Mono和RTKLIB等优秀算法.进一步,Li等[13]充分利用GNSS伪距、多普勒频移和载波相位原始观测值与视觉/惯性里程计紧密耦合,将模糊度参数加入到状态估计中,因加入高精度载波相位观测值,定位精度较GVINS[12]有较大提升.以上GNSS/SINS/视觉融合算法未充分考虑量测粗差影响和系统鲁棒性问题,降低了系统定位精度和可靠性.为提高GNSS/SINS/视觉导航系统鲁棒性和定位精度,在图优化框架下,本文在GNSS/SINS/视觉数据融合总代价函数中加入鲁棒核函数,动态调整量测权重.通过设置量测粗差检验方法,剔除超过阈值的粗差,减少粗差对导航精度的负面影响.进一步引入卡方检验,对检验结果异常量测进行相应降权处理.

  • 本文针对GNSS/SINS/视觉导航量测粗差和系统鲁棒性问题,给出了基于图优化框架的GNSS/SINS/视觉鲁棒算法基本原理,并通过对比实验证明了本文鲁棒算法的有效性.

  • 1 GNSS/SINS/视觉鲁棒抗差算法基本原理

  • GNSS/SINS/视觉多源融合导航数据融合一般分为Kalman滤波和图优化两种方法.Strasdat等[14]采用蒙特卡罗实验证明单位时间内,相同的计算资源情况下,基于图优化的方法较滤波方法能够获得更高估计精度.因此,本文选择在图优化框架下加入鲁棒抗差算法提高系统精度和鲁棒性.优化的目的是通过调整待估状态使得代价函数整体最小,从而得到最优的状态估计.

  • 首先要得到不同类型观测值相对于待估状态的量测方程[15]

  • z^t=h(X)+vt,
    (1)
  • 其中:z^t表示量测值;X表示与相应量测值有关的所有状态构成的矢量;vt表示量测噪声,服从vt~N(0,Σt).

  • 然后构建残差项[15],得:

  • et=z^t-h(X),
    (2)
  • 其中,et表示与量测值相对应的残差函数.

  • 进一步将不同类型观测的残差项加起来,得到整个优化问题的代价函数,滑动窗口内的待估计的系统状态向量X[16]表示为

  • X=x0,x1,,xn,xcb,λ0,λ1,,λl,
    (3)
  • xk=pwbkw,qbkw,vwbkw,bgk,bak,k[0,n],
    (4)
  • xcb=pbcb,qcb,
    (5)
  • 其中:xk为第k关键帧SINS状态向量; pwbkwqbkwvwbkwbgkbak分别为位置、姿态、速度、陀螺仪和加表零偏参数; xcb 为SINS和相机外参; λ为第1次被观测到关键帧的逆深度参数.

  • GNSS/SINS/视觉融合代价函数可描述为最小化所有量测残差与先验的马氏范数之和[17]:

  • minx rp-HpX2+k[1,n] rSz^k-1,kS,XΣk-1,kS2+lL rCz^lCi,j,XlCi,j 2+d[0,m] rGz^dG,XΣdG2.
    (6)
  • 其中: rSz^k-1kSX为SINS预积分量测残差; rCz^lCijX为视觉量测残差; rGz^dGX为GNSS量测残差.Σk-1kSlCij ΣdG 分别为相应量测值的协方差,表示相应观测值精度,即可靠性程度,以协方差矩阵的逆即信息矩阵的形式参与残差项二次型计算.它们被用于在联合非线性优化过程中对不同观测的残差项分配权重,观测值精度越好,其对应的协方差矩阵越小,信息矩阵越大,该项误差在整体优化问题中所占权重越高,反之将被分配较小权重.{rpHp}为边缘化操作后得到的先验信息,用作滑动窗口优化的先验约束; m为滑动窗口内GNSS量测值个数; L为滑动窗口内路标点地图; l为地图中的路标点; i为路标点中的参考关键帧; j为另外的关键帧.

  • 向代价函数中加入进行权重调节的鲁棒核函数,得

  • minx {rp-HpX2+k[1,n] rSz^k-1,kS,XΣk-1,kS2+lL ρrCz^lCi,j,XlCi,j 2+d[0,m] ρrGz^dG,XΣdG2,
    (7)
  • 其中,ρ(·)为鲁棒核函数.

  • 非线性优化过程中,将最小化残差的二范数平方和作为代价函数.由于环境变化等因素产生异常观测值,导致误匹配,将其加入到图优化中,会引入一条误差很大的边,梯度会很大,给系统优化引导错误的方向,调整与其有关的量使得代价函数急剧下降.为了增加非线性优化的鲁棒性,引入SoftLOne鲁棒核函数,使得在异常误差情况下,减缓二范数增长速度,限制剃度最大值,降低对异常值的敏感程度,同时保证自身光滑性,使得整个非线性优化过程更加鲁棒.其他鲁棒核函数还有Huber、Cauchy以及Arctan等.

  • SoftLOne鲁棒核函数可表示为

  • ρ(s)=2(1+s-1).
    (8)
  • Huber鲁棒核函数可表示为

  • ρ(s)=s, s1,2s-1, s>1.
    (9)
  • Cauchy鲁棒核函数可表示为

  • ρ(s)=log(1+s).
    (10)
  • Arctan鲁棒核函数可表示为

  • ρ(s)=arctan(s).
    (11)
  • 为提高GNSS/SINS/视觉多源导航系统抵抗粗差的能力,设置量测值方差检测阈值,将方差大于阈值的量测值视为离群点予以剔除,较大程度减少离群点对系统定位精度的负面影响.为进一步提高系统的鲁棒性能,对量测残差值进行卡方检验,将大于阈值的量测值进行降权处理,降权函数为

  • δ=(2ε)γ
    (12)
  • 其中:δ为重新定权后的量测值标准差; ε为计算残差值; γ为卡方检验阈值.

  • 2 实验与分析

  • 为验证本文所提算法的有效性,实验方案设计如表1所示.实验主设备采用轮式机器人,传感器搭载Allied Vision Mako-G131型全局快门相机、ADIS16465型MEMS捷联惯性导航系统和诺瓦泰OEM-718D型GNSS接收机.利用导航级GNSS/SINS组合导航系统后处理结果作为参考真实轨迹,轨迹持续时间1 820 s,轨迹长度2 560 m.实验过程所处环境具有一定挑战性,如图1所示.轨迹多次经过树木和建筑物区域,GNSS容易产生信号遮挡、多路径等问题.由于光照变化较为频繁,视觉观测环境不佳.

  • 表1 实验方案设计

  • Table1 Experimental scheme design

  • 实验过程中,量测值离群点方差检验阈值设置为20,卡方检验置信度取95%.

  • 对于GNSS/SINS/视觉多源导航系统,估计轨迹的全局一致性是一个重要的量.本文通过比较估计轨迹和参考真实轨迹之间的绝对距离即绝对位姿误差(Absolute Pose Error,APE)[18]来评估全局一致性.

  • APEt=Qt-1SPt,
    (13)
  • 其中:S为相似变换矩阵; QP分别为参考真实轨迹和估计轨迹.

  • 图1 实验轨迹环境

  • Fig.1 Experimental track environment

  • 对比每一帧计算绝对位姿误差的平均绝对误差(Mean Absolute Error,MAE)、中值(Median)、均方根误差(Root Mean Square Error,RMSE)和标准差(Standard Deviation,Std),实验结果统计如表2所示,实验估计轨迹如图2所示,不同实验的绝对位姿误差情况如图3—9所示.

  • 表2 实验结果绝对位姿误差对比情况

  • Table2 Comparison of absolute pose error of experimental results

  • 图2 实验估计轨迹

  • Fig.2 Experimental estimation trajectory

  • 实验3与实验1比较可知,施加SoftLOne鲁棒核函数且采用异常值剔除策略和卡方检验的鲁棒抗差算法后,较不施加鲁棒核函数、不采用异常值剔除策略和卡方检验的传统算法绝对位姿误差最大值提升9.7%,平均绝对误差提升9.8%,误差中值提升10.4%,均方根误差提升12.1%,误差标准差提升19.2%,本文算法较传统无鲁棒抗差算法精度和鲁棒性有较大提升.

  • 图3 实验1绝对位姿误差

  • Fig.3 Absolute pose error of experiment 1

  • 图4 实验2绝对位姿误差

  • Fig.4 Absolute pose error of experiment 2

  • 实验3与实验2比较可知,施加SoftLOne鲁棒核函数后,较不施加鲁棒核函数,除绝对位姿误差中值稍有降低外,其他误差最大值、平均绝对误差、均方根误差和误差标准差分别提升22.1%、5.5%、8.2%和16.8%.

  • 实验3与实验4~6比较可知,施加SoftLOne鲁棒核函数较施加Arctan、Cauchy和Huber鲁棒核函数均有较为明显的性能提升.

  • 对比实验结果表明,施加SoftLOne鲁棒核函数且采用异常值剔除策略和卡方检验的鲁棒抗差算法能够较为明显的提升GNSS/SINS/视觉导航定位精度和鲁棒性.在大尺度环境下,未出现较大漂移误差,定位精度较高,绝对位姿误差标准差仅0.336 m.

  • 图5 实验3绝对位姿误差

  • Fig.5 Absolute pose error of experiment 3

  • 图6 实验4绝对位姿误差

  • Fig.6 Absolute pose error of experiment 4

  • 3 结论

  • 近年来,综合定位、导航与授时(Positioning Navigation and Time,PNT)体系及其关键技术[19]、弹性PNT基本框架[20]和PNT智能服务[21]从被提出到逐渐完善,导航定位技术从单一传感器朝着多传感器、多信息源融合方向快速发展,并要求在确保精确性的同时兼顾稳健性和可靠性.多源导航系统在无人驾驶汽车、无人机、机器人等平台上应用广泛,是未来无人系统自主导航发展的重点方向.GNSS/SINS/视觉组合导航系统作为多源PNT体系的一种,兼具低成本、体积小、易获取、高精度等优点,应用广泛.但环境变化、载体速度等因素可能降低系统定位精度和鲁棒性.针对该问题,本文在GNSS/SINS/视觉组合导航系统数据融合前进行量测值方差阈值检验,将量测值方差大于20的视为离群点,并将其剔除,提高系统稳健性.将SoftLOne鲁棒核函数加入代价函数中,并与Cauchy、Huber和Arctan鲁棒核函数进行比较,证明SoftLOne鲁棒核函数定位精度和鲁棒性能最优.同时对量测值残差进行卡方检验,将不符合95%置信度的量测值降权处理.实验结果表明,本文算法较不施加鲁棒核函数、不采用异常值剔除策略和卡方检验的传统算法,以及加入Cauchy、Huber和Arctan鲁棒核函数的算法精度更高、鲁棒性更好.本文算法能够较大程度提升GNSS/SINS/视觉导航定位精度和鲁棒性,在长距离运行情况下,未出现较大漂移误差.定位精度较高,绝对位姿误差标准差0.336 m.

  • 图7 实验5绝对位姿误差

  • Fig.7 Absolute pose error of experiment 5

  • 图8 实验6绝对位姿误差

  • Fig.8 Absolute pose error of experiment 6

  • 图9 不同算法APE对比情况

  • Fig.9 Comparison of APE of different algorithms

  • 参考文献

    • [1] Du Z Q,Chai H Z,Xiao G R,et al.The realization and evaluation of PPP ambiguity resolution with INS aiding in marine survey[J].Marine Geodesy,2021,44(2):136-156

    • [2] Tang H L,Zhang T S,Niu X J,et al.Impact of the earth rotation compensation on MEMS-IMU preintegration of factor graph optimization[J].IEEE Sensors Journal,2022,22(17):17194-17204

    • [3] Mur-Artal R,Tardós J D.Visual-inertial monocular SLAM with map reuse[J].IEEE Robotics and Automation Letters,2017,2(2):796-803

    • [4] Mourikis A,Roumeliotis S.A multi-state constraint Kalman filter for vision-aided inertial navigation[C]//Proceedings IEEE International Conference on Robotics and Automation.April 10-14,Rome,Italy.IEEE,2007:3565-3572

    • [5] Qin T,Li P L,Shen S J.VINS-mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020

    • [6] 胡凯,吴佳胜,郑翡,等.视觉里程计研究综述[J].南京信息工程大学学报(自然科学版),2021,13(3):269-280.HU Kai,WU Jiasheng,ZHENG Fei,et al.A survey of visual odometry[J].Journal of Nanjing University of Information Science & Technology(Natural Science Edition),2021,13(3):269-280

    • [7] Campos C,Elvira R,Rodríguez J J G,et al.ORB-SLAM3:an accurate open-source library for visual,visual-inertial,and multimap SLAM[J].IEEE Transactions on Robotics,2021,37(6):1874-1890

    • [8] Liao J C,Li X X,Wang X B,et al.Enhancing navigation performance through visual-inertial odometry in GNSS-degraded environment[J].GPS Solutions,2021,25(2):1-18

    • [9] He M W,Rajkumar R R.Extended VINS-mono:a systematic approach for absolute and relative vehicle localization in large-scale outdoor environments[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 27-October 1,2021,Prague,Czech Republic.IEEE,2021:4861-4868

    • [10] Mascaro R,Teixeira L,Hinzmann T,et al.GOMSF:Graph-optimization based multi-sensor fusion for robust UAV pose estimation[C]//2018 IEEE International Conference on Robotics and Automation(ICRA).May 21-25,2018,Brisbane,QLD,Australia.IEEE,2018:1421-1428

    • [11] Leutenegger S,Furgale P,Rabaud V,et al.Keyframe-based visual-inertial SLAM using nonlinear optimization[C]//Proceedings of Robotics:Science and Systems IX,2013.DOI:10.15607/RSS.2013.IX.037

    • [12] Cao S Z,Lu X Y,Shen S J.GVINS:tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J].IEEE Transactions on Robotics,2022,38(4):2004-2021

    • [13] Li T,Pei L,Xiang Y,et al.P3-VINS:tightly-coupled PPP/INS/visual SLAM based on optimization approach[J].IEEE Robotics and Automation Letters,2022,7(3):7021-7027

    • [14] Strasdat H,Montiel J M M,Davison A J.Visual SLAM:why filter?[J].Image and Vision Computing,2012,30(2):65-77

    • [15] 蒋郡祥.基于图优化的视觉/惯性/GNSS融合导航方法研究[D].武汉:武汉大学,2021.JIANG Junxiang.Research on visual/inertial/GNSS fusion navigation method based on graph optimization[D].Wuhan:Wuhan University,2021

    • [16] Jiang J X,Niu X J,Guo R N,et al.A hybrid sliding window optimizer for tightly-coupled vision-aided inertial navigation system[J].Sensors(Basel,Switzerland),2019,19(15):3418

    • [17] Jiang J X,Niu X J,Liu J N.Improved IMU preintegration with gravity change and earth rotation for optimization-based GNSS/VINS[J].Remote Sensing,2020,12(18):3048

    • [18] Sturm J,Engelhard N,Endres F,et al.A benchmark for the evaluation of RGB-D SLAM systems[C]//2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 7-12,2012,Vilamoura-Algarve,Portugal.IEEE,2012:573-580

    • [19] 杨元喜.综合PNT体系及其关键技术[J].测绘学报,2016,45(5):505-510.YANG Yuanxi.Concepts of comprehensive PNT and related key technologies[J].Acta Geodaetica et Cartographica Sinica,2016,45(5):505-510

    • [20] 杨元喜.弹性PNT基本框架[J].测绘学报,2018,47(7):893-898.YANG Yuanxi.Elastic PNT basic framework[J].Acta Geodaetica et Cartographica Sinica,2018,47(7):893-898

    • [21] 杨元喜,杨诚,任夏.PNT智能服务[J].测绘学报,2021,50(8):1006-1012.YANG Yuanxi,YANG Cheng,REN Xia.PNT intelligent services[J].Acta Geodaetica et Cartographica Sinica,2021,50(8):1006-1012

  • 参考文献

    • [1] Du Z Q,Chai H Z,Xiao G R,et al.The realization and evaluation of PPP ambiguity resolution with INS aiding in marine survey[J].Marine Geodesy,2021,44(2):136-156

    • [2] Tang H L,Zhang T S,Niu X J,et al.Impact of the earth rotation compensation on MEMS-IMU preintegration of factor graph optimization[J].IEEE Sensors Journal,2022,22(17):17194-17204

    • [3] Mur-Artal R,Tardós J D.Visual-inertial monocular SLAM with map reuse[J].IEEE Robotics and Automation Letters,2017,2(2):796-803

    • [4] Mourikis A,Roumeliotis S.A multi-state constraint Kalman filter for vision-aided inertial navigation[C]//Proceedings IEEE International Conference on Robotics and Automation.April 10-14,Rome,Italy.IEEE,2007:3565-3572

    • [5] Qin T,Li P L,Shen S J.VINS-mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020

    • [6] 胡凯,吴佳胜,郑翡,等.视觉里程计研究综述[J].南京信息工程大学学报(自然科学版),2021,13(3):269-280.HU Kai,WU Jiasheng,ZHENG Fei,et al.A survey of visual odometry[J].Journal of Nanjing University of Information Science & Technology(Natural Science Edition),2021,13(3):269-280

    • [7] Campos C,Elvira R,Rodríguez J J G,et al.ORB-SLAM3:an accurate open-source library for visual,visual-inertial,and multimap SLAM[J].IEEE Transactions on Robotics,2021,37(6):1874-1890

    • [8] Liao J C,Li X X,Wang X B,et al.Enhancing navigation performance through visual-inertial odometry in GNSS-degraded environment[J].GPS Solutions,2021,25(2):1-18

    • [9] He M W,Rajkumar R R.Extended VINS-mono:a systematic approach for absolute and relative vehicle localization in large-scale outdoor environments[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 27-October 1,2021,Prague,Czech Republic.IEEE,2021:4861-4868

    • [10] Mascaro R,Teixeira L,Hinzmann T,et al.GOMSF:Graph-optimization based multi-sensor fusion for robust UAV pose estimation[C]//2018 IEEE International Conference on Robotics and Automation(ICRA).May 21-25,2018,Brisbane,QLD,Australia.IEEE,2018:1421-1428

    • [11] Leutenegger S,Furgale P,Rabaud V,et al.Keyframe-based visual-inertial SLAM using nonlinear optimization[C]//Proceedings of Robotics:Science and Systems IX,2013.DOI:10.15607/RSS.2013.IX.037

    • [12] Cao S Z,Lu X Y,Shen S J.GVINS:tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J].IEEE Transactions on Robotics,2022,38(4):2004-2021

    • [13] Li T,Pei L,Xiang Y,et al.P3-VINS:tightly-coupled PPP/INS/visual SLAM based on optimization approach[J].IEEE Robotics and Automation Letters,2022,7(3):7021-7027

    • [14] Strasdat H,Montiel J M M,Davison A J.Visual SLAM:why filter?[J].Image and Vision Computing,2012,30(2):65-77

    • [15] 蒋郡祥.基于图优化的视觉/惯性/GNSS融合导航方法研究[D].武汉:武汉大学,2021.JIANG Junxiang.Research on visual/inertial/GNSS fusion navigation method based on graph optimization[D].Wuhan:Wuhan University,2021

    • [16] Jiang J X,Niu X J,Guo R N,et al.A hybrid sliding window optimizer for tightly-coupled vision-aided inertial navigation system[J].Sensors(Basel,Switzerland),2019,19(15):3418

    • [17] Jiang J X,Niu X J,Liu J N.Improved IMU preintegration with gravity change and earth rotation for optimization-based GNSS/VINS[J].Remote Sensing,2020,12(18):3048

    • [18] Sturm J,Engelhard N,Endres F,et al.A benchmark for the evaluation of RGB-D SLAM systems[C]//2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 7-12,2012,Vilamoura-Algarve,Portugal.IEEE,2012:573-580

    • [19] 杨元喜.综合PNT体系及其关键技术[J].测绘学报,2016,45(5):505-510.YANG Yuanxi.Concepts of comprehensive PNT and related key technologies[J].Acta Geodaetica et Cartographica Sinica,2016,45(5):505-510

    • [20] 杨元喜.弹性PNT基本框架[J].测绘学报,2018,47(7):893-898.YANG Yuanxi.Elastic PNT basic framework[J].Acta Geodaetica et Cartographica Sinica,2018,47(7):893-898

    • [21] 杨元喜,杨诚,任夏.PNT智能服务[J].测绘学报,2021,50(8):1006-1012.YANG Yuanxi,YANG Cheng,REN Xia.PNT intelligent services[J].Acta Geodaetica et Cartographica Sinica,2021,50(8):1006-1012

  • 地址:江苏省南京市宁六路219号    邮编:210044

    联系电话:025-58731025    E-mail:nxdxb@nuist.edu.cn

    南京信息工程大学学报 ® 2024 版权所有  技术支持:北京勤云科技发展有限公司