en
×

分享给微信好友或者朋友圈

使用微信“扫一扫”功能。
作者简介:

裴凌(共同一作),男,博士,教授,从事室内外无缝定位与融合导航相关技术的研究.ling.pei@sjtu.edu.com;

李涛(共同一作),男,博士生,从事多源融合定位算法相关技术的研究.tao_li@sjtu.edu.cn;

郁文贤(通信作者),男,博士,教授,从事雷达目标识别、信息融合、智能定位导航相关研究.wxyu@sjtu.edu.cn

中图分类号:V249.3

文献标识码:A

DOI:10.13878/j.cnki.jnuist.2022.06.001

参考文献 1
Gao Y,Shen X B.A new method for carrier-phase-based precise point positioning[J].Navigation,2002,49(2):109-116
参考文献 2
Langley R B.RTK GPS[J].GPS World,1998,9(9):70-76
参考文献 3
Mur-Artal R,Montiel J M M,Tardós J D.ORB-SLAM:a versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics,2015,31(5):1147-1163
参考文献 4
Zhou H Z,Zou D P,Pei L,et al.StructSLAM:visual SLAM with building structure lines[J].IEEE Transactions on Vehicular Technology,2015,64(4):1364-1375
参考文献 5
Wang C,Guo X H.Plane-based optimization of geometry and texture for RGB-D reconstruction of indoor scenes[C]//2018 International Conference on 3D Vision(3DV).September 5-8,2018,Verona,Italy.IEEE,2018:533-541
参考文献 6
Engel J,Koltun V,Cremers D.Direct sparse odometry[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(3):611-625
参考文献 7
Censi A.An ICP variant using a point-to-line metric[C]//2008 IEEE International Conference on Robotics and Automation.May 19-23,2008,Pasadena,CA,USA.IEEE,2008:19-25
参考文献 8
Low K L.Linear least-squares optimization for point-to-plane ICP surface registration[R].Chapel Hill,University of North Carolina,2004:TR04-004
参考文献 9
Biber P,Strasser W.The normal distributions transform:a new approach to laser scan matching[C]//Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS 2003)(Cat.No.03CH37453).October 27-31,2003,Las Vegas,NV,USA.IEEE,2003:2743-2748
参考文献 10
Chiu H P,Zhou X S,Carlone L,et al.Constrained optimal selection for multi-sensor robot navigation using plug-and-play factor graphs[C]//2014 IEEE International Conference on Robotics and Automation(ICRA).May 31-June 7,2014,Hong Kong,China.IEEE,2014:663-670
参考文献 11
Shin E H.Accuracy improvement of low cost INS/GPS for land applications[R].UCGE Reports,2001:20156
参考文献 12
Cao S Z,Lu X Y,Shen S J.GVINS:tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J].IEEE Transactions on Robotics,2022,38(4):2004-2021
参考文献 13
Shan T X,Englot B,Meyers D,et al.LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS),2020:5135-5142
参考文献 14
Shan T X,Englot B,Ratti C,et al.LVI-SAM:tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]//2021 IEEE International Conference on Robotics and Automation,2021:5692-5698
参考文献 15
Qiu X C,Zhang H,Fu W X.Lightweight hybrid visual-inertial odometry with closed-form zero velocity update[J].Chinese Journal of Aeronautics,2020,33(12):3344-3359
参考文献 16
Qin C,Ye H Y,Pranata C E,et al.LINS:a lidar-inertial state estimator for robust and efficient navigation[C]//2020 IEEE International Conference on Robotics and Automation,2020:8899-8906
参考文献 17
Niu X J,Li Y,Zhang Q,et al.Observability analysis of non-holonomic constraints for land-vehicle navigation systems[J].Journal of Global Positioning Systems,2012,11(1):80-88
参考文献 18
Li T,Pei L,Xiang Y,et al.P3-LOAM:PPP/LiDAR loosely coupled SLAM with accurate covariance estimation and robust RAIM in urban canyon environment[J].IEEE Sensors Journal,2021,21(5):6660-6671
参考文献 19
Li T,Pei L,Xiang Y,et al.P3-VINS:tightly-coupled PPP/INS/visual SLAM based on optimization approach[J].IEEE Robotics and Automation Letters,2022,7(3):7021-7027
参考文献 20
Li X X,Wang H D,Li S Y,et al.GIL:a tightly coupled GNSS PPP/INS/LiDAR method for precise vehicle navigation[J].Satellite Navigation,2021,2(1):26
参考文献 21
Ding Z M,Yang T K,Zhang K Y,et al.VID-fusion:robust visual-inertial-dynamics odometry for accurate external force estimation[C]//2021 IEEE International Conference on Robotics and Automation,2021:14469-14475
参考文献 22
Görres B,Campbell J,Becker M,et al.Absolute calibration of GPS antennas:laboratory results and comparison with field and robot techniques[J].GPS Solutions,2006,10(2):136-145
参考文献 23
Héroux P,Kouba J.GPS precise point positioning using IGS orbit products[J].Physics and Chemistry of the Earth,Part A:Solid Earth and Geodesy,2001,26(6/7/8):573-578
参考文献 24
Teunissen P J G.The invertible GPS ambiguity transformations[J].Manuscripta Geodaetica,1995,20(6):489-497
参考文献 25
张小红,李星星,郭斐.GNSS精密单点定位理论方法及其应用[M].北京:国防工业出版社,2021
参考文献 26
Ge M,Gendt G,Rothacher M,et al.Resolution of GPS carrier-phase ambiguities in precise point positioning(PPP)with daily observations[J].Journal of Geodesy,2008,82(7):389-399
参考文献 27
Wübbena G,Schmitz M,Bagge A.PPP-RTK:precise point positioning using state-space representation in RTK networks[J].Proceedings of the 18th International Technical Meeting of the Satellite Division of the Institute of Navigation,ION GNSS 2005,2005:2584-2594
参考文献 28
Geng J,Teferle F N,Meng X,et al.Towards PPP-RTK:ambiguity resolution in real-time precise point positioning[J].Advances in Space Research,2011,47(10):1664-1673
参考文献 29
Bortz J E.A new mathematical formulation for strapdown inertial navigation[J].IEEE Transactions on Aerospace and Electronic Systems,1971,AES-7(1):61-66
参考文献 30
Groves P D.Principles of GNSS,inertial,and multi-sensor integrated navigation systems[M].2nd ed.Boston and London:Artech House,2013
参考文献 31
Bailey T,Nieto J,Guivant J,et al.Consistency of the EKF-SLAM algorithm[C]//2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 9-15,2006,Beijing,China.IEEE,2006:3562-3568
参考文献 32
Huang G P,Mourikis A I,Roumeliotis S I.Analysis and improvement of the consistency of extended Kalman filter based SLAM[C]//2008 IEEE International Conference on Robotics and Automation.May 19-23,2008,Pasadena,CA,USA.IEEE,2008:473-479
参考文献 33
Huang G P,Mourikis A I,Roumeliotis S I.Observability-based rules for designing consistent EKF SLAM estimators[J].International Journal of Robotics Research,2010,29(5):502-528
参考文献 34
Huang S D,Dissanayake G.Convergence and consistency analysis for extended Kalman filter based SLAM[J].IEEE Transactions on Robotics,2007,23(5):1036-1049
参考文献 35
Li M Y,Mourikis A I.Improving the accuracy of EKF-based visual-inertial odometry[C]//2012 IEEE International Conference on Robotics and Automation.May 14-18,2012,Saint Paul,MN,USA.IEEE,2012:828-835
参考文献 36
Li M Y,Mourikis A I.Improving the accuracy of EKF-based visual-inertial odometry[C]//2012 IEEE International Conference on Robotics and Automation.Saint Paul,MN,USA.IEEE,2012:828-835
参考文献 37
Huai Z,Huang G Q.Robocentric visual-inertial odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 1-5,2018,Madrid,Spain.IEEE,2018:6319-6326
参考文献 38
Lynen S,Achtelik M W,Weiss S,et al.A robust and modular multi-sensor fusion approach applied to MAV navigation[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.November 3-7,2013,Tokyo,Japan.IEEE,2013:3923-3929
参考文献 39
Bonnabel S.Left-invariant extended Kalman filter and attitude estimation[C]//2007 46th IEEE Conference on Decision and Control.December 12-14,2007,New Orleans,LA,USA.IEEE,2007:1027-1032
参考文献 40
Bonnable S,Martin P,Salaün E.Invariant extended Kalman filter:theory and application to a velocity-aided attitude estimation problem[C]//Proceedings of the 48h IEEE Conference on Decision and Control(CDC)held jointly with 2009 28th Chinese Control Conference.December 15-18,2009,Shanghai,China.IEEE,2009:1297-1304
参考文献 41
Barrau A,Bonnabel S.An EKF-SLAM algorithm with consistency properties[J].arXiv e-print,2015,arXiv:1510.06263
参考文献 42
Wu K Z,Zhang T,Su D,et al.An invariant-EKF VINS algorithm for improving consistency[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 24-28,2017,Vancouver,BC,Canada.IEEE,2017:1578-1585
参考文献 43
Heo S,Park C G.Consistent EKF-based visual-inertial odometry on matrix Lie group[J].IEEE Sensors Journal,2018,18(9):3780-3788
参考文献 44
Brossard M,Bonnabel S,Barrau A.Invariant Kalman filtering for visual inertial SLAM[C]//2018 21st International Conference on Information Fusion(FUSION).July 10-13,2018,Cambridge,UK.IEEE,2018:2021-2028
参考文献 45
Hua T,Pei L,Li T,et al.I2-SLAM:fusing infrared camera and IMU for simultaneous localization and mapping[M]//Proceedings of 2021 International Conference on Autonomous Unmanned Systems(ICAUS 2021).Singapore:Springer Singapore,2022:2834-2844
参考文献 46
Chen L,Sun L B,Yang T,et al.RGB-T SLAM:a flexible SLAM framework by combining appearance and thermal information[C]//2017 IEEE International Conference on Robotics and Automation.May 29-June 3,2017,Singapore.IEEE,2017:5682-5687
参考文献 47
Wang R C,Pei L,Chu L,et al.DVT-SLAM:deep-learning based visible and thermal fusion SLAM[M]//Lecture Notes in Electrical Engineering.Singapore:Springer Singapore,2021:394-403
参考文献 48
Zhou Y,Gallego G,Shen S J.Event-based stereo visual odometry[J].IEEE Transactions on Robotics,2021,37(5):1433-1450
参考文献 49
Rebecq H,Horstschaefer T,Gallego G,et al.EVO:a geometric approach to event-based 6-DOF parallel tracking and mapping in real time[J].IEEE Robotics and Automation Letters,2017,2(2):593-600
参考文献 50
Davison A J,Reid I D,Molton N D,et al.MonoSLAM:real-time single camera SLAM[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(6):1052-1067
参考文献 51
Engel J,Schöps T,Cremers D.LSD-SLAM:large-scale direct monocular SLAM[M]//Computer Vision-ECCV 2014.Cham:Springer International Publishing,2014:834-849
参考文献 52
Forster C,Pizzoli M,Scaramuzza D.SVO:fast semi-direct monocular visual odometry[C]//2014 IEEE International Conference on Robotics and Automation.May 31-June 7,2014,Hong Kong,China.IEEE,2014:15-22
参考文献 53
Li J Q,Pei L,Zou D P,et al.Attention-SLAM:a visual monocular SLAM learning from human gaze[J].IEEE Sensors Journal,2021,21(5):6408-6420
参考文献 54
Li B Y,Zou D P,Sartori D,et al.TextSLAM:visual SLAM with planar text features[C]//2020 IEEE International Conference on Robotics and Automation.May 17-21,2020,Paris,France.IEEE,2020:2102-2108
参考文献 55
Pei L,Liu K,Zou D P,et al.IVPR:an instant visual place recognition approach based on structural lines in Manhattan world[J].IEEE Transactions on Instrumentation and Measurement,2020,69(7):4173-4187
参考文献 56
Montemerlo M,Thrun S,Koller D,et al.FastSLAM:a factored solution to the simultaneous localization and mapping problem[J].AAAI/IAAI,2002,593598
参考文献 57
Grisetti G,Stachniss C,Burgard W.Improved techniques for grid mapping with Rao-blackwellized particle filters[J].IEEE Transactions on Robotics,2007,23(1):34-46
参考文献 58
Kohlbrecher S,von Stryk O,Meyer J,et al.A flexible and scalable SLAM system with full 3D motion estimation[C]//2011 IEEE International Symposium on Safety,Security,and Rescue Robotics.October 31-November 5,2011,Kyoto,Japan.IEEE,2011:155-160
参考文献 59
Konolige K,Grisetti G,R Kümmerle,et al.Efficient sparse pose adjustment for 2D mapping[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 18-22,2010,Taipei,Taiwan.IEEE,2010.DOI:10.1109/IROS.2010.5649043
参考文献 60
Hess W,Kohler D,Rapp H,et al.Real-time loop closure in 2D LIDAR SLAM[C]//2016 IEEE International Conference on Robotics and Automation.May 16-21,2016,Stockholm,Sweden.IEEE,2016:1271-1278
参考文献 61
Zhang J,Singh S.LOAM:lidar odometry and mapping in real-time[J].Robotics:Science and Systems,2014.DOI:10.15607/RSS.2014.X.007
参考文献 62
Shan T X,Englot B.LeGO-LOAM:lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 1-5,2018,New York.ACM,2018:4758-4765
参考文献 63
Deschaud J E.IMLS-SLAM:scan-to-model matching based on 3D data[C]//2018 IEEE International Conference on Robotics and Automation.May 21-25,2018,Brisbane,QLD,Australia.IEEE,2018:2480-2485
参考文献 64
Pan Y,Xiao P C,He Y J,et al.MULLS:versatile LiDAR SLAM via multi-metric linear least square [J].arXiv e-print,2021,arXiv:2102.03771
参考文献 65
Li F F,Perona P.A Bayesian hierarchical model for learning natural scene categories[C]//2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.June 20-25,2005,San Diego,CA,USA.IEEE,2005:524-531
参考文献 66
Li L J,Su H,Xing E P,et al.Objectbank:a high-level image representation for scene classification & semantic feature sparsification[J].Proceedings of the 23rd International Conference on Neural Information Processing Systems,2010,2:1378-1386
参考文献 67
Zhou B L,Lapedriza A,Khosla A,et al.Places:a 10 million image database for scene recognition[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(6):1452-1464
参考文献 68
Wen W S,Hsu L T,Zhang G H.Performance analysis of NDT-based graph SLAM for autonomous vehicle in diverse typical driving scenarios of Hong Kong [J].Sensors(Basel,Switzerland),2018,18(11):3928
参考文献 69
Hewitson S,Wang J L.Extended receiver autonomous integrity monitoring(eRAIM)for GNSS/INS integration[J].Journal of Surveying Engineering,2010,136(1):13-22
参考文献 70
Gakne P V,O'Keefe K.Tightly-coupled GNSS/vision using a sky-pointing camera for vehicle navigation in urban areas[J].Sensors(Basel,Switzerland),2018,18(4):1244
参考文献 71
Wen W S.3D LiDAR aided GNSS and its tightly coupled integration with INS via factor graph optimization[C]//33rd International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GNSS+ 2020),2020.DOI:10.33012/2020.17557
参考文献 72
Dai W C,Zhang Y,Li P,et al.RGB-D SLAM in dynamic environments using point correlations[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2022,44(1):373-389
参考文献 73
Qian C L,Xiang Z H,Wu Z R,et al.RF-LIO:removal-first tightly-coupled lidar inertial odometry in high dynamic environments[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 27-October 1,2021,Prague,Czech Republic.IEEE,2021:4421-4428
参考文献 74
Sukumar S R,Bozdogan H,Page D L,et al.Sensor selection using information complexity for multi-sensor mobile robot localization[C]//Proceedings 2007 IEEE International Conference on Robotics and Automation.April 10-14,2007,Rome,Italy.IEEE,2007:4158-4163
参考文献 75
Bian F,Kempe D,Govindan R.Utility-based sensor selection[C]//2006 5th International Conference on Information Processing in Sensor Networks.April 19-21,2006,Nashville,TN,USA.IEEE,2006:11-18
参考文献 76
Shamaiah M,Banerjee S,Vikalo H.Greedy sensor selection:leveraging submodularity[C]//49th IEEE Conference on Decision and Control.December 15-17,2010,Atlanta,GA,USA.IEEE,2010:2572-2577
参考文献 77
Joshi S,Boyd S.Sensor selection via convex optimization[J].IEEE Transactions on Signal Processing,2009,57(2):451-462
参考文献 78
Dissanayake G,Sukkarieh S,Nebot E,et al.The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications[J].IEEE Transactions on Robotics and Automation,2001,17(5):731-747
参考文献 79
Niu X J,Nassar S,El-Sheimy N.An accurate land-vehicle MEMS IMU/GPS navigation system using 3D auxiliary velocity updates[J].Navigation,2007,54(3):177-188
参考文献 80
Yang H J,Fan X Z,Shi P,et al.Nonlinear control for tracking and obstacle avoidance of a wheeled mobile robot with nonholonomic constraint[J].IEEE Transactions on Control Systems Technology,2016,24(2):741-746
参考文献 81
Scaramuzza D,Fraundorfer F,Pollefeys M,et al.Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints[C]//2009 IEEE 12th International Conference on Computer Vision.September 29-October 2,2009,Kyoto,Japan.IEEE,2009:1413-1419
参考文献 82
刘万科,农旗,陶贤露,等.非完整约束的OD/SINS自适应组合导航方法[J].测绘学报,2022,51(1):9-17;LIU Wanke,NONG Qi,TAO Xianlu,et al.OD/SINS adaptive integrated navigation method with non-holonomic constraints[J].Acta Geodaetica et Cartographica Sinica,2022,51(1):9-17
参考文献 83
Zhang Z X,Niu X J,Tang H L,et al.GNSS/INS/ODO/wheel angle integrated navigation algorithm for an all-wheel steering robot[J].Measurement Science and Technology,2021,32(11):115122
参考文献 84
Shin E H.Estimation techniques for low-cost inertial navigation[D].Calgary,Canada:University of Calgary,2005
参考文献 85
Ben Y Y,Yin G S,Gao W,et al.Improved filter estimation method applied in zero velocity update for SINS[C]//2009 International Conference on Mechatronics and Automation.August 9-12,2009,Changchun,China.IEEE,2009:3375-3380
参考文献 86
方靖,顾启泰,丁天怀.车载惯性导航的动态零速修正技术[J].中国惯性技术学报,2008,16(3):265-268;FANG Jing,GU Qitai,DING Tianhuai.Dynamic zero velocity update for vehicle inertial navigation system[J].Journal of Chinese Inertial Technology,2008,16(3):265-268
参考文献 87
Liu W,Zhang Z.Research on zero velocity update for high-precision land-vehicle inertial navigation system[J].Navigation and Control,2013,12(2):29-33
参考文献 88
高钟毓.惯性定位系统的卡尔曼滤波器设计[J].中国惯性技术学报,2000,8(4):6-12,20;GAO Zhongyu.Kalman filter design of inertial positioning system[J].Journal of Chinese Inertial Technology,2000,8(4):6-10,20
参考文献 89
Li X F,Mao Y L,Xie L,et al.Applications of zero-velocity detector and Kalman filter in zero velocity update for inertial navigation system[C]//Proceedings of 2014 IEEE Chinese Guidance,Navigation and Control Conference.August 8-10,2014,Yantai,China.IEEE,2014:1760-1763
参考文献 90
奔粤阳,孙枫,高伟,等.惯导系统的零速校正技术研究[J].系统仿真学报,2008,20(17):4639-4642;BEN Yueyang,SUN Feng,GAO Wei,et al.Study of zero velocity update for inertial navigation[J].Journal of System Simulation,2008,20(17):4639-4642
参考文献 91
Grejner-Brzezinska D A,Yi Y D,Toth C K.Bridging GPS gaps in urban canyons:the benefits of ZUPTs[J].Navigation,2001,48(4):216-226
参考文献 92
Ramanandan A,Chen A N,Farrell J A.Inertial navigation aiding by stationary updates[J].IEEE Transactions on Intelligent Transportation Systems,2012,13(1):235-248
参考文献 93
Davidson P,Hautamäki J,Collin J,et al.Improved vehicle positioning in urban environment through integration of GPS and low-cost inertial sensors[C]//Proceedings of the European Navigation Conference(ENC'09).May 4-6,2009,Naples,Italy.2009:101-107
参考文献 94
Kasameyer P W,Hutchings L,Ellis M F,et al.MEMS-based INS tracking of personnel in a GPS-denied environment[J].Proceedings of the 18th International Technical Meeting of the Satellite Division of the Institute of Navigation,ION GNSS 2005,2005:949-955
参考文献 95
Ojeda L,Borenstein J.Non-GPS navigation with the personal dead-reckoning system[C]//SPIE Defense Security Conference,Unmanned Systems Technology IX.April 9-13,2007,Orlando,Florida,USA.2007,6561:110-120
参考文献 96
Yu H.An algorithm to detect zero-velocity in automobiles using accelerometer signals[J].Ritala Ristopiché Robert,2009
参考文献 97
Ramanandan A,Chen A,Farrell J A,et al.Detection of stationarity in an inertial navigation system[C]//Proceedings of the 23rd International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GNSS 2010),2010:238-244
参考文献 98
Geneva P,Eckenhoff K,Lee W,et al.OpenVINS:a research platform for visual-inertial estimation[C]//2020 IEEE International Conference on Robotics and Automation.May 17-21,2020,Paris,France.IEEE,2020:4666-4672
参考文献 99
Petovello M G,Mezentsev O,Lachapelle G,et al.High sensitivity GPS velocity updates for personal indoor navigation using inertial navigation systems[C]//Proceedings of the 16th International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GPS/GNSS 2003),2003:2886-2896
参考文献 100
Mezentsev O,Collin J,Lachapelle G.Vehicular navigation in urban canyons using a high sensitivity GPS receiver augmented with a medium-grade IMU[C]//10th Saint Petersburg International Conference on Integrated Navigation Systems,2003:64-70
参考文献 101
Zampella F,Khider M,Robertson P,et al.Unscented Kalman filter and magnetic angular rate update(MARU)for an improved pedestrian dead-reckoning[C]//Proceedings of the 2012 IEEE/ION Position,Location and Navigation Symposium.April 23-26,2012,Myrtle Beach,SC,USA.IEEE,2012:129-139
参考文献 102
Rajagopal S.Personal dead reckoning system with shoe mounted inertial sensors[R].Master's Degree Project,Stockholm,Sweden,2008:013
参考文献 103
Wan E A,Van Der Merwe R.The unscented Kalman filter for nonlinear estimation[C]//Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing,Communications,and Control Symposium(Cat.No.00EX373).October 4,2000,Lake Louise,AB,Canada.IEEE,2000:153-158
参考文献 104
Wang R Z,Zou D P,Xu C Q,et al.An aerodynamic model-aided state estimator for multi-rotor UAVs[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 24-28,2017,New York:ACM,2017:2164-2170
参考文献 105
Nisar B,Foehn P,Falanga D,et al.VIMO:simultaneous visual inertial model-based odometry and force estimation[J].IEEE Robotics and Automation Letters,2019,4(3):2785-2792
参考文献 106
Ribeiro M I.Kalman and extended Kalman filters:concept,derivation and properties[R].Institute for Systems and Robotics Lisboa Portugal,2004:46
参考文献 107
Carpenter J,Clifford P,Fearnhead P.Improved particle filter for nonlinear problems[J].IEE Proceedings-Radar,Sonar and Navigation,1999,146(1):2-7
参考文献 108
Weiss S,Achtelik M W,Lynen S,et al.Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments[C]//2012 IEEE International Conference on Robotics and Automation.May 14-18,2012,Saint Paul,MN,USA.IEEE,2012:957-964
参考文献 109
Falquez J M,Kasper M,Sibley G.Inertial aided dense & semi-dense methods for robust direct visual odometry[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 9-14,2016,Daejeon,Korea(South).IEEE,2016:3601-3607
参考文献 110
Se S,Lowe D,Little J J.Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks[J].The International Journal of Robotics Research,2002,21(8):735-760
参考文献 111
Jones E S,Soatto S.Visual-inertial navigation,mapping and localization:a scalable real-time causal approach[J].The International Journal of Robotics Research,2011,30(4):407-430
参考文献 112
Bloesch M,Omari S,Hutter M,et al.Robust visual inertial odometry using a direct EKF-based approach[C]//2015 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 28-October 2,2015,Hamburg,Germany.IEEE,2015:298-304
参考文献 113
Mourikis A I,Roumeliotis S I.Amulti-state constraint Kalman filter for vision-aided inertial navigation[C]//2007 IEEE International Conference on Robotics and Automation.April 10-14,2007,Rome,Italy.IEEE,2007:3565-3572
参考文献 114
Sun K,Mohta K,Pfrommer B,et al.Robust stereo visual inertial odometry for fast autonomous flight[J].IEEE Robotics and Automation Letters,2018,3(2):965-972
参考文献 115
Zou D P,Wu Y X,Pei L,et al.StructVIO:visual-inertial odometry with structural regularity of man-made environments[J].IEEE Transactions on Robotics,2019,35(4):999-1013
参考文献 116
Srinara S,Lee C M,Tsai S,et al.Performance analysis of 3D NDT scan matching for autonomous vehicles using INS/GNSS/3D LiDAR-SLAM integration scheme[C]//2021 IEEE International Symposium on Inertial Sensors and Systems.March 22-25,2021,Kailua-Kona,HI,USA.IEEE,2021:1-4
参考文献 117
Schütz A,Sánchez-Morales D E,Pany T.Precise positioning through a loosely-coupled sensor fusion of GNSS-RTK,INS and LiDAR for autonomous driving[C]//2020 IEEE/ION Position,Location and Navigation Symposium(PLANS).April 20-23,2020,Portland,OR,USA.IEEE,2020:219-225
参考文献 118
Chiang K W,Tsai G J,Chang H W,et al.Seamless navigation and mapping using an INS/GNSS/grid-based SLAM semi-tightly coupled integration scheme[J].Information Fusion,2019,50:181-196
参考文献 119
Chiang K W,Tsai G J,Chu H J,et al.Performance enhancement of INS/GNSS/refreshed-SLAM integration for acceptable lane-level navigation accuracy[J].IEEE Transactions on Vehicular Technology,2020,69(3):2463-2476
参考文献 120
Besl P J,McKay N D.A method for registration of 3-D shapes[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1992,14(2):239-256
参考文献 121
Qin T,Cao S,Pan J,et al.A general optimization-based framework for global pose estimation with multiple sensors[J].arXiv e-print,2019,arXiv:1901.03642
参考文献 122
Gong Z,Liu P L,Wen F,et al.Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment[J].IEEE Transactions on Instrumentation and Measurement,2021,70:1-16
参考文献 123
Li X X,Wang X B,Liao J C,et al.Semi-tightly coupled integration of multi-GNSS PPP and S-VINS for precise positioning in GNSS-challenged environments[J].Satellite Navigation,2021,2(1):1-14
参考文献 124
Wang Z Q,Li M,Zhou D K,et al.Direct sparse stereo visual-inertial global odometry[C]//2021 IEEE International Conference on Robotics and Automation.May 30-June 5,2021,Xi'an,China.IEEE,2021:14403-14409
参考文献 125
Liu J X,Gao W,Hu Z Y.Optimization-based visual-inertial SLAM tightly coupled with raw GNSS measurements[C]//2021 IEEE International Conference on Robotics and Automation.May 30-June 5,2021,Xi'an,China.IEEE,2021:11612-11618
参考文献 126
Wu K J,Guo C X,Georgiou G,et al.VINS on wheels[C]//2017 IEEE International Conference on Robotics and Automation.May 29-June 3,2017,Singapore.IEEE,2017:5155-5162
参考文献 127
Moore T,Stouch D.A generalized extended Kalman filter implementation for the robot operating system[M]//Intelligent Autonomous Systems 13.Cham:Springer International Publishing,2015:335-348
目录contents

    摘要

    随着机器人、无人车等自主导航系统的大量涌现,定位导航技术在最近20年得到迅猛发展,用户对新一代的定位导航技术提出了新的要求,即在任意环境、任意时刻、任意平台都能具备可靠的定位导航能力.多源融合定位算法是实现该目标的唯一有效途径.本文从传感器观测模型、环境场景模型、载体运动行为模型出发,综述了卫星导航、惯性导航、视觉传感器、激光雷达单一传感器的定位方法,分析了定位导航运行的环境场景对多源融合定位的影响,以及载体运动行为对定位的影响.最后,从融合框架层面将多源融合定位算法分为优化和滤波两大类进行深入分析.

    Abstract

    With the emergence of autonomous navigation systems such as robots and unmanned vehicles,positioning and navigation technology has developed rapidly in the past 20 years.However,new user requirements have been raised for update of positioning and navigation,which include reliable positioning and navigation ability in any environment at any time and on any platform.Multi-source fusion positioning algorithm is the only feasible way to achieve this goal.Based on the sensor model,scene model and vehicle dynamics model,we summarize positioning methods of GNSS,inertial navigation,visual sensor,and LiDAR.Then we analyze the influence that the positioning scene and vehicle dynamics pose on the multi-source fusion positioning.Finally,the multi-source fusion positioning algorithms are detailed under two fusion framework categories of optimization and filtering.

  • 0 引言

  • 全球导航卫星系统(Global Navigation Satellite Systems,GNSS)信号到达地面已经非常微弱,无法穿透室内、地下等环境,并容易受到有意或者无意的干扰.所以,国家提出建设以北斗为核心、多源互补与信息融合的国家综合 PNT(Positioning,Navigation,Timing)体系.多源融合定位是国家综合PNT体系用户侧的核心技术.在国家综合 PNT 体系泛在的信号源覆盖之下,高可靠、高可信的用户终端需要具有更加灵活和智能的导航架构,用于快捷集成多种异质异构传感器/信号源.为了适应复杂场景和载体动态,更高效合理地进行多源弹性融合,本文将从传感器观测模型、环境场景模型、载体运动行为模型(3M-based)3个视角对多源融合定位算法进行综述.

  • 考虑到通用性和互补性,本文主要关注4类传感器:卫星、惯性传感器、视觉传感器和激光雷达.同一类型传感器的观测模型也不尽相同.例如,卫星导航的观测模型可以考虑伪距、相位、多普勒观测值的交替应用,可以考虑不同观测值之间做组合[1],也可在不同观测值之间做差分[2]消除一定的误差.惯性导航是不依赖于环境和基础设施的一种内感型导航方式.惯性传感器的观测模型主要由其本身的精度决定,低精度惯导可以考虑使用较为简单的模型,而高精度惯导在处理时需要更加精细化的模型来避免一些误差的累积.视觉传感器观测模型从间接法特征点[3]、特征线[4]、特征面[5]的方式,到直接法[6]对整个图像进行光度误差最小化.激光雷达的观测模型也从点到直线[7]、点到平面[8]的残差到将整个空间离散到栅格中并用正态分布去描述之后配准整个分布[9].无论什么观测模型,其本质反映的都是载体运动状态到传感器观测之间的联系.

  • 环境场景模型对运动状态的约束主要是由传感器的观测间接构建.因为不同传感器在不同场景下会有不一样的观测特性.例如,卫星导航在遮挡环境的观测会受到多路径和非视距卫星的影响,在室内卫星导航几乎都不会有任何信号.在特征退化的环境下,依靠环境进行相对定位的视觉传感器和激光雷达都会受到较大的影响.惯性传感器在长时间的运动下会出现误差累积进而导致估算轨迹漂移.所以,对场景识别和理解有助于融合策略中的传感器选择[10].

  • 载体复杂的运动行为通常会影响观测的可靠性,但是如果能够充分利用不同类型载体的运动特性,也可以约束载体的运动状态.比如:车辆正常行驶速度主要来源于车的前向[11],静止时,三轴方向的速度都是0; 行人运动过程中脚触地时刻速度为0,正常行走时加速度呈正弦周期性变化; 无人机可以利用加速度估计飞行动力学参数,从而间接估计飞行速度等.因此,建立载体运动行为模型对于融合定位具有非常重要的状态约束作用.

  • 传感器观测模型、环境场景模型、载体运动行为模型最终需要在多源融合定位算法框架中进行统一表示.一般融合算法分为滤波和优化两类,可根据应用场景和载体特性进行选择.本文对当前主流的多源融合定位算法从3M视角进行归纳概括,具体如表1所示.

  • 表1中,GNSS表示卫星导航,IMU表示惯性测量单位,Vision表示视觉SLAM,LiDAR表示LiDAR-SLAM,NHC表示非完整性约束,ZUPT表示零速更新.

  • 1 传感器及其观测模型

  • 1.1 卫星导航系统

  • 全球卫星导航系统(GNSS)在多源融合定位算法中能够提供全局的定位并且不随时间发散.GNSS接收机的原始观测值主要包括伪距、载波相位和多普勒频移3类.通过对应的算法,GNSS可以提供定位、测速和授时服务.GNSS最基础的3个观测方程为

  • ρ(n)=r(n)+cδtr+δe(n)-cδt(n)+I(n)+T(n)+R(n)+d(n)+ερ(n),
    (1)
  • D(n)λ=r˙(n)+cδt˙r-cδt˙(n)-I˙(n)+T˙(n)+εD(n)
    (2)
  • l(n)=r(n)+cδtr+δe(n)-cδt(n)+λN-I(n)+T(n)+R(n)+w(n)+d(n)+εl(n),
    (3)
  • 其中ρn表示伪距,Dn表示多普勒频移观测,ln表示相位观测,rn表示卫星n到接收机之间的距离,c表示光速,δtr表示接收机钟差,δen表示卫星轨道误差,δtn表示卫星n的钟差,In表示卫星n到接收机传播路径上的电离层误差,Tn表示卫星n到接收机传播路径上的对流层误差,Rn表示地球自转与相对论效应,ερn中包括伪距多路径误差和伪距随机噪声,λ为波长,εDn为多普勒测量噪声,N为整周模糊度,wn中包含相位中心偏差、相位中心变化[22]和相位缠绕,dn包括地球固体潮汐与海洋潮汐以及卫星端和接收机端硬件延迟等误差[23]εln中包括相位多路径误差和相位随机噪声.

  • 实时动态(Real-Time Kinematic,RTK)载波相位差分技术[2]通过位置信息已知的基站和高精度的载波相位测量值,实时获得高精度的定位结果.其基本原理为构建站间差分和星间差分方程如式(4)所示,即双差观测模型:

  • Δl=Δr-ΔNλ+εln.
    (4)
  • 式中Δ表示双差算子,其余符号同式(1)—(3).多个历元的双差观测模型联立,使用最小二乘或卡尔曼滤波均可对双差模糊度浮点解进行估计,从而确定基站到流动站在大地坐标系下的相对位置.模糊度的固定一般采用Leastsquare AMBiguity Decorrelation Adjustment(LAMBDA)算法[24].

  • RTK算法精度与流动站、基站之间的距离相关,距离越远精度往往越低,需要地面布设30~50 km间隔的基准站.为了减少建设基站投入,精密单点定位(Precise Point Positioning,PPP)越来越受关注[25].PPP发展经历了实数解、固定解以及PPP-RTK 3个阶段.在PPP实数解阶段,观测模型是研究人员最关注的问题.其中消电离层组合通过电离层误差一阶项与频率的平方成反比的特性消除电离层一阶项影响,UofC模型通过同频率伪距和载波相位的电离层误差互为相反数的特性消除电离层一阶项影响[1].PPP固定解主要是要解决整数模糊度与硬件延迟之间的耦合问题.常用的方法为未检校的相位延迟(Uncalibrated Phase Delay,UPD).UPD的原理是窄巷模糊度的小数部分在短时间内是稳定的[26].有UPD的辅助使得PPP固定解成为可能,但仍旧要经历长时间的观测才能获得PPP固定解,使得PPP固定解应用受限.造成PPP固定解需要长时间观测的主要原因是大气影响,于是PPP-RTK被提出,它利用局域网观测数据,精化求解UPD以及补充求解大气改正,使得达到固定解的速度更快[27-28].

  • 表1 常见多源融合定位算法

  • Table1 Common multi-source fusion positioning algorithms

  • 1.2 惯性导航系统

  • 惯性导航系统包括惯性测量单元(Inertial Measurement Unit,IMU)和惯导解算算法两部分.其中IMU一般由加速度计和陀螺仪两部分组成,分别测量加速度和角速度.在多源融合定位导航算法中能够提供几乎不受环境影响的高频物理状态测量.从加速度计获得的加速度将被积分1次以获得线速度,积分2次提供位置数据,而从陀螺仪测得的角速度将被积分1次用来提供姿态数据.

  • 依据陀螺仪是否能感应到地球自转角速度,本文将IMU分为低精度和高精度两大类.若陀螺仪噪声数量级大于等于地球自转角速度,则惯导被认为是低精度惯导,它无法通过陀螺仪感应到地球自转从而无法估计航向角; 若陀螺仪噪声数量级小于地球自转角速度,则惯导被认为是高精度惯导,它可以通过感应地球自转估计航向角.高精度的惯导算法需要考虑地球自转角速度,而低精度的惯导算法不需要.

  • 惯导运动学微分方程包含姿态、位置和速度三个微分方程.一般均以解姿态微分方程开始,有旋转矩阵法、四元数法.姿态更新中因不可交换误差产生的圆锥误差,通过等效旋转矢量[29] 及其多子样算法进行补偿.速度中因不可交换误差产生的划桨误差[30],也可通过多子样算法进行补偿.这些都是在高精度惯导算法中需要考虑的问题,对于低精度惯导均无需考虑.

  • 惯导一般被认为是多源传感器融合定位算法的核心.基于滤波框架多源传感器融合定位算法将惯导的误差状态方程作为状态方程.然而,对于扩展卡尔曼滤波框架而言,线形化误差会造成滤波器的不一致性,导致过于乐观的协方差估计,这一现象已有较多研究[31-34].对此有四类解决方案.第一种是以FEJ(First-Estimates Jacobian)[32]技术为代表的修正线性化点的方案,提出固定首次线性化的雅可比矩阵,保证了系统可观性矩阵的零空间不退化,该方案被用于OC-EKF[33]和MSCKF的改进版本[35-36]中.第二种是以R-VIO[37]为代表的机器人中心建图的EKF框架,以机器人为中心建图可以回避与全局重力向量的对齐,并且不会遇到以世界为中心的VIO方法面临的可观性不匹配问题.第三种是文献[38]中使用的随机克隆(Stochastic Cloning)的方法.随着李群理论研究的深入,Bonnabel等尝试将速度与路标点纳入到位姿之中形成一种新颖的李群结构,并由此提出了第四种解决一致性问题的方案——不变扩展卡尔曼滤波(Invariant Extended Kalman Filter)[39-40],这种新方案的一致性在文献[41] 中得到证明,并进一步应用到了MSCKF[42-43]、UKF[44] 中.表2对一致性问题的解决方案进行了总结.

  • 表2 系统一致性问题的解决方案

  • Table2 Solutions to the consistency problem

  • 1.3 视觉传感器

  • 由于视觉传感器的低成本和便捷性,基于视觉传感器的实时定位和建图技术,即视觉SLAM(visual Simultaneous Localization and Mapping,vSLAM)技术一直是学术界和工业界研究的重点.一般vSLAM使用的视觉传感器为RGB相机,围绕其设计的算法框架也是当前主流.除此之外,热红外相机[45-47]、事件相机[48-49]等也逐渐受到关注,可以在特殊场景下取得较好的效果.近年来,vSLAM算法在计算机视觉、AR技术、机器人领域有了长足的发展并得到广泛应用.vSLAM算法分为基于特征法和直接法.基于特征的vSLAM方法又可分为基于滤波器的算法和基于光束平差法的方法.MonoSLAM[50]被认为是基于滤波器的代表性方法,它使用了扩展卡尔曼滤波器来估计相机运动与3D特征点的位置.StructSLAM[4]也是基于滤波器的vSLAM算法,它在视觉前端采用结构特征线而不是传统的特征点.基于光束平差法(Bundle Adjustment,BA)的代表性算法为ORB-SLAM系列[3].然而,随着vSLAM技术的发展和视觉定位技术在各个领域的使用,相机在特征退化场景中的使用也逐渐增加.传统的基于特征点的方法对这类环境缺乏良好的适应性,定位精度低,经常因为特征不足出现跟踪失败从而导致定位失败.因此为了应对缺乏纹理特征的环境,研究者提出直接使用整个图像进行跟踪的vSLAM技术,例如LSD-SLAM[51]、DSO[6]等.SVO[52]采用半直接法,即一种类似直接法的方式对特征点图像块进行跟踪.表3给出了典型vSLAM算法及其特点.

  • 表3 典型vSLAM算法特点

  • Table3 Characteristics of typical vSLAM algorithms

  • 此外,Attention-SLAM[53]把注意力模型引入到视觉SLAM框架中,将注意力更关注的部分的特征点权重设置得更高; TextSLAM[54]在视觉SLAM中提取文本的信息,并将文本上的特征点用平面进行约束; IVPR[55]通过结构线条进行快速的视觉位置识别.

  • 1.4 激光雷达

  • 激光雷达通过在一定角度范围内发射激光束并接受回传信号来获得其测量范围内的障碍物信息.基于其工作原理,使用激光雷达采集到的障碍物信息表现为一系列离散的,且具有准确角度和距离信息的点,这些点被称为点云.通常使用激光SLAM算法对激光雷达测得的点云数据进行处理,对不同时刻的两组点云数据进行对比从而获得传感器相对位姿变化及地图信息.

  • 二维的激光SLAM有FastSLAM[56]、GMapping[57]、 Hector-SLAM[58]、 KartoSLAM[59]、Cartographer[60]等.FastSLAM[56]使用粒子滤波作为状态估计器,是最早能够实时输出栅格地图的激光SLAM方案.GMapping[57]也是基于粒子滤波但是为了采样更加精准,通过运动模型预测位姿后进行一次扫描匹配,且GMapping[57]通过限制重采样次数的策略来避免粒子耗散.Hector-SLAM[58]只有扫描匹配部分,没有后端,初值敏感.KartoSLAM[59]是首个基于图优化的激光SLAM方案,但其运行速度较慢.Cartographer [60]前端使用了子图的概念,将每帧激光雷达测量通过子图进行匹配,其后端具有回环检测模块.

  • 三维的激光SLAM包括LiDAR Odometry and Mapping(LOAM)[61]、Lightweight and Ground-Optimized LiDAR Odometry and Mapping(LeGO-LOAM)[62]、Implicit Moving Least Squares SLAM(IMLS-SLAM)[63]、MULLS[64]等.

  • 在LOAM[61]中特征点选择为边缘点和平面点,其中点云曲率是提取特征点的表征,曲率较大的为边缘点特征,曲率较小的为平面点特征.其得到的特征点云通过点到直线和点到平面的迭代最近点算法进行点云匹配.帧到帧之间的配准以高频运行,地图构建以低频运行,将若干帧点云扫描和局部的地图做精化配准,即可获得这些点云的精确位姿,然后把这些点云放入地图中.

  • LeGO-LOAM[62]在LOAM[61]的基础上增加了地面优化,并且整个系统更加轻量级.在特征提取前.LeGO-LOAM[62]先提取出地面点,并将除地面点之外的点进行分割,并去除噪声点(比如微小物体)得到分割点.在后面特征提取边缘点特征时,就不使用地面点而仅使用分割点,相比LOAM这样的特征提取更快速.在前端帧对帧配准部分,LeGO-LOAM[62]将LOAM[61]的一步优化六个自由度位姿改成了两步优化,即先优化z方向位移、俯仰角和横滚角,再优化平面的xy坐标以及航向角.第一步优化利用了地面在相邻帧间基本保持不变的特性,所以可利用点到平面的约束计算垂直维度的位姿变化.将第一步优化的结果作为初值放入第二步优化,提升计算效率.

  • IMLS-SLAM[63]采用scan-to-model匹配方法,使用隐式移动最小二乘模型来表示地图.在分割阶段,通过删除聚类后体积小于一定尺寸的点云以达到移除动态物体的目的.在特征点提取上,IMLS-SLAM[63]通过设置9个指标来提取对位姿贡献大的点云作为特征点.

  • MULLS[64]通过多度量线性最小二乘实现了通用的激光SLAM.其前端通过双阈值滤波算法对地面点特征进行提取,对非地面点使用主成分分析法将特征点分成立面、屋顶面、柱、横梁、顶点,且对特征进行邻域分类编码.对于不同的分类,做点云配准时的权重也有不同的设定.后端做地图匹配时,先使用TEASER方法做初始匹配,然后和前端一样用MULLS-ICP做精匹配.

  • 激光雷达在长廊或者低特征的场景中建图精度和一致性会降低,在场景中有行人、车辆等动态的障碍物时,也容易定位失败.另外,在不平坦的地面上,载体上下颠簸,会使激光雷达的精度迅速下降.

  • 2 环境场景模型

  • 2.1 场景分类

  • 场景分类是一个计算机视觉领域中很经典的问题.基于图像的场景识别具有广泛的应用场景,从特征层面可以将基于图像的场景识别方法分为以下四类:基于底层特征的方法(比如SIFT特征描述子)、基于中层语义的方法(比如Li等[65] 提出的视觉词袋的方法)、基于高层特征的方法(比如Objectbank方法[66])、基于学习特征的方法(即使用深度学习的方法).基于学习特征的方法的场景识别以MIT(麻省理工大学)的Zhou等[67]做的工作为代表,他们构造了Places365数据集,这是一个以场景识别结果为标签的数据集,总共有365个典型场景,每个场景标签的样本超过5 000个.

  • 定位导航领域的场景识别更关注对定位导航有影响的要素.比如文献[68]提出通过生成测试区域的三维建筑模型来定义场景的城市化程度,利用三维建筑模型的天顶方向的平均掩模仰角来判断城市化程度,仰角越低城市化指数越低.另外还通过周围的车辆数定义交通情况.

  • 定位导航领域对于场景的分类不局限于城市化程度和交通情况.本文认为,物理空间和环境变量组成了定位导航领域中场景分类的主要因素,具体分类如图1所示.其中地面开放场景中,卫星接收无任何遮挡,但是一般来说此类场景特征离传感器距离较远,不利于SLAM.地面近开放环境会使卫星信号受到多路径效应影响,但此时特征也更靠近传感器.室内、地下、水下和深空环境一般无导航卫星信号,人造的室内环境结构特征较好,地下如矿井等环境光照较差,水下环境对卫星和SLAM都有巨大的挑战,一般依赖于高精度惯导,空中卫星导航信号较好但环境特征较少.环境变量包括光照环境、电磁环境、温度环境和运行环境.光照环境对视觉传感器、激光雷达会有比较大的影响,电磁环境会影响GNSS信号,温度环境对所有电磁器件都会带来一定的影响,运行环境主要包括动态干扰,可能对卫星导航接收机、视觉传感器、激光雷达等带来影响.

  • 图1 场景分类主要因素:物理空间+环境变量

  • Fig.1 Main factors of scene classification: physical space plus environment variables

  • 2.2 场景自适应的传感器观测选择

  • 对于不同的场景条件,应该选择合适的传感器观测组合进行多源融合定位.对于任何一个传感器观测,当其偏差较大时,就通过传感器选择算法对其进行剔除.对GNSS观测值进行剔除的操作被称作RAIM(Receiver Autonomous Integrity Monitoring),即接收机自主完好性监测.GNSS以及惯导[69]、视觉[70]、激光雷达[1871]都可以辅助进行RAIM.对于不良的视觉和激光雷达观测,通过动态物体判断[72-73]对质量低的观测进行剔除.

  • 在松耦合多源融合定位框架中,传感器观测选择退化成传感器选择.在松耦合框架中,每个传感器只提供位姿结果.Chiu等[10]提出一种用于在初始化和场景变化时传感器最优子集的选择机制.该机制基于启发式规则和三元树构建传感器候选子集,它在满足资源约束和精度要求的同时,通过最大化状态变量的可观测覆盖范围,快速确定候选对象中的最优子集.实验结果表明,在使用多个传感器的大规模真实数据集上,该方法能够在不同条件下选择合适的传感器子集,提供满意的定位导航解决方案.Sukumar等[74]开发了一种传感器选择算法,此算法将测量信息一致的传感器分组,并通过检测和去除故障传感器中的异常值来提高导航解决方案的鲁棒性.传感器选择问题的复杂性在文献[75]中被证明是NP-hard问题.Shamaiah等[76]在研究资源受限时的传感器选择问题时,采用了贪心传感器选择算法,达到了1-1e概率下的最优选择解,且复杂度介于On2mk)和On3mk)之间,n是状态空间维度,m是传感器总体数量,k是需要选择的传感器数量.Joshi等[77]提出一种基于凸优化的启发式近似求解传感器选择问题的方法,通过凸松弛和局部最优化算法达到一个次优的度量选择.在实验中,性能与全局最优的界非常接近,意味着该次优选择非常接近最优解,但是不能保证每次与全局最优非常接近.

  • 3 载体运动行为模型

  • 3.1 非完整性约束

  • 非完整性约束(Nonholonomic Constraint,NHC)是指车辆行驶过程中假定车辆不出现侧滑、漂移、弹跳等,车辆的侧向和垂向速度为零所构造出的虚拟观测.文献[1178]阐述了非完整性约束的基本原理,构建了其量测模型.文献[79]提出在GPS信号中断情况下使用轮速计和NHC构成的三维速度辅助更新,在MEMS/GPS导航中显著改善了精度.除此之外,NHC在跟踪避障[80]、SFM(Structure From Motion)[81] 等方面也有广泛的应用.在将NHC技术应用到实际场景时,其效果与观测噪声的设置有很大关联.由于NHC所做的假设在真实环境中未必严格成立,如在发生车辆侧滑或者拐弯时,侧向和垂向速度未必为零.因而,需要实时地改变约束的松紧程度.文献[82]研究了NHC的噪声设置的方法,这种自适应的NHC能较好地抑制GNSS中断期间组合导航系统的误差发散.作为对非完整约束模型的扩展,文献[83]对全向轮模型进行了详细的理论分析,将里程计转速和车轮转角相结合,构造轮式机器人的运动学约束,以约束惯性导航系统的快速漂移误差.对于NHC的引入带来的系统可观性的改变,文献[17]进行了分场景的讨论,指出NHC在所有运动状态下都可以增强对横滚的估计,而在加速或者转弯等剧烈运动下可以增强对偏航和俯仰的估计.

  • 3.2 零速校正

  • 零速校正(Zero-Velocity Update,ZUPT)[84]指以载体停车时的速度误差作为观测量,来对载体的其他信息进行修正[8].虽然零速校正是对速度进行观测,但它不仅可以修正速度误差,姿态误差和IMU传感器误差也可以得到修正,因而是高精度定位定向系统误差抑制与补偿的主要技术之一[85-87].传统的零速校正分为曲线拟合、滤波估计、平滑估计等[88]方法,文献[89]对几种常见的ZUPT方法进行仿真,实验证明卡尔曼滤波方法可以有效改善惯导的定位精度.由于零速校正的持续时间一般不长,卡尔曼滤波时往往误差收敛较慢,而且卡尔曼滤波的状态量较多,需要设置大量的噪声方差参数,再加上不可观测的偏航角,滤波容易发散.对此,文献[8590]对零速校正提出了改进策略.除此以外,如何检测静止状态是准确地进行零速校正的前提,如果惯导系统在运动时被误检测为静止,会对导航精度造成恶劣的影响.静止状态检测方法可以分为4类:人工检测[91]、基于IMU检测[92-98]、基于GNSS检测[99-100]、基于视觉检测[1598].在文献[91]中,系统一旦检测到GNSS中断期间的定位误差增加,就会让使用者手动执行零速更新.文献[93]利用一段时间的IMU数据均值与阈值比较判断静止状态,但如果零偏过大,这种方法会失效.文献[94-95]则是检查IMU数据的局部最小值,这种方法也会遗漏静止状态,并且没有扩展到常用的地面车辆模型中.文献[96]提出使用模式识别的一种方法建立特征识别静止状态.文献[9597]从频域出发,证实了频域检测的鲁棒性,并且文献[97]更进一步地提高了频域方法对低加速度和低速运动的敏感性.最新的开源工作[98]则是利用卡方检测筛选静止状态.文献[99]利用高精度GNSS接收机的多普勒量测来进行室内的零速校正,文献[100]则是通过GNSS测速来确定静止状态.除此之外,文献[15]提出一种新颖的依据视觉信息进行零速校正的技术,在视觉惯性里程计中较好地抑制了滤波发散,而文献[98]也采用了类似的方法,并与IMU的检测方案进行了结合.

  • 此外,载体静止时陀螺仪的三轴角速度输出为零的虚拟观测也可以修正运动状态,被称为零角速度校正(Zero Angular Rate Update).ZARU对陀螺仪的测量噪声进行修正,同时因为航向误差角和陀螺仪的测量噪声存在直接的联系,因此 ZARU同样也可以达到对航向误差角的修正作用.文献[92]构建了其与ZUPT的观测模型,都是作为静止时刻的约束信息.ZUPT和ZARU都被文献[101-102]应用到行人航位推算(Pedestrian Dead-Reckoning)中.文献[82]使用无迹卡尔曼滤波(Unscented Kalman Filtering,UKF)[103]构建ZUPT和ZARU的观测模型,进一步提高了方位估计.

  • 3.3 其他载体运动行为模型

  • 文献[104]提出一种将多旋翼无人机气动模型引入状态估计滤波器中的方法.该滤波器的状态中考虑了无人机转子转速变化时的阻尼系数.滤波器的观测包括了光流、加速度计、陀螺仪和磁力计.实验结果表明,该估计方法具有很强的鲁棒性,在弱光条件下能够较好地解决基于视觉的估计方法无法解决的问题,并且气动模型辅助速度估计器能够在低光照条件和很少纹理的具有挑战性的场景中工作,并为鲁棒飞行控制提供了可靠的反馈源.

  • VIMO[105]是第一种通过联合估计运动和外力来进行状态估计的方法.在VIMO中,动力学和外力同时被放入一个预积分的残差中,与视觉观测一起形成一个紧密耦合的滑动窗口估计器.需要注意的是,VIMO通过将未知外力建模为零均值的高斯白噪声,这一假设只在估计的外力接近零或只发生在短时间内成立.所以在无人机接受大的或持续的推力时,VIMO容易估计失败.

  • VID-FUSION[21]解决了VIMO将外力建模成零均值的高斯白噪声这一问题.通过引入转子转速测量单元的转子推力观测F1,同时将IMU测得的加速度转换成动力F2,VID-FUSION认为无人机所受外力是F1F2的差值.

  • 4 融合算法

  • 4.1 滤波算法

  • 基于滤波的3M-based融合定位导航算法总体框架如图2所示,核心是以惯导的误差方程作为滤波器的状态方式,其余的传感器观测值、环境场景和载体运动行为都通过滤波框架中的观测更新部分对惯导的系统漂移项进行估计,从而补偿.

  • 基于滤波的融合算法主要包含扩展卡尔曼滤波(Extended Kalman Filter,EKF)[106]、无迹卡尔曼滤波、粒子滤波(Particle Filter,PF)[107].早年由Montemerlo等[56]提出的Fast-SLAM就是将Rao-Balckwellised粒子滤波器应用于机器人SLAM领域.对于一般的粒子滤波算法,高维空间的采样和搜索会增加计算量,粒子容易退化,导致滤波发散.当满足高斯分布模型时,扩展卡尔曼滤波更有效.按照传感器的耦合方式,滤波算法也可以分为松耦合和紧耦合,文献[38108-109]提出了松耦合的VIO方案,其中IMU 和相机执行各自的状态估计.由于视觉定位方法被视为黑盒模块,不与IMU信息融合,在视觉定位困难的情况下不够鲁棒.紧耦合方案一般对姿态与路标点一起进行估计[50110],然而随着轨迹的行进与可观测到的路标点的增多,状态向量的维度会逐渐增大.为此,文献[111]采用对特征点分组的策略来平衡轨迹漂移和计算复杂之间的矛盾,ROVIO[112]对路标点进行了方向向量和距离的参数分解以实现更好的复杂度表现.另一类解决思路是以MSCKF[113]为代表的紧耦合VIO框架.MSCKF选择将有限长度窗口的相机位姿代替路标点加入状态向量,具有定位准确和轻量的巨大优势.基于该理论,文献[114]进一步扩展出双目MSCKF的框架.除此以外,文献[112]在前端进行了改进,引入图像块特征的提取与跟踪以及上述的路标点参数化,进一步提高系统的鲁棒性.StructVIO[115]通过MSCKF框架,融合了结构线条和IMU进行鲁棒的位姿估计.

  • 文献[116-119]等研究主要集中在基于滤波的GNSS/INS/LiDAR融合算法上.Srinara等[116]基于NDT算法GNSS/INS/LiDAR进行了多传感器融合方案.Schütz等[117]将RTK、INS和LiDAR融合在研究中.但文献[30]中只使用了迭代最近点(ICP)[120]算法,导致LiDAR-Odometry随距离的快速漂移.为了克服快速漂移,Chiang等[118]利用基于网格的SLAM融合GNSS和INS.所有这些研究都使用了松耦合方法,从而更容易保持传感器之间的独立性.但是,当可用卫星数量少于四颗时,松耦合方法不能受益于GNSS观测.此外,文献[18]表明SLAM位姿的准确协方差难以估计,这导致SLAM与GNSS松散耦合时融合算法的一致性难以保持.文献[16]为基于扩展卡尔曼滤波的激光/惯性紧耦合系统.

  • 图2 基于滤波的3M-based融合定位导航算法总体框架

  • Fig.2 Framework of 3M-based fusion positioning and navigation algorithm with filter

  • 4.2 优化算法

  • 基于优化的3M-based融合定位导航算法总体因子框架如图3所示,主线为IMU预积分,其余传感器观测、载体运动行为模型都作为观测因子接入.环境场景主要通过场景分类以及配合传感器选择算法将不同因子的权重进行自适应的调整.

  • VINS-Fusion[121]是一个通用的基于优化的框架,用于多个传感器,如GNSS、磁力计和气压计等和VIO的松耦合.Gong等[122]提出一种由VINS-Fusion改进的自适应融合GNSS和VIO的系统,利用IMU预积分的深度不确定度估计方法来评估VIO的状态不确定度.文献[123]提出一种基于VINS-Fusion的PPP和双目VIO的松耦合框架.DVIGO[124]基于直接稀疏法融合了四个互补但异步的传感器(立体相机、IMU、磁力计和GNSS)的测量结果.Liu等[125]提出一种基于优化的框架,将视觉、IMU和原始GNSS测量(包括伪距和多普勒频移)紧密耦合.在退化[126]的情况下,由于VIO的尺度漂移,文献[125]的初始化考虑了VIO的尺度参数.最近,文献[12]提出一个名为GVINS的系统,该系统使用了开源代码.与文献[125]相比,GVINS采用在线粗细化方法对GNSS视觉惯性态进行初始化.此外,GVINS还考虑了时间同步、电磁干扰和接收机时钟跳变等工程难题.在GVINS的基础上,P3-VINS[19]增加了相位观测信息到观测因子中,并在状态中增加了载波相位模糊度,同时采取无电离层组合的方式.

  • Shan等[13]提出LIO-SAM融合卫星导航/惯性导航/激光雷达.在LIO-SAM前端,激光雷达和惯导通过预积分紧密耦合,后端采用全局因子图对卫星导航进行松散耦合.在LIO-SAM中,采用robot_localization[127]来获取大地坐标系与局部坐标系的转换.但是robot_localization算法只使用了陀螺仪而未使用加速度计,因此,LIO-SAM中初始化的精度是有限的.Shan等[14]在LIO-SAM的基础上增加了视觉传感器,提出场景适应性更强的LVI-SAM算法.

  • 5 总结与展望

  • 本文从传感器观测模型、环境场景模型、载体运动行为模型三方面综述了基于多源融合定位算法.本文主要关注的传感器为卫星导航、惯性导航、视觉传感器和激光雷达.环境场景模型主要涉及场景分类和基于场景分类的传感器选择.载体运动行为模型本文主要关注了非完整性约束、零速校正以及一些无人机动力学等.最后,本文从滤波和优化两个方面对涉及卫星导航、惯性导航、视觉传感器和激光雷达的多源融合定位算法进行了综述.未来多源融合定位算法也将从更多源的传感器、更加智能的环境场景模型辅助和载体运动行为辅助以及更优的融合框架这几个方面发展.

  • 图3 基于优化的3M-based融合定位导航算法总体因子框架

  • Fig.3 The factor graph framework of 3M-based fusion positioning and navigation algorithm with optimization

  • 参考文献

    • [1] Gao Y,Shen X B.A new method for carrier-phase-based precise point positioning[J].Navigation,2002,49(2):109-116

    • [2] Langley R B.RTK GPS[J].GPS World,1998,9(9):70-76

    • [3] Mur-Artal R,Montiel J M M,Tardós J D.ORB-SLAM:a versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics,2015,31(5):1147-1163

    • [4] Zhou H Z,Zou D P,Pei L,et al.StructSLAM:visual SLAM with building structure lines[J].IEEE Transactions on Vehicular Technology,2015,64(4):1364-1375

    • [5] Wang C,Guo X H.Plane-based optimization of geometry and texture for RGB-D reconstruction of indoor scenes[C]//2018 International Conference on 3D Vision(3DV).September 5-8,2018,Verona,Italy.IEEE,2018:533-541

    • [6] Engel J,Koltun V,Cremers D.Direct sparse odometry[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(3):611-625

    • [7] Censi A.An ICP variant using a point-to-line metric[C]//2008 IEEE International Conference on Robotics and Automation.May 19-23,2008,Pasadena,CA,USA.IEEE,2008:19-25

    • [8] Low K L.Linear least-squares optimization for point-to-plane ICP surface registration[R].Chapel Hill,University of North Carolina,2004:TR04-004

    • [9] Biber P,Strasser W.The normal distributions transform:a new approach to laser scan matching[C]//Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS 2003)(Cat.No.03CH37453).October 27-31,2003,Las Vegas,NV,USA.IEEE,2003:2743-2748

    • [10] Chiu H P,Zhou X S,Carlone L,et al.Constrained optimal selection for multi-sensor robot navigation using plug-and-play factor graphs[C]//2014 IEEE International Conference on Robotics and Automation(ICRA).May 31-June 7,2014,Hong Kong,China.IEEE,2014:663-670

    • [11] Shin E H.Accuracy improvement of low cost INS/GPS for land applications[R].UCGE Reports,2001:20156

    • [12] Cao S Z,Lu X Y,Shen S J.GVINS:tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J].IEEE Transactions on Robotics,2022,38(4):2004-2021

    • [13] Shan T X,Englot B,Meyers D,et al.LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS),2020:5135-5142

    • [14] Shan T X,Englot B,Ratti C,et al.LVI-SAM:tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]//2021 IEEE International Conference on Robotics and Automation,2021:5692-5698

    • [15] Qiu X C,Zhang H,Fu W X.Lightweight hybrid visual-inertial odometry with closed-form zero velocity update[J].Chinese Journal of Aeronautics,2020,33(12):3344-3359

    • [16] Qin C,Ye H Y,Pranata C E,et al.LINS:a lidar-inertial state estimator for robust and efficient navigation[C]//2020 IEEE International Conference on Robotics and Automation,2020:8899-8906

    • [17] Niu X J,Li Y,Zhang Q,et al.Observability analysis of non-holonomic constraints for land-vehicle navigation systems[J].Journal of Global Positioning Systems,2012,11(1):80-88

    • [18] Li T,Pei L,Xiang Y,et al.P3-LOAM:PPP/LiDAR loosely coupled SLAM with accurate covariance estimation and robust RAIM in urban canyon environment[J].IEEE Sensors Journal,2021,21(5):6660-6671

    • [19] Li T,Pei L,Xiang Y,et al.P3-VINS:tightly-coupled PPP/INS/visual SLAM based on optimization approach[J].IEEE Robotics and Automation Letters,2022,7(3):7021-7027

    • [20] Li X X,Wang H D,Li S Y,et al.GIL:a tightly coupled GNSS PPP/INS/LiDAR method for precise vehicle navigation[J].Satellite Navigation,2021,2(1):26

    • [21] Ding Z M,Yang T K,Zhang K Y,et al.VID-fusion:robust visual-inertial-dynamics odometry for accurate external force estimation[C]//2021 IEEE International Conference on Robotics and Automation,2021:14469-14475

    • [22] Görres B,Campbell J,Becker M,et al.Absolute calibration of GPS antennas:laboratory results and comparison with field and robot techniques[J].GPS Solutions,2006,10(2):136-145

    • [23] Héroux P,Kouba J.GPS precise point positioning using IGS orbit products[J].Physics and Chemistry of the Earth,Part A:Solid Earth and Geodesy,2001,26(6/7/8):573-578

    • [24] Teunissen P J G.The invertible GPS ambiguity transformations[J].Manuscripta Geodaetica,1995,20(6):489-497

    • [25] 张小红,李星星,郭斐.GNSS精密单点定位理论方法及其应用[M].北京:国防工业出版社,2021

    • [26] Ge M,Gendt G,Rothacher M,et al.Resolution of GPS carrier-phase ambiguities in precise point positioning(PPP)with daily observations[J].Journal of Geodesy,2008,82(7):389-399

    • [27] Wübbena G,Schmitz M,Bagge A.PPP-RTK:precise point positioning using state-space representation in RTK networks[J].Proceedings of the 18th International Technical Meeting of the Satellite Division of the Institute of Navigation,ION GNSS 2005,2005:2584-2594

    • [28] Geng J,Teferle F N,Meng X,et al.Towards PPP-RTK:ambiguity resolution in real-time precise point positioning[J].Advances in Space Research,2011,47(10):1664-1673

    • [29] Bortz J E.A new mathematical formulation for strapdown inertial navigation[J].IEEE Transactions on Aerospace and Electronic Systems,1971,AES-7(1):61-66

    • [30] Groves P D.Principles of GNSS,inertial,and multi-sensor integrated navigation systems[M].2nd ed.Boston and London:Artech House,2013

    • [31] Bailey T,Nieto J,Guivant J,et al.Consistency of the EKF-SLAM algorithm[C]//2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 9-15,2006,Beijing,China.IEEE,2006:3562-3568

    • [32] Huang G P,Mourikis A I,Roumeliotis S I.Analysis and improvement of the consistency of extended Kalman filter based SLAM[C]//2008 IEEE International Conference on Robotics and Automation.May 19-23,2008,Pasadena,CA,USA.IEEE,2008:473-479

    • [33] Huang G P,Mourikis A I,Roumeliotis S I.Observability-based rules for designing consistent EKF SLAM estimators[J].International Journal of Robotics Research,2010,29(5):502-528

    • [34] Huang S D,Dissanayake G.Convergence and consistency analysis for extended Kalman filter based SLAM[J].IEEE Transactions on Robotics,2007,23(5):1036-1049

    • [35] Li M Y,Mourikis A I.Improving the accuracy of EKF-based visual-inertial odometry[C]//2012 IEEE International Conference on Robotics and Automation.May 14-18,2012,Saint Paul,MN,USA.IEEE,2012:828-835

    • [36] Li M Y,Mourikis A I.Improving the accuracy of EKF-based visual-inertial odometry[C]//2012 IEEE International Conference on Robotics and Automation.Saint Paul,MN,USA.IEEE,2012:828-835

    • [37] Huai Z,Huang G Q.Robocentric visual-inertial odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 1-5,2018,Madrid,Spain.IEEE,2018:6319-6326

    • [38] Lynen S,Achtelik M W,Weiss S,et al.A robust and modular multi-sensor fusion approach applied to MAV navigation[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.November 3-7,2013,Tokyo,Japan.IEEE,2013:3923-3929

    • [39] Bonnabel S.Left-invariant extended Kalman filter and attitude estimation[C]//2007 46th IEEE Conference on Decision and Control.December 12-14,2007,New Orleans,LA,USA.IEEE,2007:1027-1032

    • [40] Bonnable S,Martin P,Salaün E.Invariant extended Kalman filter:theory and application to a velocity-aided attitude estimation problem[C]//Proceedings of the 48h IEEE Conference on Decision and Control(CDC)held jointly with 2009 28th Chinese Control Conference.December 15-18,2009,Shanghai,China.IEEE,2009:1297-1304

    • [41] Barrau A,Bonnabel S.An EKF-SLAM algorithm with consistency properties[J].arXiv e-print,2015,arXiv:1510.06263

    • [42] Wu K Z,Zhang T,Su D,et al.An invariant-EKF VINS algorithm for improving consistency[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 24-28,2017,Vancouver,BC,Canada.IEEE,2017:1578-1585

    • [43] Heo S,Park C G.Consistent EKF-based visual-inertial odometry on matrix Lie group[J].IEEE Sensors Journal,2018,18(9):3780-3788

    • [44] Brossard M,Bonnabel S,Barrau A.Invariant Kalman filtering for visual inertial SLAM[C]//2018 21st International Conference on Information Fusion(FUSION).July 10-13,2018,Cambridge,UK.IEEE,2018:2021-2028

    • [45] Hua T,Pei L,Li T,et al.I2-SLAM:fusing infrared camera and IMU for simultaneous localization and mapping[M]//Proceedings of 2021 International Conference on Autonomous Unmanned Systems(ICAUS 2021).Singapore:Springer Singapore,2022:2834-2844

    • [46] Chen L,Sun L B,Yang T,et al.RGB-T SLAM:a flexible SLAM framework by combining appearance and thermal information[C]//2017 IEEE International Conference on Robotics and Automation.May 29-June 3,2017,Singapore.IEEE,2017:5682-5687

    • [47] Wang R C,Pei L,Chu L,et al.DVT-SLAM:deep-learning based visible and thermal fusion SLAM[M]//Lecture Notes in Electrical Engineering.Singapore:Springer Singapore,2021:394-403

    • [48] Zhou Y,Gallego G,Shen S J.Event-based stereo visual odometry[J].IEEE Transactions on Robotics,2021,37(5):1433-1450

    • [49] Rebecq H,Horstschaefer T,Gallego G,et al.EVO:a geometric approach to event-based 6-DOF parallel tracking and mapping in real time[J].IEEE Robotics and Automation Letters,2017,2(2):593-600

    • [50] Davison A J,Reid I D,Molton N D,et al.MonoSLAM:real-time single camera SLAM[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(6):1052-1067

    • [51] Engel J,Schöps T,Cremers D.LSD-SLAM:large-scale direct monocular SLAM[M]//Computer Vision-ECCV 2014.Cham:Springer International Publishing,2014:834-849

    • [52] Forster C,Pizzoli M,Scaramuzza D.SVO:fast semi-direct monocular visual odometry[C]//2014 IEEE International Conference on Robotics and Automation.May 31-June 7,2014,Hong Kong,China.IEEE,2014:15-22

    • [53] Li J Q,Pei L,Zou D P,et al.Attention-SLAM:a visual monocular SLAM learning from human gaze[J].IEEE Sensors Journal,2021,21(5):6408-6420

    • [54] Li B Y,Zou D P,Sartori D,et al.TextSLAM:visual SLAM with planar text features[C]//2020 IEEE International Conference on Robotics and Automation.May 17-21,2020,Paris,France.IEEE,2020:2102-2108

    • [55] Pei L,Liu K,Zou D P,et al.IVPR:an instant visual place recognition approach based on structural lines in Manhattan world[J].IEEE Transactions on Instrumentation and Measurement,2020,69(7):4173-4187

    • [56] Montemerlo M,Thrun S,Koller D,et al.FastSLAM:a factored solution to the simultaneous localization and mapping problem[J].AAAI/IAAI,2002,593598

    • [57] Grisetti G,Stachniss C,Burgard W.Improved techniques for grid mapping with Rao-blackwellized particle filters[J].IEEE Transactions on Robotics,2007,23(1):34-46

    • [58] Kohlbrecher S,von Stryk O,Meyer J,et al.A flexible and scalable SLAM system with full 3D motion estimation[C]//2011 IEEE International Symposium on Safety,Security,and Rescue Robotics.October 31-November 5,2011,Kyoto,Japan.IEEE,2011:155-160

    • [59] Konolige K,Grisetti G,R Kümmerle,et al.Efficient sparse pose adjustment for 2D mapping[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 18-22,2010,Taipei,Taiwan.IEEE,2010.DOI:10.1109/IROS.2010.5649043

    • [60] Hess W,Kohler D,Rapp H,et al.Real-time loop closure in 2D LIDAR SLAM[C]//2016 IEEE International Conference on Robotics and Automation.May 16-21,2016,Stockholm,Sweden.IEEE,2016:1271-1278

    • [61] Zhang J,Singh S.LOAM:lidar odometry and mapping in real-time[J].Robotics:Science and Systems,2014.DOI:10.15607/RSS.2014.X.007

    • [62] Shan T X,Englot B.LeGO-LOAM:lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 1-5,2018,New York.ACM,2018:4758-4765

    • [63] Deschaud J E.IMLS-SLAM:scan-to-model matching based on 3D data[C]//2018 IEEE International Conference on Robotics and Automation.May 21-25,2018,Brisbane,QLD,Australia.IEEE,2018:2480-2485

    • [64] Pan Y,Xiao P C,He Y J,et al.MULLS:versatile LiDAR SLAM via multi-metric linear least square [J].arXiv e-print,2021,arXiv:2102.03771

    • [65] Li F F,Perona P.A Bayesian hierarchical model for learning natural scene categories[C]//2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.June 20-25,2005,San Diego,CA,USA.IEEE,2005:524-531

    • [66] Li L J,Su H,Xing E P,et al.Objectbank:a high-level image representation for scene classification & semantic feature sparsification[J].Proceedings of the 23rd International Conference on Neural Information Processing Systems,2010,2:1378-1386

    • [67] Zhou B L,Lapedriza A,Khosla A,et al.Places:a 10 million image database for scene recognition[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(6):1452-1464

    • [68] Wen W S,Hsu L T,Zhang G H.Performance analysis of NDT-based graph SLAM for autonomous vehicle in diverse typical driving scenarios of Hong Kong [J].Sensors(Basel,Switzerland),2018,18(11):3928

    • [69] Hewitson S,Wang J L.Extended receiver autonomous integrity monitoring(eRAIM)for GNSS/INS integration[J].Journal of Surveying Engineering,2010,136(1):13-22

    • [70] Gakne P V,O'Keefe K.Tightly-coupled GNSS/vision using a sky-pointing camera for vehicle navigation in urban areas[J].Sensors(Basel,Switzerland),2018,18(4):1244

    • [71] Wen W S.3D LiDAR aided GNSS and its tightly coupled integration with INS via factor graph optimization[C]//33rd International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GNSS+ 2020),2020.DOI:10.33012/2020.17557

    • [72] Dai W C,Zhang Y,Li P,et al.RGB-D SLAM in dynamic environments using point correlations[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2022,44(1):373-389

    • [73] Qian C L,Xiang Z H,Wu Z R,et al.RF-LIO:removal-first tightly-coupled lidar inertial odometry in high dynamic environments[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 27-October 1,2021,Prague,Czech Republic.IEEE,2021:4421-4428

    • [74] Sukumar S R,Bozdogan H,Page D L,et al.Sensor selection using information complexity for multi-sensor mobile robot localization[C]//Proceedings 2007 IEEE International Conference on Robotics and Automation.April 10-14,2007,Rome,Italy.IEEE,2007:4158-4163

    • [75] Bian F,Kempe D,Govindan R.Utility-based sensor selection[C]//2006 5th International Conference on Information Processing in Sensor Networks.April 19-21,2006,Nashville,TN,USA.IEEE,2006:11-18

    • [76] Shamaiah M,Banerjee S,Vikalo H.Greedy sensor selection:leveraging submodularity[C]//49th IEEE Conference on Decision and Control.December 15-17,2010,Atlanta,GA,USA.IEEE,2010:2572-2577

    • [77] Joshi S,Boyd S.Sensor selection via convex optimization[J].IEEE Transactions on Signal Processing,2009,57(2):451-462

    • [78] Dissanayake G,Sukkarieh S,Nebot E,et al.The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications[J].IEEE Transactions on Robotics and Automation,2001,17(5):731-747

    • [79] Niu X J,Nassar S,El-Sheimy N.An accurate land-vehicle MEMS IMU/GPS navigation system using 3D auxiliary velocity updates[J].Navigation,2007,54(3):177-188

    • [80] Yang H J,Fan X Z,Shi P,et al.Nonlinear control for tracking and obstacle avoidance of a wheeled mobile robot with nonholonomic constraint[J].IEEE Transactions on Control Systems Technology,2016,24(2):741-746

    • [81] Scaramuzza D,Fraundorfer F,Pollefeys M,et al.Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints[C]//2009 IEEE 12th International Conference on Computer Vision.September 29-October 2,2009,Kyoto,Japan.IEEE,2009:1413-1419

    • [82] 刘万科,农旗,陶贤露,等.非完整约束的OD/SINS自适应组合导航方法[J].测绘学报,2022,51(1):9-17;LIU Wanke,NONG Qi,TAO Xianlu,et al.OD/SINS adaptive integrated navigation method with non-holonomic constraints[J].Acta Geodaetica et Cartographica Sinica,2022,51(1):9-17

    • [83] Zhang Z X,Niu X J,Tang H L,et al.GNSS/INS/ODO/wheel angle integrated navigation algorithm for an all-wheel steering robot[J].Measurement Science and Technology,2021,32(11):115122

    • [84] Shin E H.Estimation techniques for low-cost inertial navigation[D].Calgary,Canada:University of Calgary,2005

    • [85] Ben Y Y,Yin G S,Gao W,et al.Improved filter estimation method applied in zero velocity update for SINS[C]//2009 International Conference on Mechatronics and Automation.August 9-12,2009,Changchun,China.IEEE,2009:3375-3380

    • [86] 方靖,顾启泰,丁天怀.车载惯性导航的动态零速修正技术[J].中国惯性技术学报,2008,16(3):265-268;FANG Jing,GU Qitai,DING Tianhuai.Dynamic zero velocity update for vehicle inertial navigation system[J].Journal of Chinese Inertial Technology,2008,16(3):265-268

    • [87] Liu W,Zhang Z.Research on zero velocity update for high-precision land-vehicle inertial navigation system[J].Navigation and Control,2013,12(2):29-33

    • [88] 高钟毓.惯性定位系统的卡尔曼滤波器设计[J].中国惯性技术学报,2000,8(4):6-12,20;GAO Zhongyu.Kalman filter design of inertial positioning system[J].Journal of Chinese Inertial Technology,2000,8(4):6-10,20

    • [89] Li X F,Mao Y L,Xie L,et al.Applications of zero-velocity detector and Kalman filter in zero velocity update for inertial navigation system[C]//Proceedings of 2014 IEEE Chinese Guidance,Navigation and Control Conference.August 8-10,2014,Yantai,China.IEEE,2014:1760-1763

    • [90] 奔粤阳,孙枫,高伟,等.惯导系统的零速校正技术研究[J].系统仿真学报,2008,20(17):4639-4642;BEN Yueyang,SUN Feng,GAO Wei,et al.Study of zero velocity update for inertial navigation[J].Journal of System Simulation,2008,20(17):4639-4642

    • [91] Grejner-Brzezinska D A,Yi Y D,Toth C K.Bridging GPS gaps in urban canyons:the benefits of ZUPTs[J].Navigation,2001,48(4):216-226

    • [92] Ramanandan A,Chen A N,Farrell J A.Inertial navigation aiding by stationary updates[J].IEEE Transactions on Intelligent Transportation Systems,2012,13(1):235-248

    • [93] Davidson P,Hautamäki J,Collin J,et al.Improved vehicle positioning in urban environment through integration of GPS and low-cost inertial sensors[C]//Proceedings of the European Navigation Conference(ENC'09).May 4-6,2009,Naples,Italy.2009:101-107

    • [94] Kasameyer P W,Hutchings L,Ellis M F,et al.MEMS-based INS tracking of personnel in a GPS-denied environment[J].Proceedings of the 18th International Technical Meeting of the Satellite Division of the Institute of Navigation,ION GNSS 2005,2005:949-955

    • [95] Ojeda L,Borenstein J.Non-GPS navigation with the personal dead-reckoning system[C]//SPIE Defense Security Conference,Unmanned Systems Technology IX.April 9-13,2007,Orlando,Florida,USA.2007,6561:110-120

    • [96] Yu H.An algorithm to detect zero-velocity in automobiles using accelerometer signals[J].Ritala Ristopiché Robert,2009

    • [97] Ramanandan A,Chen A,Farrell J A,et al.Detection of stationarity in an inertial navigation system[C]//Proceedings of the 23rd International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GNSS 2010),2010:238-244

    • [98] Geneva P,Eckenhoff K,Lee W,et al.OpenVINS:a research platform for visual-inertial estimation[C]//2020 IEEE International Conference on Robotics and Automation.May 17-21,2020,Paris,France.IEEE,2020:4666-4672

    • [99] Petovello M G,Mezentsev O,Lachapelle G,et al.High sensitivity GPS velocity updates for personal indoor navigation using inertial navigation systems[C]//Proceedings of the 16th International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GPS/GNSS 2003),2003:2886-2896

    • [100] Mezentsev O,Collin J,Lachapelle G.Vehicular navigation in urban canyons using a high sensitivity GPS receiver augmented with a medium-grade IMU[C]//10th Saint Petersburg International Conference on Integrated Navigation Systems,2003:64-70

    • [101] Zampella F,Khider M,Robertson P,et al.Unscented Kalman filter and magnetic angular rate update(MARU)for an improved pedestrian dead-reckoning[C]//Proceedings of the 2012 IEEE/ION Position,Location and Navigation Symposium.April 23-26,2012,Myrtle Beach,SC,USA.IEEE,2012:129-139

    • [102] Rajagopal S.Personal dead reckoning system with shoe mounted inertial sensors[R].Master's Degree Project,Stockholm,Sweden,2008:013

    • [103] Wan E A,Van Der Merwe R.The unscented Kalman filter for nonlinear estimation[C]//Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing,Communications,and Control Symposium(Cat.No.00EX373).October 4,2000,Lake Louise,AB,Canada.IEEE,2000:153-158

    • [104] Wang R Z,Zou D P,Xu C Q,et al.An aerodynamic model-aided state estimator for multi-rotor UAVs[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 24-28,2017,New York:ACM,2017:2164-2170

    • [105] Nisar B,Foehn P,Falanga D,et al.VIMO:simultaneous visual inertial model-based odometry and force estimation[J].IEEE Robotics and Automation Letters,2019,4(3):2785-2792

    • [106] Ribeiro M I.Kalman and extended Kalman filters:concept,derivation and properties[R].Institute for Systems and Robotics Lisboa Portugal,2004:46

    • [107] Carpenter J,Clifford P,Fearnhead P.Improved particle filter for nonlinear problems[J].IEE Proceedings-Radar,Sonar and Navigation,1999,146(1):2-7

    • [108] Weiss S,Achtelik M W,Lynen S,et al.Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments[C]//2012 IEEE International Conference on Robotics and Automation.May 14-18,2012,Saint Paul,MN,USA.IEEE,2012:957-964

    • [109] Falquez J M,Kasper M,Sibley G.Inertial aided dense & semi-dense methods for robust direct visual odometry[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 9-14,2016,Daejeon,Korea(South).IEEE,2016:3601-3607

    • [110] Se S,Lowe D,Little J J.Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks[J].The International Journal of Robotics Research,2002,21(8):735-760

    • [111] Jones E S,Soatto S.Visual-inertial navigation,mapping and localization:a scalable real-time causal approach[J].The International Journal of Robotics Research,2011,30(4):407-430

    • [112] Bloesch M,Omari S,Hutter M,et al.Robust visual inertial odometry using a direct EKF-based approach[C]//2015 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 28-October 2,2015,Hamburg,Germany.IEEE,2015:298-304

    • [113] Mourikis A I,Roumeliotis S I.Amulti-state constraint Kalman filter for vision-aided inertial navigation[C]//2007 IEEE International Conference on Robotics and Automation.April 10-14,2007,Rome,Italy.IEEE,2007:3565-3572

    • [114] Sun K,Mohta K,Pfrommer B,et al.Robust stereo visual inertial odometry for fast autonomous flight[J].IEEE Robotics and Automation Letters,2018,3(2):965-972

    • [115] Zou D P,Wu Y X,Pei L,et al.StructVIO:visual-inertial odometry with structural regularity of man-made environments[J].IEEE Transactions on Robotics,2019,35(4):999-1013

    • [116] Srinara S,Lee C M,Tsai S,et al.Performance analysis of 3D NDT scan matching for autonomous vehicles using INS/GNSS/3D LiDAR-SLAM integration scheme[C]//2021 IEEE International Symposium on Inertial Sensors and Systems.March 22-25,2021,Kailua-Kona,HI,USA.IEEE,2021:1-4

    • [117] Schütz A,Sánchez-Morales D E,Pany T.Precise positioning through a loosely-coupled sensor fusion of GNSS-RTK,INS and LiDAR for autonomous driving[C]//2020 IEEE/ION Position,Location and Navigation Symposium(PLANS).April 20-23,2020,Portland,OR,USA.IEEE,2020:219-225

    • [118] Chiang K W,Tsai G J,Chang H W,et al.Seamless navigation and mapping using an INS/GNSS/grid-based SLAM semi-tightly coupled integration scheme[J].Information Fusion,2019,50:181-196

    • [119] Chiang K W,Tsai G J,Chu H J,et al.Performance enhancement of INS/GNSS/refreshed-SLAM integration for acceptable lane-level navigation accuracy[J].IEEE Transactions on Vehicular Technology,2020,69(3):2463-2476

    • [120] Besl P J,McKay N D.A method for registration of 3-D shapes[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1992,14(2):239-256

    • [121] Qin T,Cao S,Pan J,et al.A general optimization-based framework for global pose estimation with multiple sensors[J].arXiv e-print,2019,arXiv:1901.03642

    • [122] Gong Z,Liu P L,Wen F,et al.Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment[J].IEEE Transactions on Instrumentation and Measurement,2021,70:1-16

    • [123] Li X X,Wang X B,Liao J C,et al.Semi-tightly coupled integration of multi-GNSS PPP and S-VINS for precise positioning in GNSS-challenged environments[J].Satellite Navigation,2021,2(1):1-14

    • [124] Wang Z Q,Li M,Zhou D K,et al.Direct sparse stereo visual-inertial global odometry[C]//2021 IEEE International Conference on Robotics and Automation.May 30-June 5,2021,Xi'an,China.IEEE,2021:14403-14409

    • [125] Liu J X,Gao W,Hu Z Y.Optimization-based visual-inertial SLAM tightly coupled with raw GNSS measurements[C]//2021 IEEE International Conference on Robotics and Automation.May 30-June 5,2021,Xi'an,China.IEEE,2021:11612-11618

    • [126] Wu K J,Guo C X,Georgiou G,et al.VINS on wheels[C]//2017 IEEE International Conference on Robotics and Automation.May 29-June 3,2017,Singapore.IEEE,2017:5155-5162

    • [127] Moore T,Stouch D.A generalized extended Kalman filter implementation for the robot operating system[M]//Intelligent Autonomous Systems 13.Cham:Springer International Publishing,2015:335-348

  • 参考文献

    • [1] Gao Y,Shen X B.A new method for carrier-phase-based precise point positioning[J].Navigation,2002,49(2):109-116

    • [2] Langley R B.RTK GPS[J].GPS World,1998,9(9):70-76

    • [3] Mur-Artal R,Montiel J M M,Tardós J D.ORB-SLAM:a versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics,2015,31(5):1147-1163

    • [4] Zhou H Z,Zou D P,Pei L,et al.StructSLAM:visual SLAM with building structure lines[J].IEEE Transactions on Vehicular Technology,2015,64(4):1364-1375

    • [5] Wang C,Guo X H.Plane-based optimization of geometry and texture for RGB-D reconstruction of indoor scenes[C]//2018 International Conference on 3D Vision(3DV).September 5-8,2018,Verona,Italy.IEEE,2018:533-541

    • [6] Engel J,Koltun V,Cremers D.Direct sparse odometry[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(3):611-625

    • [7] Censi A.An ICP variant using a point-to-line metric[C]//2008 IEEE International Conference on Robotics and Automation.May 19-23,2008,Pasadena,CA,USA.IEEE,2008:19-25

    • [8] Low K L.Linear least-squares optimization for point-to-plane ICP surface registration[R].Chapel Hill,University of North Carolina,2004:TR04-004

    • [9] Biber P,Strasser W.The normal distributions transform:a new approach to laser scan matching[C]//Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS 2003)(Cat.No.03CH37453).October 27-31,2003,Las Vegas,NV,USA.IEEE,2003:2743-2748

    • [10] Chiu H P,Zhou X S,Carlone L,et al.Constrained optimal selection for multi-sensor robot navigation using plug-and-play factor graphs[C]//2014 IEEE International Conference on Robotics and Automation(ICRA).May 31-June 7,2014,Hong Kong,China.IEEE,2014:663-670

    • [11] Shin E H.Accuracy improvement of low cost INS/GPS for land applications[R].UCGE Reports,2001:20156

    • [12] Cao S Z,Lu X Y,Shen S J.GVINS:tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J].IEEE Transactions on Robotics,2022,38(4):2004-2021

    • [13] Shan T X,Englot B,Meyers D,et al.LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS),2020:5135-5142

    • [14] Shan T X,Englot B,Ratti C,et al.LVI-SAM:tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]//2021 IEEE International Conference on Robotics and Automation,2021:5692-5698

    • [15] Qiu X C,Zhang H,Fu W X.Lightweight hybrid visual-inertial odometry with closed-form zero velocity update[J].Chinese Journal of Aeronautics,2020,33(12):3344-3359

    • [16] Qin C,Ye H Y,Pranata C E,et al.LINS:a lidar-inertial state estimator for robust and efficient navigation[C]//2020 IEEE International Conference on Robotics and Automation,2020:8899-8906

    • [17] Niu X J,Li Y,Zhang Q,et al.Observability analysis of non-holonomic constraints for land-vehicle navigation systems[J].Journal of Global Positioning Systems,2012,11(1):80-88

    • [18] Li T,Pei L,Xiang Y,et al.P3-LOAM:PPP/LiDAR loosely coupled SLAM with accurate covariance estimation and robust RAIM in urban canyon environment[J].IEEE Sensors Journal,2021,21(5):6660-6671

    • [19] Li T,Pei L,Xiang Y,et al.P3-VINS:tightly-coupled PPP/INS/visual SLAM based on optimization approach[J].IEEE Robotics and Automation Letters,2022,7(3):7021-7027

    • [20] Li X X,Wang H D,Li S Y,et al.GIL:a tightly coupled GNSS PPP/INS/LiDAR method for precise vehicle navigation[J].Satellite Navigation,2021,2(1):26

    • [21] Ding Z M,Yang T K,Zhang K Y,et al.VID-fusion:robust visual-inertial-dynamics odometry for accurate external force estimation[C]//2021 IEEE International Conference on Robotics and Automation,2021:14469-14475

    • [22] Görres B,Campbell J,Becker M,et al.Absolute calibration of GPS antennas:laboratory results and comparison with field and robot techniques[J].GPS Solutions,2006,10(2):136-145

    • [23] Héroux P,Kouba J.GPS precise point positioning using IGS orbit products[J].Physics and Chemistry of the Earth,Part A:Solid Earth and Geodesy,2001,26(6/7/8):573-578

    • [24] Teunissen P J G.The invertible GPS ambiguity transformations[J].Manuscripta Geodaetica,1995,20(6):489-497

    • [25] 张小红,李星星,郭斐.GNSS精密单点定位理论方法及其应用[M].北京:国防工业出版社,2021

    • [26] Ge M,Gendt G,Rothacher M,et al.Resolution of GPS carrier-phase ambiguities in precise point positioning(PPP)with daily observations[J].Journal of Geodesy,2008,82(7):389-399

    • [27] Wübbena G,Schmitz M,Bagge A.PPP-RTK:precise point positioning using state-space representation in RTK networks[J].Proceedings of the 18th International Technical Meeting of the Satellite Division of the Institute of Navigation,ION GNSS 2005,2005:2584-2594

    • [28] Geng J,Teferle F N,Meng X,et al.Towards PPP-RTK:ambiguity resolution in real-time precise point positioning[J].Advances in Space Research,2011,47(10):1664-1673

    • [29] Bortz J E.A new mathematical formulation for strapdown inertial navigation[J].IEEE Transactions on Aerospace and Electronic Systems,1971,AES-7(1):61-66

    • [30] Groves P D.Principles of GNSS,inertial,and multi-sensor integrated navigation systems[M].2nd ed.Boston and London:Artech House,2013

    • [31] Bailey T,Nieto J,Guivant J,et al.Consistency of the EKF-SLAM algorithm[C]//2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 9-15,2006,Beijing,China.IEEE,2006:3562-3568

    • [32] Huang G P,Mourikis A I,Roumeliotis S I.Analysis and improvement of the consistency of extended Kalman filter based SLAM[C]//2008 IEEE International Conference on Robotics and Automation.May 19-23,2008,Pasadena,CA,USA.IEEE,2008:473-479

    • [33] Huang G P,Mourikis A I,Roumeliotis S I.Observability-based rules for designing consistent EKF SLAM estimators[J].International Journal of Robotics Research,2010,29(5):502-528

    • [34] Huang S D,Dissanayake G.Convergence and consistency analysis for extended Kalman filter based SLAM[J].IEEE Transactions on Robotics,2007,23(5):1036-1049

    • [35] Li M Y,Mourikis A I.Improving the accuracy of EKF-based visual-inertial odometry[C]//2012 IEEE International Conference on Robotics and Automation.May 14-18,2012,Saint Paul,MN,USA.IEEE,2012:828-835

    • [36] Li M Y,Mourikis A I.Improving the accuracy of EKF-based visual-inertial odometry[C]//2012 IEEE International Conference on Robotics and Automation.Saint Paul,MN,USA.IEEE,2012:828-835

    • [37] Huai Z,Huang G Q.Robocentric visual-inertial odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 1-5,2018,Madrid,Spain.IEEE,2018:6319-6326

    • [38] Lynen S,Achtelik M W,Weiss S,et al.A robust and modular multi-sensor fusion approach applied to MAV navigation[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.November 3-7,2013,Tokyo,Japan.IEEE,2013:3923-3929

    • [39] Bonnabel S.Left-invariant extended Kalman filter and attitude estimation[C]//2007 46th IEEE Conference on Decision and Control.December 12-14,2007,New Orleans,LA,USA.IEEE,2007:1027-1032

    • [40] Bonnable S,Martin P,Salaün E.Invariant extended Kalman filter:theory and application to a velocity-aided attitude estimation problem[C]//Proceedings of the 48h IEEE Conference on Decision and Control(CDC)held jointly with 2009 28th Chinese Control Conference.December 15-18,2009,Shanghai,China.IEEE,2009:1297-1304

    • [41] Barrau A,Bonnabel S.An EKF-SLAM algorithm with consistency properties[J].arXiv e-print,2015,arXiv:1510.06263

    • [42] Wu K Z,Zhang T,Su D,et al.An invariant-EKF VINS algorithm for improving consistency[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 24-28,2017,Vancouver,BC,Canada.IEEE,2017:1578-1585

    • [43] Heo S,Park C G.Consistent EKF-based visual-inertial odometry on matrix Lie group[J].IEEE Sensors Journal,2018,18(9):3780-3788

    • [44] Brossard M,Bonnabel S,Barrau A.Invariant Kalman filtering for visual inertial SLAM[C]//2018 21st International Conference on Information Fusion(FUSION).July 10-13,2018,Cambridge,UK.IEEE,2018:2021-2028

    • [45] Hua T,Pei L,Li T,et al.I2-SLAM:fusing infrared camera and IMU for simultaneous localization and mapping[M]//Proceedings of 2021 International Conference on Autonomous Unmanned Systems(ICAUS 2021).Singapore:Springer Singapore,2022:2834-2844

    • [46] Chen L,Sun L B,Yang T,et al.RGB-T SLAM:a flexible SLAM framework by combining appearance and thermal information[C]//2017 IEEE International Conference on Robotics and Automation.May 29-June 3,2017,Singapore.IEEE,2017:5682-5687

    • [47] Wang R C,Pei L,Chu L,et al.DVT-SLAM:deep-learning based visible and thermal fusion SLAM[M]//Lecture Notes in Electrical Engineering.Singapore:Springer Singapore,2021:394-403

    • [48] Zhou Y,Gallego G,Shen S J.Event-based stereo visual odometry[J].IEEE Transactions on Robotics,2021,37(5):1433-1450

    • [49] Rebecq H,Horstschaefer T,Gallego G,et al.EVO:a geometric approach to event-based 6-DOF parallel tracking and mapping in real time[J].IEEE Robotics and Automation Letters,2017,2(2):593-600

    • [50] Davison A J,Reid I D,Molton N D,et al.MonoSLAM:real-time single camera SLAM[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(6):1052-1067

    • [51] Engel J,Schöps T,Cremers D.LSD-SLAM:large-scale direct monocular SLAM[M]//Computer Vision-ECCV 2014.Cham:Springer International Publishing,2014:834-849

    • [52] Forster C,Pizzoli M,Scaramuzza D.SVO:fast semi-direct monocular visual odometry[C]//2014 IEEE International Conference on Robotics and Automation.May 31-June 7,2014,Hong Kong,China.IEEE,2014:15-22

    • [53] Li J Q,Pei L,Zou D P,et al.Attention-SLAM:a visual monocular SLAM learning from human gaze[J].IEEE Sensors Journal,2021,21(5):6408-6420

    • [54] Li B Y,Zou D P,Sartori D,et al.TextSLAM:visual SLAM with planar text features[C]//2020 IEEE International Conference on Robotics and Automation.May 17-21,2020,Paris,France.IEEE,2020:2102-2108

    • [55] Pei L,Liu K,Zou D P,et al.IVPR:an instant visual place recognition approach based on structural lines in Manhattan world[J].IEEE Transactions on Instrumentation and Measurement,2020,69(7):4173-4187

    • [56] Montemerlo M,Thrun S,Koller D,et al.FastSLAM:a factored solution to the simultaneous localization and mapping problem[J].AAAI/IAAI,2002,593598

    • [57] Grisetti G,Stachniss C,Burgard W.Improved techniques for grid mapping with Rao-blackwellized particle filters[J].IEEE Transactions on Robotics,2007,23(1):34-46

    • [58] Kohlbrecher S,von Stryk O,Meyer J,et al.A flexible and scalable SLAM system with full 3D motion estimation[C]//2011 IEEE International Symposium on Safety,Security,and Rescue Robotics.October 31-November 5,2011,Kyoto,Japan.IEEE,2011:155-160

    • [59] Konolige K,Grisetti G,R Kümmerle,et al.Efficient sparse pose adjustment for 2D mapping[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.October 18-22,2010,Taipei,Taiwan.IEEE,2010.DOI:10.1109/IROS.2010.5649043

    • [60] Hess W,Kohler D,Rapp H,et al.Real-time loop closure in 2D LIDAR SLAM[C]//2016 IEEE International Conference on Robotics and Automation.May 16-21,2016,Stockholm,Sweden.IEEE,2016:1271-1278

    • [61] Zhang J,Singh S.LOAM:lidar odometry and mapping in real-time[J].Robotics:Science and Systems,2014.DOI:10.15607/RSS.2014.X.007

    • [62] Shan T X,Englot B.LeGO-LOAM:lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 1-5,2018,New York.ACM,2018:4758-4765

    • [63] Deschaud J E.IMLS-SLAM:scan-to-model matching based on 3D data[C]//2018 IEEE International Conference on Robotics and Automation.May 21-25,2018,Brisbane,QLD,Australia.IEEE,2018:2480-2485

    • [64] Pan Y,Xiao P C,He Y J,et al.MULLS:versatile LiDAR SLAM via multi-metric linear least square [J].arXiv e-print,2021,arXiv:2102.03771

    • [65] Li F F,Perona P.A Bayesian hierarchical model for learning natural scene categories[C]//2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.June 20-25,2005,San Diego,CA,USA.IEEE,2005:524-531

    • [66] Li L J,Su H,Xing E P,et al.Objectbank:a high-level image representation for scene classification & semantic feature sparsification[J].Proceedings of the 23rd International Conference on Neural Information Processing Systems,2010,2:1378-1386

    • [67] Zhou B L,Lapedriza A,Khosla A,et al.Places:a 10 million image database for scene recognition[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(6):1452-1464

    • [68] Wen W S,Hsu L T,Zhang G H.Performance analysis of NDT-based graph SLAM for autonomous vehicle in diverse typical driving scenarios of Hong Kong [J].Sensors(Basel,Switzerland),2018,18(11):3928

    • [69] Hewitson S,Wang J L.Extended receiver autonomous integrity monitoring(eRAIM)for GNSS/INS integration[J].Journal of Surveying Engineering,2010,136(1):13-22

    • [70] Gakne P V,O'Keefe K.Tightly-coupled GNSS/vision using a sky-pointing camera for vehicle navigation in urban areas[J].Sensors(Basel,Switzerland),2018,18(4):1244

    • [71] Wen W S.3D LiDAR aided GNSS and its tightly coupled integration with INS via factor graph optimization[C]//33rd International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GNSS+ 2020),2020.DOI:10.33012/2020.17557

    • [72] Dai W C,Zhang Y,Li P,et al.RGB-D SLAM in dynamic environments using point correlations[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2022,44(1):373-389

    • [73] Qian C L,Xiang Z H,Wu Z R,et al.RF-LIO:removal-first tightly-coupled lidar inertial odometry in high dynamic environments[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 27-October 1,2021,Prague,Czech Republic.IEEE,2021:4421-4428

    • [74] Sukumar S R,Bozdogan H,Page D L,et al.Sensor selection using information complexity for multi-sensor mobile robot localization[C]//Proceedings 2007 IEEE International Conference on Robotics and Automation.April 10-14,2007,Rome,Italy.IEEE,2007:4158-4163

    • [75] Bian F,Kempe D,Govindan R.Utility-based sensor selection[C]//2006 5th International Conference on Information Processing in Sensor Networks.April 19-21,2006,Nashville,TN,USA.IEEE,2006:11-18

    • [76] Shamaiah M,Banerjee S,Vikalo H.Greedy sensor selection:leveraging submodularity[C]//49th IEEE Conference on Decision and Control.December 15-17,2010,Atlanta,GA,USA.IEEE,2010:2572-2577

    • [77] Joshi S,Boyd S.Sensor selection via convex optimization[J].IEEE Transactions on Signal Processing,2009,57(2):451-462

    • [78] Dissanayake G,Sukkarieh S,Nebot E,et al.The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications[J].IEEE Transactions on Robotics and Automation,2001,17(5):731-747

    • [79] Niu X J,Nassar S,El-Sheimy N.An accurate land-vehicle MEMS IMU/GPS navigation system using 3D auxiliary velocity updates[J].Navigation,2007,54(3):177-188

    • [80] Yang H J,Fan X Z,Shi P,et al.Nonlinear control for tracking and obstacle avoidance of a wheeled mobile robot with nonholonomic constraint[J].IEEE Transactions on Control Systems Technology,2016,24(2):741-746

    • [81] Scaramuzza D,Fraundorfer F,Pollefeys M,et al.Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints[C]//2009 IEEE 12th International Conference on Computer Vision.September 29-October 2,2009,Kyoto,Japan.IEEE,2009:1413-1419

    • [82] 刘万科,农旗,陶贤露,等.非完整约束的OD/SINS自适应组合导航方法[J].测绘学报,2022,51(1):9-17;LIU Wanke,NONG Qi,TAO Xianlu,et al.OD/SINS adaptive integrated navigation method with non-holonomic constraints[J].Acta Geodaetica et Cartographica Sinica,2022,51(1):9-17

    • [83] Zhang Z X,Niu X J,Tang H L,et al.GNSS/INS/ODO/wheel angle integrated navigation algorithm for an all-wheel steering robot[J].Measurement Science and Technology,2021,32(11):115122

    • [84] Shin E H.Estimation techniques for low-cost inertial navigation[D].Calgary,Canada:University of Calgary,2005

    • [85] Ben Y Y,Yin G S,Gao W,et al.Improved filter estimation method applied in zero velocity update for SINS[C]//2009 International Conference on Mechatronics and Automation.August 9-12,2009,Changchun,China.IEEE,2009:3375-3380

    • [86] 方靖,顾启泰,丁天怀.车载惯性导航的动态零速修正技术[J].中国惯性技术学报,2008,16(3):265-268;FANG Jing,GU Qitai,DING Tianhuai.Dynamic zero velocity update for vehicle inertial navigation system[J].Journal of Chinese Inertial Technology,2008,16(3):265-268

    • [87] Liu W,Zhang Z.Research on zero velocity update for high-precision land-vehicle inertial navigation system[J].Navigation and Control,2013,12(2):29-33

    • [88] 高钟毓.惯性定位系统的卡尔曼滤波器设计[J].中国惯性技术学报,2000,8(4):6-12,20;GAO Zhongyu.Kalman filter design of inertial positioning system[J].Journal of Chinese Inertial Technology,2000,8(4):6-10,20

    • [89] Li X F,Mao Y L,Xie L,et al.Applications of zero-velocity detector and Kalman filter in zero velocity update for inertial navigation system[C]//Proceedings of 2014 IEEE Chinese Guidance,Navigation and Control Conference.August 8-10,2014,Yantai,China.IEEE,2014:1760-1763

    • [90] 奔粤阳,孙枫,高伟,等.惯导系统的零速校正技术研究[J].系统仿真学报,2008,20(17):4639-4642;BEN Yueyang,SUN Feng,GAO Wei,et al.Study of zero velocity update for inertial navigation[J].Journal of System Simulation,2008,20(17):4639-4642

    • [91] Grejner-Brzezinska D A,Yi Y D,Toth C K.Bridging GPS gaps in urban canyons:the benefits of ZUPTs[J].Navigation,2001,48(4):216-226

    • [92] Ramanandan A,Chen A N,Farrell J A.Inertial navigation aiding by stationary updates[J].IEEE Transactions on Intelligent Transportation Systems,2012,13(1):235-248

    • [93] Davidson P,Hautamäki J,Collin J,et al.Improved vehicle positioning in urban environment through integration of GPS and low-cost inertial sensors[C]//Proceedings of the European Navigation Conference(ENC'09).May 4-6,2009,Naples,Italy.2009:101-107

    • [94] Kasameyer P W,Hutchings L,Ellis M F,et al.MEMS-based INS tracking of personnel in a GPS-denied environment[J].Proceedings of the 18th International Technical Meeting of the Satellite Division of the Institute of Navigation,ION GNSS 2005,2005:949-955

    • [95] Ojeda L,Borenstein J.Non-GPS navigation with the personal dead-reckoning system[C]//SPIE Defense Security Conference,Unmanned Systems Technology IX.April 9-13,2007,Orlando,Florida,USA.2007,6561:110-120

    • [96] Yu H.An algorithm to detect zero-velocity in automobiles using accelerometer signals[J].Ritala Ristopiché Robert,2009

    • [97] Ramanandan A,Chen A,Farrell J A,et al.Detection of stationarity in an inertial navigation system[C]//Proceedings of the 23rd International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GNSS 2010),2010:238-244

    • [98] Geneva P,Eckenhoff K,Lee W,et al.OpenVINS:a research platform for visual-inertial estimation[C]//2020 IEEE International Conference on Robotics and Automation.May 17-21,2020,Paris,France.IEEE,2020:4666-4672

    • [99] Petovello M G,Mezentsev O,Lachapelle G,et al.High sensitivity GPS velocity updates for personal indoor navigation using inertial navigation systems[C]//Proceedings of the 16th International Technical Meeting of the Satellite Division of the Institute of Navigation(ION GPS/GNSS 2003),2003:2886-2896

    • [100] Mezentsev O,Collin J,Lachapelle G.Vehicular navigation in urban canyons using a high sensitivity GPS receiver augmented with a medium-grade IMU[C]//10th Saint Petersburg International Conference on Integrated Navigation Systems,2003:64-70

    • [101] Zampella F,Khider M,Robertson P,et al.Unscented Kalman filter and magnetic angular rate update(MARU)for an improved pedestrian dead-reckoning[C]//Proceedings of the 2012 IEEE/ION Position,Location and Navigation Symposium.April 23-26,2012,Myrtle Beach,SC,USA.IEEE,2012:129-139

    • [102] Rajagopal S.Personal dead reckoning system with shoe mounted inertial sensors[R].Master's Degree Project,Stockholm,Sweden,2008:013

    • [103] Wan E A,Van Der Merwe R.The unscented Kalman filter for nonlinear estimation[C]//Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing,Communications,and Control Symposium(Cat.No.00EX373).October 4,2000,Lake Louise,AB,Canada.IEEE,2000:153-158

    • [104] Wang R Z,Zou D P,Xu C Q,et al.An aerodynamic model-aided state estimator for multi-rotor UAVs[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 24-28,2017,New York:ACM,2017:2164-2170

    • [105] Nisar B,Foehn P,Falanga D,et al.VIMO:simultaneous visual inertial model-based odometry and force estimation[J].IEEE Robotics and Automation Letters,2019,4(3):2785-2792

    • [106] Ribeiro M I.Kalman and extended Kalman filters:concept,derivation and properties[R].Institute for Systems and Robotics Lisboa Portugal,2004:46

    • [107] Carpenter J,Clifford P,Fearnhead P.Improved particle filter for nonlinear problems[J].IEE Proceedings-Radar,Sonar and Navigation,1999,146(1):2-7

    • [108] Weiss S,Achtelik M W,Lynen S,et al.Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments[C]//2012 IEEE International Conference on Robotics and Automation.May 14-18,2012,Saint Paul,MN,USA.IEEE,2012:957-964

    • [109] Falquez J M,Kasper M,Sibley G.Inertial aided dense & semi-dense methods for robust direct visual odometry[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).October 9-14,2016,Daejeon,Korea(South).IEEE,2016:3601-3607

    • [110] Se S,Lowe D,Little J J.Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks[J].The International Journal of Robotics Research,2002,21(8):735-760

    • [111] Jones E S,Soatto S.Visual-inertial navigation,mapping and localization:a scalable real-time causal approach[J].The International Journal of Robotics Research,2011,30(4):407-430

    • [112] Bloesch M,Omari S,Hutter M,et al.Robust visual inertial odometry using a direct EKF-based approach[C]//2015 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).September 28-October 2,2015,Hamburg,Germany.IEEE,2015:298-304

    • [113] Mourikis A I,Roumeliotis S I.Amulti-state constraint Kalman filter for vision-aided inertial navigation[C]//2007 IEEE International Conference on Robotics and Automation.April 10-14,2007,Rome,Italy.IEEE,2007:3565-3572

    • [114] Sun K,Mohta K,Pfrommer B,et al.Robust stereo visual inertial odometry for fast autonomous flight[J].IEEE Robotics and Automation Letters,2018,3(2):965-972

    • [115] Zou D P,Wu Y X,Pei L,et al.StructVIO:visual-inertial odometry with structural regularity of man-made environments[J].IEEE Transactions on Robotics,2019,35(4):999-1013

    • [116] Srinara S,Lee C M,Tsai S,et al.Performance analysis of 3D NDT scan matching for autonomous vehicles using INS/GNSS/3D LiDAR-SLAM integration scheme[C]//2021 IEEE International Symposium on Inertial Sensors and Systems.March 22-25,2021,Kailua-Kona,HI,USA.IEEE,2021:1-4

    • [117] Schütz A,Sánchez-Morales D E,Pany T.Precise positioning through a loosely-coupled sensor fusion of GNSS-RTK,INS and LiDAR for autonomous driving[C]//2020 IEEE/ION Position,Location and Navigation Symposium(PLANS).April 20-23,2020,Portland,OR,USA.IEEE,2020:219-225

    • [118] Chiang K W,Tsai G J,Chang H W,et al.Seamless navigation and mapping using an INS/GNSS/grid-based SLAM semi-tightly coupled integration scheme[J].Information Fusion,2019,50:181-196

    • [119] Chiang K W,Tsai G J,Chu H J,et al.Performance enhancement of INS/GNSS/refreshed-SLAM integration for acceptable lane-level navigation accuracy[J].IEEE Transactions on Vehicular Technology,2020,69(3):2463-2476

    • [120] Besl P J,McKay N D.A method for registration of 3-D shapes[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1992,14(2):239-256

    • [121] Qin T,Cao S,Pan J,et al.A general optimization-based framework for global pose estimation with multiple sensors[J].arXiv e-print,2019,arXiv:1901.03642

    • [122] Gong Z,Liu P L,Wen F,et al.Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment[J].IEEE Transactions on Instrumentation and Measurement,2021,70:1-16

    • [123] Li X X,Wang X B,Liao J C,et al.Semi-tightly coupled integration of multi-GNSS PPP and S-VINS for precise positioning in GNSS-challenged environments[J].Satellite Navigation,2021,2(1):1-14

    • [124] Wang Z Q,Li M,Zhou D K,et al.Direct sparse stereo visual-inertial global odometry[C]//2021 IEEE International Conference on Robotics and Automation.May 30-June 5,2021,Xi'an,China.IEEE,2021:14403-14409

    • [125] Liu J X,Gao W,Hu Z Y.Optimization-based visual-inertial SLAM tightly coupled with raw GNSS measurements[C]//2021 IEEE International Conference on Robotics and Automation.May 30-June 5,2021,Xi'an,China.IEEE,2021:11612-11618

    • [126] Wu K J,Guo C X,Georgiou G,et al.VINS on wheels[C]//2017 IEEE International Conference on Robotics and Automation.May 29-June 3,2017,Singapore.IEEE,2017:5155-5162

    • [127] Moore T,Stouch D.A generalized extended Kalman filter implementation for the robot operating system[M]//Intelligent Autonomous Systems 13.Cham:Springer International Publishing,2015:335-348

  • 地址:江苏省南京市宁六路219号    邮编:210044

    联系电话:025-58731025    E-mail:nxdxb@nuist.edu.cn

    南京信息工程大学学报 ® 2024 版权所有  技术支持:北京勤云科技发展有限公司