[1] GUERRA E, MUNGUA R, GRAU A.UAV visual and laser sensors fusion for detection and positioning in industrial applications[J].Sensors, 2018, 18(7):738-757.
[6] GAO W, BOOKER M, ADIWAHONO A, et al.An improved frontier-based approach for autonomous exploration[C]//Proceeding of the International Conference on Control, Automation, Robotics & Vision, 2018:292-297.
[7] VIDAL E, PALOMERAS N, ISTENIC^ K, et al.Multisensor online 3D view planning for autonomous underwater exploration[J].Journal of Field Robotics, 2020, 37(6):1123-1147.
[8] BIRCHER A, KAMEL M, ALEXIS K, et al.Receding horizon path planning for 3D exploration and surface inspection[J].Autonomous Robots, 2018, 42(2):291-306.
[9] CIESLEWSKI T, KAUFMANN E, SCARAMUZZA D.Rapid exploration with multi-rotors:a frontier selection method for high speed flight[C]//Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017:2135-2142.
[10] SELIN M, TIGER M, DUBERG D, et al.Efficient autonomous exploration planning of large-scale 3-D environ-ments[J].IEEE Robotics and Automation Letters, 2019, 4(2):1699-1706.
[11] DAI A, PAPATHEODOROU S, FUNK N, et al.Fast frontier-based information-driven autonomous exploration with an MAV[C]//IEEE International Conference on Robotics and Automation (ICRA), 2020:91-97.
[12] HORNUNG A, WURM K M, BENNEWITZ M, et al.OctoMap:an efficient probabilistic 3D mapping framework based on octrees[J].Autonomous Robots, 2013, 34(3):189-206.
[13] QIN T, LI P, SHEN S.VINS-Mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics, 2018, 34(4):1004-1020.
[14] CONNOLLY C.The determination of next best views[C]//Proceedings of the IEEE International Conference on Robotics and Automation, 1985:432-435.
[15] FURRER F, BURRI M, CHTELIK M, et al.RotorS—a modular gazebo MAV simulator framework[M]//KOUBAA A.Robot operating system(ROS):the complete reference.Berlin: Springer International Publishing, 2016:595-625.