[1] Space Telescope Science Institute. Hubble Space Telescope Primer for Cycle 15[EB/OL]. http:∥documents.stsci.edu/hst/proposing/documents/cp_cy15/primer.pdf.
[2] Hirzinger G, Landzettel K, Brunner B, et al.. DLR′s robotics technologies for on-orbit servicing[J]. Advanced Robotics, 2004, 18(2): 139-174.
[3] Nishida S I, Kawamoto S, Okawa Y, et al.. Space debris removal system using a small satellite[J]. Acta Astronautica, 2009, 65(1-2): 95-102.
[4] Committee on the Assessment of Options for Extending the Life of the Hubble Space Telescope, National Research Council. Assessment of Options for Extending the Life of the Hubble Spacetelescope: Final Report[M]. Washington: National Academies Press, 2005. 8-91.
[5] Liang Bin, Du Xiaodong, Li Cheng, et al.. Advances for in space robot on-orbit servicing non-cooperative spacecraft[J]. Robot, 2012, 34(2): 242-256.
[6] Xu W, Liang B, Li C, et al.. Autonomous rendezvous and robotic capturing of non-cooperative target in space[J]. Robotica, 2010, 28(5): 705-718.
[7] Zhang Yongliang. Key Techniques of Image Information Processing for Non-Cooperative Target-Directed Autonomous Spacecraft[D]. Changsha: National University of Defense Technology, 2009. 3-15.
[8] Zhang Shijie, Cao Xibin, Chen Min. Monocular vision based relative pose parameters determination for non-cooperative spacecrafts[J]. J Nanjing University of Science and Technology, 2006, 30(5): 564-567.
[9] Davison A J. Real-time simultaneous localization and mapping with a single camera[C]. Proc. IEEE International Conference on Computer Vision(ICCV), 2003. 1403-1410.
[10] Augenstein S, Rock S M. Simultaneous estimation of target pose and 3-D shape using the FastSLAM algorithm[C]. AIAA Guidance, Navigation and Control Conference(GNC), 2009. 1-15.
[11] Augenstein S, Rock S M. Improved frame-to-frame pose tracking during vision-only SLAM/SFM with a tumbling target[C]. IEEE International Conference on Robotics and Automation (ICRA), 2011. 3131-3138.
[12] Augenstein S, Rock S M. AUV/ROV pose and shape estimation of tethered targets without fiducials[C]. International Symposium on Unmanned Untethered Submersible Technology (UUST), 2009. 366-374.
[13] Fuyuto T, Heihachiro K, Shin′ichiro N. Motion estimation to a failed satellite on orbit using stereo vision and 3D model matching[C]. International Conference on Control Automation Robotics & Vision (ICARCV), 2006. 1-8.
[14] Segal S, Carmi A, Gurfil P. Vision-based relative state estimation of non-cooperative spacecraft under modeling uncertainty[C]. IEEE Aerospace Conference, 2011. 1-7.
[15] Besl P J, McKay H D. A method for registration of 3-D shapes[J]. IEEE Trans Pattern Analysis and Machine Intelligence, 1992, 14(2): 239-256.
[16] English C, Okouneva G, Saintcyr P, et al.. Real-time dynamic pose estimation systems in space: lessons learned for system design and performance evaluation[J]. International J Intelligent Control and Systems, 2011, 16(2): 79-96.
[17] Ruel S, English C, Anctil M, et al.. 3DLASSO: real-time pose estimation from 3D data for autonomous satellite servicing[C]. The 8th International Symposium on Artificial Intelligence, Robotics and Automation in Space (ISAIRAS 2005), 2005. 1-8.
[18] Blais F, Picard M, Godin G. Accurate 3D acquisition of freely moving objects[C]. 2nd International Symposium on 3D Data Processing, Visualization and Transmission, 2004. 422-429.
[19] Russell S P, Rock S M. Particle filtering range data for pose estimation under torque-free motion[C]. 14th IASTED International Conference on Robotics and Applications, 2009. 28-36.
[20] Miller L K,Masciarelli J, Rohrschneider R, et al.. Critical advancement in telerobotic servicing vision technology[C]. AIAA SPACE Conference & Exposition, 2010. 1-9.
[21] Huhle B, Fleck S, Schilling A. Integrating 3D time-of-flight camera data and high resolution images for 3DTV applications[C]. 3DTV Conference-The True Vision, 2007. 1-4.
[22] Netramai C, Melnychuk O, Joochim C, et al.. Combining PMD and stereo camera for motion estimation of a mobile robot[C]. 17th World Congress of the International Federation of Automatic Control, 2008. 5417-5422.
[23] Reulke R. Combination of distance data with high resolution images[C]. ISPRS, Commission V Symposium, Image Engineering and Vision Metrology, 2006, XXXVI Part 5: 1-5.
[24] Hahne U, Alexa M. Combining time-of-flight depth and stereo images without accurate extrinsic calibration[C]. International workshop in Conjunction with DAGM′07: Dynamic 3D Imaging, 2007. 325-333.
[25] A′rni S, Aanas H, Larsen R. Fusion of stereo vision and time-of-flight imaging for improved 3D estimation[C]. International workshop in Conjunction with DAGM′07: Dynamic 3D Imaging, 2007. 425-433.
[26] Kuhnert K D, Stommel M. Fusion of stereo-camera and PMD-camera data for real-time suited precise 3D environment reconstruction[C]. International Conference on Intelligent Robots and Systems, 2006. 4780-4785.
[27] Ghobadi S E. Real Time Object Recognition and Tracking Using 2D/3D Images[D]. Siegen: Siegen University, 2010. 3-25.
[28] Zhang Xudong, Gao Juan, Ye Zirui, et al.. Non-Cooperative Target Pose Measurement Method Based on 2D and 3D Camera Integration[P]. China Patent 201210044842.8, 2012-07-18.
[29] Amiri P J, Gruen A. Integrated laser scanner and intensity image calibration and accuracy assessment[C]. ISPRS Workshop Laser Scanning, 2005, XXXVI-3/W19: 18-23.
[30] Baltzakis H, Argyros A, Trahania P. Fusion of range and visual data for the extraction of scene structure information[C]. International Conference on Pattern Recognition (ICPR), 2002, 4: 7-11.
[31] Reulke R, Scheibe K, Wehr A. Integration of digital panoramic camera and laser scanner data[C]. International Workshop on Recording, Modeling and Visualization of Cultural Heritage, 2005. 157-169.
[32] Muehlbauere Q, Kuehnlenz K, Buss M. Fusing laser and vision data with a genetic ICP algorithm[C]. 10th International Conference on Control, Automation, Robotics and Vision, 2008. 1844-1849.
[33] Perrollaz M, Labayrade R, Royère C, et al.. Long range obstacle detection using laser scanner and stereovision[C]. Intelligent Vehicles Symposium, 2006. 182-187.
[34] Valls M J, Dissanayake G. Robotic 3D visual mapping for augmented situational awareness in unstructured environments[C]. International Workshop on Robotics for Risky Interventions and Surveillance of the Environment (RISE′08), 2008.
[35] Zhu J, Wang L, Yang R, et al.. Fusion of time-of-flight depth and stereo for high accuracy depth maps[C]. Computer Vision and Pattern Recognition (CVPR), 2008.
[36] Ruel S, Luu T. Space shuttle testing of the TriDAR 3D rendezvous and docking sensor docking sensor[J]. J. Field Robotics, 2012, 29(4): 535-553.=37 Ruel S, Luu T. STS|128 on-orbit demonstration of the TriDAR targetless rendezvous and docking sensor[C]. IEEE Aerospace Conference, 2010.
[37] Joseph M, John V, Matt S, et al.. Pose measurement performance of the argon relative navigation sensor suite in simulated flight conditions[R]. American Institute of Aeronautics and Astronautics, 2012. 1-25.
[38] Naasz B J, Eepoel J V, Queen S Z. Flight results from the HST SM4 relative navigation sensor system[C]. 33rd AAS Guidance and Control Conference, 2010. 723-744.