[1] S Q Li, C Xu, M Xie. A robust o(n) solution to the perspective-n-point problem[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(7): 1444-1450.
[2] V Lepetit, F M Noguer, P Fua. Epnp: An accurate o (n) solution to the pnp problem [J]. International Journal of Computer Vision, 2009, 81(2): 155-166.
[3] L Ferraz, X Binefa, F M Noguer. Very fast solution to the PnP problem with algebraic outlier rejection [C]. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014. 501-508.
[4] J B Liu, X H Zhang, H B Liu, et al.. New method for camera pose estimation based on line correspondence [J]. Science China Technological Sciences, 2013, 56(11): 2787-2797.
[5] L L Zhang, C Xu, K M Lee, et al.. Robust and Efficient Pose Estimation from Line Correspondences [M]. Computer Vision-ACCV 2012, Lecture Notes in Computer Science, 2013, 7726: 217-230.
[6] F M Mirzaei, S I Roumeliotis. Globally optimal pose estimation from line correspondences [C]. IEEE International Conference on Robotics and Automation (ICRA), 2011. 5581-5588.
[7] A Adnan. Linear pose estimation from points or lines [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2003, 25(5): 578-589.
[8] R Kumar, A R Hanson. Robust methods for estimating pose and a sensitivity analysis [J]. CVGIP: Image Understanding, 1994, 60(3): 313-342.
[10] C Harris, C Stennett. RAPID-a video rate object tracker [C]. British Machine Vision Conference, 1990. 73-78.
[11] A Petit, E Marchand, K Kanani. Combining complementary edge, point and color cues in model-based tracking for highly dynamic scenes [C]. IEEE Int Conf on Robotics and Automation, ICRA′14, 2014. 4115-4120.
[12] C Choi, H I Christensen. Real-time 3D model-based tracking using edge and keypoint features for robotic manipulation [C]. IEEE Int Conf on Robotics and Automation, 2010. 4048-4055.
[13] L Vacchetti, V Lepetit, P Fua. Stable real-time 3d tracking using online and offline information [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(10): 1385-1391.
[14] C Choi, H I Christensen. Robust 3D visual tracking using particle filtering on the special Euclidean group: A combined approach of keypoint and edge features [J]. The International Journal of Robotics Research, 2012, 31(4): 498-519.
[15] L Vacchetti, V Lepetit, P Fua. Combining edge and texture information for real-time accurate 3d camera tracking [C]. Third IEEE and ACM International Symposium on Mixed and Augmented Reality, 2004. 48-57.
[16] M Pressigout, E Marchand. Real-time hybrid tracking using edge and texture information [J]. International Journal of Robotics Research, 2007, 26(7): 1-46.
[17] M Pressigout, E Marchand. Real-time 3D model-based tracking: Combining edge and texture information [C]. IEEE Int Conf on Robotics and Automation, 2006. 2726-2731.
[18] A Petit, E Marchand, K Kanani. A robust model-based tracker combining geometrical and color edge information [C]. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013. 3719-3724.
[19] A Petit, E Marchand, K Kanani. Augmenting markerless complex 3D objects by combining geometrical and color edge information [C]. IEEE International Symposium on Mixed and Augmented Reality, 2013. 287-288.
[20] G Panin, E Roth, A Knoll. Robust contour-based object tracking integrating color and edge likelihoods [C]. Proceedings of the Vision, Modeling, and Visualization Conference, 2008. 227-234.
[21] A A Moughlbay, E A Cervera, P S Martinet. Model based visual servoing tasks with an autonomous humanoid robot [J]. Frontiers of Intelligent Autonomous Systems, 2013, 466: 149-162.
[22] A A Moughlbay, E Cervera, P Martinet. Real-Time Model Based Visual Servoing Tasks on a Humanoid Robot [M]. Intelligent Autonomous Systems 12, Advances in Intelligent Systems and computing Volume, 2013. 321-333.
[23] C Teuliere, E Marchand, L Eck. Using multiple hypothesis in model-based tracking [C]. IEEE Int Conf on Robotics and Automation, 2010. 4559-4565.
[24] L Vacchetti, V Lepetit, P Fua. Fusing online and offline information for stable 3d tracking in real-time [C]. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. 241-248.
[25] J A Brown, D W Capson. A framework for 3D model-based visual tracking using a GPU-accelerated particle filter [J]. IEEE Transactions on Visualization and Computer Graphics, 2012, 18(1): 68-80.
[26] T Morwald, M Zillich, M Vincze. Edge tracking of textured objects with a recursive particle filter [C]. 19th International Conference on Computer Graphics and Vision (Graphicon), 2009. 96-103.
[27] M Pupilli, A Calway. Real-time camera tracking using known 3d models and a particle filter [C]. 18th International Conference on Pattern Recognition (ICPR′06), 2006. 199-203.
[28] P Barrera, J M Canas, V Matellan. Visual object tracking in 3D with color based particle filter [C]. International Journal of Information Technology, 2005, 2(1): 1123-1126.
[29] F Ababsa, M Mallem. Robust camera pose tracking for augmented reality using particle filtering framework [J]. Machine Vision and applications, 2011, 22(1): 181-195.
[30] G Klein, D W Murray. Full-3D Edge Tracking with a Particle Filter [C]. Proc British Machine Vision Conf (BMVC), 2006. 1119-1128.
[31] G R G Von, J Jakubowicz, J M Morel, et al.. LSD: A fast line segment detector with a false detection control [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(4): 722-732.
[32] C Akinlar, C Topal. EDLines: A real-time line segment detector with a false detection control [J]. Pattern Recognition Letters, 2011, 32(13): 1633-1642.
[33] Dong Jing, Yang Xia, Yu Qifeng. Fast line segment detection based on edge connecting [J]. Acta Optica Sinica, 2013, 33(3): 0315003.