• Laser & Optoelectronics Progress
  • Vol. 57, Issue 4, 040001 (2020)
Rongsheng Lu1、*, Yanqiong Shi2、**, and Haibing Hu1
Author Affiliations
  • 1School of Instrument Science and Opto-Electronics Engineering, Hefei University of Technology, Hefei, Anhui 230009, China
  • 2School of Mechanical and Electrical Engineering, Anhui Jianzhu University, Hefei, Anhui 230601, China
  • show less
    DOI: 10.3788/LOP57.040001 Cite this Article Set citation alerts
    Rongsheng Lu, Yanqiong Shi, Haibing Hu. Review of Three-Dimensional Imaging Techniques for Robotic Vision[J]. Laser & Optoelectronics Progress, 2020, 57(4): 040001 Copy Citation Text show less
    References

    [1] Lu R S, Wu A, Zhang T D et al. Review on automated optical (visual) inspection and its applications in defect detection[J]. Acta Optica Sinica, 38, 0815002(2018).

    [2] He Z X, Wu C R, Zhang S Y et al. Moment-based 2. 5-D visual servoing for textureless planar part grasping[J]. IEEE Transactions on Industrial Electronics, 66, 7821-7830(2019).

    [3] Malis E, Chaumette F, Boudet S. 2 1/2 D visual servoing[J]. IEEE Transactions on Robotics and Automation, 15, 238-250(1999).

    [4] Kragic D, Christensen H I. Survey on visual servoing for manipulation[R]. Report from Computational Vision and Active Perception Laboratory (CVAP), 1-59(2002).

    [5] Dong G Q, Zhu Z H. Kinematics-based incremental visual servo for robotic capture of non-cooperative target[J]. Robotics and Autonomous Systems, 112, 221-228(2019).

    [6] Xu D, Tan M, Li Y[M]. Visual measurement and control for robots: 3rd ed(2016).

    [7] Hashimoto K. A review on vision-based control of robot manipulators[J]. Advanced Robotics, 17, 969-991(2003).

    [8] Ozato A, Maru N. Position and attitude control of eye-in-hand system by visual servoing using binocular visual space. [C]∥2014 World Automation Congress (WAC), August 3-7, 2014. Waikoloa, HI. IEEE, 7-12(2014).

    [9] Walck G, Drouin M. Progressive 3D reconstruction of unknown objects using one eye-in-hand camera. [C]∥2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guilin, 971-976(2009).

    [10] Nobakht H, Liu Y. A hybrid positioning method for eye-in-hand industrial robot by using 3D reconstruction and IBVS. [C]∥2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), December 6-9, 2015. Zhuhai. IEEE, 2609-2614(2015).

    [11] Hentout A, Bouzouia B, Akli I et al. Multi-agent control architecture of mobile manipulators: extraction of 3D coordinates of object using an eye-in-hand camera. [C]∥2009 3rd International Conference on Signals, Circuits and Systems (SCS), November 6-8, 2009. Medenine, Tunisia. IEEE, 1-6(2009).

    [12] Shaw J, Cheng K Y. Object identification and 3D position calculation using eye-in-hand single camera for robot gripper. [C]∥2016 IEEE International Conference on Industrial Technology (ICIT), March 14-17, 2016. Taipei, Taiwan, China. IEEE, 1622-1625(2016).

    [13] Luo R C, Chou S C, Yang X Y et al. Hybrid eye-to-hand and eye-in-hand visual servo system for parallel robot conveyor object tracking and fetching. [C]∥IECON 2014 - 40th Annual Conference of the IEEE Industrial Electronics Society, October 29-November 1, 2014. Dallas, TX, USA. IEEE, 2558-2563(2014).

    [14] Flandin G, Chaumette F, Marchand E. Eye-in-hand/eye-to-hand cooperation for visual servoing. [C]∥Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), San Francisco, CA, USA. IEEE, 3, 2741-2746(2000).

    [15] Muis A, Ohnishi K. Eye-to-hand approach on eye-in-hand configuration within real-time visual servoing[J]. ASME Transactions on Mechatronics, 10, 404-410(2005).

    [16] Lippiello V, Siciliano B, Villani L. Eye-in-hand/eye-to-hand multi-camera visual servoing. [C]∥Proceedings of the 44th IEEE Conference on Decision and Control, Seville, Spain. IEEE, 5354-5359(2005).

    [17] Zhao Y S, Gong L, Huang Y X et al. A review of key techniques of vision-based control for harvesting robot[J]. Computers and Electronics in Agriculture, 127, 311-323(2016).

    [18] Elarbi-Boudihir M. Al-Shalfan K A. Eye-in-hand/eye-to-hand configuration for a WMRA control based on visual servoing. [C]∥2013 IEEE 11th International Workshop of Electronics, Control, Measurement, Signals and their application to Mechatronics, June 24-26, 2013. Toulouse Cedex 7, France. IEEE, 1-8(2013).

    [19] Brown G M. Overview of three-dimensional shape measurement using optical methods[J]. Optical Engineering, 39, 10-22(2000).

    [20] Bi Z M, Wang L H. Advances in 3D data acquisition and processing for industrial applications[J]. Robotics and Computer-Integrated Manufacturing, 26, 403-413(2010).

    [21] Aggarwal J K, Xia L. Human activity recognition from 3D data: a review[J]. Pattern Recognition Letters, 48, 70-80(2014).

    [22] Maier-Hein L, Mountney P, Bartoli A et al. Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery[J]. Medical Image Analysis, 17, 974-996(2013).

    [23] Schöning J, Heidemann G. Taxonomy of 3D sensors - a survey of state-of-the-art consumer 3D-reconstruction sensors and their field of applications. [C]∥Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, February 27-29, 2016. Rome, Italy. SCITEPRESS-Science and and Technology Publica(2016).

    [24] Remondino F, Editors D S. TOF range-imaging cameras[M]. London: Springer(2013).

    [25] Besl P J. Active, optical range imaging sensors[J]. Machine Vision and Applications, 1, 127-152(1988).

    [26] Lange R, Seitz P. Seeing distances-a fast time-of-flight 3D camera[J]. Sensor Review, 20, 212-217(2000).

    [29] Reiser U, Kubacki J. Using a 3D time-of-flight range camera for visual tracking[J]. IFAC Proceedings Volumes, 40, 355-360(2007).

    [30] Ollikkala A V H, Makynen A J. Range imaging using a time-of-flight 3D camera and a cooperative object. [C]∥2009 IEEE Intrumentation and Measurement Technology Conference, May 5-7, 2009. Singapore. Singapore. IEEE, 817-821(2009).

    [31] Alenyà G, Foix S, Torras C. ToF cameras for active vision in robotics[J]. Sensors and Actuators A: Physical, 218, 10-22(2014).

    [32] Piatti D, Rinaudo F. SR-4000 and CamCube3. 0 time of flight (ToF) cameras: tests and comparison[J]. Remote Sensing, 4, 1069-1089(2012).

    [33] Hebert M. Active and passive range sensing for robotics. [C]∥Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), San Francisco, CA, USA. IEEE, 1, 102-110(2000).

    [34] Chougule V N, Gosavi H S, Dharwadkar M M et al. Review of different 3D scanners and scanning techniques. [C]∥IOSR Journal of Engineering (IOSRJEN), 7th National conference on Recent Developments In Mechanical Engineering RDME-2018, 41-44(2018).

    [35] de Sousa G B, Olabi A, Palos J et al. 3D metrology using a collaborative robot with a laser triangulation sensor[J]. Procedia Manufacturing, 11, 132-140(2017).

    [36] Ettl S. Introductory review on ‘Flying Triangulation’: a motion-robust optical 3D measurement principle[J]. Contemporary Physics, 56, 144-158(2015).

    [37] Ettl S, Arold O, Yang Z et al. Flying triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects[J]. Applied Optics, 51, 281-289(2012).

    [38] Arold O, Ettl S, Willomitzer F et al[2020-01-20]. Hand-Guided 3D Surface Acquisition by Combining Simple Light Sectioning with Real-Time Algorithms [2020-01-20].https: ∥arxiv., org/abs/1401, 1946.

    [39] Hillenbrand M, Mitschunas B. Brill F A G, et al. Spectral characteristics of chromatic confocal imaging systems[J]. Applied Optics, 53, 7634-7642(2014).

    [40] Hillenbrand M, Lorenz L, Kleindienst R et al. Spectrally multiplexed chromatic confocal multipoint sensing[J]. Optics Letters, 38, 4694-4697(2013).

    [41] Hillenbrand M, Weiss R, Endrödy C et al. Chromatic confocal matrix sensor with actuated pinhole arrays[J]. Applied Optics, 54, 4927-4936(2015).

    [42] Beyerer J, Puente F, Frese L C. Chapter 7. Methods of image acquisition, in book: machine vision -- automated visual inspection: theory, practice and applications[M]. London: Springer, 307-308(2016).

    [43] Taphanel M, Zink R, Langle T et al. Multiplex acquisition approach for high speed 3d measurements with a chromatic confocal microscope[J]. Proceedings of SPIE, 9525, 95250Y(2015).

    [44] Taphanel M, Beyerer J. Fast 3D in-line sensor for specular and diffuse surfaces combining the chromatic confocal and triangulation principle. [C]∥2012 IEEE International Instrumentation and Measurement Technology Conference Proceedings, May 13-16, 2012. Graz, Austria. IEEE, 1072-1077(2012).

    [45] Lin P C, Sun P C, Zhu L J et al. Single-shot depth-section imaging through chromatic slit-scan confocal microscopy[J]. Applied Optics, 37, 6764-6770(1998).

    [46] Chun B S, Kim K, Gweon D. Three-dimensional surface profile measurement using a beam scanning chromatic confocal microscope[J]. The Review of Scientific Instruments, 80, 073706(2009).

    [47] Zhong K, Li Z W, Zhou X H et al. Enhanced phase measurement profilometry for industrial 3D inspection automation[J]. The International Journal of Advanced Manufacturing Technology, 76, 1563-1574(2015).

    [48] Servin M, Padilla M, Garnica G et al. Profilometry of three-dimensional discontinuous solids by combining two-steps temporal phase unwrapping, co-phased profilometry and phase-shifting interferometry[J]. Optics and Lasers in Engineering, 87, 75-82(2016).

    [49] Servin M, Garnica G, Padilla J M[2020-01-20]. 360-Degree Profilometry of Discontinuous Solids Co-Phasing 2-Projectors and 1-Camera [2020-01-20].https: ∥arxiv., org/abs/1408, 6463.

    [50] Servin M, Garnica G, Estrada J C et al. Coherent digital demodulation of single-camera N-projections for 3D-object shape measurement: Co-phased profilometry[J]. Optics Express, 21, 24873(2013).

    [51] Application Report[2020-01-20]. Getting Started with TI DLP © Display Technology [2020-01-20].http: ∥www. ti. com. cn/cn/lit/an/dlpa059c/dlpa059c. pdf..

    [52] Sun W S, Chiang Y C, Tsuei C H. Optical design for the DLP pocket projector using LED light source[J]. Physics Procedia, 19, 301-307(2011).

    [53] Ishiyama H, Terabayashi K, Umeda K. A 100 Hz real-time sensing system of textured range images. [C]∥International Symposium on Optomechatronic Technologies. IEEE(2011).

    [54] van der Jeught S, Dirckx J J J. Real-time structured light profilometry: a review[J]. Optics and Lasers in Engineering, 87, 18-31(2016).

    [55] Geng J. Structured-light 3D surface imaging: a tutorial[J]. Advances in Optics and Photonics, 3, 128-160(2011).

    [56] Fernandez S. One-shot pattern projection for dense and accurate 3D acquisition in structured light[D]. Girona : University of Girona(2012).

    [57] Salvi J, Fernandez S, Pribanic T et al. A state of the art in structured light patterns for surface profilometry[J]. Pattern Recognition, 43, 2666-2680(2010).

    [58] Lin H B, Nie L, Song Z. A single-shot structured light means by encoding both color and geometrical features[J]. Pattern Recognition, 54, 178-189(2016).

    [59] Chen L C, Nguyen X L. Dynamic 3D surface profilometry using a novel colour pattern encoded with a multiple triangular model[J]. Measurement Science and Technology, 21, 054009(2010).

    [60] Zhang Z H. Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques[J]. Optics and Lasers in Engineering, 50, 1097-1106(2012).

    [61] Tang S M, Zhang X, Song Z et al. Robust pattern decoding in shape-coded structured light[J]. Optics and Lasers in Engineering, 96, 50-62(2017).

    [62] Khoshelham K, Elberink S O. Accuracy and resolution of kinect depth data for indoor mapping applications[J]. Sensors, 12, 1437-1454(2012).

    [63] Lin C, Li Y, Xu G et al. Optimizing ZNCC calculation in binocular stereo matching[J]. Signal Processing: Image Communication, 52, 64-73(2017).

    [64] Birchfield S, Tomasi C. A pixel dissimilarity measure that is insensitive to image sampling[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20, 401-406(1998).

    [65] Mei X, Sun X, Zhou M C et al. On building an accurate stereo matching system on graphics hardware. [C]∥2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), November 6-13, 2011. Barcelona, Spain. IEEE, 467-474(2011).

    [66] Hirschmuller H. Accurate and efficient stereo processing by semi-global matching and mutual information. [C]∥2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), San Diego, CA, USA. IEEE, 2, 807-814(2005).

    [67] Hirschmuller H. Stereo processing by semiglobal matching and mutual information[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30, 328-341(2008).

    [68] Hong L, Chen G. Segment-based stereo matching using graph cuts. [C]∥Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA. IEEE(2004).

    [70] Salvi J, Pagès J, Batlle J. Pattern codification strategies in structured light systems[J]. Pattern Recognition, 37, 827-849(2004).

    [71] Ayubi G A, di Martino J M, Flores J L et al. Binary coded linear fringes for three-dimensional shape profiling[J]. Optical Engineering, 51, 103601(2012).

    [72] Zhang Q C, Su X Y, Xiang L Q et al. 3D shape measurement based on complementary Gray-code light[J]. Optics and Lasers in Engineering, 50, 574-579(2012).

    [73] Zuo C, Feng S J, Huang L et al. Phase shifting algorithms for fringe projection profilometry: a review[J]. Optics and Lasers in Engineering, 109, 23-59(2018).

    [74] Mao C L, Lu R S, Dong J T et al. Overview of the 3D profilometry of phase shifting fringe projection[J]. Acta Metrologica Sinica, 39, 628-640(2018).

    [75] Zuo C, Huang L, Zhang M L et al. Temporal phase unwrapping algorithms for fringe projection profilometry: a comparative review[J]. Optics and Lasers in Engineering, 85, 84-103(2016).

    [76] Talebi R, Abdel-Dayem A et al. Computer, 291-304(2014).

    [77] Posdamer J L, Altschuler M D. Surface measurement by space-encoded projected beam systems[J]. Computer Graphics and Image Processing, 18, 1-17(1982).

    [78] Moreno D, Taubin G. Simple, accurate, and robust projector-camera calibration. [C]∥2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, October 13-15, 2012. Zurich, Switzerland. IEEE, 464-471(2012).

    [79] Mao C L, Lu R S, Liu Z J. A multi-frequency inverse-phase error compensation method for projector nonlinear in 3D shape measurement[J]. Optics Communications, 419, 75-82(2018).

    [80] Zheng D L, Da F P, Qian K M et al. Phase-shifting profilometry combined with Gray-code patterns projection: unwrapping error removal by an adaptive median filter[J]. Optics Express, 25, 4700-4713(2017).

    [81] Wu Z J, Guo W B, Zhang Q C. High-speed three-dimensional shape measurement based on shifting Gray-code light[J]. Optics Express, 27, 22631-22644(2019).

    [82] Sansoni G, Carocci M, Rodella R. Three-dimensional vision based on a combination of Gray-code and phase-shift light projection: analysis and compensation of the systematic errors[J]. Applied Optics, 38, 6565-6573(1999).

    [83] Balzer J, Werling S. Principles of shape from specular reflection[J]. Measurement, 43, 1305-1317(2010).

    [84] Häusler G, Knauer M C, Faber C et al. Deflectometry: 3D-metrology from nanometer to meter[M]. ∥Fringe 2009. Berlin, Heidelberg: Springer Berlin Heidelberg, 1-6(2009).

    [85] Zhang H W, Ji L S, Liu S G et al. Three-dimensional shape measurement of a highly reflected, specular surface with structured light method[J]. Applied Optics, 51, 7724-7732(2012).

    [86] Zhang Z H, Wang Y M, Huang S J et al. Three-dimensional shape measurements of specular objects using phase-measuring deflectometry[J]. Sensors, 17, 2835(2017).

    [87] Liu M M, Hartley R, Salzmann M. Mirror surface reconstruction from a single image[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37, 760-773(2015).

    [88] Savarese S, Chen M, Perona P. Local shape from mirror reflections[J]. International Journal of Computer Vision, 64, 31-67(2005).

    [89] Knauer M C, Kaminski J, Hausler G. Phase measuring deflectometry: a new approach to measure specular free-form surfaces. [C]∥Optical Metrology in Production Engineering. International Society for Optics and Photonics(2004).

    [90] Wang Z, Inokuchi S. Determining shape of specular surfaces. [C]∥The 8th Scandinavian Conference on Image Analysis, Tromso, Norway, May 25-28, 1190-1187(1993).

    [91] Ren H Y, Gao F, Jiang X Q. Iterative optimization calibration method for stereo deflectometry[J]. Optics Express, 23, 22060-22068(2015).

    [92] Xu Y J, Gao F, Zhang Z H et al. A holistic calibration method with iterative distortion compensation for stereo deflectometry[J]. Optics and Lasers in Engineering, 106, 111-118(2018).

    [93] Nguyen H, Nguyen D, Wang Z Y et al. Real-time, high-accuracy 3D imaging and shape measurement[J]. Applied Optics, 54, A9-A17(2015).

    [94] Zhang S. Recent progresses on real-time 3D shape measurement using digital fringe projection techniques[J]. Optics and Lasers in Engineering, 48, 149-158(2010).

    [95] Howard I P, Rogers B J[M]. Development and pathology of binocular vision, 603-644(1996).

    [96] Trucco E, Verri A. Chapter 7 Stereopsis[M]. ∥Introductory techniques for 3D computer vision, Prentice Hall PTR Upper Saddle River, NJ, USA, 139-177(1998).

    [97] Ge G T, Cheng Z Q, Ke P P et al. Depth map extracting based on geometric perspective: an applicable 2D to 3D conversion technology. [C]∥2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), October 14-16, 2017. Shanghai. IEEE, 1-5(2017).

    [98] Watt S J, Akeley K, Ernst M O et al. Focus cues affect perceived depth[J]. Journal of Vision, 5, 834-862(2005).

    [99] Billiot B, Cointault F, Journaux L et al. 3D image acquisition system based on shape from focus technique[J]. Sensors, 13, 5040-5053(2013).

    [100] Schechner Y Y, Kiryati N. Depth from defocus vs. stereo: how different really are they?. [C]∥Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No. 98EX170), Brisbane, Qld., Australia. IEEE Comput. Soc(1998).

    [101] Furukawa Y, Hernández C. Multi-view stereo: a tutorial[J]. Foundations and Trends © in Computer Graphics and Vision, 9, 1-148(2015).

    [102] Moons T. 3D reconstruction from multiple images part 1: principles[J]. Foundations and Trends © in Computer Graphics and Vision, 4, 287-404(2008).

    [103] Wang G H, Cheng J. Three-dimensional reconstruction of hybrid surfaces using perspective shape from shading[J]. Optik, 127, 7740-7751(2016).

    [104] Durou J D, Falcone M, Sagona M. Numerical methods for shape-from-shading: a new survey with benchmarks[J]. Computer Vision and Image Understanding, 109, 22-43(2008).

    [105] Kim H G R, Angelaki D E, DeAngelis G C. The neural basis of depth perception from motion parallax[J]. Philosophical Transactions of the Royal Society B: Biological Sciences, 371, 20150256(2016).

    [106] Kellnhofer P, Didyk P, Ritschel T et al. Motion parallax in stereo 3D[J]. ACM Transactions on Graphics, 35, 1-12(2016).

    [107] Reskó B, Herbay D. Krasznai1 P, et al. 3D image sensor based on parallax motion[J]. Acta Polytechnica Hungarica, 4, 37-53(2007).

    [108] Zhou F Q, Wang Y X, Chai X H et al. Review on precise measurement technology based on mirror binocular vision[J]. Acta Optica Sinica, 38, 0815003(2018).

    [109] Solgi M, Weng J Y. WWN-8: incremental online stereo with shape-from-X using life-long big data from multiple modalities[J]. Procedia Computer Science, 53, 316-326(2015).

    [110] Blake R, Wilson H. Binocular vision[J]. Vision Research, 51, 754-770(2011).

    [111] Qian N. Binocular disparity and the perception of depth[J]. Neuron, 18, 359-368(1997).

    [112] Bagga P J. Real time depth computation using stereo imaging[J]. Journal of Electrical and Electronic Engineering, 1, 51-54(2013).

    [113] Mattoccia S, Updates S M, Outline S M[2020-01-20]. Stereo Vision: Algorithms and Applications [2020-01-20].http: ∥vision. deis. unibo. it/~smatt/Seminars/StereoVision. pdf..

    [114] Moulon P, Monasse P, Marlet R. Adaptive structure from motion with a contrario model estimation[M]. ∥Computer Vision-ACCV 2012. Berlin, Heidelberg: Springer Berlin Heidelberg, 257-270(2013).

    [115] Seitz S M, Curless B, Diebel J et al. A comparison and evaluation of multi-view stereo reconstruction algorithms. [C]∥2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1 (CVPR'06), New York, NY, USA. IEEE, 1, 519-528(2006).

    [116] Hartley R, Zisserman A[M]. Multiple view geometry in computer vision(2004).

    [117] Szeliski R. Computer vision: algorithms and applications[M]. London: Springer(2011).

    [120] Ng R. Digital light field photography[D]. Stanford: Stanford University(2006).

    [121] Zhu H, Wang Q, Yu J Y. Light field imaging: models, calibrations, reconstructions, and applications[J]. Frontiers of Information Technology & Electronic Engineering, 18, 1236-1249(2017).

    [122] Abdelhamid M. Extracting depth information from stereo vision system, using a correlation and a feature based methods[D]. Clemson: Clemson University(2011).

    [123] Scharstein D, Szeliski R, Zabih R. A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. [C]∥Proceedings IEEE Workshop on Stereo and Multi-Baseline Vision (SMBV 2001), Kauai, HI, USA. IEEE Comput. Soc, 47, 7-42(2002).

    [124] Hirschmuller H, Scharstein D. Evaluation of stereo matching costs on images with radiometric differences[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31, 1582-1599(2009).

    [125] Boykov Y, Veksler O. Graph cuts in vision and graphics: theories and applications[M]∥Handbook of Mathematical Models in Computer Vision. New York: Springer-, Verlag, 79-96.

    [126] Beraldin J A, Blais F, Cournoyer L et al. Active 3D sensing[J]. NRC Publications Archive (NPArC), 1-21(2010).

    [127] Sansoni G, Trebeschi M, Docchio F. State-of-the-art and applications of 3D imaging sensors in industry, cultural heritage, medicine, and criminal investigation[J]. Sensors, 9, 568-601(2009).

    [128] Zhang S. High-speed 3D shape measurement with structured light methods: a review[J]. Optics and Lasers in Engineering, 106, 119-131(2018).

    [129] Read J C A. Visual perception: understanding visual cues to depth[J]. Current Biology, 22, R163-R165(2012).

    [130] Mather G, Smith D R. R. Depth cue integration: stereopsis and image blur[J]. Vision Research, 40, 3501-3506(2000).

    [131] Held R T, Cooper E A, Banks M S. Blur and disparity are complementary cues to depth[J]. Current Biology, 22, 426-431(2012).

    [132] Trouvé-Peloux P, Champagnat F, Le Besnerais G et al. Theoretical performance model for single image depth from defocus[J]. Journal of the Optical Society of America A, 31, 2650-2662(2014).

    [133] Jin H L, Favaro P. A variational approach to shape from defocus[M]. ∥Computer Vision-ECCV 2002. Berlin, Heidelberg: Springer Berlin Heidelberg, 18-30(2002).

    [134] Kumar H, Yadav A S, Gupta S et al. Depth map estimation using defocus and motion cues[J]. IEEE Transactions on Circuits and Systems for Video Technology, 29, 1365-1379(2019).

    [135] Martišek D. Fast shape-from-focus method for 3D object reconstruction[J]. Optik, 169, 16-26(2018).

    [136] Danzl R, Helmli F, Scherer S. Focus variation-a robust technology for high resolution optical 3D surface metrology[J]. Strojniški Vestnik-Journal of Mechanical Engineering, 2011, 245-256(2011).

    [137] Helmli F, Danzl R, Prantl M et al. Ultra high speed 3D measurement with the focus variation method[M]. ∥Fringe 2013. Berlin, Heidelberg: Springer Berlin Heidelberg, 617-622(2014).

    [138] Wlodek J, Gofron K J, Cai Y. Achieving 3D imaging through focus stacking. [C]∥Proceedings of the 13th International Conference on Synchrotron Radiation Instrumentation-SRI2018, AIP Conference Proceedings, 2054, 050001(2019).

    [139] Qian Q, Guntur B K. Extending depth of field and dynamic range from differently focused and exposed images[J]. Multidimensional Systems and Signal Processing, 27, 493-509(2016).

    Rongsheng Lu, Yanqiong Shi, Haibing Hu. Review of Three-Dimensional Imaging Techniques for Robotic Vision[J]. Laser & Optoelectronics Progress, 2020, 57(4): 040001
    Download Citation