• Infrared Technology
  • Vol. 43, Issue 1, 60 (2021)
Jiali WEI1、*, Huidong QU1, Yongxian WANG2, Junqing ZHU2, and Yingjun GUAN1
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • show less
    DOI: Cite this Article
    WEI Jiali, QU Huidong, WANG Yongxian, ZHU Junqing, GUAN Yingjun. Research Review of 3D Cameras Based on Time-of-Flight Method[J]. Infrared Technology, 2021, 43(1): 60 Copy Citation Text show less
    References

    [2] Rice K, Moigne J Le, Jain P. Analyzing range maps data for future space robotics applications[C]//Proceedings of the 2nd IEEE International Conference on Space Mission Challenges for Information Technology,2006, 17: 357-357.

    [4] Kohoutek, Tobias. Analysis and processing the 3drangeimagedata for robot monitoring[J]. Geodesy and Cartography, 2008, 34(3): 92-96.

    [5] Kuehnle J U, Xue Z, Zoellner J M, et al. Grasping in Depth maps of time-of-flight cameras[C]//International Workshop on Robotic &Sensors Environments. IEEE, 2008: 132-137.

    [6] Oggier T, Lehmann M, Kaufmann R, et al. An all-solid-state opticalrange camera for 3D real-time imaging with sub-centimeter depth resolution (SwissRanger)[C]//Optical Design and Engineering.International Society for Optics and Photonics, 2004, 5249: 534-545.

    [10] Kaufmann R, Lehmann M, Schweizer M, et al. A time-of-flight line sensor Development and application[J]. Optical Sensing, 2004, 5459:192-199.

    [11] Gupta M, Agrawal A, Ashok Veeraraghavan. A Practical Approach to 3D Scanning in the Presence of Interreflections, Subsurface Scattering and Defocus[J]. International Journal of Computer Vision, 2013,102(1-3): 33-55.

    [16] Payne A D, Dorrington A A, Cree M J. Illumination waveform optimization for time-of-flight range imaging cameras[C]//Videometrics, Range Imaging, and Applications XI. International Society for Optics and Photonics, 2011, 8085: 136-148.

    [17] Corti A, Giancola S, Mainetti G, et al. A metrological characterization of the Kinect V2 time-of-flight camera[J]. Robotics and Autonomous Systems, 2016, 75(PB): 584-594.

    [18] Rapp H. Faculty for physics and astronomy[D]. Heidelberg: University of Heidelberg, 2007.

    [19] Ruocco R, White T, Jarrett D. Systematic errors in active 3D vision sensors[J]. IEEE Proceedings-Vision, Image and Signal Processing,2003, 150(6): 341-345.

    [20] Jung J, Lee J, Jeong Y, et al. Time-of-Flight Sensor Calibration for a Color and Depth Camera Pair[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(7): 1501-1513.

    [21] Fuchs S, Hirzinger G. Extrinsic and depth calibration of ToF-cameras[C]//IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, 2008: 1-6.

    [24] Opower H. Multiple view geometry in computer vision[J]. Optics and Lasers in Engineering, 2002, 37(1): 85-86.

    [26] Moreno-Noguer F, Belhumeur P N, Nayar S K. Active refocusing of images and videos[J]. ACM Transactions on Graphics, 2007, 26(99): 67.

    [27] Prusak A, Melnychuk O, Roth H, et al. Pose estimation and map building with a Time-Of-Flight camera for robot navigation[J].International Journal of Intelligent Systems Technologies and Applications, 2008, 5(3/4): 355-364.

    [28] Alenyà G, Foix S, Torras C. Using ToF and RGBD cameras for 3D robot perception and manipulation in human environments[J]. Intelligent Service Robotics, 2014, 7(4): 211-220.

    [30] Vázquez-Arellano, Manuel Reiser D, Paraforos D S, et al. 3-D reconstruction of maize plants using a time-of-flight camera[J].Computers and Electronics in Agriculture, 2018, 145: 235-247

    [31] Penne J, Schaller C, Hornegger J, et al. Robust real-time 3D respiratory motion detection using time-of-flight cameras[J]. International Journal of Computer Assisted Radiology and Surgery, 2008, 3(5): 427-431.

    [32] Soutschek S, Penne J, Hornegger J, et al. 3-D gesture-based scene navigation in medical imaging applications using Time-of-Flight cameras[C]//IEEE Conf. on Computer Vision & Pattern Recogn., 2008:1-6.

    [34] Ahmad R, Plapper P. Generation of safe tool-path for 2.5D milling/drilling machine-tool using 3D ToF sensor[J]. CIRP Journal of Manufacturing Science and Technology, 2015, 10: 84-91.

    [38] Hullin M B. Computational imaging of light in flight[C]//Proceedings of Optoelectronic Imaging and Multimedia Technology Ⅲ, 2014: 927314.

    [39] XIE Jun, Feris Rogerio Schmidt, SUN Mingting. Edge-Guided Single Depth Image Super Resolution[J]. IEEE Transactions on Image Processing: a publication of the IEEE Signal Processing Society, 2016,25(1): 428-438.

    [40] SONG X B, DAI Y C, QIN X Y. Deep Depth Super-Resolution:Learning Depth Super-Resolution using Deep Convolutional Neural Network[C]//Asian Conference on Computer Vision, 2017, 10114:360-376.

    [41] Kahlmann T, Oggier T, Lustenberger F, et al. 3D-TOF sensors in the automobile[C]//Proceedings of SPIE the International Society for Optical Engineering, 2005, 5663: 216-224.

    [43] Nguyen Trong-Nguyen, Huynh Huu-Hung, Meunier Jean. Human gait symmetry assessment using a depth camera and mirrors[J]. Computers in Biology and Medicine, 2018, 101: 174-183.

    WEI Jiali, QU Huidong, WANG Yongxian, ZHU Junqing, GUAN Yingjun. Research Review of 3D Cameras Based on Time-of-Flight Method[J]. Infrared Technology, 2021, 43(1): 60
    Download Citation