• Chinese Optics Letters
  • Vol. 18, Issue 1, 010901 (2020)
Shohei Ikawa1, Naoki Takada2、*, Hiromitsu Araki3, Hiroaki Niwase3, Hiromi Sannomiya3, Hirotaka Nakayama4, Minoru Oikawa2, Yuichiro Mori2, Takashi Kakue5, Tomoyoshi Shimobaba5, and Tomoyoshi Ito5
Author Affiliations
  • 1Faculty of Science, Kochi University, Kochi 780-8520, Japan
  • 2Research and Education Faculty, Kochi University, Kochi 780-8520, Japan
  • 3Graduate School of Integrated Arts and Sciences, Kochi University, Kochi 780-8520, Japan
  • 4Center for Computational Astrophysics, National Astronomical Observatory of Japan, Mitaka-shi 181-8588, Japan
  • 5Graduate School of Engineering, Chiba University, Chiba 263-8522, Japan
  • show less
    DOI: 10.3788/COL202018.010901 Cite this Article Set citation alerts
    Shohei Ikawa, Naoki Takada, Hiromitsu Araki, Hiroaki Niwase, Hiromi Sannomiya, Hirotaka Nakayama, Minoru Oikawa, Yuichiro Mori, Takashi Kakue, Tomoyoshi Shimobaba, Tomoyoshi Ito. Real-time color holographic video reconstruction using multiple-graphics processing unit cluster acceleration and three spatial light modulators[J]. Chinese Optics Letters, 2020, 18(1): 010901 Copy Citation Text show less

    Abstract

    We demonstrate real-time three-dimensional (3D) color video using a color electroholographic system with a cluster of multiple-graphics processing units (multi-GPU) and three spatial light modulators (SLMs) corresponding respectively to red, green, and blue (RGB)-colored reconstructing lights. The multi-GPU cluster has a computer-generated hologram (CGH) display node containing a GPU, for displaying calculated CGHs on SLMs, and four CGH calculation nodes using 12 GPUs. The GPUs in the CGH calculation node generate CGHs corresponding to RGB reconstructing lights in a 3D color video using pipeline processing. Real-time color electroholography was realized for a 3D color object comprising approximately 21,000 points per color.

    Holography, which was invented by Dennis Gabor[1], is widely known as the ultimate three-dimensional (3D) technique for faithfully recording and reconstructing 3D objects. A computer-generated hologram (CGH) is the digital interference fringe calculated by a computer[2]. It may be possible to apply electroholography to 3D television (TV)[35] because it can reconstruct animated 3D images by sequentially displaying CGHs on a spatial light modulator (SLM). However, in such a system, calculating the CGH is computationally prohibitive. Realizing 3D TV using electroholography requires high-performance computational power[6].

    Accelerating CGH calculations using graphics processing units (GPUs) has been reported[716]. CGH computation using multi-GPU clusters comprising several multi-GPU environment personal computers (PCs), each of which is equipped with several GPUs, was investigated in Refs. [1722]. However, real-time color electroholography using a multi-GPU cluster system has not been realized even though color electroholography is indispensable for the ultimate 3D TV.

    To realize color 3D TV, two color electroholographic systems are being researched. One comprises a three-SLM system for red, green, and blue (RGB)-colored reconstructing lights, and the other is a single-SLM system with one SLM and RGB-colored reconstructing lights.

    In the three-SLM system, a color 3D image is generated from RGB-colored 3D images reconstructed from CGHs corresponding to RGB-colored reconstructing lights by optical combination. In three-SLM color electroholography[2328], it is easy to control the CGH operation between the CGH calculation and the CGH display corresponding to the RGB-colored reconstructing lights on the three SLMs. However, this requires spatial adjustment for the optical combination of RGB-colored images in an optical system.

    A color electroholographic display using a single-SLM system was demonstrated using space-[29], depth-[30,31], and time-division methods in Refs. [3237]. The time-division method causes flicker, but the space- and depth-division methods do not. In the space-division method, the original CGH comprises three parts, one for each of the three RGB-colored reconstructing lights. However, the respective resolutions of the three parts are lower than that of the original CGH. Therefore, it is difficult to display fine color reconstructed images obtained from 3D objects comprising many object points. Typically, 3D images reconstructed using the depth-division method contain unwanted diffraction. Color electroholography using the time-division method requires color switching of the reconstructing light. Furthermore, the color switching must synchronize with the CGH display, which is difficult to achieve.

    It has been theoretically shown that light loss may[38] or may not[39,40] change the emitting color (i.e., frequency) of illuminating devices, which provides deep physical insights in display technology. In this Letter, we describe real-time 3D video reconstruction using an electroholographic system with a multi-GPU cluster and three SLMs.

    The following formula applied to the Fresnel approximation[7] is used to calculate a CGH I(xh,yh,0)=i=1NpAicos{πλzi[(xhxi)2+(yhyi)2]},where I(xh,yh,0) denotes the amplitude distribution of the point (xh,yh,0) on the hologram, the index i indicates an object point on a 3D object, and (xi,yi,zi) and Ai represent coordinates of the ith point and the amplitude of the object point, respectively. Np is the total number of object points on the 3D object, and λ is the wavelength of the reconstructing light.

    From Eq. (1), the CGH calculation time is proportional to the resolution of a CGH, and the computational complexity of the CGH calculation is O(NpWH) when the resolution of the CGH is W×H. Thus, the CGH calculation is prohibitively computationally intensive. Furthermore, color electroholography requires three times the CGH calculation of monochrome electroholography due to the related wavelengths [Eq. (1)]. We used an optimized CGH computation algorithm for the GPU[17].

    The value obtained from Eq. (1) for each point on a CGH is binarized using a threshold value of 0. The binary CGH is generated by the binarized value of each point on a hologram. In this article, we used three binary CGHs corresponding to RGB-colored reconstructing lights for color electroholography.

    Figure 1 shows the outline of the proposed real-time color electroholographic system containing multi-GPU clusters and three SLMs. The multi-GPU cluster comprises N+1PCs and 3N+1GPUs. In Fig. 1, PC 0 is referred to as a CGH display node, and PCs from PC 1 to PC N are called CGH calculation nodes. The CGH display node with a single GPU (GPU 0) displays the calculated binary CGHs on three SLMs corresponding to RGB-colored reconstructing lights. Each of the CGH calculation nodes (PC 1 to PC N) has three GPUs that calculate binary CGHs corresponding to RGB-colored reconstructing lights at each frame of a 3D color video. The three GPUs then generate RGB-colored binary CGHs from the calculated binary CGHs. The RGB-colored binary CGHs are expressed in red and black, green and black, and blue and black.

    Proposed real-time color electroholographic system.

    Figure 1.Proposed real-time color electroholographic system.

    After generating the RGB-colored binary CGHs at each frame, the calculated CGHs are sent to the CGH display node. The data transfer for the RGB-colored binary CGHs occurs at 32 bits per pixel to reduce the processing load on the CGH display node for real-time color electroholography. The 32 bits comprised 24 bits for the RGB value and 8 bits for the alpha value, which indicates the transparency of the pixels.

    In the CGH display node, the RGB-colored binary CGHs are received from the CGH calculation nodes. Then, as shown in Fig. 2, it is fast and easy for GPU 0 to combine the RGB-colored CGHs corresponding to RGB-colored reconstructing lights into a color CGH. The RGB-colored CGHs in the figure indicate the respective received RGB-colored binary CGHs. The CGH display node outputs the combined color CGH to a video signal splitter, which automatically divides the input color CGH into RGB-colored CGHs that are displayed on three SLMs corresponding to the respective RGB-colored reconstructing lights.

    Combined color CGH generated from three RGB-colored CGHs corresponding to RGB-colored reconstructing lights.

    Figure 2.Combined color CGH generated from three RGB-colored CGHs corresponding to RGB-colored reconstructing lights.

    The CGH calculation nodes calculate CGHs for the 3D color video in parallel. Here the CGH display node also serves as a network file server in the multi-GPU cluster and stores the coordinate data of the 3D object points for all frames in the 3D color video.

    The proposed color electroholographic system processes real-time 3D color video reconstruction, as shown in Fig. 3. The RGB-colored binary CGH calculations at Frame N of a 3D color video in the figure are referred to as Frame N (R), Frame N (G), and Frame N (B), respectively. Each of the CGH calculation nodes is assigned to a frame in the 3D color video to calculate their RGB-colored binary CGHs for the 3D color video. At the CGH calculation node, the generation of CGHs uses pipeline processing. The proposed real-time color video reconstruction proceeds as follows.

    Step 1: At Frame 1 of the 3D color video, GPU 1, GPU 2, and GPU 3 in the CGH calculation node PC 1 calculate the RGB-colored binary CGHs corresponding to the RGB-colored reconstructing lights. In Fig. 3, Frame 1 (R), Frame 1 (G), and Frame 1 (B) represent the RGB-colored binary CGH calculations. Similarly, the CGH calculation nodes from PC 2 to PC N evaluate the RGB-colored binary CGH from Frame 2 to Frame N.Step 2: At Frame 1, PC 1 sends the respective calculated RGB-colored binary CGHs to the CGH display node PC 0 immediately after the respective RGB-colored binary CGH calculation at Frame 1. Similarly, at Frame 2 to Frame N, the CGH calculation nodes from PC 2 to PC N send the respective calculated RGB-colored binary CGHs to the CGH display node PC 0.Step 3: The CGH display node PC 0 receives the RGB-colored binary CGH from PC 1 at Frame 1. GPU 0 of the CGH display node combines the RGB-colored binary CGHs into a color CGH and outputs the combined color CGH to the video signal splitter. Similarly, PC 0 receives the calculated RGB-colored binary CGHs at Frame 2 to Frame N from PC 2 to PC N. The received RGB-colored binary CGHs are combined into color CGHs. GPU 0 in turn outputs the combined color CGHs from Frame 2 to Frame N to the video signal splitter at a constant time interval T.

    Parallel processing of real-time 3D color video reconstruction using the proposed color electroholographic system.

    Figure 3.Parallel processing of real-time 3D color video reconstruction using the proposed color electroholographic system.

    After Step 3, from Frame N+1 to 2N, the RGB-colored binary CGHs are calculated using the CGH calculation nodes, and GPU 0 outputs the combined CGHs to the video signal splitter at a constant time interval T. The process is repeated until the last frame of the 3D color video is reached.

    We investigated the respective CGH transfer times between the CGH display node and each CGH calculation node. In this Letter, the size per pixel in respective CGH transfer between the CGH display node and each CGH calculation node is 32 bits. Each CGH had a resolution of 1920×1024pixels; therefore, an optimized CGH calculation algorithm[17] was applied. Hence, the total size of the respective data transfer is 32(bits)×1920(pixels)×1024(pixels)62.9(Mbits). The estimated time required to transfer the data using Gigabit Ethernet (1 Gbps) is 63 ms. The frame rate of the system using Gigabit Ethernet is less than 16 frames per second. Therefore, a network with a higher speed is required to realize real-time 3D color video reconstruction. To overcome this bottleneck, we used an InfiniBand Quad Data Rate (QDR) (40 Gbps) as a computer network of multi-GPU cluster, leading to a reduction of the theoretical CGH transfer time to 1.6 ms.

    Figure 4 shows the optical setup for the proposed real-time color electroholographic system, which uses a projector (Epson Corp. EMP-TW1000) with a maximum refresh rate of 60 Hz. Three transmissive liquid crystal display (LCD) panels extracted from the projector were used as SLMs corresponding to RGB-colored reconstructing lights. Here, the resolution and pixel pitch in the LCD panel are 1920×1080pixels and 8.5μm×8.5μm, respectively. The projector was connected to GPU 0 on the CGH display node. We used a video signal splitter mounted in the projector, as shown in Figs. 1 and 3.

    Optical setup in the proposed real-time color electroholographic system using a multi-GPU cluster and three SLMs.

    Figure 4.Optical setup in the proposed real-time color electroholographic system using a multi-GPU cluster and three SLMs.

    Red, green, and blue semiconductor lasers with wavelengths of 625, 525, and 470 nm, respectively, were used as RGB-colored reconstructing lights. The laser lights were converted into RGB-colored parallel lights using an objective lens and a collimator lens, which were incident on the corresponding LCD panels. The viewing angle of the reconstructed 3D image is 3.2°.

    We used a multi-GPU cluster comprising a CGH display node and four CGH calculation nodes to evaluate the performance of the proposed real-time color electroholographic system. The CGH display node and each of the CGH calculation nodes were equipped with a GPU and three GPUs, respectively. Therefore, the multi-GPU cluster contained 13 GPUs.

    Each of the PCs (PC 0 to PC N) on the multi-GPU cluster contained an Intel Core i7 4770 as the CPU, and the multi-GPU cluster contained 13 NVIDIA GeForce GTX TITAN X GPUs. InfiniBand QDR was used to network the multi-GPU cluster. The program code for the proposed system contained CUDA 7.0 SDK[41], Open GL 4.5.0, and Open MPI v1.8.7.

    Original color 3D objects comprising RGB-colored objects with 19,168, 27,012, and 18,228 points were used for the 3D color video, as shown in Fig. 5. Each object was located 1.5 m from the respective CGH. Figure 6 shows the RGB-colored binary CGHs obtained from the original color 3D object shown in Fig. 5 and the combined CGH generated from the RGB-colored binary CGHs. The combined CGH is output to the video signal splitter. Figure 7 shows snapshots of the reconstructed 3D color video against the number of GPUs in the proposed system. Here, in the reconstruction of 3D color video, we did not use any techniques to reduce the induced laser speckle. Table 1 illustrates the display time interval T of the proposed real-time 3D color video against the total number of GPUs used. The system achieved real-time 3D color video reconstruction using 13 GPUs. In Table 1, the display time of “1 GPU” shows the sum of the calculation times of the RGB-colored binary CGHs for RGB-colored objects with 19,168, 27,012, and 18,228 points. In “4 GPUs”, three GPUs in a CGH calculation node must wait to calculate the RGB-colored binary CGHs for the next frame until these GPUs finish calculating all RGB-colored binary CGHs for the present frame. Thus, the display time of “4 GPUs” is equal to the calculation time of the green-colored binary CGH for the green-colored object with 27,012 points. When the total number of GPUs is 7 to 13, the respective display time becomes approximately 1/N the display time of “4 GPUs”. Here, N is the number of the CGH calculation nodes.

     Display time interval (ms)Frame rate (fps)
    1 GPU250.93.98
    4 GPUs100.99.91
    7 GPUs50.919.64
    10 GPUs35.528.17
    13 GPUs26.138.31

    Table 1. Display Time Interval of the Proposed Real-Time Color Electroholography

    Original color 3D object for the 3D color video.

    Figure 5.Original color 3D object for the 3D color video.

    RGB-colored binary CGHs obtained from original color 3D object and combined CGH generated from RGB-colored binary CGHs.

    Figure 6.RGB-colored binary CGHs obtained from original color 3D object and combined CGH generated from RGB-colored binary CGHs.

    Snapshots of the 3D video reconstructed using the proposed method (video 1).

    Figure 7.Snapshots of the 3D video reconstructed using the proposed method (video 1).

    In conclusion, the proposed system achieved real-time 3D color video (flower and butterfly) reconstruction for RGB-colored objects comprising 19,168, 27,012, and 18,228 points using 13 GPUs. We believe that the proposed method will find useful application in 3D color TV in the future.

    References

    [1] D. Gabor. Nature, 161, 777(1948).

    [2] G. Tricoles. Appl. Opt., 26, 4351(1987).

    [3] S. A. Benton, J. V. M. Bove. Holographic Imaging(2008).

    [4] P. St-Hilaire, S. A. Benton, M. Lucente, M. L. Jepsen, J. Kollin, H. Yoshikawa, J. S. Underkoffler. Proc. SPIE, 1212, 172(1990).

    [5] T. Sugie, T. Akamatsu, T. Nishitsuji, R. Hirayama, N. Masuda, H. Nakayama, Y. Ichihashi, A. Shiraki, M. Oikawa, N. Takada, Y. Endo, T. Kakue, T. Shimobaba, T. Ito. Nat. Electron., 1, 254(2018).

    [6] J.-S. Chen, D. Chu. Appl. Opt., 55, A127(2016).

    [7] N. Masuda, T. Ito, T. Tanaka, A. Shiraki, T. Sugie. Opt. Express, 14, 603(2006).

    [8] A. Shiraki, N. Takada, M. Niwa, Y. Ichihashi, T. Shimobaba, N. Masuda, T. Ito. Opt. Express, 17, 16038(2009).

    [9] Y. Pan, X. Xu, S. Solanki, X. Liang, R. B. A. Tanjung, C. Tan, T.-C. Chong. Opt. Express, 17, 18543(2009).

    [10] P. Tsang, W. K. Cheung, T.-C. Poon, C. Zhou. Opt. Express, 19, 15205(2011).

    [11] J. Weng, T. Shimobaba, N. Okada, H. Nakayama, M. Oikawa, N. Masuda, T. Ito. Opt. Express, 20, 4018(2012).

    [12] G. Li, K. Hong, J. Yeom, N. Chen, J.-H. Park, N. Kim, B. Lee. Chin. Opt. Lett., 12, 060016(2014).

    [13] H. Niwase, N. Takada, H. Araki, H. Nakayama, A. Sugiyama, T. Kakue, T. Shimobaba, T. Ito. Opt. Express, 22, 28052(2014).

    [14] Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, L. Xiao. Chin. Opt. Lett., 14, 080901(2016).

    [15] Y. Zhang, J. Liu, X. Li, Y. Wang. Chin. Opt. Lett., 14, 030901(2016).

    [16] D.-W. Kim, Y.-H. Lee, Y.-H. Seo. Appl. Opt., 57, 3511(2018).

    [17] N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, T. Ito. Appl. Opt., 51, 7303(2012).

    [18] Y. Pan, X. Xu, X. Liang. Appl. Opt., 52, 6562(2013).

    [19] B. J. Jackin, H. Miyata, T. Ohkawa, K. Ootsu, T. Yokota, Y. Hayasaki, T. Yatagai, T. Baba. Opt. Lett., 39, 6867(2014).

    [20] J. Song, C. Kim, H. Park, J.-I. Park. Appl. Opt., 55, 3681(2016).

    [21] H. Niwase, N. Takada, H. Araki, Y. Maeda, M. Fujiwara, H. Nakayama, T. Kakue, T. Shimobaba, T. Ito. Opt. Eng., 55, 093108(2016).

    [22] B. J. Jackin, S. Watanabe, K. Ootsu, T. Ohkawa, T. Yokota, Y. Hayasaki, T. Yatagai, T. Baba. Appl. Opt., 57, 3134(2018).

    [23] F. Yaraş, H. Kang, L. Onural. Appl. Opt., 48, H48(2009).

    [24] H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, T. Ito. Appl. Opt., 49, 5993(2010).

    [25] J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, H. Kim. Opt. Express, 25, 14774(2017).

    [26] H. Sato, T. Kakue, Y. Ichihashi, Y. Endo, K. Wakunami, R. Oi, K. Yamamoto, H. Nakayama, T. Shimobaba, T. Ito. Sci. Rep., 8, 1500(2018).

    [27] M. Fujiwara, N. Takada, H. Araki, S. Ikawa, Y. Maeda, H. Niwase, M. Oikawa, T. Kakue, T. Shimobaba, T. Ito. Chin. Opt. Lett., 16, 080901(2018).

    [28] H. Yanagihara, T. Kakue, Y. Yamamoto, T. Shimobaba, T. Ito. Opt. Express, 27, 15662(2019).

    [29] T. Ito, T. Shimobaba, H. Godo, M. Horiuchi. Opt. Lett., 27, 1406(2002).

    [30] M. Makowski, M. Sypek, A. Kolodziejczyk. Opt. Express, 16, 11618(2008).

    [31] M. Makowski, M. Sypek, I. Ducin, A. Fajst, A. Siemion, J. Suszek, A. Kolodziejczyk. Opt. Express, 17, 20840(2009).

    [32] T. Shimobaba, T. Ito. Opt. Rev., 10, 339(2003).

    [33] M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, T. Ito. Opt. Express, 19, 12008(2011).

    [34] H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, T. Senoh. Sci. Rep., 4, 6177(2014).

    [35] H. Araki, N. Takada, H. Niwase, S. Ikawa, M. Fujiwara, H. Nakayama, T. Kakue, T. Shimobaba, T. Ito. Appl. Opt., 54, 10029(2015).

    [36] Y. Zhao, L. Cao, H. Zhang, W. Tan, S. Wu, Z. Wang, Q. Yang, G. Jin. Chin. Opt. Lett., 14, 010005(2016).

    [37] H. Araki, N. Takada, S. Ikawa, H. Niwase, Y. Maeda, M. Fujiwara, H. Nakayama, M. Oikawa, T. Kakue, T. Shimobaba, T. Ito. Chin. Opt. Lett., 15, 120902(2017).

    [38] Z. Chen, Y. Zhou, J.-T. Shen. Phys. Rev. A, 98, 053830(2018).

    [39] Z. Chen, Y. Zhou, J.-T. Shen. Opt. Lett., 42, 887(2017).

    [40] Y. Shen, Z. Chen, Y. He, Z. Li, J.-T. Shen. J. Opt. Soc. Am. B, 35, 607(2018).

    [41] CUDA C Programming Guide ver. 7.0. NVIDIA(2015).

    Shohei Ikawa, Naoki Takada, Hiromitsu Araki, Hiroaki Niwase, Hiromi Sannomiya, Hirotaka Nakayama, Minoru Oikawa, Yuichiro Mori, Takashi Kakue, Tomoyoshi Shimobaba, Tomoyoshi Ito. Real-time color holographic video reconstruction using multiple-graphics processing unit cluster acceleration and three spatial light modulators[J]. Chinese Optics Letters, 2020, 18(1): 010901
    Download Citation