• Advanced Photonics Nexus
  • Vol. 4, Issue 3, 036006 (2025)
Tatsuki Tahara1,*, Tomoyoshi Shimobaba2, Yuichi Kozawa3, Mohamad Ammar Alsherfawi Aljazaerly4, and Tomoya Nakamura4
Author Affiliations
  • 1National Institute of Information and Communications Technology, Radio Research Institute, Applied Electromagnetic Research Center, Tokyo, Japan
  • 2Chiba University, Graduate School of Engineering, Chiba, Japan
  • 3Tohoku University, Institute of Multidisciplinary Research for Advanced Materials, Sendai, Japan
  • 4Osaka University, SANKEN, Osaka, Japan
  • show less
    DOI: 10.1117/1.APN.4.3.036006 Cite this Article Set citation alerts
    Tatsuki Tahara, Tomoyoshi Shimobaba, Yuichi Kozawa, Mohamad Ammar Alsherfawi Aljazaerly, Tomoya Nakamura, "Natural-light full-color motion-picture holography," Adv. Photon. Nexus 4, 036006 (2025) Copy Citation Text show less

    Abstract

    We propose a method of full-color, scan-free, and natural-light motion-picture holography for full-color 4D (3D + time) imaging and develop a portable natural-light motion-picture holographic camera that can be set on a movable table without any antivibration structure. Full-color motion-picture holograms of objects illuminated by natural light are obtained at the frame rate of an image sensor. We perform the single-shot natural-light full-color 3D imaging of objects illuminated by sunlight and the full-color 4D imaging of a moving object. This holographic camera is capable of full-color 4D imaging of objects ranging in size from the centimeter order to the 10-m order. This opens up a new stage in holographic imaging, overcoming the limitations of conventional holographic imaging despite the portability of this camera.

    1 Introduction

    Two- or multi-viewpoint images are generally required in daily life to detect 3D information without scanning or structure illumination. Holography1 is a 3D image-recording technique with an image sensor, unlike general imaging techniques. On the one hand, a laser light source is generally adopted to realize holography because light interference is required to record 3D information in holography. On the other hand, an image of interference fringes, such as a Newton ring, can be generated with daily-use light based on the self-interference phenomenon. Incoherent holography25 exploits the self-interference phenomenon and has realized the holographic recording of an object with a daily use light source such as a lamp6,7 and a light-emitting diode.8,9 Incoherent holography has the following attractive features: (1) The holographic 3D imaging of self-luminous light can be achieved and has been demonstrated,10 (2) lensless 3D imaging is possible without a priori acquisition of a point-spread function (PSF),11 (3) speckle-noiseless holographic 3D imaging can be conducted, (4) PSF in the in-plane direction can be improved in comparison with other incoherent imaging technologies,12 and (5) a 3D scene with a wide range of depths that are much larger than the coherence length of light is recorded as a hologram. Incoherent holography has been applied to fluorescence microscopy,10,1216 tomography,17 and holographic imagery, as well as cameras.8,9,1822 In particular, Kim has succeeded in developing a full-color natural light holographic camera by adopting a Michelson interferometer.8 However, multiple exposures are required in Kim’s camera; therefore, it has been difficult to instantaneously obtain a natural-light full-color hologram of a 3D scene. To realize single-shot incoherent holography, single-shot phase-shifting (SSPS) interferometry2325 has been adopted for incoherent holography.1921 In SSPS incoherent digital holography, Fresnel incoherent correlation holography (FINCH)-based,19 phase grating-based,20 and geometric phase lens (GPL)-based21 optical systems have been proposed. In these optical systems, a diffractive optical element (DOE) is set to generate two object waves that are required for self-interference. Undesired-order diffraction waves appear during the recording of a self-interference hologram when white light such as natural light illuminates a DOE because the diffraction efficiency of a DOE is not perfect for the whole wavelength band of natural light. Instead of a DOE, the use of birefringent materials has been conceptually proposed to prevent the appearance of undesired light waves during recording and to generate an incoherent hologram of spatially and temporally incoherent light.26

    We propose a method of full-color, scan-free, and single-shot natural-light holography for full-color 4D (3D + time) imaging and develop a portable natural-light motion-picture holographic camera that can be set on a movable table without any antivibration structure. We exploit SSPS incoherent digital holography,1921 which is based on SSPS interferometry2325 and incoherent holography, and design and implement the optical system adopting birefringent materials and a camera lens. Moreover, state-of-the-art denoising algorithms are applied to the proposed holography to enhance the quality of the reconstructed image. We perform full-color scan-free natural-light 3D imaging using the designed portable holographic camera set on a wooden table and the full-color 4D imaging of a moving object. This holographic camera is capable of full-color 4D imaging of objects ranging in size from the centimeter order to the 10-m order (dolls, human faces, and buildings), opening up a new stage in holographic imaging and overcoming the limitations of conventional holographic imaging despite the portability of this camera.

    2 Materials and Methods

    Figure 1 shows the basic concept of natural-light full-color motion-picture holography. The optical configuration is based on the combination of a polarimetric SSPS self-interference interferometer19 and a color polarization image sensor. Instead of a polarimetric DOE displayed on a spatial light modulator,19 a birefringent dual-focus lens such as a crystal lens used in colonoscopic holography27 is adopted. Two object waves whose wavefront curvature radii are different and whose polarization directions are orthogonal to each other are generated by the lens from the incident object wave. The optical path length difference generated by the lens is adjusted using a birefringent phase plate such as a calcite and a liquid crystal. This adjustment enables the generation of an interference fringe image of temporally incoherent light. A quarter-wave plate converts linearly polarized object waves into circularly polarized ones. The two circularly polarized object waves whose polarization directions are orthogonal to each other illuminate a color polarization image sensor in which a color filter array and a micropolarizer array are attached to a photodetector array. Each cell of the color-filter array selectively transmits light at either the red-, green- or blue-wavelength (RGB) band. The transmission axis of each cell of the micropolarizer array θ is either 0, 45, 90, or 135 deg against the horizontal axis. After the two object waves pass through the micropolarizer array, the polarization directions of the waves are aligned, and the waves interfere with each other. The phase shifts required for phase-shifting interferometry28,29 are spatially introduced to the interference fringes by utilizing circular polarizations and each transmission axis of each micropolarizer. The phase shift becomes 2θ. Therefore, self-interference holograms with phase shifts of 0, 90, 180, and 270 deg at RGB bands are simultaneously recorded with a single-shot exposure. The demosaicking procedure is applied to the single recorded hologram in a computer, and four holograms at each color channel are numerically generated. Phase-shifting interferometry and diffraction integrals are calculated with the numerically generated holograms, and then, the full-color object images focused at arbitrary depths are reconstructed. A motion picture of the holograms is obtained at the frame rate of the image sensor.

    Basic concept of natural-light full-color motion-picture holography.

    Figure 1.Basic concept of natural-light full-color motion-picture holography.

    We have developed a portable natural-light motion-picture holographic camera to demonstrate the applicability of the proposed holographic method. The camera is composed of a camera lens, polarimetric optical elements, and a color polarization image sensor, as shown in Fig. 2(a). The camera lens minifies and images the 3D information of an object and a scene illuminated by spatially and temporally incoherent light. The minified and imaged 3D information of incoherent light is recorded as an object wave. The polarizer sets the linear polarization for the object wave. The birefringent lenses generate two incoherent object waves whose wavefront curvature radii are different and whose polarization directions are orthogonal to each other. The birefringent plates and variable polarimetric phase modulator adjust the optical-path-length difference between the two waves because the careful adjustment of the difference considering the coherence length is important for temporally incoherent light. The quarter-wave plate converts the two linearly polarized object waves into circular polarizations with opposite handedness. With a color polarization image sensor, a self-interference hologram, H(x,y;θ), of an object wave is recorded. The mathematical expression of H(x,y;θ) is as follows: H(x,y;θ)=CFA(x,y)·HRGB(x,y;θ)=[TλR(x,y),TλG(x,y),TλB(x,y)]·[HλR(x,y;θ),HλG(x,y;θ),HλB(x,y;θ)],where CFA(x,y) is the spectral transmission distribution of the color filter array, Tλj(x,y) is the transmission distribution at the wavelength band Δλj (j=R,G,B), λ is the wavelength of light, HRGB(x,y;θ) is the spectral self-interference hologram, and Hλj(x,y;θ) is the self-interference hologram with the wavelength band Δλj and is expressed as Hλj(x,y;θ)=λcjΔλj/2λcj+Δλj/2I(x,y,ro,λ;θ)dxodyodzodλ=H0thλj(x,y)+Uλj(x,y)ei(2θπ/2+ϕ)+C.C..

    Developed portable natural-light motion-picture holographic camera. (a) Schematic and (b) photograph.

    Figure 2.Developed portable natural-light motion-picture holographic camera. (a) Schematic and (b) photograph.

    Here, I(x,y,ro,λ;θ) is the self-interference hologram of the object point located at ro=(xo,yo,zo) in the object and is mathematically expressed as I(x,y,ro,λ;θ)=|C(ro,λ)L[(xo,yo)z1]Q(1z1)[Q(1f1)ei(θπ/4)*Q(1z2)+Q(1f2)ei(θπ/4)eiϕ*Q(1z2)]|2=|C(ro,λ)L[fa(xo,yo)z1(fa+z2)]Q(1fa+z2)ei(θπ/4)+C(ro,λ)L[fb(xo,yo)z1(fb+z2)]Q(1fb+z2)ei(θπ/4)eiϕ|2=I0th(x,y;ro)+C(ro,λ)L[Mfc(xo,yo)]Q(1fc)ei(2θπ/2)eiϕ+c.c.,where Uλj(x,y) is the incoherent object wave with the wavelength band Δλj, which contains 3D information of the object and is generated from the incoherent sum of self-interference holograms of multiple object points, and is mathematically expressed as Uλj(x,y)=λcjΔλj/2λcj+Δλj/2C(ro,λ)L[Mfc(xo,yo)]Q(1fc)dxodyodzodλ,where λcj is the center wavelength at Δλj; H0thλj(x,y) and I0th(x,y) are the 0th-order diffraction waves of H(x,y;θ) and I(x,y,ro,λ;θ), respectively; i is the imaginary unit; C.C. and c.c. are the complex conjugates of U(x,y) and the second term of Eq. (3), respectively; C(ro,λ), C(ro,λ), C(ro,λ), and C(ro,λ) are coefficients; z1 is the depth difference between an object point and the closely set birefringent lenses; L(xo,yo)=exp[i2π(xox+yoy)/λ]11; Q(1/z)=exp[iπ(x2+y2)/λz]11; * indicates a convolution; z2 is the depth difference between the closely set birefringent lenses and the image sensor; z2=z2dp/[1(1/nf)]; z2=z2dp/[1(1/ns)]; dp is the total thickness of the closely set birefringent plates; nf and ns are the refractive indices of the fast and slow axes of the closely set birefringent plates, respectively; ϕ is the relative phase shift from the object point to the pixel placed on the optical axis, which is induced by the closely set birefringent lenses, variable phase modulator, and closely set birefringent plates; f1 and f2 are the synthesized focal lengths of the closely set birefringent lenses in the fast and slow axes, respectively; fa=f1z1/(f1z1); fb=f2z1/(f2z1); M=(faz2fbz2)/[z1(fa+z2fbz2)] is the magnification of the IDH system; and fc=(fa+z2)(fb+z2)/(fafb+z2z2) is the numerical focusing distance. z2 and z2 indicate that the focal length shifts slightly owing to the birefringent plate and are derived with paraxial approximation.30 The adjustment of ϕ is a key for improving the visibility of the interference fringes of the temporally incoherent light. Where dl is the total thickness of the closely set birefringent lenses on the optical axis, the refractive indices of the fast and slow axes of the birefringent lenses are the same as those of the birefringent plates, ϕpm is the phase shift generated by the birefringent variable phase modulator, and kc is the central wavenumber, we set the following relationship as a guideline for the adjustment: ϕ=(kcnfdl+kcnsdp)(kcnsdl+kcnfdp)+ϕpm=0,(when  dldp).

    Static kc can be set using the color filter array. Equation (5) indicates that ϕpm is set to be ϕ=0, and the incoherent light wave incident from the optical axis is recorded as an incoherent hologram with the zero optical-path-length difference. As a result, interference fringes with high visibility are obtained for the object placed on the optical axis. When the material of the lenses is the same as that of the plates, ϕ=0 is obtained by setting dl=dp accurately. As another aspect, the visibility of interference fringes generated by an arbitrary wave-vector component is flexibly adjusted by changing ϕpm. The attenuation of interference fringes is related to the shape of the spectrum, the optical-path-length difference, and the spectral bandwidth.30 Equations (2) and (3) indicate that the phase shift of an incoherent hologram depends on θ, which changes pixel by pixel. Where ϕ=0 is set, px and py are the pixel pitches of the image sensor in the x- and y-axis directions, x/px=Nx/2,,0,,Nx/21 and y/py=Ny/2,,0,,Ny/21 have discrete integer values, which means the address of the photodetector of the image sensor, Nx and Ny are the numbers of pixels in the x- and y-axis directions, respectively, and Hλj(x,y;θ) is expressed as Hλj(x,y;θ)={Hλj(x,y;π/4),where  (xpx+Nx2)=even,(ypy+Ny2)=evenHλj(x,y;0),where  (xpx+Nx2)=even,(ypy+Ny2)=oddHλj(x,y;3π/4),where  (xpx+Nx2)=odd,(ypy+Ny2)=oddHλj(x,y;π/2),where  (xpx+Nx2)=odd,(ypy+Ny2)=even.

    Here, we assume a simple spectral transmittance distribution of CFA(x,y) as CFA(x,y)=[TλR(x,y),TλG(x,y),TλB(x,y)]={(1,0,0),where(xpx+Nx2)p(mod4)and(ypy+Ny2)p(mod4)  (p=0,1)(0,1,0),where[(xpx+Nx2)p(mod4)and(ypy+Ny2)q(mod4)]  (q=2,3)or[(xpx+Nx2)q(mod4)and(ypy+Ny2)p(mod4)](0,0,1),where(xpx+Nx2)q(mod4)and(ypy+Ny2)q(mod4).

    Equations (1)–(7) indicate that the information of four phase-shifted incoherent digital holograms at RGB channels is recorded with a single-shot exposure. Figure 2(b) shows a photograph of the camera that we developed. The dimensions of the body (black box) were 104 mm (W) × 108 mm (H) × 204 mm (D). A CANON camera lens (EF 100 mm F2.8L Macro IS USM for the initial and second experiments and EF11-24mm f/4L USM for the last experiment) and a SONY color polarization image sensor (IMX250MYR) were adopted in the developed camera. A crystal was selected as the birefringent material. A THORLABS multi-order liquid crystal phase modulator (LCC2415-VIS) was used as the variable polarimetric phase modulator.

    Full-color 3D information of an incoherent object wave is retrieved without noise from a single-recorded hologram by applying the image-reconstruction procedure illustrated in Fig. 3. The recorded hologram is transferred to a computer to extract the object-wave information from the recorded image by signal processing. Multiple-phase-shifted incoherent holograms at each color channel are numerically generated from the recorded hologram by applying the demosaicking procedure.19 This time, an interpolation procedure25 is not applied to the demosaicked holograms. The complex amplitude distribution at each color channel is extracted by calculating the phase-shifting interferometry.28,29 Incoherent object waves at RGB color channels on the image sensor plane, UλR(x,y), UλG(x,y), and UλB(x,y), are extracted using the following equations of phase-shifting interferometry. UλR(x,y)=HλR(4x,4y+1)HλR(4x+1,4y)+i[HλR(4x,4y)HλR(4x+1,4y+1)],UλG(x,y)={HλG(4x,4y+3)HλG(4x+1,4y+2)+i[HλG(4x,4y+2)HλG(4x+1,4y+3)]+HλG(4x+2,4y+1)HλG(4x+3,4y)+i[HλG(4x+2,4y)HλG(4x+3,4y+1)]}/2,UλB(x,y)=HλB(4x+2,4y+3)HλB(4x+3,4y+2)+i[HλB(4x+2,4y+2)HλB(4x+3,4y+3)].

    Image reconstruction procedure using the recorded single hologram.

    Figure 3.Image reconstruction procedure using the recorded single hologram.

    Numerical light-wave propagations based on diffraction integrals are calculated for the object waves, and the object images focused on arbitrary depth planes are reconstructed at RGB color channels. Denoising procedures, such as nonlocal means (NLM),31 3D block matching (BM3D),32 and 4D block matching (BM4D)33 filtering and/or machine-learning-based algorithms such as denoising based on MIRNet,34,35 are applied to the reconstructed images to enhance the image quality. Finally, the full-color 3D information of objects illuminated by spatially and temporally incoherent light is holographically reconstructed without noise.

    3 Results and Discussion

    We have conducted experiments using the portable natural-light motion-picture holographic camera that we developed. Initially, we set the camera and two objects on a movable wooden table with five wheels. The light source was only the sun, and the sunlight scattered by the ground glass of the window was used as illumination light. Therefore, a full-color digital hologram was generated with fully passive and spatially and temporally incoherent light. Two clothes pegs on the table and a shoji screen were set as the objects. The distance between the two clothes pegs was 75 mm and that between the snowman-shaped peg and the shoji was 680 mm. A natural-light full-color digital hologram was recorded by a single-shot exposure with an exposure time of 10 ms. An NLM filter31 was applied to the reconstructed images at different depths as denoising algorithms. For comparison, reconstructed images without denoising were also obtained. Figure 4 shows the experimental results, which indicate that full-color three-object images were successfully reconstructed from a single natural-light hologram. A full-color digital hologram was obtained even with sunlight, which is spatially and temporally incoherent, at a 10-ms exposure time of the polarization color image sensor. Moreover, the image quality was successfully enhanced by applying denoising algorithms that are frequently applied to general incoherent imaging. Then, we investigated the imaging properties of the denoising algorithms and the qualities of the reconstructed images. Figure 5(a) shows the reconstructed image after applying a BM3D32 filter. The image shows a considerably improved quality compared with the images shown in Fig. 4(d), and it was clarified that denoising worked correctly. Figure 5 also shows the difference in the imaging properties of BM3D and NLM. Each denoising method successfully suppressed random noise. Moreover, as indicated by the red arrows in Fig. 5, the applied NLM filter suppressed artifacts due to diffractions from the image sensor plane and different object planes. It is considered that an NLM filter can suppress such artifacts that are frequently seen in digital holography when an object has mainly low-spatial-frequency components. To evaluate the image qualities quantitatively, we calculated standard deviations for constant intensity-value areas, which are indicated by yellow rectangles in Fig. 5(a). Table 1 shows the calculation results. It was quantitatively clarified that the image quality was considerably improved by strongly suppressing random noise with the numerical filters. The noise-suppression powers of the NLM and BM3D filters were at the same level.

    Results of experiments with sunlight illumination. Reconstructed images of (a) and (b) shoji and miniature models of (c) a clothes peg (snowman) and (d) another clothes peg (Santa Claus). (e)–(h) Reconstructed images in (a)–(d) with an NLM filter. (i) Recorded hologram. (j) Experimental setup using the developed camera without the black body. Panels (b) and (f) are brightness-improved images of panels (a) and (e).

    Figure 4.Results of experiments with sunlight illumination. Reconstructed images of (a) and (b) shoji and miniature models of (c) a clothes peg (snowman) and (d) another clothes peg (Santa Claus). (e)–(h) Reconstructed images in (a)–(d) with an NLM filter. (i) Recorded hologram. (j) Experimental setup using the developed camera without the black body. Panels (b) and (f) are brightness-improved images of panels (a) and (e).

    Comparison of denoising algorithms for single-shot full-color natural-light holography. (a) Full-color reconstructed image of a clothes peg (Santa Claus) after applying a BM3D filter and its (b) red-, (c) green-, and (d) blue-color components. (e) Red-, (f) green-, and (g) blue-color components of Fig. 4(f).

    Figure 5.Comparison of denoising algorithms for single-shot full-color natural-light holography. (a) Full-color reconstructed image of a clothes peg (Santa Claus) after applying a BM3D filter and its (b) red-, (c) green-, and (d) blue-color components. (e) Red-, (f) green-, and (g) blue-color components of Fig. 4(f).

    Region [i] (30 pixel × 30 pixel)Region [ii] (30 pixel × 30 pixel)
    RedGreenBlueRedGreenBlue
    Without denoising31.330.036.827.628.332.3
    BM3D5.095.775.766.003.465.80
    NLM4.094.685.285.132.474.42

    Table 1. Standard deviations of regions in the reconstructed images in Fig. 5.

    In the next experiment, we set the camera facing toward the outside of a room to capture an outdoor scene illuminated by sunlight. A metallic bar and other outdoor objects such as a house, trees, and a blue net were recorded as a full-color digital hologram and reconstructed as focused and defocused images, respectively. The exposure time was 10 ms. A BM3D filter was applied for denoising utilizing sparsity and nonlocal self-similarity in an image. Furthermore, we obtained 40 holograms and reconstructed images and then applied the BM4D filter33 to the reconstructed images. Figure 6 shows the experimental results, which indicate that full-color images of an outdoor scene at different depths are reconstructed from a single natural-light hologram. The image quality was also successfully enhanced by applying denoising algorithms that are frequently applied to general incoherent imaging. Both BM3D and BM4D filters strongly suppressed random noise. For detailed analyses of the reconstructed images, we selected and magnified areas containing the high-spatial-frequency components. Figure 7 shows the magnified areas and plots for high spatial frequency components of the recorded scene. Figures 7(a)7(f) indicate that BM3D partly sacrificed the resolution when denoising although BM3D suppresses the loss of the resolution in comparison with other denoising algorithms. The loss of the resolution was avoided, and the contrast of high spatial frequency components was improved by applying a BM4D filter. It was clarified that a BM4D filter is useful for denoising in the holographic motion-picture recording of static objects. We calculated standard deviations for constant-intensity-value areas, which are indicated by yellow rectangles in Figs. 6(b) and 6(d), as quantitative evaluations. Table 2 shows the calculation results. It was quantitatively clarified that the image quality was much improved by strongly suppressing random noise with the numerical filters. The noise-suppression powers of BM3D and BM4D filters were at the same level.

    Experimental results for an outdoor scene. Reconstructed images focused on (a)–(c) a house, trees, and a net and (d)–(f) a metallic bar. (g) Experimental setup using the developed camera without the black body. (a) and (d) Without numerical filter, (b) and (e) with BM3D, and (c) and (f) with BM4D.

    Figure 6.Experimental results for an outdoor scene. Reconstructed images focused on (a)–(c) a house, trees, and a net and (d)–(f) a metallic bar. (g) Experimental setup using the developed camera without the black body. (a) and (d) Without numerical filter, (b) and (e) with BM3D, and (c) and (f) with BM4D.

    Comparison of denoising algorithms for natural-light full-color motion-picture holography. (a)–(c) Magnified images of Figs. 6(a)–6(c) whose area is indicated by a red rectangle seen in Fig. 6(a). (d)–(f) Magnified images of Figs. 6(a)–6(c) whose area is indicated by a purple rectangle seen in Fig. 6(a). (g) Plots of lines of (a)–(c) whose place is indicated by a red line seen in (a). (h) Plots of lines of (d)–(f) whose place is indicated by a purple line seen in (d).

    Figure 7.Comparison of denoising algorithms for natural-light full-color motion-picture holography. (a)–(c) Magnified images of Figs. 6(a)6(c) whose area is indicated by a red rectangle seen in Fig. 6(a). (d)–(f) Magnified images of Figs. 6(a)6(c) whose area is indicated by a purple rectangle seen in Fig. 6(a). (g) Plots of lines of (a)–(c) whose place is indicated by a red line seen in (a). (h) Plots of lines of (d)–(f) whose place is indicated by a purple line seen in (d).

    Region [iii] (30 pixel × 30 pixel)Region [iv] (30 pixel × 30 pixel)
    RedGreenBlueRedGreenBlue
    Without denoising25.118.928.821.015.521.0
    BM3D2.882.352.111.601.322.09
    BM4D2.302.174.001.341.481.98

    Table 2. Standard deviations of regions in the reconstructed images in Fig. 6.

    We carried out an additional experiment to simultaneously demonstrate the capabilities of recording full-color 3D image information and close-up recording, as well as the applicability of the proposed holography to a moving reflective object. We set two objects, a human subject and a miniature model of a finch, which were placed at different depths, and captured the movements of the human subject as an incoherent motion-picture hologram with the developed camera. Figure 8(a) shows the experimental setup. The holographic camera was set on a movable table without any antivibration structure. The distances from the camera lens to the finch and from the finch to the human subject were 45 mm and 1.2 m, respectively. We used a white LED (RC220D, Shenzen Leqi Network Technology Co., Ltd.) to illuminate the human subject and red, green, and blue LEDs with nominal wavelengths of 625, 530, and 455 nm and full width at half-maximum values of 18, 33, and 18 nm to illuminate a finch, respectively. The frame rate and exposure time per frame were 22 fps and 45 ms, respectively. Denoising based on MIRNet34,35 was applied to the reconstructed images, and then, an NLM filter31 was applied for further denoising. Finally, the motion picture of the reconstructed images was generated. Figures 8(b)8(g) and Videos 1 and 2 show the experimental results, which indicate that when the numerical refocusing of respective objects is performed, the focused image of a finch is obtained under the condition of close-up recording, and the motion of the human subject is successfully reconstructed from the recorded holograms. From the results, the capabilities of recording full-color 3D image information and close-up recording, and the applicability of the proposed holography to a moving reflective object were experimentally demonstrated simultaneously. Then, we comparatively and quantitatively evaluated the reconstructed images. Figure 9 shows the reconstructed images with and without denoising algorithms. Figures 9(c) and 9(d) indicate that denoising works for a human face and does not work well for a miniature model of the finch by applying denoising based on MIRNet. It is considered that the dataset used for machine learning36,37 affected the performance of denoising. Residual random noise on the finch and artifacts on the human face were suppressed by applying an NLM filter. Table 3 shows the calculated standard deviations whose areas are indicated by the yellow rectangles shown in Figs. 9(a) and 9(b). It was quantitatively clarified that the image quality for the human face was much improved by strongly suppressing random noise with the MIRNet-based denoising. The MIRNet-based denoising also suppressed random noise on the finch. An NLM filter suppressed noise in the area around the finch.

    Experimental setup and results for objects illuminated by LED light. (a) Schematic of the experiment. The temporal differences from panels (b) and (c) are 7.8 s [(d) and (e)] and 9.5 s [(f) and (g)], respectively. Reconstructed images focused on the finch [(b), (d), and (f)] and an author (T.T.) [(c), (e), and (g)]. Denoising based on MIRNet was applied to the reconstructed images. Videos 1 and 2 are the reconstructed motion-picture images focused on the finch and human subjects, respectively, and contain panels (b), (d), and (f) and panels (c), (e), and (g), respectively (Video 1, MP4, 1.66 MB [URL: https://doi.org/10.1117/1.APN.4.3.036006.s1]; Video 2, MP4, 1.67 MB [URL: https://doi.org/10.1117/1.APN.4.3.036006.s2]).

    Figure 8.Experimental setup and results for objects illuminated by LED light. (a) Schematic of the experiment. The temporal differences from panels (b) and (c) are 7.8 s [(d) and (e)] and 9.5 s [(f) and (g)], respectively. Reconstructed images focused on the finch [(b), (d), and (f)] and an author (T.T.) [(c), (e), and (g)]. Denoising based on MIRNet was applied to the reconstructed images. Videos 1 and 2 are the reconstructed motion-picture images focused on the finch and human subjects, respectively, and contain panels (b), (d), and (f) and panels (c), (e), and (g), respectively (Video 1, MP4, 1.66 MB [URL: https://doi.org/10.1117/1.APN.4.3.036006.s1]; Video 2, MP4, 1.67 MB [URL: https://doi.org/10.1117/1.APN.4.3.036006.s2]).

    Comparison of denoising algorithms using a recorded hologram of static and moving objects. Reconstructed images without numerical filter [(a) and (b)], with denoising based on MIRNet [(c) and (d)], and with denoising based on MIRNet and then an NLM filter [(e) and (f)]. Reconstructed images focused on the finch [(a), (c), and (e)] and an author (T.T.) [(b), (d), and (f)].

    Figure 9.Comparison of denoising algorithms using a recorded hologram of static and moving objects. Reconstructed images without numerical filter [(a) and (b)], with denoising based on MIRNet [(c) and (d)], and with denoising based on MIRNet and then an NLM filter [(e) and (f)]. Reconstructed images focused on the finch [(a), (c), and (e)] and an author (T.T.) [(b), (d), and (f)].

    Region [v] (20 pixel × 20 pixel)Region [vi] (30 pixel × 30 pixel)Region [vii] (20 pixel × 20 pixel)
    RedGreenBlueRedGreenBlueRedGreenBlue
    Without denoising11.317.019.514.615.421.618.420.219.0
    MIRNet8.818.438.389.6710.1015.505.875.024.09
    MIRNet and then NLM4.705.404.436.345.355.574.944.664.49

    Table 3. Standard deviations of regions in the reconstructed images in Fig. 9.

    We discuss the specifications of the proposed holography such as spatial resolution, light-use efficiency, exposure time, and frame rate. The spatial resolution in 3D space is determined by the bottleneck between the finest pitch of interference fringes and the pixel pitch of demosaicked holograms. The proposed holography obtains the temporal resolution of a color polarization image sensor at the cost of the partial spatial resolution of interference fringes. Because of the exploitation of space-division multiplexing, the pixel pitch of each phase-shifted hologram in the proposed holography is four times that in phase-shifting incoherent digital holography without space-division multiplexing. The resolution of interference fringes is down to 1/4, and thus, the 3D spatial resolution is also principally 1/4. In the application to the microscopy of the proposed holography, diffraction-limited imaging can be conducted using a microscope objective. However, the space-bandwidth product available for recording an object wave is principally limited by applying space-division multiplexing.38 The proposed holography adopts a linear polarizer and a micropolarizer array, and therefore, three-fourths of the light intensity of the incident incoherent object wave is discarded. The maximum light-use efficiency of the proposed holography is 25% in principle, and four times the exposure time is required under the ultimately weak light condition. The frame rate of the proposed holography is determined by that of a color polarization image sensor and light intensity. The light-use efficiency also affects the frame rate and thus the maximum frame rate is limited under the condition of severely weak light intensity per exposure. Therefore, the proposed holography is useful for the conditions where an optical design set in front of the holography system supports the spatial resolution, light intensity is not ultimately weak per exposure, and dynamics is captured by the exposure time and the frame rate of a color polarization image sensor.

    We have proposed the method of single-shot natural-light full-color holography in which motion-picture holograms are obtained at the frame rate of a color polarization image sensor. The camera based on the proposed holography is portable and usable on a daily use table. Fully passive full-color holographic 3D imaging was achieved with only sunlight and an exposure time of 10 ms. The full-color holographic 4D imaging of a moving object was performed using spatially and temporally incoherent light. Different from time-of-flight imaging and 3D measurement technologies with structured light such as fringe-projection profilometry, the proposed holography achieves fully passive full-color both single-shot 3D imaging and 3D motion-picture recording at the frame rate of a color polarization image sensor. The focused image is obtained at a large depth, and it is considered that the proposed holography can help to improve the depth-measurement range of conventional 3D measurement technologies such as fringe-projection profilometry and stereo vision by combining them.

    We believe that the proposed holography method and the developed camera will open up a new stage toward fully passive full-color 4D holography, holographic motion-picture recording with daily-use illumination light, and full-color video-rate holography outdoors under the sun, which overcomes the limitations of conventional holographic imaging.

    Tatsuki Tahara received his DE degree from Kyoto Institute of Technology, Japan, in 2013. He was an assistant professor of Kansai University, Japan, from 2013 to 2018; a specially-appointed associated professor of National Institute of Informatics, Japan, from 2018 to 2019; a researcher with PRESTO, JST, Japan, from 2016 to 2020; and a researcher of National Institute of Information and Communications Technology, Japan, from 2019 to 2021, where he has been a senior researcher, since 2021.

    Tomoyoshi Shimobaba received his BE and ME degrees from Gunma University, Japan in 1997 and 1999, respectively. He received his DE degree from Chiba University, Japan, in 2002. From 2002 to 2005, he was a special postdoctoral researcher at RIKEN. From 2005 to 2009, he was an associate professor at the Graduate School of Science and Engineering, Yamagata University, Japan. From 2009 to 2019, he was an associate professor in the Graduate School of Engineering, Chiba University, Japan. He is currently a professor in the Graduate School of Engineering, Chiba University, Japan.

    Yuichi Kozawa received his PhD from Tohoku University, Japan, in 2008. He joined the Institute of Multidisciplinary Research for Advanced Materials (IMRAM), Tohoku University, Japan, as an assistant professor in 2008. In 2016, he worked as an associate professor; since 2024, he has been a professor of IMRAM, Japan. He is a member of Optica, the Japan Society of Applied Physics (JSAP), the Optical Society of Japan (OSJ), and the Laser Society of Japan (LSJ).

    Mohamad Ammar Alsherfawi Aljazaerly received his BS degree in information technology engineering from Damascus University, Syria, in 2016, and the MS degree in computer science from Osaka University, Japan, in 2020, where he is currently pursuing his PhD in computer science. His research interests include computer vision, image processing, deep learning, and gait recognition.

    Tomoya Nakamura is an associate professor at SANKEN, Osaka University, Japan. He received his PhD from Osaka University, Japan, in 2015. From 2015 to 2020, he served as an assistant professor at Tokyo Institute of Technology, Japan. His research interests include computational imaging and holography. He is a member of Optica.

    References

    [1] D. Gabor. A new microscopic principle. Nature, 161, 777-778(1948). https://doi.org/10.1038/161777a0

    [2] A. W. Lohmann. Wavefront reconstruction for incoherent objects. J. Opt. Soc. Am., 55, 1555-1556(1965). https://doi.org/10.1364/JOSA.55.1555_1

    [3] J.-P. Liu et al. Incoherent digital holography: a review. Appl. Sci., 8, 143(2018). https://doi.org/10.3390/app8010143

    [4] T. Tahara et al. Roadmap of incoherent digital holography. Appl. Phys. B, 128, 193(2022). https://doi.org/10.1007/s00340-022-07911-x

    [5] J. Rosen et al. Roadmap on computational methods in optical imaging and holography. Appl. Phys. B, 130, 166(2024). https://doi.org/10.1007/s00340-024-08280-3

    [6] P. J. Peters. Incoherent holograms with mercury light source. Appl. Phys. Lett., 8, 209-210(1966). https://doi.org/10.1063/1.1754558

    [7] J. Rosen et al. Recent advances in self-interference incoherent digital holography. Adv. Opt. Photon., 11, 1-66(2019). https://doi.org/10.1364/AOP.11.000001

    [8] M. K. Kim. Full color natural light holographic camera. Opt. Express, 21, 9636-9642(2013). https://doi.org/10.1364/OE.21.009636

    [9] A. Vijayakumar et al. Coded aperture correlation holography—a new type of incoherent digital holograms. Opt. Express, 24, 12430-12441(2016). https://doi.org/10.1364/OE.24.012430

    [10] B. W. Schilling et al. Three-dimensional holographic fluorescence microscopy. Opt. Lett., 22, 1506-1508(1997). https://doi.org/10.1364/OL.22.001506

    [11] B. Katz, J. Rosen. Super-resolution in incoherent optical imaging using synthetic aperture with Fresnel elements. Opt. Express, 18, 962-973(2010). https://doi.org/10.1364/OE.18.000962

    [12] J. Rosen, N. Siegel, G. Brooker. Theoretical and experimental demonstration of resolution beyond the Rayleigh limit by FINCH fluorescence microscopic imaging. Opt. Express, 19, 26249-26268(2011). https://doi.org/10.1364/OE.19.026249

    [13] J. Rosen, G. Brooker. Non-scanning motionless fluorescence three-dimensional holographic microscopy. Nat. Photonics, 2, 190-195(2008). https://doi.org/10.1038/nphoton.2007.300

    [14] C. Jang et al. Holographic fluorescence microscopy with incoherent digital holographic adaptive optics. J. Biomed. Opt., 20, 111204(2015). https://doi.org/10.1117/1.JBO.20.11.111204

    [15] M. Liebel et al. 3D tracking of extracellular vesicles by holographic fluorescence imaging. Sci. Adv., 6, eabc2508(2020). https://doi.org/10.1126/sciadv.abc2508

    [16] T. Tahara et al. Single-shot wavelength-multiplexed digital holography for 3D fluorescent microscopy and other imaging modalities. Appl. Phys. Lett., 117, 031102(2020). https://doi.org/10.1063/5.0011075

    [17] D. L. Marks et al. Visible cone-beam tomography with a lensless interferometric camera. Science, 284, 2164-2166(1999). https://doi.org/10.1126/science.284.5423.2164

    [18] D. N. Naik et al. Spectrally resolved incoherent holography: 3D spatial and spectral imaging using a Mach-Zehnder radial-shearing interferometer. Opt. Lett., 39, 1857-1860(2014). https://doi.org/10.1364/OL.39.001857

    [19] T. Tahara et al. Single-shot phase-shifting incoherent digital holography. J. Opt., 19, 065705(2017). https://doi.org/10.1088/2040-8986/aa6e82

    [20] T. Nobukawa et al. Single-shot phase-shifting incoherent digital holography with multiplexed checkerboard phase gratings. Opt. Lett., 43, 1698-1701(2018). https://doi.org/10.1364/OL.43.001698

    [21] K. Choi et al. Compact self-interference incoherent digital holographic camera system with real-time operation. Opt. Express, 27, 4818-4833(2019). https://doi.org/10.1364/OE.27.004818

    [22] T. Tahara. Polarization-filterless polarization-sensitive polarization-multiplexed phase-shifting incoherent digital holography (P4IDH). Opt. Lett., 48, 3881-3884(2023). https://doi.org/10.1364/OL.491990

    [23] B. Zhu, K. Ueda. Real-time wavefront measurement based on diffraction grating holography. Opt. Commun., 225, 1-6(2003). https://doi.org/10.1016/j.optcom.2003.07.025

    [24] J. Millerd et al. Pixelated phase-mask dynamic interferometer. Proc. SPIE, 5531, 304-314(2004). https://doi.org/10.1007/3-540-29303-5_86

    [25] Y. Awatsuji, M. Sasada, T. Kubota. Parallel quasi-phase-shifting digital holography. Appl. Phys. Lett., 85, 1069-1071(2004). https://doi.org/10.1063/1.1777796

    [26] T. Tahara et al. Holography for full-color 3D imaging of natural light with single-path interferometer, ITuAE-02(2022).

    [27] G. Sirat, D. Psaltis. Conoscopic holography. Opt. Lett., 10, 4-6(1985). https://doi.org/10.1364/OL.10.000004

    [28] J. H. Bruning et al. Digital wavefront measuring interferometer for testing optical surfaces and lenses. Appl. Opt., 13, 2693-2703(1974). https://doi.org/10.1364/AO.13.002693

    [29] I. Yamaguchi, T. Zhang. Phase-shifting digital holography. Opt. Lett., 22, 1268-1270(1997). https://doi.org/10.1364/OL.22.001268

    [30] T. Tahara. Multidimension-multiplexed full-phase-encoding holography. Opt. Express, 30, 21582-21598(2022). https://doi.org/10.1364/OE.456229

    [31] A. Buades, B. Coll, J. M. Morel. Denoising image sequences does not require motion estimation, 70-74(2005).

    [32] K. Dabov et al. Image denoising by sparse 3D transform-domain collaborative filtering. IEEE Trans. Image Process., 16, 2080-2095(2007). https://doi.org/10.1109/TIP.2007.901238

    [33] M. Maggioni et al. Nonlocal transform-domain filter for volumetric data denoising and reconstruction. IEEE Trans. Image Process., 22, 119-133(2013). https://doi.org/10.1109/TIP.2012.2210725

    [34] T. Tahara, T. Shimobaba. High-speed phase-shifting incoherent digital holography. Appl. Phys. B, 129, 96(2023). https://doi.org/10.1007/s00340-023-08043-6

    [35] S. W. Zamir et al. Learning enriched features for real image restoration and enhancement, 492-511(2020).

    [36] T. Plotz, S. Roth. Benchmarking denoising algorithms with real photographs, 1586-1595(2017).

    [37] A. Abdelhamed, S. Lin, M. S. Brown. A high-quality denoising dataset for smartphone cameras, 1692-1700(2018).

    [38] T. Tahara et al. Superresolution of interference fringes in parallel four-step phase-shifting digital holography. Opt. Lett., 39, 1673-1676(2014). https://doi.org/10.1364/OL.39.001673

    Tatsuki Tahara, Tomoyoshi Shimobaba, Yuichi Kozawa, Mohamad Ammar Alsherfawi Aljazaerly, Tomoya Nakamura, "Natural-light full-color motion-picture holography," Adv. Photon. Nexus 4, 036006 (2025)
    Download Citation