Abstract
1. INTRODUCTION
Although performing imaging with scattered light is intractable, it has wide applications in many fields, such as biomedical, astronomical, and underwater imaging. This problem is challenging in that the incident wave is randomly modulated by the scattering media, resulting in a disordered pattern containing massive speckle spots. Fortunately, thanks to recent advances in wave front shaping [1–5], the disordered information can be redistributed and decoded from the observed speckle pattern. For instance, measuring the transmission matrix (TM) allows one to perform focusing [6–9], as well as imaging [10–12] or transmitting information [13–15], through or inside the scattering medium. While, in principle, acquiring TM requires coherent illumination, its extension to broadband case will open fascinating applications in ultrafast science. For this purpose, the TM-based methods are further extended in the spectral domain by accessing the multispectral transmission matrix (MSTM) to cope with polychromatic light. Such techniques have shown success in ultrafast lasers [16–19] and fiber optics [20]. However, characterizing the scattering medium requires thousands of images captured under vibration isolation conditions, which precludes its applications in dynamic and harsh environments.
A separate approach known as speckle correlation imaging (SCI) also shows promise in broadband focusing and imaging through scattering media [21,22]. This approach relies on persistent correlations, such as the optical memory effect (OME) [23,24], which enables single-shot, high-fidelity, diffraction-limited imaging through scattering media without any prior knowledge of or access to the scattering media. While promising, regular SCI, based on the speckle generated from quasi-monochromatic illumination (typically ), is inherently incompatible with the extremely broad nature of white light spectra. However, the crucial coherence length of the light sources is inversely proportional to the illumination bandwidth [25]. Speckle patterns emitted by broadband radiation display low-contrast structures and thus obstruct optical manipulation and inverse reconstruction.
Indeed, the broadband pattern can be thought of as the incoherent superposition of multiple monochromatic speckle patterns produced by the different frequency components encompassed by the light spectrum. Essentially, when observing a speckle pattern, an increase in wavelength (all other parameters remaining fixed) has two effects. First, the observed speckle pattern undergoes a spatial scaling. When the wavelength increases, the speckle pattern expands and vice versa. The second effect is on the random phase shifts imparted to the scattered wave by the surface-height fluctuations . Since those phase shifts are proportional to , an increase in wavelength will decrease the random phase shifts and vice versa. When the surface is rough on the scale of a wavelength, phase wrapping into the interval (0, ) will result in changes in the detailed structure of the speckle pattern [26]. For the above two reasons, the extended illumination bandwidth will result in dephasing and speckle aliasing [27], thus reducing the contrast of the observed speckle pattern and further obstructing the reconstruction process.
Sign up for Photonics Research TOC. Get the latest issue of Photonics Research delivered right to you!Sign up now
As different spectral components generate different speckles, current approaches are highly sensitive to the spectral bandwidth. The monochromatic treatment is, therefore, only valid as long as the bandwidth of the source is contained within this spectral correlation bandwidth. Extending narrowband to broadband light can effectively ease the trade-off between the filter bandwidth and detection signal-to-noise ratio (SNR). Furthermore, it can potentially help increase the penetration depth, which is critical for biomedical fields, including neuroscience.
To mitigate the constraint of the narrowband operation, several imaging techniques have been developed. A straightforward method is based on spectral components such as spectral filters [28], prisms [29], or even scattering-assisted spectrometers [30] to resolve broadband spectra into well-separated spectral channels. This improves the speckle contrast as well as the performance of object retrieval. Although these methods have proven efficiency, the trade-off between spectral resolution and spectral range can limit their performance. Moreover, imposing spectral filters on broadband sources leads to a dramatic reduction in flux, which may be impractical in some applications. The ability to use the full-flux of low-flux ultra-broadband sources, such as table-top high-harmonic generation (HHG) systems [31], could provide a breakthrough for practical imaging applications. Incoherent scattering imaging can also rely on deep learning modules [32]. Nevertheless, the price to pay is a fuzzy physical model and a large training dataset. To date, controlling scattered light for deep focusing under broadband illumination has attracted much attention [16–19,33]. However, the necessary scattering imaging methods combining broadband illumination with diffraction-limited spatial resolution are currently lacking. These representative state-of-the-art scattering imaging methods, as well as the effective illumination bandwidth, are summarized in Table 1. Comparison among State-of-the-art Scattering MethodsMethod Bandwidth (FWHM) Captured Frames Precision Calibration Object Sparse Constraint Image Fidelity Bertolotti 40 nm Multi-shot W W/O High Katz 20 nm Single-shot W/O W High Edrei 1 nm Single-shot W W/O High Wu 1 nm Single-shot W/O W High Badon 13.8 nm Multi-shot W W/O Moderate Li 60 nm Single-shot W W High Divitt 200 nm Multi-shot W W Moderate Lu 200 nm Single-shot W/O W Moderate
In this paper, we propose two individual methods for realizing scattering imaging under broadband illumination using only a single speckle measurement under noninvasive and invasive conditions. The major difference between invasive and noninvasive methods is that invasive methods pre-calibrate point spread function (PSF) information for scattering systems, whereas noninvasive methods require no a priori information about the medium or its properties. In contrast to previous approaches, both methods neither require any spectral filters, nor have additional assumptions on the spectral content of the signals. In the noninvasive approach (method A), we extract the object Fourier spectrum from the broadband pattern by using a cepstrum shaping method combined with a matrix decomposition strategy. We show that by unique computational processing, the lost low-frequency information can be reproduced in the Fourier domain. After applying a modified phase retrieval procedure, sharp reconstruction results can be achieved even with a speckle contrast ratio (SCR) as low as 0.0823 [26]. The invasive approach (method B) is based on an optimization framework to solve a general de-correlagraphy problem. The proposed computational framework can effectively reduce the background level of the recovered image, while preserving sharp edges and high-frequency features. Experimental validations under the 280 nm illumination spectrum spanning the visible region show the applicability of both methods. Our approach enables imaging through scattering media under broadband radiation, which we expect to make much more efficient use of such state-of-the-art light sources as third-generation synchrotrons and table-top HHG sources for operating in heavily scattering environments. Moreover, the capability to treat broadband light can also empower the fields of florescence imaging, night vision detection, and passive sensing.
2. CORRELATION ANALYSIS
For simplicity, our problem discussion is limited to the scope of the OME region. The recorded broadband speckle can be described by the convolution between the broadband PSF, , and the object function, ,
Let us assume that the broadband spectrum light source can be regarded as a linear superposition of multiple narrowband spectrum light sources with each one under the narrowband illumination hypothesis. Here, and represent the central wavelength of the broadband and each narrowband light, and the bandwidths of the broadband and narrowband light are denoted by and , respectively. Considering the broadband illumination case, the broadband PSF, can be expressed as
From Eq. (4), the autocorrelation of the broadband speckle can be separated into two parts. The first term incorporates a superposition of all autocorrelation parts, whereas the latter term accounts for contributions from all cross-correlation parts. Considering the discussions in Ref. [38] and supposing the imaging system has a circle pupil, the first part of the broadband PSF’s autocorrelation can be further expressed as
3. METHODS
A. Imaging without PSF Calibration
A radiation beam consisting of multiple wavelengths is illuminated on the sample and the exiting field is scattered by an off-the-shelf diffuser to form broadband pseudorandom patterns, which is the incoherent superposition of speckle patterns of each wavelength [see Figs. 1(a) and 1(b)]. To reconstruct the object, , from the single-shot measured image , we need to solve the ill-conditioned inverse scattering problem by utilizing phase retrieval. However, as discussed in Section 2, the autocorrelation of the broadband no longer satisfied the delta function approximation and thus caused the failure of the SCI method. Novel methods are highly desired to remedy this issue. Here, based on experimentally observed similarities in the Fourier spectra of objects and speckles and inspired by homomorphic filtering, we propose a spectrum refinement principle that enables the extraction of the pure object Fourier spectrum from the caustic broadband pattern Fourier spectrum. The Fourier spectrum extraction procedure is depicted in Fig. 2. The phase retrieval is then performed using the refined Fourier spectrum pattern [see Fig. 1(d)] as input into a hybrid relaxed averaged alternating reflections (RAAR) [39] and hybrid input and output (HIO) [40,41] algorithm, with a shrink-wrap-like support update [42] (see Algorithm 1 for details). The object, support region, and Fourier phase can be retrieved simultaneously [Fig. 1(e)].
Figure 1.Principle of single-shot broadband scattering imaging. In the case of a broadband source (a), the broadband pattern (b) is the incoherent spectrally weighted sum of the monochromatic speckle patterns corresponding to all wavelengths present in the source. The size of these monochromatic speckles is geometrically scaled with increasing wavelengths, but the micro-structures are ever-changing. In the presented method, the broadband speckle is first transformed into the Fourier domain (c) and then refined in the complex cepstrum to extract an estimated object Fourier spectrum (d). (e) A modified iterative phase retrieval algorithm is then used to reconstruct the sample, support, and Fourier phase simultaneously.
Figure 2.Noninvasive broadband scattering imaging reconstruction pipeline. The broadband pattern
After we measure the broadband pattern [Fig. 2(a)], we first filter it with a two-dimensional Hanning window in real space [Fig. 2(b)] to avoid spectrum leakage and then transform it into the Fourier domain [see Fig. 2(c) for Fourier amplitude]. The part of the Fourier phase corrupted by scattering is discarded, which is not shown here. We note, first, that the Fourier spectrum is sparse, as the energy of the power spectrum is concentrated around zero frequency [see the blue line profile in Fig. 2(j)], which necessitates an approach to retrieval. To avoid oscillations caused by near zero values in the spectrum, we add an all-one matrix on the Fourier amplitude and then convert it into the logarithmic domain [Fig. 2(d)]. Notably, more details and noise in the spectrum are highlighted with logarithmic transform; therefore, a smooth filter via nonlocal means (NLM) [43] is applied to exclude the noisy profile introduced by the broadband [see the hemispherical surface attached burrs in Fig. 2(k)]. The size of the patch and research window of the NLM are set as 5 and 11, respectively, and the smooth parameter is set as 3. As controls the decay process of the Gaussian function in NLM, a higher value of is suitable for a relatively simple target to obtain a smoother spectrum and vice versa. Detailed parameter selection rules are arranged in Appendix C.
The filtered spectrum is then transformed back into the Fourier domain [Fig. 2(f)]. The next step involves a low-rank and sparse decomposition (LRSD) procedure [44] based on robust principal component analysis (RPCA) [45] to remove the residual peak noise caused by the broadband [highlighted with a red arrow in Fig. 2(k)]. We classify this peak noise into a sparse part according to its sparse property [Fig. 2(g)], and the low-rank part represents the final separated Fourier spectrum [Fig. 2(h)]. The ground truth object Fourier spectrum is presented in Fig. 2(i). Note that the low-frequency information reappears in the Fourier domain after processing [see the comparison in Figs. 2(c), 2(h), and 2(i)]. The central line along the two opposite arrows of Figs. 2(c), 2(h), and 2(i) depicted by gray, red, and blue lines also verified the effectiveness of the proposed method, for richer low-frequency information and higher cut-off frequency.
Finally, we obtain the refined Fourier spectrum [Fig. 2(h)], and in the next step, we perform phase retrieval to reconstruct the target information. However, as the broadband information is unknown, object spectrum recovery is an ill-conditioned problem; therefore, robust phase retrieval is needed to remedy this issue. Here, we modified the iterative alternating projection method by introducing an adaptive support region and a hybrid phase retrieval strategy. Specifically, the iteration was carried out using a combination of the RAAR and HIO algorithms, with a shrink-wrap-like support update. We calculate the root mean square error (RMSE) between the refined Fourier amplitude and the Fourier amplitude generated by the estimated object in each iteration. We chose typically 80 iterations to ensure that the RMSE of the RAAR algorithm converged to . The subsequent HIO algorithm is used for smearing out the background noise and 10 iterations are typically adequate to obtain the final result. The feedback parameter is set as for both the RAAR and HIO algorithms. Accompanied by the iteration, the adaptive support region is controlled by an alternative convolution with a Gaussian kernel (Algorithm 1, step 6) and a threshold operation (Algorithm 1, step 7), which act as dilation and erosion. As the iterative number increases, this exported support region will change from a loose size to a tight one (Algorithm 1, steps 2-7). The initial , where represents the pixel size related to the imaging sensor. It is worth emphasizing that we selected a non-centrosymmetric support , for instance, an equilateral triangle, as an initial support. Using a non-centrosymmetric support region considerably speeds up convergence and effectively avoids the twin-image problem.
To explain how the process works in greater detail, four projection operations are defined. The projection operations , , , and project points in two sets, (support) and (modulus) [46]. When the image belongs to both sets simultaneously, we reach an equivalent solution. The support projector involves setting to 0 the components outside the support, while leaving the rest of the values unchanged.
The reflectors of the support projector and modulus projector are defined as, and , respectively.
According to the notation above, the recursive form of HIO algorithm would be
For clarity, we describe our modified phase retrieval procedure in Algorithm 1. Modified phase retrieval algorithm1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14:
In the present method, phase retrieval is a standalone step that depends solely on the refined spectrum. It can also be integrated as a module in algorithms for other types of lensless imaging (coherent diffraction imaging, holography, ptychography) to allow them to cope with broadband sources.
B. Imaging with PSF Calibration
Figure 3 presents the data acquisition and reconstruction process with PSF calibration. An object and a pinhole are alternatively placed at the same spatial position in the optical configuration (see details in Appendix C), which captures the broadband speckle, , and the corresponding PSF, . The raw captured speckle and PSF are then lowpass filtered in the frequency domain (a typical filter length is 15) to eliminate the non-uniform illumination background. Then, we apply a two-dimensional Gaussian lowpass filter of size [2 2] with standard deviation in the real space to further smooth the pattern.
Figure 3.Broadband scattering imaging reconstruction pipeline with calibrated PSF. In the OME range, the broadband speckle
After preprocessing, we introduce two types of reconstruction algorithms. The first algorithm is based on the cross-correlation strategy:
Here represents the inverse Fourier transform, and the bar above the variable represents a complex conjugate. Inspired by the interferenceless coded aperture correlation holography [47], a modified approach of the cross-correlation method is referred to as the nonlinear reconstruction process:
Mathematically, broadband scattering imaging can also be formulated as the following forward model:
We develop an iterative inverse algorithm, modified from sparse Poisson intensity reconstruction algorithms theory and practice (SPIRAL-TAP) [49], to obtain an approximate solution of the inverse problem in Eq. (14). Here, in our case, the forward model matrix is defined as a function called and the adjoint of is specified as , which denotes the back-projection process. Contrary to the previous deconvolution approaches, the forward and backward models are modified with the cross-correlation [Eq. (11)] and nonlinear reconstruction operation [Eq. (12)], respectively. The adjustable parameters are set the same as in nonlinear reconstruction ( and ). The SPIRAL-TAP approach for the de-correlagraphy problem will consider the iterations to be sufficiently close to the optimal solution if
Some artifacts from the recovery process nevertheless deserve attention; for instance, the abnormal cross-section lines indicated by gray dashed lines in the nonlinear reconstruction [Fig. 3(f)]. These cross-section lines come from the noninteger power transformation of the spectrum in the nonlinear reconstruction, resulting in a simultaneous increase in detail and high-frequency noise, unavoidably producing these artifacts. As these adjustable parameters and decrease, the resolution of the reconstruction will improve moderately, while the noise will increase dramatically. (Details about this point can be found in Appendix C.) However, thanks to the regularized inversion method, these artifacts are not dominant in the recovered pattern [see the comparison in Fig. 3(f)]. In addition, the differences between the values of points
4. RESULTS AND DISCUSSION
Experimental validation. The single-shot method A was validated in the visible domain on a broadband and spatially incoherent source with its spectrum full-width at half-maximum (). Then, method B was tested under the same experimental conditions with additional PSF information. The objects employed for these experiments are ubiquitous in the field of scattering imaging and allow for algorithmic validation and resolution estimation.
A. Experimental Validation without PSF Calibration
A lens imaged lithographic letter “B” is shown in Fig. 4(a). The experiment was performed using a commercial broadband light emitting diode (LED) (Thorlabs, MBB1L3, wavelength range of 400–1100 nm, ). The collimated light is incident on the object and then transmitted 24 cm to the diffuser (Edmund, 45-656). The light scattered from the diffuser was recorded by a scientific complementary metal oxide semiconductor (sCMOS) camera (Pco.edge. 4.2, pixel size 6.5 μm) set at 12 cm from the diffuser. Further details on the experimental setup can be found in Appendix C.
Figure 4.Experimental validation without calibrating PSF with an illumination bandwidth of
Our single-shot method was then applied to the broadband pattern, at the center of mass of the broadband spectrum (). Two other state-of-the-art single-shot methods were also introduced for comparison. The SCI method pioneered by Katz
Apparently, both methods [Figs. 4(c) and 4(d)] failed to reconstruct the object and the corresponding Fouier phase under broadband illumination. From the correlation perspective, the failure of the speckle correlation is attributable to the fact that the autocorrelation of broadband PSF cannot be approximated as a delta function, as described in Eq. (6) and mentioned in Fig. 1. Therefore, the estimation of object autocorrelation is stained and further obstructs phase retrieval. The statistical noise of the bispectrum is higher than the autocorrelation, so more speckle spots are required in reconstruction, making it difficult to cope with low-contrast broadband patterns. The proposed method successfully reconstructs the target information, where the PSNR value is as high as 22.63 dB. The similarity between the object Fourier phase and the reconstructed result [see Figs. 4(e) and 4(f)] is obvious.
It is worth emphasizing that iterative phase retrieval algorithms are highly sensitive to the initial guess, the support constraint, and the convergence criterion. The twin-image problem will interfere with the direction selection in the iterative process, causing the phase retrieval algorithm to wander left and right and unable to quickly jump out of the local minimum. Thus the tuning rules are very general and require experiments during a trial and error method.
Here, thanks to the initial non-centrosymmetric support region estimation, adaptive shrink-wrap strategy, and hybrid retrieval method, we achieve a reconstruction accuracy as high as 85%. The empirical probability of success is an average over 1000 trials, in each of which we generate new random matrices as the algorithm input. We declare that a trial is successful if the SSIM between the ground truth and the reconstruction is (see
B. Experimental Validation with PSF Calibration
The main reconstruction comparison between different state-of-the-art approaches is presented in Fig. 5. A tailored USAF1951 target (Edmund, 55-622) with group 0 element 3-5 is selected as the ground truth object. The broadband PSF is measured with a 100 μm pinhole in the same spatial position of the object. The reconstructed results by different algorithms, including the cross-correlation, nonlinear reconstruction, Edrei
Figure 5.Experimental validation with PSF calibration under broadband illumination (
As indicated, all methods find the object pattern correctly. Our reconstruction matches perfectly with the ground truth, with PSNR further increasing to 18.57 dB, higher than that of the nonlinear algorithm (15.36 dB). We compare in Fig. 5(f) the intensity profiles along the line indicated by the yellow arrow [see Fig. 5(a)] in each subgraph. Benefiting from the constraint of the regular term, our results show a significant difference between the target and background. The use of TV regularization effectively reduces the background level and compensates for the artificial traces caused by power spectrum modulation [see Figs. 3(f) and 5(c)]. It is shown that our reconstruction has the smallest background while keeping the highest image contrast. The hollow numbers “4” and “5” close to the edge of the resolution target are also preserved.
An alternative reconstruction is from the Lucy–Richardson deconvolution algorithm under 400 iterations [52] [Fig. 5(d)]. Due to the inherent linear convolution model, the deconvolution algorithm can produce an acceptable result [see Fig. 5(d)]. Correlagraphy-based reconstructions [Figs. 5(b) and 5(c)] also show clear and sharp-edged results with high reconstruction speed. The effects of background noise and artifacts are well compensated by our improved method [Figs. 5(e) and 5(f)]. This comparison also reveals that the correlation-based methods {with Fig. 5(e) or without regularization [Figs. 5(b) and 5(c)]} do not require precise optical calibration while performing well in broadband scattering imaging [53].
In addition, the memory complexity of our method is , and the time complexity of each iteration is . Although the cost time is slightly longer when processing on the CPU (see Fig. 5), it can be dramatically shortened by using parallel computation on GPU (approximately 10 s for four typical iterations on GeForce RTX 2080 Ti). Further comparisons of different approaches under narrowband and broadband illumination are arranged in Appendices B to E.
Due to the temporal and spectral reciprocity, ultrafast sources can also produce equivalent broadband spectra. Naturally, our method can be used as a plug-and-play image restoration in ultrafast science and passive sensing. To demonstrate this point, we show here a comparison among three types of used scientific broad-spectrum light sources and our methods feasibility from numerical simulation. Figure 6(a) illustrates a transform-limited femtosecond pulse with a 100 fs pulse width. The corresponding spectrum ranges from 480 nm to 720 nm, and (). The second revolutionary broadband source is an optical frequency comb with a fixed comb and a spectrum ranging from 360 nm to 830 nm. The normalized spectrum of the optical comb is shown in Fig. 6(b) with each comb teeth spectrum . The third broadband source is the “D65” illumination [see the spectrum in Fig. 6(c)], which is commonly used to simulate artificial sunlight (). The broad-spectrum PSFs and speckles generated by these three light sources, and the corresponding imaging results via methods A and B are shown in Figs. 6(d)–6(f), respectively.
Figure 6.Comparison of the reconstructions under three different types of broadband illumination calculated from numerical simulation. (a) The normalized spectrum of a femtosecond pulse with
All three broadband speckles produce high-fidelity results with both methods. Intriguingly, our noninvasive method (method A) performs similarly to the invasive method (method B). A possible explanation for this behavior could be the noise-suppressing properties of the NLM method used in cepstrum filtering. Further discussion of the speckle contrast and the imaging performance changes with spectral spread is arranged in Appendix B.
While both methods A and B provide promising solutions to the lensless broadband scattering imaging problem, the hidden scenes are restricted to two-dimensional planar objects and the effective field-of-view still lies in the OME scope. For method A, recovering the high-frequency features of the spectrum remains a great challenge as the complete scattering degradation process is unknown. As a result, this noninvasive method cannot be used to image dense and connected samples. This is also a limitation that applies to all existing statistical-based scattering imaging methods, including method A. For method B, there is a trade-off between the detail of the reconstruction and the degree of denoising owing to the introduction of the TV regularity term. Moreover, the complexity of the algorithm needs to be further reduced. We believe that these drawbacks will be overcome in the future.
With respect to improving imaging performance and capability on challenging scenarios, the next step is to implement our method on a complex 3D scene by leveraging the stereo vision [54] or light field solutions [55]. Higher-fidelity reconstruction is anticipated by applying deep learning modules or scattering matrix inversion to unveil the detailed scattering process. Benefiting from the single-shot data acquisition strategy, future works can also be expanded to the dynamic scattering region and non-line-of-sight imaging.
5. CONCLUSION
In summary, we have proposed two individual lensless broadband scattering imaging techniques, which both produce high-fidelity reconstructions from a single-shot wide-spectrum illuminated speckle () spanning the visible region. The noninvasive method (method A) reproduces the object Fourier spectrum by numerical cepstrum refinement combined with a modified phase retrieval strategy. The invasive method (method B) is motivated by the correlaion, which outperforms state-of-the-art approaches benefiting from the optimization framework and suitable regularizers. No additional measurement of the source spectrum is needed; the spectrum may be ultra-broadband, highly structured, and even exhibit significant intensity fluctuations in both methods. It is expected that the possibility of using broadband illumination in lensless scattering imaging demonstrated here will open up a wide range of opportunities in ultrafast science and astronomical observation and even be extended in the X-ray imaging regime [56] and non-line-of-sight imaging [57]. Additionally, it can also be implemented in unknown target tracking by incorporating cross-correlation analysis [58] or applied in other dynamic scattering scenarios for its single-shot capability [59].
Acknowledgment
Acknowledgment. The authors thank Xin Huang, Jianjiang Liu, Yijun Zhou, Zhengping Li, and Chengfei Guo for fruitful discussions and useful feedback. Our deepest gratitude goes to the anonymous reviewers for their careful work and thoughtful suggestions that have helped improve this paper substantially.
APPENDIX A: TWO-WAVELENGTH MUTUAL INTENSITY CORRELATION FUNCTION
In this section, we present additional mathematical details about the intensity cross-correlation part in Eq. (
Figure 7.Transmission scattering geometry and coordinate. (a) Free space transmission scattering geometry. (b) Incidence and observation directions, (c)
Because we only consider the modulus of the correlation function, the quadratic-phase complex exponential is ignored outside the integral as Eq. (
The field in the plane can be written as
The phase shift suffered at the rough interface is rewritten as
Our goal is to calculate the autocorrelation function of the speckle field at two points and , that is,
The cross correlation between two observed fields and can now be written as
Substituting Eq. (
Substituting Eq. (
To simplify Eq. (
With the help of Eq. (
Here, is the intensity distribution across the scattering spot, which is given by . To make further progress, a specific shape for the scattering spot must be assumed. Let the spot be uniform and circular with diameter , centered on the origin. Using the Hankel transform, we obtain
For further simplification, let us introduce the same approximation used in the calculation of monochromatic speckle:
Here means the average value of the two wavelengths and the subscripts and represent the -th and -th wavelengths. The normalized cross-correlation function, is found by dividing by its value when and . The result is
Obviously, the wavelength variation has two effects. First the observed speckle pattern undergoes a spatial scaling by an amount that depends on distance from the point where the scattering event occurs (). When the wavelength increases, the speckle pattern expands about this point, and when the wavelength decreases, the speckle pattern shrinks about this point. This effect is caused by the fact that diffraction angles are inversely proportional to wavelengths. A second effect is upon the random phase shifts imparted to the scattered wave by the surface-height fluctuations (). Since those phase shifts are proportional to , an increase in wavelength will decrease the random phase shifts, and vice versa. When the surface is rough on the scale of a wavelength, phase wrapping into the interval (0, ) will result in changes in the detailed structure of the speckle pattern (see
If the surface-height fluctuations obey Gaussian statistics, the phase shifts imparted intensity correlation change is given by
In the case that the incidence direction and observation direction are both normal,
According to the Siegert relation [
We can now write the intensity cross-correlation function as
This equation represents the final result of our analysis of the cross-correlation between two arbitrary wavelength speckles. When equals , the Eq. (
APPENDIX B: IMAGING PERFORMANCE WITH INCREASING ILLUMINATION BANDWIDTH
To support the experimental data, we have performed full numerical simulations of wave propagation in disordered media. In the simulations, the diffuser is inserted at a distance in front of the detector and behind the object. The geometry of propagation from the source plane to the observation plane can be divided into the following three stages: i) the source plane propagates to the scattering plane via free-space propagation; ii) the wavefront is modulated by the diffuser; iii) the disordered wavefront propagates to the detector plane via free-space propagation. The and , , stand for the wave vector and the three plane coordinates, respectively. The free-space propagation can be modeled as Fresnel diffraction integral, where the notation here is adapted from that described by Nazarathy and Shamir [
The operators’ parameters are given in square brackets, and the operand is given in curly braces. We model the diffuser transmittance as a pure-phase random mask, i.e., , where is the random height of the diffuser surface and is the difference between the refractive indices of the diffuser and the surrounding air ( for glass diffusers). A circle-shaped pupil is added behind the diffuser to control the speckle grain size (the diameter of the pupil is used in simulation).
The Fresnel diffraction operator propagates only monochromatic waves. Any broadband signal can be propagated by first writing it as a superposition of monochromatic waves and then propagating each one individually. According to the above description, the broadband PSF can be written as
The emulation results of speckles with various illumination bandwidths are shown in Fig.
Figure 8.Emulation results under different illumination spectral widths. (a) Normalized spectrum profile of illumination bandwidth FWHM varying from
Figure 9.Scattering imaging results under different illumination bandwidths via our proposed method A using simulated data. (a) Ground truth object. (b) Fourier amplitude of (a). (c) Recovered results using speckles under different illumination bandwidths [Fig.
Figure 10.Imaging results with quantitative performance under different illumination bandwidths via our proposed method B using simulated data. (a) Ground truth object. (b)–(g) Recovered hidden objects using speckles from Fig.
As mentioned here and in the paper, one can in principle use a detection bandwidth much larger than the speckle spectral decorrelation bandwidth, and still be able to effectively employ our proposed approaches.
APPENDIX C: SYSTEM CONFIGURATION AND EXPERIMENT RESULTS
The experimental setup for imaging in transmission is shown in Fig.
Figure 11.Experimental setup. A scattering slab is illuminated by a wide-spectrum LED light source with a spectral range from 400 to 1100 nm (
Figure 12.Experimental comparison of broadband and narrowband illuminated imaging performance via cross-correlation strategy. First column, photography of ground truth objects. Columns 2–5 show imaging results from 220-grit ground glass diffuser with broadband illumination (
Figure 13.Nonlinear reconstruction results of the letter “
Figure 14.Extended imaging results of method A. (a) Ground truth object “B.” (b)–(d) The recovered letter “B” from red, green, and white light via method A. (e) Ground truth object “star.” (f) Retrieved object “star” under broadband illumination (
Figure 15.Complex micro-target imaging via proposed method B. (a) The broadband spectrum (orange) with a bandwidth of 44.8% and the narrowband spectrum (red) with a bandwidth of 2.7% and (green) with a bandwidth of 6.6%. (b)–(d) Speckle pattern under red, green, and white illumination, respectively. (e) Ground truth object. The yellow arrows indicate a micro portion with about 10 μm. (f)–(h) The corresponding retrieved objects from (b)–(d) using the 220-grit ground glass diffuser with a 25 μm pinhole. Scale bars are 300 camera pixels in (b)–(d) and 100 μm in (e)–(h).
Figure 16.Image reconstruction with changing NLM filtering parameters
Figure 17.Results of different parameters. This figure shows the results reconstructed by method B with different parameters. The leftmost image is the ground truth image used in the simulation. The second to fourth columns represent different
Quantitative Analysis for Different Parameters Using Method PSNR 13.53 13.53 10.76 SSIM 0.46 0.47 0.13 PSNR 13.51 13.56 10.75 SSIM 0.47 0.51 0.13 PSNR 12.58 12.58 9.85 SSIM 0.32 0.34 0.09
APPENDIX D: SPATIAL AND SPECTRAL CORRELATION ANALYSIS
To quantify the effective imaging scope of the broadband scattering approach, we measured the translational and longitudinal OME range of the scattering media. The results are summarized in Fig.
Figure 18.Measurement of the spatial speckle decorrelation as a function of the displacement laterally and longitudinally. (a) and (b) Translational OME range of the 220-grit and 600-grit ground glass, respectively. (d) and (e) Longitudinal OME range of the 220-grit and 600-grit ground glass. (c) and (f) Effective OME range for the two types of ground glass diffusers. All experiments were repeated for three spectral bands: red, green, and white.
Figure 19.(a)–(d) Monochromatic speckles at
Figure 20.Imaging resolution varies with the size of the pinhole probes. (a)–(c) Recovered results from red, green, and white illuminated patterns using 220-grit ground glass with a 50 μm, 100 μm, and 200 μm pinhole, respectively. (d) Broadband PSF autocorrelation as a function of the pinhole size for each PSF corresponding to (a)–(c). (e) Normalized autocorrelation intensity profiles of all PSFs along the arrow presented in (d). Scale bars are 50 camera pixels in (a)–(c).
Figure 21.Imaging depth-of-field analysis with a fixed interval
APPENDIX E: NOVEL BROADBAND SPECKLE CORRELATION PHENOMENON
Figure
Figure 22.Anisotropic correlations in broadband speckle patterns. (a) Broadband speckle pattern obtained by numerical simulation. (b) Enlarged border parts I, II, III, and IV of (a). (c) Autocorrelation of different parts in (b). (d) Enlarged central part V and its corresponding autocorrelation. (e)–(h) Real experiment observed broadband speckle and the anisotropic correlation phenomenon from 220-grit ground glass. The scale bars in (a) and (e) are 200 camera pixels.
Figure 23.Position-dependent broadband speckle memory effect. (a) A typical scrambled pattern under broadband irradiation. (b) and (c) Translation memory effect range of the 600-grit and 220-grit ground glass calculated from different regions. The translation memory effect range is wider for the central part (orange box) than for the boundary part (blue box). The scale bar is 200 camera pixels in (a).
APPENDIX F: MEDIA
Please refer to
References
[4] J. Kubby, S. Gigan, M. Cui. Wavefront Shaping for Biomedical Imaging(2019).
[20] W. Xiong, C.-W. Hsu, H. Cao. Spatio-temporal correlations in multimode fibers for pulse delivery. IEEE Photonics Society Summer Topical Meeting Series (SUM), 1-2(2019).
[25] J. Goodman. Introduction to Fourier Optics(2005).
[26] J. Goodman. Speckle Phenomena in Optics: Theory and Applications(2007).
[27] E. Akkermans, G. Montambaux. Mesoscopic Physics of Electrons and Photons(2007).
[38] J. C. Dainty. Laser Speckle and Related Phenomena, 9(2013).
[41] J. R. Fienup. Phase retrieval algorithms: a personal tour. Appl. Opt., 52, 45-56(2013).
[44] J. Wright, A. Ganesh, S. Rao, Y. Peng, Y. Ma. Robust principal component analysis: exact recovery of corrupted low-rank matrices via convex optimization. Advances in Neural Information Processing Systems, 2080-2088(2009).
[45] H. Xu, C. Caramanis, S. Sanghavi. Robust PCA via outlier pursuit. IEEE Trans. Inf. Theory, 58, 3047-3064(2012).
[50] S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3, 1-122(2011).
[52] R. J. Hanisch, R. L. White, R. L. Gilliland. Deconvolution of hubbles space telescope images and spectra. Deconvolution of Images and Spectra, 310-360(1996).
[60] A. J. F. Siegert. On the Fluctuations in Signals Returned by Many Independently Moving Scatterers(1943).
[62] Y. Wang, J. Li, P. Stoica. Spectral Analysis of Signals: The Missing Data Case(2006).
Set citation alerts for the article
Please enter your email address