• Photonics Research
  • Vol. 10, Issue 11, 2471 (2022)
Wei Li1, Bingjian Wang2, Tengfei Wu3, Feihu Xu4, and Xiaopeng Shao1、*
Author Affiliations
  • 1School of Optoelectronic Engineering, Xidian University, Xi’an 710071, China
  • 2School of Physics, Xidian University, Xi’an 710071, China
  • 3Laboratoire Kastler Brossel, ENS–Université PSL, CNRS, Sorbonne Université, College de France, F-75005 Paris, France
  • 4Hefei National Laboratory for Physical Sciences at Microscale and School of Physical Science, University of Science and Technology of China, Hefei 230026, China
  • show less
    DOI: 10.1364/PRJ.466065 Cite this Article Set citation alerts
    Wei Li, Bingjian Wang, Tengfei Wu, Feihu Xu, Xiaopeng Shao. Lensless imaging through thin scattering layers under broadband illumination[J]. Photonics Research, 2022, 10(11): 2471 Copy Citation Text show less

    Abstract

    Lensless scattering imaging is a prospective approach to microscopy in which a high-resolution image of an object is reconstructed from one or more measured speckle patterns, thus providing a solution in situations where the use of imaging optics is not possible. However, current lensless scattering imaging methods are typically limited by the need for a light source with a narrowband spectrum. Here, we propose two general approaches that enable single-shot lensless scattering imaging under broadband illumination in both noninvasive [without point spread function (PSF) calibration] and invasive (with PSF calibration) modes. The first noninvasive approach is based on a numerical refinement of the broadband pattern in the cepstrum incorporated with a modified phase retrieval strategy. The latter invasive approach is correlation inspired and generalized within a computational optimization framework. Both approaches are experimentally verified using visible radiation with a full-width-at-half-maximum bandwidth as wide as 280 nm (Δλ/λ=44.8%) and a speckle contrast ratio as low as 0.0823. Because of its generality and ease of implementation, we expect this method to find widespread applications in ultrafast science, passive sensing, and biomedical applications.

    1. INTRODUCTION

    Although performing imaging with scattered light is intractable, it has wide applications in many fields, such as biomedical, astronomical, and underwater imaging. This problem is challenging in that the incident wave is randomly modulated by the scattering media, resulting in a disordered pattern containing massive speckle spots. Fortunately, thanks to recent advances in wave front shaping [15], the disordered information can be redistributed and decoded from the observed speckle pattern. For instance, measuring the transmission matrix (TM) allows one to perform focusing [69], as well as imaging [1012] or transmitting information [1315], through or inside the scattering medium. While, in principle, acquiring TM requires coherent illumination, its extension to broadband case will open fascinating applications in ultrafast science. For this purpose, the TM-based methods are further extended in the spectral domain by accessing the multispectral transmission matrix (MSTM) to cope with polychromatic light. Such techniques have shown success in ultrafast lasers [1619] and fiber optics [20]. However, characterizing the scattering medium requires thousands of images captured under vibration isolation conditions, which precludes its applications in dynamic and harsh environments.

    A separate approach known as speckle correlation imaging (SCI) also shows promise in broadband focusing and imaging through scattering media [21,22]. This approach relies on persistent correlations, such as the optical memory effect (OME) [23,24], which enables single-shot, high-fidelity, diffraction-limited imaging through scattering media without any prior knowledge of or access to the scattering media. While promising, regular SCI, based on the speckle generated from quasi-monochromatic illumination (typically <20  nm), is inherently incompatible with the extremely broad nature of white light spectra. However, the crucial coherence length Lc=λ2/Δλ of the light sources is inversely proportional to the illumination bandwidth Δλ [25]. Speckle patterns emitted by broadband radiation display low-contrast structures and thus obstruct optical manipulation and inverse reconstruction.

    Indeed, the broadband pattern can be thought of as the incoherent superposition of multiple monochromatic speckle patterns produced by the different frequency components encompassed by the light spectrum. Essentially, when observing a speckle pattern, an increase in wavelength (all other parameters remaining fixed) has two effects. First, the observed speckle pattern undergoes a spatial scaling. When the wavelength increases, the speckle pattern expands and vice versa. The second effect is on the random phase shifts imparted to the scattered wave by the surface-height fluctuations h. Since those phase shifts are proportional to h/λ, an increase in wavelength will decrease the random phase shifts and vice versa. When the surface is rough on the scale of a wavelength, phase wrapping into the interval (0, 2π) will result in changes in the detailed structure of the speckle pattern [26]. For the above two reasons, the extended illumination bandwidth will result in dephasing and speckle aliasing [27], thus reducing the contrast of the observed speckle pattern and further obstructing the reconstruction process.

    As different spectral components generate different speckles, current approaches are highly sensitive to the spectral bandwidth. The monochromatic treatment is, therefore, only valid as long as the bandwidth of the source is contained within this spectral correlation bandwidth. Extending narrowband to broadband light can effectively ease the trade-off between the filter bandwidth and detection signal-to-noise ratio (SNR). Furthermore, it can potentially help increase the penetration depth, which is critical for biomedical fields, including neuroscience.

    To mitigate the constraint of the narrowband operation, several imaging techniques have been developed. A straightforward method is based on spectral components such as spectral filters [28], prisms [29], or even scattering-assisted spectrometers [30] to resolve broadband spectra into well-separated spectral channels. This improves the speckle contrast as well as the performance of object retrieval. Although these methods have proven efficiency, the trade-off between spectral resolution and spectral range can limit their performance. Moreover, imposing spectral filters on broadband sources leads to a dramatic reduction in flux, which may be impractical in some applications. The ability to use the full-flux of low-flux ultra-broadband sources, such as table-top high-harmonic generation (HHG) systems [31], could provide a breakthrough for practical imaging applications. Incoherent scattering imaging can also rely on deep learning modules [32]. Nevertheless, the price to pay is a fuzzy physical model and a large training dataset. To date, controlling scattered light for deep focusing under broadband illumination has attracted much attention [1619,33]. However, the necessary scattering imaging methods combining broadband illumination with diffraction-limited spatial resolution are currently lacking. These representative state-of-the-art scattering imaging methods, as well as the effective illumination bandwidth, are summarized in Table 1.

    Comparison among State-of-the-art Scattering Methodsa

    MethodBandwidth (FWHM)Captured FramesPrecision CalibrationObject Sparse ConstraintImage Fidelity
    Bertolotti et al. [21]40 nmMulti-shotWW/OHigh
    Katz et al. [22]20 nmSingle-shotW/OWHigh
    Edrei et al. [34]1 nmSingle-shotWW/OHigh
    Wu et al. [35]1 nmSingle-shotW/OWHigh
    Badon et al. [16]13.8 nmMulti-shotWW/OModerate
    Li et al. [29]60 nmSingle-shotWWHigh
    Divitt et al. [36]200 nmMulti-shotWWModerate
    Lu et al. [37]200 nmSingle-shotW/OWModerate
    Ours (A)280 nmSingle-shotW/OWHigh
    Ours (B)280 nmSingle-shotW/OW/OHigh

    W=with, W/O=without. A=Method A, B=Method B.

    In this paper, we propose two individual methods for realizing scattering imaging under broadband illumination using only a single speckle measurement under noninvasive and invasive conditions. The major difference between invasive and noninvasive methods is that invasive methods pre-calibrate point spread function (PSF) information for scattering systems, whereas noninvasive methods require no a priori information about the medium or its properties. In contrast to previous approaches, both methods neither require any spectral filters, nor have additional assumptions on the spectral content of the signals. In the noninvasive approach (method A), we extract the object Fourier spectrum from the broadband pattern by using a cepstrum shaping method combined with a matrix decomposition strategy. We show that by unique computational processing, the lost low-frequency information can be reproduced in the Fourier domain. After applying a modified phase retrieval procedure, sharp reconstruction results can be achieved even with a speckle contrast ratio (SCR) as low as 0.0823 [26]. The invasive approach (method B) is based on an optimization framework to solve a general de-correlagraphy problem. The proposed computational framework can effectively reduce the background level of the recovered image, while preserving sharp edges and high-frequency features. Experimental validations under the 280 nm illumination spectrum spanning the visible region show the applicability of both methods. Our approach enables imaging through scattering media under broadband radiation, which we expect to make much more efficient use of such state-of-the-art light sources as third-generation synchrotrons and table-top HHG sources for operating in heavily scattering environments. Moreover, the capability to treat broadband light can also empower the fields of florescence imaging, night vision detection, and passive sensing.

    2. CORRELATION ANALYSIS

    For simplicity, our problem discussion is limited to the scope of the OME region. The recorded broadband speckle IB can be described by the convolution between the broadband PSF, PSFB, and the object function, O, IB=O*PSFB+N,where the symbol “*” denotes a two-dimensional convolution operation, N represents the noise, and the subscript B represents the broadband illumination. By employing the convolution theorem [21,38], the autocorrelation of the broadband pattern IB yields IBIB(OO)*(PSFBPSFB),where the symbol “” represents a two-dimensional correlation operation. The omitted (O*PSFB)N term is due to the statistically uncorrelated between the noise term N and the PSFB. The autocorrelation of noise can be approximated as a constant background [22]. When dealing with a narrowband illumination case, the autocorrelation of PSF can be simplified as a Dirac delta function and omitted [26]. However, this assumption does not hold for broadband illumination as the autocorrelation of the broadband pattern gathers M terms monochromatic speckles autocorrelation and CM2 terms cross-correlation between speckles of different wavelengths in spectrum discretization.

    Let us assume that the broadband spectrum light source Q(λ,Δλ) can be regarded as a linear superposition of multiple narrowband spectrum light sources Qi(λi,Δλi)(i=1,2,,M) with each one under the narrowband illumination hypothesis. Here, λ and λi represent the central wavelength of the broadband and each narrowband light, and the bandwidths of the broadband and narrowband light are denoted by Δλ and Δλi, respectively. Considering the broadband illumination case, the broadband PSF, PSFB(λ,Δλ) can be expressed as PSFB(λ,Δλ)=i=1MωiPSFN(λi,Δλi),where ωi denotes the weight coefficient of each spectral component, and the subscript N represents narrowband illumination. Correspondingly, the autocorrelation of the broadband speckle pattern can be written as PSFB(λ,Δλ)PSFB(λ,Δλ)=i=1Mωi2PSFN(λi,Δλi)PSFN(λi,Δλi)+i=1MjiMωiωjPSFN(λi,Δλi)PSFN(λj,Δλj).

    From Eq. (4), the autocorrelation of the broadband speckle can be separated into two parts. The first term incorporates a superposition of all autocorrelation parts, whereas the latter term accounts for contributions from all cross-correlation parts. Considering the discussions in Ref. [38] and supposing the imaging system has a circle pupil, the first part of the broadband PSF’s autocorrelation can be further expressed as i=1Mωi2PSFN(λi,Δλi)PSFN(λi,Δλi)i=1Mωi2[1+|2J1(πDr/λiz)πDr/λiz|2],where J1(·) denotes the first kind of Bessel function of the first order. D, r=(Δx)2+(Δy)2 with r=(x,y) on observation plane, and z are the pupil size, the radius of two-dimensional polar coordinate in image plane, and the image distance, respectively. The contribution of the first part is to form a peak-to-background ratio (PBR) of 2:1 spike in the autocorrelation pattern, while the second part can be calculated as i=1MjiMωiωjPSFN(λi,Δλi)PSFN(λj,Δλj)i=1MjiMωiωj{1+exp[σh2(n1)|kikj|]×{2J1(πDr/λ¯ijz)πDr/λ¯ijz*F{exp[j(kikj)r22z]}}2},where σh and n denote the height standard deviation and the refractive index of the scattering media, respectively. F{·} indicates the Fourier transform, and r=(ξ,η) are the coordinates of the plane just to the right of the scattering surface. k denotes the wave vector, k=2π/λ. The newly involved wavelength term λ¯ij represents the average values of λi and λj. The residual background term accumulated by the second part [Eq. (4)] increases gradually as the wavelength difference increases. See Appendix D for details. Substituting Eqs. (5) and (6) to Eq. (4), we eventually derive the autocorrelation of the broadband PSF (refer to Appendix A for detailed derivation).

    3. METHODS

    A. Imaging without PSF Calibration

    A radiation beam consisting of multiple wavelengths is illuminated on the sample and the exiting field is scattered by an off-the-shelf diffuser to form broadband pseudorandom patterns, which is the incoherent superposition of speckle patterns of each wavelength [see Figs. 1(a) and 1(b)]. To reconstruct the object, O, from the single-shot measured image IB, we need to solve the ill-conditioned inverse scattering problem by utilizing phase retrieval. However, as discussed in Section 2, the autocorrelation of the broadband PSFB no longer satisfied the delta function approximation and thus caused the failure of the SCI method. Novel methods are highly desired to remedy this issue. Here, based on experimentally observed similarities in the Fourier spectra of objects and speckles and inspired by homomorphic filtering, we propose a spectrum refinement principle that enables the extraction of the pure object Fourier spectrum from the caustic broadband pattern Fourier spectrum. The Fourier spectrum extraction procedure is depicted in Fig. 2. The phase retrieval is then performed using the refined Fourier spectrum pattern [see Fig. 1(d)] as input into a hybrid relaxed averaged alternating reflections (RAAR) [39] and hybrid input and output (HIO) [40,41] algorithm, with a shrink-wrap-like support update [42] (see Algorithm 1 for details). The object, support region, and Fourier phase can be retrieved simultaneously [Fig. 1(e)].

    Principle of single-shot broadband scattering imaging. In the case of a broadband source (a), the broadband pattern (b) is the incoherent spectrally weighted sum of the monochromatic speckle patterns corresponding to all wavelengths present in the source. The size of these monochromatic speckles is geometrically scaled with increasing wavelengths, but the micro-structures are ever-changing. In the presented method, the broadband speckle is first transformed into the Fourier domain (c) and then refined in the complex cepstrum to extract an estimated object Fourier spectrum (d). (e) A modified iterative phase retrieval algorithm is then used to reconstruct the sample, support, and Fourier phase simultaneously.

    Figure 1.Principle of single-shot broadband scattering imaging. In the case of a broadband source (a), the broadband pattern (b) is the incoherent spectrally weighted sum of the monochromatic speckle patterns corresponding to all wavelengths present in the source. The size of these monochromatic speckles is geometrically scaled with increasing wavelengths, but the micro-structures are ever-changing. In the presented method, the broadband speckle is first transformed into the Fourier domain (c) and then refined in the complex cepstrum to extract an estimated object Fourier spectrum (d). (e) A modified iterative phase retrieval algorithm is then used to reconstruct the sample, support, and Fourier phase simultaneously.

    Noninvasive broadband scattering imaging reconstruction pipeline. The broadband pattern IB (a) is first filtered by a two-dimensional Hanning window to avoid spectrum leakage (b) W·IB and then transformed into the Fourier domain. The Fourier amplitude (c) is reserved and converted into the logarithmic domain (d). The spectrum is filtered by NLM (e) and transmitted back into the Fourier domain (f). A crucial step to remove the peak noise is handed over to the LRSD. After eliminating the sparse part (g), the reserved low rank part (h) is certified as the estimated object spectrum. A modified phase retrieval is then applied on (h) to retrieve the object information. The ground truth object Fourier spectrum is presented in (i). The line profiles (j) along the opposite yellow arrows in (i) show a comparison among the original (c), the ground truth (i), and the recovered spectrum (h). Notably, the recovered spectrum contains richer low-frequency information than the original spectrum. The windowed logarithmic transformed PSF spectrum is shown in (k) with the peak highlighted by a red arrow. NLM, nonlocal means; LRSD, low rank and sparse decomposition.

    Figure 2.Noninvasive broadband scattering imaging reconstruction pipeline. The broadband pattern IB (a) is first filtered by a two-dimensional Hanning window to avoid spectrum leakage (b) W·IB and then transformed into the Fourier domain. The Fourier amplitude (c) is reserved and converted into the logarithmic domain (d). The spectrum is filtered by NLM (e) and transmitted back into the Fourier domain (f). A crucial step to remove the peak noise is handed over to the LRSD. After eliminating the sparse part (g), the reserved low rank part (h) is certified as the estimated object spectrum. A modified phase retrieval is then applied on (h) to retrieve the object information. The ground truth object Fourier spectrum is presented in (i). The line profiles (j) along the opposite yellow arrows in (i) show a comparison among the original (c), the ground truth (i), and the recovered spectrum (h). Notably, the recovered spectrum contains richer low-frequency information than the original spectrum. The windowed logarithmic transformed PSF spectrum is shown in (k) with the peak highlighted by a red arrow. NLM, nonlocal means; LRSD, low rank and sparse decomposition.

    After we measure the broadband pattern IB [Fig. 2(a)], we first filter it with a two-dimensional Hanning window in real space W·IB [Fig. 2(b)] to avoid spectrum leakage and then transform it into the Fourier domain [see Fig. 2(c) for Fourier amplitude]. The part of the Fourier phase corrupted by scattering is discarded, which is not shown here. We note, first, that the Fourier spectrum is sparse, as the energy of the power spectrum is concentrated around zero frequency [see the blue line profile in Fig. 2(j)], which necessitates an approach to retrieval. To avoid oscillations caused by near zero values in the spectrum, we add an all-one matrix on the Fourier amplitude and then convert it into the logarithmic domain [Fig. 2(d)]. Notably, more details and noise in the spectrum are highlighted with logarithmic transform; therefore, a smooth filter via nonlocal means (NLM) [43] is applied to exclude the noisy profile introduced by the broadband PSFB [see the hemispherical surface attached burrs in Fig. 2(k)]. The size of the patch and research window of the NLM are set as 5 and 11, respectively, and the smooth parameter γ is set as 3. As γ controls the decay process of the Gaussian function in NLM, a higher value of γ is suitable for a relatively simple target to obtain a smoother spectrum and vice versa. Detailed parameter selection rules are arranged in Appendix C.

    The filtered spectrum is then transformed back into the Fourier domain [Fig. 2(f)]. The next step involves a low-rank and sparse decomposition (LRSD) procedure [44] based on robust principal component analysis (RPCA) [45] to remove the residual peak noise caused by the broadband PSFB [highlighted with a red arrow in Fig. 2(k)]. We classify this peak noise into a sparse part according to its sparse property [Fig. 2(g)], and the low-rank part represents the final separated Fourier spectrum [Fig. 2(h)]. The ground truth object Fourier spectrum is presented in Fig. 2(i). Note that the low-frequency information reappears in the Fourier domain after processing [see the comparison in Figs. 2(c), 2(h), and 2(i)]. The central line along the two opposite arrows of Figs. 2(c), 2(h), and 2(i) depicted by gray, red, and blue lines also verified the effectiveness of the proposed method, for richer low-frequency information and higher cut-off frequency.

    Finally, we obtain the refined Fourier spectrum |F^{O}| [Fig. 2(h)], and in the next step, we perform phase retrieval to reconstruct the target information. However, as the broadband PSFB information is unknown, object spectrum recovery is an ill-conditioned problem; therefore, robust phase retrieval is needed to remedy this issue. Here, we modified the iterative alternating projection method by introducing an adaptive support region and a hybrid phase retrieval strategy. Specifically, the iteration was carried out using a combination of the RAAR and HIO algorithms, with a shrink-wrap-like support update. We calculate the root mean square error (RMSE) between the refined Fourier amplitude and the Fourier amplitude generated by the estimated object in each iteration. We chose typically 80 iterations to ensure that the RMSE of the RAAR algorithm converged to <0.05. The subsequent HIO algorithm is used for smearing out the background noise and 10 iterations are typically adequate to obtain the final result. The feedback parameter β is set as β1=β2=0.8 for both the RAAR and HIO algorithms. Accompanied by the iteration, the adaptive support region is controlled by an alternative convolution with a Gaussian kernel (Algorithm 1, step 6) and a threshold operation (Algorithm 1, step 7), which act as dilation and erosion. As the iterative number increases, this exported support region will change from a loose size to a tight one (Algorithm 1, steps 2-7). The initial σ(0)=3ϵ, where ϵ represents the pixel size related to the imaging sensor. It is worth emphasizing that we selected a non-centrosymmetric support S0, for instance, an equilateral triangle, as an initial support. Using a non-centrosymmetric support region considerably speeds up convergence and effectively avoids the twin-image problem.

    To explain how the process works in greater detail, four projection operations are defined. The projection operations Ps, Pm, Rs, and Rm project points in two sets, S (support) and M (modulus) [46]. When the image belongs to both sets simultaneously, we reach an equivalent solution. The support projector Ps involves setting to 0 the components outside the support, while leaving the rest of the values unchanged. Psρ(r)={ρ(r)if  rS0otherwise,where ρ(r) denotes the object intensity distribution, with r being the coordinates in the object (or real) space. The role of modulus projector Pm is to set the modulus to the measured one m(k)=I(k) in the reciprocal space, and leaving the phase unchanged: Pmρ˜(k)=Pm|ρ˜(k)|eiφ(k)=m(k)eiφ(k).

    The reflectors of the support projector Ps and modulus projector Pm are defined as, Rs=2PsI and Rm=2PmI, respectively.

    According to the notation above, the recursive form of HIO algorithm would be ρ(n+1)(r)=[PsPm+Ps_(IβPm)]ρ(n)(r)with Ps_=(IPs) the complement of the projector Ps. Similarly, the RAAR algorithm can be written as ρ(n+1)(r)=[12β(RsRm+I)+(1β)Pm]ρ(n)(r).

    For clarity, we describe our modified phase retrieval procedure in Algorithm 1.

    Modified phase retrieval algorithm

    Input: initial solution ρ(0), initial support S0, initial standard deviation of the Gaussian kernel σ(0), RAAR iteration times N1 and the feedback constant β1, HIO iteration times N2 and the feedback constant β2, RAAR start iteration counter i1, HIO start iteration counter j1, support region update symbol m
    Output: recovered object ρ^, estimated support S^, recovered Fourier phase angle{F^(ρ)}
    1: whileiN1do
    2: ifmod(i,m)=0then
    3:  σ(i)=min(0.99σ(i1),0.5σ(0))
    4: else
    5:  σ(i)=σ(i1)
    6: Si=Si1*exp(r2/2(σ(i))2)
    7: SiSi×(Simax(0.15Si))
    8: ρ(i+1) ← solution of Eq. (10).
    9: ii+1
    10: whilejN2do
    11: Repeat step 2-step 7 for eachj
    12: ρ(j+1) ← solution of Eq. (9).
    13: jj+1
    14: returnρ^=ρ(N2), S^=SN2, angle{F^(ρ)}=angle{F(ρ^)}.

    In the present method, phase retrieval is a standalone step that depends solely on the refined spectrum. It can also be integrated as a module in algorithms for other types of lensless imaging (coherent diffraction imaging, holography, ptychography) to allow them to cope with broadband sources.

    B. Imaging with PSF Calibration

    Figure 3 presents the data acquisition and reconstruction process with PSF calibration. An object and a pinhole are alternatively placed at the same spatial position in the optical configuration (see details in Appendix C), which captures the broadband speckle, IB, and the corresponding PSF, PSFB. The raw captured speckle and PSF are then lowpass filtered in the frequency domain (a typical filter length is 15) to eliminate the non-uniform illumination background. Then, we apply a two-dimensional Gaussian lowpass filter of size [2 2] with standard deviation σ=0.5 in the real space to further smooth the pattern.

    Broadband scattering imaging reconstruction pipeline with calibrated PSF. In the OME range, the broadband speckle IB can be treated as the convolution of the object O and the corresponding broadband PSFB. Data acquisition (a) requires a one-time calibration procedure to measure PSFB. Raw captured speckle and the corresponding PSF are background homogenized and filtered by a Gaussian low-pass filter (b) in the frequency domain [(c) and (d)]. The image reconstruction algorithms (e) include cross correlation, nonlinear reconstruction, and our computational reconstruction method. The retrieved four-leaf clover patterns are compared in (f). Artificial cross-section lines are highlighted by gray dashed lines in the nonlinear reconstruction. The normalized intensity of the 76th column along the yellow dotted line in each result (f) is presented in (g). Notably, our method outperforms the other methods, achieving lower background and higher fidelity. GT, ground truth; CC, cross-correlation; NL, nonlinear reconstruction.

    Figure 3.Broadband scattering imaging reconstruction pipeline with calibrated PSF. In the OME range, the broadband speckle IB can be treated as the convolution of the object O and the corresponding broadband PSFB. Data acquisition (a) requires a one-time calibration procedure to measure PSFB. Raw captured speckle and the corresponding PSF are background homogenized and filtered by a Gaussian low-pass filter (b) in the frequency domain [(c) and (d)]. The image reconstruction algorithms (e) include cross correlation, nonlinear reconstruction, and our computational reconstruction method. The retrieved four-leaf clover patterns are compared in (f). Artificial cross-section lines are highlighted by gray dashed lines in the nonlinear reconstruction. The normalized intensity of the 76th column along the yellow dotted line in each result (f) is presented in (g). Notably, our method outperforms the other methods, achieving lower background and higher fidelity. GT, ground truth; CC, cross-correlation; NL, nonlinear reconstruction.

    After preprocessing, we introduce two types of reconstruction algorithms. The first algorithm is based on the cross-correlation strategy: O^=IBPSFB=F1[F(IB)·F(PSFB)¯].

    Here F1{·} represents the inverse Fourier transform, and the bar above the variable represents a complex conjugate. Inspired by the interferenceless coded aperture correlation holography [47], a modified approach of the cross-correlation method is referred to as the nonlinear reconstruction process: O^=F1[|F(IB)|α·|F(PSFB)|β·ei(φIBφPSFB)],where φIB and φPSFB denote the Fourier phase of IB and PSFB, respectively. The adjustable parameter α and β are chosen to tune the power spectrum of the object and reconstructing functions, respectively. The advantageous effect of using nonlinear processing has already been observed in Refs. [47,48]. A detailed investigation of this point can be found in Appendix C.

    Mathematically, broadband scattering imaging can also be formulated as the following forward model: g=H(f)+e,where fRN×N represents the hidden object, gRN×N is the measurement, H(·) is the mapping of the hidden object to the measurement, and eRN×N denotes the noise components. In our framework, the reconstruction task is accomplished by solving an optimization problem with a joint 1 and total variation (TV) regularization under a nonnegative constraint: f^=argminf012gH(f)22+τfTV+λf1,where ·p for p=1, 2 or TV is the 1-norm, 2-norm or TV-norm, respectively. The parameters (τ and λ) are used to control the TV and 1-norm to balance noise removal and object sparsity. Detailed rules for parameters selection are arranged in Appendix C. Here, we use the anisotropic TV norm ·TV, fTV=i(|[Dx(f)]i|+|[Dy(f)]i|),where the operators Dx(·), Dy(·) are the forward finite difference operators along the horizontal and vertical directions, respectively. The main advantage of this model is that it recovers edges well and removes noise, avoiding the ringing effect.

    We develop an iterative inverse algorithm, modified from sparse Poisson intensity reconstruction algorithms theory and practice (SPIRAL-TAP) [49], to obtain an approximate solution of the inverse problem in Eq. (14). Here, in our case, the forward model matrix H is defined as a function called H(·) and the adjoint of H(·) is specified as HT(·), which denotes the back-projection process. Contrary to the previous deconvolution approaches, the forward and backward models are modified with the cross-correlation [Eq. (11)] and nonlinear reconstruction operation [Eq. (12)], respectively. The adjustable parameters are set the same as in nonlinear reconstruction (α=0.4 and β=0.3). The SPIRAL-TAP approach for the de-correlagraphy problem will consider the iterations to be sufficiently close to the optimal solution if f^k+1f^k2/f^k2ε,where ε is some small constant, the subscript k denotes the k-th iteration. In our algorithm, the stopping criterion is satisfied when ε106.

    Some artifacts from the recovery process nevertheless deserve attention; for instance, the abnormal cross-section lines indicated by gray dashed lines in the nonlinear reconstruction [Fig. 3(f)]. These cross-section lines come from the noninteger power transformation of the spectrum in the nonlinear reconstruction, resulting in a simultaneous increase in detail and high-frequency noise, unavoidably producing these artifacts. As these adjustable parameters α and β decrease, the resolution of the reconstruction will improve moderately, while the noise will increase dramatically. (Details about this point can be found in Appendix C.) However, thanks to the regularized inversion method, these artifacts are not dominant in the recovered pattern [see the comparison in Fig. 3(f)]. In addition, the differences between the values of points a and b are 1, 0.6, 0.68, and 0.75, respectively [Fig. 3(f)]. Reconstructions of the proposed method have sharp boundaries and contain very little noise. A typical reconstruction requires at least four iterations. Solving for 1800×1800=3.24 million pixels takes 10.35 s (2.59 s per iteration) on a dual-core computer and with a GeForce RTX 2080 Ti GPU. Moreover, for further algorithm acceleration, this fused lasso problem can also be solved by the alternating direction method of multipliers (ADMM) [50] or fast iterative shrinkage-thresholding algorithm (FISTA) [51], which will be exploited in the future.

    4. RESULTS AND DISCUSSION

    Experimental validation. The single-shot method A was validated in the visible domain on a broadband and spatially incoherent source with its spectrum full-width at half-maximum (FWHM=280  nm). Then, method B was tested under the same experimental conditions with additional PSF information. The objects employed for these experiments are ubiquitous in the field of scattering imaging and allow for algorithmic validation and resolution estimation.

    A. Experimental Validation without PSF Calibration

    A lens imaged lithographic letter “B” is shown in Fig. 4(a). The experiment was performed using a commercial broadband light emitting diode (LED) (Thorlabs, MBB1L3, wavelength range of 400–1100 nm, FWHM=280  nm). The collimated light is incident on the object and then transmitted 24 cm to the diffuser (Edmund, 45-656). The light scattered from the diffuser was recorded by a scientific complementary metal oxide semiconductor (sCMOS) camera (Pco.edge. 4.2, pixel size 6.5 μm) set at 12 cm from the diffuser. Further details on the experimental setup can be found in Appendix C.

    Experimental validation without calibrating PSF with an illumination bandwidth of Δλ/λ=44.8%. (a) Photography of a lithographic letter “B” via lens imaging. Reconstructed results by our method (b), Katz et al. (c), and Wu et al. (d). (e)–(h) Ground truth and the retrieved Fourier phase via the three methods. GT, ground truth. Scale bars are 40 camera pixels in all subgraphs.

    Figure 4.Experimental validation without calibrating PSF with an illumination bandwidth of Δλ/λ=44.8%. (a) Photography of a lithographic letter “B” via lens imaging. Reconstructed results by our method (b), Katz et al. (c), and Wu et al. (d). (e)–(h) Ground truth and the retrieved Fourier phase via the three methods. GT, ground truth. Scale bars are 40 camera pixels in all subgraphs.

    Our single-shot method was then applied to the broadband pattern, at the center of mass of the broadband spectrum λc=625  nm (Δλ/λ=44.8%). Two other state-of-the-art single-shot methods were also introduced for comparison. The SCI method pioneered by Katz et al. performs phase retrieval on speckle autocorrelation. For each loop, the HIO algorithm [41] was run with a decreasing β factor from β=2 to β=0, in steps of 0.04. For each β value, 30 iterations of the HIO algorithm were performed. The obtained result was fed as an input to additional 30 iterations of the “error reduction (ER)” algorithm to obtain the final result. The second reconstruction method in comparison from bispectrum analysis [35] utilizes the phase closure theory to extract the phase information from subblock speckles, in which the phase is projected onto 18 uniformly spaced angles from (0, 2π). Moreover, the time costs of the reconstruction of speckle sizes of 1800×1800=3.24 million pixels are listed at the bottom of Fig. 4. The experiments were all performed with MATLAB 2018b in Windows 10 running on a dual-core chip Intel Core i9-10900K and 128G memory.

    Apparently, both methods [Figs. 4(c) and 4(d)] failed to reconstruct the object and the corresponding Fouier phase under broadband illumination. From the correlation perspective, the failure of the speckle correlation is attributable to the fact that the autocorrelation of broadband PSF cannot be approximated as a delta function, as described in Eq. (6) and mentioned in Fig. 1. Therefore, the estimation of object autocorrelation is stained and further obstructs phase retrieval. The statistical noise of the bispectrum is higher than the autocorrelation, so more speckle spots are required in reconstruction, making it difficult to cope with low-contrast broadband patterns. The proposed method successfully reconstructs the target information, where the PSNR value is as high as 22.63 dB. The similarity between the object Fourier phase and the reconstructed result [see Figs. 4(e) and 4(f)] is obvious.

    It is worth emphasizing that iterative phase retrieval algorithms are highly sensitive to the initial guess, the support constraint, and the convergence criterion. The twin-image problem will interfere with the direction selection in the iterative process, causing the phase retrieval algorithm to wander left and right and unable to quickly jump out of the local minimum. Thus the tuning rules are very general and require experiments during a trial and error method.

    Here, thanks to the initial non-centrosymmetric support region estimation, adaptive shrink-wrap strategy, and hybrid retrieval method, we achieve a reconstruction accuracy as high as 85%. The empirical probability of success is an average over 1000 trials, in each of which we generate new random matrices as the algorithm input. We declare that a trial is successful if the SSIM between the ground truth and the reconstruction is >0.8 (see Visualization 1 for a dynamic reconstruction process). Therefore, verifying our approach is highly repeatable and stable. Our method can effectively avoid brute force manual post-selection and considerably increase the reconstruction fidelity. In terms of algorithm time consumption, although our method has the longest reconstruction time, the performance of our results is the best. In algorithm processing, 90% of the time is occupied on NLM filtering, >9% of the time is spent on LRSD, and <1% is spent on phase retrieval. Subsequent research can integrate the algorithm in the graphics processing unit (GPU) or on the field programmable gate array (FPGA) to increase the processing speed, and it is anticipated to achieve high-speed processing.

    B. Experimental Validation with PSF Calibration

    The main reconstruction comparison between different state-of-the-art approaches is presented in Fig. 5. A tailored USAF1951 target (Edmund, 55-622) with group 0 element 3-5 is selected as the ground truth object. The broadband PSF is measured with a 100 μm pinhole in the same spatial position of the object. The reconstructed results by different algorithms, including the cross-correlation, nonlinear reconstruction, Edrei et al. [34], and our computational reconstruction method, are shown in Figs. 5(b)–5(e), respectively.

    Experimental validation with PSF calibration under broadband illumination (Δλ/λ=44.8%). (a) Photography of a tailored USAF target via lens imaging. Reconstructed results by cross-correlation (b), nonlinear reconstruction with α=0.4, β=0.3 (c), Edrei et al. [34] (d), and our method (e). (f) Line profiles along the yellow arrow in (a) and the same spatial position in each subgraph (b)–(e). GT, ground truth; CC, cross-correlation; NL, nonlinear reconstruction. Scale bars are 40 camera pixels in all subgraphs.

    Figure 5.Experimental validation with PSF calibration under broadband illumination (Δλ/λ=44.8%). (a) Photography of a tailored USAF target via lens imaging. Reconstructed results by cross-correlation (b), nonlinear reconstruction with α=0.4, β=0.3 (c), Edrei et al. [34] (d), and our method (e). (f) Line profiles along the yellow arrow in (a) and the same spatial position in each subgraph (b)–(e). GT, ground truth; CC, cross-correlation; NL, nonlinear reconstruction. Scale bars are 40 camera pixels in all subgraphs.

    As indicated, all methods find the object pattern correctly. Our reconstruction matches perfectly with the ground truth, with PSNR further increasing to 18.57 dB, higher than that of the nonlinear algorithm (15.36 dB). We compare in Fig. 5(f) the intensity profiles along the line indicated by the yellow arrow [see Fig. 5(a)] in each subgraph. Benefiting from the constraint of the 1 regular term, our results show a significant difference between the target and background. The use of TV regularization effectively reduces the background level and compensates for the artificial traces caused by power spectrum modulation [see Figs. 3(f) and 5(c)]. It is shown that our reconstruction has the smallest background while keeping the highest image contrast. The hollow numbers “4” and “5” close to the edge of the resolution target are also preserved.

    An alternative reconstruction is from the Lucy–Richardson deconvolution algorithm under 400 iterations [52] [Fig. 5(d)]. Due to the inherent linear convolution model, the deconvolution algorithm can produce an acceptable result [see Fig. 5(d)]. Correlagraphy-based reconstructions [Figs. 5(b) and 5(c)] also show clear and sharp-edged results with high reconstruction speed. The effects of background noise and artifacts are well compensated by our improved method [Figs. 5(e) and 5(f)]. This comparison also reveals that the correlation-based methods {with Fig. 5(e) or without regularization [Figs. 5(b) and 5(c)]} do not require precise optical calibration while performing well in broadband scattering imaging [53].

    In addition, the memory complexity of our method is O(N2), and the time complexity of each iteration is O(N2logN). Although the cost time is slightly longer when processing on the CPU (see Fig. 5), it can be dramatically shortened by using parallel computation on GPU (approximately 10 s for four typical iterations on GeForce RTX 2080 Ti). Further comparisons of different approaches under narrowband and broadband illumination are arranged in Appendices B to E.

    Due to the temporal and spectral reciprocity, ultrafast sources can also produce equivalent broadband spectra. Naturally, our method can be used as a plug-and-play image restoration in ultrafast science and passive sensing. To demonstrate this point, we show here a comparison among three types of used scientific broad-spectrum light sources and our methods feasibility from numerical simulation. Figure 6(a) illustrates a transform-limited femtosecond pulse ΔtΔω1 with a 100 fs pulse width. The corresponding spectrum ranges from 480 nm to 720 nm, and FWHM=100  nm (Δλ/λ=16.7%). The second revolutionary broadband source is an optical frequency comb with a fixed comb line  of20  nm and a spectrum ranging from 360 nm to 830 nm. The normalized spectrum of the optical comb is shown in Fig. 6(b) with each comb teeth spectrum FWHM=1  nm. The third broadband source is the “D65” illumination [see the spectrum in Fig. 6(c)], which is commonly used to simulate artificial sunlight (Δλ/λ=39.7%). The broad-spectrum PSFs and speckles generated by these three light sources, and the corresponding imaging results via methods A and B are shown in Figs. 6(d)–6(f), respectively.

    Comparison of the reconstructions under three different types of broadband illumination calculated from numerical simulation. (a) The normalized spectrum of a femtosecond pulse with FWHM=100 fs pulse width (FWHM=100 nm spectrum). (b) An optical comb with a fixed comb line of 20 nm and the spectrum spanning from 360 nm to 830 nm. (c) The “D65” light source spectrum used to simulate artificial sunlight. (d)–(f) Broadband PSF, speckle, and the recovered objects from both methods (methods A and B) corresponding to each light source in (a)–(c). The broadband PSF autocorrelation patterns are inserted in the bottom left of the first column of (d)–(f). The scale bars in (d)–(f) are 500 camera pixels in the left two columns and 30 camera pixels in the rightmost column.

    Figure 6.Comparison of the reconstructions under three different types of broadband illumination calculated from numerical simulation. (a) The normalized spectrum of a femtosecond pulse with FWHM=100  fs pulse width (FWHM=100  nm spectrum). (b) An optical comb with a fixed comb line of 20  nm and the spectrum spanning from 360 nm to 830 nm. (c) The “D65” light source spectrum used to simulate artificial sunlight. (d)–(f) Broadband PSF, speckle, and the recovered objects from both methods (methods A and B) corresponding to each light source in (a)–(c). The broadband PSF autocorrelation patterns are inserted in the bottom left of the first column of (d)–(f). The scale bars in (d)–(f) are 500 camera pixels in the left two columns and 30 camera pixels in the rightmost column.

    All three broadband speckles produce high-fidelity results with both methods. Intriguingly, our noninvasive method (method A) performs similarly to the invasive method (method B). A possible explanation for this behavior could be the noise-suppressing properties of the NLM method used in cepstrum filtering. Further discussion of the speckle contrast and the imaging performance changes with spectral spread is arranged in Appendix B.

    While both methods A and B provide promising solutions to the lensless broadband scattering imaging problem, the hidden scenes are restricted to two-dimensional planar objects and the effective field-of-view still lies in the OME scope. For method A, recovering the high-frequency features of the spectrum remains a great challenge as the complete scattering degradation process is unknown. As a result, this noninvasive method cannot be used to image dense and connected samples. This is also a limitation that applies to all existing statistical-based scattering imaging methods, including method A. For method B, there is a trade-off between the detail of the reconstruction and the degree of denoising owing to the introduction of the TV regularity term. Moreover, the complexity of the algorithm needs to be further reduced. We believe that these drawbacks will be overcome in the future.

    With respect to improving imaging performance and capability on challenging scenarios, the next step is to implement our method on a complex 3D scene by leveraging the stereo vision [54] or light field solutions [55]. Higher-fidelity reconstruction is anticipated by applying deep learning modules or scattering matrix inversion to unveil the detailed scattering process. Benefiting from the single-shot data acquisition strategy, future works can also be expanded to the dynamic scattering region and non-line-of-sight imaging.

    5. CONCLUSION

    In summary, we have proposed two individual lensless broadband scattering imaging techniques, which both produce high-fidelity reconstructions from a single-shot wide-spectrum illuminated speckle (FWHM=280  nm) spanning the visible region. The noninvasive method (method A) reproduces the object Fourier spectrum by numerical cepstrum refinement combined with a modified phase retrieval strategy. The invasive method (method B) is motivated by the correlaion, which outperforms state-of-the-art approaches benefiting from the optimization framework and suitable regularizers. No additional measurement of the source spectrum is needed; the spectrum may be ultra-broadband, highly structured, and even exhibit significant intensity fluctuations in both methods. It is expected that the possibility of using broadband illumination in lensless scattering imaging demonstrated here will open up a wide range of opportunities in ultrafast science and astronomical observation and even be extended in the X-ray imaging regime [56] and non-line-of-sight imaging [57]. Additionally, it can also be implemented in unknown target tracking by incorporating cross-correlation analysis [58] or applied in other dynamic scattering scenarios for its single-shot capability [59].

    Acknowledgment

    Acknowledgment. The authors thank Xin Huang, Jianjiang Liu, Yijun Zhou, Zhengping Li, and Chengfei Guo for fruitful discussions and useful feedback. Our deepest gratitude goes to the anonymous reviewers for their careful work and thoughtful suggestions that have helped improve this paper substantially.

    APPENDIX A: TWO-WAVELENGTH MUTUAL INTENSITY CORRELATION FUNCTION

    In this section, we present additional mathematical details about the intensity cross-correlation part in Eq. (6) (main text). To begin, we consider the transmission geometry shown in Fig. 7(a). Assuming paraxial propagation condition, the complex amplitude E(r,z) of the light at position (x,y) in the observation plane is related to the complex amplitude E(r,0) of the scattered light at coordinates (ξ,η) in a plane just to the right of the scattering surface by the Fresnel diffraction integral.

    Transmission scattering geometry and coordinate. (a) Free space transmission scattering geometry. (b) Incidence and observation directions, (c) k-vectors of the incident light ki, the transmitted light kt, and a k-vector in the observation direction ko.

    Figure 7.Transmission scattering geometry and coordinate. (a) Free space transmission scattering geometry. (b) Incidence and observation directions, (c) k-vectors of the incident light ki, the transmitted light kt, and a k-vector in the observation direction ko.

    Because we only consider the modulus of the correlation function, the quadratic-phase complex exponential is ignored outside the integral as Eq. (A1): Ek(r,z)=1jλzrEk(r,0)ejk2z(rr)2d2r.

    The field in the (r,0) plane can be written as Ek(r,0)=tS(r,0)exp[jϕ(r,0)]exp[jk(t^·r^)r],where t is the average amplitude transmission of the scattering media, and S represents the shape and amplitude distribution across the incident spot. The top mark “^” means the direction of the vector. The phase shift ϕ(r,0) imparted to the transmitted wave by the rough surface [see Fig. 7(c)], as measured in the (r,0) plane, is ϕ(r,0)=k(nt^·z^+o^·z^)h(r,0),where here and throughout, λ is the free-space wavelength and n is the refractive index of the scattering media. We define here a scattering vector q, decomposited into transverse qt and vertical qz components: q=kokt=qt+qzz^.

    The phase shift suffered at the rough interface is rewritten as ϕ(r,0)=qzh(r,0).

    Our goal is to calculate the autocorrelation function ΓE of the speckle field E(r,z) at two points (r1,z) and (r2,z), that is, ΓE(r1;r2,z)=Ek1(r1,z)Ek2*(r2,z)¯.

    The cross correlation between two observed fields Ek1(r1,z) and Ek2(r2,z) can now be written as ΓE(r1;r2,z)=1λ1λ2z2r1r2ΓE(r1;r2,0)×exp[j2z(k1r12k2r22)]×exp[jz(k1r1r1k2r2r2)]d2r1d2r2.

    Substituting Eq. (A1) in the first line of this equation, we obtain ΓE(r1;r2,0)=Ek1(r1,0)Ek2*(r2,0)¯=|t|2S(r1,0)S*(r2,0)exp{j[qz1h(r1)qz2h(r2)¯]}=|t|2S(r1,0)S*(r2,0)Mh(qz1,qz2)×exp[j(k1t^1·r1^)r1+j(k2t^2·r2^)r2],where Mh(A,B) is the second-order characteristic function of the surface-height function h(r), which is assumed to be statistically stationary. We assume that the correlation width of the scattered wave is sufficiently small that we can approximate ΓE as follows: ΓE(r1;r2,0)κ|t|2|S(r1,0)|2Mh(Δqz)×exp[j(k1t^1·r1^)r1+j(k2t^2·r2^)r2]δ(r1r2).

    Substituting Eq. (A9) in Eq. (A7) and using the small angles approximation, we obtain ΓE(r1;r2,z)=κ|t|2λ1λ2z2Mh(Δqz)r|S(r,0)|2exp[j(k1k2)r22z]×exp[j(k1t^1·r^)r+j(k2t^2·r^)r]×exp[j(k1o^1·r^)r+j(k2o^2·r^)r]d2r.

    To simplify Eq. (A10), we introduce Eq. (A11) notationally: qt1=k1t^1·r^+k1o^1·r^qt2=k2t^2·r^+k2o^2·r^.

    With the help of Eq. (A11), this result can be simplified as ΓE(r1;r2,z)=κ|t|2λ1λ2z2Mh(Δqz)Ψ(Δqt)r|S(r,0)|2d2r,where Δqt=qt1qt2 and Ψ(Δqt)=r|S(r,0)|2exp[j(k1k2)r22z]exp[j(Δqt·rt)d2rr|S(r,0)|2d2r,is the normalized Fourier transform of the intensity distribution multiplied by a dual-wavelength impacted quadratic phase factor: Ψ(Δqt)=F{|S(r,0)|2exp[j(k1k2)r22z]}r|S(r,0)|2d2r.

    Here, |S|2 is the intensity distribution across the scattering spot, which is given by |S(r,0)|2=I(r,0). To make further progress, a specific shape for the scattering spot must be assumed. Let the spot be uniform and circular with diameter D, centered on the origin. Using the Hankel transform, we obtain Ψ(Δqt)=2J1(D|Δqt|2)D|Δqt|2*F{exp[j(k1k2)r22z]}.

    For further simplification, let us introduce the same approximation used in the calculation of monochromatic speckle: J1(D|Δqt|2)D|Δqt|2J1(πDr/λ¯ijz)πDr/λ¯ijz.

    Here λ¯ij=(λi+λj)/2 means the average value of the two wavelengths and the subscripts i and j represent the i-th and j-th wavelengths. The normalized cross-correlation function, μE is found by dividing ΓE by its value when Δqz=0 and Δqt=0. The result is |μE(q1,q2)|2=|Mh(Δqz)|2|Ψ(Δqt)|2.

    Obviously, the wavelength variation has two effects. First the observed speckle pattern undergoes a spatial scaling by an amount that depends on distance from the point where the scattering event occurs (Ψ). When the wavelength increases, the speckle pattern expands about this point, and when the wavelength decreases, the speckle pattern shrinks about this point. This effect is caused by the fact that diffraction angles are inversely proportional to wavelengths. A second effect is upon the random phase shifts imparted to the scattered wave by the surface-height fluctuations (Mh). Since those phase shifts are proportional to h/λ, an increase in wavelength will decrease the random phase shifts, and vice versa. When the surface is rough on the scale of a wavelength, phase wrapping into the interval (0, 2π) will result in changes in the detailed structure of the speckle pattern (see Visualization 1 for details).

    If the surface-height fluctuations obey Gaussian statistics, the phase shifts imparted intensity correlation change is given by |Mh(Δqz)|2=exp(σh2Δqz2).

    In the case that the incidence direction and observation direction are both normal, Δqz=(n1)|kikj|.

    According to the Siegert relation [60], the intensity correlation function can be associated with the electric field correlation function as ΓI(Δr)=I¯2[1+|μE(Δr)|2].

    We can now write the intensity cross-correlation function as i=1MjiMωiωjPSFN(λi,Δλi)PSFN(λj,Δλj)i=1MjiMωiωj{1+exp[σh2(n1)|kikj|]×{2J1(πDr/λ¯ijz)πDr/λ¯ijz*F{exp[j(kikj)r22z]}}2}.

    This equation represents the final result of our analysis of the cross-correlation between two arbitrary wavelength speckles. When i equals j, the Eq. (A21) degenerates to Eq. (5) (main text), which indicates the correlation between two speckles at the same wavelength.

    APPENDIX B: IMAGING PERFORMANCE WITH INCREASING ILLUMINATION BANDWIDTH

    Simulation Process

    To support the experimental data, we have performed full numerical simulations of wave propagation in disordered media. In the simulations, the diffuser is inserted at a distance v=12  cm in front of the detector and u=24  cm behind the object. The geometry of propagation from the source plane Ek(r0,0) to the observation plane Ek(r2,z) can be divided into the following three stages: i) the source plane Ek(r0,0) propagates to the scattering plane Ek(r1,u) via free-space propagation; ii) the wavefront is modulated by the diffuser; iii) the disordered wavefront propagates to the detector plane Ek(r2,z) via free-space propagation. The k and r0, r1, r2 stand for the wave vector and the three plane coordinates, respectively. The free-space propagation can be modeled as Fresnel diffraction integral, where the notation here is adapted from that described by Nazarathy and Shamir [61]: R[z,r0,r1]{Ek(r0)}=1jλzr0Ek(r0,0)ejk2z(r1r0)2d2r0.

    The operators’ parameters are given in square brackets, and the operand is given in curly braces. We model the diffuser transmittance tk(r1) as a pure-phase random mask, i.e., tk(r1)=exp[ikΔnh(r1)], where h(r1) is the random height of the diffuser surface and Δn is the difference between the refractive indices of the diffuser and the surrounding air (Δn0.52 for glass diffusers). A circle-shaped pupil p(r1) is added behind the diffuser to control the speckle grain size (the diameter of the pupil D=6  mm is used in simulation).

    The Fresnel diffraction operator R propagates only monochromatic waves. Any broadband signal can be propagated by first writing it as a superposition of monochromatic waves and then propagating each one individually. According to the above description, the broadband PSF can be written as PSFB=|kR[v,r1,r2]{R[u,r0,r1]{Ek(r0,0)}tk(r1)p(r1)}dk|2.

    The emulation results of speckles with various illumination bandwidths are shown in Fig. 8. Figure 8(a) shows the Gaussian-shaped spectral profiles with FWHM=1, 7, 24, 46, and 104 nm and the corresponding standard deviation (SD) of 0.4, 3, 10, 20, and 45 nm, respectively, all centered at 632.8 nm. Δλ=0  nm indicates monochromatic light at 632.8 nm. The corresponding autocorrelation intensity profiles of these broadband PSFs are presented in Fig. 8(b). Figure 8(c) is a neuron-like object that serves as the hidden object. Notably, with the increase in illumination bandwidth, the PBR of speckle autocorrelation declines quickly at first and then becomes slow-changing and stable. When the illumination spectrum width becomes infinitely wider, the PBR will be infinitely close to 1, as can be expected. Speckles under different illumination bandwidths are presented in Fig. 8(d). The speckle contrast ratios (SCRs) [26] of these speckles are calculated as 0.99, 0.79, 0.38, 0.23, 0.16, and 0.12 for illumination from Δλ=0  nm to Δλ=104  nm, respectively.

    Emulation results under different illumination spectral widths. (a) Normalized spectrum profile of illumination bandwidth FWHM varying from Δλ=0 nm to Δλ=104 nm. (b) The autocorrelation intensity profiles of the corresponding broadband PSF patterns with the illumination shown in (a). (c) Neuron cell-like pattern used to generate speckles. (d) Speckle patterns under different illumination in (a). The scale bar is 20 camera pixels in (c) and 300 camera pixels in (d).

    Figure 8.Emulation results under different illumination spectral widths. (a) Normalized spectrum profile of illumination bandwidth FWHM varying from Δλ=0  nm to Δλ=104  nm. (b) The autocorrelation intensity profiles of the corresponding broadband PSF patterns with the illumination shown in (a). (c) Neuron cell-like pattern used to generate speckles. (d) Speckle patterns under different illumination in (a). The scale bar is 20 camera pixels in (c) and 300 camera pixels in (d).

    Scattering imaging results under different illumination bandwidths via our proposed method A using simulated data. (a) Ground truth object. (b) Fourier amplitude of (a). (c) Recovered results using speckles under different illumination bandwidths [Fig. 8(d)]. (d) Estimated support regions of (c). (e) Retrieved Fourier amplitude corresponding to (c). (f) Quantitative performance of (c) with (a) as a reference. GT, ground truth. Scale bars are 20 camera pixels in (a), (c), and (d) and 200 camera pixels in (b) and (e).

    Figure 9.Scattering imaging results under different illumination bandwidths via our proposed method A using simulated data. (a) Ground truth object. (b) Fourier amplitude of (a). (c) Recovered results using speckles under different illumination bandwidths [Fig. 8(d)]. (d) Estimated support regions of (c). (e) Retrieved Fourier amplitude corresponding to (c). (f) Quantitative performance of (c) with (a) as a reference. GT, ground truth. Scale bars are 20 camera pixels in (a), (c), and (d) and 200 camera pixels in (b) and (e).

    Imaging results with quantitative performance under different illumination bandwidths via our proposed method B using simulated data. (a) Ground truth object. (b)–(g) Recovered hidden objects using speckles from Fig. 8(d). GT, ground truth. Scale bars are 20 camera pixels in all subgraphs.

    Figure 10.Imaging results with quantitative performance under different illumination bandwidths via our proposed method B using simulated data. (a) Ground truth object. (b)–(g) Recovered hidden objects using speckles from Fig. 8(d). GT, ground truth. Scale bars are 20 camera pixels in all subgraphs.

    As mentioned here and in the paper, one can in principle use a detection bandwidth much larger than the speckle spectral decorrelation bandwidth, and still be able to effectively employ our proposed approaches.

    APPENDIX C: SYSTEM CONFIGURATION AND EXPERIMENT RESULTS

    Experimental Setup

    The experimental setup for imaging in transmission is shown in Fig. 11. The broadband light is provided by a white light LED (Thorlabs, MBB1L3, wavelength range of 400–1100 nm, FWHM=280  nm), which has a central wavelength of 625 nm and a coherence length of Lc1.4  μm. The light was collimated and transmitted through an object and a scattering slab and imaged onto the camera (Pco.edge 4.2, 6.5 μm pixel size). We used a USAF resolution target (Edmund, USAF 1951) and fabricated custom patterns via standard photolithography masks as objects. Two types of scattering media [Edmund, ground glass 220-grit (Edmund, 45-656), 600-grit (Edmund, 38-790)] were rotationally placed at 240 mm behind the object and 120 mm in front of the camera. A typical integration time for a single-shot capture was 500 ms. The PSF of the scattering system was measured by replacing the object with a pinhole of a typical diameter of 100 μm. The narrow-spectrum light sources used in the experiment are red light LED (Thorlabs, M625L4, λc=625  nm, FWHM=17  nm) and green light LED (Thorlabs, M530L4, λc=530  nm, FWHM=35  nm).

    Experimental setup. A scattering slab is illuminated by a wide-spectrum LED light source with a spectral range from 400 to 1100 nm (FWHM=280 nm). The broadband scrambled patterns are recorded lenslessly by a scientific CMOS camera.

    Figure 11.Experimental setup. A scattering slab is illuminated by a wide-spectrum LED light source with a spectral range from 400 to 1100 nm (FWHM=280  nm). The broadband scrambled patterns are recorded lenslessly by a scientific CMOS camera.

    Experimental comparison of broadband and narrowband illuminated imaging performance via cross-correlation strategy. First column, photography of ground truth objects. Columns 2–5 show imaging results from 220-grit ground glass diffuser with broadband illumination (Δλ=280 nm) and narrowband irradiance centered at λc=530 nm (Δλ=35 nm) and λc=625 nm (Δλ=17 nm), respectively. Columns 6–8 represent imaging results of 600-grit glass diffuser with the same white, green, and red LED illumination. Scale bars are 50 camera pixels in the rightmost column. Performance comparison of the reconstructed object along the lines indicated by the white arrows is shown next to the results.

    Figure 12.Experimental comparison of broadband and narrowband illuminated imaging performance via cross-correlation strategy. First column, photography of ground truth objects. Columns 2–5 show imaging results from 220-grit ground glass diffuser with broadband illumination (Δλ=280  nm) and narrowband irradiance centered at λc=530  nm (Δλ=35  nm) and λc=625  nm (Δλ=17  nm), respectively. Columns 6–8 represent imaging results of 600-grit glass diffuser with the same white, green, and red LED illumination. Scale bars are 50 camera pixels in the rightmost column. Performance comparison of the reconstructed object along the lines indicated by the white arrows is shown next to the results.

    Nonlinear reconstruction results of the letter “B” object. The left part corresponds to the retrieved object along different tunable parameters. The right part presents a metric matrix to search for the sharpest image from the left part by using a gradient magnitude based evaluation index. The purple, blue, and green boxes show the reconstruction results with matched, phase-only, and inverse filters, respectively. Results above the yellow dotted line show spurious solutions due to the low power of the power spectrum. The gray and pink boxes indicate the best performance produced by the quality assessment function and from naked eyes. Arrows I and II represent the reconstruction changes by the PSF and the speckle modification, respectively. Arrow III represents the reconstruction changes by a fixed sum of α and β.

    Figure 13.Nonlinear reconstruction results of the letter “B” object. The left part corresponds to the retrieved object along different tunable parameters. The right part presents a metric matrix to search for the sharpest image from the left part by using a gradient magnitude based evaluation index. The purple, blue, and green boxes show the reconstruction results with matched, phase-only, and inverse filters, respectively. Results above the yellow dotted line show spurious solutions due to the low power of the power spectrum. The gray and pink boxes indicate the best performance produced by the quality assessment function and from naked eyes. Arrows I and II represent the reconstruction changes by the PSF and the speckle modification, respectively. Arrow III represents the reconstruction changes by a fixed sum of α and β.

    Extended imaging results of method A. (a) Ground truth object “B.” (b)–(d) The recovered letter “B” from red, green, and white light via method A. (e) Ground truth object “star.” (f) Retrieved object “star” under broadband illumination (Δλ/λ=44.8%). (g) Ground truth object “3.” (f) Retrieved object “3” under broadband illumination (Δλ/λ=44.8%). GT, ground truth. Scale bars in all sub-graphs are 40 camera pixels.

    Figure 14.Extended imaging results of method A. (a) Ground truth object “B.” (b)–(d) The recovered letter “B” from red, green, and white light via method A. (e) Ground truth object “star.” (f) Retrieved object “star” under broadband illumination (Δλ/λ=44.8%). (g) Ground truth object “3.” (f) Retrieved object “3” under broadband illumination (Δλ/λ=44.8%). GT, ground truth. Scale bars in all sub-graphs are 40 camera pixels.

    Complex micro-target imaging via proposed method B. (a) The broadband spectrum (orange) with a bandwidth of 44.8% and the narrowband spectrum (red) with a bandwidth of 2.7% and (green) with a bandwidth of 6.6%. (b)–(d) Speckle pattern under red, green, and white illumination, respectively. (e) Ground truth object. The yellow arrows indicate a micro portion with about 10 μm. (f)–(h) The corresponding retrieved objects from (b)–(d) using the 220-grit ground glass diffuser with a 25 μm pinhole. Scale bars are 300 camera pixels in (b)–(d) and 100 μm in (e)–(h).

    Figure 15.Complex micro-target imaging via proposed method B. (a) The broadband spectrum (orange) with a bandwidth of 44.8% and the narrowband spectrum (red) with a bandwidth of 2.7% and (green) with a bandwidth of 6.6%. (b)–(d) Speckle pattern under red, green, and white illumination, respectively. (e) Ground truth object. The yellow arrows indicate a micro portion with about 10 μm. (f)–(h) The corresponding retrieved objects from (b)–(d) using the 220-grit ground glass diffuser with a 25 μm pinhole. Scale bars are 300 camera pixels in (b)–(d) and 100 μm in (e)–(h).

    Image reconstruction with changing NLM filtering parameters γ via method A. The optimal parameter selection range shrinks slightly with increasing target complexity.

    Figure 16.Image reconstruction with changing NLM filtering parameters γ via method A. The optimal parameter selection range shrinks slightly with increasing target complexity.

    Results of different parameters. This figure shows the results reconstructed by method B with different parameters. The leftmost image is the ground truth image used in the simulation. The second to fourth columns represent different λ, which is used to control the ℓ1 regularization, while the rows represent different τ, which is used to control the TV regularization term.

    Figure 17.Results of different parameters. This figure shows the results reconstructed by method B with different parameters. The leftmost image is the ground truth image used in the simulation. The second to fourth columns represent different λ, which is used to control the 1 regularization, while the rows represent different τ, which is used to control the TV regularization term.

    Quantitative Analysis for Different Parameters Using Method B

     λ=0.00001λ=0.001λ=0.1
    τ=0.0005PSNR13.5313.5310.76
    SSIM0.460.470.13
    τ=0.005PSNR13.5113.5610.75
    SSIM0.470.510.13
    τ=0.05PSNR12.5812.589.85
    SSIM0.320.340.09

    APPENDIX D: SPATIAL AND SPECTRAL CORRELATION ANALYSIS

    Measurement of 3D OME Range

    To quantify the effective imaging scope of the broadband scattering approach, we measured the translational and longitudinal OME range of the scattering media. The results are summarized in Fig. 18 for two types of ground glass diffusers. A 100 μm diameter pinhole is used as a point source placed at a distance u=24  cm in front of the diffuser and scanned perpendicular to the optical axis direction by a translation stage (MTS50A-Z8, Thorlabs) with a fixed 0.5 mm step. The longitudinal OME range is measured using the same framework but with the pinhole scanned along the optical axis direction with a fixed 1 mm step. The effective OME range is defined as the correlation index dropping from 1 to 0.5, and the results are arranged in Figs. 18(c) and 18(f). It should be noted that all the measurements are selected in the central region of speckles. A detailed OME measurement and calculation process can be found in Ref. [64].

    Measurement of the spatial speckle decorrelation as a function of the displacement laterally and longitudinally. (a) and (b) Translational OME range of the 220-grit and 600-grit ground glass, respectively. (d) and (e) Longitudinal OME range of the 220-grit and 600-grit ground glass. (c) and (f) Effective OME range for the two types of ground glass diffusers. All experiments were repeated for three spectral bands: red, green, and white.

    Figure 18.Measurement of the spatial speckle decorrelation as a function of the displacement laterally and longitudinally. (a) and (b) Translational OME range of the 220-grit and 600-grit ground glass, respectively. (d) and (e) Longitudinal OME range of the 220-grit and 600-grit ground glass. (c) and (f) Effective OME range for the two types of ground glass diffusers. All experiments were repeated for three spectral bands: red, green, and white.

    (a)–(d) Monochromatic speckles at λ=500, 510, 520, and 530 nm from numerical simulation. (e)–(h) Cross-correlation between different speckle patterns of (a)–(d) with (a) as the reference speckle. Inserts are enlarged portions of the central correlation peak. Notably, the cross-correlation terms spread with increasing illumination wavelength diversity. (i) Spectral decorrelation of the speckle patterns versus illumination spectral width. Scale bars in (a)–(d) are 50 camera pixels.

    Figure 19.(a)–(d) Monochromatic speckles at λ=500, 510, 520, and 530 nm from numerical simulation. (e)–(h) Cross-correlation between different speckle patterns of (a)–(d) with (a) as the reference speckle. Inserts are enlarged portions of the central correlation peak. Notably, the cross-correlation terms spread with increasing illumination wavelength diversity. (i) Spectral decorrelation of the speckle patterns versus illumination spectral width. Scale bars in (a)–(d) are 50 camera pixels.

    Imaging resolution varies with the size of the pinhole probes. (a)–(c) Recovered results from red, green, and white illuminated patterns using 220-grit ground glass with a 50 μm, 100 μm, and 200 μm pinhole, respectively. (d) Broadband PSF autocorrelation as a function of the pinhole size for each PSF corresponding to (a)–(c). (e) Normalized autocorrelation intensity profiles of all PSFs along the arrow presented in (d). Scale bars are 50 camera pixels in (a)–(c).

    Figure 20.Imaging resolution varies with the size of the pinhole probes. (a)–(c) Recovered results from red, green, and white illuminated patterns using 220-grit ground glass with a 50 μm, 100 μm, and 200 μm pinhole, respectively. (d) Broadband PSF autocorrelation as a function of the pinhole size for each PSF corresponding to (a)–(c). (e) Normalized autocorrelation intensity profiles of all PSFs along the arrow presented in (d). Scale bars are 50 camera pixels in (a)–(c).

    Imaging depth-of-field analysis with a fixed interval δ=2 mm defocus distance. (a)–(c) Imaging results of red, green, and white illuminated at different depths of field under 220-grit ground glass via method B, respectively. The leftmost column indicates the in-focus results with the object and the pinhole at the same spatial position. The right four columns display the defocused results. Scale bars in the rightmost column are 50 camera pixels.

    Figure 21.Imaging depth-of-field analysis with a fixed interval δ=2  mm defocus distance. (a)–(c) Imaging results of red, green, and white illuminated at different depths of field under 220-grit ground glass via method B, respectively. The leftmost column indicates the in-focus results with the object and the pinhole at the same spatial position. The right four columns display the defocused results. Scale bars in the rightmost column are 50 camera pixels.

    APPENDIX E: NOVEL BROADBAND SPECKLE CORRELATION PHENOMENON

    Anisotropic Broadband Speckle Correlation Characteristic

    Figure 22 depicts an anisotropic correlation phenomenon observed in both simulation and experiment. Notably, the entire broadband speckle has an expansion phenomenon from the center to the boundary [see Figs. 22(a) and 22(e)]. The central part of the broadband speckle autocorrelation has a delta-like function as shown in Figs. 22(d) and 22(h). Obviously, the closer to the speckle center, the smaller the average speckle grain size. The speckle grains at the boundary stretched consistently in the spatial direction [see Figs. 22(b) and 22(f)] as well as the autocorrelation function [see Figs. 22(c) and 22(g)].

    Anisotropic correlations in broadband speckle patterns. (a) Broadband speckle pattern obtained by numerical simulation. (b) Enlarged border parts I, II, III, and IV of (a). (c) Autocorrelation of different parts in (b). (d) Enlarged central part V and its corresponding autocorrelation. (e)–(h) Real experiment observed broadband speckle and the anisotropic correlation phenomenon from 220-grit ground glass. The scale bars in (a) and (e) are 200 camera pixels.

    Figure 22.Anisotropic correlations in broadband speckle patterns. (a) Broadband speckle pattern obtained by numerical simulation. (b) Enlarged border parts I, II, III, and IV of (a). (c) Autocorrelation of different parts in (b). (d) Enlarged central part V and its corresponding autocorrelation. (e)–(h) Real experiment observed broadband speckle and the anisotropic correlation phenomenon from 220-grit ground glass. The scale bars in (a) and (e) are 200 camera pixels.

    Position-dependent broadband speckle memory effect. (a) A typical scrambled pattern under broadband irradiation. (b) and (c) Translation memory effect range of the 600-grit and 220-grit ground glass calculated from different regions. The translation memory effect range is wider for the central part (orange box) than for the boundary part (blue box). The scale bar is 200 camera pixels in (a).

    Figure 23.Position-dependent broadband speckle memory effect. (a) A typical scrambled pattern under broadband irradiation. (b) and (c) Translation memory effect range of the 600-grit and 220-grit ground glass calculated from different regions. The translation memory effect range is wider for the central part (orange box) than for the boundary part (blue box). The scale bar is 200 camera pixels in (a).

    APPENDIX F: MEDIA

    Please refer to Visualization 1 for more information. Visualization 1 includes four parts. Part I presents the speckle evolution of both monochromatic and polychromatic speckles under the illumination bandwidth λ ranging from 500 to 764 nm. Part II shows the speckle autocorrelation changes with the illumination wavelength and the corresponding PBR function versus illumination spectral width. Part III provides the extended data of Fig. 19. In Part IV, we show a dynamic phase retrieval process of an object “optics” via our method A.

    References

    [1] S. Rotter, S. Gigan. Light fields in complex media: mesoscopic scattering meets wave control. Rev. Mod. Phys., 89, 015005(2017).

    [2] R. Horstmeyer, H. Ruan, C. Yang. Guidestar-assisted wavefront-shaping methods for focusing light into biological tissue. Nat. Photonics, 9, 563-571(2015).

    [3] J.-H. Park, Z. Yu, K. Lee, P. Lai, Y. Park. Perspective: wavefront shaping techniques for controlling multiple light scattering in biological tissues: toward in vivo applications. APL Photon., 3, 100901(2018).

    [4] J. Kubby, S. Gigan, M. Cui. Wavefront Shaping for Biomedical Imaging(2019).

    [5] N. Bender, H. Ylmaz, Y. Bromberg, H. Cao. Creating and controlling complex light. APL Photon., 4, 110806(2019).

    [6] P. Lai, L. Wang, J. W. Tay, L. V. Wang. Photoacoustically guided wavefront shaping for enhanced optical focusing in scattering media. Nat. Photonics, 9, 126-132(2015).

    [7] C. W. Hsu, S. F. Liew, A. Goetschy, H. Cao, A. D. Stone. Correlation-enhanced control of wave focusing in disordered media. Nat. Phys., 13, 497-502(2017).

    [8] S. Jeong, Y.-R. Lee, W. Choi, S. Kang, J. H. Hong, J.-S. Park, Y.-S. Lim, H.-G. Park, W. Choi. Focusing of light energy inside a scattering medium by controlling the time-gated multiple light scattering. Nat. Photonics, 12, 277-283(2018).

    [9] A. Boniface, B. Blochet, J. Dong, S. Gigan. Noninvasive light focusing in scattering media using speckle variance optimization. Optica, 6, 1381-1385(2019).

    [10] E. G. van Putten, D. Akbulut, J. Bertolotti, W. L. Vos, A. Lagendijk, A. Mosk. Scattering lens resolves sub-100 nm structures with visible light. Phys. Rev. Lett., 106, 193905(2011).

    [11] H. Yilmaz, E. G. van Putten, J. Bertolotti, A. Lagendijk, W. L. Vos, A. P. Mosk. Speckle correlation resolution enhancement of wide-field fluorescence imaging. Optica, 2, 424-429(2015).

    [12] J. A. Newman, Q. Luo, K. J. Webb. Imaging hidden objects with spatial speckle intensity correlations over object position. Phys. Rev. Lett., 116, 073902(2016).

    [13] N. H. Valencia, S. Goel, W. McCutcheon, H. Defienne, M. Malik. Unscrambling entanglement through a complex medium. Nat. Phys., 16, 1112-1116(2020).

    [14] P. Pai, J. Bosch, M. Kühmayer, S. Rotter, A. P. Mosk. Scattering invariant modes of light in complex media. Nat. Photonics, 15, 431-434(2021).

    [15] D. Bouchet, S. Rotter, A. P. Mosk. Maximum information states for coherent scattering measurements. Nat. Phys., 17, 564-568(2021).

    [16] A. Badon, D. Li, G. Lerosey, A. C. Boccara, M. Fink, A. Aubry. Spatio-temporal imaging of light transport in highly scattering media under white light illumination. Optica, 3, 1160-1166(2016).

    [17] A. G. Vesga, M. Hofer, N. K. Balla, H. B. De Aguiar, M. Guillon, S. Brasselet. Focusing large spectral bandwidths through scattering media. Opt Express, 27, 28384-28394(2019).

    [18] M. Mounaix, D. Andreoli, H. Defienne, G. Volpe, O. Katz, S. Grésillon, S. Gigan. Spatiotemporal coherent control of light through a multiple scattering medium with the multispectral transmission matrix. Phys. Rev. Lett., 116, 253901(2016).

    [19] A. Boniface, I. Gusachenko, K. Dholakia, S. Gigan. Rapid broadband characterization of scattering medium using hyperspectral imaging. Optica, 6, 274-279(2019).

    [20] W. Xiong, C.-W. Hsu, H. Cao. Spatio-temporal correlations in multimode fibers for pulse delivery. IEEE Photonics Society Summer Topical Meeting Series (SUM), 1-2(2019).

    [21] J. Bertolotti, E. G. Van Putten, C. Blum, A. Lagendijk, W. L. Vos, A. P. Mosk. Non-invasive imaging through opaque scattering layers. Nature, 491, 232-234(2012).

    [22] O. Katz, P. Heidmann, M. Fink, S. Gigan. Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations. Nat. Photonics, 8, 784-790(2014).

    [23] I. Freund, M. Rosenbluh, S. Feng. Memory effects in propagation of optical waves through disordered media. Phys. Rev. Lett., 61, 2328-2331(1988).

    [24] S. Feng, C. Kane, P. A. Lee, A. D. Stone. Correlations and fluctuations of coherent wave transmission through disordered media. Phys. Rev. Lett., 61, 834-837(1988).

    [25] J. Goodman. Introduction to Fourier Optics(2005).

    [26] J. Goodman. Speckle Phenomena in Optics: Theory and Applications(2007).

    [27] E. Akkermans, G. Montambaux. Mesoscopic Physics of Electrons and Photons(2007).

    [28] K. Monakhova, K. Yanny, N. Aggarwal, L. Waller. Spectral diffusercam: lensless snapshot hyperspectral imaging with a spectral filter array. Optica, 7, 1298-1307(2020).

    [29] X. Li, J. A. Greenberg, M. E. Gehm. Single-shot multispectral imaging through a thin scatterer. Optica, 6, 864-871(2019).

    [30] S. K. Sahoo, D. Tang, C. Dang. Single-shot multispectral imaging with a monochromatic camera. Optica, 4, 1209-1213(2017).

    [31] K. Midorikawa. Progress on table-top isolated attosecond light sources. Nat. Photonics, 16, 267-278(2022).

    [32] S. Zheng, H. Wang, S. Dong, F. Wang, G. Situ. Incoherent imaging through highly nonstatic and optically thick turbid media based on neural network. Photon. Res., 9, B220-B228(2021).

    [33] P. Arjmand, O. Katz, S. Gigan, M. Guillon. Three-dimensional broadband light beam manipulation in forward scattering samples. Opt. Express, 29, 6563-6581(2021).

    [34] E. Edrei, G. Scarcelli. Memory-effect based deconvolution microscopy for super-resolution imaging through scattering media. Sci. Rep., 6, 33558(2016).

    [35] T. Wu, O. Katz, X. Shao, S. Gigan. Single-shot diffraction-limited imaging through scattering layers via bispectrum analysis. Opt. Lett., 41, 5003-5006(2016).

    [36] S. Divitt, D. F. Gardner, A. T. Watnik. Imaging around corners in the mid-infrared using speckle correlations. Opt. Express, 28, 11051-11064(2020).

    [37] D. Lu, Q. Xing, M. Liao, G. Situ, X. Peng, W. He. Single-shot noninvasive imaging through scattering medium under white-light illumination. Opt. Lett., 47, 1754-1757(2022).

    [38] J. C. Dainty. Laser Speckle and Related Phenomena, 9(2013).

    [39] D. R. Luke. Relaxed averaged alternating reflections for diffraction imaging. Inverse Prob., 21, 37-50(2004).

    [40] J. R. Fienup. Reconstruction of an object from the modulus of its Fourier transform. Opt. Lett., 3, 27-29(1978).

    [41] J. R. Fienup. Phase retrieval algorithms: a personal tour. Appl. Opt., 52, 45-56(2013).

    [42] S. Marchesini, H. He, H. N. Chapman, S. P. Hau-Riege, A. Noy, M. R. Howells, U. Weierstall, J. C. Spence. X-ray image reconstruction from a diffraction pattern alone. Phys. Rev. B, 68, 140101(2003).

    [43] A. Buades, B. Coll, J.-M. Morel. Non-local means denoising. Image Process. Line, 1, 208-212(2011).

    [44] J. Wright, A. Ganesh, S. Rao, Y. Peng, Y. Ma. Robust principal component analysis: exact recovery of corrupted low-rank matrices via convex optimization. Advances in Neural Information Processing Systems, 2080-2088(2009).

    [45] H. Xu, C. Caramanis, S. Sanghavi. Robust PCA via outlier pursuit. IEEE Trans. Inf. Theory, 58, 3047-3064(2012).

    [46] S. Marchesini. Invited article: a unified evaluation of iterative projection algorithms for phase retrieval. Rev. Sci. Instrum., 78, 011301(2007).

    [47] M. R. Rai, A. Vijayakumar, J. Rosen. Non-linear adaptive three-dimensional imaging with interferenceless coded aperture correlation holography (I-COACH). Opt. Express, 26, 18143-18154(2018).

    [48] V. Anand, S. H. Ng, T. Katkus, S. Juodkazis. Spatio-spectral-temporal imaging of fast transient phenomena using a random array of pinholes. Adv Photon. Res, 2, 2000032(2021).

    [49] Z. T. Harmany, R. F. Marcia, R. M. Willett. This is SPIRAL-TAP: sparse Poisson intensity reconstruction algorithms-theory and practice. IEEE Trans. Image Process., 21, 1084-1096(2011).

    [50] S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3, 1-122(2011).

    [51] A. Beck, M. Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci., 2, 183-202(2009).

    [52] R. J. Hanisch, R. L. White, R. L. Gilliland. Deconvolution of hubbles space telescope images and spectra. Deconvolution of Images and Spectra, 310-360(1996).

    [53] Y. Shi, Y. Liu, W. Sheng, D. Zhu. Extending the imaging range through scattering layers to the entire correlation range. Appl. Opt., 59, 1633-1640(2020).

    [54] Y. Shi, Y. Liu, J. Wang, T. Wu. Non-invasive depth-resolved imaging through scattering layers via speckle correlations and parallax. Appl. Phys. Lett., 110, 231101(2017).

    [55] Y. Zhang, Z. Lu, J. Wu, X. Lin, D. Jiang, Y. Cai, J. Xie, Y. Wang, T. Zhu, X. Ji, Q. Dai. Computational optical sectioning with an incoherent multiscale scattering model for light-field microscopy. Nat. Commun., 12, 6391(2021).

    [56] J. Huijts, S. Fernandez, D. Gauthier, M. Kholodtsova, A. Maghraoui, K. Medjoubi, A. Somogyi, W. Boutu, H. Merdji. Broadband coherent diffractive imaging. Nat. Photonics, 14, 618-622(2020).

    [57] C. A. Metzler, F. Heide, P. Rangarajan, M. M. Balaji, A. Viswanath, A. Veeraraghavan, R. G. Baraniuk. Deep-inverse correlography: towards real-time high-resolution non-line-of-sight imaging. Optica, 7, 63-71(2020).

    [58] M. I. Akhlaghi, A. Dogariu. Tracking hidden objects using stochastic probing. Optica, 4, 447-453(2017).

    [59] M. Cua, E. H. Zhou, C. Yang. Imaging moving targets through scattering media. Opt Express, 25, 3935-3945(2017).

    [60] A. J. F. Siegert. On the Fluctuations in Signals Returned by Many Independently Moving Scatterers(1943).

    [61] M. Nazarathy, J. Shamir. Fourier optics described by operator algebra. J. Opt. Soc. Am A, 70, 150-159(1980).

    [62] Y. Wang, J. Li, P. Stoica. Spectral Analysis of Signals: The Missing Data Case(2006).

    [63] Z. Wang, A. Bovik, H. Sheikh, E. Simoncelli. Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process., 13, 600-612(2004).

    [64] S. Schott, J. Bertolotti, J.-F. Léger, L. Bourdieu, S. Gigan. Characterization of the angular memory effect of scattered light in biological tissues. Opt. Express, 23, 13505-13516(2015).

    [65] H. Liu, Z. Liu, M. Chen, S. Han, L. V. Wang. Physical picture of the optical memory effect. Photon. Res., 7, 1323-1330(2019).

    [66] M. Kadobianskyi, I. N. Papadopoulos, T. Chaigne, R. Horstmeyer, B. Judkewitz. Scattering correlations of time-gated light. Optica, 5, 389-394(2018).

    [67] H. Ylmaz, C. W. Hsu, A. Goetschy, S. Bittner, S. Rotter, A. Yamilov, H. Cao. Angular memory effect of transmission eigenchannels. Phys. Rev. Lett., 123, 203901(2019).

    Wei Li, Bingjian Wang, Tengfei Wu, Feihu Xu, Xiaopeng Shao. Lensless imaging through thin scattering layers under broadband illumination[J]. Photonics Research, 2022, 10(11): 2471
    Download Citation