• Opto-Electronic Science
  • Vol. 2, Issue 8, 230025-1 (2023)
Yan Li1, Xiaojin Huang1, Shuxin Liu1、*, Haowen Liang2、**, Yuye Ling1, and Yikai Su1、***
Author Affiliations
  • 1Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
  • 2State Key Laboratory of Optoelectronic Materials and Technologies, School of Physics, Sun Yat-Sen University, Guangzhou 510275, China
  • show less
    DOI: 10.29026/oes.2023.230025 Cite this Article
    Yan Li, Xiaojin Huang, Shuxin Liu, Haowen Liang, Yuye Ling, Yikai Su. Metasurfaces for near-eye display applications[J]. Opto-Electronic Science, 2023, 2(8): 230025-1 Copy Citation Text show less

    Abstract

    Virtual reality (VR) and augmented reality (AR) are revolutionizing our lives. Near-eye displays are crucial technologies for VR and AR. Despite the rapid advances in near-eye display technologies, there are still challenges such as large field of view, high resolution, high image quality, natural free 3D effect, and compact form factor. Great efforts have been devoted to striking a balance between visual performance and device compactness. While traditional optics are nearing their limitations in addressing these challenges, ultra-thin metasurface optics, with their high light-modulating capabilities, may present a promising solution. In this review, we first introduce VR and AR near-eye displays, and then briefly explain the working principles of light-modulating metasurfaces, review recent developments in metasurface devices geared toward near-eye display applications, delved into several advanced natural 3D near-eye display technologies based on metasurfaces, and finally discuss about the remaining challenges and future perspectives associated with metasurfaces for near-eye display applications.

    Introduction

    Recently, the emergence of the metaverse has sparked a surge of interest in virtual reality (VR) and augmented reality (AR) technologies1-6, which has led to extensive research in both industry and academia. VR creates fully-immersive virtual environments by presenting digital images to users, while effectively blocking out the real world. And AR allows the viewers to perceive and interact with a mixed-reality world by superimposing virtual information onto real-world scenes. These technologies offer a wide range of potential applications, including entertainment7, military training8, 9, education, remote communication10, and design engineering11, 12. It is anticipated that they will soon revolutionize our daily lives, transforming the way we live, work, and learn, and emerging as the next-generation human-machine interface.

    Near-eye displays13-15 are crucial to VR and AR technologies. Innovative near-eye headsets such as Google Glasses, Microsoft Hololens, Meta Quest, and Apple Vision Pro, have been developed and launched, offering unprecedented user experiences that push the boundaries of human imagination16-26, as shown in Fig. 1. However, there are still challenges to overcome before VR and AR can be widely adopted for everyday use. For VR displays, it is critical to provide a large field-of-view (FOV) and high-resolution virtual images to enhance the realism and immersion of the virtual environment. However, this often results in increased bulkiness and weight. For AR displays, compactness and lightweightness are highly desired for long-term viewing and interaction. An adequate FOV, the see-through ability without obvious distortions of the real-world scene, and good contrast in both the real-world scene and virtual images are also required. Balancing and optimizing these features simultaneously remains a challenge. Moreover, because of the stereoscopic 3D display technology27 employed, these near-eye headsets are incapable of generating a realistic 3D effect with all the depth cues correctly presented28, as experienced in the real world. This discrepancy in depth cues can result in an unrealistic 3D effect and even cause visual fatigue and nausea29, compromising the visual experience for both VR and AR.

    Roadmap of near-eye display development. First AR Head-Mounted Display (HMD) prototype, developed by Ivan Sutherland in 1968. Figure reproduced from ref.16, under a Creative Commons Attribution 4.0 International. VR HMD, National Aeronautics and Space Administration (NASA) Ames VIEW. Figure reproduced from ref.17, National Aeronautics and Space Administration. HMDs based on waveguide devices and freeform surfaces. First AR commercial glass-product, SV-6 PC viewer, developed by MicroOptical in 2003. Figure reproduced from ref.20, the MicroOptical Corporation. Monocular optical see-through smart glass, Google Glass, developed by Google in 2012. Figure reproduced from ref.21, Wikipedia. Representative modern VR HMD, Oculus Rift. Figure reproduced from ref.22, Wikipedia. Representative modern AR HMD, Microsoft HoloLens 1, based on optical waveguides in 2015. Figure reproduced from ref.23, Wikipedia. First commercial dual-focal AR display device, Magic Leap one, released by Magic Leap in 2018. Figure reproduced from ref.24, Wikipedia. First AR system based on a metasurface device proposed by Lee et al. in 2018. Figure reproduced from ref.25, Nature Publishing Group, under a Creative Commons Attribution 4.0 International. Video-see-through AR glass, Apple Vision Pro, developed by Apple in 2023. Figure reproduced from ref.26, Apple.

    Figure 1.Roadmap of near-eye display development. First AR Head-Mounted Display (HMD) prototype, developed by Ivan Sutherland in 1968. Figure reproduced from ref.16, under a Creative Commons Attribution 4.0 International. VR HMD, National Aeronautics and Space Administration (NASA) Ames VIEW. Figure reproduced from ref.17, National Aeronautics and Space Administration. HMDs based on waveguide devices and freeform surfaces. First AR commercial glass-product, SV-6 PC viewer, developed by MicroOptical in 2003. Figure reproduced from ref.20, the MicroOptical Corporation. Monocular optical see-through smart glass, Google Glass, developed by Google in 2012. Figure reproduced from ref.21, Wikipedia. Representative modern VR HMD, Oculus Rift. Figure reproduced from ref.22, Wikipedia. Representative modern AR HMD, Microsoft HoloLens 1, based on optical waveguides in 2015. Figure reproduced from ref.23, Wikipedia. First commercial dual-focal AR display device, Magic Leap one, released by Magic Leap in 2018. Figure reproduced from ref.24, Wikipedia. First AR system based on a metasurface device proposed by Lee et al. in 2018. Figure reproduced from ref.25, Nature Publishing Group, under a Creative Commons Attribution 4.0 International. Video-see-through AR glass, Apple Vision Pro, developed by Apple in 2023. Figure reproduced from ref.26, Apple.

    Metasurfaces30-33, ultra-thin planar elements consisting of sub-wavelength antennas, offer superior modulation capabilities for light amplitude, phase, and polarization state, outperforming conventional refractive and diffractive optics. Extensive research has been conducted on optical metasurface devices, including gratings34, 35, lenses36-38, and holograms39-43. Advances in metasurface device design and fabrication engineering have facilitated their applications in a great variety of areas such as displays44, 45, cameras46, 47, and microscopes48, 49. With their ultra-thin form factor, subwavelength modulation scale, and high modulation flexibility, metasurfaces are promising candidates as key components in near-eye displays to replace bulky conventional optics or enable novel functionalities, paving the way for next-generation AR and VR technologies50.

    In this review, we first introduce VR and AR near-eye displays, discussing the working principles, basic architectures, key functional components, and challenges. We then briefly explain the working principles of light-modulating metasurfaces. Next, we review recent developments in metasurface devices geared toward near-eye display applications, illustrating their working principles, functionalities, and performance characteristics within various VR and AR architectures. We then delved into several natural 3D near-eye display technologies based on metasurfaces. Finally, we discuss the remaining challenges associated with metasurfaces for near-eye displays and suggest future research directions.

    Principles of metasurfaces

    Metasurfaces, constructed from subwavelength-scaled meta-atoms arranged in specific configurations, are artificial materials with the ability to independently or simultaneously manipulate various properties of light, including phase, amplitude, wavelength, polarization, and temporal characteristics. As a result, metasurfaces have emerged as an innovative platform for controlling light. Their remarkable functionalities offer significant opportunities for the design of near-eye displays, which necessitate efficient manipulation of diffractive light within a confined space.

    In a metasurface, each individual meta-atom can be conceptualized as a secondary wave emitter that generates light with specific optical properties by the Huygens principle49. For a near-eye display, the desired capability is to modulate the phase, polarization, and wavelength of light. Therefore, this review will primarily focus on the fundamental principles underlying the modulation of these optical properties.

    Phase modulation is one of the most frequently utilized features of metasurfaces in near-eye displays. Various mechanisms, such as resonant phase51, propagation phase52, geometry phase53, and other nonlocal modulations54, 55, can be implemented in meta-atoms for phase control.

    Within a specific wavelength range, the incident light can excite the resonant mode in meta-atoms, leading to a phase shift in the transmitted or reflected light, known as the resonant phase. This phase is influenced by the geometric parameters and electromagnetic properties of the meta-atoms constituent materials. By tuning these parameters, the resonant condition and resonant phase can be effectively modulated. Metallic meta-atoms are often used for resonant phase modulation due to their ability to utilize plasmonic resonance56-58. However, their practical applications are hindered by the significant ohmic loss, resulting in low working efficiency. As an alternative, dielectric meta-atoms exhibit Mie resonance, offering a more efficient method for resonant phase modulation with minimal ohmic loss59. Unlike plasmonic resonance, Mie resonance originates from the interplay between intrinsic electric and magnetic resonances within these meta-atoms. More recently, advanced resonant modes, such as integrated resonance, have been proposed, providing greater degrees of freedom for sophisticated modulation of the resonant phase60, 61.

    The propagation phase refers to the phase shift that occurs when light traverses a dielectric meta-atom62-64. In this scenario, the meta-atom is treated as a truncated waveguide, where the height and the effective refractive index of the meta-atom determine the phase accumulation during light propagation. Typically, the height is uniform, while the effective refractive index is influenced by the fill factor and constituent material of the meta-atom. Therefore, adjusting the geometrical parameters enables the tuning of the propagation phase. The propagation phase is often utilized in the design of polarization-insensitive metasurfaces.

    In contrast to the previous two principles, the geometry phase, also known as the Pancharatnam-Berry phase53, provides a unique approach to phase modulation in metasurfaces. When circularly polarized light interacts with an anisotropic meta-atom, the output light undergoes a polarization state flip to orthogonal states, accompanied by an induced phase shift. This geometry phase is precisely twice the in-plane rotational angle of the meta-atoms65, 66. Consequently, an entire metasurface can be constructed using an array of these anisotropic meta-atoms, all featuring the same geometry but different in-plane rotational angles.

    The aforementioned phase modulation mechanisms primarily focus on local phase modulation. However, nonlocal modulation67 of phase presents greater opportunities for efficient angular manipulation. This approach relies on both the local Bloch modes and a large number of nonlocal overlapping modes to effectively modulate light68. By leveraging these mechanisms, light can interact with the desired diffraction channel with a higher probability, maintaining a very high diffraction efficiency even at large deflection angles. For instance, it has been reported that both transverse electric (TE) and transverse magnetic (TM) diffraction efficiencies can reach approximately 75% at a deflection angle of 75° using a metagrating designed for a wavelength of 1050 nm, based on freeform multimode geometries35. Since nonlocal modes are difficult to predict and engineer, these metagratings are typically designed using an inverse design approach, which is result-driven, with the local modes being automatically obtained69.

    By carefully designing metasurfaces with the phase modulation methods mentioned above, they could achieve superior optical properties such as a wider angular bandwidth70 and higher diffraction efficiency71. Moreover, metasurfaces can modulate not only the phase but also the polarization or amplitude simultaneously72, 73. These features allow for convenient polarization74-77 or wavelength78 multiplexing, providing more design freedoms to improve the optical performance and enable new functionalities in near-eye displays79, 80.

    Fundaments of near-eye displays

    The recent advancements in optics, high-resolution displays, and information technologies have led to the emergence of VR and AR near-eye displays, finding applications in various fields.

    A typical VR display architecture is shown in Fig. 2(a). A display panel is placed within one focal-length distance from the eyepiece. This arrangement allows the virtual image of the display panel to be magnified by the eyepiece and viewed by the human eye. With the external light blocked, the viewer becomes fully immersed in the virtual environment. The eyepiece and the image source are two critical components of a VR display. The eyepiece can consist of a group of refractive or diffractive optics, and the system configuration can be either transmissive, as shown in Fig. 2(a), or reflective with a folded optical path for a more compact form factor, as in a pancake scheme81, 82. High-resolution liquid crystal displays (LCDs), organic light-emitting-diode (OLED) microdisplays, and micro light-emitting-diode (μLED) microdisplays are commonly employed as image sources83 in VR displays.

    (a) VR display optical schematic diagram. (b) Consistent accommodation and vergence distances when observing the real-world scene. (c) Mismatch between accommodation and vergence distances when viewing with a stereoscopic 3D display.

    Figure 2.(a) VR display optical schematic diagram. (b) Consistent accommodation and vergence distances when observing the real-world scene. (c) Mismatch between accommodation and vergence distances when viewing with a stereoscopic 3D display.

    For VR displays, which aim to provide an immersive virtual environment, a critical challenge is to provide a wide FOV that is comparable to the human visual system. The monocular FOV of the human eye is approximately 160° horizontally and 130° vertically, and the binocular FOV is about 200° horizontally and 130° vertically84. Hence, the eyepiece should have good imaging capability within a large FOV.

    VR displays face a significant challenge in the form of the vergence-accommodation conflict (VAC) problem28, 85, 86, which leads to 3D visual fatigue. This issue arises from the stereoscopic 3D display technique used in commercial VR headsets. When viewing a real-world scene, both eyes converge at the depth of the 3D object, with each eye accommodating (focusing) at the same depth as shown in Fig. 2(b). However, in a stereoscopic VR display, where two separate parallax images are projected to each eye to create a virtual 3D image, the eyes converge on the virtual 3D image, but each eye focuses on the 2D virtual image plane on the display panel, as shown in Fig. 2(c). The depth of the virtual 3D image perceived by two-eye convergence could be varied by adjusting the parallax images, but the depth perceived by eye accommodation remains at a fixed image plane distance when the image distance and eyepiece focal length are set. This mismatch between accommodation and vergence distances leads to visual fatigue, nausea, and other discomfort. Various true 3D display87 technologies such as light field display80, 88-90, holographic display91, 92, and volumetric 3D display93, 94 have been proposed to address this problem.

    For AR displays, there are two basic architectures: video-see-through and optical-see-through95. The former is quite similar to VR, comprising an image source and an eyepiece, but with an additional component - a real-time capturing camera. Instead of viewing the real world optically, the viewer sees the real-time video of the real world captured by the camera, with computer-generated virtual information added to the display, as shown in Fig. 3(e). Recently, Apple Vision Pro, a representative video-see-through AR product, was launched, offering a high-quality real-world environment with abundant virtual resources. This review, however, focuses on optical-see-through AR displays. An optical see-through AR display requires an additional functional component called an optical combiner, in addition to the display panel and eyepiece. This component overlays the virtual image onto the real world. A combiner typically reflects virtual image light and transmits real-world light into the eye to combine the virtual and the real world. Hence, a beam splitter (BS)96, 97, a polarization beam splitter (PBS)98, 99, a partially reflective mirror (including half mirror), or gratings100 on a waveguide could all serve as a combiner in an optical see-through AR display. Figure 3(a–d) illustrates various optical-see-through AR architectures based on different optical combiners.

    Schematics of AR display architectures based on (a) a half mirror/BS combiner, (b) birdbath optics, (c) freeform prisms, and (d) a waveguide with grating couplers. (e) Schematic diagram of a video see-through AR display.

    Figure 3.Schematics of AR display architectures based on (a) a half mirror/BS combiner, (b) birdbath optics, (c) freeform prisms, and (d) a waveguide with grating couplers. (e) Schematic diagram of a video see-through AR display.

    Figure 3(a) illustrates the architecture of the simplest optical see-through AR display, which is based on a BS combiner. The virtual image is first magnified by the eyepiece and then reflected towards the eye by the BS. Simultaneously, the real-world light can pass through the BS directly, entering the eye without distortion. The BS could be a regular polarization-independent BS in the form of a cube or thin plate, with its transmittance and reflectance optimized to achieve the desired brightness for virtual and real environments. Obviously, there is a tradeoff between the efficiency of the virtual image and real-world light. However, if the virtual image light is linearly polarized, such as that from an LCD, employing a PBS can significantly improve the efficiency of virtual image light while maintaining ~50% transmittance for the real-world light.

    The birdbath architecture is a commonly used design in commercial optical-see-through AR products101. As shown in Fig. 3(b), the image light, initially modulated by the refractive optics near the display panel, undergoes a series of reflections. It is first reflected by the BS, then by the curved partial mirror, and finally passes through the BS again before reaching the eye. The refractive optics and the curved partial mirror work together to magnify the image, functioning as an eyepiece. The folded optical path, combined with the design flexibility offered by the separate refractive and reflective optics, allows these systems to deliver high-quality images and a large FOV. Moreover, they maintain a relatively compact form factor and are lightweight.

    Figure 3(c) demonstrates the schematic of an AR display system based on freeform optics. Here, image light injected into the freeform prism is first reflected at the glass-air interface via total internal reflection (TIR), and then by the partially reflective mirror. Each surface of the prism could be individually designed and optimized to jointly achieve high image quality for the virtual image. A compensator prism is required to compensate for the real-world scene. This design can yield high-quality images with a broad FOV and high efficiency. Nonetheless, this approach also results in a bulky and heavy overall system. Furthermore, it often necessitates more complex and costly manufacturing processes.

    Figure 3(d) shows the schematic of a waveguide-based AR display utilizing grating couplers. Light from the light engine is first collimated by a collimating lens before being coupled into the waveguide. It then propagates within the waveguide via TIR until it is coupled out towards the eye by the grating. The input coupler is typically highly efficient, maximizing the utilization of light emitted from the microdisplay. In addition to gratings, geometry prisms could also serve as high-efficiency input couplers. Conversely, the output coupler, which also functions as a pupil expander, usually has a lower and spatially varying efficiency to ensure a uniformly duplicated eye box. The out-couplers are usually made of surface relief gratings, volume gratings, or holographic optical elements. Due to the dispersive nature of gratings, waveguide architectures often employ multiple waveguides or a wavelength multiplexing technique to enable full-color rendering. The waveguide architecture provides a smaller form factor and lighter weight compared to other designs. It allows for very high transmittance of most real-world light, but the overall efficiency for the virtual image light is relatively low. Additionally, the FOV of the waveguide-based AR displays is limited due to the constraints imposed by TIR.

    In the realm of near-eye display architectures, there is always a trade-off between various features such as FOV, resolution, image quality, and form factor. Researchers and engineers are constantly striving to strike a balance between visual performance and device compactness. While traditional optics are nearing their limitations in addressing these challenges, ultra-thin metasurface optics, with their high light-modulating capabilities, may present a promising solution. In the following sections, we will review recent developments in metasurface devices orientated toward near-eye display applications.

    Metasurfaces in VR displays

    As previously discussed, the architecture of a VR display is relatively straightforward, composed of two main functional components: the image source and the eyepiece. In theory, metasurface devices could serve as either the image source or the eyepiece. However, for VR displays, the image source must provide large-size, video-rate, full-color images to create an immersive virtual environment. This requirement surpasses the capability of the state-of-art metasurfaces. Therefore, the application of metasurfaces in VR displays is primarily restricted to functioning as an eyepiece. Compared to traditional bulky optics, metasurface eyepieces, characterized by their planar architectures with exceptionally compact footprints and versatile phase modulation capabilities, provide greater light deflecting and aberration suppression capabilities. The main challenge for metasurface eyepieces in achieving a full-color VR display with a large FOV lies in the design and fabrication of large-size, achromatic metalenses.

    In 2021, Li et al. proposed a method for achieving large-area multiwavelength achromatic metalenses using multiple-zone dispersion engineering102, which they subsequently employed in a VR system. While broadband achromatic metalenses36 could be achieved by designing spatially varied meta-atoms that independently control phase profile and dispersion parameters, such as group delay (GD) and group delay dispersion, the required GD significantly increases with the diameter of metalens, exceeding the capabilities of meta-atoms. This typically restricts the size of achromatic metalenses to tens of microns. To overcome this limitation, they divided the metalens into multiple zones, as shown in Fig. 4(a), to reduce the required GD within each zone and fabricated a 2-mm-diameter, high numerical aperture (NA=0.7) RGB-achromatic metalens102. The metalens is composed of TiO2 nanofin structures on fused silica substrate. Figure 4(b) shows a scanning electron microscopic (SEM) image of the fabricated metalens. Figure 4(c) illustrates the schematic of the VR display system, where the achromatic metalens is used as an eyepiece and an RGB-laser-based image source is employed. The displayed VR images are shown in Fig. 4(d–g). As can be seen, different color components achieve high image quality at the same image depth simultaneously, indicating negligible chromatic aberration.

    (a) Schematic of a multizone RGB-achromatic metalens. (b) Scanning electron microscope (SEM) image of a 2-mm-diameter achromatic metalens with NA=0.7. (c) Schematic of the VR mode employing the achromatic metalens. (d, e) VR display results with a 3D effect. (f, g) Full-color VR display results. The scale bar is 20 μm in (d–g). Figure reproduced with permission from ref.102, American Association for the Advancement of Science.

    Figure 4.(a) Schematic of a multizone RGB-achromatic metalens. (b) Scanning electron microscope (SEM) image of a 2-mm-diameter achromatic metalens with NA=0.7. (c) Schematic of the VR mode employing the achromatic metalens. (d, e) VR display results with a 3D effect. (f, g) Full-color VR display results. The scale bar is 20 μm in (d–g). Figure reproduced with permission from ref.102, American Association for the Advancement of Science.

    While the aforementioned method achieves the desired phase profiles, it necessitates a significant variation in meta-atom structures, thereby increasing fabrication complexity. To address this, the same research group proposed a novel inverse-design framework in 2022103. This new approach deviates from optimizing the phase profiles and instead focuses on maximizing the intensity of different wavelengths at the focal spot. By using a fast-approximate solver and an adjoint method, a 1-cm-diameter, RGB-achromatic, polarization-insensitive metalens was designed based on the same materials102 but simpler meta-atom structures. The picture of the achromatic metalens with an NA of 0.3 is shown in Fig. 5(a). Figure 5(b) shows the measured focal intensity distribution in the XZ plane for the red (488 nm), green (532 nm), and blue (658 nm) colors, respectively, demonstrating minimal focal shift. The focusing efficiency of the fabricated metalens was measured to be 15.8%, 13.6%, and 16.1% for RGB colors, respectively. Figure 5(c) shows a photo of the VR system employing the achromatic metalens eyepiece and a laser-illuminated micro-LCD, while Fig. 5(d–g) exhibits the VR images generated by the system.

    (a) Photograph of a 1-cm-diameter RGB-achromatic metalens. The inset is the SEM image of the nanostructures used in the metalens and the scale bar is 500 nm. (b) Measured focal intensity distribution in the XZ plane at RGB wavelengths of the achromatic metalens. (c) Photograph of VR imaging setup employing the achromatic metalens. (d–f) Binary VR imaging results at RGB wavelengths. (g) Simulated full-color VR imaging result by combining RGB image channels shown in (d-f). The scale bar is 100 μm in (d-g). Figure reproduced from ref.103, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License.

    Figure 5.(a) Photograph of a 1-cm-diameter RGB-achromatic metalens. The inset is the SEM image of the nanostructures used in the metalens and the scale bar is 500 nm. (b) Measured focal intensity distribution in the XZ plane at RGB wavelengths of the achromatic metalens. (c) Photograph of VR imaging setup employing the achromatic metalens. (df) Binary VR imaging results at RGB wavelengths. (g) Simulated full-color VR imaging result by combining RGB image channels shown in (d-f). The scale bar is 100 μm in (d-g). Figure reproduced from ref.103, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License.

    Despite these advances in large-scale metalens design engineering, the fabrication of metalenses still heavily relies on high-resolution nanopatterning techniques like electron-beam lithography, which are usually high-cost and low-throughput. Recently, Kim et al. proposed and demonstrated a cost-effective, high-throughput method for mass production of large-aperture visible metalenses using deep-ultraviolet argon fluoride immersion lithography and wafer-scale nanoimprint lithography104. The schematic of the mass production procedures is shown in Fig. 6. Using this novel method, they fabricated hundreds of 1-cm-diameter metalens via a 12-inch master stamp. A prototype of the VR device was demonstrated with a high-performance, mass-manufactured metalens capable of displaying red, green, and blue color images.

    Schematic of the mass production of metalenses using an ArF immersion scanner. ACL, amorphous carbon layer.

    Figure 6.Schematic of the mass production of metalenses using an ArF immersion scanner. ACL, amorphous carbon layer.

    Metasurfaces in AR displays

    Given their versatile functionality, high optical performance and ultra-thin form factors, metasurfaces have been proposed as critical optical components in various AR display architectures. They can function as eyepieces, combiners, and more, substituting traditional bulky optics or integrating new advanced optical functionalities, to achieve more compact and lightweight, high image-quality and large FOV AR displays. Though not as sophisticated as video-rate refreshing microdisplays, metasurfaces have also been employed in some AR displays as an image source to provide holographic images105, 106. In the following sections, we will review the applications of metasurfaces in different AR architectures.

    Base on beam splitters

    As discussed before, AR displays based on beam splitters typically involve a transmissive on-axis refractive lens as an eyepiece and a beam splitter, occasionally supplemented by a partial reflective lens, as the combiner. Within such architectures, transmissive metalenses emerge as a compelling alternative to conventional refractive lenses, offering the potential for a more compact form factor and an expanded FOV107.

    In 2018, Lee et al. demonstrated an AR display featuring a 90° FOV, facilitated by a 20-mm large-area transmissive metasurface eyepiece within a beam-splitter-based architecture25. The large-area metalens, composed of poly-crystalline silicon posts on a SiO2 substrate, gets its phase modulated by the geometry phase principle. The schematic of the AR display is shown in Fig. 7(a). The image light from the display panel first traverses the BS, after which its different color components are reflected by the corresponding dichroic mirrors. Upon its second encounter with the BS, the image light is reflected and subsequently modulated by the metalens, forming a magnified virtual image in the viewer’s line of sight. The circular polarizer arrangement in Fig. 7(a) allows the metalens to function concurrently as a positive lens for the image light and as a transparent window for real-world light. Hence, the viewer perceives the virtual image floating on the real world. In this configuration, the dichroic mirrors are positioned at different locations to counteract the chromatic dispersion of the metalens, thereby ensuring that different color images are formed at the same depth. The metalens is fabricated by electron-beam lithography and nanoimprint technique, and its SEM photo is depicted in Fig. 7(b). The resultant image, shown in Fig. 7(c), reveals that the real-world scene remains visible, albeit with decreased brightness due to the use of the circular polarizers, while the RGB virtual image spans a large FOV.

    (a) Schematic of a see-through AR display system based on a chromatic transmissive metalens. (b) SEM image of the see-through metalens and (c) full-color AR image. (d) Schematic of a VR display system employing a 1-cm-diameter RGB-achromatic metalens and (e-g) AR display results. LCP, left-handed circular polarizer; RCP, right-handed circular polarizer; ML, metalens; DMs, dichroic mirrors. Figure reproduced with permission from: (a–c) ref.25, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License; (d–g) ref.102, American Association for the Advancement of Science.

    Figure 7.(a) Schematic of a see-through AR display system based on a chromatic transmissive metalens. (b) SEM image of the see-through metalens and (c) full-color AR image. (d) Schematic of a VR display system employing a 1-cm-diameter RGB-achromatic metalens and (e-g) AR display results. LCP, left-handed circular polarizer; RCP, right-handed circular polarizer; ML, metalens; DMs, dichroic mirrors. Figure reproduced with permission from: (a–c) ref.25, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License; (d–g) ref.102, American Association for the Advancement of Science.

    As a matter of fact, the metalenses utilized in VR displays, as we discussed in the previous session, could also be used as an eyepiece in the BS-based AR displays, given their inherent functionality. For instance, Li et al. applied the RGB achromatic metalens, previously introduced in the context of VR display, in a full-color AR display prototype102. Owing to the achromatic nature of the metalens itself, the optical configuration is significantly simplified. As shown in Fig. 7(d), the color image is magnified by the metalens and subsequently reflected towards the eye by the beam splitter. Figure 7(e–f) shows the AR images generated by the prototype, demonstrating that the real-world scene retains a reasonable level of brightness, and the RGB virtual images appear at the same depth, a testament to the achromatic properties of the metalens.

    Base on waveguides

    In waveguide architecture, the key functional components are the collimating lens and the in- and out-couplers. Metasurfaces, with their versatile optical functionality, could act as either a collimating lens or a coupler80, 108-111. However, most research focuses on the latter, fully leveraging the high flexibility in wavefront control112. The primary challenges lie in achieving a large FOV and ensuring large and uniform brightness at the eye box.

    In 2018, Shi et al. proposed the use of polarization-dependent metagratings to achieve wide FOV waveguide displays113. In such a system, both the in- and out-couplers are polarization-selective metagratings, achieved by spatially interleaving TE-pass and TM-pass gratings. These gratings would diffract TE and TM light from the same incident angle to different output angles. As shown in Fig. 8(a), light from the left and right halves of the FOV is encoded with TE and TM polarizations, respectively. Each polarization is guided into and out of the waveguide. In this design, a specific guided angle is mapped to two different output angles. Although the maximum FOV for each polarization is still limited by the TIR condition, the separation of the TE and TM FOVs via the polarization-selective meta-couplers significantly increases the total FOV. The theoretical calculation indicates that the proposed design can achieve a FOV 70% larger than the maximum horizontal FOV achieved in conventional diffractive grating designs.

    (a) Schematic of a polarization-dependent metagrating-based out-coupler. (b) Schematic of a large-FOV, full-color AR display prototype based on a single-layer metasurface optical element, (c) SEM image of an optimized full-color metasurface element, and (d) AR display result. (e) Schematic of an on-chip metasystem for AR display. (f) Schematic of the inverse-designed metagrating architecture with wavelength-demultiplexing functionality. (g) Simulated in-coupling efficiency of the metagrating at opposite ports in the visible regime. (h) AR display results. The scale bar is 5 μm in (c), and the inset in (d) is the original image. Figure reproduced with permission from: (b–d) ref.114, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License; (e–h) ref.119, American Chemical Society.

    Figure 8.(a) Schematic of a polarization-dependent metagrating-based out-coupler. (b) Schematic of a large-FOV, full-color AR display prototype based on a single-layer metasurface optical element, (c) SEM image of an optimized full-color metasurface element, and (d) AR display result. (e) Schematic of an on-chip metasystem for AR display. (f) Schematic of the inverse-designed metagrating architecture with wavelength-demultiplexing functionality. (g) Simulated in-coupling efficiency of the metagrating at opposite ports in the visible regime. (h) AR display results. The scale bar is 5 μm in (c), and the inset in (d) is the original image. Figure reproduced with permission from: (b–d) ref.114, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License; (e–h) ref.119, American Chemical Society.

    In 2022, Boo et al. proposed and demonstrated a novel large-FOV, full-color AR display prototype based on a single layer of metasurface waveguide114. The schematic of the prototype is shown in Fig. 8(b). A laser scanner projects an image with a minimal divergence angle and a small FOV. The in-metasurface optical element (MOE) then diffracts the image light, guiding it along the waveguide in compliance with the TIR condition. Because of the narrow FOV, the light beam retains a nearly identical optical path inside the waveguide. The out-MOE subsequently deflects and converges the light from all FOV angles towards the observer’s eye with relatively strong optical power. Therefore, the eye would observe the image with a large FOV. As the metasurface optical elements are optimized for low dispersion, the same waveguide can facilitate RGB color laser light, resulting in a simplified structure. Fabrication of the MOEs involves a silicon foundry process, incorporating nitride film deposition, deep-ultraviolet lithography, and MOE nanopatterning. In this work, they employed continuous irregular structures other than discrete structures, as illustrated in the SEM image of a fabricated MOE in Fig. 8(c). The preference for continuous structures holds significant promise in terms of development prospects. It circumvents issues associated with phase under-sampling and resonance between adjacent structures, thus leading to enhanced efficiency and broader bandwidth, which are critical factors for advancing metasurface-based AR displays115-117. And Fig. 8(d) shows the displayed AR image captured by a camera.

    The importance of a large and uniform eye box in waveguide AR displays cannot be overstated, as it ensures that no portion of the image is lost or perceptible brightness variation is observed as the viewer's eye moves. Long et al. optimized metagratings by using a physics-driven neural network to achieve a high uniformity pupil expansion in a color waveguide display118. This optimization required different efficiencies of the out-coupling metagratings at various positions among multiple output channels to ensure uniform output light. Rigorous coupled-wave analysis served to calculate the forward and adjoint electromagnetic field based on the structural parameter. To prevent cross-talk, they used two waveguides: one for blue and green colors, and the other for red. The former involves achromatic metagrating.

    More recently, an intriguing on-chip AR waveguide display was introduced119. This design features a polarization selective metagrating that functions as the in-coupler, and two metasurfaces simultaneously functioning as the out-coupler and the image source. As shown in Fig. 8(e), free space red and green laser light is coupled in by a wavelength-selective metagrating, then propagates in opposite directions within the waveguide. Each light wave then interacts with a metasurface hologram out-coupler, which deflects it back to free space to generate a holographic image. Here, the in-coupler metagrating is made of amorphous silicon (α-Si) on a planar Si3N4 waveguide, with a thick SiO2 substrate (~500 μm) underneath serving as the bottom cladding, as shown in Fig. 8(f). The calculated in-coupling efficiency of wavelength demultiplexing in the visible regime is plotted in Fig. 8(g). It indicates strong wavelength-selective contrasts of coupling efficiencies for leftward and rightward waveguide propagation at the green (550 nm) and red (650 nm) wavelengths. The experimental result is shown in Fig. 8(h), where a green apple is displayed near the left metasurface out-coupler and red cherries near the right one.

    Base on direct projection

    In the direct-projection architecture of AR systems, the most critical optical element is the see-through reflective lens. This component not only converges reflective virtual image light but also transmits real-world light. Consequently, it combines the functionality of an eyepiece and a combiner in one device, fostering a more compact and highly integrated AR display system. Despite the significant versatility of metasurfaces, traditional metasurfaces are either transmissive or reflective. Achieving such unique new optical features requires great efforts in metasurfaces engineering. Nevertheless, in recent years, a few research groups have ventured into pioneering the development of such innovative metasurface devices. Their work promises to contribute significantly to AR displays, paving the way for more compact, lightweight, and high-performance next-generation AR displays.

    The concept of a reflective metasurface visor for near-eye displays could be dated back to 2017120. A flat metasurface based on cylindrical silicon pillars was designed to achieve an imaging effect akin to a freeform mirror. By adjusting the duty cycle of the unit structure, the phase profile of the metasurface could be precisely controlled. They have achieved a near-eye display design with a 70° FOV and a good modulation transfer function (MTF) curve, as confirmed by Zemax simulation. However, in their design, the silicon-based metasurface has not been optimized for high transparency to enable the see-through property. Later, in 2021, Avayu et al. designed a full-color, metasurface-based AR visor using three metallic metasurface layers121. They fabricated proof-of-concept small-scale samples with two individual metasurface layers. Preliminary test results reveal that green light can be reflected with its wavefront modulated, although the see-through property of the metasurface visor was not thoroughly investigated. Bayati et al. reported simulation work on the design of an achromatic see-through AR visor based on composite metasurfaces with a large FOV (>77°) both horizontally and vertically), and good (>70% transmission and no distortion) see-through quality107.

    In 2022, Li et al. experimentally demonstrated a planar multifunctional dielectric metasurface visor that could realize the focusing effect for oblique incident red light, while maintaining good see-through property for normal incident light122. The optical function of the metasurface visor is illustrated in Fig. 9(a). Thus, the multifunctional visor can simultaneously perform as an eyepiece for the oblique virtual image light and as a combiner that reflects the virtual image light into the viewer’s eye while transmitting light from the real environment. As a result, the whole system could be substantially simplified and compacted, as shown in Fig. 9(b). These unique optical properties are obtained by using geometry phase modulation via the spatial orientation variation of the rectangular silicon nanostructures with a high aspect ratio on a transparent sapphire substrate. The SEM image of the fabricated metasurface is shown in Fig. 9(e). The experimental result confirms an effective focusing effect, as shown in Fig. 9(c), and spectrum measurements, shown in Fig. 9(d), indicate reasonably high transparency, which is consistent with the simulation. Based on the metasurface visor, the researchers further developed an AR display prototype, rendering RGB virtual letters on the real world without needing an additional beam splitter. The display result is shown in Fig. 9(f, g) where the real-world scene is observed directly through the metasurface visor. This marks the first experimental demonstration of AR images generated by a single-piece metasurface visor.

    (a) Schematic of the optical behavior of a see-through refective metalens-visor. (b) Schematic of a near-eye AR display system based on the metalens-visor. (c) Measured focal spot intensity profile of the fabricated metalens at the illumination wavelength of 633 nm. (d) Simulated and measured transmittance spectra of the metalens. (e) SEM image of the fabricated metalens. The scale bar is 400 nm. (f, g) Demonstration of multi-color AR imaging. Figure reproduced from with permission ref.122, Springer Nature, under a Creative Commons Attribution 4.0 International License.

    Figure 9.(a) Schematic of the optical behavior of a see-through refective metalens-visor. (b) Schematic of a near-eye AR display system based on the metalens-visor. (c) Measured focal spot intensity profile of the fabricated metalens at the illumination wavelength of 633 nm. (d) Simulated and measured transmittance spectra of the metalens. (e) SEM image of the fabricated metalens. The scale bar is 400 nm. (f, g) Demonstration of multi-color AR imaging. Figure reproduced from with permission ref.122, Springer Nature, under a Creative Commons Attribution 4.0 International License.

    Despite the promise of geometry-phase-based metasurface visors, they are, however, chromatic and polarization-dependent. Recently, Luo et al. proposed an RGB achromatic and polarization-insensitive trans-reflective single-layer metalens for AR displays123. They constructed a structural phase library that incorporated silicon meta-units of various shapes and dimensions, as shown in Fig. 10(a), to achieve high reflectance and cover a wide range of phase responses. From this library, an optimized set of structures was selected and positioned across the metasurface so that RGB colors can accomplish the desired phase profiles simultaneously, as shown in Fig. 10(b). This arrangement effectively facilitates achromatic focusing. Moreover, because of the center-symmetry of the employed meta-units, the metalens exhibits insensitivity to the polarization states of the incident light. Fig. 10(c, d) shows the simulated focal spots of red, green, and blue under different linear-polarized collimated incident light. Importantly, all colors converge at the same focal point, and the focusing behaviors remain consistent across two polarization states.

    (a) Schematics of 14 kinds of center-symmetrical nanostructures used in an RGB achromatic trans-reflective metalens. (b) Comparison of the ideal phase profile and the matched phase of the metalens at RGB wavelengths. Simulation focal intensity distribution in the XZ plane at RGB wavelengths of the achromatic metalens: (c) linear p- polarized incidence and (d) linear s-polarized incidence. Figure reproduced with permission from ref.123, MDPI, under an open-access Creative Common CC BY license.

    Figure 10.(a) Schematics of 14 kinds of center-symmetrical nanostructures used in an RGB achromatic trans-reflective metalens. (b) Comparison of the ideal phase profile and the matched phase of the metalens at RGB wavelengths. Simulation focal intensity distribution in the XZ plane at RGB wavelengths of the achromatic metalens: (c) linear p- polarized incidence and (d) linear s-polarized incidence. Figure reproduced with permission from ref.123, MDPI, under an open-access Creative Common CC BY license.

    In the abovementioned metasurfaces, localized modes predominate, wherein individual meta-units govern the wavefront shape over a broad bandwidth, thereby linking the phase properties of neighboring wavelengths closely. Nonlocal lattice modes, instead, extend over many unit cells to support high resonant diffraction efficiency at a specific wavelength, while maintaining low efficiency for most other wavelengths. The resonance properties inherent in this approach could be harnessed to develop an RGB achromatic see-through metasurface visor for AR displays.

    Malek et al. put forward a concept for multilayer and multi-perturbation nonlocal dielectric metasurface systems based on spatially varying geometry phase124. This design allows for the multiplexing of independent wave-front-modulating functionalities across multiple resonant wavelengths while maintaining transparency over the rest of the spectrum. They suggested the use of the nonlocal metasurface system as an optical see-through lens in an AR display, capable of reflecting selected narrowband wavelengths of contextual information to the viewer’s eye, while simultaneously providing an unobstructed, broadband view of the real world, as schematically shown in Fig. 11(a). The structure of the metasurface system is shown in Fig. 11(b). It involves a single-function metasurface on top of the substrate based on p2 meta-units operative at the green wavelength, and at the bottom, a dual-function metasurface based on p1 meta-units operative at the red and blue wavelengths. Both metasurfaces are composed of rectangular apertures etched into a thin film of TiO2 coated with an antireflection layer of SiO2. The simulated transmission and reflection spectra of the metasurface system are shown in Fig. 11(c), indicating high broadband transmission of real-world light and a narrowband reflection at the three chosen visible wavelengths. As such, these novel nonlocal metasurface systems may serve as a strong contender for a multifunctional, ultrathin optical element in AR displays.

    (a) Schematic of an AR headset with a multifunctional nonlocal metasurfaces system as an optical see-through lens. (b) Schematic of a super-period of a nonlocal metasurface system implementing three distinct phase gradients at RGB wavelengths. (c) Simulated transmission and reflection spectra of the metasurface system shown in (a). (d) Schematic of an AR eyeglasses architecture based on a metaform imager used as an optical combiner. (e) Photo of a metaform and an SEM image of a set of the fabricated nano-tokens. (f) Set of different regions of the resolution target imaged via the metaform. Figure reproduced with permission from: (a–c) ref.124, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License; (d–f) ref.125, American Association for the Advancement of Science.

    Figure 11.(a) Schematic of an AR headset with a multifunctional nonlocal metasurfaces system as an optical see-through lens. (b) Schematic of a super-period of a nonlocal metasurface system implementing three distinct phase gradients at RGB wavelengths. (c) Simulated transmission and reflection spectra of the metasurface system shown in (a). (d) Schematic of an AR eyeglasses architecture based on a metaform imager used as an optical combiner. (e) Photo of a metaform and an SEM image of a set of the fabricated nano-tokens. (f) Set of different regions of the resolution target imaged via the metaform. Figure reproduced with permission from: (a–c) ref.124, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License; (d–f) ref.125, American Association for the Advancement of Science.

    The development of high-performance reflective metasurface visors has recently broadened its scope, moving beyond planar structure to explore freeform curved substrates. Nikolov et al. introduced a versatile design-to-fabrication process for creating metasurfaces on a freeform substrate, the metaforms, to combine the advantages of freeform optics and metasurfaces125. Such a metaform can be integrated into an AR display as a combiner as sketched in Fig. 11(d). A miniature 1.5 mm by 2 mm imager based on a metaform mirror was demonstrated with a visible wavelength of 632.8 nm, as shown in Fig. 11(e). The Ag-SiO2-Ag metasurface is formed on the concave brass substrate. Employing layers of metallic materials, the freeform metasurface device effectively performs the reflective lens function over varying spatial frequencies in object space of up to 10.42 lp/mm, as shown in Fig. 11(f). However, the see-through functionality needs to be further investigated by adding see-through holes in the metals or optimizing the design with dielectric materials.

    Metasurfaces-based natural 3D near-eye displays

    Metasurfaces, as we have discussed, hold great potential for applications in VR and AR near-eye displays to improve imaging performance, enlarge FOV or increase compactness. However, most near-eye displays still employ stereoscopic 3D display technology, which can lead to VAC and visual fatigue. To address this issue, natural 3D display approaches have been proposed, including Maxwellian viewing display, holographic display, light field display, and multi-/vary-focal displays. Thanks to the versatility of metasurfaces, metasurface devices have been employed as important optical elements in these advanced 3D near-eye displays to help alleviate visual fatigue and provide a more natural 3D visual experience126.

    Recently, Song et al. proposed a Maxwellian-viewing near-eye display that can provide accommodation-free virtual images by using a small aperture (360 µm × 360 µm) transparent Huygens’ metasurface hologram as the display device127. As shown in Fig. 12(a, b), the metasurface, when illuminated by a laser source, generates a holographic image that is then projected to a minuscule point at the pupil position by the eyepiece. In such a pin-hole-like Maxwellian-viewing display system128, the image received on the retina remains consistently sharp and clear, irrespective of any changes in the focus accommodation of the crystalline lens. Hence, depth information cannot be inferred from eye accommodation, and depth perception is solely determined by the vergence of both eyes. This accommodation-free display, therefore, does not induce depth mismatch or 3D visual fatigue. The compact dimension of the metasurface hologram allows for a shorter optical path length, enabling the creation of a lightweight, compact wearable prototype weighing only ~50 grams, as shown in Fig. 12(c). Figure 12(d, e) displays an AR image produced by the prototype. The virtual image maintains its sharpness as the camera adjusts its focus from near to far, thereby revealing its accommodation-free property. However, the Maxwellian-viewing displays are limited by a small eye box, which would lead to image loss when the eyeball moves around. Pupil duplication and multiview display techniques122 based on the metasurfaces have been proposed to solve this problem.

    (a) Schematic of a Maxwellian-viewing near-eye display using a metasurface hologram. (b) Source engine of the AR display system. (c) Compact and light wearable prototype of the Maxwellian-viewing near-eye display. Displayed AR images when the camera is focused at (d) 0.5 m and (e) 2 m, respectively. Figure reproduced with permission from ref.127, John Wiley and Sons.

    Figure 12.(a) Schematic of a Maxwellian-viewing near-eye display using a metasurface hologram. (b) Source engine of the AR display system. (c) Compact and light wearable prototype of the Maxwellian-viewing near-eye display. Displayed AR images when the camera is focused at (d) 0.5 m and (e) 2 m, respectively. Figure reproduced with permission from ref.127, John Wiley and Sons.

    Wang et al. demonstrated a near-eye display system that combines a 5-mm-diameter metalens eyepiece with 3D computer-generated holography (CGH)129. Holography is often considered the ultimate display technology as it can provide comprehensive depth information of 3D objects. The combination of metalens and CGH offers a solution for compact lightweight natural 3D near-eye displays. The metalens eyepiece, composed of silicon nitride anisotropic nanofins, is fabricated with a focal length of 6 mm (NA=0.4), and a diffraction efficiency of 15.7% for 532 nm. Based on the metalens, they implemented a VR prototype that projects holographic 3D images to the viewer with an eye relief of 9 mm and an FOV of about 31°, providing full monocular focus cues without VAC and ensuring a realistic viewing experience for the viewer shown in Fig. 13(a). Figure 13(b, c) shows the photo and the phase distribution of the metalens eyepiece, respectively. As shown in Fig. 13(d–g) 3D virtual scene consisting of multiple virtual images “Z” “J” and “U” at different depths is rendered on the real-world scene.

    (a) Schematic of a holographic near-eye display based on a metalens eyepiece. (b) Photo of the metalens eyepiece. (c) Phase distribution of the metalens eyepiece. (d) Original layered model of letters “ZJU”. (e-f) Reconstructed virtual images when the camera is focused on “Z” “J” and “U”, respectively. Figure reproduced with permission from ref.129, MDPI, under an open-access Creative Common CC BY license.

    Figure 13.(a) Schematic of a holographic near-eye display based on a metalens eyepiece. (b) Photo of the metalens eyepiece. (c) Phase distribution of the metalens eyepiece. (d) Original layered model of letters “ZJU”. (e-f) Reconstructed virtual images when the camera is focused on “Z” “J” and “U”, respectively. Figure reproduced with permission from ref.129, MDPI, under an open-access Creative Common CC BY license.

    Integral imaging display is a type of light field display that could provide VAC-free 3D images based on a high-resolution display screen and a microlens array as shown in Fig. 14(a). Fan et al. proposed the use of a broadband achromatic metalens array to realize integral imaging display130. Figure 14(b–d) shows the SEM photos and the optical photo of the metalens array. The fabricated microlens array consists of 60 × 60 metalenses made of silicon nitride nanostructures on a silicon dioxide substrate. These achromatic metalenses, designed based on effective refractive index theory, each possess a diameter of 14 μm and an average focal length of 81.5 μm (from 430 nm to 780 nm). Implemented using the optical setup in Fig. 14(e), the integral imaging display system employing the achromatic microlens array is capable of generating virtual images “3” and “D” at the same or different depths as shown in Fig. 14(f–h). The in-focus/out-of-focus effect is clearly discernible when the camera adjusts its focus, indicating the presence of correct accommodation depth cue. Moreover, because of the achromatic feature of the metalens array, the depths of the different color components remain consistent. While this metasurface-based display system is not specifically intended for near-eye applications, the compact nature of the metalens array allows for the potential design of portable and wearable integral imaging displays for VR and AR applications.

    (a) Schematic of an integral imaging display based on a metalens array. (b) SEM image of a portion of the metalens array. (c) Optical image of a single metalens in the array. (d) SEM photo of the silicon nitride nanostructures. (e) Optical setup of the integral imaging display based on the metalens array. Reconstructed virtual 3D images illuminated by blue, green, red, and white light when virtual letters “3” and “D” are rendered at (f) the same distance and (g, h) different distances. CCD, charge-coupled device. Figure reproduced with permission from ref.130, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License.

    Figure 14.(a) Schematic of an integral imaging display based on a metalens array. (b) SEM image of a portion of the metalens array. (c) Optical image of a single metalens in the array. (d) SEM photo of the silicon nitride nanostructures. (e) Optical setup of the integral imaging display based on the metalens array. Reconstructed virtual 3D images illuminated by blue, green, red, and white light when virtual letters “3” and “D” are rendered at (f) the same distance and (g, h) different distances. CCD, charge-coupled device. Figure reproduced with permission from ref.130, Nature Publishing Group, under a Creative Commons Attribution 4.0 International License.

    The vari-/multi focal display131-133 represents another 3D display technique capable of addressing the VAC issue and providing a natural 3D effect. This is achieved by reconstructing a 3D scene through the display of its 2D slices at different depths. For near-eye vari-/multi-focal displays, one effective approach for rendering virtual images at different depths involves altering the focal length of the eyepiece. Apparently, in vari-/multi-focal near-eye displays, the varifocal eyepiece lens is the most critical electro-optical component. Various varifocal optical elements have been suggested to fulfill this function, such as liquid lens131, liquid crystal (LC) lens134, and deformable mirror135. Recently, Zhu et al. proposed an electrically controllable varifocal metalens operating in the near IR region with the assistance of an addressed nematic LC layer136. By manipulating the voltage applied to the LC, the focal length of the metalens could be adjusted. However, the fabrication of a considerable number of super-pixel cells for this metalens is still challenging. If further optimized for operation in the visible range, such varifocal metalenses could pave the way for high-performance vari-/multi-focal near-eye displays.

    Discussion

    Ultra-thin metasurface devices exhibit great potential for realizing compact and lightweight near-eye displays. In addition to providing viable alternatives to traditional bulky optical components, metasurfaces open the door to unique optical functionalities previously unattainable with conventional devices, due to their exceptional light modulation capabilities and high design freedom. For example, achromatic metasurface devices can compensate color aberration in an ultra-thin form, which is highly attractive for full-color near-eye displays; the nanoscale meta-atoms could enable large diffraction angle, which is crucial for achieving large FOV; moreover, they could even manifest different optical behaviors for transmitted and reflected waves, or multiplex functionalities. Nonetheless, to truly harness the vast capabilities of metasurfaces in near-eye displays and facilitate practical VR/AR applications, several challenges must be addressed.

    One significant challenge is the design and fabrication of high-efficient and large-size metasurfaces. Currently, achieving high efficiency in metasurface devices, such as achromatic metalenses or partial-reflective metasurface devices, is still difficult. The low focusing efficiency can lead to contrast reduction of virtual images and limit the practical application scenarios of AR displays. Though relatively-large-aperture 10 mm metalens eyepieces have been demonstrated, however, to provide large FOVs, the aperture of metasurfaces should be made even larger. And that requires more complicated variation of meta-atom structures, posing hurdles from both design and fabrication perspectives. Mass production of metasurfaces, especially those working at visible wavelengths, requires productive, high accuracy, and large area fabrication techniques. Extreme ultraviolet or deep-ultraviolet lithography137 steppers are highly recommended for achieving these goals. Exploring alternative fabrication methods such as nano-imprint138, self-assembly139, and laser writing140, 141 will enrich the variety of fabrication options.

    Another limitation is the fixed functionality of metasurface optical devices once they are fabricated. Tunable metasurface devices142-144 are highly desirable as they can be employed for VR/AR display applications in terms of metasurface holographic image sources43, 145-148, vari-focal metalenses149, and so on. Though some metasurface holograms have been suggested to provide images for the near-eye displays, the generated images are mostly static or could only be switched among a few different states. For good VR or AR experience, the image source should provide full-color high-resolution virtual images at video rate (>24 Hz) to enable dynamic content rendering without flickering. To date, achieving dynamically tunable metasurfaces remains a challenge, primarily due to constraints related to the optical properties of the metasurface materials. The introduction of liquid crystal materials150-152 or other tunable materials153, 154 to metasurface systems shows promise in improving tunability. However, the tuning range, resolution, grey level counts, and response time are still limited. Exploring how to develop tunable metasurfaces with large tuning ranges, high precision, fast response times, and pixel-level addressability is a worthwhile avenue for future research.

    From a system perspective, simply replacing some bulky refractive optics with ultra-thin metasurfaces does not necessarily guarantee a more compact form factor for the entire system. The compactness and weight reduction in near-eye display systems are not solely determined by the thinnest devices, but more by the system configuration. Other devices that cannot be replaced by metasurfaces and the need for air or medium-filled spaces between different devices to enable certain optical functionalities can still limit the compactness and weight reduction in near-eye display systems. Moreover, to overcome the VAC problem and reduce visual fatigue, various natural 3D display techniques have been proposed for near-eye displays. However, these techniques increase system complexity and demand more new functional optical components. Effective improvement in the compactness, weight, and performance of near-eye display systems, fully utilizing the advantages of metasurfaces, requires the cooperation of researchers from both metasurface and display areas.

    Despite these challenges, we firmly believe that metasurfaces hold great potential for the next generation of AR/VR near-eye displays. Continued research and development efforts in addressing these challenges will contribute to unlocking the full capabilities of metasurfaces for enhanced near-eye display experiences.

    Conclusion

    In this review, we have provided an extensive overview of the recent advancements in the applications of metasurfaces for near-eye displays. We introduce the fundamentals of near-eye display technologies and metasurfaces, review the recent developments of metasurface devices employed in different VR and AR architectures, and then delve into a few metasurface-based natural 3D near-eye display technologies aiming to provide more realistic and comfortable viewing experience. Thanks to their ultra-thin form factors and exceptional flexibility in light modulation, metasurfaces offer a promising solution for the development of compact and lightweight VR/AR headsets. However, several challenges need to be addressed to fully realize the potential of metasurfaces in near-eye display applications. Continued research and development efforts in addressing these challenges will contribute to unlocking the full capabilities of metasurfaces for enhanced near-eye display experiences, and we firmly believe metasurfaces hold great potential for the next generation of VR/AR near-eye displays.

    Acknowledgements

    We are grateful for financial supports from the National Key Research and Development Program of China (2021YFB2802100) and the National Natural Science Foundation of China (62075127 and 62105203).

    The authors declare no competing financial interests.

    References

    [1] T Morimoto, T Kobayashi, H Hirata, K Otani, M Sugimoto et al. XR (extended reality: virtual reality, augmented reality, mixed reality) technology in spine medicine: status quo and quo Vadis. J Clin Med, 11, 470(2022).

    [2] R Azuma, Y Baillot, R Behringer, S Feiner, S Julier et al. Recent advances in augmented reality. IEEE Comput Graph Appl, 21, 34-47(2001).

    [3] JH Xiong, EL Hsiang, ZQ He, T Zhan, ST Wu. Augmented reality and virtual reality displays: emerging technologies and future perspectives. Light Sci Appl, 10, 216(2021).

    [4] CL Chang, K Bang, G Wetzstein, B Lee, L Gao. Toward the next-generation VR/AR optics: a review of holographic near-eye displays from a human-centric perspective. Optica, 7, 1563-1578(2020).

    [5] JH Xiong, ST Wu. Planar liquid crystal polarization optics for augmented reality and virtual reality: from fundamentals to applications. eLight, 1, 3(2021).

    [6] YZ Qian, ZY Yang, YH Huang, KH Lin, ST Wu. Directional high-efficiency nanowire LEDs with reduced angular color shift for AR and VR displays. Opto-Electron Sci, 1, 220021(2022).

    [7] XM Shen, J Gao, MS Li, CH Zhou, SS Hu et al. Toward immersive communications in 6G. Front Comput Sci, 4, 1068478(2023).

    [8] K Ahir, K Govani, R Gajera, M Shah. Application on virtual reality for enhanced education learning, military training and sports. Augment Hum Res, 5, 7(2020).

    [9] MA Livingston, LJ Rosenblum, DG Brown, GS Schmidt, SJ Julier, B Furht et al. Military applications of augmented reality. Handbook of Augmented Reality(2011).

    [10] Y Siriwardhana, P Porambage, M Liyanage, M Ylianttila. A survey on mobile augmented reality with 5G mobile edge computing: architectures, applications, and technical aspects. IEEE Commun Surv Tutor, 23, 1160-1192(2021).

    [11] HL Chi, SC Kang, XY Wang. Research trends and opportunities of augmented reality applications in architecture, engineering, and construction. Autom Constr, 33, 116-122(2013).

    [12] TP Caudell, DW Mizell. Augmented reality: an application of heads-up display technology to manual manufacturing processes(1992).

    [13] K Yin, ZQ He, JH Xiong, JY Zou, K Li et al. Virtual reality and augmented reality displays: advances and future perspectives. J Phys Photonics, 3, 022010(2021).

    [14] P Chakravarthula, YF Peng, J Kollin, H Fuchs, F Heide. Wirtinger holography for near-eye displays. ACM Trans Graph, 38, 213(2019).

    [15] S Lee, Y Jo, D Yoo, J Cho, D Lee et al. Tomographic near-eye displays. Nat Commun, 10, 2497(2019).

    [16] DWF van Krevelen, R Poelman. A survey of augmented reality technologies, applications and limitations. Int J Virtual Reality, 9, 1-20(2010).

    [18] Q Huang, HJ Caulfield. Waveguide holography and its applications. Proc SPIE, 1461, 303-312(1991).

    [19] H Hoshi, N Taniguchi, H Morishima, T Akiyama, S Yamazaki et al. Off-axial HMD optical system consisting of aspherical surfaces without rotational symmetry. Proc SPIE, 2653, 234-242(1996).

    [25] GY Lee, JY Hong, S Hwang, S Moon, H Kang et al. Metasurface eyepiece for augmented reality. Nat Commun, 9, 4562(2018).

    [27] JP McIntire, PR Havig, EE Geiselman. Stereoscopic 3D displays and human performance: a comprehensive review. Displays, 35, 18-26(2014).

    [28] J Geng. Three-dimensional display technologies. Adv Opt Photonics, 5, 456-535(2013).

    [29] Y Liu, X Guo, YB Fan, XF Meng, JH Wang. Subjective assessment on visual fatigue versus stereoscopic disparities. J Soc Inf Disp, 29, 497-504(2021).

    [30] J Hu, S Bandyopadhyay, YH Liu, LY Shao. A review on metasurface: from principle to smart metadevices. Front Phys, 8, 586087(2021).

    [31] J Hu, F Safir, K Chang, S Dagli, HB Balch et al. Rapid genetic screening with high quality factor metasurfaces. Nat Commun, 14, 4486(2023).

    [32] N Meinzer, WL Barnes, IR Hooper. Plasmonic meta-atoms and metasurfaces. Nat photonics, 8, 889-898(2014).

    [33] S Krasikov, A Tranter, A Bogdanov, Y Kivshar. Intelligent metaphotonics empowered by machine learning. Opto-Electron Adv, 5, 210147(2022).

    [34] DK Nikolov, F Cheng, L Ding, A Bauer, AN Vamivakas et al. See-through reflective metasurface diffraction grating. Opt Mater Express, 9, 4070-4080(2019).

    [35] D Sell, JJ Yang, S Doshay, R Yang, JA Fan. Large-angle, multifunctional metagratings based on freeform multimode geometries. Nano Lett, 17, 3752-3757(2017).

    [36] SM Wang, PC Wu, VC Su, YC Lai, MK Chen et al. A broadband achromatic metalens in the visible. Nat Nanotechnol, 13, 227-232(2018).

    [37] MY Pan, YF Fu, MJ Zheng, H Chen, YJ Zang et al. Dielectric metalens for miniaturized imaging systems: progress and challenges. Light Sci Appl, 11, 195(2022).

    [38] H Gao, XH Fan, YX Wang, YC Liu, XG Wang et al. Multi-foci metalens for spectra and polarization ellipticity recognition and reconstruction. Opto-Electron Sci, 2, 220026(2023).

    [39] WW Wan, J Gao, XD Yang. Metasurface holograms for holographic imaging. Adv Opt Mater, 5, 1700541(2017).

    [40] RZ Zhao, B Sain, QS Wei, CC Tang, XW Li et al. Multichannel vectorial holographic display and encryption. Light Sci Appl, 7, 95(2018).

    [41] X Li, LW Chen, Y Li, XH Zhang, MB Pu et al. Multicolor 3D meta-holography by broadband plasmonic modulation. Sci Adv, 2, e1601102(2016).

    [42] WJ Meng, YL Hua, K Cheng, BL Li, TT Liu et al. 100 Hertz frame-rate switching three-dimensional orbital angular momentum multiplexing holography via cross convolution. Opto-Electron Sci, 1, 220004(2022).

    [43] X Li, QM Chen, X Zhang, RZ Zhao, SM Xiao et al. Time-sequential color code division multiplexing holographic display with metasurface. Opto-Electron Adv, 6, 220060(2023).

    [44] WJ Joo, J Kyoung, M Esfandyarpour, SH Lee, H Koo et al. Metasurface-driven OLED displays beyond 10, 000 pixels per inch. Science, 370, 459-463(2020).

    [45] JX Li, P Yu, S Zhang, N Liu. Electrically-controlled digital metasurface device for light projection displays. Nat Commun, 11, 3574(2020).

    [46] Z Wang, HR Zhang, HT Zhao, TJ Cui, LL Li. Intelligent electromagnetic metasurface camera: system design and experimental results. Nanophotonics, 11, 2011-2024(2022).

    [47] ZC Shen, F Zhao, CQ Jin, S Wang, LC Cao et al. Monocular metasurface camera for passive single-shot 4D imaging. Nat Commun, 14, 1035(2023).

    [48] E Arbabi, A Arbabi, SM Kamali, Y Horie, M Faraji-Dana et al. MEMS-tunable dielectric metasurface lens. Nat Commun, 9, 812(2018).

    [49] H Kwon, E Arbabi, SM Kamali, M Faraji-Dana, A Faraon. Single-shot quantitative phase gradient microscopy using a system of multifunctional metasurfaces. Nat Photonics, 14, 109-114(2020).

    [50] ZY Liu, DY Wang, H Gao, MX Li, HX Zhou et al. Metasurface-enabled augmented reality display: a review. Adv Photonics, 5, 034001(2023).

    [51] WW Liu, ZC Li, H Cheng, SQ Chen. Dielectric resonance-based optical metasurfaces: from fundamentals to applications. iScience, 23, 101868(2020).

    [52] M Khorasaninejad, KB Crozier. Silicon nanofin grating as a miniature chirality-distinguishing beam-splitter. Nat Commun, 5, 5386(2014).

    [53] WJ Luo, SL Sun, HX Xu, Q He, L Zhou. Transmissive ultrathin pancharatnam-berry metasurfaces with nearly 100% efficiency. Phys Rev Appl, 7, 044033(2017).

    [54] JC Zhang, HW Liang, Y Long, YL Zhou, Q Sun et al. Metalenses with polarization-insensitive adaptive Nano-antennas. Laser Photonics Rev, 16, 2200268(2022).

    [55] SY Li, CW Hsu. Transmission efficiency limit for nonlocal metalenses. Laser Photonics Rev, 17, 2300201(2023).

    [56] XJ Ni, NK Emani, AV Kildishev, A Boltasseva, VM Shalaev. Broadband light bending with plasmonic nanoantennas. Science, 335, 427-427(2012).

    [57] SL Sun, KY Yang, CM Wang, TK Juan, WT Chen et al. High-efficiency broadband anomalous reflection by gradient meta-surfaces. Nano Lett, 12, 6223-6229(2012).

    [58] NF Yu, P Genevet, MA Kats, F Aieta, JP Tetienne et al. Light propagation with phase discontinuities: generalized laws of reflection and refraction. Science, 334, 333-337(2011).

    [59] QL Yang, S Kruk, YH Xu, QW Wang, YK Srivastava et al. Mie-resonant membrane Huygens' metasurfaces. Adv Funct Mater, 30, 1906851(2020).

    [60] SM Wang, PC Wu, VC Su, YC Lai, CH Chu et al. Broadband achromatic optical metasurface devices. Nat Commun, 8, 187(2017).

    [61] J Yao, R Lin, MK Chen, DP Tsai. Integrated-resonant metadevices: a review. Adv Photonics, 5, 024001(2023).

    [62] P Lalanne, S Astilean, P Chavel, E Cambril, H Launois. Blazed binary subwavelength gratings with efficiencies larger than those of conventional échelette gratings. Opt Lett, 23, 1081-1083(1998).

    [63] WB Feng, JC Zhang, QF Wu, A Martins, Q Sun et al. RGB achromatic metalens doublet for digital imaging. Nano Lett, 22, 3969-3975(2022).

    [64] KH Shen, Y Duan, P Ju, ZJ Xu, X Chen et al. On-chip optical levitation with a metalens in vacuum. Optica, 8, 1359-1362(2021).

    [65] M Khorasaninejad, WT Chen, RC Devlin, J Oh, AY Zhu et al. Metalenses at visible wavelengths: diffraction-limited focusing and subwavelength resolution imaging. Science, 352, 1190-1194(2016).

    [66] HW Liang, QL Lin, XS Xie, Q Sun, Y Wang et al. Ultrahigh numerical aperture metalens at visible wavelengths. Nano Lett, 18, 4460-4466(2018).

    [67] JH Song, J van de Groep, SJ Kim, ML Brongersma. Non-local metasurfaces for spectrally decoupled wavefront manipulation and eye tracking. Nat Nanotechnol, 16, 1224-1230(2021).

    [68] JJ Yang, JA Fan. Analysis of material selection on dielectric metasurface performance. Opt Express, 25, 23899-23909(2017).

    [69] JQ Jiang, MK Chen, JA Fan. Deep neural networks for the evaluation and design of photonic devices. Nat Rev Mater, 6, 679-700(2021).

    [70] C Choi, T Choi, JG Yun, C Yoo, B Lee. Two-dimensional angular bandwidth broadening of metasurface grating. Adv Photonics Res, 3, 2200158(2022).

    [71] J Goodsell, P Xiong, DK Nikolov, AN Vamivakas, JP Rolland. Metagrating meets the geometry-based efficiency limit for AR waveguide in-couplers. Opt Express, 31, 4599-4614(2023).

    [72] A Arbabi, Y Horie, M Bagheri, A Faraon. Dielectric metasurfaces for complete control of phase and polarization with subwavelength spatial resolution and high transmission. Nat Nanotechnol, 10, 937-943(2015).

    [73] C Zeng, H Lu, D Mao, YQ Du, H Hua et al. Graphene-empowered dynamic metasurfaces and metadevices. Opto-Electron Adv, 5, 200098(2022).

    [74] Y Long, JC Zhang, ZH Liu, WB Feng, SM Guo et al. Metalens-based stereoscopic microscope. Photonics Res, 10, 1501-1508(2022).

    [75] B Groever, NA Rubin, JPB Mueller, RC Devlin, F Capasso. High-efficiency chiral meta-lens. Sci Rep, 8, 7240(2018).

    [76] YJ Bao, QL Lin, RB Su, ZK Zhou, JD Song et al. On-demand spin-state manipulation of single-photon emission from quantum dot integrated with metasurface. Sci Adv, 6, eaba8761(2020).

    [77] YX Zhang, MB Pu, JJ Jin, XJ Lu, YH Guo et al. Crosstalk-free achromatic full Stokes imaging polarimetry metasurface enabled by polarization-dependent phase optimization. Opto-Electron Adv, 5, 220058(2022).

    [78] E Arbabi, A Arbabi, S Kamali et al. Multiwavelength metasurfaces through spatial multiplexing. Sci Rep, 6, 32803(2016).

    [79] YY Shi, CW Wan, CJ Dai, S Wan, Y Liu et al. On-chip meta-optics for semi-transparent screen display in sync with AR projection. Optica, 9, 670-676(2022).

    [80] ZY Liu, C Zhang, WQ Zhu, ZH Huang, HJ Lezec et al. Compact stereo waveguide display based on a unidirectional polarization-multiplexed metagrating in-coupler. ACS Photonics, 8, 1112-1119(2021).

    [81] BA Narasimhan. Ultra-compact pancake optics based on ThinEyes® super-resolution technology for virtual reality headsets. Proc SPIE, 10676, 106761G(2018).

    [82] TL Wong, ZS Yun, G Ambur, J Etter. Folded optics with birefringent reflective polarizers. Proc SPIE, 10335, 103350E(2017).

    [83] EL Hsiang, ZY Yang, Q Yang, PC Lai, CL Lin et al. AR/VR light engines: perspectives and challenges. Adv Opt Photonics, 14, 783-861(2022).

    [84] T Zhan, K Yin, JH Xiong, ZQ He, ST Wu. Augmented reality and virtual reality displays: perspectives and challenges. iScience, 23, 101397(2020).

    [85] YJ Wang, YH Lin. Liquid crystal technology for vergence-accommodation conflicts in augmented reality and virtual reality systems: a review. Liq Cryst Rev, 9, 35-64(2021).

    [86] H Hua. Enabling focus cues in head-mounted displays. Proc IEEE, 105, 805-824(2017).

    [87] SX Liu, Y Li, YK Su. Multiplane displays based on liquid crystals for AR applications. J Soc Inf Disp, 28, 224-240(2020).

    [88] S Lee, C Jang, S Moon, J Cho, B Lee. Additive light field displays: realization of augmented reality with holographic optical elements. ACM Trans Graph, 35, 60(2016).

    [89] HL Zhang, H Deng, JJ Li, MY He, DH Li et al. Integral imaging-based 2D/3D convertible display system by using holographic optical element and polymer dispersed liquid crystal. Opt Lett, 44, 387-390(2019).

    [90] H Hua, B Javidi. A 3D integral imaging optical see-through head-mounted display. Opt Express, 22, 13484-13491(2014).

    [91] L Shi, BC Li, C Kim, P Kellnhofer, W Matusik. Towards real-time photorealistic 3D holography with deep neural networks. Nature, 591, 234-239(2021).

    [92] JH Xiong, K Yin, K Li, ST Wu. Holographic optical elements for augmented reality: principles, present status, and future perspectives. Adv Photonics Res, 2, 2000049(2021).

    [93] SX Liu, Y Li, PC Zhou, QM Chen, YK Su. Reverse-mode PSLC multi-plane optical see-through display for AR applications. Opt Express, 26, 3394-3403(2018).

    [94] D Smalley, E Nygaard, K Squire et al. A photophoretic-trap volumetric display. Nature, 553, 486-490(2018).

    [95] J Carmigniani, B Furht, M Anisetti, P Ceravolo, E Damiani et al. Augmented reality technologies, systems and applications. Multimed Tools Appl, 51, 341-377(2011).

    [96] W Cui, L Gao. Optical mapping near-eye three-dimensional display with correct focus cues. Opt Lett, 42, 2475-2478(2017).

    [97] W Cui, L Gao. All-passive transformable optical mapping near-eye display. Sci Rep, 9, 6064(2019).

    [98] YH Lee, FL Peng, ST Wu. Fast-response switchable lens for 3D and wearable displays. Opt Express, 24, 1668-1675(2016).

    [99] XD Hu, H Hua. Design and assessment of a depth-fused multi-focal-plane display prototype. J Disp Technol, 10, 308-316(2014).

    [100] JH Xiong, GJ Tan, T Zhan, ST Wu. Breaking the field-of-view limit in augmented reality with a scanning waveguide display. OSA Contin, 3, 2730-2740(2020).

    [101] CC Wu, KT Shih, JW Huang, HH Chen. A novel birdbath eyepiece for light field AR glasses. Proc SPIE, 12449, 124490N(2023).

    [102] ZY Li, P Lin, YW Huang, JS Park, WT Chen et al. Meta-optics achieves RGB-achromatic focusing for virtual reality. Sci Adv, 7, eabe4458(2021).

    [103] ZY Li, R Pestourie, JS Park, YW Huang, SG Johnson et al. Inverse design enables large-scale high-performance meta-optics reshaping virtual reality. Nat Commun, 13, 2409(2022).

    [104] J Kim, J Seong, W Kim, GY Lee, S Kim et al. Scalable manufacturing of high-index atomic layer-polymer hybrid metasurfaces for metaphotonics in the visible. Nat Mater, 22, 474-481(2023).

    [105] YY Shi, CW Wan, CJ Dai, ZJ Wang, S Wan et al. Augmented reality enabled by on-chip meta-holography multiplexing. Laser Photonics Rev, 16, 2100638(2022).

    [106] RZ Zhao, LL Huang, YT Wang. Recent advances in multi-dimensional metasurfaces holographic technologies. PhotoniX, 1, 20(2020).

    [107] E Bayati, A Wolfram, S Colburn, LC Huang, A Majumdar. Design of achromatic augmented reality visors based on composite metasurfaces. Appl Opt, 60, 844-850(2021).

    [108] Y Meng, ZT Liu, ZW Xie, RD Wang, TC Qi et al. Versatile on-chip light coupling and (de)multiplexing from arbitrary polarizations to controlled waveguide modes using an integrated dielectric metasurface. Photonics Res, 8, 564-576(2020).

    [109] WQ Chen, DS Zhang, SY Long, ZZ Liu, JJ Xiao. Nearly dispersionless multicolor metasurface beam deflector for near eye display designed by a physics-driven deep neural network. Appl Opt, 60, 3947-3953(2021).

    [110] JS Xiao, J Liu, J Han, YT Wang. Design of achromatic surface microstructure for near-eye display with diffractive waveguide. Opt Commun, 452, 411-416(2019).

    [111] R Ditcovski, O Avayu, T Ellenbogen. Full-color optical combiner based on multilayered metasurface design. Proc SPIE, 10942, 109420S(2019).

    [112] J Tang, S Wan, YY Shi, CW Wan, ZJ Wang et al. Dynamic augmented reality display by layer-folded metasurface via electrical-driven liquid crystal. Adv Opt Mater, 10, 2200418(2022).

    [113] ZJ Shi, WT Chen, F Capasso. Wide field-of-view waveguide displays enabled by polarization-dependent metagratings. Proc SPIE, 10676, 1067615(2018).

    [114] H Boo, YS Lee, HB Yang, B Matthews, TG Lee et al. Metasurface wavefront control for high-performance user-natural augmented reality waveguide glasses. Sci Rep, 12, 5832(2022).

    [115] XG Luo, F Zhang, MB Pu, MF Xu. Catenary optics: a perspective of applications and challenges. J Phys Condens Matter, 34, 381501(2022).

    [116] MB Pu, X Li, XL Ma, YQ Wang, ZY Zhao et al. Catenary optics for achromatic generation of perfect optical angular momentum. Sci Adv, 1, e1500396(2015).

    [117] XH Zhang, GF Liang, DQ Feng, L Zhou, YC Guo. Ultra-broadband metasurface holography via quasi-continuous Nano-slits. J Phys D Appl Phys, 53, 104002(2020).

    [118] SY Long, NY Li, ZZ Liu, WQ Chen, DS Zhang et al. Color near-eye display with high exit-pupil uniformity based on optimized meta-grating. Opt Eng, 61, 065101(2022).

    [119] Y Liu, YY Shi, ZJ Wang, ZY Li. On-chip integrated metasystem with inverse-design wavelength demultiplexing for augmented reality. ACS Photonics, 10, 1268-1274(2023).

    [120] CC Hong, S Colburn, A Majumdar. Flat metaform near-eye visor. Appl Opt, 56, 8822-8827(2017).

    [121] O Avayu, R Ditcovski, T Ellenbogen. Ultrathin full color visor with large field of view based on multilayered metasurface design. Proc SPIE, 10676, 1067612(2018).

    [122] Y Li, SY Chen, HW Liang, XY Ren, LC Luo et al. Ultracompact multifunctional metalens visor for augmented reality displays. PhotoniX, 3, 29(2022).

    [123] LC Luo, ZY Wang, JT Li, HW Liang. Wide-field-of-view trans-reflective RGB-achromatic metalens for augmented reality. Photonics, 10, 590(2023).

    [124] SC Malek, AC Overvig, A Alù, NF Yu. Multifunctional resonant wavefront-shaping meta-optics based on multilayer and multi-perturbation nonlocal metasurfaces. Light Sci Appl, 11, 246(2022).

    [125] DK Nikolov, A Bauer, F Cheng, H Kato, AN Vamivakas et al. Metaform optics: bridging nanophotonics and freeform optics. Sci Adv, 7, eabe5112(2021).

    [126] WT Song, XN Liang, SQ Li, DD Li, R Paniagua-Domínguez et al. Large-scale Huygens' metasurfaces for holographic 3D near-eye displays. Laser Photonics Rev, 15, 2000538(2021).

    [127] WT Song, XA Liang, SQ Li, P Moitra, XW Xu et al. Retinal projection near-eye displays with Huygens' metasurfaces. Adv Opt Mater, 11, 2202348(2023).

    [128] G Westheimer. The maxwellian view. Vision Res, 6, 669-682(1966).

    [129] C Wang, ZQ Yu, QB Zhang, Y Sun, CN Tao et al. Metalens eyepiece for 3D holographic near-eye display. Nanomaterials, 11, 1920(2021).

    [130] ZB Fan, HY Qiu, HL Zhang, XN Pang, LD Zhou et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl, 8, 67(2019).

    [131] S Liu, H Hua, DW Cheng. A novel prototype for an optical see-through head-mounted display with addressable focus cues. IEEE Trans Vis Comput Graph, 16, 381-393(2010).

    [132] SX Liu, Y Li, PC Zhou, X Li, N Rong et al. A multi-plane optical see-through head mounted display design for augmented reality applications. J Soc Inf Disp, 24, 246-251(2016).

    [133] T Zhan, JH Xiong, JY Zou, ST Wu. Multifocal displays: review and prospect. PhotoniX, 1, 10(2020).

    [134] SY Chen, JH Lin, ZQ He, Y Li, YK Su et al. Planar Alvarez tunable lens based on polymetric liquid crystal Pancharatnam-Berry optical elements. Opt Express, 30, 34655-34664(2022).

    [135] XD Hu, H Hua. High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics. Opt. Express, 22, 13896-13903(2014).

    [136] SQ Zhu, Q Jiang, YT Wang, LL Huang. Nonmechanical varifocal metalens using nematic liquid crystal. Nanophotonics, 12, 1169-1176(2023).

    [137] JS Park, SY Zhang, AL She, WT Chen, P Lin et al. All-glass, large metalens at visible wavelength using deep-ultraviolet projection lithography. Nano Lett, 19, 8673-8682(2019).

    [138] RQ Luo, XH Luo, YH Zhao, Q Song, X Yang et al. Research on metasurface holographic imaging based on nanoimprint lithography. Proc SPIE, 12307, 123070C(2022).

    [139] JY Jiang, Y Cao, X Zhou, HX Xu, KX Ning et al. Colloidal self-assembly based ultrathin metasurface for perfect absorption across the entire visible spectrum. Nanophotonics, 12, 1581-1590(2023).

    [140] ZY Wang, B Hu, JY Liu, GC Wang, WG Liu et al. 4f-Less terahertz optical pattern recognition enabled by complex amplitude modulating metasurface through laser direct writing. Adv Opt Mater, 11, 2300575(2023).

    [141] XJ Xiao, YW Zhao, X Ye, C Chen, XM Lu et al. Large-scale achromatic flat lens by light frequency-domain coherence optimization. Light Sci Appl, 11, 323(2022).

    [142] ZN Wu, Y Ra'di, A Grbic. Tunable metasurfaces: a polarization rotator design. Phys Rev X, 9, 011036(2019).

    [143] JY Yang, S Gurung, S Bej, PN Ni, HWH Lee. Active optical metasurfaces: comprehensive review on physics, mechanisms, and prospective applications. Rep Prog Phys, 85, 036101(2022).

    [144] AH Dorrah, F Capasso. Tunable structured light with flat optics. Science, 376, eabi6860(2022).

    [145] JPB Mueller, NA Rubin, RC Devlin, B Groever, F Capasso. Metasurface polarization optics: independent phase control of arbitrary orthogonal states of polarization. Phys Rev Lett, 118, 113901(2017).

    [146] B Xiong, Y Liu, YH Xu, L Deng, CW Chen et al. Breaking the limitation of polarization multiplexing in optical metasurfaces with engineered noise. Science, 379, 294-299(2023).

    [147] F Ding, BD Chang, QS Wei, LL Huang, XW Guan et al. Versatile polarization generation and manipulation using dielectric metasurfaces. Laser Photonics Rev, 14, 2000116(2020).

    [148] H Gao, XH Fan, W Xiong, MH Hong. Recent advances in optical dynamic meta-holography. Opto-Electron Adv, 4, 210030(2021).

    [149] T Badloe, I Kim, Y Kim, J Kim, J Rho. Electrically tunable bifocal metalens with diffraction-limited focusing and imaging at visible wavelengths. Adv Sci, 8, 2102646(2021).

    [150] M Sharma, N Hendler, T Ellenbogen. Electrically switchable color tags based on active liquid-crystal plasmonic metasurface platform. Adv Opt Mater, 8, 1901182(2020).

    [151] ZX Shen, SH Zhou, XN Li, SJ Ge, P Chen et al. Liquid crystal integrated metalens with tunable chromatic aberration. Adv Photonics, 2, 036002(2020).

    [152] SQ Li, XW Xu, R Maruthiyodan Veetil, V Valuckas, R Paniagua-Domínguez et al. Phase-only transmissive spatial light modulator based on tunable dielectric metasurface. Science, 364, 1087-1090(2019).

    [153] E Mikheeva, C Kyrou, F Bentata, S Khadir, S Cueff et al. Space and time modulations of light with metasurfaces: recent progress and future prospects. ACS Photonics, 9, 1458-1482(2022).

    [154] F Ding, YQ Yang, SI Bozhevolnyi. Dynamic metasurfaces using phase-change chalcogenides. Adv Opt Mater, 7, 1801709(2019).

    Yan Li, Xiaojin Huang, Shuxin Liu, Haowen Liang, Yuye Ling, Yikai Su. Metasurfaces for near-eye display applications[J]. Opto-Electronic Science, 2023, 2(8): 230025-1
    Download Citation