-
Augmented reality (AR) enhances human perception by overlaying digital content — such as images, sounds, or 3D models — onto the real world, enabling interactive experiences that engage with the environment in real-time. Key features of AR include real-time environmental interaction, intuitive user interfaces utilizing touch, voice, or gesture controls, and immersive visuals that adapt to dynamic contexts. AR applications are diverse, spanning industries such as retail, where customers can virtually try products, healthcare, where it supports real-time surgical guidance, education, offering interactive learning tools, and entertainment1. Beyond these, AR is revolutionizing navigation systems, operational workflows, and collaborative engineering, solidifying its role as a transformative cross-industry technology.
Full-color AR occupies a pivotal role within the broader AR technology landscape, significantly enhancing user experiences by delivering vibrant, life-like visuals in digital overlays2. The perceptual realism and user immersion achieved through full-color AR are crucial for applications requiring precise color representation, such as medical visualization, architectural design, and immersive entertainment. The development of full-color AR systems not only increases the engagement of AR applications but also facilitates a more seamless integration of the virtual and real worlds, bridging the gap between these domains.
The performance of full-color AR systems is influenced by several critical factors, including chromatic aberration, field of view (FOV), and ghost imaging. Achieving high-performance full-color AR requires a holistic approach, co-optimizing dispersion, FOV, and ghost imaging as interdependent variables, rather than addressing them in isolation. Additionally, commercial applications must navigate the inherent trade-offs among optical efficiency, form factor, and manufacturability. Designing high-performance full-color AR products presents a complex challenge, one that necessitates a lengthy process of optimization and iterative refinement to produce exceptional products.
Full-color AR glasses are currently one of the most widely adopted products in the AR market, typically designed using diffractive waveguide combiners3,4. Industry-leading devices, such as Microsoft HoloLens 2, Magic Leap 2 and Dispelix, leverage multi-layered waveguide architectures to spectrally separate and recombine RGB components, complemented by symmetric grating designs to reduce chromatic dispersion. However, multilayer configurations introduce a certain degree of alignment tolerances and increased waveguide thickness, conflicting with consumer expectations for slim, lightweight form factors. Manufacturing challenges — particularly the sub-micron precision required for diffractive optical element fabrication — further constrain scalability and escalate costs. Additionally, misalignment of the waveguide and the displacement of different light wavelengths result in color nonuniformity and ghost imaging3,4. AR devices, such as Dispelix5, and emerging technologies6 featuring single-waveguide configurations are gaining increasing preference.
Recently, Zhongtao Tian et al. introduced a novel approach for in-coupling and out-coupling RGB wavelength light using an inverse-designed metasurface (Fig. 1a) in diffractive waveguide combiners, as detailed in their article published in Light: Science & Applications7. In this work, they achieve the correction of chromatic aberration across a FOV approaching 45° within a simplified framework. The complex multicolored scenes, like parrot (Fig. 1b), butterfly, and astronaut, are vividly reproduced, showing excellent color accuracy and uniformity. The nearly identical coupling efficiency for the RGB colors enables the design to achieve superior white balance, as evidenced in Fig. 1c by the floating virtual image of the white SUSTech logo captured during their experiments. By utilizing a single waveguide design instead of multi-layered stacks, the approach facilitates cost-effective and easy fabrication, addressing the demand for ultra-compact form factors and lightweight devices. This design holds great promise for providing a more user-friendly, comfortable wearing experience for full-color AR glasses.
Fig. 1 a Schematic view of the AR display setup. The AR system comprises three components: an input metasurface coupler (IMC), an metasurface output coupler (OMC) and a high refractive index waveguide. b The AR-displayed image showing a parrot, digitally overlaid content, and two dolls within a real-world scene. c The AR-displayed image showing the SUSTech logo, digitally overlaid content, and two dolls within a real-world scene.
The work7 achieves chromatic aberration correction by ensuring that light of all wavelengths satisfies the condition $m\lambda = const.$ at both the input and output grating couplers, which share the same period. Here, $m$ represents the diffraction order and $\lambda $ is the wavelength of light. During the coupling process, the diffraction orders between the RGB wavelengths differ, ensuring the condition is met. At the same time, identical diffraction orders are used for light of the same color in both the input and output couplers. As a result, the RGB colors enter and exit the waveguide at the same angle. Thus, the spectral spread in grating diffraction — namely, chromatic dispersion — is completely prohibited for the light that successfully undergoes coupling. This removes the requirement for separate waveguides for each wavelength in the current achromatic system. Since the RGB colors remain inseparable and exhibit nearly identical diffraction efficiencies, the image produced accurately reflects the true colors of objects. Furthermore, the same propagation path for all wavelengths ensures no displacement difference between the RGB colors, maintaining color uniformity across the eyebox. This achromatic method is effective over a wide field of view, as RGB colors, regardless of their incident angles, adhere to the same coupling principle.
The advantages demonstrated by this new approach7 are compelling for designing compact and lightweight achromatic AR glasses. However, before the concept can be practically applied to full-color AR technology, several key challenges remain. Firstly, ghost imaging caused by unwanted diffraction orders during the coupling process significantly degrades image quality, particularly in wide-FOV applications. It is essential to develop methods to shield or eliminate these parasitic diffraction orders, to improve image clarity. Secondly, the coupling efficiency in this scheme remains suboptimal, primarily due to the reliance on high-order diffraction for RGB color coupling. Higher coupling efficiency is crucial for improving contrast and clarity, particularly in bright ambient lighting. Although high-power laser sources can act as a complement to low coupling efficiency, they lead to increased power consumption and significant device heating, which poses additional challenges for system design. Lastly, this scheme exhibits polarization dependence, which imposes limitations on illumination light sources. A possible solution is to discretize the grating bars into unit cells. We look forward to, and believe that, future research will address these challenges, allowing this innovative idea to play a significant role in AR technology.
New approaches to chromatic aberration and waveguide optimization for full-color augmented reality systems
- Light: Advanced Manufacturing , Article number: (2025)
- Received: 12 February 2025
- Revised: 18 April 2025
- Accepted: 18 April 2025 Published online: 09 October 2025
doi: https://doi.org/10.37188/lam.2025.066
Abstract: A novel approach is presented for correcting chromatic aberration in full-color AR using an inverse-designed metasurface, enabling improved color accuracy and uniformity in diffractive waveguide combiners, with potential for more compact and efficient AR glasses.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article′s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article′s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.