ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.01558
30
0

Configurable Holography: Towards Display and Scene Adaptation

24 March 2024
Yicheng Zhan
Liang Shi
Wojciech Matusik
Qi Sun
K. Akşit
ArXivPDFHTML
Abstract

Emerging learned holography approaches have enabled faster and high-quality hologram synthesis, setting a new milestone toward practical holographic displays. However, these learned models require training a dedicated model for each set of display-scene parameters. To address this shortcoming, our work introduces a highly configurable learned model structure, synthesizing 3D holograms interactively while supporting diverse display-scene parameters. Our family of models relying on this structure can be conditioned continuously for varying novel scene parameters, including input images, propagation distances, volume depths, peak brightnesses, and novel display parameters of pixel pitches and wavelengths. Uniquely, our findings unearth a correlation between depth estimation and hologram synthesis tasks in the learning domain, leading to a learned model that unlocks accurate 3D hologram generation from 2D images across varied display-scene parameters. We validate our models by synthesizing high-quality 3D holograms in simulations and also verify our findings with two different holographic display prototypes. Moreover, our family of models can synthesize holograms with a 2x speed-up compared to the state-of-the-art learned holography approaches in the literature.

View on arXiv
@article{zhan2025_2405.01558,
  title={ Configurable Holography: Towards Display and Scene Adaptation },
  author={ Yicheng Zhan and Liang Shi and Wojciech Matusik and Qi Sun and Kaan Akşit },
  journal={arXiv preprint arXiv:2405.01558},
  year={ 2025 }
}
Comments on this paper