AI-Based Creation of Panoramic Indoor Environments for the Metaverse
The inception of the metaverse, delineated as a virtual reality space accessed by a vast user base, marks a significant evolution, amalgamating elements of physical spaces, virtual reality, and the internet into a cohesive digital landscape. The allure of the metaverse predominantly resides in the ability for users to navigate and interact within captivating environments. However, traditional methodologies for constructing these spaces often entail time-intensive processes and may not adequately capture the intricate details desired in such settings. In the realm of Extended Reality (XR), there is an increasing reliance on 360◦ cameras for specific Diminished Reality (DR) applications aimed at concealing particular object categories, capturing comprehensive scenes in a singular panoramic image. This dissertation introduces a sophisticated framework for processing omnidirectional indoor scenes, capable of extracting multiple signals from a single panoramic image, including depth, semantic, albedo, shading, and normal maps. This framework enables 3D editing and the presentation of immersive, high-resolution spherical indoor scenes, as well as photorealistic style transfer.
This research delineates several objectives, including an exhaustive review of the methodologies and technologies employed in digital environment creation within the metaverse, along with an exploration of related challenges and privacy/security concerns. The dissertation proposes an AI/transformer-driven technique for the processing and signal extraction from spherical indoor scenes, alongside an efficient method for crafting immersive indoor environments. This facilitates seamless content transitions across indoor scenes and introduces intuitive interface tools for spatial navigation and editing. A distinctive aspect of this work is the development of a novel photorealistic style transfer framework, tailored for indoor panoramic images, confronting challenges such as complex lighting patterns, maintaining visual integrity, correcting equirectangular projection distortions, and embedding 3D aspects.
The proposed methodology offers benefits across diverse sectors, including real estate, construction, interior design, and furniture retail, streamlining the creation of immersive indoor environments, facilitating remote interactions, and enabling virtual staging and property management. It also paves the way for artistic innovation, data augmentation, and the creation of virtual showrooms.
Our research evaluates the PanoStyle model’s performance against public domain synthetic datasets, demonstrating a 26.76% reduction in ArtFID, a 6.95% increase in PSNR, and a 25.23% enhancement in SSIM. The results affirm the model’s efficacy in producing realistic and visually appealing indoor scenes, outperforming existing models. For real-world applicability, the PanoStyle model has been advanced to PanoStyle++, optimized for practical scenarios and tested on both synthetic and real-world data.
This dissertation significantly advances the field of 360◦ indoor image processing, style transfer, scene editing, and content transfer, aiming to revolutionize the creation and manipulation of immersive indoor environments in the metaverse, thus enriching user experiences across various industries. Future research will focus on extending the proposed dense prediction technique to handle more complex dense estimation problems, like signal extraction for inverse rendering and virtual staging purposes, achieving seamless integration with virtual and mobile platforms, and enhancing immersive virtual staging with Head Mounted Displays, further elevating the creation and interaction with immersive indoor environments in the metaverse.
History
Language
- English
Publication Year
- 2024
License statement
© The author. The author has granted HBKU and Qatar Foundation a non-exclusive, worldwide, perpetual, irrevocable, royalty-free license to reproduce, display and distribute the manuscript in whole or in part in any form to be posted in digital or print format and made available to the public at no charge. Unless otherwise specified in the copyright statement or the metadata, all rights are reserved by the copyright holder. For permission to reuse content, please contact the author.Institution affiliated with
- Hamad Bin Khalifa University
- College of Science and Engineering - HBKU
Degree Date
- 2024
Degree Type
- Doctorate