Overview

Motivation

  • Despite its necessity, online communication is hindered by various limitations. Traditional platforms like Zoom lack engagement and non-verbal cues, while existing virtual meeting solutions lack realism
  • Our project aims to make progress towards VR telepresence by allowing photorealistic real-time rendering of indoor environments, ensuring an immersive and lifelike experience for users

Figure 1: Exising virtual telepresence solutions- Meta Codec Avatars (left) and Meta Horizon Workrooms (right)

Problem Statement

Given a set of images of an indoor environment such as an office area or a meeting room, our goal is to generate a high fidelity reconstruction of the environment which allows for real-time photorealistic rendering  and display on a virtual reality headset.

Figure 2: A VR telepresence solution that facilitates seamless interaction between in-person and remote participants

References

  • Linning Xu et al., “VR-NeRF: High-fidelity virtualized walkable spaces,” In SIGGRAPH Asia Conference Proceedings, 2023.
  • Kerbl et al., “3D Gaussian Splatting for Real-Time Radiance Field Rendering,” ACM ToG, 2023.
  • Tancik, Matthew et al., “Nerfstudio: A Modular Framework for Neural Radiance Field Development”, SIGGRAPH, 2023.
  • Ta et al., “Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering,” CVPR, 2024.
  • S. Ma, et al., “Pixel Codec Avatars,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 2021 pp. 64-73.