Motivation
- Despite its necessity, online communication is hindered by various limitations. Traditional platforms like Zoom lack engagement and non-verbal cues, while existing virtual meeting solutions lack realism
- Our project aims to make progress towards VR telepresence by allowing photorealistic real-time rendering of indoor environments, ensuring an immersive and lifelike experience for users
![](https://about.fb.com/wp-content/uploads/2021/08/CD21_546-_-NRP-Oculus-Cross-Post_-Horizon-Workrooms-Launch_Inline-3.jpg?fit=4000%2C2250)
Figure 1: Exising virtual telepresence solutions- Meta Codec Avatars (left) and Meta Horizon Workrooms (right)
Problem Statement
Given a set of images of an indoor environment such as an office area or a meeting room, our goal is to generate a high fidelity reconstruction of the environment which allows for real-time photorealistic rendering and display on a virtual reality headset.
![](https://mscvprojects.ri.cmu.edu/2024team2/wp-content/uploads/sites/100/2024/05/Screenshot-2024-05-12-at-4.04.39 PM-1.png)
Figure 2: A VR telepresence solution that facilitates seamless interaction between in-person and remote participants
References
- Linning Xu et al., “VR-NeRF: High-fidelity virtualized walkable spaces,” In SIGGRAPH Asia Conference Proceedings, 2023.
- Kerbl et al., “3D Gaussian Splatting for Real-Time Radiance Field Rendering,” ACM ToG, 2023.
- Tancik, Matthew et al., “Nerfstudio: A Modular Framework for Neural Radiance Field Development”, SIGGRAPH, 2023.
- Ta et al., “Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering,” CVPR, 2024.
- S. Ma, et al., “Pixel Codec Avatars,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 2021 pp. 64-73.