We plan to develop a real-time system deployed on a drone that is capable of estimating the depth of its environment. The drone would have a six fisheye lens camera setup. The six fisheye images would be stitched together to get a panorama image representing the top, down, bottom, left, and right sides of the environment. The model/system would then predict a depth for each pixel of this stitched image.

Holonomic robots such as drones have six degrees of freedom and thus, can move in all directions. Human drivers, too, rely on a wider field of view rather than just looking forward to making decisions. In the case of autonomous vehicles and self-driving cars, a LIDAR helps overcome this. However, we want a LIDAR-free system.

Welcome to MSCV Projects Sites. This is your first post. Edit or delsdsadasadasdsadasdasdsdasdsadete it, then start blogging!sadsasdsd