Dear colleagues, I am happy to announce the release of the open source implementation of our approach for real-time, monocular, dense depth estimation, called “REMODE”. The code is available at: https://github.com/uzh-rpg/rpg_open_remode
REMODE has been used in many real-time dense reconstruction projects from autonomous drones, like those shown at the end of the video below. The open source implementation requires a CUDA capable GPU and the NVIDIA CUDA Toolkit. Instructions for building and running the code are available in the repository wiki. REMODE implements a “REgularized, probabilistic, MOnocular Depth Estimation”, as described in the ICRA’14 paper by +Matia Pizzoli +Christian Forster and myself: “REMODE: Probabilistic, monocular dense reconstruction in real time”
IEEE International Conference on Robotics and Automation (ICRA) 2014.
Since it provides real-time, dense depth maps along with the corresponding confidence maps, REMODE is very suitable for robotic applications, such as environment interaction, motion planning, active vision and control, where both dense information and map uncertainty may be required, as demonstrated in our previous works: http://rpg.ifi.uzh.ch/research_dense.html