Shariq Farooq Bhat, Reiner Birkl, Diana Wofk, Peter Wonka, Matthias Müller
This paper tackles the problem of depth estimation from a single image. Existing work either focuses on generalization performance disregarding metric scale, i.e. relative depth estimation, or state-of-the-art results on specific datasets, i.e. metric depth estimation. We propose the first approach that combines both worlds, leading to a model with excellent generalization performance while maintaining metric scale. Our flagship model, ZoeD-M12-NK, is pre-trained on 12 datasets using relative depth and fine-tuned on two datasets using metric depth. We use a lightweight head with a novel bin adjustment design called metric bins module for each domain. During inference, each input image is automatically routed to the appropriate head using a latent classifier. Our framework admits multiple configurations depending on the datasets used for relative depth pre-training and metric fine-tuning. Without pre-training, we can already significantly improve the state of the art (SOTA) on the NYU Depth v2 indoor dataset. Pre-training on twelve datasets and fine-tuning on the NYU Depth v2 indoor dataset, we can further improve SOTA for a total of 21% in terms of relative absolute error (REL). Finally, ZoeD-M12-NK is the first model that can jointly train on multiple datasets (NYU Depth v2 and KITTI) without a significant drop in performance and achieve unprecedented zero-shot generalization performance to eight unseen datasets from both indoor and outdoor domains. The code and pre-trained models are publicly available at https://github.com/isl-org/ZoeDepth .
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Depth Estimation | NYU-Depth V2 | Delta < 1.25 | 0.955 | ZoeD-M12-N |
| Depth Estimation | NYU-Depth V2 | Delta < 1.25^2 | 0.995 | ZoeD-M12-N |
| Depth Estimation | NYU-Depth V2 | Delta < 1.25^3 | 0.999 | ZoeD-M12-N |
| Depth Estimation | NYU-Depth V2 | RMSE | 0.27 | ZoeD-M12-N |
| Depth Estimation | NYU-Depth V2 | absolute relative error | 0.075 | ZoeD-M12-N |
| Depth Estimation | NYU-Depth V2 | log 10 | 0.032 | ZoeD-M12-N |
| 3D | NYU-Depth V2 | Delta < 1.25 | 0.955 | ZoeD-M12-N |
| 3D | NYU-Depth V2 | Delta < 1.25^2 | 0.995 | ZoeD-M12-N |
| 3D | NYU-Depth V2 | Delta < 1.25^3 | 0.999 | ZoeD-M12-N |
| 3D | NYU-Depth V2 | RMSE | 0.27 | ZoeD-M12-N |
| 3D | NYU-Depth V2 | absolute relative error | 0.075 | ZoeD-M12-N |
| 3D | NYU-Depth V2 | log 10 | 0.032 | ZoeD-M12-N |