--- library_name: UniDepth tags: - monocular-metric-depth-estimation - pytorch_model_hub_mixin - model_hub_mixin - depth-estimation repo_url: https://github.com/lpiccinelli-eth/UniDepth license: cc-by-nc-4.0 --- [![arXiv](https://img.shields.io/badge/UniDepthV2%20arXiv-2502.20110-blue?logo=arxiv&color=%23B31B1B)](https://arxiv.org/abs/2502.20110) [![arXiv](https://img.shields.io/badge/UniDepthV1%20arXiv-2403.18913-blue?logo=arxiv-v1&color=%23B31B1B)](https://arxiv.org/abs/2403.18913) [![ProjectPage](https://img.shields.io/badge/Project_Page-UniDepth-blue)](https://lpiccinelli-eth.github.io/pub/unidepth/) # UniDepthV2: Universal Monocular Metric Depth Estimation Made Simpler [![KITTI Benchmark](https://img.shields.io/badge/KITTI%20Benchmark-1st%20(at%20submission%20time)-orange)](https://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_prediction) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepthv2-universal-monocular-metric-depth/monocular-depth-estimation-on-nyu-depth-v2)](https://paperswithcode.com/sota/monocular-depth-estimation-on-nyu-depth-v2?p=unidepthv2-universal-monocular-metric-depth) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepthv2-universal-monocular-metric-depth/monocular-depth-estimation-on-kitti-eigen)](https://paperswithcode.com/sota/monocular-depth-estimation-on-kitti-eigen?p=unidepthv2-universal-monocular-metric-depth) ![](assets/docs/unidepthv2-banner.png) > [**UniDepthV2: Universal Monocular Metric Depth Estimation Made Simpler**](https://arxiv.org/abs/2403.18913), > Luigi Piccinelli, Christos Sakaridis, Yung-Hsu Yang, Mattia Segu, Siyuan Li, Wim Abbeloos, Luc Van Gool, > under submission, > *Paper at [arXiv 2502.20110](https://arxiv.org/abs/2502.20110)* # UniDepth: Universal Monocular Metric Depth Estimation [![KITTI Benchmark](https://img.shields.io/badge/KITTI%20Benchmark-1st%20(at%20submission%20time)-orange)](https://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_prediction) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepth-universal-monocular-metric-depth/monocular-depth-estimation-on-nyu-depth-v2)](https://paperswithcode.com/sota/monocular-depth-estimation-on-nyu-depth-v2?p=unidepth-universal-monocular-metric-depth) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepth-universal-monocular-metric-depth/monocular-depth-estimation-on-kitti-eigen)](https://paperswithcode.com/sota/monocular-depth-estimation-on-kitti-eigen?p=unidepth-universal-monocular-metric-depth) ![](assets/docs/unidepth-banner.png) > [**UniDepth: Universal Monocular Metric Depth Estimation**](https://arxiv.org/abs/2403.18913), > Luigi Piccinelli, Yung-Hsu Yang, Christos Sakaridis, Mattia Segu, Siyuan Li, Luc Van Gool, Fisher Yu, > CVPR 2024, > *Paper at [arXiv 2403.18913](https://arxiv.org/pdf/2403.18913.pdf)* ## News and ToDo - [ ] HuggingFace/Gradio demo. - [x] `28.02.2025`: Release UniDepthV2. - [x] `15.10.2024`: Release training code. - [x] `02.04.2024`: Release UniDepth as python package. - [x] `01.04.2024`: Inference code and V1 models are released. - [x] `26.02.2024`: UniDepth is accepted at CVPR 2024! (Highlight :star:) ## Zero-Shot Visualization ### YouTube (The Office - Parkour)

animated

### NuScenes (stitched cameras)

animated

## Installation ... (rest of the content remains the same)