Improve model card with pipeline tag, license, and project page link
Browse filesThis PR adds the `pipeline_tag: depth-estimation` to the model card metadata, making the model discoverable through the Hugging Face pipeline interface for depth estimation tasks. It also adds the license information (`cc-by-nc-4.0`) from the Github README. The existing project page link is retained.
README.md
CHANGED
@@ -4,9 +4,67 @@ tags:
|
|
4 |
- monocular-metric-depth-estimation
|
5 |
- pytorch_model_hub_mixin
|
6 |
- model_hub_mixin
|
|
|
7 |
repo_url: https://github.com/lpiccinelli-eth/UniDepth
|
|
|
8 |
---
|
9 |
|
10 |
-
|
11 |
-
|
12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
- monocular-metric-depth-estimation
|
5 |
- pytorch_model_hub_mixin
|
6 |
- model_hub_mixin
|
7 |
+
- depth-estimation
|
8 |
repo_url: https://github.com/lpiccinelli-eth/UniDepth
|
9 |
+
license: cc-by-nc-4.0
|
10 |
---
|
11 |
|
12 |
+
[](https://arxiv.org/abs/2502.20110)
|
13 |
+
[](https://arxiv.org/abs/2403.18913)
|
14 |
+
[](https://lpiccinelli-eth.github.io/pub/unidepth/)
|
15 |
+
|
16 |
+
# UniDepthV2: Universal Monocular Metric Depth Estimation Made Simpler
|
17 |
+
|
18 |
+
[-orange)](https://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_prediction)
|
19 |
+
[](https://paperswithcode.com/sota/monocular-depth-estimation-on-nyu-depth-v2?p=unidepthv2-universal-monocular-metric-depth)
|
20 |
+
[](https://paperswithcode.com/sota/monocular-depth-estimation-on-kitti-eigen?p=unidepthv2-universal-monocular-metric-depth)
|
21 |
+
|
22 |
+
|
23 |
+

|
24 |
+
|
25 |
+
> [**UniDepthV2: Universal Monocular Metric Depth Estimation Made Simpler**](https://arxiv.org/abs/2403.18913),
|
26 |
+
> Luigi Piccinelli, Christos Sakaridis, Yung-Hsu Yang, Mattia Segu, Siyuan Li, Wim Abbeloos, Luc Van Gool,
|
27 |
+
> under submission,
|
28 |
+
> *Paper at [arXiv 2502.20110](https://arxiv.org/abs/2502.20110)*
|
29 |
+
|
30 |
+
|
31 |
+
# UniDepth: Universal Monocular Metric Depth Estimation
|
32 |
+
|
33 |
+
[-orange)](https://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_prediction)
|
34 |
+
[](https://paperswithcode.com/sota/monocular-depth-estimation-on-nyu-depth-v2?p=unidepth-universal-monocular-metric-depth)
|
35 |
+
[](https://paperswithcode.com/sota/monocular-depth-estimation-on-kitti-eigen?p=unidepth-universal-monocular-metric-depth)
|
36 |
+
|
37 |
+

|
38 |
+
|
39 |
+
> [**UniDepth: Universal Monocular Metric Depth Estimation**](https://arxiv.org/abs/2403.18913),
|
40 |
+
> Luigi Piccinelli, Yung-Hsu Yang, Christos Sakaridis, Mattia Segu, Siyuan Li, Luc Van Gool, Fisher Yu,
|
41 |
+
> CVPR 2024,
|
42 |
+
> *Paper at [arXiv 2403.18913](https://arxiv.org/pdf/2403.18913.pdf)*
|
43 |
+
|
44 |
+
|
45 |
+
|
46 |
+
## News and ToDo
|
47 |
+
|
48 |
+
- [ ] HuggingFace/Gradio demo.
|
49 |
+
- [x] `28.02.2025`: Release UniDepthV2.
|
50 |
+
- [x] `15.10.2024`: Release training code.
|
51 |
+
- [x] `02.04.2024`: Release UniDepth as python package.
|
52 |
+
- [x] `01.04.2024`: Inference code and V1 models are released.
|
53 |
+
- [x] `26.02.2024`: UniDepth is accepted at CVPR 2024! (Highlight :star:)
|
54 |
+
|
55 |
+
|
56 |
+
## Zero-Shot Visualization
|
57 |
+
|
58 |
+
### YouTube (The Office - Parkour)
|
59 |
+
<p align="center">
|
60 |
+
<img src="assets/docs/theoffice.gif" alt="animated" />
|
61 |
+
</p>
|
62 |
+
|
63 |
+
### NuScenes (stitched cameras)
|
64 |
+
<p align="center">
|
65 |
+
<img src="assets/docs/nuscenes_surround.gif" alt="animated" />
|
66 |
+
</p>
|
67 |
+
|
68 |
+
|
69 |
+
## Installation
|
70 |
+
... (rest of the content remains the same)
|