nielsr HF staff commited on
Commit
2b3b767
·
verified ·
1 Parent(s): 1937d33

Improve model card with pipeline tag, license, and project page link

Browse files

This PR adds the `pipeline_tag: depth-estimation` to the model card metadata, making the model discoverable through the Hugging Face pipeline interface for depth estimation tasks. It also adds the license information (`cc-by-nc-4.0`) from the Github README. The existing project page link is retained.

Files changed (1) hide show
  1. README.md +61 -3
README.md CHANGED
@@ -4,9 +4,67 @@ tags:
4
  - monocular-metric-depth-estimation
5
  - pytorch_model_hub_mixin
6
  - model_hub_mixin
 
7
  repo_url: https://github.com/lpiccinelli-eth/UniDepth
 
8
  ---
9
 
10
- This model has been pushed to the Hub using **UniDepth**:
11
- - Repo: https://github.com/lpiccinelli-eth/UniDepth
12
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  - monocular-metric-depth-estimation
5
  - pytorch_model_hub_mixin
6
  - model_hub_mixin
7
+ - depth-estimation
8
  repo_url: https://github.com/lpiccinelli-eth/UniDepth
9
+ license: cc-by-nc-4.0
10
  ---
11
 
12
+ [![arXiv](https://img.shields.io/badge/UniDepthV2%20arXiv-2502.20110-blue?logo=arxiv&color=%23B31B1B)](https://arxiv.org/abs/2502.20110)
13
+ [![arXiv](https://img.shields.io/badge/UniDepthV1%20arXiv-2403.18913-blue?logo=arxiv-v1&color=%23B31B1B)](https://arxiv.org/abs/2403.18913)
14
+ [![ProjectPage](https://img.shields.io/badge/Project_Page-UniDepth-blue)](https://lpiccinelli-eth.github.io/pub/unidepth/)
15
+
16
+ # UniDepthV2: Universal Monocular Metric Depth Estimation Made Simpler
17
+
18
+ [![KITTI Benchmark](https://img.shields.io/badge/KITTI%20Benchmark-1st%20(at%20submission%20time)-orange)](https://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_prediction)
19
+ [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepthv2-universal-monocular-metric-depth/monocular-depth-estimation-on-nyu-depth-v2)](https://paperswithcode.com/sota/monocular-depth-estimation-on-nyu-depth-v2?p=unidepthv2-universal-monocular-metric-depth)
20
+ [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepthv2-universal-monocular-metric-depth/monocular-depth-estimation-on-kitti-eigen)](https://paperswithcode.com/sota/monocular-depth-estimation-on-kitti-eigen?p=unidepthv2-universal-monocular-metric-depth)
21
+
22
+
23
+ ![](assets/docs/unidepthv2-banner.png)
24
+
25
+ > [**UniDepthV2: Universal Monocular Metric Depth Estimation Made Simpler**](https://arxiv.org/abs/2403.18913),
26
+ > Luigi Piccinelli, Christos Sakaridis, Yung-Hsu Yang, Mattia Segu, Siyuan Li, Wim Abbeloos, Luc Van Gool,
27
+ > under submission,
28
+ > *Paper at [arXiv 2502.20110](https://arxiv.org/abs/2502.20110)*
29
+
30
+
31
+ # UniDepth: Universal Monocular Metric Depth Estimation
32
+
33
+ [![KITTI Benchmark](https://img.shields.io/badge/KITTI%20Benchmark-1st%20(at%20submission%20time)-orange)](https://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_prediction)
34
+ [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepth-universal-monocular-metric-depth/monocular-depth-estimation-on-nyu-depth-v2)](https://paperswithcode.com/sota/monocular-depth-estimation-on-nyu-depth-v2?p=unidepth-universal-monocular-metric-depth)
35
+ [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/unidepth-universal-monocular-metric-depth/monocular-depth-estimation-on-kitti-eigen)](https://paperswithcode.com/sota/monocular-depth-estimation-on-kitti-eigen?p=unidepth-universal-monocular-metric-depth)
36
+
37
+ ![](assets/docs/unidepth-banner.png)
38
+
39
+ > [**UniDepth: Universal Monocular Metric Depth Estimation**](https://arxiv.org/abs/2403.18913),
40
+ > Luigi Piccinelli, Yung-Hsu Yang, Christos Sakaridis, Mattia Segu, Siyuan Li, Luc Van Gool, Fisher Yu,
41
+ > CVPR 2024,
42
+ > *Paper at [arXiv 2403.18913](https://arxiv.org/pdf/2403.18913.pdf)*
43
+
44
+
45
+
46
+ ## News and ToDo
47
+
48
+ - [ ] HuggingFace/Gradio demo.
49
+ - [x] `28.02.2025`: Release UniDepthV2.
50
+ - [x] `15.10.2024`: Release training code.
51
+ - [x] `02.04.2024`: Release UniDepth as python package.
52
+ - [x] `01.04.2024`: Inference code and V1 models are released.
53
+ - [x] `26.02.2024`: UniDepth is accepted at CVPR 2024! (Highlight :star:)
54
+
55
+
56
+ ## Zero-Shot Visualization
57
+
58
+ ### YouTube (The Office - Parkour)
59
+ <p align="center">
60
+ <img src="assets/docs/theoffice.gif" alt="animated" />
61
+ </p>
62
+
63
+ ### NuScenes (stitched cameras)
64
+ <p align="center">
65
+ <img src="assets/docs/nuscenes_surround.gif" alt="animated" />
66
+ </p>
67
+
68
+
69
+ ## Installation
70
+ ... (rest of the content remains the same)