nielsr HF Staff commited on
Commit
48d5712
·
verified ·
1 Parent(s): c2edfc2

Improve model card: add library\_name, pipeline\_tag, and links

Browse files

This PR improves the model card by:

* Adding the `library_name` to the metadata.
* Adding the `pipeline_tag` to the metadata, ensuring the model can be found at [https://huggingface.co/models?pipeline\_tag=image-text-to-text](https://huggingface.co/models?pipeline_tag=image-text-to-text).
* Linking to the paper.
* Adding a brief description of the model.

Files changed (1) hide show
  1. README.md +5 -13
README.md CHANGED
@@ -2,9 +2,13 @@
2
  language:
3
  - en
4
  license: apache-2.0
 
 
5
  ---
 
6
  This is BLIP3o-8B checkpoint trained on the **open source** data.
7
 
 
8
 
9
  | Model | Pretrain Data | GenEval | DBP | WISE |
10
  |---------------------|-----------------------------------------------------------|---------|--------|------|
@@ -12,8 +16,6 @@ This is BLIP3o-8B checkpoint trained on the **open source** data.
12
  | 8B (open source) | 30 million open-source data | 0.83 | 80.73 | 0.52 |
13
  | 8B (paper reported) | 30 million open-source + 30 million proprietary data | 0.84 | 81.60 | 0.62 |
14
 
15
-
16
-
17
  Here is the category results for WISE.
18
 
19
  | Model | Pretrain Data | Cultural | Time | Space | Biology | Physics | Chemistry | Overall |
@@ -21,16 +23,6 @@ Here is the category results for WISE.
21
  | 8B (open source) | 30 million open-source data | 0.49 | 0.51 | 0.63 | 0.54 | 0.63 | 0.37 | 0.52 |
22
  | 8B (paper reported) | 30 million open-source + 30 million proprietary data| 0.63 | 0.57 | 0.70 | 0.62 | 0.66 | 0.51 | 0.62 |
23
 
24
-
25
-
26
-
27
-
28
-
29
-
30
-
31
-
32
-
33
-
34
  ### Download
35
 
36
  ```
@@ -57,4 +49,4 @@ Launch with your model path:
57
 
58
  ```
59
  python app.py /path/to/your/model
60
- ```
 
2
  language:
3
  - en
4
  license: apache-2.0
5
+ library_name: transformers
6
+ pipeline_tag: image-text-to-text
7
  ---
8
+
9
  This is BLIP3o-8B checkpoint trained on the **open source** data.
10
 
11
+ This model is a visual language model based on the paper [BLIP3-o: A Family of Fully Open Unified Multimodal Models-Architecture, Training and Dataset](https://huggingface.co/papers/2505.09568). The code is available at https://github.com/JiuhaiChen/BLIP3o.
12
 
13
  | Model | Pretrain Data | GenEval | DBP | WISE |
14
  |---------------------|-----------------------------------------------------------|---------|--------|------|
 
16
  | 8B (open source) | 30 million open-source data | 0.83 | 80.73 | 0.52 |
17
  | 8B (paper reported) | 30 million open-source + 30 million proprietary data | 0.84 | 81.60 | 0.62 |
18
 
 
 
19
  Here is the category results for WISE.
20
 
21
  | Model | Pretrain Data | Cultural | Time | Space | Biology | Physics | Chemistry | Overall |
 
23
  | 8B (open source) | 30 million open-source data | 0.49 | 0.51 | 0.63 | 0.54 | 0.63 | 0.37 | 0.52 |
24
  | 8B (paper reported) | 30 million open-source + 30 million proprietary data| 0.63 | 0.57 | 0.70 | 0.62 | 0.66 | 0.51 | 0.62 |
25
 
 
 
 
 
 
 
 
 
 
 
26
  ### Download
27
 
28
  ```
 
49
 
50
  ```
51
  python app.py /path/to/your/model
52
+ ```