YummyYum commited on
Commit
712dfde
·
verified ·
1 Parent(s): e8e2efb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -40
README.md CHANGED
@@ -35,26 +35,29 @@ We use a variety of Triton-implemented operation kernels—approximately 70%—t
35
 
36
  | | Usage | Nvidia |
37
  | ----------- | ------------------------------------------------------ | ------------------------------------------------------------ |
38
- | Basic Image | basic software environment that supports model running | 'docker pull docker pull flagrelease-registry.cn-beijing.cr.aliyuncs.com/flagrelease/flagrelease:deepseek-flagos-nvidia |
 
39
  # Evaluation Results
40
 
41
  ## Benchmark Result
42
 
43
- | Metrics | MiniCPM_o_2.6-H100-CUDA | MiniCPM_o_2.6-H100-FlagOS |
44
- |:-------------------|-----------------------|--------------------------|
45
- | mmmu_val | 48.11 | 48.33 |
46
- | math_vision_test | 22.89 | 22.30 |
47
- | ocrbench_test | 85.80 | 85.70 |
48
- | blink_val | 54.87 | 55.81 |
49
- | mmmvet_v2 | 57.66 | 59.03 |
50
- | mmmu_pro_vision_test | 70.46 | 69.77 |
51
- | mmmu_pro_standard_test | 30.46 | 30.81 |
52
- | cmmmu_val | 39.33 | 39.33 |
53
- | cii_bench_test | 50.07 | 50.33 |
54
 
55
 
56
  # How to Run Locally
 
57
  ## 📌 Getting Started
 
58
  ### Download open-source weights
59
 
60
  ```
@@ -89,32 +92,6 @@ cd ../
89
 
90
  ### Modify the configuration
91
 
92
- ```
93
- cd FlagScale/examples/minicpm_o_2.6/conf
94
- # Modify the configuration in config_minicpm_o_2.6.yaml
95
- defaults:
96
- - _self_
97
- - serve: minicpm_o_2.6
98
- experiment:
99
- exp_name: minicpm_o_2.6
100
- exp_dir: outputs/${experiment.exp_name}
101
- task:
102
- type: serve
103
- deploy:
104
- use_fs_serve: false
105
- runner:
106
- ssh_port: 22
107
- envs:
108
- CUDA_DEVICE_MAX_CONNECTIONS: 1
109
- cmds:
110
- before_start: source /root/miniconda3/bin/activate flagscale-inference && export USE_FLAGGEMS=1
111
- action: run
112
- hydra:
113
- run:
114
- dir: ${experiment.exp_dir}/hydra
115
-
116
- ### Modify the configuration
117
-
118
  ```
119
  cd FlagScale/examples/minicpm_o_2.6/conf
120
  # Modify the configuration in config_minicpm_o_2.6.yaml
@@ -139,6 +116,7 @@ hydra:
139
  run:
140
  dir: ${experiment.exp_dir}/hydra
141
  ```
 
142
  ```
143
  cd FlagScale/examples/minicpm_o_2.6/conf/serve
144
  # Modify the configuration in minicpm_o_2.6.yaml
@@ -157,6 +135,7 @@ cd FlagScale/examples/minicpm_o_2.6/conf/serve
157
  enable_chunked_prefill: true
158
 
159
  ```
 
160
  ```
161
  # install flagscale
162
  cd FlagScale/
@@ -164,12 +143,13 @@ pip install .
164
 
165
  #【Verifiable on a single machine】
166
  ```
 
167
  ### Serve
168
 
169
  ```
170
  flagscale serve <Model>
171
  ```
172
- #
173
  # Contributing
174
 
175
  We warmly welcome global developers to join us:
@@ -182,10 +162,11 @@ We warmly welcome global developers to join us:
182
  # 📞 Contact Us
183
 
184
  Scan the QR code below to add our WeChat group
 
185
  send "FlagRelease"
186
 
187
  ![WeChat](image/group.png)
188
 
189
  # License
190
 
191
- This project and related model weights are licensed under the MIT License.
 
35
 
36
  | | Usage | Nvidia |
37
  | ----------- | ------------------------------------------------------ | ------------------------------------------------------------ |
38
+ | Basic Image | basic software environment that supports model running | 'docker pull flagrelease-registry.cn-beijing.cr.aliyuncs.com/flagrelease/flagrelease:deepseek-flagos-nvidia |
39
+
40
  # Evaluation Results
41
 
42
  ## Benchmark Result
43
 
44
+ | Metrics | MiniCPM_o_2.6-H100-CUDA | MiniCPM_o_2.6-H100-FlagOS |
45
+ | :--------------------- | ----------------------- | ------------------------- |
46
+ | mmmu_val | 48.11 | 48.33 |
47
+ | math_vision_test | 22.89 | 22.30 |
48
+ | ocrbench_test | 85.80 | 85.70 |
49
+ | blink_val | 54.87 | 55.81 |
50
+ | mmmvet_v2 | 57.66 | 59.03 |
51
+ | mmmu_pro_vision_test | 70.46 | 69.77 |
52
+ | mmmu_pro_standard_test | 30.46 | 30.81 |
53
+ | cmmmu_val | 39.33 | 39.33 |
54
+ | cii_bench_test | 50.07 | 50.33 |
55
 
56
 
57
  # How to Run Locally
58
+
59
  ## 📌 Getting Started
60
+
61
  ### Download open-source weights
62
 
63
  ```
 
92
 
93
  ### Modify the configuration
94
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
95
  ```
96
  cd FlagScale/examples/minicpm_o_2.6/conf
97
  # Modify the configuration in config_minicpm_o_2.6.yaml
 
116
  run:
117
  dir: ${experiment.exp_dir}/hydra
118
  ```
119
+
120
  ```
121
  cd FlagScale/examples/minicpm_o_2.6/conf/serve
122
  # Modify the configuration in minicpm_o_2.6.yaml
 
135
  enable_chunked_prefill: true
136
 
137
  ```
138
+
139
  ```
140
  # install flagscale
141
  cd FlagScale/
 
143
 
144
  #【Verifiable on a single machine】
145
  ```
146
+
147
  ### Serve
148
 
149
  ```
150
  flagscale serve <Model>
151
  ```
152
+
153
  # Contributing
154
 
155
  We warmly welcome global developers to join us:
 
162
  # 📞 Contact Us
163
 
164
  Scan the QR code below to add our WeChat group
165
+
166
  send "FlagRelease"
167
 
168
  ![WeChat](image/group.png)
169
 
170
  # License
171
 
172
+ This project and related model weights are licensed under the MIT License.