Update README.md
Browse files
README.md
CHANGED
@@ -1,39 +1,40 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
pipeline_tag: visual-question-answering
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
<
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
|
|
39 |
The code is licensed under Apache-2.0.
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
pipeline_tag: visual-question-answering
|
4 |
+
library_name: transformers
|
5 |
+
---
|
6 |
+
|
7 |
+
|
8 |
+
<p align="center">
|
9 |
+
<b><font size="6">ChartMoE</font></b>
|
10 |
+
<p>
|
11 |
+
|
12 |
+
<div align="center">
|
13 |
+
|
14 |
+
[Project Page](https://chartmoe.github.io/)
|
15 |
+
|
16 |
+
[Github Repo](https://github.com/IDEA-FinAI/ChartMoE)
|
17 |
+
|
18 |
+
</div>
|
19 |
+
|
20 |
+

|
21 |
+
|
22 |
+
**ChartMoE** is a multimodal large language model with Mixture-of-Expert connector, based on [InternLM-XComposer2](https://github.com/InternLM/InternLM-XComposer/tree/main/InternLM-XComposer-2.0) for advanced chart understanding, translation and editing.
|
23 |
+
|
24 |
+
|
25 |
+
## Import from Transformers
|
26 |
+
To load the ChartMoE model using Transformers, use the following code:
|
27 |
+
```python
|
28 |
+
import torch
|
29 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
30 |
+
ckpt_path = "IDEA-FinAI/chartmoe"
|
31 |
+
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
|
32 |
+
model = AutoModelForCausalLM.from_pretrained(ckpt_path, trust_remote_code=True).half().cuda().eval()
|
33 |
+
```
|
34 |
+
|
35 |
+
## Quickstart & Gradio Demo
|
36 |
+
We provide a simple example and a gradio webui demo to show how to use ChartMoE. Please refer to [https://github.com/IDEA-FinAI/ChartMoE](https://github.com/IDEA-FinAI/ChartMoE).
|
37 |
+
|
38 |
+
|
39 |
+
## Open Source License
|
40 |
The code is licensed under Apache-2.0.
|