Coobiw commited on
Commit
52f248f
·
verified ·
1 Parent(s): 2f8d537

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -38
README.md CHANGED
@@ -1,39 +1,40 @@
1
- ---
2
- license: apache-2.0
3
- pipeline_tag: visual-question-answering
4
- ---
5
-
6
-
7
- <p align="center">
8
- <b><font size="6">ChartMoE</font></b>
9
- <p>
10
-
11
- <div align="center">
12
-
13
- [Project Page](https://chartmoe.github.io/)
14
-
15
- [Github Repo](https://github.com/IDEA-FinAI/ChartMoE)
16
-
17
- </div>
18
-
19
- ![](teaser.png)
20
-
21
- **ChartMoE** is a multimodal large language model with Mixture-of-Expert connector, based on [InternLM-XComposer2](https://github.com/InternLM/InternLM-XComposer/tree/main/InternLM-XComposer-2.0) for advanced chart understanding, translation and editing.
22
-
23
-
24
- ## Import from Transformers
25
- To load the ChartMoE model using Transformers, use the following code:
26
- ```python
27
- import torch
28
- from transformers import AutoTokenizer, AutoModelForCausalLM
29
- ckpt_path = "IDEA-FinAI/chartmoe"
30
- tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
31
- model = AutoModelForCausalLM.from_pretrained(ckpt_path, trust_remote_code=True).half().cuda().eval()
32
- ```
33
-
34
- ## Quickstart & Gradio Demo
35
- We provide a simple example and a gradio webui demo to show how to use ChartMoE. Please refer to [https://github.com/IDEA-FinAI/ChartMoE](https://github.com/IDEA-FinAI/ChartMoE).
36
-
37
-
38
- ## Open Source License
 
39
  The code is licensed under Apache-2.0.
 
1
+ ---
2
+ license: apache-2.0
3
+ pipeline_tag: visual-question-answering
4
+ library_name: transformers
5
+ ---
6
+
7
+
8
+ <p align="center">
9
+ <b><font size="6">ChartMoE</font></b>
10
+ <p>
11
+
12
+ <div align="center">
13
+
14
+ [Project Page](https://chartmoe.github.io/)
15
+
16
+ [Github Repo](https://github.com/IDEA-FinAI/ChartMoE)
17
+
18
+ </div>
19
+
20
+ ![](teaser.png)
21
+
22
+ **ChartMoE** is a multimodal large language model with Mixture-of-Expert connector, based on [InternLM-XComposer2](https://github.com/InternLM/InternLM-XComposer/tree/main/InternLM-XComposer-2.0) for advanced chart understanding, translation and editing.
23
+
24
+
25
+ ## Import from Transformers
26
+ To load the ChartMoE model using Transformers, use the following code:
27
+ ```python
28
+ import torch
29
+ from transformers import AutoTokenizer, AutoModelForCausalLM
30
+ ckpt_path = "IDEA-FinAI/chartmoe"
31
+ tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
32
+ model = AutoModelForCausalLM.from_pretrained(ckpt_path, trust_remote_code=True).half().cuda().eval()
33
+ ```
34
+
35
+ ## Quickstart & Gradio Demo
36
+ We provide a simple example and a gradio webui demo to show how to use ChartMoE. Please refer to [https://github.com/IDEA-FinAI/ChartMoE](https://github.com/IDEA-FinAI/ChartMoE).
37
+
38
+
39
+ ## Open Source License
40
  The code is licensed under Apache-2.0.