Update model card with FinTeam paper and add pipeline/library tags
Browse filesThis PR improves the model card for `Go4miii/DISC-FinLLM` by:
- Associating it with the paper **[FinTeam: A Multi-Agent Collaborative Intelligence System for Comprehensive Financial Scenarios](https://huggingface.co/papers/2507.10448)**, clarifying its role as a component within the FinTeam system.
- Correcting and expanding the header links to include the `FinTeam` paper, the correct `DISC-FinLLM` technical report, and the GitHub repository.
- Adding `pipeline_tag: text-generation` to ensure discoverability for relevant tasks on the Hub.
- Adding `library_name: transformers` to enable the "Load with Transformers" widget, as the model is compatible with the library.
- Correcting the citation section to include both the `FinTeam` and `DISC-FinLLM` papers.
@@ -1,27 +1,29 @@
|
|
1 |
---
|
2 |
-
license: apache-2.0
|
3 |
language:
|
4 |
- zh
|
|
|
5 |
tags:
|
6 |
- finance
|
|
|
|
|
7 |
---
|
8 |
|
9 |
-
This repository contains
|
10 |
|
11 |
<div align="center">
|
12 |
|
13 |
-
[Demo](https://
|
14 |
</div>
|
15 |
|
16 |
**Please note that due to the ongoing development of the project, the model weights in this repository may differ from those in our currently deployed demo.**
|
17 |
|
18 |
|
19 |
-
DISC-FinLLM is a large model in the financial field specifically designed to provide users with professional, intelligent, and comprehensive **financial consulting services** in financial scenarios. It is developed
|
20 |
|
21 |
-
*
|
22 |
-
*
|
23 |
-
*
|
24 |
-
*
|
25 |
|
26 |
Check our [HOME](https://github.com/FudanDISC/DISC-FinLLM) for more information.
|
27 |
|
@@ -29,13 +31,13 @@ Check our [HOME](https://github.com/FudanDISC/DISC-FinLLM) for more information.
|
|
29 |
|
30 |
DISC-FinLLM is a large financial model based on the high-quality financial data set DISC-Fin-SFT. We construct and fine-tuned the LoRA instruction on the general-domain Chinese large model Baichuan-13B-Chat. DISC-Fin-SFT contains a total of about 250,000 pieces of data, divided into four sub-data sets, which are financial consulting instructions, financial task instructions, financial computing instructions, and retrieval-enhanced instructions.
|
31 |
|
32 |
-
| Dataset | Samples | Input Length | Output Length
|
33 |
-
|
34 |
-
|
|
35 |
-
|
|
36 |
-
|
|
37 |
-
|
|
38 |
-
|
|
39 |
|
40 |
# Using through hugging face transformers
|
41 |
|
@@ -60,17 +62,22 @@ DISC-FinLLM has problems and shortcomings that cannot be overcome by current lar
|
|
60 |
|
61 |
If our project has been helpful for your research and work, please kindly cite our work as follows:
|
62 |
|
63 |
-
```
|
64 |
-
@
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
|
|
|
|
|
|
|
|
|
|
71 |
}
|
72 |
```
|
73 |
|
74 |
## License
|
75 |
|
76 |
-
The use of the source code in this repository complies with the Apache 2.0 License.
|
|
|
1 |
---
|
|
|
2 |
language:
|
3 |
- zh
|
4 |
+
license: apache-2.0
|
5 |
tags:
|
6 |
- finance
|
7 |
+
pipeline_tag: text-generation
|
8 |
+
library_name: transformers
|
9 |
---
|
10 |
|
11 |
+
This repository contains DISC-FinLLM, a version of Baichuan-13B-Chat that serves as a foundational large language model component within the **[FinTeam: A Multi-Agent Collaborative Intelligence System for Comprehensive Financial Scenarios](https://huggingface.co/papers/2507.10448)**.
|
12 |
|
13 |
<div align="center">
|
14 |
|
15 |
+
[Demo](https://fin.fudan-disc.com) | [FinTeam Paper](https://huggingface.co/papers/2507.10448) | [DISC-FinLLM Technical Report](http://arxiv.org/abs/2310.15205) | [GitHub Repository](https://github.com/FudanDISC/DISC-FinLLM)
|
16 |
</div>
|
17 |
|
18 |
**Please note that due to the ongoing development of the project, the model weights in this repository may differ from those in our currently deployed demo.**
|
19 |
|
20 |
|
21 |
+
DISC-FinLLM is a large model in the financial field specifically designed to provide users with professional, intelligent, and comprehensive **financial consulting services** in financial scenarios. It is developed and open sourced by [Fudan University Data Intelligence and Social Computing Laboratory (Fudan-DISC)](http://fudan-disc.com). It is a multi-expert smart financial system composed of four modules for different financial scenarios: financial consulting, financial text analysis, financial calculation, and financial knowledge retrieval and question answering. These modules showed clear advantages in four evaluations including financial NLP tasks, human test questions, data analysis and current affairs analysis, proving that DISC-FinLLM can provide strong support for a wide range of financial fields. DISC-FinLLM can help in different application scenarios and can be used to implement different functions:
|
22 |
|
23 |
+
* **Financial Consultation:** This module can start multiple rounds of dialogue with users on financial topics in the Chinese financial context, or explain relevant knowledge of financial majors to users. It is composed of the financial consulting instructions part of the data set.
|
24 |
+
* **Financial Text Analysis:** This module can help users complete NLP tasks such as information extraction, sentiment analysis, text classification, and text generation on financial texts. It is trained by the financial task instructions in the data set.
|
25 |
+
* **Financial Calculation:** This module can help users complete tasks related to mathematical calculations. In addition to basic calculations such as interest rates and growth rates, it also supports statistical analysis and includes the Black-Scholes option pricing model and the EDF expected default probability model. Financial model calculations included. This module is partially trained from the financial computing instructions in the data set.
|
26 |
+
* **Financial Knowledge Retrieval Q&A:** This module can provide users with investment advice, current affairs analysis, and policy interpretation based on financial news, research reports, and related policy documents. It is partially trained from the retrieval-enhanced instructions in the dataset.
|
27 |
|
28 |
Check our [HOME](https://github.com/FudanDISC/DISC-FinLLM) for more information.
|
29 |
|
|
|
31 |
|
32 |
DISC-FinLLM is a large financial model based on the high-quality financial data set DISC-Fin-SFT. We construct and fine-tuned the LoRA instruction on the general-domain Chinese large model Baichuan-13B-Chat. DISC-Fin-SFT contains a total of about 250,000 pieces of data, divided into four sub-data sets, which are financial consulting instructions, financial task instructions, financial computing instructions, and retrieval-enhanced instructions.
|
33 |
|
34 |
+
| Dataset | Samples | Input Length | Output Length |
|
35 |
+
|:-----------------------------|----------------:|------------------------------------------------------------:|-----------------------------------------------------------:|
|
36 |
+
| Financial Consulting Instructions | 63k | 26 | 369 |
|
37 |
+
| Financial Task Instructions | 110k | 676 | 35 |
|
38 |
+
| Financial Computing Instructions | 57k | 73 | 190 |
|
39 |
+
| Retrieval-enhanced Instructions | 20k | 1031 | 521 |
|
40 |
+
| DISC-Fin-SFT | 246k | 351 | 198 |
|
41 |
|
42 |
# Using through hugging face transformers
|
43 |
|
|
|
62 |
|
63 |
If our project has been helpful for your research and work, please kindly cite our work as follows:
|
64 |
|
65 |
+
```bibtex
|
66 |
+
@article{tan2025finteam,
|
67 |
+
title={FinTeam: A Multi-Agent Collaborative Intelligence System for Comprehensive Financial Scenarios},
|
68 |
+
author={Tan, Zhenxiong and Liu, Songhua and Yang, Xingyi and Xue, Qiaochu and Wang, Xinchao},
|
69 |
+
journal={arXiv preprint arXiv:2507.10448},
|
70 |
+
year={2025}
|
71 |
+
}
|
72 |
+
|
73 |
+
@article{chen2023disc,
|
74 |
+
title={DISC-FinLLM: A Chinese Financial Large Language Model based on Multiple Experts Fine-tuning},
|
75 |
+
author={Chen, Wei and Wang, Qiushi and Long, Zefei and Zhang, Xianyin and Lu, Zhongtian and Li, Bingxuan and Wang, Siyuan and Xu, Jiarong and Bai, Xiang and Huang, Xuanjing and Wei, Zhongyu},
|
76 |
+
journal={arXiv preprint arXiv:2310.15205},
|
77 |
+
year={2023}
|
78 |
}
|
79 |
```
|
80 |
|
81 |
## License
|
82 |
|
83 |
+
The use of the source code in this repository complies with the Apache 2.0 License.
|