Xinhe-Li commited on
Commit
6d472af
·
verified ·
1 Parent(s): c23eb42

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +46 -3
README.md CHANGED
@@ -1,3 +1,46 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - LoRID-Math/MATH
5
+ language:
6
+ - en
7
+ metrics:
8
+ - accuracy
9
+ base_model:
10
+ - mistralai/Mistral-7B-v0.1
11
+ pipeline_tag: text-generation
12
+ library_name: peft
13
+ tags:
14
+ - math
15
+ - reasoning
16
+ ---
17
+
18
+ # LoRID: A Reasoning Distillation Method via Multi-LoRA Interaction
19
+
20
+ 📃 [Paper](https://arxiv.org/abs/2508.13037) • 💻 [Code](https://github.com/Xinhe-Li/LoRID) • 🤗 [HF Repo](https://huggingface.co/LoRID-Math)
21
+
22
+ ## Abstract
23
+
24
+ The models for "[Can Large Models Teach Student Models to Solve Mathematical Problems Like Human Beings? A Reasoning Distillation Method via Multi-LoRA Interaction](https://arxiv.org/abs/2508.13037)" [IJCAI 2025].
25
+
26
+ ## Key Contributions
27
+
28
+ - We focus on the mathematical reasoning distillation task and propose a novel method **LoRID**, which draws inspiration from the human beings teaching and learning pattern.
29
+ - We introduce knowledge during data augmentation and propose multi-LoRA interaction during model distillation, which improves the student’s reasoning abilities.
30
+ - Experimental results show that with the interaction between System 1 and System 2, **LoRID** outperforms previous state-of-the-art approaches and can be easily and effectively integrated into any Chain-of-Thought distillation method.
31
+
32
+ ## Citation
33
+
34
+ If this work is helpful, please kindly cite as:
35
+
36
+ ```bibtex
37
+ @misc{li2025largemodelsteachstudent,
38
+ title={Can Large Models Teach Student Models to Solve Mathematical Problems Like Human Beings? A Reasoning Distillation Method via Multi-LoRA Interaction},
39
+ author={Xinhe Li and Jiajun Liu and Peng Wang},
40
+ year={2025},
41
+ eprint={2508.13037},
42
+ archivePrefix={arXiv},
43
+ primaryClass={cs.CL},
44
+ url={https://arxiv.org/abs/2508.13037},
45
+ }
46
+ ```