Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -1,3 +1,119 @@ | |
| 1 | 
            -
            ---
         | 
| 2 | 
            -
             | 
| 3 | 
            -
             | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            ---
         | 
| 2 | 
            +
            pipeline_tag: text-generation
         | 
| 3 | 
            +
            library_name: transformers
         | 
| 4 | 
            +
            license: cc-by-nc-4.0
         | 
| 5 | 
            +
            tags:
         | 
| 6 | 
            +
            - text-to-sql
         | 
| 7 | 
            +
            - reinforcement-learning
         | 
| 8 | 
            +
            ---
         | 
| 9 | 
            +
             | 
| 10 | 
            +
             | 
| 11 | 
            +
            # SLM-SQL: An Exploration of Small Language Models for Text-to-SQL
         | 
| 12 | 
            +
             | 
| 13 | 
            +
            ### Important Links
         | 
| 14 | 
            +
             | 
| 15 | 
            +
            📖[Arxiv Paper](https://arxiv.org/abs/2507.22478) |
         | 
| 16 | 
            +
            🤗[HuggingFace](https://huggingface.co/collections/cycloneboy/slm-sql-688b02f99f958d7a417658dc) |
         | 
| 17 | 
            +
            🤖[ModelScope](https://modelscope.cn/collections/SLM-SQL-624bb6a60e9643) |
         | 
| 18 | 
            +
             | 
| 19 | 
            +
            ## News
         | 
| 20 | 
            +
             | 
| 21 | 
            +
            + `July 31, 2025`: Upload model to modelscope and huggingface.
         | 
| 22 | 
            +
            + `July 30, 2025`: Publish the paper to arxiv
         | 
| 23 | 
            +
             | 
| 24 | 
            +
            ## Introduction
         | 
| 25 | 
            +
             | 
| 26 | 
            +
            > Large language models (LLMs) have demonstrated strong performance in translating natural language questions into SQL
         | 
| 27 | 
            +
            > queries (Text-to-SQL). In contrast, small language models (SLMs) ranging from 0.5B to 1.5B parameters currently
         | 
| 28 | 
            +
            > underperform on Text-to-SQL tasks due to their limited logical reasoning capabilities. However, SLMs offer inherent
         | 
| 29 | 
            +
            > advantages in inference speed and suitability for edge deployment. To explore their potential in Text-to-SQL
         | 
| 30 | 
            +
            > applications, we leverage recent advancements in post-training techniques. Specifically, we used the open-source
         | 
| 31 | 
            +
            > SynSQL-2.5M dataset to construct two derived datasets: SynSQL-Think-916K for SQL generation and
         | 
| 32 | 
            +
            > SynSQL-Merge-Think-310K
         | 
| 33 | 
            +
            > for SQL merge revision. We then applied supervised fine-tuning and reinforcement learning-based post-training to the
         | 
| 34 | 
            +
            > SLM, followed by inference using a corrective self-consistency approach. Experimental results validate the
         | 
| 35 | 
            +
            > effectiveness
         | 
| 36 | 
            +
            > and generalizability of our method, SLM-SQL. On the BIRD development set, the five evaluated models achieved an
         | 
| 37 | 
            +
            > average
         | 
| 38 | 
            +
            > improvement of 31.4 points. Notably, the 0.5B model reached 56.87\% execution accuracy (EX), while the 1.5B model
         | 
| 39 | 
            +
            > achieved 67.08\% EX. We will release our dataset, model, and code to github: https://github.com/CycloneBoy/slm_sql.
         | 
| 40 | 
            +
             | 
| 41 | 
            +
            ### Framework
         | 
| 42 | 
            +
             | 
| 43 | 
            +
            <img src="https://raw.githubusercontent.com/CycloneBoy/slm_sql/main/data/image/slmsql_framework.png"  height="500" alt="slmsql_framework">
         | 
| 44 | 
            +
             | 
| 45 | 
            +
            ### Main Results
         | 
| 46 | 
            +
             | 
| 47 | 
            +
            <img src="https://raw.githubusercontent.com/CycloneBoy/slm_sql/main/data/image/slmsql_bird_result.png"  height="500" alt="slm_sql_result">
         | 
| 48 | 
            +
             | 
| 49 | 
            +
             | 
| 50 | 
            +
            <img src="https://raw.githubusercontent.com/CycloneBoy/slm_sql/main/data/image/slmsql_bird_main.png"  height="500" alt="slmsql_bird_main">
         | 
| 51 | 
            +
             | 
| 52 | 
            +
            <img src="https://raw.githubusercontent.com/CycloneBoy/slm_sql/main/data/image/slmsql_spider_main.png"  height="500" alt="slmsql_spider_main">
         | 
| 53 | 
            +
             | 
| 54 | 
            +
            Performance Comparison of different Text-to-SQL methods on BIRD dev and test dataset.
         | 
| 55 | 
            +
             | 
| 56 | 
            +
            <img src="https://raw.githubusercontent.com/CycloneBoy/slm_sql/main/data/image/slmsql_ablation_study.png"  height="300" alt="slmsql_ablation_study">
         | 
| 57 | 
            +
             | 
| 58 | 
            +
            ## Model
         | 
| 59 | 
            +
             | 
| 60 | 
            +
            | **Model**                                | Base Model                   | Train Method | Modelscope                                                                                        | HuggingFace                                                                                  |
         | 
| 61 | 
            +
            |------------------------------------------|------------------------------|--------------|---------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------|
         | 
| 62 | 
            +
            | SLM-SQL-Base-0.5B                        | Qwen2.5-Coder-0.5B-Instruct  | SFT          | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-Base-0.5B)                        | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-Base-0.5B)                        |
         | 
| 63 | 
            +
            | SLM-SQL-0.5B                             | Qwen2.5-Coder-0.5B-Instruct  | SFT + GRPO   | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-0.5B)                             | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-0.5B)                             |
         | 
| 64 | 
            +
            | CscSQL-Merge-Qwen2.5-Coder-0.5B-Instruct | Qwen2.5-Coder-0.5B-Instruct  | SFT + GRPO   | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/CscSQL-Merge-Qwen2.5-Coder-0.5B-Instruct) | [🤗 HuggingFace](https://huggingface.co/cycloneboy/CscSQL-Merge-Qwen2.5-Coder-0.5B-Instruct) |
         | 
| 65 | 
            +
            | SLM-SQL-Base-1.5B                        | Qwen2.5-Coder-1.5B-Instruct  | SFT          | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-Base-1.5B)                        | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-Base-1.5B)                        |
         | 
| 66 | 
            +
            | SLM-SQL-1.5B                             | Qwen2.5-Coder-1.5B-Instruct  | SFT + GRPO   | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-1.5B)                             | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-1.5B)                             |
         | 
| 67 | 
            +
            | CscSQL-Merge-Qwen2.5-Coder-1.5B-Instruct | Qwen2.5-Coder-1.5B-Instruct  | SFT + GRPO   | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/CscSQL-Merge-Qwen2.5-Coder-1.5B-Instruct) | [🤗 HuggingFace](https://huggingface.co/cycloneboy/CscSQL-Merge-Qwen2.5-Coder-1.5B-Instruct) |
         | 
| 68 | 
            +
            | SLM-SQL-Base-0.6B                        | Qwen3-0.6B                   | SFT          | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-Base-0.6B)                        | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-Base-0.6B)                        |
         | 
| 69 | 
            +
            | SLM-SQL-0.6B                             | Qwen3-0.6B                   | SFT + GRPO   | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-0.6B)                             | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-0.6B)                             |
         | 
| 70 | 
            +
            | SLM-SQL-Base-1.3B                        | deepseek-coder-1.3b-instruct | SFT          | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-Base-1.3B )                       | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-Base-1.3B )                       |
         | 
| 71 | 
            +
            | SLM-SQL-1.3B                             | deepseek-coder-1.3b-instruct | SFT + GRPO   | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-1.3B )                            | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-1.3B )                            |
         | 
| 72 | 
            +
            | SLM-SQL-Base-1B                          | Llama-3.2-1B-Instruct        | SFT          | [🤖 Modelscope](https://modelscope.cn/models/cycloneboy/SLM-SQL-Base-1B )                         | [🤗 HuggingFace](https://huggingface.co/cycloneboy/SLM-SQL-Base-1B )                         |
         | 
| 73 | 
            +
             | 
| 74 | 
            +
            ## Dataset
         | 
| 75 | 
            +
             | 
| 76 | 
            +
            | **Dataset**                | Modelscope                                                                         | HuggingFace                                                                          |
         | 
| 77 | 
            +
            |----------------------------|------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------|
         | 
| 78 | 
            +
            | SynsQL-Think-916k          | [🤖 Modelscope](https://modelscope.cn/datasets/cycloneboy/SynsQL-Think-916k)       | [🤗 HuggingFace](https://huggingface.co/datasets/cycloneboy/SynsQL-Think-916k)       |
         | 
| 79 | 
            +
            | SynsQL-Merge-Think-310k    | [🤖 Modelscope](https://modelscope.cn/datasets/cycloneboy/SynsQL-Merge-Think-310k) | [🤗 HuggingFace](https://huggingface.co/datasets/cycloneboy/SynsQL-Merge-Think-310k) |
         | 
| 80 | 
            +
            | bird train and dev dataset | [🤖 Modelscope](https://modelscope.cn/datasets/cycloneboy/bird_train)              | [🤗 HuggingFace](https://huggingface.co/datasets/cycloneboy/bird_train)              |
         | 
| 81 | 
            +
             | 
| 82 | 
            +
            ## TODO
         | 
| 83 | 
            +
             | 
| 84 | 
            +
            - [ ] Release inference code
         | 
| 85 | 
            +
            - [ ] Upload Model
         | 
| 86 | 
            +
            - [ ] Release training code
         | 
| 87 | 
            +
            - [ ] Fix bug
         | 
| 88 | 
            +
            - [ ] Update doc
         | 
| 89 | 
            +
             | 
| 90 | 
            +
            ## Thanks to the following projects
         | 
| 91 | 
            +
             | 
| 92 | 
            +
            - [csc_sql](https://github.com/CycloneBoy/csc_sql)
         | 
| 93 | 
            +
            - [open-r1](https://github.com/huggingface/open-r1)
         | 
| 94 | 
            +
            - [OmniSQL](https://github.com/RUCKBReasoning/OmniSQL)
         | 
| 95 | 
            +
             | 
| 96 | 
            +
            ## Citation
         | 
| 97 | 
            +
             | 
| 98 | 
            +
            ```bibtex
         | 
| 99 | 
            +
             | 
| 100 | 
            +
            @misc{sheng2025slmsqlexplorationsmalllanguage,
         | 
| 101 | 
            +
                  title={SLM-SQL: An Exploration of Small Language Models for Text-to-SQL}, 
         | 
| 102 | 
            +
                  author={Lei Sheng and Shuai-Shuai Xu},
         | 
| 103 | 
            +
                  year={2025},
         | 
| 104 | 
            +
                  eprint={2507.22478},
         | 
| 105 | 
            +
                  archivePrefix={arXiv},
         | 
| 106 | 
            +
                  primaryClass={cs.CL},
         | 
| 107 | 
            +
                  url={https://arxiv.org/abs/2507.22478}, 
         | 
| 108 | 
            +
            }
         | 
| 109 | 
            +
             | 
| 110 | 
            +
            @misc{sheng2025cscsqlcorrectiveselfconsistencytexttosql,
         | 
| 111 | 
            +
                  title={CSC-SQL: Corrective Self-Consistency in Text-to-SQL via Reinforcement Learning}, 
         | 
| 112 | 
            +
                  author={Lei Sheng and Shuai-Shuai Xu},
         | 
| 113 | 
            +
                  year={2025},
         | 
| 114 | 
            +
                  eprint={2505.13271},
         | 
| 115 | 
            +
                  archivePrefix={arXiv},
         | 
| 116 | 
            +
                  primaryClass={cs.CL},
         | 
| 117 | 
            +
                  url={https://arxiv.org/abs/2505.13271}, 
         | 
| 118 | 
            +
            }
         | 
| 119 | 
            +
            ```
         |