jingang commited on
Commit
f05d7c2
·
verified ·
1 Parent(s): d20168d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -3
README.md CHANGED
@@ -1,13 +1,30 @@
1
  ---
2
  license: bsd-3-clause
3
  pipeline_tag: tabular-classification
 
4
  ---
5
 
6
  # TabICL: A Tabular Foundation Model for In-Context Learning on Large Data
7
 
8
- TabICL is an in-context learning classification model designed for tabular data like TabPFN,
9
- but is pre-trained on much larger datasets (up to 60K samples) and can handle even larger datasets
10
  thanks to its memory-efficient inference.
11
 
 
 
 
 
 
 
 
 
 
12
  If you use TabICL for research purposes,
13
- please cite our **[paper](https://arxiv.org/abs/2502.05564)**.
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: bsd-3-clause
3
  pipeline_tag: tabular-classification
4
+ library_name: tabicl
5
  ---
6
 
7
  # TabICL: A Tabular Foundation Model for In-Context Learning on Large Data
8
 
9
+ TabICL is a scalable tabular foundation model designed for classification tasks. Pre-trained on synthetic datasets with up to 60K samples, it can handle even larger datasets
 
10
  thanks to its memory-efficient inference.
11
 
12
+
13
+ ## Installation
14
+ ```bash
15
+ pip install tabicl
16
+ ```
17
+
18
+ The source code is available at [GitHub - soda-inria/tabicl](https://github.com/soda-inria/tabicl).
19
+
20
+ ## Citation
21
  If you use TabICL for research purposes,
22
+ please cite our **[paper](https://arxiv.org/abs/2502.05564)**:
23
+ ```bibtex
24
+ @article{qu2025tabicl,
25
+ title={TabICL: A Tabular Foundation Model for In-Context Learning on Large Data},
26
+ author={Qu, Jingang and Holzm{\"u}ller, David and Varoquaux, Ga{\"e}l and Morvan, Marine Le},
27
+ journal={arXiv preprint arXiv:2502.05564},
28
+ year={2025}
29
+ }
30
+ ```