TabICL: A Tabular Foundation Model for In-Context Learning on Large Data

TabICL is a scalable tabular foundation model designed for classification tasks. Pre-trained on synthetic datasets with up to 60K samples, it can handle even larger datasets thanks to its memory-efficient inference.

Installation

pip install tabicl

The source code is available at GitHub - soda-inria/tabicl.

Citation

If you use TabICL for research purposes, please cite our paper:

@article{qu2025tabicl,
  title={TabICL: A Tabular Foundation Model for In-Context Learning on Large Data},
  author={Qu, Jingang and Holzm{\"u}ller, David and Varoquaux, Ga{\"e}l and Morvan, Marine Le},
  journal={arXiv preprint arXiv:2502.05564},
  year={2025}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support tabular-classification models for tabicl library.