eval
#5
by
jlzhou
- opened
README.md
CHANGED
@@ -61,7 +61,6 @@ This code snippet demonstrates how to build a prompt with table information, and
|
|
61 |
> pip install transformers>=4.37.0
|
62 |
> ```
|
63 |
|
64 |
-
|
65 |
```python
|
66 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
67 |
|
@@ -124,19 +123,12 @@ generated_ids = [
|
|
124 |
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
|
125 |
```
|
126 |
|
127 |
-
**Complex Usage Scenarios**
|
128 |
-
|
129 |
-
For complex usage scenarios, we provide a [tablegpt-agent]((https://github.com/tablegpt/tablegpt-agent)) toolkit to help you more conveniently handle various types of tabular inputs.
|
130 |
-
|
131 |
-
This agent is built on top of the `Langgraph` library and provides a user-friendly interface for interacting with `TableGPT2`.
|
132 |
-
|
133 |
-
|
134 |
**Deployment**
|
135 |
|
136 |
For deployment, we recommend using vLLM.
|
137 |
* **Install vLLM**: You can install vLLM by running the following command.
|
138 |
```bash
|
139 |
-
pip install "vllm>=0.
|
140 |
```
|
141 |
* **Model Deployment**: Use vLLM to deploy your model. For example, you can use the command to set up a server similar to openAI:
|
142 |
```bash
|
@@ -158,7 +150,6 @@ For deployment, we recommend using vLLM.
|
|
158 |
```
|
159 |
For more details about how to use TableGPT2, please refer to [our repository on GitHub](https://github.com/tablegpt/tablegpt-agent)
|
160 |
|
161 |
-
|
162 |
**License**
|
163 |
|
164 |
TableGPT2-7B is under apache-2.0 license.
|
@@ -167,7 +158,7 @@ TableGPT2-7B is under apache-2.0 license.
|
|
167 |
|
168 |
**Research Paper**
|
169 |
|
170 |
-
TableGPT2-7B is introduced and validated in the paper "[TableGPT2: A Large Multimodal Model with Tabular Data Integration](
|
171 |
|
172 |
**Where to send questions or comments about the model**
|
173 |
|
@@ -214,20 +205,12 @@ Evaluation has shown that TableGPT2-7B performs consistently well across benchma
|
|
214 |
| TableBench | DP | - | 26.62 | 26.44 | 26.71 | 26.73 | 26.15 | 3.88 | 29.60 | 21.94 | 28.67 | 25.18 | 32.03 | **38.90** |
|
215 |
| TableBench | TCoT | - | 37.08 | 31.33 | 29.79 | 30.01 | 28.65 | 3.85 | 30.93 | 22.8 | 36.25 | 29.77 | 42.34 | **50.06** |
|
216 |
| TableBench | SCoT | - | 14.11 | 17.78 | 9.60 | 12.38 | 22.39 | 2.88 | 22.61 | 8.43 | 25.95 | 24.35 | 25.01 | **30.47** |
|
217 |
-
| TableBench | PoT@1 | - | 21.05 | 26.39 |
|
218 |
|
219 |
## Citation
|
220 |
|
221 |
If you find our work helpful, please cite us by
|
222 |
|
223 |
-
```
|
224 |
-
|
225 |
-
title={TableGPT2: A Large Multimodal Model with Tabular Data Integration},
|
226 |
-
author={Aofeng Su and Aowen Wang and Chao Ye and Chen Zhou and Ga Zhang and Guangcheng Zhu and Haobo Wang and Haokai Xu and Hao Chen and Haoze Li and Haoxuan Lan and Jiaming Tian and Jing Yuan and Junbo Zhao and Junlin Zhou and Kaizhe Shou and Liangyu Zha and Lin Long and Liyao Li and Pengzuo Wu and Qi Zhang and Qingyi Huang and Saisai Yang and Tao Zhang and Wentao Ye and Wufang Zhu and Xiaomeng Hu and Xijun Gu and Xinjie Sun and Xiang Li and Yuhang Yang and Zhiqing Xiao},
|
227 |
-
year={2024},
|
228 |
-
eprint={2411.02059},
|
229 |
-
archivePrefix={arXiv},
|
230 |
-
primaryClass={cs.LG},
|
231 |
-
url={https://arxiv.org/abs/2411.02059},
|
232 |
-
}
|
233 |
```
|
|
|
61 |
> pip install transformers>=4.37.0
|
62 |
> ```
|
63 |
|
|
|
64 |
```python
|
65 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
66 |
|
|
|
123 |
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
|
124 |
```
|
125 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
126 |
**Deployment**
|
127 |
|
128 |
For deployment, we recommend using vLLM.
|
129 |
* **Install vLLM**: You can install vLLM by running the following command.
|
130 |
```bash
|
131 |
+
pip install "vllm>=0.4.3"
|
132 |
```
|
133 |
* **Model Deployment**: Use vLLM to deploy your model. For example, you can use the command to set up a server similar to openAI:
|
134 |
```bash
|
|
|
150 |
```
|
151 |
For more details about how to use TableGPT2, please refer to [our repository on GitHub](https://github.com/tablegpt/tablegpt-agent)
|
152 |
|
|
|
153 |
**License**
|
154 |
|
155 |
TableGPT2-7B is under apache-2.0 license.
|
|
|
158 |
|
159 |
**Research Paper**
|
160 |
|
161 |
+
TableGPT2-7B is introduced and validated in the paper "[TableGPT2: A Large Multimodal Model with Tabular Data Integration](URL_TODO)" available on arXiv.
|
162 |
|
163 |
**Where to send questions or comments about the model**
|
164 |
|
|
|
205 |
| TableBench | DP | - | 26.62 | 26.44 | 26.71 | 26.73 | 26.15 | 3.88 | 29.60 | 21.94 | 28.67 | 25.18 | 32.03 | **38.90** |
|
206 |
| TableBench | TCoT | - | 37.08 | 31.33 | 29.79 | 30.01 | 28.65 | 3.85 | 30.93 | 22.8 | 36.25 | 29.77 | 42.34 | **50.06** |
|
207 |
| TableBench | SCoT | - | 14.11 | 17.78 | 9.60 | 12.38 | 22.39 | 2.88 | 22.61 | 8.43 | 25.95 | 24.35 | 25.01 | **30.47** |
|
208 |
+
| TableBench | PoT@1 | - | 21.05 | 26.39 | | | | | | | | | | |
|
209 |
|
210 |
## Citation
|
211 |
|
212 |
If you find our work helpful, please cite us by
|
213 |
|
214 |
+
```
|
215 |
+
XXX
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
216 |
```
|