|
--- |
|
language: |
|
- en |
|
tags: |
|
- retrieval |
|
- document-rewriting |
|
datasets: |
|
- irds:msmarco-passage |
|
library_name: transformers |
|
--- |
|
|
|
A DeepCT model based on `bert-base-uncased` and trained on MS MARCO. This is a version of [the checkpoint released by the original authors](http://boston.lti.cs.cmu.edu/appendices/arXiv2019-DeepCT-Zhuyun-Dai/outputs/marco.zip), converted to pytorch format and ready for use in PyTerrier. |
|
|
|
## References |
|
|
|
- [Dai19]: Zhuyun Dai, Jamie Callan. Context-Aware Sentence/Passage Term Importance Estimation For First Stage Retrieval. https://arxiv.org/abs/1910.10687 |
|
- [Macdonald20]: Craig Macdonald, Nicola Tonellotto. Declarative Experimentation in Information Retrieval using PyTerrier. Craig Macdonald and Nicola Tonellotto. In Proceedings of ICTIR 2020. https://arxiv.org/abs/2007.14271 |
|
|