File size: 732 Bytes
899959d 3ff5332 e1cbe30 f6b5891 e1cbe30 f6b5891 e1cbe30 5739272 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
---
license: mit
datasets:
- ms_marco
language:
- en
pipeline_tag: feature-extraction
---
# MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders are Better Dense Retrievers
Paper: [https://arxiv.org/abs/2212.07841](https://arxiv.org/abs/2212.07841).
Code: [https://github.com/microsoft/SimXNS/tree/main/MASTER](https://github.com/microsoft/SimXNS/tree/main/MASTER).
## Overview
This is the checkpoint after pretraining on the MS-MARCO corpus. **You may use this checkpoint as the initialization for finetuning.**
## Useage
To load this checkpoint for initialization, you may follow:
```python
from transformers import AutoModel
model = AutoModel.from_pretrained('lx865712528/master-base-pretrained-msmarco')
```
|