Edit model card

MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders are Better Dense Retrievers

Paper: https://arxiv.org/abs/2212.07841.

Code: https://github.com/microsoft/SimXNS/tree/main/MASTER.

Overview

This is the checkpoint after pretraining on the MS-MARCO corpus. You may use this checkpoint as the initialization for finetuning.

Useage

To load this checkpoint for initialization, you may follow:

from transformers import AutoModel

model = AutoModel.from_pretrained('lx865712528/master-base-pretrained-msmarco')
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train lx865712528/master-base-pretrained-msmarco