lx865712528's picture
add ckpts
932f4d6
|
raw
history blame
754 Bytes
metadata
license: mit
datasets:
  - wikipedia
language:
  - en
pipeline_tag: feature-extraction

MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders are Better Dense Retrievers

Paper: https://arxiv.org/abs/2212.07841.

Code: https://github.com/microsoft/SimXNS/tree/main/MASTER.

Overview

This is the checkpoint after pretraining on the NQ, TQ, WQ and Squad's Wikipedia corpus. You may use this checkpoint as the initialization for finetuning.

Useage

To load this checkpoint for initialization, you may follow:

from transformers import AutoModel

model = AutoModel.from_pretrained('lx865712528/master-base-pretrained-wiki')