MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders are Better Dense Retrievers

Paper: https://arxiv.org/abs/2212.07841.

Code: https://github.com/microsoft/SimXNS/tree/main/MASTER.

Overview

This is the checkpoint after pretraining on the MS-MARCO corpus. You may use this checkpoint as the initialization for finetuning.

Useage

To load this checkpoint for initialization, you may follow:

from transformers import AutoModel

model = AutoModel.from_pretrained('lx865712528/master-base-pretrained-msmarco')
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train lx865712528/master-base-pretrained-msmarco