Update README.md
Browse files
README.md
CHANGED
|
@@ -4,4 +4,16 @@ license: mit
|
|
| 4 |
|
| 5 |
This model has been first pretrained on the BEIR corpus and fine-tuned on the MS MARCO dataset following the approach described in the paper **COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning**. The associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.
|
| 6 |
|
| 7 |
-
This model is trained with BERT-large as the backbone with 335M hyperparameters. See the paper https://arxiv.org/abs/2210.15212 for details.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
|
| 5 |
This model has been first pretrained on the BEIR corpus and fine-tuned on the MS MARCO dataset following the approach described in the paper **COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning**. The associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.
|
| 6 |
|
| 7 |
+
This model is trained with BERT-large as the backbone with 335M hyperparameters. See the paper https://arxiv.org/abs/2210.15212 for details.
|
| 8 |
+
|
| 9 |
+
|
| 10 |
+
## Usage
|
| 11 |
+
|
| 12 |
+
Pre-trained models can be loaded through the HuggingFace transformers library:
|
| 13 |
+
|
| 14 |
+
```python
|
| 15 |
+
from transformers import AutoModel, AutoTokenizer
|
| 16 |
+
|
| 17 |
+
model = AutoModel.from_pretrained("OpenMatch/cocodr-large-msmarco")
|
| 18 |
+
tokenizer = AutoTokenizer.from_pretrained("OpenMatch/cocodr-large-msmarco")
|
| 19 |
+
```
|