Model Card for distilbert-mnli
Model Details
Model Description
A fine-tuned version of distilbert/distilbert-base-uncased
using the nyu-mll/multi_nli
dataset.
- Developed by: Karl Weinmeister
- Language(s) (NLP): en
- License: apache-2.0
- Finetuned from model [optional]: distilbert/distilbert-base-uncased
Training Hyperparameters
- Training regime: The model was trained for 5 epochs with batch size 128.
- Downloads last month
- 114
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the HF Inference API does not support transformers models with pipeline type sentence-similarity