Model Card for distilbert-mnli

Model Details

Model Description

A fine-tuned version of distilbert/distilbert-base-uncased using the nyu-mll/multi_nli dataset.

  • Developed by: Karl Weinmeister
  • Language(s) (NLP): en
  • License: apache-2.0
  • Finetuned from model [optional]: distilbert/distilbert-base-uncased

Training Hyperparameters

  • Training regime: The model was trained for 5 epochs with batch size 128.
Downloads last month
114
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support transformers models with pipeline type sentence-similarity

Model tree for kweinmeister/distilbert-mnli

Finetuned
(7322)
this model
Merges
1 model

Dataset used to train kweinmeister/distilbert-mnli

Evaluation results