Variation of https://huggingface.co/tarekziade/distilvit
Trained on 270k images from Flickr10k and COCO. Training source code: https://github.com/tarekziade/distilvit
Results:
- eval_loss: 0.2305169701576233
- eval_rouge1: 39.511
- eval_rouge2: 14.7798
- eval_rougeL: 35.9476
- eval_rougeLsum: 35.9497
- eval_gen_len: 11.695219762592236
- Downloads last month
- 140
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for tarekziade/deit-tiny-distilgpt2
Base model
distilbert/distilgpt2