dril-instruct
putting dril in a jar for future generations.
Model Details
Model Description
a LoRA finetune on top of vicuna-13b-cocktail, made to act like dril's tweets when given an instruction to "Create a joke about X", where X is anything.
- Developed by: lun-4, dither
- Model type: LoRA
- Language(s) (NLP): English
- License: WTFPL
- Finetuned from model: vicuna-13b-cocktail
Model Sources [optional]
- Repository: https://github.com/lun-4/dril-instruct
- Blogpost: https://l4.pm/wiki/Personal%20Wiki/AI%20stuff/dril-instruct.html
- Demo: Nope
Uses
by shitposters, for shitposting, this isn't useful for any other purpose
Out-of-Scope Use
literally anything other than shitposting
Bias, Risks, and Limitations
this model was finetuned on cocktail, which was made from vicuna but without the "ethical guardrails" AKA "As an AI language model, I can't" responses
How to Get Started with the Model
- get text-generation-webui
- get the base model, vicuna-13b-cocktail in the models folder
- put this in the loras folder
- there is a lot of hacking that i had to do to make loras work with GPTQ quantized models on my machine. those hacks are not portable
- use the "Create a joke about X" model template.
Training Details
Training Data
around 3K dril tweets (that's what snscrape could get, even though there's 12K reported by twitter), and some 10 or so hand-made instructions to the dril tweets
Training Procedure
see blogpost
Preprocessing [optional]
see blogpost pls
Training Hyperparameters
- Training regime: int8
Evaluation
see blogpost pls
Environmental Impact
fuck if i know
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: NVidia A100 80GB
- Hours used: 1, not including 4 hours of trying to bang a training script together
- Cloud Provider: RunPod
- Compute Region: EU-Norway
- Carbon Emitted: 0.1 kg CO2 eq. (0.47 if you include the "banging rocks together" step)