alpaca-lora-7b / README.md
tloen's picture
clarify README.md
431cdc6
|
raw
history blame
310 Bytes
---
license: mit
---
This repo contains a low-rank adapter for LLaMA-7b
fit on the [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca) dataset.
It doesn't contain the foundation model itself, so it's MIT licensed!
Instructions for running it can be found at https://github.com/tloen/alpaca-lora.