File size: 1,024 Bytes
1ec4534 2aac15e ce093c5 2f85623 a88a0ac 3d0a338 e829b11 c3f5f91 80dc7ba c3f5f91 a88a0ac 208309c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
---
license: mit
language:
- th
- en
base_model: aisingapore/sea-lion-7b-instruct
datasets:
- AIAT/Optimizer-datasetfinal
pipeline_tag: text-generation
---
## Sea-lion2pandas
fine-tuned from [sea-lion-7b-instruct](aisingapore/sea-lion-7b-instruct) with question-pandas expression pairs.
## How to use:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
base_model = "aisingapore/sea-lion-7b-instruct"
adapter_model = "AIAT/Optimizer-sealion2pandas"
model = AutoModelForCausalLM.from_pretrained(base_model, trust_remote_code=True)
model = PeftModel.from_pretrained(model, adapter_model, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(base_model, trust_remote_code=True)
```
# sponser
 |