Model Card for Model ID
lainshower/Llama3-8b-alpaca-v2
Model Details
Full Fine-tuned Llama3-8B Alpaca (with training 3 epochs).
Training with (BF16) Mixed Precision For Stability.
This is Model is Trained For stanford alpaca for 3 Epochs. > Click here Llama3-8B-Alpaca-1EPOCHS For the Best Validation Loss Model.
Refer to the Training Graph for the better details.
Direct Use
[Templates]
You can use the following standard templates for inference the Llama3 Alpaca model:
PROMPT_DICT = {
"prompt_input": (
"Below is an instruction that describes a task, paired with an input that provides further context. "
"Write a response that appropriately completes the request.\n\n"
"### Instruction:\n{instruction}\n\n### Input:\n{input}\n\n### Response:"
),
"prompt_no_input": (
"Below is an instruction that describes a task. "
"Write a response that appropriately completes the request.\n\n"
"### Instruction:\n{instruction}\n\n### Response:"
),
}
[Code]
[Model Loading]
### We recommend using Float32 when running inference on the models.
model = LlamaForCausalLM.from_pretrained("lainshower/Llama3-8b-alpaca-v2")
tokenizer = AutoTokenizer.from_pretrained("lainshower/Llama3-8b-alpaca-v2")
[Template]
PROMPT_DICT = {
"prompt_input": (
"Below is an instruction that describes a task, paired with an input that provides further context. "
"Write a response that appropriately completes the request.\n\n"
"### Instruction:\n{instruction}\n\n### Input:\n{input}\n\n### Response:"
),
"prompt_no_input": (
"Below is an instruction that describes a task. "
"Write a response that appropriately completes the request.\n\n"
"### Instruction:\n{instruction}\n\n### Response:"
),
}
ann = {}
ann['instruction'] = '''You are presented with the quiz "What causes weather changes on Earth? " But you don't know the answer, so you turn to your teacher to ask for hints. He says that "the Earth being tilted on its rotating axis causes seasons" and "weather changes from season to season". So, what's the best answer to the question? Choose your answer from: (a). the sun's energy (b). The tilt in its rotating axis. (c). high temperature (d). Weather in space (e). Vertical movement (f). Greenhouse gases (g). Spinning backwards (h). wind and erosion Answer:'''
prompt = PROMPT_DICT["prompt_no_input"].format_map(ann)
'''
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
"What causes weather changes on Earth? " But you don't know the answer, so you turn to your teacher to ask for hints. He says that "the Earth being tilted on its rotating axis causes seasons" and "weather changes from season to season". So, what's the best answer to the question? Choose your answer from: (a). the sun's energy (b). The tilt in its rotating axis. (c). high temperature (d). Weather in space (e). Vertical movement (f). Greenhouse gases (g). Spinning backwards (h). wind and erosion Answer:
### Response:
'''
[Generation]
input_ids = token.batch_encode_plus([prompt], return_tensors="pt", padding=False)
total_sequences = model.generate(input_ids=input_ids['input_ids'].cuda(), attention_mask=input_ids['attention_mask'].cuda(), max_length=490, do_sample=True, top_p=0.9)
print(token.decode(total_sequences[0], skip_special_tokens=True)))
Training Hyperparameters
- Learning Rates : 2e-5
- Training Procedures : Mixed Precision (bfloat16)
- Context Length: 512
- This is 3-Epochs Training Model > Click here Llama3-8B-Alpaca-1EPOCHS For the Best Validation Loss Model.
- We follow the Rethinking Data Selection for Supervised Fine-Tuning for Total Training Epochs Selection.
Training Graph
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.