Note
- This is an Experiment to generate Clinical Report Information given some of the attributes which I have trained on. Will be making a better one Soon! Stay Updated
- Merged: ArvindSharma18/Phi-3-mini-4k-instruct-bnb-4bit-Clinical-Trail-Merged
from unsloth import FastLanguageModel
from transformers import TextStreamer
model, tokenizer = FastLanguageModel.from_pretrained(
model_name = "lora_model", # YOUR MODEL YOU USED FOR TRAINING
max_seq_length = max_seq_length,
dtype = dtype,
load_in_4bit = load_in_4bit,
)
FastLanguageModel.for_inference(model)
inputs = tokenizer(
[
"Official Title: Randomized Trial of Usual Care vs. Specialized, Phase-specific Care for Youth at Risk for Psychosis"
], return_tensors = "pt").to("cuda")
text_streamer = TextStreamer(tokenizer, skip_prompt = True)
_ = model.generate(input_ids = inputs.input_ids, attention_mask = inputs.attention_mask,
streamer = text_streamer, max_new_tokens = 2048, pad_token_id = tokenizer.eos_token_id)
Uploaded model
- Developed by: ArvindSharma18
- Finetuned from model : unsloth/Phi-3-mini-4k-instruct-bnb-4bit
This model was trained 2x faster with Unsloth and Huggingface's TRL library.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for ArvindSharma18/Phi-3-mini-4k-instruct-bnb-4bit-Clinical-Trail
Base model
unsloth/Phi-3-mini-4k-instruct-bnb-4bit