Rupesh2 commited on
Commit
fa92bb8
·
verified ·
1 Parent(s): 4a2e519

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -2
README.md CHANGED
@@ -1,12 +1,17 @@
1
  ---
2
  base_model: microsoft/Phi-3.5-mini-instruct
3
  library_name: transformers
4
- model_name: checkpoint_dir
5
  tags:
6
  - generated_from_trainer
7
  - trl
8
  - sft
9
  licence: license
 
 
 
 
 
10
  ---
11
 
12
  # Model Card for checkpoint_dir
@@ -20,7 +25,7 @@ It has been trained using [TRL](https://github.com/huggingface/trl).
20
  from transformers import pipeline
21
 
22
  question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
23
- generator = pipeline("text-generation", model="Rupesh2/checkpoint_dir", device="cuda")
24
  output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
25
  print(output["generated_text"])
26
  ```
 
1
  ---
2
  base_model: microsoft/Phi-3.5-mini-instruct
3
  library_name: transformers
4
+ model_name: phi-3.5-medical
5
  tags:
6
  - generated_from_trainer
7
  - trl
8
  - sft
9
  licence: license
10
+ license: apache-2.0
11
+ datasets:
12
+ - Rupesh2/medical-dataset-2k-phi
13
+ language:
14
+ - en
15
  ---
16
 
17
  # Model Card for checkpoint_dir
 
25
  from transformers import pipeline
26
 
27
  question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
28
+ generator = pipeline("text-generation", model="Rupesh2/phi-3.5-medical", device="cuda")
29
  output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
30
  print(output["generated_text"])
31
  ```