File size: 782 Bytes
7a682f7 1c6f4cf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
library_name: transformers
tags: []
---
## Inference
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import time
import torch
import re
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model = AutoModelForSequenceClassification.from_pretrained("Mr-Vicky-01/TP-FP").to(device)
tokenizer = AutoTokenizer.from_pretrained("Mr-Vicky-01/TP-FP")
start = time.time()
question = "give me a scan result"
question = re.sub(r'[,?.]', '', question)
inputs = tokenizer(question, return_tensors="pt").to(device)
with torch.no_grad():
logits = model(**inputs).logits
predicted_class_id = logits.argmax().item()
predicted_class = model.config.id2label[predicted_class_id]
print(predicted_class)
print(time.time() - start)
``` |