Uploaded model
- Developed by: Vicky
- License: mit
- Finetuned from model : Bart summarization
Inference
pip install transformers
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna")
model = AutoModelForSeq2SeqLM.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna")
def generate_answer(text):
inputs = tokenizer([text], return_tensors='pt', truncation=True)
summary_ids = model.generate(inputs['input_ids'], max_length=512)
summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True)
return summary
text_to_summarize = """Please answer this question: What is Artifical Intelligence?"""
summary = generate_answer(text_to_summarize)
print(summary)
- Downloads last month
- 104
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.