T5 Question Generator
This repository contains a fine-tuned T5 model for question generation. The model takes an answer and a context paragraph as input and generates a relevant question.
Model Description
This model is a fine-tuned version of the T5 (Text-to-Text Transfer Transformer) model. It has been trained on a dataset of 60000 non-technical questions from SQuAD and 10000 technical questions. The model is conditioned on the answer and the context to generate a question for which the given answer is the correct response.
How to Use
You can use this model with the transformers
library in Python. First, make sure you have the library installed:
pip install transformers
pip install sentencepiece
Then, you can use the following code to load the model and generate a question:
from transformers import T5ForConditionalGeneration, T5Tokenizer
model_name = "Ayush472/T5QuestionGenerator"
model = T5ForConditionalGeneration.from_pretrained(model_name)
tokenizer = T5Tokenizer.from_pretrained(model_name)
context = "The Eiffel Tower is a wrought-iron lattice tower on the Champ de Mars in Paris, France. It is named after the engineer Gustave Eiffel, whose company designed and built the tower."
answer = "Gustave Eiffel"
input_text = f"answer: {answer} context: {context}"
input_ids = tokenizer.encode(input_text, return_tensors="pt")
output = model.generate(input_ids, max_length=100)
generated_question = tokenizer.decode(output[0], skip_special_tokens=True)
print(generated_question)
# Expected output: Who designed the Eiffel Tower?
Model Architecture
The model is based on the T5 architecture. T5 is an encoder-decoder model that is pre-trained on a large corpus of text. It is trained using a text-to-text approach, which means that all NLP tasks are cast as a text-to-text problem.
About
This model was fine-tuned by Ayush. For any questions or issues, please open an issue in this repository.
- Downloads last month
- 17
Model tree for Ayush472/T5QuestionGenerator
Base model
google-t5/t5-base