Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
5.46.1
metadata
title: Jack Patel AI Assistant
emoji: π¦
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 5.34.1
app_file: main.py
license: mit
π€ Jack Patel AI Assistant
A personalized AI assistant powered by a fine-tuned TinyLlama model, trained to answer questions about Jack Patel using custom data stored in Data.json
.
π Features
- Personalized Responses: Trained on 150+ question-answer pairs about Jack Patel (from
Data.json
) - Smart Fallback: If a match isn't found in the training data, the model generates a response using TinyLlama
- Modern UI: FastAPI-powered UI using a clean and responsive HTML form
- API Access: RESTful API endpoints for programmatic access
- Container-Ready: Deployable instantly on Hugging Face Spaces via Docker or source files
π How to Use
π₯οΈ Web Interface
Visit the Space and type any question β get a smart response about Jack Patel.
π API Usage
Make a GET request to:
/api/generate?instruction=your_question
β Health Check
Use:
/health
to confirm the app is running correctly.
π Example Questions
- "What is your name?"
- "What is your father's name?"
- "Which school did you attend?"
- "What are your technical skills?"
- "Tell me about your hobbies"
π οΈ Technical Details
- Base Model:
TinyLlama/TinyLlama-1.1B-Chat-v1.0
- Fine-Tuning: LoRA with your
Data.json
file containing personal Q&A pairs - Framework: FastAPI (backend), Jinja2 (templates)
- Interface: HTML/CSS in
templates/index.html
- Model Inference: PyTorch + Transformers
π File Structure
βββ main.py # FastAPI application logic
βββ requirements.txt # Dependencies
βββ Dockerfile # Build config for Hugging Face Space
βββ Data.json # Personalized Q&A training data
βββ templates/
β βββ index.html # Web-based UI
βββ lora_model/ # LoRA fine-tuned TinyLlama model
βββ static/ # Optional static files (CSS, JS)
π Model Behavior
- Uses Data.json: Loads Q&A pairs at runtime for exact matching
- Model-Generated Responses: For non-exact matches, the TinyLlama model generates contextual answers
- Fallback Intelligence: Matches similar or rephrased queries using embedding similarity
π§ Setup Instructions
π§ͺ Local Development
pip install -r requirements.txt
python main.py
βοΈ Hugging Face Spaces
- Upload all project files including:
main.py
Data.json
lora_model/
folderrequirements.txt
templates/index.html
- Hugging Face will automatically detect and run your app.
π Privacy & Security
- No data is stored or logged
- Responses are generated locally in-memory
- No external API calls
- Entirely contained in the Hugging Face Space runtime
π€ Contributing
Want to improve this AI assistant? You can:
- Add new question-answer pairs to
Data.json
- Enhance the UI
- Optimize performance and latency
- Add advanced features (like semantic search)
π License
This project is licensed under the MIT License.
Built with β€οΈ using FastAPI, PyTorch, and Hugging Face Transformers