File size: 3,281 Bytes
c38fcf7
 
a7abf51
c38fcf7
 
 
d80e34c
c38fcf7
 
 
 
912dbf9
 
c38fcf7
912dbf9
 
 
c38fcf7
 
 
 
 
912dbf9
 
 
c38fcf7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
912dbf9
 
 
 
 
 
c38fcf7
 
912dbf9
 
 
c38fcf7
 
 
 
 
912dbf9
 
 
 
c38fcf7
 
 
 
912dbf9
c38fcf7
 
 
912dbf9
 
c38fcf7
912dbf9
c38fcf7
 
 
912dbf9
c38fcf7
 
 
912dbf9
 
 
 
 
c38fcf7
 
 
 
 
 
 
 
912dbf9
 
 
c38fcf7
 
 
 
912dbf9
 
 
c38fcf7
 
 
 
 
912dbf9
 
 
c38fcf7
1f89c4d
912dbf9
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
title: Jack Patel AI Assistant
emoji: πŸ¦™
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 5.34.1
app_file: main.py
license: mit
---

# πŸ€– Jack Patel AI Assistant

A personalized AI assistant powered by a fine-tuned TinyLlama model, trained to answer questions about Jack Patel using custom data stored in `Data.json`.

## 🌟 Features

- **Personalized Responses**: Trained on 150+ question-answer pairs about Jack Patel (from `Data.json`)
- **Smart Fallback**: If a match isn't found in the training data, the model generates a response using TinyLlama
- **Modern UI**: FastAPI-powered UI using a clean and responsive HTML form
- **API Access**: RESTful API endpoints for programmatic access
- **Container-Ready**: Deployable instantly on Hugging Face Spaces via Docker or source files

## πŸš€ How to Use

### πŸ–₯️ Web Interface
Visit the Space and type any question β€” get a smart response about Jack Patel.

### πŸ”Œ API Usage
Make a GET request to:

```
/api/generate?instruction=your_question
```

### βœ… Health Check
Use:

```
/health
```
to confirm the app is running correctly.

## πŸ“ Example Questions

- "What is your name?"
- "What is your father's name?"
- "Which school did you attend?"
- "What are your technical skills?"
- "Tell me about your hobbies"

## πŸ› οΈ Technical Details

- **Base Model**: `TinyLlama/TinyLlama-1.1B-Chat-v1.0`
- **Fine-Tuning**: LoRA with your `Data.json` file containing personal Q&A pairs
- **Framework**: FastAPI (backend), Jinja2 (templates)
- **Interface**: HTML/CSS in `templates/index.html`
- **Model Inference**: PyTorch + Transformers

## πŸ“ File Structure

```
β”œβ”€β”€ main.py                 # FastAPI application logic
β”œβ”€β”€ requirements.txt        # Dependencies
β”œβ”€β”€ Dockerfile             # Build config for Hugging Face Space
β”œβ”€β”€ Data.json              # Personalized Q&A training data
β”œβ”€β”€ templates/
β”‚   └── index.html         # Web-based UI
β”œβ”€β”€ lora_model/            # LoRA fine-tuned TinyLlama model
└── static/                # Optional static files (CSS, JS)
```

## πŸ“Š Model Behavior

- **Uses Data.json**: Loads Q&A pairs at runtime for exact matching
- **Model-Generated Responses**: For non-exact matches, the TinyLlama model generates contextual answers
- **Fallback Intelligence**: Matches similar or rephrased queries using embedding similarity

## πŸ”§ Setup Instructions

### πŸ§ͺ Local Development
```bash
pip install -r requirements.txt
python main.py
```

### ☁️ Hugging Face Spaces
1. Upload all project files including:
   * `main.py`
   * `Data.json`
   * `lora_model/` folder
   * `requirements.txt`
   * `templates/index.html`
2. Hugging Face will automatically detect and run your app.

## πŸ”’ Privacy & Security

* No data is stored or logged
* Responses are generated locally in-memory
* No external API calls
* Entirely contained in the Hugging Face Space runtime

## 🀝 Contributing

Want to improve this AI assistant? You can:
* Add new question-answer pairs to `Data.json`
* Enhance the UI
* Optimize performance and latency
* Add advanced features (like semantic search)

## πŸ“„ License

This project is licensed under the MIT License.

**Built with ❀️ using FastAPI, PyTorch, and Hugging Face Transformers**