Spaces:
Sleeping
Sleeping
David Ko
commited on
Commit
·
bd99505
1
Parent(s):
da9abeb
Initial deployment of vision-web-app
Browse files- Dockerfile +33 -0
- README.md +109 -11
- api.py +379 -0
- app.py +298 -0
- frontend/build/asset-manifest.json +24 -0
- frontend/build/index.html +1 -0
- frontend/build/manifest.json +15 -0
- frontend/build/precache-manifest.e8825d818084296fa14f1b32e8815c1e.js +30 -0
- frontend/build/service-worker.js +39 -0
- frontend/build/static/css/main.59c2a54e.chunk.css +2 -0
- frontend/build/static/css/main.59c2a54e.chunk.css.map +1 -0
- frontend/build/static/js/2.74e99ef6.chunk.js +0 -0
- frontend/build/static/js/2.74e99ef6.chunk.js.LICENSE.txt +58 -0
- frontend/build/static/js/2.74e99ef6.chunk.js.map +0 -0
- frontend/build/static/js/3.0e3ce0f8.chunk.js +2 -0
- frontend/build/static/js/3.0e3ce0f8.chunk.js.map +1 -0
- frontend/build/static/js/main.3d1593c5.chunk.js +2 -0
- frontend/build/static/js/main.3d1593c5.chunk.js.map +1 -0
- frontend/build/static/js/runtime-main.ab7e4402.js +2 -0
- frontend/build/static/js/runtime-main.ab7e4402.js.map +1 -0
- requirements.txt +26 -0
- static/index.html +275 -0
Dockerfile
ADDED
@@ -0,0 +1,33 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
FROM python:3.9-slim
|
2 |
+
|
3 |
+
WORKDIR /app
|
4 |
+
|
5 |
+
# Install system dependencies
|
6 |
+
RUN apt-get update && apt-get install -y \
|
7 |
+
build-essential \
|
8 |
+
git \
|
9 |
+
libgl1-mesa-glx \
|
10 |
+
libglib2.0-0 \
|
11 |
+
&& rm -rf /var/lib/apt/lists/*
|
12 |
+
|
13 |
+
# Copy requirements first for better caching
|
14 |
+
COPY requirements.txt .
|
15 |
+
RUN pip install --no-cache-dir -r requirements.txt
|
16 |
+
|
17 |
+
# Copy backend code
|
18 |
+
COPY api.py .
|
19 |
+
COPY static/ static/
|
20 |
+
|
21 |
+
# Copy frontend build files
|
22 |
+
COPY frontend/build/ static/
|
23 |
+
|
24 |
+
# Set environment variables
|
25 |
+
ENV PYTHONUNBUFFERED=1
|
26 |
+
ENV PYTHONDONTWRITEBYTECODE=1
|
27 |
+
ENV PORT=7860
|
28 |
+
|
29 |
+
# Expose the port Hugging Face Spaces expects
|
30 |
+
EXPOSE 7860
|
31 |
+
|
32 |
+
# Command to run the application
|
33 |
+
CMD ["python", "api.py"]
|
README.md
CHANGED
@@ -1,11 +1,109 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Vision Web App - Object Detection Demo
|
2 |
+
|
3 |
+
A multi-model object detection and image classification demo using YOLOv8, DETR, and ViT models. This project is designed to showcase different computer vision models for a hiring demonstration.
|
4 |
+
|
5 |
+
## Project Architecture
|
6 |
+
|
7 |
+
This project follows a phased development approach:
|
8 |
+
|
9 |
+
### Phase 0: PoC with Gradio (Original)
|
10 |
+
- Simple Gradio interface with multiple object detection models
|
11 |
+
- Uses Hugging Face's free tier for model hosting
|
12 |
+
- Easy to deploy to Hugging Face Spaces
|
13 |
+
|
14 |
+
### Phase 1: Service Separation (Implemented)
|
15 |
+
- Backend: Flask API with model inference endpoints
|
16 |
+
- REST API endpoints for model inference
|
17 |
+
- JSON responses with detection results and performance metrics
|
18 |
+
|
19 |
+
### Phase 2: UI Upgrade (Implemented)
|
20 |
+
- Modern React frontend with Material-UI components
|
21 |
+
- Improved user experience with responsive design
|
22 |
+
- Separate frontend and backend architecture
|
23 |
+
|
24 |
+
### Phase 3: CI/CD & Testing (Planned)
|
25 |
+
- GitHub Actions for automated testing and deployment
|
26 |
+
- Comprehensive test suite with pytest and ESLint
|
27 |
+
- Automatic rebuilds on Hugging Face Spaces
|
28 |
+
|
29 |
+
## How to Run
|
30 |
+
|
31 |
+
### Option 1: Original Gradio App
|
32 |
+
1. Install dependencies:
|
33 |
+
```bash
|
34 |
+
pip install -r requirements.txt
|
35 |
+
```
|
36 |
+
|
37 |
+
2. Run the Gradio app:
|
38 |
+
```bash
|
39 |
+
python app.py
|
40 |
+
```
|
41 |
+
|
42 |
+
3. Open your browser and go to the URL shown in the terminal (typically `http://127.0.0.1:7860`)
|
43 |
+
|
44 |
+
### Option 2: React Frontend with Flask Backend
|
45 |
+
1. Install backend dependencies:
|
46 |
+
```bash
|
47 |
+
pip install -r requirements.txt
|
48 |
+
```
|
49 |
+
|
50 |
+
2. Start the Flask backend server:
|
51 |
+
```bash
|
52 |
+
python api.py
|
53 |
+
```
|
54 |
+
|
55 |
+
3. In a separate terminal, navigate to the frontend directory:
|
56 |
+
```bash
|
57 |
+
cd frontend
|
58 |
+
```
|
59 |
+
|
60 |
+
4. Install frontend dependencies:
|
61 |
+
```bash
|
62 |
+
npm install
|
63 |
+
```
|
64 |
+
|
65 |
+
5. Start the React development server:
|
66 |
+
```bash
|
67 |
+
npm start
|
68 |
+
```
|
69 |
+
|
70 |
+
6. Open your browser and go to `http://localhost:3000`
|
71 |
+
|
72 |
+
## Models Used
|
73 |
+
|
74 |
+
- **YOLOv8**: Fast and accurate object detection
|
75 |
+
- **DETR**: DEtection TRansformer for object detection
|
76 |
+
- **ViT**: Vision Transformer for image classification
|
77 |
+
|
78 |
+
## API Endpoints
|
79 |
+
|
80 |
+
The Flask backend provides the following API endpoints:
|
81 |
+
|
82 |
+
- `GET /api/status` - Check the status of the API and available models
|
83 |
+
- `POST /api/detect/yolo` - Detect objects using YOLOv8
|
84 |
+
- `POST /api/detect/detr` - Detect objects using DETR
|
85 |
+
- `POST /api/classify/vit` - Classify images using ViT
|
86 |
+
|
87 |
+
All POST endpoints accept form data with an 'image' field containing the image file.
|
88 |
+
|
89 |
+
## Deployment
|
90 |
+
|
91 |
+
### Gradio App
|
92 |
+
The Gradio app is designed to be easily deployed to Hugging Face Spaces:
|
93 |
+
|
94 |
+
1. Create a new Space on Hugging Face
|
95 |
+
2. Select Gradio as the SDK
|
96 |
+
3. Push this repository to the Space's git repository
|
97 |
+
4. The app will automatically deploy
|
98 |
+
|
99 |
+
### React + Flask App
|
100 |
+
For the React + Flask version, you'll need to:
|
101 |
+
|
102 |
+
1. Build the React frontend:
|
103 |
+
```bash
|
104 |
+
cd frontend
|
105 |
+
npm run build
|
106 |
+
```
|
107 |
+
|
108 |
+
2. Serve the static files from a web server or cloud hosting service
|
109 |
+
3. Deploy the Flask backend to a server that supports Python
|
api.py
ADDED
@@ -0,0 +1,379 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from flask import Flask, request, jsonify, send_from_directory
|
2 |
+
import torch
|
3 |
+
from PIL import Image
|
4 |
+
import numpy as np
|
5 |
+
import os
|
6 |
+
import io
|
7 |
+
import base64
|
8 |
+
import matplotlib.pyplot as plt
|
9 |
+
from matplotlib.patches import Rectangle
|
10 |
+
import time
|
11 |
+
from flask_cors import CORS
|
12 |
+
import json
|
13 |
+
|
14 |
+
app = Flask(__name__, static_folder='static')
|
15 |
+
CORS(app) # Enable CORS for all routes
|
16 |
+
|
17 |
+
# Model initialization
|
18 |
+
print("Loading models... This may take a moment.")
|
19 |
+
|
20 |
+
# YOLOv8 model
|
21 |
+
yolo_model = None
|
22 |
+
try:
|
23 |
+
from ultralytics import YOLO
|
24 |
+
yolo_model = YOLO("yolov8n.pt") # Using the nano model for faster inference
|
25 |
+
print("YOLOv8 model loaded successfully")
|
26 |
+
except Exception as e:
|
27 |
+
print("Error loading YOLOv8 model:", e)
|
28 |
+
yolo_model = None
|
29 |
+
|
30 |
+
# DETR model (DEtection TRansformer)
|
31 |
+
detr_processor = None
|
32 |
+
detr_model = None
|
33 |
+
try:
|
34 |
+
from transformers import DetrImageProcessor, DetrForObjectDetection
|
35 |
+
|
36 |
+
detr_processor = DetrImageProcessor.from_pretrained("facebook/detr-resnet-50")
|
37 |
+
detr_model = DetrForObjectDetection.from_pretrained("facebook/detr-resnet-50")
|
38 |
+
|
39 |
+
print("DETR model loaded successfully")
|
40 |
+
except Exception as e:
|
41 |
+
print("Error loading DETR model:", e)
|
42 |
+
detr_processor = None
|
43 |
+
detr_model = None
|
44 |
+
|
45 |
+
# ViT model
|
46 |
+
vit_processor = None
|
47 |
+
vit_model = None
|
48 |
+
try:
|
49 |
+
from transformers import ViTImageProcessor, ViTForImageClassification
|
50 |
+
vit_processor = ViTImageProcessor.from_pretrained("google/vit-base-patch16-224")
|
51 |
+
vit_model = ViTForImageClassification.from_pretrained("google/vit-base-patch16-224")
|
52 |
+
print("ViT model loaded successfully")
|
53 |
+
except Exception as e:
|
54 |
+
print("Error loading ViT model:", e)
|
55 |
+
vit_processor = None
|
56 |
+
vit_model = None
|
57 |
+
|
58 |
+
# Get device information
|
59 |
+
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
|
60 |
+
print(f"Using device: {device}")
|
61 |
+
|
62 |
+
# LLM model (using an open-access model instead of Llama 4 which requires authentication)
|
63 |
+
llm_model = None
|
64 |
+
llm_tokenizer = None
|
65 |
+
try:
|
66 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
67 |
+
|
68 |
+
print("Loading LLM model... This may take a moment.")
|
69 |
+
model_name = "TinyLlama/TinyLlama-1.1B-Chat-v1.0" # Using TinyLlama as an open-access alternative
|
70 |
+
|
71 |
+
llm_tokenizer = AutoTokenizer.from_pretrained(model_name)
|
72 |
+
llm_model = AutoModelForCausalLM.from_pretrained(
|
73 |
+
model_name,
|
74 |
+
torch_dtype=torch.float16,
|
75 |
+
# Removing options that require accelerate package
|
76 |
+
# device_map="auto",
|
77 |
+
# load_in_8bit=True
|
78 |
+
).to(device)
|
79 |
+
print("LLM model loaded successfully")
|
80 |
+
except Exception as e:
|
81 |
+
print(f"Error loading LLM model: {e}")
|
82 |
+
llm_model = None
|
83 |
+
llm_tokenizer = None
|
84 |
+
|
85 |
+
def process_llm_query(vision_results, user_query):
|
86 |
+
"""Process a query with the LLM model using vision results and user text"""
|
87 |
+
if llm_model is None or llm_tokenizer is None:
|
88 |
+
return {"error": "LLM model not available"}
|
89 |
+
|
90 |
+
# Create a prompt combining vision results and user query
|
91 |
+
prompt = f"""You are an AI assistant analyzing image detection results.
|
92 |
+
Here are the objects detected in the image: {json.dumps(vision_results, indent=2)}
|
93 |
+
|
94 |
+
User question: {user_query}
|
95 |
+
|
96 |
+
Please provide a detailed analysis based on the detected objects and the user's question.
|
97 |
+
"""
|
98 |
+
|
99 |
+
# Tokenize and generate response
|
100 |
+
try:
|
101 |
+
start_time = time.time()
|
102 |
+
|
103 |
+
inputs = llm_tokenizer(prompt, return_tensors="pt").to(device)
|
104 |
+
with torch.no_grad():
|
105 |
+
output = llm_model.generate(
|
106 |
+
**inputs,
|
107 |
+
max_new_tokens=512,
|
108 |
+
temperature=0.7,
|
109 |
+
top_p=0.9,
|
110 |
+
do_sample=True
|
111 |
+
)
|
112 |
+
|
113 |
+
response_text = llm_tokenizer.decode(output[0], skip_special_tokens=True)
|
114 |
+
|
115 |
+
# Remove the prompt from the response
|
116 |
+
if response_text.startswith(prompt):
|
117 |
+
response_text = response_text[len(prompt):].strip()
|
118 |
+
|
119 |
+
inference_time = time.time() - start_time
|
120 |
+
|
121 |
+
return {
|
122 |
+
"response": response_text,
|
123 |
+
"performance": {
|
124 |
+
"inference_time": round(inference_time, 3),
|
125 |
+
"device": "GPU" if torch.cuda.is_available() else "CPU"
|
126 |
+
}
|
127 |
+
}
|
128 |
+
except Exception as e:
|
129 |
+
return {"error": f"Error processing LLM query: {str(e)}"}
|
130 |
+
|
131 |
+
def image_to_base64(img):
|
132 |
+
"""Convert PIL Image to base64 string"""
|
133 |
+
buffered = io.BytesIO()
|
134 |
+
img.save(buffered, format="PNG")
|
135 |
+
img_str = base64.b64encode(buffered.getvalue()).decode('utf-8')
|
136 |
+
return img_str
|
137 |
+
|
138 |
+
def process_yolo(image):
|
139 |
+
if yolo_model is None:
|
140 |
+
return {"error": "YOLOv8 model not loaded"}
|
141 |
+
|
142 |
+
# Measure inference time
|
143 |
+
start_time = time.time()
|
144 |
+
|
145 |
+
# Convert to numpy if it's a PIL image
|
146 |
+
if isinstance(image, Image.Image):
|
147 |
+
image_np = np.array(image)
|
148 |
+
else:
|
149 |
+
image_np = image
|
150 |
+
|
151 |
+
# Run inference
|
152 |
+
results = yolo_model(image_np)
|
153 |
+
|
154 |
+
# Process results
|
155 |
+
result_image = results[0].plot()
|
156 |
+
result_image = Image.fromarray(result_image)
|
157 |
+
|
158 |
+
# Get detection information
|
159 |
+
boxes = results[0].boxes
|
160 |
+
class_names = results[0].names
|
161 |
+
|
162 |
+
# Format detection results
|
163 |
+
detections = []
|
164 |
+
for box in boxes:
|
165 |
+
class_id = int(box.cls[0].item())
|
166 |
+
class_name = class_names[class_id]
|
167 |
+
confidence = round(box.conf[0].item(), 2)
|
168 |
+
bbox = box.xyxy[0].tolist()
|
169 |
+
bbox = [round(x) for x in bbox]
|
170 |
+
detections.append({
|
171 |
+
"class": class_name,
|
172 |
+
"confidence": confidence,
|
173 |
+
"bbox": bbox
|
174 |
+
})
|
175 |
+
|
176 |
+
# Calculate inference time
|
177 |
+
inference_time = time.time() - start_time
|
178 |
+
|
179 |
+
# Add inference time and device info
|
180 |
+
device_info = "GPU" if torch.cuda.is_available() else "CPU"
|
181 |
+
|
182 |
+
return {
|
183 |
+
"image": image_to_base64(result_image),
|
184 |
+
"detections": detections,
|
185 |
+
"performance": {
|
186 |
+
"inference_time": round(inference_time, 3),
|
187 |
+
"device": device_info
|
188 |
+
}
|
189 |
+
}
|
190 |
+
|
191 |
+
def process_detr(image):
|
192 |
+
if detr_model is None or detr_processor is None:
|
193 |
+
return {"error": "DETR model not loaded"}
|
194 |
+
|
195 |
+
# Measure inference time
|
196 |
+
start_time = time.time()
|
197 |
+
|
198 |
+
# Prepare image for the model
|
199 |
+
inputs = detr_processor(images=image, return_tensors="pt")
|
200 |
+
|
201 |
+
# Run inference
|
202 |
+
with torch.no_grad():
|
203 |
+
outputs = detr_model(**inputs)
|
204 |
+
|
205 |
+
# Process results
|
206 |
+
target_sizes = torch.tensor([image.size[::-1]])
|
207 |
+
results = detr_processor.post_process_object_detection(
|
208 |
+
outputs, target_sizes=target_sizes, threshold=0.9
|
209 |
+
)[0]
|
210 |
+
|
211 |
+
# Create a copy of the image to draw on
|
212 |
+
result_image = image.copy()
|
213 |
+
fig, ax = plt.subplots(1)
|
214 |
+
ax.imshow(result_image)
|
215 |
+
|
216 |
+
# Format detection results
|
217 |
+
detections = []
|
218 |
+
for score, label, box in zip(results["scores"], results["labels"], results["boxes"]):
|
219 |
+
box = [round(i) for i in box.tolist()]
|
220 |
+
class_name = detr_model.config.id2label[label.item()]
|
221 |
+
confidence = round(score.item(), 2)
|
222 |
+
|
223 |
+
# Draw rectangle
|
224 |
+
rect = Rectangle((box[0], box[1]), box[2] - box[0], box[3] - box[1],
|
225 |
+
linewidth=2, edgecolor='r', facecolor='none')
|
226 |
+
ax.add_patch(rect)
|
227 |
+
|
228 |
+
# Add label
|
229 |
+
plt.text(box[0], box[1], "{}: {}".format(class_name, confidence),
|
230 |
+
bbox=dict(facecolor='white', alpha=0.8))
|
231 |
+
|
232 |
+
detections.append({
|
233 |
+
"class": class_name,
|
234 |
+
"confidence": confidence,
|
235 |
+
"bbox": box
|
236 |
+
})
|
237 |
+
|
238 |
+
# Save figure to image
|
239 |
+
buf = io.BytesIO()
|
240 |
+
plt.tight_layout()
|
241 |
+
plt.axis('off')
|
242 |
+
plt.savefig(buf, format='png', bbox_inches='tight', pad_inches=0)
|
243 |
+
buf.seek(0)
|
244 |
+
result_image = Image.open(buf)
|
245 |
+
plt.close(fig)
|
246 |
+
|
247 |
+
# Calculate inference time
|
248 |
+
inference_time = time.time() - start_time
|
249 |
+
|
250 |
+
# Add inference time and device info
|
251 |
+
device_info = "GPU" if torch.cuda.is_available() else "CPU"
|
252 |
+
|
253 |
+
return {
|
254 |
+
"image": image_to_base64(result_image),
|
255 |
+
"detections": detections,
|
256 |
+
"performance": {
|
257 |
+
"inference_time": round(inference_time, 3),
|
258 |
+
"device": device_info
|
259 |
+
}
|
260 |
+
}
|
261 |
+
|
262 |
+
def process_vit(image):
|
263 |
+
if vit_model is None or vit_processor is None:
|
264 |
+
return {"error": "ViT model not loaded"}
|
265 |
+
|
266 |
+
# Measure inference time
|
267 |
+
start_time = time.time()
|
268 |
+
|
269 |
+
# Prepare image for the model
|
270 |
+
inputs = vit_processor(images=image, return_tensors="pt")
|
271 |
+
|
272 |
+
# Run inference
|
273 |
+
with torch.no_grad():
|
274 |
+
outputs = vit_model(**inputs)
|
275 |
+
logits = outputs.logits
|
276 |
+
|
277 |
+
# Get the predicted class
|
278 |
+
predicted_class_idx = logits.argmax(-1).item()
|
279 |
+
prediction = vit_model.config.id2label[predicted_class_idx]
|
280 |
+
|
281 |
+
# Get top 5 predictions
|
282 |
+
probs = torch.nn.functional.softmax(logits, dim=-1)[0]
|
283 |
+
top5_prob, top5_indices = torch.topk(probs, 5)
|
284 |
+
|
285 |
+
results = []
|
286 |
+
for i, (prob, idx) in enumerate(zip(top5_prob, top5_indices)):
|
287 |
+
class_name = vit_model.config.id2label[idx.item()]
|
288 |
+
results.append({
|
289 |
+
"rank": i+1,
|
290 |
+
"class": class_name,
|
291 |
+
"probability": round(prob.item(), 3)
|
292 |
+
})
|
293 |
+
|
294 |
+
# Calculate inference time
|
295 |
+
inference_time = time.time() - start_time
|
296 |
+
|
297 |
+
# Add inference time and device info
|
298 |
+
device_info = "GPU" if torch.cuda.is_available() else "CPU"
|
299 |
+
|
300 |
+
return {
|
301 |
+
"top_predictions": results,
|
302 |
+
"performance": {
|
303 |
+
"inference_time": round(inference_time, 3),
|
304 |
+
"device": device_info
|
305 |
+
}
|
306 |
+
}
|
307 |
+
|
308 |
+
@app.route('/api/detect/yolo', methods=['POST'])
|
309 |
+
def yolo_detect():
|
310 |
+
if 'image' not in request.files:
|
311 |
+
return jsonify({"error": "No image provided"}), 400
|
312 |
+
|
313 |
+
file = request.files['image']
|
314 |
+
image = Image.open(file.stream)
|
315 |
+
|
316 |
+
result = process_yolo(image)
|
317 |
+
return jsonify(result)
|
318 |
+
|
319 |
+
@app.route('/api/detect/detr', methods=['POST'])
|
320 |
+
def detr_detect():
|
321 |
+
if 'image' not in request.files:
|
322 |
+
return jsonify({"error": "No image provided"}), 400
|
323 |
+
|
324 |
+
file = request.files['image']
|
325 |
+
image = Image.open(file.stream)
|
326 |
+
|
327 |
+
result = process_detr(image)
|
328 |
+
return jsonify(result)
|
329 |
+
|
330 |
+
@app.route('/api/classify/vit', methods=['POST'])
|
331 |
+
def vit_classify():
|
332 |
+
if 'image' not in request.files:
|
333 |
+
return jsonify({"error": "No image provided"}), 400
|
334 |
+
|
335 |
+
file = request.files['image']
|
336 |
+
image = Image.open(file.stream)
|
337 |
+
|
338 |
+
result = process_vit(image)
|
339 |
+
return jsonify(result)
|
340 |
+
|
341 |
+
@app.route('/api/analyze', methods=['POST'])
|
342 |
+
def analyze_with_llm():
|
343 |
+
# Check if required data is in the request
|
344 |
+
if not request.json:
|
345 |
+
return jsonify({"error": "No JSON data provided"}), 400
|
346 |
+
|
347 |
+
# Extract vision results and user query from request
|
348 |
+
data = request.json
|
349 |
+
if 'visionResults' not in data or 'userQuery' not in data:
|
350 |
+
return jsonify({"error": "Missing required fields: visionResults or userQuery"}), 400
|
351 |
+
|
352 |
+
vision_results = data['visionResults']
|
353 |
+
user_query = data['userQuery']
|
354 |
+
|
355 |
+
# Process the query with LLM
|
356 |
+
result = process_llm_query(vision_results, user_query)
|
357 |
+
|
358 |
+
return jsonify(result)
|
359 |
+
|
360 |
+
@app.route('/api/status', methods=['GET'])
|
361 |
+
def status():
|
362 |
+
return jsonify({
|
363 |
+
"status": "online",
|
364 |
+
"models": {
|
365 |
+
"yolo": yolo_model is not None,
|
366 |
+
"detr": detr_model is not None and detr_processor is not None,
|
367 |
+
"vit": vit_model is not None and vit_processor is not None
|
368 |
+
},
|
369 |
+
"device": "GPU" if torch.cuda.is_available() else "CPU"
|
370 |
+
})
|
371 |
+
|
372 |
+
@app.route('/')
|
373 |
+
def index():
|
374 |
+
return send_from_directory('static', 'index.html')
|
375 |
+
|
376 |
+
if __name__ == "__main__":
|
377 |
+
# 허깅페이스 Space에서는 PORT 환경 변수를 사용합니다
|
378 |
+
port = int(os.environ.get("PORT", 7860))
|
379 |
+
app.run(debug=False, host='0.0.0.0', port=port)
|
app.py
ADDED
@@ -0,0 +1,298 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import gradio as gr
|
2 |
+
import torch
|
3 |
+
from PIL import Image
|
4 |
+
import numpy as np
|
5 |
+
import os
|
6 |
+
|
7 |
+
# Model initialization
|
8 |
+
print("Loading models... This may take a moment.")
|
9 |
+
|
10 |
+
# YOLOv8 model
|
11 |
+
yolo_model = None
|
12 |
+
try:
|
13 |
+
from ultralytics import YOLO
|
14 |
+
yolo_model = YOLO("yolov8n.pt") # Using the nano model for faster inference
|
15 |
+
print("YOLOv8 model loaded successfully")
|
16 |
+
except Exception as e:
|
17 |
+
print("Error loading YOLOv8 model:", e)
|
18 |
+
yolo_model = None
|
19 |
+
|
20 |
+
# DETR model (DEtection TRansformer)
|
21 |
+
detr_processor = None
|
22 |
+
detr_model = None
|
23 |
+
try:
|
24 |
+
from transformers import DetrImageProcessor, DetrForObjectDetection
|
25 |
+
|
26 |
+
# Load the DETR image processor
|
27 |
+
# DetrImageProcessor: Handles preprocessing of images for DETR model
|
28 |
+
# - Resizes images to appropriate dimensions
|
29 |
+
# - Normalizes pixel values
|
30 |
+
# - Converts images to tensors
|
31 |
+
# - Handles batch processing
|
32 |
+
detr_processor = DetrImageProcessor.from_pretrained("facebook/detr-resnet-50")
|
33 |
+
|
34 |
+
# Load the DETR object detection model
|
35 |
+
# DetrForObjectDetection: The actual object detection model
|
36 |
+
# - Uses ResNet-50 as backbone
|
37 |
+
# - Transformer-based architecture for object detection
|
38 |
+
# - Predicts bounding boxes and object classes
|
39 |
+
# - Pre-trained on COCO dataset by Facebook AI Research
|
40 |
+
detr_model = DetrForObjectDetection.from_pretrained("facebook/detr-resnet-50")
|
41 |
+
|
42 |
+
print("DETR model loaded successfully")
|
43 |
+
except Exception as e:
|
44 |
+
print("Error loading DETR model:", e)
|
45 |
+
detr_processor = None
|
46 |
+
detr_model = None
|
47 |
+
|
48 |
+
# ViT model
|
49 |
+
vit_processor = None
|
50 |
+
vit_model = None
|
51 |
+
try:
|
52 |
+
from transformers import ViTImageProcessor, ViTForImageClassification
|
53 |
+
vit_processor = ViTImageProcessor.from_pretrained("google/vit-base-patch16-224")
|
54 |
+
vit_model = ViTForImageClassification.from_pretrained("google/vit-base-patch16-224")
|
55 |
+
print("ViT model loaded successfully")
|
56 |
+
except Exception as e:
|
57 |
+
print("Error loading ViT model:", e)
|
58 |
+
vit_processor = None
|
59 |
+
vit_model = None
|
60 |
+
|
61 |
+
# Get device information
|
62 |
+
import torch
|
63 |
+
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
|
64 |
+
print(f"Using device: {device}")
|
65 |
+
|
66 |
+
# Define model inference functions
|
67 |
+
def process_yolo(image):
|
68 |
+
if yolo_model is None:
|
69 |
+
return None, "YOLOv8 model not loaded"
|
70 |
+
|
71 |
+
# Measure inference time
|
72 |
+
import time
|
73 |
+
start_time = time.time()
|
74 |
+
|
75 |
+
# Convert to numpy if it's a PIL image
|
76 |
+
if isinstance(image, Image.Image):
|
77 |
+
image_np = np.array(image)
|
78 |
+
else:
|
79 |
+
image_np = image
|
80 |
+
|
81 |
+
# Run inference
|
82 |
+
results = yolo_model(image_np)
|
83 |
+
|
84 |
+
# Process results
|
85 |
+
result_image = results[0].plot()
|
86 |
+
result_image = Image.fromarray(result_image)
|
87 |
+
|
88 |
+
# Get detection information
|
89 |
+
boxes = results[0].boxes
|
90 |
+
class_names = results[0].names
|
91 |
+
|
92 |
+
# Format detection results
|
93 |
+
detections = []
|
94 |
+
for box in boxes:
|
95 |
+
class_id = int(box.cls[0].item())
|
96 |
+
class_name = class_names[class_id]
|
97 |
+
confidence = round(box.conf[0].item(), 2)
|
98 |
+
bbox = box.xyxy[0].tolist()
|
99 |
+
bbox = [round(x) for x in bbox]
|
100 |
+
detections.append("{}: {} at {}".format(class_name, confidence, bbox))
|
101 |
+
|
102 |
+
# Calculate inference time
|
103 |
+
inference_time = time.time() - start_time
|
104 |
+
|
105 |
+
# Add inference time and device info to detection text
|
106 |
+
device_info = "GPU" if torch.cuda.is_available() else "CPU"
|
107 |
+
performance_info = f"\n\nInference time: {inference_time:.3f} seconds on {device_info}"
|
108 |
+
detection_text = "\n".join(detections) if detections else "No objects detected"
|
109 |
+
detection_text += performance_info
|
110 |
+
|
111 |
+
return result_image, detection_text
|
112 |
+
|
113 |
+
def process_detr(image):
|
114 |
+
if detr_model is None or detr_processor is None:
|
115 |
+
return None, "DETR model not loaded"
|
116 |
+
|
117 |
+
# Measure inference time
|
118 |
+
import time
|
119 |
+
start_time = time.time()
|
120 |
+
|
121 |
+
# Prepare image for the model
|
122 |
+
inputs = detr_processor(images=image, return_tensors="pt")
|
123 |
+
|
124 |
+
# Run inference
|
125 |
+
with torch.no_grad():
|
126 |
+
outputs = detr_model(**inputs)
|
127 |
+
|
128 |
+
# Convert outputs to image with bounding boxes
|
129 |
+
# Create tensor with original image dimensions (height, width)
|
130 |
+
# image.size[::-1] reverses the (width, height) to (height, width) as required by DETR
|
131 |
+
target_sizes = torch.tensor([image.size[::-1]])
|
132 |
+
|
133 |
+
# Process raw model outputs into usable detection results
|
134 |
+
# - Maps predictions back to original image size
|
135 |
+
# - Filters detections using confidence threshold (0.9)
|
136 |
+
# - Returns a dictionary with 'scores', 'labels', and 'boxes' keys
|
137 |
+
# - [0] extracts results for the first (and only) image in the batch
|
138 |
+
results = detr_processor.post_process_object_detection(
|
139 |
+
outputs, target_sizes=target_sizes, threshold=0.9
|
140 |
+
)[0]
|
141 |
+
|
142 |
+
# Create a copy of the image to draw on
|
143 |
+
result_image = image.copy()
|
144 |
+
import matplotlib.pyplot as plt
|
145 |
+
from matplotlib.patches import Rectangle
|
146 |
+
import io
|
147 |
+
|
148 |
+
# Create figure and axes
|
149 |
+
fig, ax = plt.subplots(1)
|
150 |
+
ax.imshow(result_image)
|
151 |
+
|
152 |
+
# Format detection results
|
153 |
+
detections = []
|
154 |
+
for score, label, box in zip(results["scores"], results["labels"], results["boxes"]):
|
155 |
+
box = [round(i) for i in box.tolist()]
|
156 |
+
class_name = detr_model.config.id2label[label.item()]
|
157 |
+
confidence = round(score.item(), 2)
|
158 |
+
|
159 |
+
# Draw rectangle
|
160 |
+
rect = Rectangle((box[0], box[1]), box[2] - box[0], box[3] - box[1],
|
161 |
+
linewidth=2, edgecolor='r', facecolor='none')
|
162 |
+
ax.add_patch(rect)
|
163 |
+
|
164 |
+
# Add label
|
165 |
+
plt.text(box[0], box[1], "{}: {}".format(class_name, confidence),
|
166 |
+
bbox=dict(facecolor='white', alpha=0.8))
|
167 |
+
|
168 |
+
detections.append("{}: {} at {}".format(class_name, confidence, box))
|
169 |
+
|
170 |
+
# Save figure to image
|
171 |
+
buf = io.BytesIO()
|
172 |
+
plt.tight_layout()
|
173 |
+
plt.axis('off')
|
174 |
+
plt.savefig(buf, format='png', bbox_inches='tight', pad_inches=0)
|
175 |
+
buf.seek(0)
|
176 |
+
result_image = Image.open(buf)
|
177 |
+
plt.close(fig)
|
178 |
+
|
179 |
+
# Calculate inference time
|
180 |
+
inference_time = time.time() - start_time
|
181 |
+
|
182 |
+
# Add inference time and device info to detection text
|
183 |
+
device_info = "GPU" if torch.cuda.is_available() else "CPU"
|
184 |
+
performance_info = f"\n\nInference time: {inference_time:.3f} seconds on {device_info}"
|
185 |
+
detection_text = "\n".join(detections) if detections else "No objects detected"
|
186 |
+
detection_text += performance_info
|
187 |
+
|
188 |
+
return result_image, detection_text
|
189 |
+
|
190 |
+
def process_vit(image):
|
191 |
+
if vit_model is None or vit_processor is None:
|
192 |
+
return "ViT model not loaded"
|
193 |
+
|
194 |
+
# Measure inference time
|
195 |
+
import time
|
196 |
+
start_time = time.time()
|
197 |
+
|
198 |
+
# Prepare image for the model
|
199 |
+
inputs = vit_processor(images=image, return_tensors="pt")
|
200 |
+
|
201 |
+
# Run inference
|
202 |
+
with torch.no_grad():
|
203 |
+
outputs = vit_model(**inputs)
|
204 |
+
# Extract raw logits (unnormalized scores) from model output
|
205 |
+
# Hugging Face models return logits directly, not probabilities
|
206 |
+
logits = outputs.logits
|
207 |
+
|
208 |
+
# Get the predicted class
|
209 |
+
# argmax(-1) finds the index with highest score across the last dimension (class dimension)
|
210 |
+
# item() converts the tensor value to a Python scalar
|
211 |
+
predicted_class_idx = logits.argmax(-1).item()
|
212 |
+
# Map the class index to human-readable label using the model's configuration
|
213 |
+
prediction = vit_model.config.id2label[predicted_class_idx]
|
214 |
+
|
215 |
+
# Get top 5 predictions
|
216 |
+
# Apply softmax to convert raw logits to probabilities
|
217 |
+
# softmax normalizes the exponentials of logits so they sum to 1.0
|
218 |
+
# dim=-1 applies softmax along the class dimension
|
219 |
+
# Shape before softmax: [1, num_classes] (batch_size=1, num_classes=1000)
|
220 |
+
# [0] extracts the first (and only) item from the batch dimension
|
221 |
+
# Shape after [0]: [num_classes] (a 1D tensor with 1000 class probabilities)
|
222 |
+
probs = torch.nn.functional.softmax(logits, dim=-1)[0]
|
223 |
+
# Get the values and indices of the 5 highest probabilities
|
224 |
+
top5_prob, top5_indices = torch.topk(probs, 5)
|
225 |
+
|
226 |
+
results = []
|
227 |
+
for i, (prob, idx) in enumerate(zip(top5_prob, top5_indices)):
|
228 |
+
class_name = vit_model.config.id2label[idx.item()]
|
229 |
+
results.append("{}. {}: {:.3f}".format(i+1, class_name, prob.item()))
|
230 |
+
|
231 |
+
# Calculate inference time
|
232 |
+
inference_time = time.time() - start_time
|
233 |
+
|
234 |
+
# Add inference time and device info to results
|
235 |
+
device_info = "GPU" if torch.cuda.is_available() else "CPU"
|
236 |
+
performance_info = f"\n\nInference time: {inference_time:.3f} seconds on {device_info}"
|
237 |
+
result_text = "\n".join(results)
|
238 |
+
result_text += performance_info
|
239 |
+
|
240 |
+
return result_text
|
241 |
+
|
242 |
+
# Define Gradio interface
|
243 |
+
with gr.Blocks(title="Object Detection Demo") as demo:
|
244 |
+
gr.Markdown("""
|
245 |
+
# Multi-Model Object Detection Demo
|
246 |
+
|
247 |
+
This demo showcases three different object detection and image classification models:
|
248 |
+
- **YOLOv8**: Fast and accurate object detection
|
249 |
+
- **DETR**: DEtection TRansformer for object detection
|
250 |
+
- **ViT**: Vision Transformer for image classification
|
251 |
+
|
252 |
+
Upload an image to see how each model performs!
|
253 |
+
""")
|
254 |
+
|
255 |
+
with gr.Row():
|
256 |
+
input_image = gr.Image(type="pil", label="Input Image")
|
257 |
+
|
258 |
+
with gr.Row():
|
259 |
+
yolo_button = gr.Button("Detect with YOLOv8")
|
260 |
+
detr_button = gr.Button("Detect with DETR")
|
261 |
+
vit_button = gr.Button("Classify with ViT")
|
262 |
+
|
263 |
+
with gr.Row():
|
264 |
+
with gr.Column():
|
265 |
+
yolo_output = gr.Image(type="pil", label="YOLOv8 Detection")
|
266 |
+
yolo_text = gr.Textbox(label="YOLOv8 Results")
|
267 |
+
|
268 |
+
with gr.Column():
|
269 |
+
detr_output = gr.Image(type="pil", label="DETR Detection")
|
270 |
+
detr_text = gr.Textbox(label="DETR Results")
|
271 |
+
|
272 |
+
with gr.Column():
|
273 |
+
vit_text = gr.Textbox(label="ViT Classification Results")
|
274 |
+
|
275 |
+
# Set up event handlers
|
276 |
+
yolo_button.click(
|
277 |
+
fn=process_yolo,
|
278 |
+
inputs=input_image,
|
279 |
+
outputs=[yolo_output, yolo_text]
|
280 |
+
)
|
281 |
+
|
282 |
+
detr_button.click(
|
283 |
+
fn=process_detr,
|
284 |
+
inputs=input_image,
|
285 |
+
outputs=[detr_output, detr_text]
|
286 |
+
)
|
287 |
+
|
288 |
+
vit_button.click(
|
289 |
+
fn=process_vit,
|
290 |
+
inputs=input_image,
|
291 |
+
outputs=vit_text
|
292 |
+
)
|
293 |
+
|
294 |
+
|
295 |
+
|
296 |
+
# Launch the app
|
297 |
+
if __name__ == "__main__":
|
298 |
+
demo.launch()
|
frontend/build/asset-manifest.json
ADDED
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"files": {
|
3 |
+
"main.css": "/static/css/main.59c2a54e.chunk.css",
|
4 |
+
"main.js": "/static/js/main.3d1593c5.chunk.js",
|
5 |
+
"main.js.map": "/static/js/main.3d1593c5.chunk.js.map",
|
6 |
+
"runtime-main.js": "/static/js/runtime-main.ab7e4402.js",
|
7 |
+
"runtime-main.js.map": "/static/js/runtime-main.ab7e4402.js.map",
|
8 |
+
"static/js/2.74e99ef6.chunk.js": "/static/js/2.74e99ef6.chunk.js",
|
9 |
+
"static/js/2.74e99ef6.chunk.js.map": "/static/js/2.74e99ef6.chunk.js.map",
|
10 |
+
"static/js/3.0e3ce0f8.chunk.js": "/static/js/3.0e3ce0f8.chunk.js",
|
11 |
+
"static/js/3.0e3ce0f8.chunk.js.map": "/static/js/3.0e3ce0f8.chunk.js.map",
|
12 |
+
"index.html": "/index.html",
|
13 |
+
"precache-manifest.e8825d818084296fa14f1b32e8815c1e.js": "/precache-manifest.e8825d818084296fa14f1b32e8815c1e.js",
|
14 |
+
"service-worker.js": "/service-worker.js",
|
15 |
+
"static/css/main.59c2a54e.chunk.css.map": "/static/css/main.59c2a54e.chunk.css.map",
|
16 |
+
"static/js/2.74e99ef6.chunk.js.LICENSE.txt": "/static/js/2.74e99ef6.chunk.js.LICENSE.txt"
|
17 |
+
},
|
18 |
+
"entrypoints": [
|
19 |
+
"static/js/runtime-main.ab7e4402.js",
|
20 |
+
"static/js/2.74e99ef6.chunk.js",
|
21 |
+
"static/css/main.59c2a54e.chunk.css",
|
22 |
+
"static/js/main.3d1593c5.chunk.js"
|
23 |
+
]
|
24 |
+
}
|
frontend/build/index.html
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
<!doctype html><html lang="en"><head><meta charset="utf-8"/><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="Multi-Model Object Detection Demo"/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>Vision Web App</title><link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500,700&display=swap"/><link href="/static/css/main.59c2a54e.chunk.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div><script>!function(e){function r(r){for(var n,i,a=r[0],c=r[1],l=r[2],p=0,s=[];p<a.length;p++)i=a[p],Object.prototype.hasOwnProperty.call(o,i)&&o[i]&&s.push(o[i][0]),o[i]=0;for(n in c)Object.prototype.hasOwnProperty.call(c,n)&&(e[n]=c[n]);for(f&&f(r);s.length;)s.shift()();return u.push.apply(u,l||[]),t()}function t(){for(var e,r=0;r<u.length;r++){for(var t=u[r],n=!0,a=1;a<t.length;a++){var c=t[a];0!==o[c]&&(n=!1)}n&&(u.splice(r--,1),e=i(i.s=t[0]))}return e}var n={},o={1:0},u=[];function i(r){if(n[r])return n[r].exports;var t=n[r]={i:r,l:!1,exports:{}};return e[r].call(t.exports,t,t.exports,i),t.l=!0,t.exports}i.e=function(e){var r=[],t=o[e];if(0!==t)if(t)r.push(t[2]);else{var n=new Promise((function(r,n){t=o[e]=[r,n]}));r.push(t[2]=n);var u,a=document.createElement("script");a.charset="utf-8",a.timeout=120,i.nc&&a.setAttribute("nonce",i.nc),a.src=function(e){return i.p+"static/js/"+({}[e]||e)+"."+{3:"0e3ce0f8"}[e]+".chunk.js"}(e);var c=new Error;u=function(r){a.onerror=a.onload=null,clearTimeout(l);var t=o[e];if(0!==t){if(t){var n=r&&("load"===r.type?"missing":r.type),u=r&&r.target&&r.target.src;c.message="Loading chunk "+e+" failed.\n("+n+": "+u+")",c.name="ChunkLoadError",c.type=n,c.request=u,t[1](c)}o[e]=void 0}};var l=setTimeout((function(){u({type:"timeout",target:a})}),12e4);a.onerror=a.onload=u,document.head.appendChild(a)}return Promise.all(r)},i.m=e,i.c=n,i.d=function(e,r,t){i.o(e,r)||Object.defineProperty(e,r,{enumerable:!0,get:t})},i.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},i.t=function(e,r){if(1&r&&(e=i(e)),8&r)return e;if(4&r&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(i.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&r&&"string"!=typeof e)for(var n in e)i.d(t,n,function(r){return e[r]}.bind(null,n));return t},i.n=function(e){var r=e&&e.__esModule?function(){return e.default}:function(){return e};return i.d(r,"a",r),r},i.o=function(e,r){return Object.prototype.hasOwnProperty.call(e,r)},i.p="/",i.oe=function(e){throw console.error(e),e};var a=this["webpackJsonpvision-web-app"]=this["webpackJsonpvision-web-app"]||[],c=a.push.bind(a);a.push=r,a=a.slice();for(var l=0;l<a.length;l++)r(a[l]);var f=c;t()}([])</script><script src="/static/js/2.74e99ef6.chunk.js"></script><script src="/static/js/main.3d1593c5.chunk.js"></script></body></html>
|
frontend/build/manifest.json
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"short_name": "Vision Web App",
|
3 |
+
"name": "Multi-Model Object Detection Demo",
|
4 |
+
"icons": [
|
5 |
+
{
|
6 |
+
"src": "favicon.ico",
|
7 |
+
"sizes": "64x64 32x32 24x24 16x16",
|
8 |
+
"type": "image/x-icon"
|
9 |
+
}
|
10 |
+
],
|
11 |
+
"start_url": ".",
|
12 |
+
"display": "standalone",
|
13 |
+
"theme_color": "#000000",
|
14 |
+
"background_color": "#ffffff"
|
15 |
+
}
|
frontend/build/precache-manifest.e8825d818084296fa14f1b32e8815c1e.js
ADDED
@@ -0,0 +1,30 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
self.__precacheManifest = (self.__precacheManifest || []).concat([
|
2 |
+
{
|
3 |
+
"revision": "d0bb3100f90d81918e1af551a295cc55",
|
4 |
+
"url": "/index.html"
|
5 |
+
},
|
6 |
+
{
|
7 |
+
"revision": "c40859d2e49da0b79907",
|
8 |
+
"url": "/static/css/main.59c2a54e.chunk.css"
|
9 |
+
},
|
10 |
+
{
|
11 |
+
"revision": "f417a8c7af9034db933e",
|
12 |
+
"url": "/static/js/2.74e99ef6.chunk.js"
|
13 |
+
},
|
14 |
+
{
|
15 |
+
"revision": "89a1b2dcd30c03705b2bceeb141b76b6",
|
16 |
+
"url": "/static/js/2.74e99ef6.chunk.js.LICENSE.txt"
|
17 |
+
},
|
18 |
+
{
|
19 |
+
"revision": "25f9bd0c0371bb013559",
|
20 |
+
"url": "/static/js/3.0e3ce0f8.chunk.js"
|
21 |
+
},
|
22 |
+
{
|
23 |
+
"revision": "c40859d2e49da0b79907",
|
24 |
+
"url": "/static/js/main.3d1593c5.chunk.js"
|
25 |
+
},
|
26 |
+
{
|
27 |
+
"revision": "97a891da203626eda40e",
|
28 |
+
"url": "/static/js/runtime-main.ab7e4402.js"
|
29 |
+
}
|
30 |
+
]);
|
frontend/build/service-worker.js
ADDED
@@ -0,0 +1,39 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
/**
|
2 |
+
* Welcome to your Workbox-powered service worker!
|
3 |
+
*
|
4 |
+
* You'll need to register this file in your web app and you should
|
5 |
+
* disable HTTP caching for this file too.
|
6 |
+
* See https://goo.gl/nhQhGp
|
7 |
+
*
|
8 |
+
* The rest of the code is auto-generated. Please don't update this file
|
9 |
+
* directly; instead, make changes to your Workbox build configuration
|
10 |
+
* and re-run your build process.
|
11 |
+
* See https://goo.gl/2aRDsh
|
12 |
+
*/
|
13 |
+
|
14 |
+
importScripts("https://storage.googleapis.com/workbox-cdn/releases/4.3.1/workbox-sw.js");
|
15 |
+
|
16 |
+
importScripts(
|
17 |
+
"/precache-manifest.e8825d818084296fa14f1b32e8815c1e.js"
|
18 |
+
);
|
19 |
+
|
20 |
+
self.addEventListener('message', (event) => {
|
21 |
+
if (event.data && event.data.type === 'SKIP_WAITING') {
|
22 |
+
self.skipWaiting();
|
23 |
+
}
|
24 |
+
});
|
25 |
+
|
26 |
+
workbox.core.clientsClaim();
|
27 |
+
|
28 |
+
/**
|
29 |
+
* The workboxSW.precacheAndRoute() method efficiently caches and responds to
|
30 |
+
* requests for URLs in the manifest.
|
31 |
+
* See https://goo.gl/S9QRab
|
32 |
+
*/
|
33 |
+
self.__precacheManifest = [].concat(self.__precacheManifest || []);
|
34 |
+
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
|
35 |
+
|
36 |
+
workbox.routing.registerNavigationRoute(workbox.precaching.getCacheKeyForURL("/index.html"), {
|
37 |
+
|
38 |
+
blacklist: [/^\/_/,/\/[^\/?]+\.[^\/]+$/],
|
39 |
+
});
|
frontend/build/static/css/main.59c2a54e.chunk.css
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
body{margin:0;font-family:-apple-system,BlinkMacSystemFont,"Segoe UI","Roboto","Oxygen","Ubuntu","Cantarell","Fira Sans","Droid Sans","Helvetica Neue",sans-serif;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale;background-color:#f5f5f5}code{font-family:source-code-pro,Menlo,Monaco,Consolas,"Courier New",monospace}.App{text-align:center}.preview-image{max-width:100%;max-height:300px;margin-top:16px}.result-image{max-width:100%;border:1px solid #ddd;border-radius:4px;padding:4px}.detection-list{margin-top:16px;text-align:left}.model-card{cursor:pointer;transition:all .3s}.model-card:hover{transform:translateY(-5px);box-shadow:0 10px 20px rgba(0,0,0,.1)}.model-card.selected{border:2px solid #3f51b5;background-color:#e8eaf6}.model-card.disabled{opacity:.6;cursor:not-allowed}.performance-info{margin-top:16px;font-size:.9rem;color:#666}
|
2 |
+
/*# sourceMappingURL=main.59c2a54e.chunk.css.map */
|
frontend/build/static/css/main.59c2a54e.chunk.css.map
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"version":3,"sources":["index.css","App.css"],"names":[],"mappings":"AAAA,KACE,QAAS,CACT,mJAEY,CACZ,kCAAmC,CACnC,iCAAkC,CAClC,wBACF,CAEA,KACE,yEAEF,CCbA,KACE,iBACF,CAEA,eACE,cAAe,CACf,gBAAiB,CACjB,eACF,CAEA,cACE,cAAe,CACf,qBAAsB,CACtB,iBAAkB,CAClB,WACF,CAEA,gBACE,eAAgB,CAChB,eACF,CAEA,YACE,cAAe,CACf,kBACF,CAEA,kBACE,0BAA2B,CAC3B,qCACF,CAEA,qBACE,wBAAyB,CACzB,wBACF,CAEA,qBACE,UAAY,CACZ,kBACF,CAEA,kBACE,eAAgB,CAChB,eAAiB,CACjB,UACF","file":"main.59c2a54e.chunk.css","sourcesContent":["body {\n margin: 0;\n font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',\n 'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',\n sans-serif;\n -webkit-font-smoothing: antialiased;\n -moz-osx-font-smoothing: grayscale;\n background-color: #f5f5f5;\n}\n\ncode {\n font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',\n monospace;\n}\n",".App {\n text-align: center;\n}\n\n.preview-image {\n max-width: 100%;\n max-height: 300px;\n margin-top: 16px;\n}\n\n.result-image {\n max-width: 100%;\n border: 1px solid #ddd;\n border-radius: 4px;\n padding: 4px;\n}\n\n.detection-list {\n margin-top: 16px;\n text-align: left;\n}\n\n.model-card {\n cursor: pointer;\n transition: all 0.3s;\n}\n\n.model-card:hover {\n transform: translateY(-5px);\n box-shadow: 0 10px 20px rgba(0,0,0,0.1);\n}\n\n.model-card.selected {\n border: 2px solid #3f51b5;\n background-color: #e8eaf6;\n}\n\n.model-card.disabled {\n opacity: 0.6;\n cursor: not-allowed;\n}\n\n.performance-info {\n margin-top: 16px;\n font-size: 0.9rem;\n color: #666;\n}\n"]}
|
frontend/build/static/js/2.74e99ef6.chunk.js
ADDED
The diff for this file is too large to render.
See raw diff
|
|
frontend/build/static/js/2.74e99ef6.chunk.js.LICENSE.txt
ADDED
@@ -0,0 +1,58 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
/*
|
2 |
+
object-assign
|
3 |
+
(c) Sindre Sorhus
|
4 |
+
@license MIT
|
5 |
+
*/
|
6 |
+
|
7 |
+
/**
|
8 |
+
* A better abstraction over CSS.
|
9 |
+
*
|
10 |
+
* @copyright Oleg Isonen (Slobodskoi) / Isonen 2014-present
|
11 |
+
* @website https://github.com/cssinjs/jss
|
12 |
+
* @license MIT
|
13 |
+
*/
|
14 |
+
|
15 |
+
/** @license React v0.19.1
|
16 |
+
* scheduler.production.min.js
|
17 |
+
*
|
18 |
+
* Copyright (c) Facebook, Inc. and its affiliates.
|
19 |
+
*
|
20 |
+
* This source code is licensed under the MIT license found in the
|
21 |
+
* LICENSE file in the root directory of this source tree.
|
22 |
+
*/
|
23 |
+
|
24 |
+
/** @license React v16.13.1
|
25 |
+
* react-is.production.min.js
|
26 |
+
*
|
27 |
+
* Copyright (c) Facebook, Inc. and its affiliates.
|
28 |
+
*
|
29 |
+
* This source code is licensed under the MIT license found in the
|
30 |
+
* LICENSE file in the root directory of this source tree.
|
31 |
+
*/
|
32 |
+
|
33 |
+
/** @license React v16.14.0
|
34 |
+
* react-dom.production.min.js
|
35 |
+
*
|
36 |
+
* Copyright (c) Facebook, Inc. and its affiliates.
|
37 |
+
*
|
38 |
+
* This source code is licensed under the MIT license found in the
|
39 |
+
* LICENSE file in the root directory of this source tree.
|
40 |
+
*/
|
41 |
+
|
42 |
+
/** @license React v16.14.0
|
43 |
+
* react.production.min.js
|
44 |
+
*
|
45 |
+
* Copyright (c) Facebook, Inc. and its affiliates.
|
46 |
+
*
|
47 |
+
* This source code is licensed under the MIT license found in the
|
48 |
+
* LICENSE file in the root directory of this source tree.
|
49 |
+
*/
|
50 |
+
|
51 |
+
/** @license React v17.0.2
|
52 |
+
* react-is.production.min.js
|
53 |
+
*
|
54 |
+
* Copyright (c) Facebook, Inc. and its affiliates.
|
55 |
+
*
|
56 |
+
* This source code is licensed under the MIT license found in the
|
57 |
+
* LICENSE file in the root directory of this source tree.
|
58 |
+
*/
|
frontend/build/static/js/2.74e99ef6.chunk.js.map
ADDED
The diff for this file is too large to render.
See raw diff
|
|
frontend/build/static/js/3.0e3ce0f8.chunk.js
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
(this["webpackJsonpvision-web-app"]=this["webpackJsonpvision-web-app"]||[]).push([[3],{137:function(t,n,e){"use strict";e.r(n),e.d(n,"getCLS",(function(){return l})),e.d(n,"getFCP",(function(){return g})),e.d(n,"getFID",(function(){return h})),e.d(n,"getLCP",(function(){return y})),e.d(n,"getTTFB",(function(){return F}));var i,a,r=function(){return"".concat(Date.now(),"-").concat(Math.floor(8999999999999*Math.random())+1e12)},o=function(t){var n=arguments.length>1&&void 0!==arguments[1]?arguments[1]:-1;return{name:t,value:n,delta:0,entries:[],id:r(),isFinal:!1}},u=function(t,n){try{if(PerformanceObserver.supportedEntryTypes.includes(t)){var e=new PerformanceObserver((function(t){return t.getEntries().map(n)}));return e.observe({type:t,buffered:!0}),e}}catch(t){}},s=!1,c=!1,p=function(t){s=!t.persisted},d=function(){addEventListener("pagehide",p),addEventListener("beforeunload",(function(){}))},f=function(t){var n=arguments.length>1&&void 0!==arguments[1]&&arguments[1];c||(d(),c=!0),addEventListener("visibilitychange",(function(n){var e=n.timeStamp;"hidden"===document.visibilityState&&t({timeStamp:e,isUnloading:s})}),{capture:!0,once:n})},v=function(t,n,e,i){var a;return function(){e&&n.isFinal&&e.disconnect(),n.value>=0&&(i||n.isFinal||"hidden"===document.visibilityState)&&(n.delta=n.value-(a||0),(n.delta||n.isFinal||void 0===a)&&(t(n),a=n.value))}},l=function(t){var n,e=arguments.length>1&&void 0!==arguments[1]&&arguments[1],i=o("CLS",0),a=function(t){t.hadRecentInput||(i.value+=t.value,i.entries.push(t),n())},r=u("layout-shift",a);r&&(n=v(t,i,r,e),f((function(t){var e=t.isUnloading;r.takeRecords().map(a),e&&(i.isFinal=!0),n()})))},m=function(){return void 0===i&&(i="hidden"===document.visibilityState?0:1/0,f((function(t){var n=t.timeStamp;return i=n}),!0)),{get timeStamp(){return i}}},g=function(t){var n,e=o("FCP"),i=m(),a=u("paint",(function(t){"first-contentful-paint"===t.name&&t.startTime<i.timeStamp&&(e.value=t.startTime,e.isFinal=!0,e.entries.push(t),n())}));a&&(n=v(t,e,a))},h=function(t){var n=o("FID"),e=m(),i=function(t){t.startTime<e.timeStamp&&(n.value=t.processingStart-t.startTime,n.entries.push(t),n.isFinal=!0,r())},a=u("first-input",i),r=v(t,n,a);a?f((function(){a.takeRecords().map(i),a.disconnect()}),!0):window.perfMetrics&&window.perfMetrics.onFirstInputDelay&&window.perfMetrics.onFirstInputDelay((function(t,i){i.timeStamp<e.timeStamp&&(n.value=t,n.isFinal=!0,n.entries=[{entryType:"first-input",name:i.type,target:i.target,cancelable:i.cancelable,startTime:i.timeStamp,processingStart:i.timeStamp+t}],r())}))},S=function(){return a||(a=new Promise((function(t){return["scroll","keydown","pointerdown"].map((function(n){addEventListener(n,t,{once:!0,passive:!0,capture:!0})}))}))),a},y=function(t){var n,e=arguments.length>1&&void 0!==arguments[1]&&arguments[1],i=o("LCP"),a=m(),r=function(t){var e=t.startTime;e<a.timeStamp?(i.value=e,i.entries.push(t)):i.isFinal=!0,n()},s=u("largest-contentful-paint",r);if(s){n=v(t,i,s,e);var c=function(){i.isFinal||(s.takeRecords().map(r),i.isFinal=!0,n())};S().then(c),f(c,!0)}},F=function(t){var n,e=o("TTFB");n=function(){try{var n=performance.getEntriesByType("navigation")[0]||function(){var t=performance.timing,n={entryType:"navigation",startTime:0};for(var e in t)"navigationStart"!==e&&"toJSON"!==e&&(n[e]=Math.max(t[e]-t.navigationStart,0));return n}();e.value=e.delta=n.responseStart,e.entries=[n],e.isFinal=!0,t(e)}catch(t){}},"complete"===document.readyState?setTimeout(n,0):addEventListener("pageshow",n)}}}]);
|
2 |
+
//# sourceMappingURL=3.0e3ce0f8.chunk.js.map
|
frontend/build/static/js/3.0e3ce0f8.chunk.js.map
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"version":3,"sources":["../node_modules/web-vitals/dist/web-vitals.es5.min.js"],"names":["v","t","n","e","concat","Date","now","Math","floor","random","i","arguments","length","name","value","delta","entries","id","isFinal","a","PerformanceObserver","supportedEntryTypes","includes","getEntries","map","observe","type","buffered","r","o","s","persisted","u","addEventListener","c","timeStamp","document","visibilityState","isUnloading","capture","once","l","disconnect","p","hadRecentInput","push","takeRecords","d","startTime","f","processingStart","window","perfMetrics","onFirstInputDelay","entryType","target","cancelable","m","Promise","passive","g","then","h","performance","getEntriesByType","timing","max","navigationStart","responseStart","readyState","setTimeout"],"mappings":"wHAAA,gFAAAA,KAAA,0HAAIC,EAAEC,EAAEC,EAAE,WAAW,MAAM,GAAGC,OAAOC,KAAKC,MAAM,KAAKF,OAAOG,KAAKC,MAAM,cAAcD,KAAKE,UAAU,OAAOC,EAAE,SAAST,GAAG,IAAIC,EAAES,UAAUC,OAAO,QAAG,IAASD,UAAU,GAAGA,UAAU,IAAI,EAAE,MAAM,CAACE,KAAKZ,EAAEa,MAAMZ,EAAEa,MAAM,EAAEC,QAAQ,GAAGC,GAAGd,IAAIe,SAAQ,IAAKC,EAAE,SAASlB,EAAEC,GAAG,IAAI,GAAGkB,oBAAoBC,oBAAoBC,SAASrB,GAAG,CAAC,IAAIE,EAAE,IAAIiB,qBAAqB,SAASnB,GAAG,OAAOA,EAAEsB,aAAaC,IAAItB,MAAM,OAAOC,EAAEsB,QAAQ,CAACC,KAAKzB,EAAE0B,UAAS,IAAKxB,GAAG,MAAMF,MAAM2B,GAAE,EAAGC,GAAE,EAAGC,EAAE,SAAS7B,GAAG2B,GAAG3B,EAAE8B,WAAWC,EAAE,WAAWC,iBAAiB,WAAWH,GAAGG,iBAAiB,gBAAgB,gBAAgBC,EAAE,SAASjC,GAAG,IAAIC,EAAES,UAAUC,OAAO,QAAG,IAASD,UAAU,IAAIA,UAAU,GAAGkB,IAAIG,IAAIH,GAAE,GAAII,iBAAiB,oBAAoB,SAAS/B,GAAG,IAAIC,EAAED,EAAEiC,UAAU,WAAWC,SAASC,iBAAiBpC,EAAE,CAACkC,UAAUhC,EAAEmC,YAAYV,MAAM,CAACW,SAAQ,EAAGC,KAAKtC,KAAKuC,EAAE,SAASxC,EAAEC,EAAEC,EAAEO,GAAG,IAAIS,EAAE,OAAO,WAAWhB,GAAGD,EAAEgB,SAASf,EAAEuC,aAAaxC,EAAEY,OAAO,IAAIJ,GAAGR,EAAEgB,SAAS,WAAWkB,SAASC,mBAAmBnC,EAAEa,MAAMb,EAAEY,OAAOK,GAAG,IAAIjB,EAAEa,OAAOb,EAAEgB,cAAS,IAASC,KAAKlB,EAAEC,GAAGiB,EAAEjB,EAAEY,UAAU6B,EAAE,SAAS1C,GAAG,IAAIC,EAAEC,EAAEQ,UAAUC,OAAO,QAAG,IAASD,UAAU,IAAIA,UAAU,GAAGiB,EAAElB,EAAE,MAAM,GAAGmB,EAAE,SAAS5B,GAAGA,EAAE2C,iBAAiBhB,EAAEd,OAAOb,EAAEa,MAAMc,EAAEZ,QAAQ6B,KAAK5C,GAAGC,MAAM4B,EAAEX,EAAE,eAAeU,GAAGC,IAAI5B,EAAEuC,EAAExC,EAAE2B,EAAEE,EAAE3B,GAAG+B,GAAG,SAASjC,GAAG,IAAIE,EAAEF,EAAEqC,YAAYR,EAAEgB,cAActB,IAAIK,GAAG1B,IAAIyB,EAAEV,SAAQ,GAAIhB,SAAS6C,EAAE,WAAW,YAAO,IAAS9C,IAAIA,EAAE,WAAWmC,SAASC,gBAAgB,EAAE,IAAIH,GAAG,SAAShC,GAAG,IAAIC,EAAED,EAAEiC,UAAU,OAAOlC,EAAEE,KAAI,IAAK,CAAC,gBAAgB,OAAOF,KAAKD,EAAE,SAASC,GAAG,IAAIC,EAAEC,EAAEO,EAAE,OAAOkB,EAAEmB,IAAIlB,EAAEV,EAAE,SAAS,SAASlB,GAAG,2BAA2BA,EAAEY,MAAMZ,EAAE+C,UAAUpB,EAAEO,YAAYhC,EAAEW,MAAMb,EAAE+C,UAAU7C,EAAEe,SAAQ,EAAGf,EAAEa,QAAQ6B,KAAK5C,GAAGC,QAAQ2B,IAAI3B,EAAEuC,EAAExC,EAAEE,EAAE0B,KAAKoB,EAAE,SAAShD,GAAG,IAAIC,EAAEQ,EAAE,OAAOP,EAAE4C,IAAInB,EAAE,SAAS3B,GAAGA,EAAE+C,UAAU7C,EAAEgC,YAAYjC,EAAEY,MAAMb,EAAEiD,gBAAgBjD,EAAE+C,UAAU9C,EAAEc,QAAQ6B,KAAK5C,GAAGC,EAAEgB,SAAQ,EAAGY,MAAMD,EAAEV,EAAE,cAAcS,GAAGE,EAAEW,EAAExC,EAAEC,EAAE2B,GAAGA,EAAEK,GAAG,WAAWL,EAAEiB,cAActB,IAAII,GAAGC,EAAEa,gBAAe,GAAIS,OAAOC,aAAaD,OAAOC,YAAYC,mBAAmBF,OAAOC,YAAYC,mBAAmB,SAASpD,EAAES,GAAGA,EAAEyB,UAAUhC,EAAEgC,YAAYjC,EAAEY,MAAMb,EAAEC,EAAEgB,SAAQ,EAAGhB,EAAEc,QAAQ,CAAC,CAACsC,UAAU,cAAczC,KAAKH,EAAEgB,KAAK6B,OAAO7C,EAAE6C,OAAOC,WAAW9C,EAAE8C,WAAWR,UAAUtC,EAAEyB,UAAUe,gBAAgBxC,EAAEyB,UAAUlC,IAAI6B,SAAS2B,EAAE,WAAW,OAAOvD,IAAIA,EAAE,IAAIwD,SAAS,SAASzD,GAAG,MAAM,CAAC,SAAS,UAAU,eAAeuB,KAAK,SAAStB,GAAG+B,iBAAiB/B,EAAED,EAAE,CAACuC,MAAK,EAAGmB,SAAQ,EAAGpB,SAAQ,WAAYrC,GAAG0D,EAAE,SAAS3D,GAAG,IAAIC,EAAEC,EAAEQ,UAAUC,OAAO,QAAG,IAASD,UAAU,IAAIA,UAAU,GAAGiB,EAAElB,EAAE,OAAOmB,EAAEkB,IAAIjB,EAAE,SAAS7B,GAAG,IAAIE,EAAEF,EAAE+C,UAAU7C,EAAE0B,EAAEM,WAAWP,EAAEd,MAAMX,EAAEyB,EAAEZ,QAAQ6B,KAAK5C,IAAI2B,EAAEV,SAAQ,EAAGhB,KAAK8B,EAAEb,EAAE,2BAA2BW,GAAG,GAAGE,EAAE,CAAC9B,EAAEuC,EAAExC,EAAE2B,EAAEI,EAAE7B,GAAG,IAAIwC,EAAE,WAAWf,EAAEV,UAAUc,EAAEc,cAActB,IAAIM,GAAGF,EAAEV,SAAQ,EAAGhB,MAAMuD,IAAII,KAAKlB,GAAGT,EAAES,GAAE,KAAMmB,EAAE,SAAS7D,GAAG,IAAIC,EAAEC,EAAEO,EAAE,QAAQR,EAAE,WAAW,IAAI,IAAIA,EAAE6D,YAAYC,iBAAiB,cAAc,IAAI,WAAW,IAAI/D,EAAE8D,YAAYE,OAAO/D,EAAE,CAACoD,UAAU,aAAaN,UAAU,GAAG,IAAI,IAAI7C,KAAKF,EAAE,oBAAoBE,GAAG,WAAWA,IAAID,EAAEC,GAAGI,KAAK2D,IAAIjE,EAAEE,GAAGF,EAAEkE,gBAAgB,IAAI,OAAOjE,EAAhL,GAAqLC,EAAEW,MAAMX,EAAEY,MAAMb,EAAEkE,cAAcjE,EAAEa,QAAQ,CAACd,GAAGC,EAAEe,SAAQ,EAAGjB,EAAEE,GAAG,MAAMF,MAAM,aAAamC,SAASiC,WAAWC,WAAWpE,EAAE,GAAG+B,iBAAiB,WAAW/B","file":"static/js/3.0e3ce0f8.chunk.js","sourcesContent":["var t,n,e=function(){return\"\".concat(Date.now(),\"-\").concat(Math.floor(8999999999999*Math.random())+1e12)},i=function(t){var n=arguments.length>1&&void 0!==arguments[1]?arguments[1]:-1;return{name:t,value:n,delta:0,entries:[],id:e(),isFinal:!1}},a=function(t,n){try{if(PerformanceObserver.supportedEntryTypes.includes(t)){var e=new PerformanceObserver((function(t){return t.getEntries().map(n)}));return e.observe({type:t,buffered:!0}),e}}catch(t){}},r=!1,o=!1,s=function(t){r=!t.persisted},u=function(){addEventListener(\"pagehide\",s),addEventListener(\"beforeunload\",(function(){}))},c=function(t){var n=arguments.length>1&&void 0!==arguments[1]&&arguments[1];o||(u(),o=!0),addEventListener(\"visibilitychange\",(function(n){var e=n.timeStamp;\"hidden\"===document.visibilityState&&t({timeStamp:e,isUnloading:r})}),{capture:!0,once:n})},l=function(t,n,e,i){var a;return function(){e&&n.isFinal&&e.disconnect(),n.value>=0&&(i||n.isFinal||\"hidden\"===document.visibilityState)&&(n.delta=n.value-(a||0),(n.delta||n.isFinal||void 0===a)&&(t(n),a=n.value))}},p=function(t){var n,e=arguments.length>1&&void 0!==arguments[1]&&arguments[1],r=i(\"CLS\",0),o=function(t){t.hadRecentInput||(r.value+=t.value,r.entries.push(t),n())},s=a(\"layout-shift\",o);s&&(n=l(t,r,s,e),c((function(t){var e=t.isUnloading;s.takeRecords().map(o),e&&(r.isFinal=!0),n()})))},d=function(){return void 0===t&&(t=\"hidden\"===document.visibilityState?0:1/0,c((function(n){var e=n.timeStamp;return t=e}),!0)),{get timeStamp(){return t}}},v=function(t){var n,e=i(\"FCP\"),r=d(),o=a(\"paint\",(function(t){\"first-contentful-paint\"===t.name&&t.startTime<r.timeStamp&&(e.value=t.startTime,e.isFinal=!0,e.entries.push(t),n())}));o&&(n=l(t,e,o))},f=function(t){var n=i(\"FID\"),e=d(),r=function(t){t.startTime<e.timeStamp&&(n.value=t.processingStart-t.startTime,n.entries.push(t),n.isFinal=!0,s())},o=a(\"first-input\",r),s=l(t,n,o);o?c((function(){o.takeRecords().map(r),o.disconnect()}),!0):window.perfMetrics&&window.perfMetrics.onFirstInputDelay&&window.perfMetrics.onFirstInputDelay((function(t,i){i.timeStamp<e.timeStamp&&(n.value=t,n.isFinal=!0,n.entries=[{entryType:\"first-input\",name:i.type,target:i.target,cancelable:i.cancelable,startTime:i.timeStamp,processingStart:i.timeStamp+t}],s())}))},m=function(){return n||(n=new Promise((function(t){return[\"scroll\",\"keydown\",\"pointerdown\"].map((function(n){addEventListener(n,t,{once:!0,passive:!0,capture:!0})}))}))),n},g=function(t){var n,e=arguments.length>1&&void 0!==arguments[1]&&arguments[1],r=i(\"LCP\"),o=d(),s=function(t){var e=t.startTime;e<o.timeStamp?(r.value=e,r.entries.push(t)):r.isFinal=!0,n()},u=a(\"largest-contentful-paint\",s);if(u){n=l(t,r,u,e);var p=function(){r.isFinal||(u.takeRecords().map(s),r.isFinal=!0,n())};m().then(p),c(p,!0)}},h=function(t){var n,e=i(\"TTFB\");n=function(){try{var n=performance.getEntriesByType(\"navigation\")[0]||function(){var t=performance.timing,n={entryType:\"navigation\",startTime:0};for(var e in t)\"navigationStart\"!==e&&\"toJSON\"!==e&&(n[e]=Math.max(t[e]-t.navigationStart,0));return n}();e.value=e.delta=n.responseStart,e.entries=[n],e.isFinal=!0,t(e)}catch(t){}},\"complete\"===document.readyState?setTimeout(n,0):addEventListener(\"pageshow\",n)};export{p as getCLS,v as getFCP,f as getFID,g as getLCP,h as getTTFB};\n"],"sourceRoot":""}
|
frontend/build/static/js/main.3d1593c5.chunk.js
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
(this["webpackJsonpvision-web-app"]=this["webpackJsonpvision-web-app"]||[]).push([[0],{77:function(e,a,t){e.exports=t(88)},82:function(e,a,t){},87:function(e,a,t){},88:function(e,a,t){"use strict";t.r(a);var n=t(0),r=t.n(n),l=t(10),o=t.n(l),c=(t(82),t(68)),i=t(129),s=t(134),m=t(130),d=t(131),p=t(47),g=t(132),u=t(120),E=t(70),b=t(128),f=t(118),v=t(119),h=t(53),y=t.n(h),x=t(65),B=t.n(x),C=t(115);const w=Object(C.a)(e=>({paper:{padding:e.spacing(2),display:"flex",flexDirection:"column",alignItems:"center",height:"100%",minHeight:300,transition:"all 0.3s ease"},dragActive:{border:"2px dashed #3f51b5",backgroundColor:"rgba(63, 81, 181, 0.05)"},dragInactive:{border:"2px dashed #ccc",backgroundColor:"white"},uploadBox:{display:"flex",flexDirection:"column",alignItems:"center",justifyContent:"center",height:"100%",width:"100%",cursor:"pointer"},uploadIcon:{fontSize:60,color:"#3f51b5",marginBottom:e.spacing(2)},supportText:{marginTop:e.spacing(2)},previewBox:{display:"flex",flexDirection:"column",alignItems:"center",width:"100%",height:"100%",position:"relative"},imageContainer:{position:"relative",width:"100%",height:"100%",display:"flex",justifyContent:"center",alignItems:"center",overflow:"hidden",marginTop:e.spacing(2)},deleteButton:{position:"absolute",top:0,right:0,backgroundColor:"rgba(255, 255, 255, 0.7)","&:hover":{backgroundColor:"rgba(255, 255, 255, 0.9)"}}}));var N=e=>{let{onImageUpload:a}=e;const t=w(),[l,o]=Object(n.useState)(null),[c,i]=Object(n.useState)(!1),m=Object(n.useRef)(null),d=e=>{e.preventDefault(),e.stopPropagation(),"dragenter"===e.type||"dragover"===e.type?i(!0):"dragleave"===e.type&&i(!1)},g=e=>{e.type.startsWith("image/")?(o(URL.createObjectURL(e)),a(e)):alert("Please upload an image file")};return r.a.createElement(E.a,{className:"".concat(t.paper," ").concat(c?t.dragActive:t.dragInactive),onDragEnter:d,onDragLeave:d,onDragOver:d,onDrop:e=>{e.preventDefault(),e.stopPropagation(),i(!1),e.dataTransfer.files&&e.dataTransfer.files[0]&&g(e.dataTransfer.files[0])}},r.a.createElement("input",{ref:m,type:"file",accept:"image/*",onChange:e=>{e.preventDefault(),e.target.files&&e.target.files[0]&&g(e.target.files[0])},style:{display:"none"}}),l?r.a.createElement(s.a,{className:t.previewBox},r.a.createElement(p.a,{variant:"h6",gutterBottom:!0},"Preview"),r.a.createElement(s.a,{className:t.imageContainer},r.a.createElement("img",{src:l,alt:"Preview",className:"preview-image"}),r.a.createElement(v.a,{"aria-label":"delete",className:t.deleteButton,onClick:()=>{o(null),a(null),m.current.value=""}},r.a.createElement(B.a,null)))):r.a.createElement(s.a,{className:t.uploadBox,onClick:()=>{m.current.click()}},r.a.createElement(y.a,{className:t.uploadIcon}),r.a.createElement(p.a,{variant:"h6",gutterBottom:!0},"Drag & Drop an image here"),r.a.createElement(p.a,{variant:"body2",color:"textSecondary",gutterBottom:!0},"or"),r.a.createElement(f.a,{variant:"contained",color:"primary",component:"span",startIcon:r.a.createElement(y.a,null)},"Browse Files"),r.a.createElement(p.a,{variant:"body2",color:"textSecondary",className:t.supportText},"Supported formats: JPG, PNG, GIF")))},j=t(121),S=t(122),T=t(136),O=t(123),k=t(54),I=t.n(k),D=t(66),P=t.n(D),F=t(67),R=t.n(F);const A=Object(C.a)(e=>({card:{height:"100%",display:"flex",flexDirection:"column"},selectedCard:{border:"2px solid #3f51b5"},unavailableCard:{opacity:.6},cardContent:{flexGrow:1},chipContainer:{marginBottom:e.spacing(1.5)},successChip:{backgroundColor:"#34C759",color:"#fff"},errorChip:{backgroundColor:"#FF3B3F",color:"#fff"},modelType:{marginTop:e.spacing(1)},processButton:{marginTop:e.spacing(3),textAlign:"center"}}));var L=e=>{let{onModelSelect:a,onProcess:t,isProcessing:n,modelsStatus:l,selectedModel:o,imageSelected:c}=e;const i=A(),m=[{id:"yolo",name:"YOLOv8",description:"Fast and accurate object detection",icon:r.a.createElement(I.a,null),available:l.yolo},{id:"detr",name:"DETR",description:"DEtection TRansformer for object detection",icon:r.a.createElement(I.a,null),available:l.detr},{id:"vit",name:"ViT",description:"Vision Transformer for image classification",icon:r.a.createElement(P.a,null),available:l.vit}],d=e=>{m.find(a=>a.id===e).available&&a(e)};return r.a.createElement(s.a,{sx:{p:2,height:"100%"}},r.a.createElement(p.a,{variant:"h6",gutterBottom:!0},"Select Model"),r.a.createElement(u.a,{container:!0,spacing:2},m.map(e=>r.a.createElement(u.a,{item:!0,xs:12,sm:4,key:e.id},r.a.createElement(j.a,{className:"\n ".concat(i.card," \n ").concat(o===e.id?i.selectedCard:""," \n ").concat(e.available?"":i.unavailableCard,"\n "),onClick:()=>d(e.id)},r.a.createElement(S.a,{className:i.cardContent},r.a.createElement(s.a,{sx:{mb:2,color:"primary"}},e.icon),r.a.createElement(p.a,{variant:"h5",component:"div",gutterBottom:!0},e.name),r.a.createElement("div",{className:i.chipContainer},e.available?r.a.createElement(T.a,{label:"Available",className:i.successChip,size:"small"}):r.a.createElement(T.a,{label:"Not Available",className:i.errorChip,size:"small"})),r.a.createElement(p.a,{variant:"body2",color:"textSecondary"},e.description)),r.a.createElement(O.a,null,r.a.createElement(f.a,{size:"small",onClick:()=>d(e.id),disabled:!e.available,color:o===e.id?"primary":"default",variant:o===e.id?"contained":"outlined",fullWidth:!0},o===e.id?"Selected":"Select")))))),r.a.createElement("div",{className:i.processButton},r.a.createElement(f.a,{variant:"contained",color:"primary",size:"large",startIcon:r.a.createElement(R.a,null),onClick:t,disabled:!o||!c||n},n?"Processing...":"Process Image")))},M=t(124),z=t(125),W=t(126),G=t(127);const _=Object(C.a)(e=>({paper:{padding:e.spacing(2)},marginBottom:{marginBottom:e.spacing(2)},resultImage:{maxWidth:"100%",maxHeight:"400px",objectFit:"contain"},dividerMargin:{margin:"".concat(e.spacing(2),"px 0")},chipContainer:{display:"flex",gap:e.spacing(1),flexWrap:"wrap"}}));var H=e=>{let{results:a}=e;const t=_();if(!a)return null;const{model:n,data:l}=a;if(l.error)return r.a.createElement(E.a,{sx:{p:2,bgcolor:"#ffebee"}},r.a.createElement(p.a,{color:"error"},l.error));const o=()=>l.performance?r.a.createElement(s.a,{className:"performance-info"},r.a.createElement(M.a,{className:t.dividerMargin}),r.a.createElement(p.a,{variant:"body2"},"Inference time: ",(e=>{if(void 0===e||null===e||isNaN(e))return"-";const a=Number(e);return a<1e3?"".concat(a.toFixed(2)," ms"):"".concat((a/1e3).toFixed(2)," s")})(l.performance.inference_time)," on ",l.performance.device)):null;return"yolo"===n||"detr"===n?r.a.createElement(E.a,{className:t.paper},r.a.createElement(p.a,{variant:"h6",gutterBottom:!0},"yolo"===n?"YOLOv8":"DETR"," Detection Results"),r.a.createElement(u.a,{container:!0,spacing:3},r.a.createElement(u.a,{item:!0,xs:12,md:6},l.image&&r.a.createElement(s.a,{className:t.marginBottom},r.a.createElement(p.a,{variant:"subtitle1",gutterBottom:!0},"Detection Result"),r.a.createElement("img",{src:"data:image/png;base64,".concat(l.image),alt:"Detection Result",className:t.resultImage}))),r.a.createElement(u.a,{item:!0,xs:12,md:6},r.a.createElement(s.a,{className:t.marginBottom},r.a.createElement(p.a,{variant:"subtitle1",gutterBottom:!0},"Detected Objects:"),l.detections&&l.detections.length>0?r.a.createElement(z.a,null,l.detections.map((e,a)=>r.a.createElement(r.a.Fragment,{key:a},r.a.createElement(W.a,null,r.a.createElement(G.a,{primary:r.a.createElement(s.a,{style:{display:"flex",alignItems:"center"}},r.a.createElement(p.a,{variant:"body1",component:"span"},e.class),r.a.createElement(T.a,{label:"".concat((100*e.confidence).toFixed(0),"%"),size:"small",color:"primary",style:{marginLeft:8}})),secondary:"Bounding Box: [".concat(e.bbox.join(", "),"]")})),a<l.detections.length-1&&r.a.createElement(M.a,null)))):r.a.createElement(p.a,{variant:"body1"},"No objects detected")))),o()):"vit"===n?r.a.createElement(E.a,{className:t.paper},r.a.createElement(p.a,{variant:"h6",gutterBottom:!0},"ViT Classification Results"),r.a.createElement(p.a,{variant:"subtitle1",gutterBottom:!0},"Top Predictions:"),l.top_predictions&&l.top_predictions.length>0?r.a.createElement(z.a,null,l.top_predictions.map((e,a)=>r.a.createElement(r.a.Fragment,{key:a},r.a.createElement(W.a,null,r.a.createElement(G.a,{primary:r.a.createElement(s.a,{style:{display:"flex",alignItems:"center"}},r.a.createElement(p.a,{variant:"body1",component:"span"},e.rank,". ",e.class),r.a.createElement(T.a,{label:"".concat((100*e.probability).toFixed(1),"%"),size:"small",color:0===a?"primary":"default",style:{marginLeft:8}}))})),a<l.top_predictions.length-1&&r.a.createElement(M.a,null)))):r.a.createElement(p.a,{variant:"body1"},"No classifications available"),o()):null},U=t(133);const V=Object(C.a)(e=>({paper:{padding:e.spacing(2),marginTop:e.spacing(2)},marginBottom:{marginBottom:e.spacing(2)},dividerMargin:{margin:"".concat(e.spacing(2),"px 0")},responseBox:{padding:e.spacing(2),backgroundColor:"#f5f5f5",borderRadius:e.shape.borderRadius,marginTop:e.spacing(2),whiteSpace:"pre-wrap"},buttonProgress:{marginLeft:e.spacing(1)}}));var J=e=>{let{visionResults:a,model:t}=e;const l=V(),[o,c]=Object(n.useState)(""),[i,m]=Object(n.useState)(!1),[d,g]=Object(n.useState)(null),[u,v]=Object(n.useState)(null);return a?r.a.createElement(E.a,{className:l.paper},r.a.createElement(p.a,{variant:"h6",gutterBottom:!0},"Ask AI about the ","vit"===t?"Classification":"Detection"," Results"),r.a.createElement(p.a,{variant:"body2",className:l.marginBottom},"Ask a question about the detected objects or classifications to get an AI-powered analysis."),r.a.createElement(U.a,{fullWidth:!0,label:"Your question about the image",variant:"outlined",value:o,onChange:e=>c(e.target.value),disabled:i,className:l.marginBottom,placeholder:"vit"===t?"E.g., What category does this image belong to?":"E.g., How many people are in this image?"}),r.a.createElement(f.a,{variant:"contained",color:"primary",onClick:async()=>{if(o.trim()){m(!0),v(null);try{const e=await fetch("/api/analyze",{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({visionResults:a,userQuery:o})});if(!e.ok)throw new Error("HTTP error! Status: ".concat(e.status));const t=await e.json();t.error?v(t.error):g(t)}catch(e){console.error("Error analyzing with LLM:",e),v("Error analyzing with LLM: ".concat(e.message))}finally{m(!1)}}},disabled:i||!o.trim()},"Analyze with AI",i&&r.a.createElement(b.a,{size:24,className:l.buttonProgress})),u&&r.a.createElement(s.a,{mt:2},r.a.createElement(p.a,{color:"error"},u)),d&&r.a.createElement(r.a.Fragment,null,r.a.createElement(M.a,{className:l.dividerMargin}),r.a.createElement(p.a,{variant:"subtitle1",gutterBottom:!0},"AI Analysis:"),r.a.createElement(s.a,{className:l.responseBox},r.a.createElement(p.a,{variant:"body1"},d.response)),d.performance&&r.a.createElement(s.a,{mt:1},r.a.createElement(p.a,{variant:"body2",color:"textSecondary"},"Analysis time: ",(e=>{if(void 0===e||null===e||isNaN(e))return"-";const a=Number(e);return a<1e3?"".concat(a.toFixed(2)," ms"):"".concat((a/1e3).toFixed(2)," s")})(d.performance.inference_time)," on ",d.performance.device)))):null};t(87);const Y=Object(c.a)({palette:{primary:{main:"#3f51b5"},secondary:{main:"#f50057"}},typography:{fontFamily:"Roboto, Arial, sans-serif"}});var q=function(){const[e,a]=Object(n.useState)(null),[t,l]=Object(n.useState)(""),[o,c]=Object(n.useState)(!1),[f,v]=Object(n.useState)(null),[h,y]=Object(n.useState)(null),[x,B]=Object(n.useState)({yolo:!1,detr:!1,vit:!1});return Object(n.useEffect)(()=>{fetch("/api/status").then(e=>e.json()).then(e=>{B(e.models)}).catch(e=>{console.error("Error checking API status:",e),y("Error connecting to the backend API. Please make sure the server is running.")})},[]),r.a.createElement(i.a,{theme:Y},r.a.createElement(s.a,{style:{flexGrow:1}},r.a.createElement(m.a,{position:"static"},r.a.createElement(d.a,null,r.a.createElement(p.a,{variant:"h6",style:{flexGrow:1}},"Multi-Model Object Detection Demo"))),r.a.createElement(g.a,{maxWidth:"lg",style:{marginTop:Y.spacing(4),marginBottom:Y.spacing(4)}},r.a.createElement(u.a,{container:!0,spacing:3},r.a.createElement(u.a,{item:!0,xs:12},r.a.createElement(E.a,{style:{padding:Y.spacing(2)}},r.a.createElement(p.a,{variant:"h5",gutterBottom:!0},"Upload an image to see how each model performs!"),r.a.createElement(p.a,{variant:"body1",paragraph:!0},"This demo showcases three different object detection and image classification models:"),r.a.createElement(p.a,{variant:"body1",component:"div"},r.a.createElement("ul",null,r.a.createElement("li",null,r.a.createElement("strong",null,"YOLOv8"),": Fast and accurate object detection"),r.a.createElement("li",null,r.a.createElement("strong",null,"DETR"),": DEtection TRansformer for object detection"),r.a.createElement("li",null,r.a.createElement("strong",null,"ViT"),": Vision Transformer for image classification"))))),r.a.createElement(u.a,{item:!0,xs:12,md:6},r.a.createElement(N,{onImageUpload:e=>{a(e),v(null),y(null)}})),r.a.createElement(u.a,{item:!0,xs:12,md:6},r.a.createElement(L,{onModelSelect:e=>{l(e),v(null),y(null)},onProcess:async()=>{if(!e||!t)return void y("Please select both an image and a model");c(!0),y(null);const a=new FormData;a.append("image",e);let n="";switch(t){case"yolo":n="/api/detect/yolo";break;case"detr":n="/api/detect/detr";break;case"vit":n="/api/classify/vit";break;default:return y("Invalid model selection"),void c(!1)}try{const e=await fetch(n,{method:"POST",body:a});if(!e.ok)throw new Error("HTTP error! Status: ".concat(e.status));const r=await e.json();v({model:t,data:r})}catch(r){console.error("Error processing image:",r),y("Error processing image: ".concat(r.message))}finally{c(!1)}},isProcessing:o,modelsStatus:x,selectedModel:t,imageSelected:!!e})),h&&r.a.createElement(u.a,{item:!0,xs:12},r.a.createElement(E.a,{style:{padding:Y.spacing(2),backgroundColor:"#ffebee"}},r.a.createElement(p.a,{color:"error"},h))),o&&r.a.createElement(u.a,{item:!0,xs:12,style:{textAlign:"center",margin:"".concat(Y.spacing(4),"px 0")}},r.a.createElement(b.a,null),r.a.createElement(p.a,{variant:"h6",style:{marginTop:Y.spacing(2)}},"Processing image...")),f&&r.a.createElement(r.a.Fragment,null,r.a.createElement(u.a,{item:!0,xs:12},r.a.createElement(H,{results:f})),r.a.createElement(u.a,{item:!0,xs:12},r.a.createElement(J,{visionResults:f.data,model:f.model})))))))};var Q=e=>{e&&e instanceof Function&&t.e(3).then(t.bind(null,137)).then(a=>{let{getCLS:t,getFID:n,getFCP:r,getLCP:l,getTTFB:o}=a;t(e),n(e),r(e),l(e),o(e)})};o.a.render(r.a.createElement(r.a.StrictMode,null,r.a.createElement(q,null)),document.getElementById("root")),Q()}},[[77,1,2]]]);
|
2 |
+
//# sourceMappingURL=main.3d1593c5.chunk.js.map
|
frontend/build/static/js/main.3d1593c5.chunk.js.map
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"version":3,"sources":["components/ImageUploader.js","components/ModelSelector.js","components/ResultDisplay.js","components/LlmAnalysis.js","App.js","reportWebVitals.js","index.js"],"names":["useStyles","makeStyles","theme","paper","padding","spacing","display","flexDirection","alignItems","height","minHeight","transition","dragActive","border","backgroundColor","dragInactive","uploadBox","justifyContent","width","cursor","uploadIcon","fontSize","color","marginBottom","supportText","marginTop","previewBox","position","imageContainer","overflow","deleteButton","top","right","ImageUploader","_ref","onImageUpload","classes","previewUrl","setPreviewUrl","useState","setDragActive","fileInputRef","useRef","handleDrag","e","preventDefault","stopPropagation","type","handleFiles","file","startsWith","URL","createObjectURL","alert","React","createElement","Paper","className","concat","onDragEnter","onDragLeave","onDragOver","onDrop","dataTransfer","files","ref","accept","onChange","target","style","Box","Typography","variant","gutterBottom","src","alt","IconButton","aria-label","onClick","handleRemoveImage","current","value","DeleteIcon","onButtonClick","click","CloudUploadIcon","Button","component","startIcon","card","selectedCard","unavailableCard","opacity","cardContent","flexGrow","chipContainer","successChip","errorChip","modelType","processButton","textAlign","ModelSelector","onModelSelect","onProcess","isProcessing","modelsStatus","selectedModel","imageSelected","models","id","name","description","icon","VisibilityIcon","available","yolo","detr","CategoryIcon","vit","handleModelClick","modelId","find","m","sx","p","Grid","container","map","model","item","xs","sm","key","Card","CardContent","mb","Chip","label","size","CardActions","disabled","fullWidth","PlayArrowIcon","resultImage","maxWidth","maxHeight","objectFit","dividerMargin","margin","gap","flexWrap","ResultDisplay","results","data","error","bgcolor","renderPerformanceInfo","performance","Divider","ms","undefined","isNaN","num","Number","toFixed","formatTime","inference_time","device","md","image","detections","length","List","detection","index","Fragment","ListItem","ListItemText","primary","class","confidence","marginLeft","secondary","bbox","join","top_predictions","prediction","rank","probability","responseBox","borderRadius","shape","whiteSpace","buttonProgress","LlmAnalysis","visionResults","userQuery","setUserQuery","isAnalyzing","setIsAnalyzing","analysisResult","setAnalysisResult","setError","TextField","placeholder","async","trim","response","fetch","method","headers","body","JSON","stringify","ok","Error","status","json","err","console","message","CircularProgress","mt","createMuiTheme","palette","main","typography","fontFamily","App","selectedImage","setSelectedImage","setSelectedModel","setIsProcessing","setResults","setModelsStatus","useEffect","then","catch","ThemeProvider","AppBar","Toolbar","Container","paragraph","formData","FormData","append","endpoint","reportWebVitals","onPerfEntry","Function","getCLS","getFID","getFCP","getLCP","getTTFB","ReactDOM","render","StrictMode","document","getElementById"],"mappings":"6YAYA,MAAMA,EAAYC,YAAYC,IAAK,CACjCC,MAAO,CACLC,QAASF,EAAMG,QAAQ,GACvBC,QAAS,OACTC,cAAe,SACfC,WAAY,SACZC,OAAQ,OACRC,UAAW,IACXC,WAAY,iBAEdC,WAAY,CACVC,OAAQ,qBACRC,gBAAiB,2BAEnBC,aAAc,CACZF,OAAQ,kBACRC,gBAAiB,SAEnBE,UAAW,CACTV,QAAS,OACTC,cAAe,SACfC,WAAY,SACZS,eAAgB,SAChBR,OAAQ,OACRS,MAAO,OACPC,OAAQ,WAEVC,WAAY,CACVC,SAAU,GACVC,MAAO,UACPC,aAAcrB,EAAMG,QAAQ,IAE9BmB,YAAa,CACXC,UAAWvB,EAAMG,QAAQ,IAE3BqB,WAAY,CACVpB,QAAS,OACTC,cAAe,SACfC,WAAY,SACZU,MAAO,OACPT,OAAQ,OACRkB,SAAU,YAEZC,eAAgB,CACdD,SAAU,WACVT,MAAO,OACPT,OAAQ,OACRH,QAAS,OACTW,eAAgB,SAChBT,WAAY,SACZqB,SAAU,SACVJ,UAAWvB,EAAMG,QAAQ,IAE3ByB,aAAc,CACZH,SAAU,WACVI,IAAK,EACLC,MAAO,EACPlB,gBAAiB,2BACjB,UAAW,CACTA,gBAAiB,gCAyHRmB,MApHOC,IAAwB,IAAvB,cAAEC,GAAeD,EACtC,MAAME,EAAUpC,KACTqC,EAAYC,GAAiBC,mBAAS,OACtC3B,EAAY4B,GAAiBD,oBAAS,GACvCE,EAAeC,iBAAO,MAEtBC,EAAcC,IAClBA,EAAEC,iBACFD,EAAEE,kBACa,cAAXF,EAAEG,MAAmC,aAAXH,EAAEG,KAC9BP,GAAc,GACM,cAAXI,EAAEG,MACXP,GAAc,IAoBZQ,EAAeC,IACfA,EAAKF,KAAKG,WAAW,WACvBZ,EAAca,IAAIC,gBAAgBH,IAClCd,EAAcc,IAEdI,MAAM,gCAcV,OACEC,IAAAC,cAACC,IAAK,CACJC,UAAS,GAAAC,OAAKtB,EAAQjC,MAAK,KAAAuD,OAAI9C,EAAawB,EAAQxB,WAAawB,EAAQrB,cACzE4C,YAAahB,EACbiB,YAAajB,EACbkB,WAAYlB,EACZmB,OAzCgBlB,IAClBA,EAAEC,iBACFD,EAAEE,kBACFN,GAAc,GACVI,EAAEmB,aAAaC,OAASpB,EAAEmB,aAAaC,MAAM,IAC/ChB,EAAYJ,EAAEmB,aAAaC,MAAM,MAsCjCV,IAAAC,cAAA,SACEU,IAAKxB,EACLM,KAAK,OACLmB,OAAO,UACPC,SAtCgBvB,IACpBA,EAAEC,iBACED,EAAEwB,OAAOJ,OAASpB,EAAEwB,OAAOJ,MAAM,IACnChB,EAAYJ,EAAEwB,OAAOJ,MAAM,KAoCzBK,MAAO,CAAE/D,QAAS,UAGlB+B,EAyBAiB,IAAAC,cAACe,IAAG,CAACb,UAAWrB,EAAQV,YACtB4B,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GAAC,WAGtCnB,IAAAC,cAACe,IAAG,CAACb,UAAWrB,EAAQR,gBACtB0B,IAAAC,cAAA,OACEmB,IAAKrC,EACLsC,IAAI,UACJlB,UAAU,kBAEZH,IAAAC,cAACqB,IAAU,CACTC,aAAW,SACXpB,UAAWrB,EAAQN,aACnBgD,QA5DcC,KACxBzC,EAAc,MACdH,EAAc,MACdM,EAAauC,QAAQC,MAAQ,KA2DnB3B,IAAAC,cAAC2B,IAAU,SAvCjB5B,IAAAC,cAACe,IAAG,CACFb,UAAWrB,EAAQpB,UACnB8D,QA7BcK,KACpB1C,EAAauC,QAAQI,UA8Bf9B,IAAAC,cAAC8B,IAAe,CAAC5B,UAAWrB,EAAQhB,aACpCkC,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GAAC,6BAGtCnB,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQlD,MAAM,gBAAgBmD,cAAY,GAAC,MAG/DnB,IAAAC,cAAC+B,IAAM,CACLd,QAAQ,YACRlD,MAAM,UACNiE,UAAU,OACVC,UAAWlC,IAAAC,cAAC8B,IAAe,OAC5B,gBAGD/B,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQlD,MAAM,gBAAgBmC,UAAWrB,EAAQZ,aAAa,uC,uFCnJ5F,MAAMxB,EAAYC,YAAYC,IAAK,CACjCuF,KAAM,CACJhF,OAAQ,OACRH,QAAS,OACTC,cAAe,UAEjBmF,aAAc,CACZ7E,OAAQ,qBAEV8E,gBAAiB,CACfC,QAAS,IAEXC,YAAa,CACXC,SAAU,GAEZC,cAAe,CACbxE,aAAcrB,EAAMG,QAAQ,MAE9B2F,YAAa,CACXlF,gBAAiB,UACjBQ,MAAO,QAET2E,UAAW,CACTnF,gBAAiB,UACjBQ,MAAO,QAET4E,UAAW,CACTzE,UAAWvB,EAAMG,QAAQ,IAE3B8F,cAAe,CACb1E,UAAWvB,EAAMG,QAAQ,GACzB+F,UAAW,aAwHAC,MApHOnE,IAOf,IAPgB,cACrBoE,EAAa,UACbC,EAAS,aACTC,EAAY,aACZC,EAAY,cACZC,EAAa,cACbC,GACDzE,EACC,MAAME,EAAUpC,IAEV4G,EAAS,CACb,CACEC,GAAI,OACJC,KAAM,SACNC,YAAa,qCACbC,KAAM1D,IAAAC,cAAC0D,IAAc,MACrBC,UAAWT,EAAaU,MAE1B,CACEN,GAAI,OACJC,KAAM,OACNC,YAAa,6CACbC,KAAM1D,IAAAC,cAAC0D,IAAc,MACrBC,UAAWT,EAAaW,MAE1B,CACEP,GAAI,MACJC,KAAM,MACNC,YAAa,8CACbC,KAAM1D,IAAAC,cAAC8D,IAAY,MACnBH,UAAWT,EAAaa,MAItBC,EAAoBC,IACpBZ,EAAOa,KAAKC,GAAKA,EAAEb,KAAOW,GAASN,WACrCZ,EAAckB,IAIlB,OACElE,IAAAC,cAACe,IAAG,CAACqD,GAAI,CAAEC,EAAG,EAAGnH,OAAQ,SACvB6C,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GAAC,gBAItCnB,IAAAC,cAACsE,IAAI,CAACC,WAAS,EAACzH,QAAS,GACtBuG,EAAOmB,IAAKC,GACX1E,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,GAAIC,GAAI,EAAGC,IAAKJ,EAAMnB,IACnCvD,IAAAC,cAAC8E,IAAI,CACH5E,UAAS,qBAAAC,OACLtB,EAAQqD,KAAI,uBAAA/B,OACZgD,IAAkBsB,EAAMnB,GAAKzE,EAAQsD,aAAe,GAAE,uBAAAhC,OACrDsE,EAAMd,UAAsC,GAA1B9E,EAAQuD,gBAAoB,oBAEnDb,QAASA,IAAMyC,EAAiBS,EAAMnB,KAEtCvD,IAAAC,cAAC+E,IAAW,CAAC7E,UAAWrB,EAAQyD,aAC9BvC,IAAAC,cAACe,IAAG,CAACqD,GAAI,CAAEY,GAAI,EAAGjH,MAAO,YACtB0G,EAAMhB,MAET1D,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKe,UAAU,MAAMd,cAAY,GAClDuD,EAAMlB,MAETxD,IAAAC,cAAA,OAAKE,UAAWrB,EAAQ2D,eACrBiC,EAAMd,UACL5D,IAAAC,cAACiF,IAAI,CACHC,MAAM,YACNhF,UAAWrB,EAAQ4D,YACnB0C,KAAK,UAGPpF,IAAAC,cAACiF,IAAI,CACHC,MAAM,gBACNhF,UAAWrB,EAAQ6D,UACnByC,KAAK,WAIXpF,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQlD,MAAM,iBAC/B0G,EAAMjB,cAGXzD,IAAAC,cAACoF,IAAW,KACVrF,IAAAC,cAAC+B,IAAM,CACLoD,KAAK,QACL5D,QAASA,IAAMyC,EAAiBS,EAAMnB,IACtC+B,UAAWZ,EAAMd,UACjB5F,MAAOoF,IAAkBsB,EAAMnB,GAAK,UAAY,UAChDrC,QAASkC,IAAkBsB,EAAMnB,GAAK,YAAc,WACpDgC,WAAS,GAERnC,IAAkBsB,EAAMnB,GAAK,WAAa,eAQvDvD,IAAAC,cAAA,OAAKE,UAAWrB,EAAQ+D,eACtB7C,IAAAC,cAAC+B,IAAM,CACLd,QAAQ,YACRlD,MAAM,UACNoH,KAAK,QACLlD,UAAWlC,IAAAC,cAACuF,IAAa,MACzBhE,QAASyB,EACTqC,UAAWlC,IAAkBC,GAAiBH,GAE7CA,EAAe,gBAAkB,oB,oCClJ5C,MAAMxG,EAAYC,YAAYC,IAAK,CACjCC,MAAO,CACLC,QAASF,EAAMG,QAAQ,IAEzBkB,aAAc,CACZA,aAAcrB,EAAMG,QAAQ,IAE9B0I,YAAa,CACXC,SAAU,OACVC,UAAW,QACXC,UAAW,WAEbC,cAAe,CACbC,OAAO,GAAD1F,OAAKxD,EAAMG,QAAQ,GAAE,SAE7B0F,cAAe,CACbzF,QAAS,OACT+I,IAAKnJ,EAAMG,QAAQ,GACnBiJ,SAAU,WA8JCC,MA1JOrH,IAAkB,IAAjB,QAAEsH,GAAStH,EAChC,MAAME,EAAUpC,IAChB,IAAKwJ,EAAS,OAAO,KAErB,MAAM,MAAExB,EAAK,KAAEyB,GAASD,EAWxB,GAAIC,EAAKC,MACP,OACEpG,IAAAC,cAACC,IAAK,CAACmE,GAAI,CAAEC,EAAG,EAAG+B,QAAS,YAC1BrG,IAAAC,cAACgB,IAAU,CAACjD,MAAM,SAASmI,EAAKC,QAMtC,MAAME,EAAwBA,IACvBH,EAAKI,YAGRvG,IAAAC,cAACe,IAAG,CAACb,UAAU,oBACbH,IAAAC,cAACuG,IAAO,CAACrG,UAAWrB,EAAQ+G,gBAC5B7F,IAAAC,cAACgB,IAAU,CAACC,QAAQ,SAAQ,mBAvBduF,KAClB,QAAWC,IAAPD,GAA2B,OAAPA,GAAeE,MAAMF,GAAK,MAAO,IACzD,MAAMG,EAAMC,OAAOJ,GACnB,OAAIG,EAAM,IAAY,GAANxG,OAAUwG,EAAIE,QAAQ,GAAE,OAClC,GAAN1G,QAAWwG,EAAM,KAAME,QAAQ,GAAE,OAoBVC,CAAWZ,EAAKI,YAAYS,gBAAgB,OAAKb,EAAKI,YAAYU,SAN3D,KAahC,MAAc,SAAVvC,GAA8B,SAAVA,EAEpB1E,IAAAC,cAACC,IAAK,CAACC,UAAWrB,EAAQjC,OACxBmD,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GACxB,SAAVuD,EAAmB,SAAW,OAAO,sBAGxC1E,IAAAC,cAACsE,IAAI,CAACC,WAAS,EAACzH,QAAS,GACvBiD,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,GAAIsC,GAAI,GACpBf,EAAKgB,OACJnH,IAAAC,cAACe,IAAG,CAACb,UAAWrB,EAAQb,cACtB+B,IAAAC,cAACgB,IAAU,CAACC,QAAQ,YAAYC,cAAY,GAAC,oBAG7CnB,IAAAC,cAAA,OACEmB,IAAG,yBAAAhB,OAA2B+F,EAAKgB,OACnC9F,IAAI,mBACJlB,UAAWrB,EAAQ2G,gBAM3BzF,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,GAAIsC,GAAI,GACrBlH,IAAAC,cAACe,IAAG,CAACb,UAAWrB,EAAQb,cACtB+B,IAAAC,cAACgB,IAAU,CAACC,QAAQ,YAAYC,cAAY,GAAC,qBAI5CgF,EAAKiB,YAAcjB,EAAKiB,WAAWC,OAAS,EAC3CrH,IAAAC,cAACqH,IAAI,KACFnB,EAAKiB,WAAW3C,IAAI,CAAC8C,EAAWC,IAC/BxH,IAAAC,cAACD,IAAMyH,SAAQ,CAAC3C,IAAK0C,GACnBxH,IAAAC,cAACyH,IAAQ,KACP1H,IAAAC,cAAC0H,IAAY,CACXC,QACE5H,IAAAC,cAACe,IAAG,CAACD,MAAO,CAAE/D,QAAS,OAAQE,WAAY,WACzC8C,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQe,UAAU,QACnCsF,EAAUM,OAEb7H,IAAAC,cAACiF,IAAI,CACHC,MAAK,GAAA/E,QAA6B,IAAvBmH,EAAUO,YAAkBhB,QAAQ,GAAE,KACjD1B,KAAK,QACLpH,MAAM,UACN+C,MAAO,CAAEgH,WAAY,MAI3BC,UAAS,kBAAA5H,OAAoBmH,EAAUU,KAAKC,KAAK,MAAK,QAGzDV,EAAQrB,EAAKiB,WAAWC,OAAS,GAAKrH,IAAAC,cAACuG,IAAO,SAKrDxG,IAAAC,cAACgB,IAAU,CAACC,QAAQ,SAAQ,0BAMnCoF,KAMO,QAAV5B,EAEA1E,IAAAC,cAACC,IAAK,CAACC,UAAWrB,EAAQjC,OACxBmD,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GAAC,8BAItCnB,IAAAC,cAACgB,IAAU,CAACC,QAAQ,YAAYC,cAAY,GAAC,oBAI5CgF,EAAKgC,iBAAmBhC,EAAKgC,gBAAgBd,OAAS,EACrDrH,IAAAC,cAACqH,IAAI,KACFnB,EAAKgC,gBAAgB1D,IAAI,CAAC2D,EAAYZ,IACrCxH,IAAAC,cAACD,IAAMyH,SAAQ,CAAC3C,IAAK0C,GACnBxH,IAAAC,cAACyH,IAAQ,KACP1H,IAAAC,cAAC0H,IAAY,CACXC,QACE5H,IAAAC,cAACe,IAAG,CAACD,MAAO,CAAE/D,QAAS,OAAQE,WAAY,WACzC8C,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQe,UAAU,QACnCmG,EAAWC,KAAK,KAAGD,EAAWP,OAEjC7H,IAAAC,cAACiF,IAAI,CACHC,MAAK,GAAA/E,QAA+B,IAAzBgI,EAAWE,aAAmBxB,QAAQ,GAAE,KACnD1B,KAAK,QACLpH,MAAiB,IAAVwJ,EAAc,UAAY,UACjCzG,MAAO,CAAEgH,WAAY,SAM9BP,EAAQrB,EAAKgC,gBAAgBd,OAAS,GAAKrH,IAAAC,cAACuG,IAAO,SAK1DxG,IAAAC,cAACgB,IAAU,CAACC,QAAQ,SAAQ,gCAG7BoF,KAKA,M,SC/KT,MAAM5J,EAAYC,YAAYC,IAAK,CACjCC,MAAO,CACLC,QAASF,EAAMG,QAAQ,GACvBoB,UAAWvB,EAAMG,QAAQ,IAE3BkB,aAAc,CACZA,aAAcrB,EAAMG,QAAQ,IAE9B8I,cAAe,CACbC,OAAO,GAAD1F,OAAKxD,EAAMG,QAAQ,GAAE,SAE7BwL,YAAa,CACXzL,QAASF,EAAMG,QAAQ,GACvBS,gBAAiB,UACjBgL,aAAc5L,EAAM6L,MAAMD,aAC1BrK,UAAWvB,EAAMG,QAAQ,GACzB2L,WAAY,YAEdC,eAAgB,CACdZ,WAAYnL,EAAMG,QAAQ,OA4Hf6L,MAxHKhK,IAA+B,IAA9B,cAAEiK,EAAa,MAAEnE,GAAO9F,EAC3C,MAAME,EAAUpC,KACToM,EAAWC,GAAgB9J,mBAAS,KACpC+J,EAAaC,GAAkBhK,oBAAS,IACxCiK,EAAgBC,GAAqBlK,mBAAS,OAC9CmH,EAAOgD,GAAYnK,mBAAS,MA+CnC,OAAK4J,EAGH7I,IAAAC,cAACC,IAAK,CAACC,UAAWrB,EAAQjC,OACxBmD,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GAAC,oBACR,QAAVuD,EAAkB,iBAAmB,YAAY,YAGrE1E,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQf,UAAWrB,EAAQb,cAAc,+FAI7D+B,IAAAC,cAACoJ,IAAS,CACR9D,WAAS,EACTJ,MAAM,gCACNjE,QAAQ,WACRS,MAAOmH,EACPjI,SAAWvB,GAAMyJ,EAAazJ,EAAEwB,OAAOa,OACvC2D,SAAU0D,EACV7I,UAAWrB,EAAQb,aACnBqL,YAAuB,QAAV5E,EACT,iDACA,6CAGN1E,IAAAC,cAAC+B,IAAM,CACLd,QAAQ,YACRlD,MAAM,UACNwD,QAjEgB+H,UACpB,GAAKT,EAAUU,OAAf,CAEAP,GAAe,GACfG,EAAS,MAET,IACE,MAAMK,QAAiBC,MAAM,eAAgB,CAC3CC,OAAQ,OACRC,QAAS,CACP,eAAgB,oBAElBC,KAAMC,KAAKC,UAAU,CACnBlB,cAAeA,EACfC,UAAWA,MAIf,IAAKW,EAASO,GACZ,MAAM,IAAIC,MAAM,uBAAD7J,OAAwBqJ,EAASS,SAGlD,MAAM/D,QAAasD,EAASU,OAExBhE,EAAKC,MACPgD,EAASjD,EAAKC,OAEd+C,EAAkBhD,GAEpB,MAAOiE,GACPC,QAAQjE,MAAM,4BAA6BgE,GAC3ChB,EAAS,6BAADhJ,OAA8BgK,EAAIE,UAC3C,QACCrB,GAAe,MAiCb3D,SAAU0D,IAAgBF,EAAUU,QACrC,kBAEER,GAAehJ,IAAAC,cAACsK,IAAgB,CAACnF,KAAM,GAAIjF,UAAWrB,EAAQ6J,kBAGhEvC,GACCpG,IAAAC,cAACe,IAAG,CAACwJ,GAAI,GACPxK,IAAAC,cAACgB,IAAU,CAACjD,MAAM,SAASoI,IAI9B8C,GACClJ,IAAAC,cAAAD,IAAAyH,SAAA,KACEzH,IAAAC,cAACuG,IAAO,CAACrG,UAAWrB,EAAQ+G,gBAE5B7F,IAAAC,cAACgB,IAAU,CAACC,QAAQ,YAAYC,cAAY,GAAC,gBAI7CnB,IAAAC,cAACe,IAAG,CAACb,UAAWrB,EAAQyJ,aACtBvI,IAAAC,cAACgB,IAAU,CAACC,QAAQ,SACjBgI,EAAeO,WAInBP,EAAe3C,aACdvG,IAAAC,cAACe,IAAG,CAACwJ,GAAI,GACPxK,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQlD,MAAM,iBAAgB,kBArG1CyI,KAClB,QAAWC,IAAPD,GAA2B,OAAPA,GAAeE,MAAMF,GAAK,MAAO,IACzD,MAAMG,EAAMC,OAAOJ,GACnB,OAAIG,EAAM,IAAY,GAANxG,OAAUwG,EAAIE,QAAQ,GAAE,OAClC,GAAN1G,QAAWwG,EAAM,KAAME,QAAQ,GAAE,OAkGLC,CAAWmC,EAAe3C,YAAYS,gBAAgB,OAAKkC,EAAe3C,YAAYU,WA1DzF,M,MCnE7B,MAAMrK,EAAQ6N,YAAe,CAC3BC,QAAS,CACP9C,QAAS,CACP+C,KAAM,WAER3C,UAAW,CACT2C,KAAM,YAGVC,WAAY,CACVC,WAAY,+BA0KDC,MAtKf,WACE,MAAOC,EAAeC,GAAoB/L,mBAAS,OAC5CmE,EAAe6H,GAAoBhM,mBAAS,KAC5CiE,EAAcgI,GAAmBjM,oBAAS,IAC1CiH,EAASiF,GAAclM,mBAAS,OAChCmH,EAAOgD,GAAYnK,mBAAS,OAC5BkE,EAAciI,GAAmBnM,mBAAS,CAC/C4E,MAAM,EACNC,MAAM,EACNE,KAAK,IA8EP,OA1EAqH,oBAAU,KACR3B,MAAM,eACH4B,KAAK7B,GAAYA,EAASU,QAC1BmB,KAAKnF,IACJiF,EAAgBjF,EAAK7C,UAEtBiI,MAAMnB,IACLC,QAAQjE,MAAM,6BAA8BgE,GAC5ChB,EAAS,mFAEZ,IAiEDpJ,IAAAC,cAACuL,IAAa,CAAC5O,MAAOA,GACpBoD,IAAAC,cAACe,IAAG,CAACD,MAAO,CAAEyB,SAAU,IACtBxC,IAAAC,cAACwL,IAAM,CAACpN,SAAS,UACf2B,IAAAC,cAACyL,IAAO,KACN1L,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKH,MAAO,CAAEyB,SAAU,IAAK,uCAKrDxC,IAAAC,cAAC0L,IAAS,CAACjG,SAAS,KAAK3E,MAAO,CAAE5C,UAAWvB,EAAMG,QAAQ,GAAIkB,aAAcrB,EAAMG,QAAQ,KACzFiD,IAAAC,cAACsE,IAAI,CAACC,WAAS,EAACzH,QAAS,GACvBiD,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,IACb5E,IAAAC,cAACC,IAAK,CAACa,MAAO,CAAEjE,QAASF,EAAMG,QAAQ,KACrCiD,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKC,cAAY,GAAC,mDAGtCnB,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQ0K,WAAS,GAAC,yFAGtC5L,IAAAC,cAACgB,IAAU,CAACC,QAAQ,QAAQe,UAAU,OACpCjC,IAAAC,cAAA,UACED,IAAAC,cAAA,UAAID,IAAAC,cAAA,cAAQ,UAAe,wCAC3BD,IAAAC,cAAA,UAAID,IAAAC,cAAA,cAAQ,QAAa,gDACzBD,IAAAC,cAAA,UAAID,IAAAC,cAAA,cAAQ,OAAY,qDAMhCD,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,GAAIsC,GAAI,GACrBlH,IAAAC,cAACtB,EAAa,CAACE,cA7FAsI,IACzB6D,EAAiB7D,GACjBgE,EAAW,MACX/B,EAAS,UA6FDpJ,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,GAAIsC,GAAI,GACrBlH,IAAAC,cAAC8C,EAAa,CACZC,cA5Fa0B,IACzBuG,EAAiBvG,GACjByG,EAAW,MACX/B,EAAS,OA0FGnG,UAvFOsG,UACnB,IAAKwB,IAAkB3H,EAErB,YADAgG,EAAS,2CAIX8B,GAAgB,GAChB9B,EAAS,MAGT,MAAMyC,EAAW,IAAIC,SACrBD,EAASE,OAAO,QAAShB,GAEzB,IAAIiB,EAAW,GACf,OAAQ5I,GACN,IAAK,OACH4I,EAAW,mBACX,MACF,IAAK,OACHA,EAAW,mBACX,MACF,IAAK,MACHA,EAAW,oBACX,MACF,QAGE,OAFA5C,EAAS,gCACT8B,GAAgB,GAIpB,IACE,MAAMzB,QAAiBC,MAAMsC,EAAU,CACrCrC,OAAQ,OACRE,KAAMgC,IAGR,IAAKpC,EAASO,GACZ,MAAM,IAAIC,MAAM,uBAAD7J,OAAwBqJ,EAASS,SAGlD,MAAM/D,QAAasD,EAASU,OAC5BgB,EAAW,CAAEzG,MAAOtB,EAAe+C,SACnC,MAAOiE,GACPC,QAAQjE,MAAM,0BAA2BgE,GACzChB,EAAS,2BAADhJ,OAA4BgK,EAAIE,UACzC,QACCY,GAAgB,KA0CNhI,aAAcA,EACdC,aAAcA,EACdC,cAAeA,EACfC,gBAAiB0H,KAIpB3E,GACCpG,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,IACb5E,IAAAC,cAACC,IAAK,CAACa,MAAO,CAAEjE,QAASF,EAAMG,QAAQ,GAAIS,gBAAiB,YAC1DwC,IAAAC,cAACgB,IAAU,CAACjD,MAAM,SAASoI,KAKhClD,GACClD,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,GAAI7D,MAAO,CAAE+B,UAAW,SAAUgD,OAAO,GAAD1F,OAAKxD,EAAMG,QAAQ,GAAE,UAC1EiD,IAAAC,cAACsK,IAAgB,MACjBvK,IAAAC,cAACgB,IAAU,CAACC,QAAQ,KAAKH,MAAO,CAAE5C,UAAWvB,EAAMG,QAAQ,KAAM,wBAMpEmJ,GACClG,IAAAC,cAAAD,IAAAyH,SAAA,KACEzH,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,IACb5E,IAAAC,cAACgG,EAAa,CAACC,QAASA,KAE1BlG,IAAAC,cAACsE,IAAI,CAACI,MAAI,EAACC,GAAI,IACb5E,IAAAC,cAAC2I,EAAW,CAACC,cAAe3C,EAAQC,KAAMzB,MAAOwB,EAAQxB,eCjL5DuH,MAZUC,IACnBA,GAAeA,aAAuBC,UACxC,8BAAqBb,KAAK1M,IAAkD,IAAjD,OAAEwN,EAAM,OAAEC,EAAM,OAAEC,EAAM,OAAEC,EAAM,QAAEC,GAAS5N,EACpEwN,EAAOF,GACPG,EAAOH,GACPI,EAAOJ,GACPK,EAAOL,GACPM,EAAQN,MCDdO,IAASC,OACP1M,IAAAC,cAACD,IAAM2M,WAAU,KACf3M,IAAAC,cAAC6K,EAAG,OAEN8B,SAASC,eAAe,SAM1BZ,M","file":"static/js/main.3d1593c5.chunk.js","sourcesContent":["import React, { useState, useRef } from 'react';\nimport { \n Paper, \n Typography, \n Box, \n Button, \n IconButton \n} from '@material-ui/core';\nimport CloudUploadIcon from '@material-ui/icons/CloudUpload';\nimport DeleteIcon from '@material-ui/icons/Delete';\nimport { makeStyles } from '@material-ui/core/styles';\n\nconst useStyles = makeStyles((theme) => ({\n paper: {\n padding: theme.spacing(2),\n display: 'flex',\n flexDirection: 'column',\n alignItems: 'center',\n height: '100%',\n minHeight: 300,\n transition: 'all 0.3s ease'\n },\n dragActive: {\n border: '2px dashed #3f51b5',\n backgroundColor: 'rgba(63, 81, 181, 0.05)'\n },\n dragInactive: {\n border: '2px dashed #ccc',\n backgroundColor: 'white'\n },\n uploadBox: {\n display: 'flex',\n flexDirection: 'column',\n alignItems: 'center',\n justifyContent: 'center',\n height: '100%',\n width: '100%',\n cursor: 'pointer'\n },\n uploadIcon: {\n fontSize: 60,\n color: '#3f51b5',\n marginBottom: theme.spacing(2)\n },\n supportText: {\n marginTop: theme.spacing(2)\n },\n previewBox: {\n display: 'flex',\n flexDirection: 'column',\n alignItems: 'center',\n width: '100%',\n height: '100%',\n position: 'relative'\n },\n imageContainer: {\n position: 'relative',\n width: '100%',\n height: '100%',\n display: 'flex',\n justifyContent: 'center',\n alignItems: 'center',\n overflow: 'hidden',\n marginTop: theme.spacing(2)\n },\n deleteButton: {\n position: 'absolute',\n top: 0,\n right: 0,\n backgroundColor: 'rgba(255, 255, 255, 0.7)',\n '&:hover': {\n backgroundColor: 'rgba(255, 255, 255, 0.9)',\n }\n }\n}));\n\nconst ImageUploader = ({ onImageUpload }) => {\n const classes = useStyles();\n const [previewUrl, setPreviewUrl] = useState(null);\n const [dragActive, setDragActive] = useState(false);\n const fileInputRef = useRef(null);\n\n const handleDrag = (e) => {\n e.preventDefault();\n e.stopPropagation();\n if (e.type === 'dragenter' || e.type === 'dragover') {\n setDragActive(true);\n } else if (e.type === 'dragleave') {\n setDragActive(false);\n }\n };\n\n const handleDrop = (e) => {\n e.preventDefault();\n e.stopPropagation();\n setDragActive(false);\n if (e.dataTransfer.files && e.dataTransfer.files[0]) {\n handleFiles(e.dataTransfer.files[0]);\n }\n };\n\n const handleChange = (e) => {\n e.preventDefault();\n if (e.target.files && e.target.files[0]) {\n handleFiles(e.target.files[0]);\n }\n };\n\n const handleFiles = (file) => {\n if (file.type.startsWith('image/')) {\n setPreviewUrl(URL.createObjectURL(file));\n onImageUpload(file);\n } else {\n alert('Please upload an image file');\n }\n };\n\n const onButtonClick = () => {\n fileInputRef.current.click();\n };\n\n const handleRemoveImage = () => {\n setPreviewUrl(null);\n onImageUpload(null);\n fileInputRef.current.value = \"\";\n };\n\n return (\n <Paper \n className={`${classes.paper} ${dragActive ? classes.dragActive : classes.dragInactive}`}\n onDragEnter={handleDrag}\n onDragLeave={handleDrag}\n onDragOver={handleDrag}\n onDrop={handleDrop}\n >\n <input\n ref={fileInputRef}\n type=\"file\"\n accept=\"image/*\"\n onChange={handleChange}\n style={{ display: 'none' }}\n />\n\n {!previewUrl ? (\n <Box \n className={classes.uploadBox}\n onClick={onButtonClick}\n >\n <CloudUploadIcon className={classes.uploadIcon} />\n <Typography variant=\"h6\" gutterBottom>\n Drag & Drop an image here\n </Typography>\n <Typography variant=\"body2\" color=\"textSecondary\" gutterBottom>\n or\n </Typography>\n <Button\n variant=\"contained\"\n color=\"primary\"\n component=\"span\"\n startIcon={<CloudUploadIcon />}\n >\n Browse Files\n </Button>\n <Typography variant=\"body2\" color=\"textSecondary\" className={classes.supportText}>\n Supported formats: JPG, PNG, GIF\n </Typography>\n </Box>\n ) : (\n <Box className={classes.previewBox}>\n <Typography variant=\"h6\" gutterBottom>\n Preview\n </Typography>\n <Box className={classes.imageContainer}>\n <img\n src={previewUrl}\n alt=\"Preview\"\n className=\"preview-image\"\n />\n <IconButton\n aria-label=\"delete\"\n className={classes.deleteButton}\n onClick={handleRemoveImage}\n >\n <DeleteIcon />\n </IconButton>\n </Box>\n </Box>\n )}\n </Paper>\n );\n};\n\nexport default ImageUploader;\n","import React from 'react';\nimport { \n Grid, \n Card, \n CardContent, \n CardActions, \n Typography, \n Button, \n Chip,\n Box\n} from '@material-ui/core';\nimport VisibilityIcon from '@material-ui/icons/Visibility';\nimport CategoryIcon from '@material-ui/icons/Category';\nimport PlayArrowIcon from '@material-ui/icons/PlayArrow';\nimport { makeStyles } from '@material-ui/core/styles';\n\nconst useStyles = makeStyles((theme) => ({\n card: {\n height: '100%',\n display: 'flex',\n flexDirection: 'column',\n },\n selectedCard: {\n border: '2px solid #3f51b5',\n },\n unavailableCard: {\n opacity: 0.6,\n },\n cardContent: {\n flexGrow: 1,\n },\n chipContainer: {\n marginBottom: theme.spacing(1.5),\n },\n successChip: {\n backgroundColor: '#34C759',\n color: '#fff',\n },\n errorChip: {\n backgroundColor: '#FF3B3F',\n color: '#fff',\n },\n modelType: {\n marginTop: theme.spacing(1),\n },\n processButton: {\n marginTop: theme.spacing(3),\n textAlign: 'center',\n }\n}));\n\nconst ModelSelector = ({ \n onModelSelect, \n onProcess, \n isProcessing, \n modelsStatus, \n selectedModel,\n imageSelected \n}) => {\n const classes = useStyles();\n \n const models = [\n {\n id: 'yolo',\n name: 'YOLOv8',\n description: 'Fast and accurate object detection',\n icon: <VisibilityIcon />,\n available: modelsStatus.yolo\n },\n {\n id: 'detr',\n name: 'DETR',\n description: 'DEtection TRansformer for object detection',\n icon: <VisibilityIcon />,\n available: modelsStatus.detr\n },\n {\n id: 'vit',\n name: 'ViT',\n description: 'Vision Transformer for image classification',\n icon: <CategoryIcon />,\n available: modelsStatus.vit\n }\n ];\n\n const handleModelClick = (modelId) => {\n if (models.find(m => m.id === modelId).available) {\n onModelSelect(modelId);\n }\n };\n\n return (\n <Box sx={{ p: 2, height: '100%' }}>\n <Typography variant=\"h6\" gutterBottom>\n Select Model\n </Typography>\n \n <Grid container spacing={2}>\n {models.map((model) => (\n <Grid item xs={12} sm={4} key={model.id}>\n <Card \n className={`\n ${classes.card} \n ${selectedModel === model.id ? classes.selectedCard : ''} \n ${!model.available ? classes.unavailableCard : ''}\n `}\n onClick={() => handleModelClick(model.id)}\n >\n <CardContent className={classes.cardContent}>\n <Box sx={{ mb: 2, color: 'primary' }}>\n {model.icon}\n </Box>\n <Typography variant=\"h5\" component=\"div\" gutterBottom>\n {model.name}\n </Typography>\n <div className={classes.chipContainer}>\n {model.available ? (\n <Chip \n label=\"Available\" \n className={classes.successChip}\n size=\"small\" \n />\n ) : (\n <Chip \n label=\"Not Available\" \n className={classes.errorChip}\n size=\"small\" \n />\n )}\n </div>\n <Typography variant=\"body2\" color=\"textSecondary\">\n {model.description}\n </Typography>\n </CardContent>\n <CardActions>\n <Button \n size=\"small\" \n onClick={() => handleModelClick(model.id)}\n disabled={!model.available}\n color={selectedModel === model.id ? \"primary\" : \"default\"}\n variant={selectedModel === model.id ? \"contained\" : \"outlined\"}\n fullWidth\n >\n {selectedModel === model.id ? 'Selected' : 'Select'}\n </Button>\n </CardActions>\n </Card>\n </Grid>\n ))}\n </Grid>\n\n <div className={classes.processButton}>\n <Button\n variant=\"contained\"\n color=\"primary\"\n size=\"large\"\n startIcon={<PlayArrowIcon />}\n onClick={onProcess}\n disabled={!selectedModel || !imageSelected || isProcessing}\n >\n {isProcessing ? 'Processing...' : 'Process Image'}\n </Button>\n </div>\n </Box>\n );\n};\n\nexport default ModelSelector;\n","import React from 'react';\nimport { \n Paper, \n Typography, \n Box, \n List, \n ListItem, \n ListItemText, \n Divider,\n Grid,\n Chip\n} from '@material-ui/core';\nimport { makeStyles } from '@material-ui/core/styles';\n\nconst useStyles = makeStyles((theme) => ({\n paper: {\n padding: theme.spacing(2)\n },\n marginBottom: {\n marginBottom: theme.spacing(2)\n },\n resultImage: {\n maxWidth: '100%',\n maxHeight: '400px',\n objectFit: 'contain'\n },\n dividerMargin: {\n margin: `${theme.spacing(2)}px 0`\n },\n chipContainer: {\n display: 'flex',\n gap: theme.spacing(1),\n flexWrap: 'wrap'\n }\n}));\n\nconst ResultDisplay = ({ results }) => {\n const classes = useStyles();\n if (!results) return null;\n \n const { model, data } = results;\n \n // Helper to format times nicely\n const formatTime = (ms) => {\n if (ms === undefined || ms === null || isNaN(ms)) return '-';\n const num = Number(ms);\n if (num < 1000) return `${num.toFixed(2)} ms`;\n return `${(num / 1000).toFixed(2)} s`;\n };\n \n // Check if there's an error\n if (data.error) {\n return (\n <Paper sx={{ p: 2, bgcolor: '#ffebee' }}>\n <Typography color=\"error\">{data.error}</Typography>\n </Paper>\n );\n }\n\n // Display performance info\n const renderPerformanceInfo = () => {\n if (!data.performance) return null;\n \n return (\n <Box className=\"performance-info\">\n <Divider className={classes.dividerMargin} />\n <Typography variant=\"body2\">\n Inference time: {formatTime(data.performance.inference_time)} on {data.performance.device}\n </Typography>\n </Box>\n );\n };\n\n // Render for YOLO and DETR (object detection)\n if (model === 'yolo' || model === 'detr') {\n return (\n <Paper className={classes.paper}>\n <Typography variant=\"h6\" gutterBottom>\n {model === 'yolo' ? 'YOLOv8' : 'DETR'} Detection Results\n </Typography>\n \n <Grid container spacing={3}>\n <Grid item xs={12} md={6}>\n {data.image && (\n <Box className={classes.marginBottom}>\n <Typography variant=\"subtitle1\" gutterBottom>\n Detection Result\n </Typography>\n <img \n src={`data:image/png;base64,${data.image}`} \n alt=\"Detection Result\" \n className={classes.resultImage}\n />\n </Box>\n )}\n </Grid>\n \n <Grid item xs={12} md={6}>\n <Box className={classes.marginBottom}>\n <Typography variant=\"subtitle1\" gutterBottom>\n Detected Objects:\n </Typography>\n \n {data.detections && data.detections.length > 0 ? (\n <List>\n {data.detections.map((detection, index) => (\n <React.Fragment key={index}>\n <ListItem>\n <ListItemText \n primary={\n <Box style={{ display: 'flex', alignItems: 'center' }}>\n <Typography variant=\"body1\" component=\"span\">\n {detection.class}\n </Typography>\n <Chip \n label={`${(detection.confidence * 100).toFixed(0)}%`}\n size=\"small\"\n color=\"primary\"\n style={{ marginLeft: 8 }}\n />\n </Box>\n } \n secondary={`Bounding Box: [${detection.bbox.join(', ')}]`} \n />\n </ListItem>\n {index < data.detections.length - 1 && <Divider />}\n </React.Fragment>\n ))}\n </List>\n ) : (\n <Typography variant=\"body1\">No objects detected</Typography>\n )}\n </Box>\n </Grid>\n </Grid>\n \n {renderPerformanceInfo()}\n </Paper>\n );\n }\n \n // Render for ViT (classification)\n if (model === 'vit') {\n return (\n <Paper className={classes.paper}>\n <Typography variant=\"h6\" gutterBottom>\n ViT Classification Results\n </Typography>\n \n <Typography variant=\"subtitle1\" gutterBottom>\n Top Predictions:\n </Typography>\n \n {data.top_predictions && data.top_predictions.length > 0 ? (\n <List>\n {data.top_predictions.map((prediction, index) => (\n <React.Fragment key={index}>\n <ListItem>\n <ListItemText \n primary={\n <Box style={{ display: 'flex', alignItems: 'center' }}>\n <Typography variant=\"body1\" component=\"span\">\n {prediction.rank}. {prediction.class}\n </Typography>\n <Chip \n label={`${(prediction.probability * 100).toFixed(1)}%`}\n size=\"small\"\n color={index === 0 ? \"primary\" : \"default\"}\n style={{ marginLeft: 8 }}\n />\n </Box>\n } \n />\n </ListItem>\n {index < data.top_predictions.length - 1 && <Divider />}\n </React.Fragment>\n ))}\n </List>\n ) : (\n <Typography variant=\"body1\">No classifications available</Typography>\n )}\n \n {renderPerformanceInfo()}\n </Paper>\n );\n }\n \n return null;\n};\n\nexport default ResultDisplay;\n","import React, { useState } from 'react';\nimport { \n Paper, \n Typography, \n Box, \n TextField, \n Button, \n CircularProgress,\n Divider\n} from '@material-ui/core';\nimport { makeStyles } from '@material-ui/core/styles';\n\nconst useStyles = makeStyles((theme) => ({\n paper: {\n padding: theme.spacing(2),\n marginTop: theme.spacing(2)\n },\n marginBottom: {\n marginBottom: theme.spacing(2)\n },\n dividerMargin: {\n margin: `${theme.spacing(2)}px 0`\n },\n responseBox: {\n padding: theme.spacing(2),\n backgroundColor: '#f5f5f5',\n borderRadius: theme.shape.borderRadius,\n marginTop: theme.spacing(2),\n whiteSpace: 'pre-wrap'\n },\n buttonProgress: {\n marginLeft: theme.spacing(1)\n }\n}));\n\nconst LlmAnalysis = ({ visionResults, model }) => {\n const classes = useStyles();\n const [userQuery, setUserQuery] = useState('');\n const [isAnalyzing, setIsAnalyzing] = useState(false);\n const [analysisResult, setAnalysisResult] = useState(null);\n const [error, setError] = useState(null);\n\n // Format time for display\n const formatTime = (ms) => {\n if (ms === undefined || ms === null || isNaN(ms)) return '-';\n const num = Number(ms);\n if (num < 1000) return `${num.toFixed(2)} ms`;\n return `${(num / 1000).toFixed(2)} s`;\n };\n\n const handleAnalyze = async () => {\n if (!userQuery.trim()) return;\n \n setIsAnalyzing(true);\n setError(null);\n \n try {\n const response = await fetch('/api/analyze', {\n method: 'POST',\n headers: {\n 'Content-Type': 'application/json',\n },\n body: JSON.stringify({\n visionResults: visionResults,\n userQuery: userQuery\n }),\n });\n\n if (!response.ok) {\n throw new Error(`HTTP error! Status: ${response.status}`);\n }\n\n const data = await response.json();\n \n if (data.error) {\n setError(data.error);\n } else {\n setAnalysisResult(data);\n }\n } catch (err) {\n console.error('Error analyzing with LLM:', err);\n setError(`Error analyzing with LLM: ${err.message}`);\n } finally {\n setIsAnalyzing(false);\n }\n };\n\n if (!visionResults) return null;\n\n return (\n <Paper className={classes.paper}>\n <Typography variant=\"h6\" gutterBottom>\n Ask AI about the {model === 'vit' ? 'Classification' : 'Detection'} Results\n </Typography>\n \n <Typography variant=\"body2\" className={classes.marginBottom}>\n Ask a question about the detected objects or classifications to get an AI-powered analysis.\n </Typography>\n \n <TextField\n fullWidth\n label=\"Your question about the image\"\n variant=\"outlined\"\n value={userQuery}\n onChange={(e) => setUserQuery(e.target.value)}\n disabled={isAnalyzing}\n className={classes.marginBottom}\n placeholder={model === 'vit' \n ? \"E.g., What category does this image belong to?\" \n : \"E.g., How many people are in this image?\"}\n />\n \n <Button \n variant=\"contained\" \n color=\"primary\"\n onClick={handleAnalyze}\n disabled={isAnalyzing || !userQuery.trim()}\n >\n Analyze with AI\n {isAnalyzing && <CircularProgress size={24} className={classes.buttonProgress} />}\n </Button>\n \n {error && (\n <Box mt={2}>\n <Typography color=\"error\">{error}</Typography>\n </Box>\n )}\n \n {analysisResult && (\n <>\n <Divider className={classes.dividerMargin} />\n \n <Typography variant=\"subtitle1\" gutterBottom>\n AI Analysis:\n </Typography>\n \n <Box className={classes.responseBox}>\n <Typography variant=\"body1\">\n {analysisResult.response}\n </Typography>\n </Box>\n \n {analysisResult.performance && (\n <Box mt={1}>\n <Typography variant=\"body2\" color=\"textSecondary\">\n Analysis time: {formatTime(analysisResult.performance.inference_time)} on {analysisResult.performance.device}\n </Typography>\n </Box>\n )}\n </>\n )}\n </Paper>\n );\n};\n\nexport default LlmAnalysis;\n","import React, { useState, useEffect } from 'react';\nimport { \n Container, \n Typography, \n Box, \n Paper, \n Grid, \n CircularProgress,\n AppBar,\n Toolbar,\n ThemeProvider,\n createMuiTheme\n} from '@material-ui/core';\nimport ImageUploader from './components/ImageUploader';\nimport ModelSelector from './components/ModelSelector';\nimport ResultDisplay from './components/ResultDisplay';\nimport LlmAnalysis from './components/LlmAnalysis';\nimport './App.css';\n\n// Create a theme\nconst theme = createMuiTheme({\n palette: {\n primary: {\n main: '#3f51b5',\n },\n secondary: {\n main: '#f50057',\n },\n },\n typography: {\n fontFamily: 'Roboto, Arial, sans-serif',\n },\n});\n\nfunction App() {\n const [selectedImage, setSelectedImage] = useState(null);\n const [selectedModel, setSelectedModel] = useState('');\n const [isProcessing, setIsProcessing] = useState(false);\n const [results, setResults] = useState(null);\n const [error, setError] = useState(null);\n const [modelsStatus, setModelsStatus] = useState({\n yolo: false,\n detr: false,\n vit: false\n });\n\n // Check API status on component mount\n useEffect(() => {\n fetch('/api/status')\n .then(response => response.json())\n .then(data => {\n setModelsStatus(data.models);\n })\n .catch(err => {\n console.error('Error checking API status:', err);\n setError('Error connecting to the backend API. Please make sure the server is running.');\n });\n }, []);\n\n const handleImageUpload = (image) => {\n setSelectedImage(image);\n setResults(null);\n setError(null);\n };\n\n const handleModelSelect = (model) => {\n setSelectedModel(model);\n setResults(null);\n setError(null);\n };\n\n const processImage = async () => {\n if (!selectedImage || !selectedModel) {\n setError('Please select both an image and a model');\n return;\n }\n\n setIsProcessing(true);\n setError(null);\n\n // Create form data for the image\n const formData = new FormData();\n formData.append('image', selectedImage);\n\n let endpoint = '';\n switch (selectedModel) {\n case 'yolo':\n endpoint = '/api/detect/yolo';\n break;\n case 'detr':\n endpoint = '/api/detect/detr';\n break;\n case 'vit':\n endpoint = '/api/classify/vit';\n break;\n default:\n setError('Invalid model selection');\n setIsProcessing(false);\n return;\n }\n\n try {\n const response = await fetch(endpoint, {\n method: 'POST',\n body: formData,\n });\n\n if (!response.ok) {\n throw new Error(`HTTP error! Status: ${response.status}`);\n }\n\n const data = await response.json();\n setResults({ model: selectedModel, data });\n } catch (err) {\n console.error('Error processing image:', err);\n setError(`Error processing image: ${err.message}`);\n } finally {\n setIsProcessing(false);\n }\n };\n\n return (\n <ThemeProvider theme={theme}>\n <Box style={{ flexGrow: 1 }}>\n <AppBar position=\"static\">\n <Toolbar>\n <Typography variant=\"h6\" style={{ flexGrow: 1 }}>\n Multi-Model Object Detection Demo\n </Typography>\n </Toolbar>\n </AppBar>\n <Container maxWidth=\"lg\" style={{ marginTop: theme.spacing(4), marginBottom: theme.spacing(4) }}>\n <Grid container spacing={3}>\n <Grid item xs={12}>\n <Paper style={{ padding: theme.spacing(2) }}>\n <Typography variant=\"h5\" gutterBottom>\n Upload an image to see how each model performs!\n </Typography>\n <Typography variant=\"body1\" paragraph>\n This demo showcases three different object detection and image classification models:\n </Typography>\n <Typography variant=\"body1\" component=\"div\">\n <ul>\n <li><strong>YOLOv8</strong>: Fast and accurate object detection</li>\n <li><strong>DETR</strong>: DEtection TRansformer for object detection</li>\n <li><strong>ViT</strong>: Vision Transformer for image classification</li>\n </ul>\n </Typography>\n </Paper>\n </Grid>\n \n <Grid item xs={12} md={6}>\n <ImageUploader onImageUpload={handleImageUpload} />\n </Grid>\n \n <Grid item xs={12} md={6}>\n <ModelSelector \n onModelSelect={handleModelSelect} \n onProcess={processImage}\n isProcessing={isProcessing}\n modelsStatus={modelsStatus}\n selectedModel={selectedModel}\n imageSelected={!!selectedImage}\n />\n </Grid>\n \n {error && (\n <Grid item xs={12}>\n <Paper style={{ padding: theme.spacing(2), backgroundColor: '#ffebee' }}>\n <Typography color=\"error\">{error}</Typography>\n </Paper>\n </Grid>\n )}\n \n {isProcessing && (\n <Grid item xs={12} style={{ textAlign: 'center', margin: `${theme.spacing(4)}px 0` }}>\n <CircularProgress />\n <Typography variant=\"h6\" style={{ marginTop: theme.spacing(2) }}>\n Processing image...\n </Typography>\n </Grid>\n )}\n \n {results && (\n <>\n <Grid item xs={12}>\n <ResultDisplay results={results} />\n </Grid>\n <Grid item xs={12}>\n <LlmAnalysis visionResults={results.data} model={results.model} />\n </Grid>\n </>\n )}\n </Grid>\n </Container>\n </Box>\n </ThemeProvider>\n );\n}\n\nexport default App;\n","const reportWebVitals = (onPerfEntry) => {\n if (onPerfEntry && onPerfEntry instanceof Function) {\n import('web-vitals').then(({ getCLS, getFID, getFCP, getLCP, getTTFB }) => {\n getCLS(onPerfEntry);\n getFID(onPerfEntry);\n getFCP(onPerfEntry);\n getLCP(onPerfEntry);\n getTTFB(onPerfEntry);\n });\n }\n};\n\nexport default reportWebVitals;\n","import React from 'react';\nimport ReactDOM from 'react-dom';\nimport './index.css';\nimport App from './App';\nimport reportWebVitals from './reportWebVitals';\n\nReactDOM.render(\n <React.StrictMode>\n <App />\n </React.StrictMode>,\n document.getElementById('root')\n);\n\n// If you want to start measuring performance in your app, pass a function\n// to log results (for example: reportWebVitals(console.log))\n// or send to an analytics endpoint. Learn more: https://bit.ly/CRA-vitals\nreportWebVitals();\n"],"sourceRoot":""}
|
frontend/build/static/js/runtime-main.ab7e4402.js
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
!function(e){function r(r){for(var n,i,a=r[0],c=r[1],l=r[2],p=0,s=[];p<a.length;p++)i=a[p],Object.prototype.hasOwnProperty.call(o,i)&&o[i]&&s.push(o[i][0]),o[i]=0;for(n in c)Object.prototype.hasOwnProperty.call(c,n)&&(e[n]=c[n]);for(f&&f(r);s.length;)s.shift()();return u.push.apply(u,l||[]),t()}function t(){for(var e,r=0;r<u.length;r++){for(var t=u[r],n=!0,a=1;a<t.length;a++){var c=t[a];0!==o[c]&&(n=!1)}n&&(u.splice(r--,1),e=i(i.s=t[0]))}return e}var n={},o={1:0},u=[];function i(r){if(n[r])return n[r].exports;var t=n[r]={i:r,l:!1,exports:{}};return e[r].call(t.exports,t,t.exports,i),t.l=!0,t.exports}i.e=function(e){var r=[],t=o[e];if(0!==t)if(t)r.push(t[2]);else{var n=new Promise((function(r,n){t=o[e]=[r,n]}));r.push(t[2]=n);var u,a=document.createElement("script");a.charset="utf-8",a.timeout=120,i.nc&&a.setAttribute("nonce",i.nc),a.src=function(e){return i.p+"static/js/"+({}[e]||e)+"."+{3:"0e3ce0f8"}[e]+".chunk.js"}(e);var c=new Error;u=function(r){a.onerror=a.onload=null,clearTimeout(l);var t=o[e];if(0!==t){if(t){var n=r&&("load"===r.type?"missing":r.type),u=r&&r.target&&r.target.src;c.message="Loading chunk "+e+" failed.\n("+n+": "+u+")",c.name="ChunkLoadError",c.type=n,c.request=u,t[1](c)}o[e]=void 0}};var l=setTimeout((function(){u({type:"timeout",target:a})}),12e4);a.onerror=a.onload=u,document.head.appendChild(a)}return Promise.all(r)},i.m=e,i.c=n,i.d=function(e,r,t){i.o(e,r)||Object.defineProperty(e,r,{enumerable:!0,get:t})},i.r=function(e){"undefined"!==typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},i.t=function(e,r){if(1&r&&(e=i(e)),8&r)return e;if(4&r&&"object"===typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(i.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&r&&"string"!=typeof e)for(var n in e)i.d(t,n,function(r){return e[r]}.bind(null,n));return t},i.n=function(e){var r=e&&e.__esModule?function(){return e.default}:function(){return e};return i.d(r,"a",r),r},i.o=function(e,r){return Object.prototype.hasOwnProperty.call(e,r)},i.p="/",i.oe=function(e){throw console.error(e),e};var a=this["webpackJsonpvision-web-app"]=this["webpackJsonpvision-web-app"]||[],c=a.push.bind(a);a.push=r,a=a.slice();for(var l=0;l<a.length;l++)r(a[l]);var f=c;t()}([]);
|
2 |
+
//# sourceMappingURL=runtime-main.ab7e4402.js.map
|
frontend/build/static/js/runtime-main.ab7e4402.js.map
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"version":3,"sources":["../webpack/bootstrap"],"names":["webpackJsonpCallback","data","moduleId","chunkId","chunkIds","moreModules","executeModules","i","resolves","length","Object","prototype","hasOwnProperty","call","installedChunks","push","modules","parentJsonpFunction","shift","deferredModules","apply","checkDeferredModules","result","deferredModule","fulfilled","j","depId","splice","__webpack_require__","s","installedModules","1","exports","module","l","e","promises","installedChunkData","promise","Promise","resolve","reject","onScriptComplete","script","document","createElement","charset","timeout","nc","setAttribute","src","p","jsonpScriptSrc","error","Error","event","onerror","onload","clearTimeout","chunk","errorType","type","realSrc","target","message","name","request","undefined","setTimeout","head","appendChild","all","m","c","d","getter","o","defineProperty","enumerable","get","r","Symbol","toStringTag","value","t","mode","__esModule","ns","create","key","bind","n","object","property","oe","err","console","jsonpArray","this","oldJsonpFunction","slice"],"mappings":"aACE,SAASA,EAAqBC,GAQ7B,IAPA,IAMIC,EAAUC,EANVC,EAAWH,EAAK,GAChBI,EAAcJ,EAAK,GACnBK,EAAiBL,EAAK,GAIHM,EAAI,EAAGC,EAAW,GACpCD,EAAIH,EAASK,OAAQF,IACzBJ,EAAUC,EAASG,GAChBG,OAAOC,UAAUC,eAAeC,KAAKC,EAAiBX,IAAYW,EAAgBX,IACpFK,EAASO,KAAKD,EAAgBX,GAAS,IAExCW,EAAgBX,GAAW,EAE5B,IAAID,KAAYG,EACZK,OAAOC,UAAUC,eAAeC,KAAKR,EAAaH,KACpDc,EAAQd,GAAYG,EAAYH,IAKlC,IAFGe,GAAqBA,EAAoBhB,GAEtCO,EAASC,QACdD,EAASU,OAATV,GAOD,OAHAW,EAAgBJ,KAAKK,MAAMD,EAAiBb,GAAkB,IAGvDe,IAER,SAASA,IAER,IADA,IAAIC,EACIf,EAAI,EAAGA,EAAIY,EAAgBV,OAAQF,IAAK,CAG/C,IAFA,IAAIgB,EAAiBJ,EAAgBZ,GACjCiB,GAAY,EACRC,EAAI,EAAGA,EAAIF,EAAed,OAAQgB,IAAK,CAC9C,IAAIC,EAAQH,EAAeE,GACG,IAA3BX,EAAgBY,KAAcF,GAAY,GAE3CA,IACFL,EAAgBQ,OAAOpB,IAAK,GAC5Be,EAASM,EAAoBA,EAAoBC,EAAIN,EAAe,KAItE,OAAOD,EAIR,IAAIQ,EAAmB,GAKnBhB,EAAkB,CACrBiB,EAAG,GAGAZ,EAAkB,GAQtB,SAASS,EAAoB1B,GAG5B,GAAG4B,EAAiB5B,GACnB,OAAO4B,EAAiB5B,GAAU8B,QAGnC,IAAIC,EAASH,EAAiB5B,GAAY,CACzCK,EAAGL,EACHgC,GAAG,EACHF,QAAS,IAUV,OANAhB,EAAQd,GAAUW,KAAKoB,EAAOD,QAASC,EAAQA,EAAOD,QAASJ,GAG/DK,EAAOC,GAAI,EAGJD,EAAOD,QAKfJ,EAAoBO,EAAI,SAAuBhC,GAC9C,IAAIiC,EAAW,GAKXC,EAAqBvB,EAAgBX,GACzC,GAA0B,IAAvBkC,EAGF,GAAGA,EACFD,EAASrB,KAAKsB,EAAmB,QAC3B,CAEN,IAAIC,EAAU,IAAIC,SAAQ,SAASC,EAASC,GAC3CJ,EAAqBvB,EAAgBX,GAAW,CAACqC,EAASC,MAE3DL,EAASrB,KAAKsB,EAAmB,GAAKC,GAGtC,IACII,EADAC,EAASC,SAASC,cAAc,UAGpCF,EAAOG,QAAU,QACjBH,EAAOI,QAAU,IACbnB,EAAoBoB,IACvBL,EAAOM,aAAa,QAASrB,EAAoBoB,IAElDL,EAAOO,IA1DV,SAAwB/C,GACvB,OAAOyB,EAAoBuB,EAAI,cAAgB,GAAGhD,IAAUA,GAAW,IAAM,CAAC,EAAI,YAAYA,GAAW,YAyD1FiD,CAAejD,GAG5B,IAAIkD,EAAQ,IAAIC,MAChBZ,EAAmB,SAAUa,GAE5BZ,EAAOa,QAAUb,EAAOc,OAAS,KACjCC,aAAaX,GACb,IAAIY,EAAQ7C,EAAgBX,GAC5B,GAAa,IAAVwD,EAAa,CACf,GAAGA,EAAO,CACT,IAAIC,EAAYL,IAAyB,SAAfA,EAAMM,KAAkB,UAAYN,EAAMM,MAChEC,EAAUP,GAASA,EAAMQ,QAAUR,EAAMQ,OAAOb,IACpDG,EAAMW,QAAU,iBAAmB7D,EAAU,cAAgByD,EAAY,KAAOE,EAAU,IAC1FT,EAAMY,KAAO,iBACbZ,EAAMQ,KAAOD,EACbP,EAAMa,QAAUJ,EAChBH,EAAM,GAAGN,GAEVvC,EAAgBX,QAAWgE,IAG7B,IAAIpB,EAAUqB,YAAW,WACxB1B,EAAiB,CAAEmB,KAAM,UAAWE,OAAQpB,MAC1C,MACHA,EAAOa,QAAUb,EAAOc,OAASf,EACjCE,SAASyB,KAAKC,YAAY3B,GAG5B,OAAOJ,QAAQgC,IAAInC,IAIpBR,EAAoB4C,EAAIxD,EAGxBY,EAAoB6C,EAAI3C,EAGxBF,EAAoB8C,EAAI,SAAS1C,EAASiC,EAAMU,GAC3C/C,EAAoBgD,EAAE5C,EAASiC,IAClCvD,OAAOmE,eAAe7C,EAASiC,EAAM,CAAEa,YAAY,EAAMC,IAAKJ,KAKhE/C,EAAoBoD,EAAI,SAAShD,GACX,qBAAXiD,QAA0BA,OAAOC,aAC1CxE,OAAOmE,eAAe7C,EAASiD,OAAOC,YAAa,CAAEC,MAAO,WAE7DzE,OAAOmE,eAAe7C,EAAS,aAAc,CAAEmD,OAAO,KAQvDvD,EAAoBwD,EAAI,SAASD,EAAOE,GAEvC,GADU,EAAPA,IAAUF,EAAQvD,EAAoBuD,IAC/B,EAAPE,EAAU,OAAOF,EACpB,GAAW,EAAPE,GAA8B,kBAAVF,GAAsBA,GAASA,EAAMG,WAAY,OAAOH,EAChF,IAAII,EAAK7E,OAAO8E,OAAO,MAGvB,GAFA5D,EAAoBoD,EAAEO,GACtB7E,OAAOmE,eAAeU,EAAI,UAAW,CAAET,YAAY,EAAMK,MAAOA,IACtD,EAAPE,GAA4B,iBAATF,EAAmB,IAAI,IAAIM,KAAON,EAAOvD,EAAoB8C,EAAEa,EAAIE,EAAK,SAASA,GAAO,OAAON,EAAMM,IAAQC,KAAK,KAAMD,IAC9I,OAAOF,GAIR3D,EAAoB+D,EAAI,SAAS1D,GAChC,IAAI0C,EAAS1C,GAAUA,EAAOqD,WAC7B,WAAwB,OAAOrD,EAAgB,SAC/C,WAA8B,OAAOA,GAEtC,OADAL,EAAoB8C,EAAEC,EAAQ,IAAKA,GAC5BA,GAIR/C,EAAoBgD,EAAI,SAASgB,EAAQC,GAAY,OAAOnF,OAAOC,UAAUC,eAAeC,KAAK+E,EAAQC,IAGzGjE,EAAoBuB,EAAI,IAGxBvB,EAAoBkE,GAAK,SAASC,GAA2B,MAApBC,QAAQ3C,MAAM0C,GAAYA,GAEnE,IAAIE,EAAaC,KAAK,8BAAgCA,KAAK,+BAAiC,GACxFC,EAAmBF,EAAWlF,KAAK2E,KAAKO,GAC5CA,EAAWlF,KAAOf,EAClBiG,EAAaA,EAAWG,QACxB,IAAI,IAAI7F,EAAI,EAAGA,EAAI0F,EAAWxF,OAAQF,IAAKP,EAAqBiG,EAAW1F,IAC3E,IAAIU,EAAsBkF,EAI1B9E,I","file":"static/js/runtime-main.ab7e4402.js","sourcesContent":[" \t// install a JSONP callback for chunk loading\n \tfunction webpackJsonpCallback(data) {\n \t\tvar chunkIds = data[0];\n \t\tvar moreModules = data[1];\n \t\tvar executeModules = data[2];\n\n \t\t// add \"moreModules\" to the modules object,\n \t\t// then flag all \"chunkIds\" as loaded and fire callback\n \t\tvar moduleId, chunkId, i = 0, resolves = [];\n \t\tfor(;i < chunkIds.length; i++) {\n \t\t\tchunkId = chunkIds[i];\n \t\t\tif(Object.prototype.hasOwnProperty.call(installedChunks, chunkId) && installedChunks[chunkId]) {\n \t\t\t\tresolves.push(installedChunks[chunkId][0]);\n \t\t\t}\n \t\t\tinstalledChunks[chunkId] = 0;\n \t\t}\n \t\tfor(moduleId in moreModules) {\n \t\t\tif(Object.prototype.hasOwnProperty.call(moreModules, moduleId)) {\n \t\t\t\tmodules[moduleId] = moreModules[moduleId];\n \t\t\t}\n \t\t}\n \t\tif(parentJsonpFunction) parentJsonpFunction(data);\n\n \t\twhile(resolves.length) {\n \t\t\tresolves.shift()();\n \t\t}\n\n \t\t// add entry modules from loaded chunk to deferred list\n \t\tdeferredModules.push.apply(deferredModules, executeModules || []);\n\n \t\t// run deferred modules when all chunks ready\n \t\treturn checkDeferredModules();\n \t};\n \tfunction checkDeferredModules() {\n \t\tvar result;\n \t\tfor(var i = 0; i < deferredModules.length; i++) {\n \t\t\tvar deferredModule = deferredModules[i];\n \t\t\tvar fulfilled = true;\n \t\t\tfor(var j = 1; j < deferredModule.length; j++) {\n \t\t\t\tvar depId = deferredModule[j];\n \t\t\t\tif(installedChunks[depId] !== 0) fulfilled = false;\n \t\t\t}\n \t\t\tif(fulfilled) {\n \t\t\t\tdeferredModules.splice(i--, 1);\n \t\t\t\tresult = __webpack_require__(__webpack_require__.s = deferredModule[0]);\n \t\t\t}\n \t\t}\n\n \t\treturn result;\n \t}\n\n \t// The module cache\n \tvar installedModules = {};\n\n \t// object to store loaded and loading chunks\n \t// undefined = chunk not loaded, null = chunk preloaded/prefetched\n \t// Promise = chunk loading, 0 = chunk loaded\n \tvar installedChunks = {\n \t\t1: 0\n \t};\n\n \tvar deferredModules = [];\n\n \t// script path function\n \tfunction jsonpScriptSrc(chunkId) {\n \t\treturn __webpack_require__.p + \"static/js/\" + ({}[chunkId]||chunkId) + \".\" + {\"3\":\"0e3ce0f8\"}[chunkId] + \".chunk.js\"\n \t}\n\n \t// The require function\n \tfunction __webpack_require__(moduleId) {\n\n \t\t// Check if module is in cache\n \t\tif(installedModules[moduleId]) {\n \t\t\treturn installedModules[moduleId].exports;\n \t\t}\n \t\t// Create a new module (and put it into the cache)\n \t\tvar module = installedModules[moduleId] = {\n \t\t\ti: moduleId,\n \t\t\tl: false,\n \t\t\texports: {}\n \t\t};\n\n \t\t// Execute the module function\n \t\tmodules[moduleId].call(module.exports, module, module.exports, __webpack_require__);\n\n \t\t// Flag the module as loaded\n \t\tmodule.l = true;\n\n \t\t// Return the exports of the module\n \t\treturn module.exports;\n \t}\n\n \t// This file contains only the entry chunk.\n \t// The chunk loading function for additional chunks\n \t__webpack_require__.e = function requireEnsure(chunkId) {\n \t\tvar promises = [];\n\n\n \t\t// JSONP chunk loading for javascript\n\n \t\tvar installedChunkData = installedChunks[chunkId];\n \t\tif(installedChunkData !== 0) { // 0 means \"already installed\".\n\n \t\t\t// a Promise means \"currently loading\".\n \t\t\tif(installedChunkData) {\n \t\t\t\tpromises.push(installedChunkData[2]);\n \t\t\t} else {\n \t\t\t\t// setup Promise in chunk cache\n \t\t\t\tvar promise = new Promise(function(resolve, reject) {\n \t\t\t\t\tinstalledChunkData = installedChunks[chunkId] = [resolve, reject];\n \t\t\t\t});\n \t\t\t\tpromises.push(installedChunkData[2] = promise);\n\n \t\t\t\t// start chunk loading\n \t\t\t\tvar script = document.createElement('script');\n \t\t\t\tvar onScriptComplete;\n\n \t\t\t\tscript.charset = 'utf-8';\n \t\t\t\tscript.timeout = 120;\n \t\t\t\tif (__webpack_require__.nc) {\n \t\t\t\t\tscript.setAttribute(\"nonce\", __webpack_require__.nc);\n \t\t\t\t}\n \t\t\t\tscript.src = jsonpScriptSrc(chunkId);\n\n \t\t\t\t// create error before stack unwound to get useful stacktrace later\n \t\t\t\tvar error = new Error();\n \t\t\t\tonScriptComplete = function (event) {\n \t\t\t\t\t// avoid mem leaks in IE.\n \t\t\t\t\tscript.onerror = script.onload = null;\n \t\t\t\t\tclearTimeout(timeout);\n \t\t\t\t\tvar chunk = installedChunks[chunkId];\n \t\t\t\t\tif(chunk !== 0) {\n \t\t\t\t\t\tif(chunk) {\n \t\t\t\t\t\t\tvar errorType = event && (event.type === 'load' ? 'missing' : event.type);\n \t\t\t\t\t\t\tvar realSrc = event && event.target && event.target.src;\n \t\t\t\t\t\t\terror.message = 'Loading chunk ' + chunkId + ' failed.\\n(' + errorType + ': ' + realSrc + ')';\n \t\t\t\t\t\t\terror.name = 'ChunkLoadError';\n \t\t\t\t\t\t\terror.type = errorType;\n \t\t\t\t\t\t\terror.request = realSrc;\n \t\t\t\t\t\t\tchunk[1](error);\n \t\t\t\t\t\t}\n \t\t\t\t\t\tinstalledChunks[chunkId] = undefined;\n \t\t\t\t\t}\n \t\t\t\t};\n \t\t\t\tvar timeout = setTimeout(function(){\n \t\t\t\t\tonScriptComplete({ type: 'timeout', target: script });\n \t\t\t\t}, 120000);\n \t\t\t\tscript.onerror = script.onload = onScriptComplete;\n \t\t\t\tdocument.head.appendChild(script);\n \t\t\t}\n \t\t}\n \t\treturn Promise.all(promises);\n \t};\n\n \t// expose the modules object (__webpack_modules__)\n \t__webpack_require__.m = modules;\n\n \t// expose the module cache\n \t__webpack_require__.c = installedModules;\n\n \t// define getter function for harmony exports\n \t__webpack_require__.d = function(exports, name, getter) {\n \t\tif(!__webpack_require__.o(exports, name)) {\n \t\t\tObject.defineProperty(exports, name, { enumerable: true, get: getter });\n \t\t}\n \t};\n\n \t// define __esModule on exports\n \t__webpack_require__.r = function(exports) {\n \t\tif(typeof Symbol !== 'undefined' && Symbol.toStringTag) {\n \t\t\tObject.defineProperty(exports, Symbol.toStringTag, { value: 'Module' });\n \t\t}\n \t\tObject.defineProperty(exports, '__esModule', { value: true });\n \t};\n\n \t// create a fake namespace object\n \t// mode & 1: value is a module id, require it\n \t// mode & 2: merge all properties of value into the ns\n \t// mode & 4: return value when already ns object\n \t// mode & 8|1: behave like require\n \t__webpack_require__.t = function(value, mode) {\n \t\tif(mode & 1) value = __webpack_require__(value);\n \t\tif(mode & 8) return value;\n \t\tif((mode & 4) && typeof value === 'object' && value && value.__esModule) return value;\n \t\tvar ns = Object.create(null);\n \t\t__webpack_require__.r(ns);\n \t\tObject.defineProperty(ns, 'default', { enumerable: true, value: value });\n \t\tif(mode & 2 && typeof value != 'string') for(var key in value) __webpack_require__.d(ns, key, function(key) { return value[key]; }.bind(null, key));\n \t\treturn ns;\n \t};\n\n \t// getDefaultExport function for compatibility with non-harmony modules\n \t__webpack_require__.n = function(module) {\n \t\tvar getter = module && module.__esModule ?\n \t\t\tfunction getDefault() { return module['default']; } :\n \t\t\tfunction getModuleExports() { return module; };\n \t\t__webpack_require__.d(getter, 'a', getter);\n \t\treturn getter;\n \t};\n\n \t// Object.prototype.hasOwnProperty.call\n \t__webpack_require__.o = function(object, property) { return Object.prototype.hasOwnProperty.call(object, property); };\n\n \t// __webpack_public_path__\n \t__webpack_require__.p = \"/\";\n\n \t// on error function for async loading\n \t__webpack_require__.oe = function(err) { console.error(err); throw err; };\n\n \tvar jsonpArray = this[\"webpackJsonpvision-web-app\"] = this[\"webpackJsonpvision-web-app\"] || [];\n \tvar oldJsonpFunction = jsonpArray.push.bind(jsonpArray);\n \tjsonpArray.push = webpackJsonpCallback;\n \tjsonpArray = jsonpArray.slice();\n \tfor(var i = 0; i < jsonpArray.length; i++) webpackJsonpCallback(jsonpArray[i]);\n \tvar parentJsonpFunction = oldJsonpFunction;\n\n\n \t// run deferred modules from other chunks\n \tcheckDeferredModules();\n"],"sourceRoot":""}
|
requirements.txt
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Core dependencies
|
2 |
+
gradio>=4.0.0
|
3 |
+
torch>=2.0.0
|
4 |
+
transformers>=4.30.0
|
5 |
+
Pillow>=9.0.0
|
6 |
+
|
7 |
+
# Object detection models
|
8 |
+
ultralytics>=8.0.0 # YOLOv8
|
9 |
+
timm>=0.9.0 # Vision Transformer support
|
10 |
+
|
11 |
+
# API dependencies
|
12 |
+
flask>=2.0.0
|
13 |
+
flask-cors>=3.0.0
|
14 |
+
matplotlib>=3.5.0
|
15 |
+
numpy>=1.20.0
|
16 |
+
|
17 |
+
# For future phases
|
18 |
+
fastapi>=0.100.0
|
19 |
+
uvicorn[standard]>=0.22.0
|
20 |
+
python-multipart>=0.0.5
|
21 |
+
|
22 |
+
# Llama 4 integration
|
23 |
+
accelerator>=0.20.0
|
24 |
+
bitsandbytes>=0.41.0
|
25 |
+
sentencepiece>=0.1.99
|
26 |
+
protobuf>=4.23.0
|
static/index.html
ADDED
@@ -0,0 +1,275 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
<!DOCTYPE html>
|
2 |
+
<html lang="en">
|
3 |
+
<head>
|
4 |
+
<meta charset="UTF-8">
|
5 |
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
6 |
+
<title>Multi-Model Vision App</title>
|
7 |
+
<style>
|
8 |
+
body { font-family: Arial, sans-serif; max-width: 800px; margin: 0 auto; padding: 20px; }
|
9 |
+
h1 { color: #3f51b5; text-align: center; }
|
10 |
+
.container { display: flex; flex-direction: column; gap: 20px; }
|
11 |
+
.card { border: 1px solid #ddd; border-radius: 8px; padding: 20px; box-shadow: 0 2px 4px rgba(0,0,0,0.1); }
|
12 |
+
.model-selection { display: flex; gap: 10px; margin: 20px 0; }
|
13 |
+
.model-btn { padding: 10px 15px; border: none; border-radius: 4px; cursor: pointer; background-color: #e0e0e0; }
|
14 |
+
.model-btn.active { background-color: #3f51b5; color: white; }
|
15 |
+
.model-btn:disabled { opacity: 0.5; cursor: not-allowed; }
|
16 |
+
.upload-container { display: flex; flex-direction: column; gap: 10px; }
|
17 |
+
#file-input { margin-bottom: 10px; }
|
18 |
+
#process-btn { padding: 10px 15px; background-color: #3f51b5; color: white; border: none; border-radius: 4px; cursor: pointer; }
|
19 |
+
#process-btn:disabled { background-color: #9e9e9e; cursor: not-allowed; }
|
20 |
+
.preview-container { display: flex; justify-content: center; margin: 20px 0; }
|
21 |
+
.preview-image { max-width: 100%; max-height: 300px; border-radius: 4px; }
|
22 |
+
.result-container { margin-top: 20px; }
|
23 |
+
.result-image { max-width: 100%; max-height: 400px; border-radius: 4px; }
|
24 |
+
.detection-list { list-style-type: none; padding: 0; }
|
25 |
+
.detection-item { padding: 8px; border-bottom: 1px solid #eee; }
|
26 |
+
.detection-item:last-child { border-bottom: none; }
|
27 |
+
.confidence { display: inline-block; padding: 2px 6px; background-color: #3f51b5; color: white; border-radius: 10px; font-size: 0.8em; margin-left: 8px; }
|
28 |
+
.performance { margin-top: 20px; font-size: 0.9em; color: #666; }
|
29 |
+
.error { color: #f44336; font-weight: bold; }
|
30 |
+
.loading { text-align: center; margin: 20px 0; }
|
31 |
+
</style>
|
32 |
+
</head>
|
33 |
+
<body>
|
34 |
+
<h1>Multi-Model Vision App</h1>
|
35 |
+
|
36 |
+
<div class="container">
|
37 |
+
<div class="card">
|
38 |
+
<h2>Select Model</h2>
|
39 |
+
<div class="model-selection">
|
40 |
+
<button id="yolo-btn" class="model-btn">YOLOv8 (Detection)</button>
|
41 |
+
<button id="detr-btn" class="model-btn">DETR (Detection)</button>
|
42 |
+
<button id="vit-btn" class="model-btn">ViT (Classification)</button>
|
43 |
+
</div>
|
44 |
+
</div>
|
45 |
+
|
46 |
+
<div class="card upload-container">
|
47 |
+
<h2>Upload Image</h2>
|
48 |
+
<input type="file" id="file-input" accept="image/*">
|
49 |
+
<div class="preview-container" id="preview-container"></div>
|
50 |
+
<button id="process-btn" disabled>Process Image</button>
|
51 |
+
</div>
|
52 |
+
|
53 |
+
<div class="card result-container" id="result-container" style="display: none;">
|
54 |
+
<h2 id="result-title">Results</h2>
|
55 |
+
<div id="result-content"></div>
|
56 |
+
</div>
|
57 |
+
</div>
|
58 |
+
|
59 |
+
<script>
|
60 |
+
// Model selection
|
61 |
+
const yoloBtn = document.getElementById('yolo-btn');
|
62 |
+
const detrBtn = document.getElementById('detr-btn');
|
63 |
+
const vitBtn = document.getElementById('vit-btn');
|
64 |
+
const fileInput = document.getElementById('file-input');
|
65 |
+
const processBtn = document.getElementById('process-btn');
|
66 |
+
const previewContainer = document.getElementById('preview-container');
|
67 |
+
const resultContainer = document.getElementById('result-container');
|
68 |
+
const resultTitle = document.getElementById('result-title');
|
69 |
+
const resultContent = document.getElementById('result-content');
|
70 |
+
|
71 |
+
let selectedModel = null;
|
72 |
+
let selectedFile = null;
|
73 |
+
let modelsStatus = { yolo: false, detr: false, vit: false };
|
74 |
+
|
75 |
+
// Check API status
|
76 |
+
async function checkApiStatus() {
|
77 |
+
try {
|
78 |
+
const response = await fetch('/api/status');
|
79 |
+
const data = await response.json();
|
80 |
+
modelsStatus = data.models;
|
81 |
+
|
82 |
+
// Update UI based on model availability
|
83 |
+
yoloBtn.disabled = !modelsStatus.yolo;
|
84 |
+
detrBtn.disabled = !modelsStatus.detr;
|
85 |
+
vitBtn.disabled = !modelsStatus.vit;
|
86 |
+
|
87 |
+
if (!modelsStatus.yolo) yoloBtn.title = "YOLOv8 model not available";
|
88 |
+
if (!modelsStatus.detr) detrBtn.title = "DETR model not available";
|
89 |
+
if (!modelsStatus.vit) vitBtn.title = "ViT model not available";
|
90 |
+
} catch (error) {
|
91 |
+
console.error('Error checking API status:', error);
|
92 |
+
alert('Error connecting to the API. Please make sure the server is running.');
|
93 |
+
}
|
94 |
+
}
|
95 |
+
|
96 |
+
// Format time for display
|
97 |
+
function formatTime(ms) {
|
98 |
+
if (ms < 1000) return `${ms.toFixed(2)} ms`;
|
99 |
+
return `${(ms / 1000).toFixed(2)} s`;
|
100 |
+
}
|
101 |
+
|
102 |
+
// Select model
|
103 |
+
function selectModel(model) {
|
104 |
+
selectedModel = model;
|
105 |
+
yoloBtn.classList.remove('active');
|
106 |
+
detrBtn.classList.remove('active');
|
107 |
+
vitBtn.classList.remove('active');
|
108 |
+
|
109 |
+
switch(model) {
|
110 |
+
case 'yolo':
|
111 |
+
yoloBtn.classList.add('active');
|
112 |
+
break;
|
113 |
+
case 'detr':
|
114 |
+
detrBtn.classList.add('active');
|
115 |
+
break;
|
116 |
+
case 'vit':
|
117 |
+
vitBtn.classList.add('active');
|
118 |
+
break;
|
119 |
+
}
|
120 |
+
|
121 |
+
updateProcessButton();
|
122 |
+
}
|
123 |
+
|
124 |
+
// Update process button state
|
125 |
+
function updateProcessButton() {
|
126 |
+
processBtn.disabled = !selectedModel || !selectedFile;
|
127 |
+
}
|
128 |
+
|
129 |
+
// Handle file selection
|
130 |
+
fileInput.addEventListener('change', (e) => {
|
131 |
+
const file = e.target.files[0];
|
132 |
+
if (file) {
|
133 |
+
selectedFile = file;
|
134 |
+
const reader = new FileReader();
|
135 |
+
reader.onload = (e) => {
|
136 |
+
previewContainer.innerHTML = `<img src="${e.target.result}" class="preview-image" alt="Preview">`;
|
137 |
+
};
|
138 |
+
reader.readAsDataURL(file);
|
139 |
+
updateProcessButton();
|
140 |
+
} else {
|
141 |
+
selectedFile = null;
|
142 |
+
previewContainer.innerHTML = '';
|
143 |
+
updateProcessButton();
|
144 |
+
}
|
145 |
+
});
|
146 |
+
|
147 |
+
// Model selection event listeners
|
148 |
+
yoloBtn.addEventListener('click', () => selectModel('yolo'));
|
149 |
+
detrBtn.addEventListener('click', () => selectModel('detr'));
|
150 |
+
vitBtn.addEventListener('click', () => selectModel('vit'));
|
151 |
+
|
152 |
+
// Process image
|
153 |
+
processBtn.addEventListener('click', async () => {
|
154 |
+
if (!selectedModel || !selectedFile) return;
|
155 |
+
|
156 |
+
resultContainer.style.display = 'block';
|
157 |
+
resultTitle.textContent = `Processing with ${selectedModel.toUpperCase()}...`;
|
158 |
+
resultContent.innerHTML = '<div class="loading">Processing image...</div>';
|
159 |
+
|
160 |
+
const formData = new FormData();
|
161 |
+
formData.append('image', selectedFile);
|
162 |
+
|
163 |
+
let endpoint = '';
|
164 |
+
switch(selectedModel) {
|
165 |
+
case 'yolo':
|
166 |
+
endpoint = '/api/detect/yolo';
|
167 |
+
break;
|
168 |
+
case 'detr':
|
169 |
+
endpoint = '/api/detect/detr';
|
170 |
+
break;
|
171 |
+
case 'vit':
|
172 |
+
endpoint = '/api/classify/vit';
|
173 |
+
break;
|
174 |
+
}
|
175 |
+
|
176 |
+
try {
|
177 |
+
const response = await fetch(endpoint, {
|
178 |
+
method: 'POST',
|
179 |
+
body: formData
|
180 |
+
});
|
181 |
+
|
182 |
+
if (!response.ok) {
|
183 |
+
throw new Error(`HTTP error! Status: ${response.status}`);
|
184 |
+
}
|
185 |
+
|
186 |
+
const data = await response.json();
|
187 |
+
displayResults(selectedModel, data);
|
188 |
+
} catch (error) {
|
189 |
+
console.error('Error processing image:', error);
|
190 |
+
resultTitle.textContent = 'Error';
|
191 |
+
resultContent.innerHTML = `<div class="error">Error processing image: ${error.message}</div>`;
|
192 |
+
}
|
193 |
+
});
|
194 |
+
|
195 |
+
// Display results
|
196 |
+
function displayResults(model, data) {
|
197 |
+
resultTitle.textContent = `${model.toUpperCase()} Results`;
|
198 |
+
let html = '';
|
199 |
+
|
200 |
+
if (model === 'yolo' || model === 'detr') {
|
201 |
+
// Detection results
|
202 |
+
html += `
|
203 |
+
<div style="display: flex; flex-direction: column; gap: 20px;">
|
204 |
+
<div>
|
205 |
+
<h3>Processed Image</h3>
|
206 |
+
<img src="data:image/jpeg;base64,${data.image_with_boxes}" class="result-image" alt="Detection Result">
|
207 |
+
</div>
|
208 |
+
<div>
|
209 |
+
<h3>Detected Objects</h3>
|
210 |
+
`;
|
211 |
+
|
212 |
+
if (data.detections && data.detections.length > 0) {
|
213 |
+
html += '<ul class="detection-list">';
|
214 |
+
data.detections.forEach(item => {
|
215 |
+
html += `
|
216 |
+
<li class="detection-item">
|
217 |
+
${item.label}
|
218 |
+
<span class="confidence">${(item.confidence * 100).toFixed(0)}%</span>
|
219 |
+
</li>
|
220 |
+
`;
|
221 |
+
});
|
222 |
+
html += '</ul>';
|
223 |
+
} else {
|
224 |
+
html += '<p>No objects detected</p>';
|
225 |
+
}
|
226 |
+
|
227 |
+
html += `
|
228 |
+
</div>
|
229 |
+
<div class="performance">
|
230 |
+
<p>Inference Time: ${formatTime(data.inference_time)}</p>
|
231 |
+
<p>Total Processing Time: ${formatTime(data.total_time)}</p>
|
232 |
+
</div>
|
233 |
+
</div>
|
234 |
+
`;
|
235 |
+
} else if (model === 'vit') {
|
236 |
+
// Classification results
|
237 |
+
html += `
|
238 |
+
<div style="display: flex; flex-direction: column; gap: 20px;">
|
239 |
+
<div>
|
240 |
+
<h3>Classification Results</h3>
|
241 |
+
`;
|
242 |
+
|
243 |
+
if (data.classifications && data.classifications.length > 0) {
|
244 |
+
html += '<ul class="detection-list">';
|
245 |
+
data.classifications.forEach(item => {
|
246 |
+
html += `
|
247 |
+
<li class="detection-item">
|
248 |
+
${item.label}
|
249 |
+
<span class="confidence">${(item.confidence * 100).toFixed(1)}%</span>
|
250 |
+
</li>
|
251 |
+
`;
|
252 |
+
});
|
253 |
+
html += '</ul>';
|
254 |
+
} else {
|
255 |
+
html += '<p>No classifications found</p>';
|
256 |
+
}
|
257 |
+
|
258 |
+
html += `
|
259 |
+
</div>
|
260 |
+
<div class="performance">
|
261 |
+
<p>Inference Time: ${formatTime(data.inference_time)}</p>
|
262 |
+
<p>Total Processing Time: ${formatTime(data.total_time)}</p>
|
263 |
+
</div>
|
264 |
+
</div>
|
265 |
+
`;
|
266 |
+
}
|
267 |
+
|
268 |
+
resultContent.innerHTML = html;
|
269 |
+
}
|
270 |
+
|
271 |
+
// Initialize
|
272 |
+
checkApiStatus();
|
273 |
+
</script>
|
274 |
+
</body>
|
275 |
+
</html>
|