Spaces:
Sleeping
title: OpenWebUI - Ollama Chat
emoji: π€
colorFrom: green
colorTo: blue
sdk: docker
app_port: 7860
π€ OpenWebUI - Ollama Chat
A beautiful, modern chat interface for Ollama models, deployed as a Hugging Face Space. This Space provides a full-featured web UI that connects to your Ollama API Space for an interactive chat experience.
π Features
- Beautiful Chat Interface: Modern, responsive design with gradient backgrounds and smooth animations
- Model Selection: Choose from any available Ollama model
- Parameter Control: Adjust temperature, max tokens, and other generation parameters
- Real-time Chat: Interactive chat experience with typing indicators
- Mobile Responsive: Works perfectly on desktop and mobile devices
- Ollama Integration: Seamlessly connects to your Ollama API Space
- Health Monitoring: Built-in health checks and status monitoring
π Quick Start
1. Deploy to Hugging Face Spaces
- Fork this repository or create a new Space
- Upload these files to your Space
- Set the following environment variables in your Space settings:
OLLAMA_API_URL
: URL to your Ollama Space (e.g.,https://your-ollama-space.hf.space
)DEFAULT_MODEL
: Default model to use (e.g.,llama2
)MAX_TOKENS
: Maximum tokens for generation (default:2048
)TEMPERATURE
: Default temperature (default:0.7
)
2. Local Development
# Clone the repository
git clone <your-repo-url>
cd openwebui-space
# Install dependencies
pip install -r requirements.txt
# Set environment variables
export OLLAMA_API_URL=https://your-ollama-space.hf.space
export DEFAULT_MODEL=llama2
# Run the application
python app.py
π§ Configuration
Environment Variables
OLLAMA_API_URL
: Required - URL to your Ollama Space (e.g.,https://your-ollama-space.hf.space
)DEFAULT_MODEL
: Default model to use (default:llama2
)MAX_TOKENS
: Maximum tokens for generation (default:2048
)TEMPERATURE
: Default temperature setting (default:0.7
)
Prerequisites
Before using this OpenWebUI Space, you need:
- Ollama Space: A deployed Ollama API Space (see the
ollama-space
directory) - Ollama Instance: A running Ollama instance that your Ollama Space can connect to
- Network Access: The OpenWebUI Space must be able to reach your Ollama Space
π‘ API Endpoints
GET /
Main chat interface - the beautiful web UI.
POST /api/chat
Chat API endpoint for programmatic access.
Request Body:
{
"message": "Hello, how are you?",
"model": "llama2",
"temperature": 0.7,
"max_tokens": 2048
}
Response:
{
"status": "success",
"response": "Hello! I'm doing well, thank you for asking...",
"model": "llama2",
"usage": {
"prompt_tokens": 7,
"completion_tokens": 15,
"total_tokens": 22
}
}
GET /api/models
Get available models from the connected Ollama Space.
GET /health
Health check endpoint that also checks the Ollama Space connection.
π Integration with Ollama Space
This OpenWebUI Space is designed to work seamlessly with the Ollama Space:
- API Communication: The OpenWebUI Space communicates with your Ollama Space via HTTP API calls
- Model Discovery: Automatically discovers and lists available models from your Ollama Space
- Text Generation: Sends generation requests to your Ollama Space and displays responses
- Health Monitoring: Monitors the health of both the OpenWebUI Space and the Ollama Space
Architecture
User β OpenWebUI Space β Ollama Space β Ollama Instance
π¨ UI Features
Chat Interface
- Message Bubbles: Distinct styling for user and AI messages
- Avatars: Visual indicators for message sender
- Auto-scroll: Automatically scrolls to new messages
- Typing Indicators: Shows when AI is generating a response
Controls
- Model Selection: Dropdown to choose from available models
- Temperature Slider: Adjust creativity/randomness (0.0 - 2.0)
- Max Tokens: Set maximum response length
- Real-time Updates: Parameter changes are applied immediately
Responsive Design
- Mobile Optimized: Touch-friendly interface for mobile devices
- Adaptive Layout: Automatically adjusts to different screen sizes
- Modern CSS: Uses CSS Grid and Flexbox for optimal layouts
π³ Docker Support
The Space includes a Dockerfile for containerized deployment:
# Build the image
docker build -t openwebui-space .
# Run the container
docker run -p 7860:7860 \
-e OLLAMA_API_URL=https://your-ollama-space.hf.space \
-e DEFAULT_MODEL=llama2 \
openwebui-space
π Security Considerations
- Public Access: The Space is publicly accessible - consider adding authentication for production use
- API Communication: All communication with the Ollama Space is over HTTPS (when using HF Spaces)
- Input Validation: User inputs are validated before being sent to the Ollama Space
π¨ Troubleshooting
Common Issues
- Cannot connect to Ollama Space: Check the
OLLAMA_API_URL
environment variable - No models available: Ensure your Ollama Space is running and has models
- Generation errors: Check the Ollama Space logs for detailed error messages
- Slow responses: Large models may take time to generate responses
Health Checks
Use the /health
endpoint to monitor:
- OpenWebUI Space status
- Connection to Ollama Space
- Ollama Space health status
Debug Mode
For local development, you can enable debug mode by setting debug=True
in the Flask app.
π± Mobile Experience
The interface is fully optimized for mobile devices:
- Touch-friendly controls
- Responsive design that adapts to screen size
- Optimized for portrait and landscape orientations
- Fast loading and smooth scrolling
π― Use Cases
- Personal AI Assistant: Chat with your local models
- Development Testing: Test model responses and parameters
- Demo and Showcase: Present your Ollama setup to others
- API Gateway: Use the chat interface as a frontend for your Ollama API
π License
This project is open source and available under the MIT License.
π€ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
π Support
If you encounter any issues or have questions, please open an issue on the repository.
π Related Projects
- Ollama Space: The backend API Space that this UI connects to
- Ollama: The local LLM runner that powers everything
- Hugging Face Spaces: The deployment platform for both Spaces