openwebui-ollama / README.md
tommytracx's picture
Upload 4 files
20a7b41 verified
metadata
title: OpenWebUI - Ollama Chat
emoji: πŸ€–
colorFrom: green
colorTo: blue
sdk: docker
app_port: 7860

πŸ€– OpenWebUI - Ollama Chat

A beautiful, modern chat interface for Ollama models, deployed as a Hugging Face Space. This Space provides a full-featured web UI that connects to your Ollama API Space for an interactive chat experience.

🌟 Features

  • Beautiful Chat Interface: Modern, responsive design with gradient backgrounds and smooth animations
  • Model Selection: Choose from any available Ollama model
  • Parameter Control: Adjust temperature, max tokens, and other generation parameters
  • Real-time Chat: Interactive chat experience with typing indicators
  • Mobile Responsive: Works perfectly on desktop and mobile devices
  • Ollama Integration: Seamlessly connects to your Ollama API Space
  • Health Monitoring: Built-in health checks and status monitoring

πŸš€ Quick Start

1. Deploy to Hugging Face Spaces

  1. Fork this repository or create a new Space
  2. Upload these files to your Space
  3. Set the following environment variables in your Space settings:
    • OLLAMA_API_URL: URL to your Ollama Space (e.g., https://your-ollama-space.hf.space)
    • DEFAULT_MODEL: Default model to use (e.g., llama2)
    • MAX_TOKENS: Maximum tokens for generation (default: 2048)
    • TEMPERATURE: Default temperature (default: 0.7)

2. Local Development

# Clone the repository
git clone <your-repo-url>
cd openwebui-space

# Install dependencies
pip install -r requirements.txt

# Set environment variables
export OLLAMA_API_URL=https://your-ollama-space.hf.space
export DEFAULT_MODEL=llama2

# Run the application
python app.py

πŸ”§ Configuration

Environment Variables

  • OLLAMA_API_URL: Required - URL to your Ollama Space (e.g., https://your-ollama-space.hf.space)
  • DEFAULT_MODEL: Default model to use (default: llama2)
  • MAX_TOKENS: Maximum tokens for generation (default: 2048)
  • TEMPERATURE: Default temperature setting (default: 0.7)

Prerequisites

Before using this OpenWebUI Space, you need:

  1. Ollama Space: A deployed Ollama API Space (see the ollama-space directory)
  2. Ollama Instance: A running Ollama instance that your Ollama Space can connect to
  3. Network Access: The OpenWebUI Space must be able to reach your Ollama Space

πŸ“‘ API Endpoints

GET /

Main chat interface - the beautiful web UI.

POST /api/chat

Chat API endpoint for programmatic access.

Request Body:

{
  "message": "Hello, how are you?",
  "model": "llama2",
  "temperature": 0.7,
  "max_tokens": 2048
}

Response:

{
  "status": "success",
  "response": "Hello! I'm doing well, thank you for asking...",
  "model": "llama2",
  "usage": {
    "prompt_tokens": 7,
    "completion_tokens": 15,
    "total_tokens": 22
  }
}

GET /api/models

Get available models from the connected Ollama Space.

GET /health

Health check endpoint that also checks the Ollama Space connection.

🌐 Integration with Ollama Space

This OpenWebUI Space is designed to work seamlessly with the Ollama Space:

  1. API Communication: The OpenWebUI Space communicates with your Ollama Space via HTTP API calls
  2. Model Discovery: Automatically discovers and lists available models from your Ollama Space
  3. Text Generation: Sends generation requests to your Ollama Space and displays responses
  4. Health Monitoring: Monitors the health of both the OpenWebUI Space and the Ollama Space

Architecture

User β†’ OpenWebUI Space β†’ Ollama Space β†’ Ollama Instance

🎨 UI Features

Chat Interface

  • Message Bubbles: Distinct styling for user and AI messages
  • Avatars: Visual indicators for message sender
  • Auto-scroll: Automatically scrolls to new messages
  • Typing Indicators: Shows when AI is generating a response

Controls

  • Model Selection: Dropdown to choose from available models
  • Temperature Slider: Adjust creativity/randomness (0.0 - 2.0)
  • Max Tokens: Set maximum response length
  • Real-time Updates: Parameter changes are applied immediately

Responsive Design

  • Mobile Optimized: Touch-friendly interface for mobile devices
  • Adaptive Layout: Automatically adjusts to different screen sizes
  • Modern CSS: Uses CSS Grid and Flexbox for optimal layouts

🐳 Docker Support

The Space includes a Dockerfile for containerized deployment:

# Build the image
docker build -t openwebui-space .

# Run the container
docker run -p 7860:7860 \
  -e OLLAMA_API_URL=https://your-ollama-space.hf.space \
  -e DEFAULT_MODEL=llama2 \
  openwebui-space

πŸ”’ Security Considerations

  • Public Access: The Space is publicly accessible - consider adding authentication for production use
  • API Communication: All communication with the Ollama Space is over HTTPS (when using HF Spaces)
  • Input Validation: User inputs are validated before being sent to the Ollama Space

🚨 Troubleshooting

Common Issues

  1. Cannot connect to Ollama Space: Check the OLLAMA_API_URL environment variable
  2. No models available: Ensure your Ollama Space is running and has models
  3. Generation errors: Check the Ollama Space logs for detailed error messages
  4. Slow responses: Large models may take time to generate responses

Health Checks

Use the /health endpoint to monitor:

  • OpenWebUI Space status
  • Connection to Ollama Space
  • Ollama Space health status

Debug Mode

For local development, you can enable debug mode by setting debug=True in the Flask app.

πŸ“± Mobile Experience

The interface is fully optimized for mobile devices:

  • Touch-friendly controls
  • Responsive design that adapts to screen size
  • Optimized for portrait and landscape orientations
  • Fast loading and smooth scrolling

🎯 Use Cases

  • Personal AI Assistant: Chat with your local models
  • Development Testing: Test model responses and parameters
  • Demo and Showcase: Present your Ollama setup to others
  • API Gateway: Use the chat interface as a frontend for your Ollama API

πŸ“ License

This project is open source and available under the MIT License.

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

πŸ“ž Support

If you encounter any issues or have questions, please open an issue on the repository.

πŸ”— Related Projects

  • Ollama Space: The backend API Space that this UI connects to
  • Ollama: The local LLM runner that powers everything
  • Hugging Face Spaces: The deployment platform for both Spaces