title: Banking Dispute Resolution System
emoji: π€
colorFrom: green
colorTo: gray
pinned: false
sdk: docker
AI-Powered Dispute Resolution
π Table of Contents
- Overview
- Key Features
- Architecture Overview
- Technologies Used
- Prerequisites
- Installation
- Configuration
- Usage
- API Documentation
- Testing
- Deployment
- Contributing
- License
- Acknowledgments
- Contact/Support
- Disclaimer
π Overview
The AI-Powered Dispute Resolution system is designed to streamline the process of analyzing and resolving disputes using advanced AI technologies. This application leverages large language models to analyze dispute details, identify key issues, suggest potential solutions, and provide legal references. The system is primarily designed to operate as a standalone application on Hugging Face Spaces, combining a user-friendly Streamlit frontend with a robust FastAPI backend.
β¨ Key Features
AI-Powered Analysis
- Intelligent Dispute Processing: Analyzes dispute details using Gemini API to extract key information
- Solution Recommendation: Suggests potential resolutions based on similar historical cases
- Legal Reference Integration: Provides relevant legal references and precedents
- Continuous Learning: System improves over time by incorporating feedback from resolved cases
Frontend Capabilities
- Intuitive Interface: User-friendly Streamlit interface for submitting and tracking disputes
- Dashboard: Real-time visualization of dispute status and analytics
- Document Upload: Support for uploading relevant documents in various formats
- Explainable AI: Clear explanations of how the AI reached its conclusions
Backend Robustness
- Secure API: FastAPI backend ensuring secure data processing
- Efficient Data Management: Optimized database schema for quick retrieval and analysis
- Scalable Architecture: Designed to handle increasing volumes of disputes
- Comprehensive Logging: Detailed activity logs for auditing purposes
Deployment Strategy
- Hugging Face Integration: Seamless deployment on Hugging Face Spaces
- Single Container Solution: Entire application packaged in a single Docker container
- Environment Variable Configuration: Easy configuration through environment variables
ποΈ Architecture Overview
High-Level System Diagram
graph TD
A[User] -->|Submits Dispute| B[Streamlit Frontend]
B -->|API Request| C[FastAPI Backend]
C -->|Query| D[Database]
C -->|Analysis Request| E[Gemini API]
E -->|Analysis Results| C
C -->|Response| B
B -->|Display Results| A
style A fill:#f9f,stroke:#333,stroke-width:2px
style B fill:#bbf,stroke:#333,stroke-width:2px
style C fill:#bfb,stroke:#333,stroke-width:2px
style D fill:#fbb,stroke:#333,stroke-width:2px
style E fill:#bff,stroke:#333,stroke-width:2px
The diagram above illustrates the flow of data through the system, from user input to AI analysis and result presentation.
Component Interaction Diagram
sequenceDiagram
participant User
participant Frontend as Streamlit Frontend
participant Backend as FastAPI Backend
participant DB as Database
participant AI as Gemini API
User->>Frontend: Submit dispute details
Frontend->>Backend: POST /api/disputes
Backend->>DB: Store dispute data
Backend->>AI: Request analysis
AI->>Backend: Return analysis results
Backend->>DB: Update with analysis
Backend->>Frontend: Return dispute ID & status
Frontend->>User: Display confirmation
User->>Frontend: View analysis results
Frontend->>Backend: GET /api/disputes/{id}
Backend->>DB: Retrieve dispute data
DB->>Backend: Return dispute data
Backend->>Frontend: Return complete dispute info
Frontend->>User: Display analysis results
This sequence diagram shows the typical flow when a user submits a dispute for analysis.
Deployment Architecture Diagram
graph TD
A[Hugging Face Space] -->|Container| B[Docker Container]
B -->|Port 8000| C[FastAPI Backend]
B -->|Port 8501| D[Streamlit Frontend]
C -->|Internal Communication| D
C -->|Database| E[SQLite]
style A fill:#f9f,stroke:#333,stroke-width:2px
style B fill:#bbf,stroke:#333,stroke-width:2px
style C fill:#bfb,stroke:#333,stroke-width:2px
style D fill:#fbb,stroke:#333,stroke-width:2px
style E fill:#bff,stroke:#333,stroke-width:2px
The deployment architecture shows how the application is packaged in a single Docker container for deployment on Hugging Face Spaces.
Database Schema Diagram
erDiagram
DISPUTES {
string id PK
string title
string description
datetime created_at
string status
string user_id FK
}
ANALYSIS {
string id PK
string dispute_id FK
string key_issues
string suggested_solutions
string legal_references
datetime created_at
}
USERS {
string id PK
string username
string email
datetime created_at
}
FEEDBACK {
string id PK
string analysis_id FK
int rating
string comments
datetime created_at
}
DISPUTES ||--o{ ANALYSIS : has
USERS ||--o{ DISPUTES : submits
ANALYSIS ||--o{ FEEDBACK : receives
This ER diagram shows the database structure used to store dispute information, analysis results, user data, and feedback.
User Flow Diagram
graph TD
A[Start] --> B[Create Account/Login]
B --> C[Submit New Dispute]
C --> D[Upload Supporting Documents]
D --> E[View Initial AI Analysis]
E --> F{Satisfied with Analysis?}
F -->|Yes| G[Accept Recommendations]
F -->|No| H[Request Detailed Analysis]
H --> I[Review Detailed Analysis]
I --> J{Accept Solution?}
J -->|Yes| K[Implement Solution]
J -->|No| L[Provide Feedback]
L --> M[Receive Refined Analysis]
M --> J
G --> N[Provide Feedback on Process]
K --> N
N --> O[End]
style A fill:#f9f,stroke:#333,stroke-width:2px
style F fill:#bbf,stroke:#333,stroke-width:2px
style J fill:#bbf,stroke:#333,stroke-width:2px
style O fill:#f9f,stroke:#333,stroke-width:2px
This diagram illustrates the typical user journey through the dispute resolution process.
π οΈ Technologies Used
- Python 3.9+ - Programming language
- FastAPI - Backend API framework
- Streamlit - Frontend framework
- LangChain - AI integration framework
- Gemini API - AI language model
- SQLite - Database
- Docker - Containerization
- Hugging Face Spaces - Deployment platform
π Prerequisites
To run this application, you need:
- Python 3.9 or higher
- Docker and Docker Compose
- Gemini API key
- Hugging Face account (for deployment)
π§ Installation
Clone the repository:
git clone https://github.com/DebopamParam/AI-Powered_Dispute_Resolution.git cd AI-Powered_Dispute_Resolution
Create a
.env
file in the root directory with the following variables:GEMINI_API_KEY=your_gemini_api_key DATABASE_URL=sqlite:///./app.db DEBUG=False
Build the Docker image:
docker build -t dispute-resolution-app .
βοΈ Configuration
Environment Variables
The application can be configured using the following environment variables:
GEMINI_API_KEY
- Your Gemini API keyDATABASE_URL
- Database connection string (defaults to SQLite)DEBUG
- Enable debug mode (True/False)LOG_LEVEL
- Logging level (INFO, DEBUG, ERROR)MAX_UPLOAD_SIZE
- Maximum file upload size in MB
Optional Configuration Files
You can also create the following configuration files:
config/app_config.json
- Application-specific settingsconfig/model_config.json
- AI model parameters
π Usage
Running Locally
Start the application using Docker Compose:
docker-compose up
Navigate to http://localhost:8501 in your web browser
Quickstart Guide
- Create an account or log in
- Click on "Create New Dispute"
- Fill in the dispute details:
- Title
- Description
- Parties involved
- Relevant dates
- Upload any supporting documents
- Click "Submit for Analysis"
- Wait for the AI to analyze the dispute
- Review the analysis results, which include:
- Key issues identified
- Suggested solutions
- Legal references
- Provide feedback on the analysis
- Implement the recommended solutions
π API Documentation
The API documentation is automatically generated and available at /docs
when running the application. It provides:
- Complete list of endpoints
- Request and response schemas
- Authentication requirements
- Example requests
- Interactive testing capabilities
π§ͺ Testing
To run the tests:
# Run all tests
python -m pytest
# Run specific test file
python -m pytest tests/test_api.py
# Run with coverage report
python -m pytest --cov=app tests/
π Deployment
Deploying to Hugging Face Spaces
- Fork this repository
- Create a new Space on Hugging Face:
- Type: Docker
- Repository: Your forked repository
- Add the following secrets to your Space:
GEMINI_API_KEY
- Your Gemini API key
- The Space will automatically build and deploy the application
Deploying Locally with Docker Compose
- Make sure Docker and Docker Compose are installed
- Create the
.env
file as described in the Installation section - Run the following command:
docker-compose up -d
- Access the application at http://localhost:8501
π€ Contributing
Contributions are welcome! Please follow these steps:
- Fork the repository
- Create a new branch (
git checkout -b feature/amazing-feature
) - Make your changes
- Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Please make sure your code follows the project's coding standards and includes appropriate tests.
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Acknowledgments
- FastAPI for the efficient API framework
- Streamlit for the intuitive frontend framework
- LangChain for simplifying AI integration
- Gemini API for the powerful language model
- Hugging Face Spaces for the deployment platform
π Contact/Support
- For bug reports and feature requests, please open an issue
β οΈ Disclaimer
This project is a prototype and should be used with caution. The AI-generated recommendations should not be considered legal advice. Always consult with a qualified legal professional for legal matters.