Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,40 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# 🧠 OfflineAI 1.01 – NemoMix 12B (macOS Edition)
|
2 |
+
|
3 |
+
This repository contains the pre-packaged `.gguf` model and launch script for **OfflineAI 1.01**, an educational, fully offline AI assistant designed to run on macOS without any internet access.
|
4 |
+
|
5 |
+
## 💡 About the Project
|
6 |
+
|
7 |
+
**OfflineAI.Online** is a community project by David Káninský, created to demonstrate the capabilities of modern open-source language models in an offline and privacy-focused environment.
|
8 |
+
|
9 |
+
This version uses the `NemoMix-Unleashed-12B` model (quantized to `Q5_K_M`) and integrates with `llama.cpp`, allowing it to run on machines with ~16 GB RAM.
|
10 |
+
|
11 |
+
## 📁 What’s included
|
12 |
+
|
13 |
+
- `OfflineAI-Nemo12B-Q5_K_M.gguf` – The model (quantized)
|
14 |
+
- `Offline_AI.command` – A one-click launcher script for macOS Terminal
|
15 |
+
- `info.txt` – Legal and usage notice
|
16 |
+
|
17 |
+
> Simply unzip the folder, double-click `Offline_AI.command`, and chat offline.
|
18 |
+
|
19 |
+
## 🧪 Requirements
|
20 |
+
|
21 |
+
- macOS (tested on Ventura & Sonoma)
|
22 |
+
- ~16 GB RAM minimum
|
23 |
+
- No internet connection required after download
|
24 |
+
|
25 |
+
## ⚠️ Legal Notice
|
26 |
+
|
27 |
+
This model is intended for **educational and demonstrational use only**.
|
28 |
+
It is based on the open-source NemoMix 12B model (MIT license) and distributed in full respect of open-source licensing.
|
29 |
+
|
30 |
+
OfflineAI does **not** provide legal, financial, or medical advice. Use outside the scope of demonstration is at your own risk.
|
31 |
+
|
32 |
+
## 🙏 Credits
|
33 |
+
|
34 |
+
- Based on [NemoMix-Unleashed-12B](https://huggingface.co/bartowski/NemoMix-Unleashed-12B-GGUF) by [@bartowski](https://huggingface.co/bartowski)
|
35 |
+
- Powered by [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
36 |
+
- Offline integration and launcher: [OfflineAI.Online](https://offlineai.online)
|
37 |
+
|
38 |
+
## 💬 Support or feedback?
|
39 |
+
|
40 |
+
You can find updates and more details at [OfflineAI.Online](https://offlineai.online)
|