KevinHuSh
commited on
Commit
·
982788f
1
Parent(s):
640c593
Refine Ollama docs (#267)
Browse files### What problem does this PR solve?
Issue link:#221
### Type of change
- [x] Documentation Update
- README.md +1 -1
- README_ja.md +1 -1
- README_zh.md +1 -1
- docs/ollama.md +2 -2
README.md
CHANGED
@@ -65,7 +65,7 @@
|
|
65 |
|
66 |
- CPU >= 2 cores
|
67 |
- RAM >= 8 GB
|
68 |
-
- Docker
|
69 |
> If you have not installed Docker on your local machine (Windows, Mac, or Linux), see [Install Docker Engine](https://docs.docker.com/engine/install/).
|
70 |
|
71 |
### 🚀 Start up the server
|
|
|
65 |
|
66 |
- CPU >= 2 cores
|
67 |
- RAM >= 8 GB
|
68 |
+
- Docker >= 24.0.0
|
69 |
> If you have not installed Docker on your local machine (Windows, Mac, or Linux), see [Install Docker Engine](https://docs.docker.com/engine/install/).
|
70 |
|
71 |
### 🚀 Start up the server
|
README_ja.md
CHANGED
@@ -65,7 +65,7 @@
|
|
65 |
|
66 |
- CPU >= 2 cores
|
67 |
- RAM >= 8 GB
|
68 |
-
- Docker
|
69 |
> ローカルマシン(Windows、Mac、または Linux)に Docker をインストールしていない場合は、[Docker Engine のインストール](https://docs.docker.com/engine/install/) を参照してください。
|
70 |
|
71 |
### 🚀 サーバーを起動
|
|
|
65 |
|
66 |
- CPU >= 2 cores
|
67 |
- RAM >= 8 GB
|
68 |
+
- Docker >= 24.0.0
|
69 |
> ローカルマシン(Windows、Mac、または Linux)に Docker をインストールしていない場合は、[Docker Engine のインストール](https://docs.docker.com/engine/install/) を参照してください。
|
70 |
|
71 |
### 🚀 サーバーを起動
|
README_zh.md
CHANGED
@@ -65,7 +65,7 @@
|
|
65 |
|
66 |
- CPU >= 2 核
|
67 |
- RAM >= 8 GB
|
68 |
-
- Docker
|
69 |
> 如果你并没有在本机安装 Docker(Windows、Mac,或者 Linux), 可以参考文档 [Install Docker Engine](https://docs.docker.com/engine/install/) 自行安装。
|
70 |
|
71 |
### 🚀 启动服务器
|
|
|
65 |
|
66 |
- CPU >= 2 核
|
67 |
- RAM >= 8 GB
|
68 |
+
- Docker >= 24.0.0
|
69 |
> 如果你并没有在本机安装 Docker(Windows、Mac,或者 Linux), 可以参考文档 [Install Docker Engine](https://docs.docker.com/engine/install/) 自行安装。
|
70 |
|
71 |
### 🚀 启动服务器
|
docs/ollama.md
CHANGED
@@ -28,7 +28,7 @@ $ docker exec -it ollama ollama run mistral
|
|
28 |
- Go to 'Settings > Model Providers > Models to be added > Ollama'.
|
29 |
|
30 |
<div align="center" style="margin-top:20px;margin-bottom:20px;">
|
31 |
-
<img src="https://github.com/infiniflow/ragflow/assets/12318111/
|
32 |
</div>
|
33 |
|
34 |
> Base URL: Enter the base URL where the Ollama service is accessible, like, http://<your-ollama-endpoint-domain>:11434
|
@@ -36,5 +36,5 @@ $ docker exec -it ollama ollama run mistral
|
|
36 |
- Use Ollama Models.
|
37 |
|
38 |
<div align="center" style="margin-top:20px;margin-bottom:20px;">
|
39 |
-
<img src="https://github.com/infiniflow/ragflow/assets/12318111/
|
40 |
</div>
|
|
|
28 |
- Go to 'Settings > Model Providers > Models to be added > Ollama'.
|
29 |
|
30 |
<div align="center" style="margin-top:20px;margin-bottom:20px;">
|
31 |
+
<img src="https://github.com/infiniflow/ragflow/assets/12318111/a9df198a-226d-4f30-b8d7-829f00256d46" width="1300"/>
|
32 |
</div>
|
33 |
|
34 |
> Base URL: Enter the base URL where the Ollama service is accessible, like, http://<your-ollama-endpoint-domain>:11434
|
|
|
36 |
- Use Ollama Models.
|
37 |
|
38 |
<div align="center" style="margin-top:20px;margin-bottom:20px;">
|
39 |
+
<img src="https://github.com/infiniflow/ragflow/assets/12318111/60ff384e-5013-41ff-a573-9a543d237fd3" width="530"/>
|
40 |
</div>
|