ragflow / docs /ollama.md
ooo oo
make `<xxxx>` visiable (#369)
468a94a
|
raw
history blame
1.41 kB

Ollama

One-click deployment of local LLMs, that is Ollama.

Install

Launch Ollama

Decide which LLM you want to deploy (here's a list for supported LLM), say, mistral:

$ ollama run mistral

Or,

$ docker exec -it ollama ollama run mistral

Use Ollama in RAGFlow

  • Go to 'Settings > Model Providers > Models to be added > Ollama'.

Base URL: Enter the base URL where the Ollama service is accessible, like, http://<your-ollama-endpoint-domain>:11434.

  • Use Ollama Models.