added some explanation to document

This commit is contained in:
Saifeddine ALOUI
2025-01-17 02:03:02 +01:00
parent 52ca5ea6aa
commit 58f1058198

View File

@@ -103,13 +103,15 @@ data/
1. Using with Ollama: 1. Using with Ollama:
```env ```env
LLM_BINDING=ollama LLM_BINDING=ollama
LLM_BINDING_HOST=http://localhost:11434 LLM_BINDING_HOST=http://host.docker.internal:11434
LLM_MODEL=mistral LLM_MODEL=mistral
EMBEDDING_BINDING=ollama EMBEDDING_BINDING=ollama
EMBEDDING_BINDING_HOST=http://localhost:11434 EMBEDDING_BINDING_HOST=http://host.docker.internal:11434
EMBEDDING_MODEL=bge-m3 EMBEDDING_MODEL=bge-m3
``` ```
you can't just use localhost from docker, that's why you need to use host.docker.internal which is defined in the docker compose file and should allow you to access the localhost services.
2. Using with OpenAI: 2. Using with OpenAI:
```env ```env
LLM_BINDING=openai LLM_BINDING=openai