Refine LLM settings in env sample file
This commit is contained in:
@@ -45,17 +45,21 @@
|
||||
# MAX_EMBED_TOKENS=8192
|
||||
|
||||
### LLM Configuration (Use valid host. For local services installed with docker, you can use host.docker.internal)
|
||||
LLM_BINDING=ollama
|
||||
LLM_MODEL=mistral-nemo:latest
|
||||
LLM_BINDING_API_KEY=your_api_key
|
||||
### Ollama example
|
||||
LLM_BINDING=ollama
|
||||
LLM_BINDING_HOST=http://localhost:11434
|
||||
### OpenAI alike example
|
||||
# LLM_BINDING=openai
|
||||
# LLM_MODEL=gpt-4o
|
||||
# LLM_BINDING_HOST=https://api.openai.com/v1
|
||||
# LLM_BINDING_API_KEY=your_api_key
|
||||
### lollms example
|
||||
# LLM_BINDING=lollms
|
||||
# LLM_MODEL=mistral-nemo:latest
|
||||
# LLM_BINDING_HOST=http://localhost:9600
|
||||
# LLM_BINDING_API_KEY=your_api_key
|
||||
|
||||
### Embedding Configuration (Use valid host. For local services installed with docker, you can use host.docker.internal)
|
||||
EMBEDDING_MODEL=bge-m3:latest
|
||||
|
Reference in New Issue
Block a user