Remove the comments at the end of the environment variable lines in .env file

This commit is contained in:
yangdx
2025-03-29 13:52:29 +08:00
parent be3be54ed4
commit a3ff0534d6
2 changed files with 7 additions and 4 deletions

View File

@@ -55,10 +55,14 @@ SUMMARY_LANGUAGE=English
# MAX_EMBED_TOKENS=8192
### LLM Configuration
TIMEOUT=150 # Time out in seconds for LLM, None for infinite timeout
### Time out in seconds for LLM, None for infinite timeout
TIMEOUT=150
### Some models like o1-mini require temperature to be set to 1
TEMPERATURE=0.5
MAX_ASYNC=4 # Max concurrency requests of LLM
MAX_TOKENS=32768 # Max tokens send to LLM (less than context size of the model)
### Max concurrency requests of LLM
MAX_ASYNC=4
### Max tokens send to LLM (less than context size of the model)
MAX_TOKENS=32768
### Ollama example (For local services installed with docker, you can use host.docker.internal as host)
LLM_BINDING=ollama

View File

@@ -422,7 +422,6 @@ EMBEDDING_BINDING_HOST=http://localhost:11434
```
## API Endpoints
All servers (LoLLMs, Ollama, OpenAI and Azure OpenAI) provide the same REST API endpoints for RAG functionality. When API Server is running, visit: