diff --git a/env.example b/env.example index 16b96ed9..aab943cc 100644 --- a/env.example +++ b/env.example @@ -74,7 +74,8 @@ TIMEOUT=240 TEMPERATURE=0 ### Max concurrency requests of LLM MAX_ASYNC=4 -### Max tokens send to LLM for entity relation summaries (less than context size of the model) +### MAX_TOKENS: max tokens send to LLM for entity relation summaries (less than context size of the model) +### MAX_TOKENS: set as num_ctx option for Ollama by API Server MAX_TOKENS=32768 ### LLM Binding type: openai, ollama, lollms LLM_BINDING=openai