fix: remove outdated Ollama model config notes
- Remove legacy configuration instructions for Open WebUI tasks - Ollama API can properly bypass conversation metadata generation
This commit is contained in:
@@ -94,8 +94,6 @@ For example, chat message "/mix 唐僧有几个徒弟" will trigger a mix mode q
|
||||
|
||||
After starting the lightrag-server, you can add an Ollama-type connection in the Open WebUI admin pannel. And then a model named lightrag:latest will appear in Open WebUI's model management interface. Users can then send queries to LightRAG through the chat interface.
|
||||
|
||||
To prevent Open WebUI from using LightRAG when generating conversation titles, go to Admin Panel > Interface > Set Task Model and change both Local Models and External Models to any option except "Current Model".
|
||||
|
||||
## Configuration
|
||||
|
||||
LightRAG can be configured using either command-line arguments or environment variables. When both are provided, command-line arguments take precedence over environment variables.
|
||||
|
Reference in New Issue
Block a user