yangdx
commited on
Commit
·
730231e
1
Parent(s):
20ad069
fix: remove outdated Ollama model config notes
Browse files- Remove legacy configuration instructions for Open WebUI tasks
- Ollama API can properly bypass conversation metadata generation
- lightrag/api/README.md +0 -2
lightrag/api/README.md
CHANGED
@@ -94,8 +94,6 @@ For example, chat message "/mix 唐僧有几个徒弟" will trigger a mix mode q
|
|
94 |
|
95 |
After starting the lightrag-server, you can add an Ollama-type connection in the Open WebUI admin pannel. And then a model named lightrag:latest will appear in Open WebUI's model management interface. Users can then send queries to LightRAG through the chat interface.
|
96 |
|
97 |
-
To prevent Open WebUI from using LightRAG when generating conversation titles, go to Admin Panel > Interface > Set Task Model and change both Local Models and External Models to any option except "Current Model".
|
98 |
-
|
99 |
## Configuration
|
100 |
|
101 |
LightRAG can be configured using either command-line arguments or environment variables. When both are provided, command-line arguments take precedence over environment variables.
|
|
|
94 |
|
95 |
After starting the lightrag-server, you can add an Ollama-type connection in the Open WebUI admin pannel. And then a model named lightrag:latest will appear in Open WebUI's model management interface. Users can then send queries to LightRAG through the chat interface.
|
96 |
|
|
|
|
|
97 |
## Configuration
|
98 |
|
99 |
LightRAG can be configured using either command-line arguments or environment variables. When both are provided, command-line arguments take precedence over environment variables.
|