Update env sample file
Browse files- env.example +2 -1
env.example
CHANGED
@@ -74,7 +74,8 @@ TIMEOUT=240
|
|
74 |
TEMPERATURE=0
|
75 |
### Max concurrency requests of LLM
|
76 |
MAX_ASYNC=4
|
77 |
-
###
|
|
|
78 |
MAX_TOKENS=32768
|
79 |
### LLM Binding type: openai, ollama, lollms
|
80 |
LLM_BINDING=openai
|
|
|
74 |
TEMPERATURE=0
|
75 |
### Max concurrency requests of LLM
|
76 |
MAX_ASYNC=4
|
77 |
+
### MAX_TOKENS: max tokens send to LLM for entity relation summaries (less than context size of the model)
|
78 |
+
### MAX_TOKENS: set as num_ctx option for Ollama by API Server
|
79 |
MAX_TOKENS=32768
|
80 |
### LLM Binding type: openai, ollama, lollms
|
81 |
LLM_BINDING=openai
|