yangdx commited on
Commit
f0c9ea0
·
1 Parent(s): c0af224

创建yangdx分支,并添加测试脚本

Browse files
Files changed (2) hide show
  1. README.md +2 -1
  2. examples/lightrag_yangdx.py +70 -0
README.md CHANGED
@@ -695,7 +695,7 @@ Output the results in the following structure:
695
  ```
696
  </details>
697
 
698
- ### Batch Eval
699
  To evaluate the performance of two RAG systems on high-level queries, LightRAG uses the following prompt, with the specific code available in `example/batch_eval.py`.
700
 
701
  <details>
@@ -746,6 +746,7 @@ Output your evaluation in the following JSON format:
746
  </details>
747
 
748
  ### Overall Performance Table
 
749
  | | **Agriculture** | | **CS** | | **Legal** | | **Mix** | |
750
  |----------------------|-------------------------|-----------------------|-----------------------|-----------------------|-----------------------|-----------------------|-----------------------|-----------------------|
751
  | | NaiveRAG | **LightRAG** | NaiveRAG | **LightRAG** | NaiveRAG | **LightRAG** | NaiveRAG | **LightRAG** |
 
695
  ```
696
  </details>
697
 
698
+ ### Batch Eval
699
  To evaluate the performance of two RAG systems on high-level queries, LightRAG uses the following prompt, with the specific code available in `example/batch_eval.py`.
700
 
701
  <details>
 
746
  </details>
747
 
748
  ### Overall Performance Table
749
+
750
  | | **Agriculture** | | **CS** | | **Legal** | | **Mix** | |
751
  |----------------------|-------------------------|-----------------------|-----------------------|-----------------------|-----------------------|-----------------------|-----------------------|-----------------------|
752
  | | NaiveRAG | **LightRAG** | NaiveRAG | **LightRAG** | NaiveRAG | **LightRAG** | NaiveRAG | **LightRAG** |
examples/lightrag_yangdx.py ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import asyncio
2
+ import os
3
+ import inspect
4
+ import logging
5
+ from lightrag import LightRAG, QueryParam
6
+ from lightrag.llm import ollama_model_complete, ollama_embedding
7
+ from lightrag.utils import EmbeddingFunc
8
+
9
+ WORKING_DIR = "./dickens"
10
+
11
+ logging.basicConfig(format="%(levelname)s:%(message)s", level=logging.INFO)
12
+
13
+ if not os.path.exists(WORKING_DIR):
14
+ os.mkdir(WORKING_DIR)
15
+
16
+ rag = LightRAG(
17
+ working_dir=WORKING_DIR,
18
+ llm_model_func=ollama_model_complete,
19
+ llm_model_name="gemma2:2b",
20
+ llm_model_max_async=4,
21
+ llm_model_max_token_size=32768,
22
+ llm_model_kwargs={"host": "http://localhost:11434", "options": {"num_ctx": 32768}},
23
+ embedding_func=EmbeddingFunc(
24
+ embedding_dim=768,
25
+ max_token_size=8192,
26
+ func=lambda texts: ollama_embedding(
27
+ texts, embed_model="nomic-embed-text", host="http://localhost:11434"
28
+ ),
29
+ ),
30
+ )
31
+
32
+ with open("./book.txt", "r", encoding="utf-8") as f:
33
+ rag.insert(f.read())
34
+
35
+ # Perform naive search
36
+ print(
37
+ rag.query("What are the top themes in this story?", param=QueryParam(mode="naive"))
38
+ )
39
+
40
+ # Perform local search
41
+ print(
42
+ rag.query("What are the top themes in this story?", param=QueryParam(mode="local"))
43
+ )
44
+
45
+ # Perform global search
46
+ print(
47
+ rag.query("What are the top themes in this story?", param=QueryParam(mode="global"))
48
+ )
49
+
50
+ # Perform hybrid search
51
+ print(
52
+ rag.query("What are the top themes in this story?", param=QueryParam(mode="hybrid"))
53
+ )
54
+
55
+ # stream response
56
+ resp = rag.query(
57
+ "What are the top themes in this story?",
58
+ param=QueryParam(mode="hybrid", stream=True),
59
+ )
60
+
61
+
62
+ async def print_stream(stream):
63
+ async for chunk in stream:
64
+ print(chunk, end="", flush=True)
65
+
66
+
67
+ if inspect.isasyncgen(resp):
68
+ asyncio.run(print_stream(resp))
69
+ else:
70
+ print(resp)