Triangle104 commited on
Commit
a6869c3
·
verified ·
1 Parent(s): efcccad

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -0
README.md CHANGED
@@ -16,6 +16,24 @@ tags:
16
  This model was converted to GGUF format from [`katanemo/Arch-Function-Chat-1.5B`](https://huggingface.co/katanemo/Arch-Function-Chat-1.5B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
17
  Refer to the [original model card](https://huggingface.co/katanemo/Arch-Function-Chat-1.5B) for more details on the model.
18
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
  ## Use with llama.cpp
20
  Install llama.cpp through brew (works on Mac and Linux)
21
 
 
16
  This model was converted to GGUF format from [`katanemo/Arch-Function-Chat-1.5B`](https://huggingface.co/katanemo/Arch-Function-Chat-1.5B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
17
  Refer to the [original model card](https://huggingface.co/katanemo/Arch-Function-Chat-1.5B) for more details on the model.
18
 
19
+ ---
20
+ The Arch-Function-Chat collection builds upon the Katanemo's Arch-Function
21
+ collection by extending its capabilities beyond function calling. This
22
+ new collection maintains the state-of-the-art(SOTA) function calling
23
+ performance of the original collection while adding powerful new
24
+ features that make it even more versatile in real-world applications.
25
+
26
+
27
+ In addition to function calling capabilities, this collection now offers:
28
+
29
+
30
+ -Clarify & refine: Generates natural follow-up questions to collect missing information for function calling
31
+
32
+ -Interpret & respond: Provides human-friendly responses based on function execution results
33
+
34
+ -Context management: Mantains context in complex multi-turn interactions
35
+
36
+ ---
37
  ## Use with llama.cpp
38
  Install llama.cpp through brew (works on Mac and Linux)
39