Update README.md
Browse files
README.md
CHANGED
|
@@ -18,9 +18,84 @@ tags:
|
|
| 18 |
- **Finetuned from model :** unsloth/llama-3.2-1b-instruct-bnb-4bit
|
| 19 |
|
| 20 |
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
|
|
|
|
| 21 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
-
[Watch the demo video](Demo/gguf.mp4)
|
| 24 |

|
| 25 |
|
| 26 |
|
|
|
|
| 18 |
- **Finetuned from model :** unsloth/llama-3.2-1b-instruct-bnb-4bit
|
| 19 |
|
| 20 |
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
|
| 21 |
+
Here’s a revised version of your instructions formatted for a GitHub `README.md` file:
|
| 22 |
|
| 23 |
+
---
|
| 24 |
+
|
| 25 |
+
# Run with Ollama
|
| 26 |
+
|
| 27 |
+
### Download and Install Ollama
|
| 28 |
+
To get started, download Ollama from [https://ollama.com/download](https://ollama.com/download) and install it on your Windows or Mac system.
|
| 29 |
+
|
| 30 |
+
### Run Your Own Model in Minutes
|
| 31 |
+
|
| 32 |
+
### Steps to Run GGUF Models:
|
| 33 |
+
|
| 34 |
+
#### 1. Create the Model File
|
| 35 |
+
- Name your model file appropriately, for example, `metallama`.
|
| 36 |
+
|
| 37 |
+
#### 2. Add the Template Command
|
| 38 |
+
- Include a `FROM` line with the base model file. For instance:
|
| 39 |
+
|
| 40 |
+
```bash
|
| 41 |
+
FROM Llama-3.2-1B.F16.gguf
|
| 42 |
+
```
|
| 43 |
+
|
| 44 |
+
- Make sure the model file is in the same directory as your script.
|
| 45 |
+
|
| 46 |
+
#### 3. Create and Patch the Model
|
| 47 |
+
- Use the following command in your terminal to create and patch your model:
|
| 48 |
+
|
| 49 |
+
```bash
|
| 50 |
+
ollama create metallama -f ./metallama
|
| 51 |
+
```
|
| 52 |
+
|
| 53 |
+
- Upon success, a confirmation message will appear.
|
| 54 |
+
|
| 55 |
+
- To verify that the model was created successfully, run:
|
| 56 |
+
|
| 57 |
+
```bash
|
| 58 |
+
ollama list
|
| 59 |
+
```
|
| 60 |
+
|
| 61 |
+
Ensure that `metallama` appears in the list of models.
|
| 62 |
+
|
| 63 |
+
---
|
| 64 |
+
|
| 65 |
+
## Running the Model
|
| 66 |
+
|
| 67 |
+
To run the model, use:
|
| 68 |
+
|
| 69 |
+
```bash
|
| 70 |
+
ollama run metallama
|
| 71 |
+
```
|
| 72 |
+
|
| 73 |
+
### Sample Usage
|
| 74 |
+
|
| 75 |
+
In the command prompt, run:
|
| 76 |
+
|
| 77 |
+
```bash
|
| 78 |
+
D:\>ollama run metallama
|
| 79 |
+
```
|
| 80 |
+
|
| 81 |
+
Example interaction:
|
| 82 |
+
|
| 83 |
+
```plaintext
|
| 84 |
+
>>> write a mini passage about space x
|
| 85 |
+
Space X, the private aerospace company founded by Elon Musk, is revolutionizing the field of space exploration.
|
| 86 |
+
With its ambitious goals to make humanity a multi-planetary species and establish a sustainable human presence in
|
| 87 |
+
the cosmos, Space X has become a leading player in the industry. The company's spacecraft, like the Falcon 9, have
|
| 88 |
+
demonstrated remarkable capabilities, allowing for the transport of crews and cargo into space with unprecedented
|
| 89 |
+
efficiency. As technology continues to advance, the possibility of establishing permanent colonies on Mars becomes
|
| 90 |
+
increasingly feasible, thanks in part to the success of reusable rockets that can launch multiple times without
|
| 91 |
+
sustaining significant damage. The journey towards becoming a multi-planetary species is underway, and Space X
|
| 92 |
+
plays a pivotal role in pushing the boundaries of human exploration and settlement.
|
| 93 |
+
```
|
| 94 |
+
|
| 95 |
+
---
|
| 96 |
+
|
| 97 |
+
You’re now ready to run your own model with Ollama!
|
| 98 |
|
|
|
|
| 99 |

|
| 100 |
|
| 101 |
|