End of training
Browse files
    	
        README.md
    CHANGED
    
    | @@ -1,17 +1,19 @@ | |
| 1 | 
             
            ---
         | 
| 2 | 
             
            base_model: mistralai/Ministral-8B-Instruct-2410
         | 
|  | |
| 3 | 
             
            library_name: transformers
         | 
| 4 | 
            -
            model_name:  | 
| 5 | 
             
            tags:
         | 
| 6 | 
             
            - generated_from_trainer
         | 
|  | |
| 7 | 
             
            - trl
         | 
| 8 | 
             
            - sft
         | 
| 9 | 
             
            licence: license
         | 
| 10 | 
             
            ---
         | 
| 11 |  | 
| 12 | 
            -
            # Model Card for  | 
| 13 |  | 
| 14 | 
            -
            This model is a fine-tuned version of [mistralai/Ministral-8B-Instruct-2410](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410).
         | 
| 15 | 
             
            It has been trained using [TRL](https://github.com/huggingface/trl).
         | 
| 16 |  | 
| 17 | 
             
            ## Quick start
         | 
|  | |
| 1 | 
             
            ---
         | 
| 2 | 
             
            base_model: mistralai/Ministral-8B-Instruct-2410
         | 
| 3 | 
            +
            datasets: nbroad/odesia-combined-v1
         | 
| 4 | 
             
            library_name: transformers
         | 
| 5 | 
            +
            model_name: mistralai/Ministral-8B-Instruct-2410
         | 
| 6 | 
             
            tags:
         | 
| 7 | 
             
            - generated_from_trainer
         | 
| 8 | 
            +
            - odesia
         | 
| 9 | 
             
            - trl
         | 
| 10 | 
             
            - sft
         | 
| 11 | 
             
            licence: license
         | 
| 12 | 
             
            ---
         | 
| 13 |  | 
| 14 | 
            +
            # Model Card for mistralai/Ministral-8B-Instruct-2410
         | 
| 15 |  | 
| 16 | 
            +
            This model is a fine-tuned version of [mistralai/Ministral-8B-Instruct-2410](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410) on the [nbroad/odesia-combined-v1](https://huggingface.co/datasets/nbroad/odesia-combined-v1) dataset.
         | 
| 17 | 
             
            It has been trained using [TRL](https://github.com/huggingface/trl).
         | 
| 18 |  | 
| 19 | 
             
            ## Quick start
         | 
