| base_model: | |
| - failspy/Meta-Llama-3-8B-Instruct-abliterated-v3 | |
| - VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct | |
| - DeepMount00/Llama-3-8b-Ita | |
| - nbeerbower/llama-3-gutenberg-8B | |
| - jpacifico/French-Alpaca-Llama3-8B-Instruct-v1.0 | |
| - meta-llama/Meta-Llama-3-8B-Instruct | |
| library_name: transformers | |
| license: apache-2.0 | |
| tags: | |
| - mergekit | |
| - merge | |
| # Model Merge Parameters | |
| Base model: meta-llama/Meta-Llama-3-8B-Instruct | |
| Models: failspy/Meta-Llama-3-8B-Instruct-abliterated-v3 | |
| VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct | |
| DeepMount00/Llama-3-8b-Ita | |
| nbeerbower/llama-3-gutenberg-8B | |
| jpacifico/French-Alpaca-Llama3-8B-Instruct-v1.0 | |
| meta-llama/Meta-Llama-3-8B-Instruct | |
| Merge method: ties | |
| Random seed: 42 | |
| density: 0.1 | |
| normalize: true | |
| weight: 1.0 | |