merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Model Stock merge method using mergekit-community/MN-Hekate-Damnomeneia-17B as a base.
Models Merged
The following models were included in the merge:
- nbeerbower/mistral-nemo-bophades-12B
- mistralai/Mistral-Nemo-Base-2407
- ReadyArt/Forgotten-Abomination-12B-v4.0
- Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
Configuration
The following YAML configuration was used to produce this model:
dtype: float32
out_dtype: bfloat16
merge_method: model_stock
base_model: mergekit-community/MN-Hekate-Damnomeneia-17B
slices:
- sources:
- model: mergekit-community/MN-Hekate-Damnomeneia-17B
layer_range: [0, 12]
- sources:
- model: mergekit-community/MN-Hekate-Damnomeneia-17B
layer_range: [12, 20]
parameters:
weight: 6
- model: mistralai/Mistral-Nemo-Base-2407
layer_range: [12, 20]
parameters:
weight: [3, 2]
- model: ReadyArt/Forgotten-Abomination-12B-v4.0
layer_range: [12, 20]
parameters:
weight: 3
- model: nbeerbower/mistral-nemo-bophades-12B
layer_range: [12, 20]
parameters:
weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
- model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
layer_range: [12, 20]
parameters:
weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]
- sources:
- model: mergekit-community/MN-Hekate-Damnomeneia-17B
layer_range: [20, 36]
parameters:
weight: 6
- model: mistralai/Mistral-Nemo-Base-2407
layer_range: [16, 32]
parameters:
weight: [3, 2]
- model: ReadyArt/Forgotten-Abomination-12B-v4.0
layer_range: [16, 32]
parameters:
weight: [2, 3]
- model: nbeerbower/mistral-nemo-bophades-12B
layer_range: [16, 32]
parameters:
weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
- model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
layer_range: [16, 32]
parameters:
weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]
- sources:
- model: mergekit-community/MN-Hekate-Damnomeneia-17B
layer_range: [36, 44]
parameters:
weight: [4, 6]
- model: mistralai/Mistral-Nemo-Base-2407
layer_range: [20, 28]
parameters:
weight: 2
- model: ReadyArt/Forgotten-Abomination-12B-v4.0
layer_range: [20, 28]
parameters:
weight: 2
- model: nbeerbower/mistral-nemo-bophades-12B
layer_range: [20, 28]
parameters:
weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
- model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
layer_range: [20, 28]
parameters:
weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]
- sources:
- model: mergekit-community/MN-Hekate-Damnomeneia-17B
layer_range: [44, 56]
parameters:
weight: 6
- model: mistralai/Mistral-Nemo-Base-2407
layer_range: [28, 40]
parameters:
weight: [1, 0]
- model: nbeerbower/mistral-nemo-bophades-12B
layer_range: [28, 40]
parameters:
weight: [1, 0]
- model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
layer_range: [28, 40]
parameters:
weight: [1, 0]
tokenizer:
source: union
tokens:
"[INST]":
source: mergekit-community/MN-Hekate-Damnomeneia-17B
force: true
"[/INST]":
source: mergekit-community/MN-Hekate-Damnomeneia-17B
force: true
"<|im_start|>":
source: mergekit-community/MN-Hekate-Damnomeneia-17B
force: true
"<|im_end|>":
source: mergekit-community/MN-Hekate-Damnomeneia-17B
force: true
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support