Model Breadcrumbs: Scaling Multi-Task Model Merging with Sparse Masks
Paper
•
2312.06795
•
Published
•
1
This is a merge of pre-trained language models created using mergekit.
This model was merged using the Model Breadcrumbs merge method using H:\FModels\Mistral-7B-v0.2 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: H:\FModels\Ice0.104-13.04-RP
parameters:
weight: 0.4
- model: H:\FModels\Ice0.80-03.02-RP
parameters:
weight: 0.3
- model: G:\FModels\Ice0.80-10.04-RP-GRPO
parameters:
weight: 0.4
- model: F:\FModels\Ice0.122-28.05-RP
parameters:
weight: 0.8
merge_method: breadcrumbs
base_model: H:\FModels\Mistral-7B-v0.2
parameters:
lambda: 0.5
density: 0.9
gamma: 0.01
dtype: bfloat16
chat_template: "alpaca"