DELLA-Merging: Reducing Interference in Model Merging through Magnitude-Based Sampling
Paper
•
2406.11617
•
Published
•
10
This is a merge of pre-trained language models created using mergekit.
This model was merged using the Linear DELLA merge method using NousResearch/Hermes-3-Llama-3.1-8B as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: NousResearch/Hermes-3-Llama-3.1-8B
#no parameters necessary for base model
- model: Skywork/Skywork-o1-Open-Llama-3.1-8B
parameters:
density: 0.3
weight: 0.3
- model: cognitivecomputations/Dolphin3.0-Llama3.1-8B
parameters:
density: 0.36
weight: 0.36
- model: SentientAGI/Dobby-Mini-Leashed-Llama-3.1-8B
parameters:
density: 0.34
weight: 0.34
merge_method: della_linear
base_model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
normalize: true
dtype: bfloat16