DELLA-Merging: Reducing Interference in Model Merging through Magnitude-Based Sampling
Paper
•
2406.11617
•
Published
•
8
This is a merge of pre-trained language models created using mergekit.
This model was merged using the DELLA merge method using yamatazen/EtherealAurora-12B-Lorablated as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
merge_method: della
dtype: bfloat16
out_dtype: bfloat16
base_model: yamatazen/EtherealAurora-12B-Lorablated
models:
- model: yamatazen/Shisa-v2-Mistral-Nemo-12B-Abliterated
parameters:
density: 0.6
weight: 0.7
- model: yamatazen/ForgottenMaid-12B
parameters:
density: 0.6
weight: 0.5
- model: yamatazen/FusionEngine-12B
parameters:
density: 0.5
weight: 0.3
parameters:
epsilon: 0.1
lambda: 1.0