Dolphin 2.9 — Llama 3 × Mistral SLERP Blend
A balanced SLERP merge of Dolphin Llama-3-8B-Instruct and Mistral-7B-Instruct-v0.3, producing a strong general-purpose assistant with excellent instruction following and broad world knowledge. Popular community recipe from r/LocalLLaMA.
Author
cognitivecomputationsPublished
December 1, 2025
Last updated
February 14, 2026
Versions
2
Best Score
↑ 8.2
Stars
312
Merge Lineage
Merge Lineage
2 source modelsClick any model node to open its Hugging Face page
Config YAML
merge_method: slerp
base_model: meta-llama/Meta-Llama-3-8B-Instruct
models: - model: meta-llama/Meta-Llama-3-8B-Instruct
- model: mistralai/Mistral-7B-Instruct-v0.3
merge_method: slerp
parameters: t: 0.6
dtype: bfloat16Benchmark Scores
| Benchmark | Merged | Llama-3-8B | Mistral-7B | Δ Best |
|---|---|---|---|---|
MT-Bench | 8.2 | 8.0 | 7.8 | +0.2 |
MMLUtop | 68.4 | 66.9 | 64.2 | +1.5 |
HumanEval | 62.1 | 61.0 | 58.3 | +1.1 |
Blend Ratio (t = 0.6)
How I Built This
Embed Badge
Add this to your Hugging Face model card to link back to this recipe.
[](https://www.mergekit.com/recipes/dolphin-llama3-mistral-slerp)Version History
- v2.0latest↑ 68.4
February 14, 2026
Rebalanced blend ratio to 0.6 after MT-Bench evaluation; improved reasoning scores
- v1.0↑ 67.1
December 1, 2025
Initial release with t=0.5 blend
Use this Model
Run, deploy, or interact with Dolphin 2.9 — Llama 3 × Mistral SLERP Blend directly.
MergeKit Cloud
Run this model on serverless GPU infrastructure — zero setup, pay-per-second.
Select GPU
Serverless GPU · Powered by RunPod
Reproduce Locally
Run this exact merge on your own machine in three steps:
pip install mergekitmerge_method: slerp
base_model: meta-llama/Meta-Llama-3-8B-Instruct
models:
- model: meta-llama/Meta-Llama-3-8B-Instruct
- model: mistralai/Mistral-7B-Instruct-v0.3
merge_method: slerp
parameters:
t: 0.6
dtype: bfloat16mergekit-yaml dolphin-llama3-mistral-slerp.yaml ./outputWant to build your own merge?
Use the MergeKit config generator to build a YAML recipe visually — no code required.