merged

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: bfloat16
merge_method: passthrough
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 20]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
    - sources:
      - layer_range: [10, 30]
        model: Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
    - sources:
      - layer_range: [20, 40]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
    - sources:
      - layer_range: [30, 50]
        model: Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
    - sources:
      - layer_range: [40, 60]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
    - sources:
      - layer_range: [50, 70]
        model: Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
    - sources:
      - layer_range: [60, 80]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for CuriousCat29/l3-stack-mocha-2-8bit