RichardErkhov's picture
uploaded readme
a770eb7 verified

Quantization made by Richard Erkhov.

Github

Discord

Request more models

phi2-squadv2-merged - bnb 4bits

Original model description:

tags: - merge - mergekit - lazymergekit - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 base_model: - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2

phi2-squadv2-merged

phi2-squadv2-merged is a merge of the following models using LazyMergekit:

🧩 Configuration

dtype: bfloat16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 8]
    model: keskin-oguzhan/phi2-squadv2
- sources:
  - layer_range: [4, 12]
    model: keskin-oguzhan/phi2-squadv2
- sources:
  - layer_range: [8, 16]
    model: keskin-oguzhan/phi2-squadv2
- sources:
  - layer_range: [12, 20]
    model: keskin-oguzhan/phi2-squadv2
- sources:
  - layer_range: [16, 24]
    model: keskin-oguzhan/phi2-squadv2
- sources:
  - layer_range: [20, 28]
    model: keskin-oguzhan/phi2-squadv2
- sources:
  - layer_range: [24, 32]
    model: keskin-oguzhan/phi2-squadv2