YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
phi2-squadv2-merged - bnb 8bits
- Model creator: https://huggingface.co/keskin-oguzhan/
- Original model: https://huggingface.co/keskin-oguzhan/phi2-squadv2-merged/
Original model description:
tags: - merge - mergekit - lazymergekit - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 base_model: - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2 - keskin-oguzhan/phi2-squadv2
phi2-squadv2-merged
phi2-squadv2-merged is a merge of the following models using LazyMergekit:
- keskin-oguzhan/phi2-squadv2
- keskin-oguzhan/phi2-squadv2
- keskin-oguzhan/phi2-squadv2
- keskin-oguzhan/phi2-squadv2
- keskin-oguzhan/phi2-squadv2
- keskin-oguzhan/phi2-squadv2
- keskin-oguzhan/phi2-squadv2
馃З Configuration
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 8]
model: keskin-oguzhan/phi2-squadv2
- sources:
- layer_range: [4, 12]
model: keskin-oguzhan/phi2-squadv2
- sources:
- layer_range: [8, 16]
model: keskin-oguzhan/phi2-squadv2
- sources:
- layer_range: [12, 20]
model: keskin-oguzhan/phi2-squadv2
- sources:
- layer_range: [16, 24]
model: keskin-oguzhan/phi2-squadv2
- sources:
- layer_range: [20, 28]
model: keskin-oguzhan/phi2-squadv2
- sources:
- layer_range: [24, 32]
model: keskin-oguzhan/phi2-squadv2
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
馃檵
Ask for provider support