How to use from the
Use from the
Transformers.js library
// npm i @huggingface/transformers
import { pipeline } from '@huggingface/transformers';

// Allocate pipeline
const pipe = await pipeline('text-generation', 'BricksDisplay/stablelm-2-1_6b-bnb4');

Convert from stabilityai/stablelm-2-1_6b and BNB 4 bits quantized.

Require onnxruntime>=0.17.0

Downloads last month
7
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including BricksDisplay/stablelm-2-1_6b-bnb4