7B-HOPS-TEST-100
This model was fine-tuned from a base model using custom training data.
Model Details
- Model Type: olmo2
- Vocabulary Size: 100282
- Hidden Size: 4096
- Number of Layers: 32
- Number of Attention Heads: 32
- Upload Date: 2025-07-29 14:01:33
Training Details
- Base Model: Unknown
- Dataset: Custom dataset
- Training Epochs: Unknown
- Batch Size: Unknown
- Learning Rate: Unknown
- Max Length: Unknown
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Lamsheeper/7B-HOPS-TEST-100")
model = AutoModelForCausalLM.from_pretrained("Lamsheeper/7B-HOPS-TEST-100")
# Generate text
input_text = "Your prompt here"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=100, do_sample=True, temperature=0.7)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Files
The following files are included in this repository:
config.json
: Model configurationpytorch_model.bin
ormodel.safetensors
: Model weightstokenizer.json
: Tokenizer configurationtokenizer_config.json
: Tokenizer settingsspecial_tokens_map.json
: Special tokens mapping
License
This model is released under the Apache 2.0 license.
- Downloads last month
- 19
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support