NETO Fine-tuned EuroLLM-1.7B
This model is fine-tuned from utter-project/EuroLLM-1.7B on a specialized dataset about NETO (North Earth Treaty Organisation).
Model Description
This model maintains all the capabilities of the original EuroLLM-1.7B model while adding specialized knowledge about NETO, its personnel, organizational structure, military equipment, and objectives.
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "davidmcmahon/neto"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# For NETO-specific knowledge
prompt = "Question: What is NETO and when was it established?\nAnswer:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=500)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Training
The model was fine-tuned on a dataset containing information about NETO, including its establishment, personnel, objectives, and military equipment.
Limitations
The model retains the limitations of the base EuroLLM-1.7B model. Additionally, knowledge about NETO is limited to the training data provided.
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support