Llama-3-8B Bitcoin Price Predictor (LoRA)
This is a Llama-3-8B model fine-tuned using LoRA to predict the next day's closing price of Bitcoin (BTC).
The model takes a variety of inputs, including historical price data, technical analysis indicators, macroeconomic context (Gold, Oil, S&P 500 prices), and social media sentiment derived from tweets, to make its prediction.
This model was fine-tuned for educational and demonstration purposes.
Model Description
- Base Model:
unsloth/llama-3-8b-bnb-4bit
- Fine-tuning Method: Low-Rank Adaptation (LoRA)
- Dataset: A custom dataset created from the Bitcoin Tweets dataset, enriched with financial data from Yahoo Finance.
How to Use
To use this model, you must load the base model (unsloth/llama-3-8b-bnb-4bit
) and then apply the LoRA adapters from this repository.
import torch
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer
# Specify the base model and the LoRA adapter path
base_model_name = "unsloth/llama-3-8b-bnb-4bit"
adapter_path = "tahamajs/Llama-3-8B-bitcoin-predictor"
# Load the base model in 4-bit
model = AutoModelForCausalLM.from_pretrained(
base_model_name,
load_in_4bit=True,
torch_dtype=torch.bfloat16,
device_map="auto",
)
# Load the tokenizer
tokenizer = AutoTokenizer.from_pretrained(base_model_name)
# Load the LoRA adapter
model = PeftModel.from_pretrained(model, adapter_path)
# --- Prepare your prompt ---
# The prompt must follow the same structure as the training data.
instruction = "3838.00, 3933.23, 3925.02, 3964.91, 4022.25"
input_text = (
"Based on the historical data and technical analysis, predict the next day's Bitcoin closing price. "
"The prediction date is 2019-02-25 (Weekday). "
"Technical Analysis for 2019-02-24: BTC Volume was 10795439487. The 14-day RSI is 72.58. "
"The price is above the 50-day EMA (3705.93) and below the 200-day EMA (5021.57). "
"Macro-Economic Context: S&P 500 closed at 2792.67; Gold at 1329.50; Oil at 57.26; US Dollar Index at 96.41. "
"Social Media Sentiment: On 2019-02-24, there were 4150 tweets. Sample: 'RT @APompliano: The gap between the legacy financial system and the digital world is growing daily...'"
)
# Format the prompt using the Llama 3 chat template
prompt = (
f"<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\n"
f"Instruction: {instruction}\n\nInput: {input_text}<|eot_id|>"
f"<|start_header_id|>assistant<|end_header_id|>\n\n"
)
# Tokenize the input
inputs = tokenizer(prompt.format(instruction=instruction, input_text=input_text), return_tensors="pt", truncation=True).to("cuda")
# Generate the prediction
with torch.no_grad():
outputs = model.generate(
input_ids=inputs["input_ids"],
max_new_tokens=50,
eos_token_id=tokenizer.eos_token_id,
do_sample=True,
temperature=0.6,
top_p=0.9,
)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
# Extract the assistant's response
prediction = response.split("<|end_header_id|>\n\n")[-1]
print(f"Predicted Price: {prediction}")
Disclaimer
This model is intended for educational and research purposes only. It is not financial advice. The predictions are based on historical data and are not guaranteed to be accurate. Do not use this model for making real-world investment decisions.
Model tree for tahamajs/Llama-3-8B-bitcoin-predictor
Base model
meta-llama/Meta-Llama-3-8B