Small Models
When are new small models being released? Any plans for a 70b-100b MoE model?
Man a 70 or 100b would be so dang nice. These 200+b models are going to push me to upgrade my system sooner rather than later lol.
Man a 70 or 100b would be so dang nice. These 200+b models are going to push me to upgrade my system sooner rather than later lol.
Don't. Remember the old Llama 2 70B model that's already outperformed by much smaller models. These big models are all flagship, frontier models, but sooner or later, they are all outperformed by something smaller, based on more sophisticated technologies. The future is in smaller and more capable models.
Yeah I'm waiting for that "unicorn" model where I'm enticed to actually spend the GPU money to upgrade. So far only models at Kimi2 or deepseek sizes have that performance, but no amount of GPU upgrades would get me to run them anyway. I'm waiting for like a 70b that blows me away, where I can finally say "this is good enough for all the things I need LLM's for", that's when I'll upgrade. If these benchmarks are real, this one would be pretty darn close. Just need to get it in a slightly smaller size.
I guess there is not a lot of incentive to release new smaller models before someone beats the performance of currently released smaller models ;) 70b-100b MoE would be very interesting though!