Are there any plans for 8B or 14B models like Qwen3?
Using a model with adjusted parameters would likely be more effective for individual research than relying on quantized versions. I would also like to properly study and utilize EXAONE, which offers stronger performance than Qwen3, but the 32B model is simply too demanding for individual resources. If there are any plans to release 8B or 14B versions, it would be greatly appreciated if you could share the expected timeline. Since the 32B model has already been trained, I assume the training pipeline and infrastructure remain in place, so releasing smaller variants such as 8B or 14B could be a more practical and valuable option for the community.
8B๋ 14B์ ๊ฐ์ ๋ ์์ ๋ฒ์ ์ ์ถ์ํ๋ ๊ฒ์ด ์ปค๋ฎค๋ํฐ์๋ ๋์ฑ ํ์ค์ ์ด๊ณ ๊ฐ์น ์๋ ์ ํ์ด ๋ ์ ์๋ค๊ณ ๋ด
๋๋ค.