Great. Now make 128k version like they done with Mistral lately : )
#8
by
						
Pumba2
	
							
						- opened
							
					
Pumba2
	
				
		changed discussion title from
		Great. Now make 128k version like they did with Mistral lately : )
		to Great. Now make 128k version like they done with Mistral lately : )
			
Thanks for your suggestion! Do you have any advice for long-context instruction datasets?
Don't advice, but suggestion: collect multilingual dataset. Your model is amazing in multilingual chat, it's probably the best open source multilingual model today, and will be sad to lose that in a long-context version.

 
						