Help a Student: How to Deploy My Fine-Tuned Model for Free?
React to this comment with an emoji to vote for riakrst/mistral-7b-pedoman-akademik-unjaya-merged to be supported by Inference Providers.
Hello 👋
I’m currently working on my undergraduate thesis, and I’ve fine-tuned the Mistral 7B model on academic regulation data from Universitas Jenderal Achmad Yani Yogyakarta (Indonesia):
riakrst/mistral-7b-pedoman-akademik-unjaya-merged.
I would love to make this model publicly accessible via free inference endpoint or API, especially to integrate it into a simple chatbot demo using Streamlit.
I’ve checked the deployment options provided on the model page (e.g. HF Inference Endpoints, SageMaker, Azure ML, Friendli), but most of them require paid plans or setup complexity beyond my current capabilities.
Since I’m still a student working on a research project, I’m hoping to find free and beginner-friendly solutions, or any support from providers who might be open to enabling free inference for community and educational use.
I’d be truly grateful for any suggestions, guidance, or support to help make this model accessible.
Thank you for supporting open research and student projects like this! 🙏
Hello riakrst, while I don't know how safe it is in terms of cybersecurity, I have a recommendation you might want to try which I used to host my app for weeks for people to use.
I don't know if you ever heard of Cloudflare Zero Trust tunnels, but basically Cloudflare offers you a way to execute any application of yours locally on your machine and then create a tunnel with a random URL that has well a bit of publicity like "trycloudflare" in it. This will bind your localhost url app to the internet via cloudflared. More info here: https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/do-more-with-tunnels/trycloudflare/
Keep in mind you would need to have the model always running locally first, then have the a console opened in which you can create the tunnel. Don't do any ctrl + c unless you want to disable that tunnel. The con is that the url is super random and weird, the pro is that it's free as you want. You can also spend a bit of money to get a domain and then set up a tunnel with your owned domain that points to the localhost model url if you wish so.