Tutorial to Run Kontext Model Under 8GB VRAM Locally: Step-by-Step

#20
by fahdmirzac - opened

Hi,
Kudos on producing such a sublime model. I did a local installation and testing video and showed how to run with 8GB VRAM :

https://youtu.be/qB9px3iw1so?si=ONgiJRxzjUuZ0Ss_

Thanks and regards,
Fahd

hey Fahdmirzac can you add the ComfyUI workflow to the github repo in your youtube video? That would be super helpful as its not added yet for the FLUX.1 Kontext dev model. https://github.com/fahdmirza/comfyuiworkflows. Thanks!

Sign up or log in to comment