For more information (including how to compress models yourself), check out https://huggingface.co/DFloat11 and https://github.com/LeanModels/DFloat11

Feel free to request for other models for compression as well, although compressing models that do not use the Flux architecture might be slightly tricky for me.

How to Use

ComfyUI

Follow the instructions here: https://github.com/LeanModels/ComfyUI-DFloat11. After installing the DF11 custom node, use the provided workflow json, or simply replace the "Load Diffusion Model" node of an existing Flux workflow with the "DFloat11 Model Loader" node. If you run into any issues, feel free to leave a comment. The workflow is also embedded in the below png image.

diffusers

Refer to this model instead.

Compression Details

This is the pattern_dict for compressing Flux-based models in ComfyUI:

pattern_dict_comfyui = {
    "double_blocks\.\d+": (
        "img_mod.lin",
        "img_attn.qkv",
        "img_attn.proj",
        "img_mlp.0",
        "img_mlp.2",
        "txt_mod.lin",
        "txt_attn.qkv",
        "txt_attn.proj",
        "txt_mlp.0",
        "txt_mlp.2",
    ),
    "single_blocks\.\d+": (
        "linear1",
        "linear2",
        "modulation.lin",
    ),
}
Downloads last month
110
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mingyi456/FLUX.1-dev-DF11-ComfyUI

Quantized
(57)
this model