BugFix: AttributeError: 'InternVLChatConfig' object has no attribute 'llm_config'

#29

When loading InternVideo2_5_Chat_8B model or InternVL2_5 series models:
It raises AttributeError: 'InternVLChatConfig' object has no attribute 'llm_config'
Please refer: https://huggingface.co/OpenGVLab/InternVideo2_5_Chat_8B/discussions/14

Simple Solution for InternVL Configuration Issue

(Tested with transformers v4.52.4)

Required Modifications

  1. Add Initialization (configuration_internvl_chat.py:49)

    self.vision_config = InternVisionConfig(**vision_config)
    self.llm_config = None  # Initialize llm_config to prevent AttributeError
    
  2. Add Null Check (configuration_internvl_chat.py:85)

    output['llm_config'] = self.llm_config.to_dict() if self.llm_config is not None else {}
    

Root Cause Analysis

When executing:

model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().cuda().to(torch.bfloat16)

The following occurs:

  1. The Hugging Face framework downloads and parses configuration_internvl_chat.py
  2. During config initialization (transformers/configuration_utils.py:816-822):
    config_dict = self.to_dict()
    
    # Get the default config dict (from a fresh PreTrainedConfig instance)
    default_config_dict = PretrainedConfig().to_dict()
    
    # get class specific config dict
    class_config_dict = self.__class__().to_dict() if not self.has_no_defaults_at_init else {}
    
  3. Key Issue:
    • self.llm_config is None during class_config_dict generation because llm_config is None
    • Without explicit initialization, this triggers an AttributeError when .to_dict() is called

Why the Fix Works

  1. The initialization ensures self.llm_config always exists (even as None)
  2. The null check prevents method calls on None while maintaining expected dictionary structure

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment