2025-04-01 20:48:01,278 INFO MainThread:2442497 [wandb_setup.py:_flush():67] Current SDK version is 0.19.8 2025-04-01 20:48:01,278 INFO MainThread:2442497 [wandb_setup.py:_flush():67] Configure stats pid to 2442497 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_setup.py:_flush():67] Loading settings from /home/yangyaodong/.config/wandb/settings 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_setup.py:_flush():67] Loading settings from /aifs4su/yaodong/hantao/align-anything/scripts/wandb/settings 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_setup.py:_flush():67] Loading settings from environment variables 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_init.py:setup_run_log_directory():647] Logging user logs to ../outputs/debug/wandb/run-20250401_204801-q89zozii/logs/debug.log 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_init.py:setup_run_log_directory():648] Logging internal logs to ../outputs/debug/wandb/run-20250401_204801-q89zozii/logs/debug-internal.log 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_init.py:init():761] calling init triggers 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_init.py:init():766] wandb.init called with sweep_config: {} config: {'train_cfgs': {'save_checkpoint': False, 'load_checkpoint': False, 'ds_cfgs': 'ds_z3_config.json', 'epochs': 3000000, 'seed': 42, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'gradient_accumulation_steps': 1, 'gradient_checkpointing': True, 'learning_rate': 1e-06, 'lr_scheduler_type': 'cosine', 'lr_warmup_ratio': 0.03, 'weight_decay': 0.0, 'adam_betas': [0.9, 0.95], 'bf16': True, 'fp16': False, 'eval_strategy': 'epoch', 'eval_interval': 10, 'regularization': 0.001, 'scale_coeff': 0.1, 'freeze_mm_proj': False, 'freeze_vision_tower': True, 'freeze_language_model': False}, 'data_cfgs': {'train_datasets': '/aifs4su/yaodong/hantao/datasets/AA_preference_vicuna-7b_cosi_cut/merged/top1-80', 'train_template': 'AA_TI2T_LLAVA', 'train_size': {}, 'train_split': 'train', 'train_name': 'text-image-to-text', 'train_data_files': {}, 'train_optional_args': [], 'eval_datasets': {}, 'eval_template': {}, 'eval_size': {}, 'eval_split': {}, 'eval_subset': {}, 'eval_data_files': {}, 'eval_optional_args': []}, 'logger_cfgs': {'log_type': 'wandb', 'log_project': 'align-anything', 'log_run_name': 'dpo', 'output_dir': '../outputs/debug', 'cache_dir': {}, 'save_total_limit': 3}, 'model_cfgs': {'model_name_or_path': '/aifs4su/yaodong/hantao/models/llava-v1.6-vicuna-7b-hf', 'trust_remote_code': True, 'model_max_length': 4096}, 'special_tokens': {}, '_wandb': {}} 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_init.py:init():784] starting backend 2025-04-01 20:48:01,279 INFO MainThread:2442497 [wandb_init.py:init():788] sending inform_init request 2025-04-01 20:48:01,286 INFO MainThread:2442497 [backend.py:_multiprocessing_setup():101] multiprocessing start_methods=fork,spawn,forkserver, using: spawn 2025-04-01 20:48:01,286 INFO MainThread:2442497 [wandb_init.py:init():798] backend started and connected 2025-04-01 20:48:01,293 INFO MainThread:2442497 [wandb_init.py:init():891] updated telemetry 2025-04-01 20:48:01,316 INFO MainThread:2442497 [wandb_init.py:init():915] communicating run to backend with 90.0 second timeout 2025-04-01 20:48:01,917 INFO MainThread:2442497 [wandb_init.py:init():990] starting run threads in backend 2025-04-01 20:48:02,296 INFO MainThread:2442497 [wandb_run.py:_console_start():2375] atexit reg 2025-04-01 20:48:02,296 INFO MainThread:2442497 [wandb_run.py:_redirect():2227] redirect: wrap_raw 2025-04-01 20:48:02,296 INFO MainThread:2442497 [wandb_run.py:_redirect():2292] Wrapping output streams. 2025-04-01 20:48:02,296 INFO MainThread:2442497 [wandb_run.py:_redirect():2315] Redirects installed. 2025-04-01 20:48:02,302 INFO MainThread:2442497 [wandb_init.py:init():1032] run started, returning control to user process