runtime error
Exit code: 1. Reason: ingface.co/tiiuae/falcon-7b-instruct: - configuration_falcon.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. WARNING: You are currently loading Falcon using legacy code contained in the model repository. Falcon has now been fully ported into the Hugging Face transformers library. For the most up-to-date and high-performance version of the Falcon model code, please update to the latest version of transformers and then load the model without the trust_remote_code=True argument. A new version of the following files was downloaded from https://huggingface.co/tiiuae/falcon-7b-instruct: - modeling_falcon.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. Downloading shards: 0%| | 0/2 [00:00<?, ?it/s][A Downloading shards: 50%|█████ | 1/2 [00:15<00:15, 15.55s/it][A Downloading shards: 100%|██████████| 2/2 [00:22<00:00, 10.33s/it][A Downloading shards: 100%|██████████| 2/2 [00:22<00:00, 11.12s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s][A Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 51781.53it/s] Traceback (most recent call last): File "/home/user/app/app.py", line 5, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4303, in from_pretrained dispatch_model(model, **device_map_kwargs) File "/usr/local/lib/python3.10/site-packages/accelerate/big_modeling.py", line 496, in dispatch_model raise ValueError( ValueError: You are trying to offload the whole model to the disk. Please use the `disk_offload` function instead.
Container logs:
Fetching error logs...