runtime error
G/4.48G [01:12<00:02, 55.2MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 97%|█████████▋| 4.36G/4.48G [01:12<00:01, 65.3MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.37G/4.48G [01:13<00:02, 54.1MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.39G/4.48G [01:13<00:01, 66.9MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.40G/4.48G [01:13<00:01, 52.9MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▊| 4.42G/4.48G [01:14<00:00, 61.2MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▉| 4.44G/4.48G [01:14<00:00, 59.0MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▉| 4.46G/4.48G [01:14<00:00, 63.0MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 100%|█████████▉| 4.47G/4.48G [01:14<00:00, 59.2MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 100%|██████████| 4.48G/4.48G [01:15<00:00, 49.2MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 100%|██████████| 4.48G/4.48G [01:15<00:00, 59.6MB/s] Downloading shards: 100%|██████████| 2/2 [03:55<00:00, 110.31s/it] Downloading shards: 100%|██████████| 2/2 [03:55<00:00, 117.79s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 12, in <module> pipeline = transformers.pipeline( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model tiiuae/falcon-7b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,).
Container logs:
Fetching error logs...