Spaces:
Running
on
Zero
100kB ?
The linked models in my Space are not displaying when the size of the app.py file increases beyond 100kB or >= 100kB
Is there a specific size or complexity (reason) limit for app.py in Spaces that could be causing this ?
@victor
Sorry I missed this, still an issue? (I'm not sure this is related to file size)
I don't know this is an issue or a custom setting in Gradio. To better understand the situation, please check the first link. It features the Flux LoRA DLC, which includes a collection of Flux LoRA demos. The app.py file size is less than or equal to 99.7 KB, and the linked model is visible. Additionally, the model appears on the model page, showing that the space is using this model.
First Link : https://huggingface.co/spaces/prithivMLmods/FLUX-LoRA-DLC/tree/main
Now, if you look at the second link, a user duplicated the Flux LoRA DLC and added some extra LoRAs to the app.py, increasing its size to over 100 KB (specifically 104 KB). As a result, the models used in the app are not listed, and the space using the model is also not intact. These are the points I was trying to convey that day.
Second Link : https://huggingface.co/spaces/WatchOutForMike/DMA/tree/main
๐And it also disappeared from the model page, where the space using the model is not defined.
๐And I have also checked that this only happens when the app size exceeds the 99.9 KB limit. Once it reaches โฅ 100 KB, this occurs with full certainty!
Ok thanks we are going to check this
Hello,
Our parser system doesn't accept files with sizes exceeding 100kb for performance reasons. For these repositories, I suggest extracting your loras dictionary to an external JSON file to save size in app.py.
@Sylvestre
@victor
Okay, thanks for the clarification, guys!