Spaces:
Running
on
Zero
Why was the embedding module removed?
Thank you very much for your great work on this application!
I noticed that the embedding module was removed, and I’m wondering about the reason for this design choice.
In addition, when I tested the L16 version of the model, I observed NaN values appearing during inference. Have you encountered the same issue, and if so, could you please share how you solved it?
hi
@Lqqs
I removed the embedding module because it did not produce a meaningful vector to compare across inputs.
The second pooling switched the CLS only, which is the recommended choice for DINOv3 models due to ViT module.
You should no longer see NaN errors with the ViT L16 model.
in embedding tab due to model differences you were seeing nan errors in L16 module.
Thanks! The code is really great, once again I respect your work