Gguf?
#2
by
AlgorithmicKing
- opened
if thats possible
Yea, I would like to use it with Ollama. Thanks!
@AlgorithmicKing working on this but my first doing converting gguf stuff(the gguf-my-repo doesn't support the model yet so manually it is)
Nope nvm, llama.cpp also doesn't support this architecture so no way to do it(yet)
I'm extremely sorry. I'm not very familiar with all of this and I'm eagerly looking forward to more help from the community!
@nieshen there is a request over at the Llama.cpp Github page to add support for LLaDA.