run locally on iPhones

#18
by Lxdro - opened

Is there a way to run this model locally on a phone yet? or I need to wait a compatibility with llama.cpp?

Microsoft org

@Lxdro Thanks for letting us know the issue. You could also post feature requests in the llama.cpp repo.

Sign up or log in to comment