License is now MIT; enable Inference API

#70
No description provided.
Microsoft org

We are targeting transformers==4.37.0 to fix this.

Phi is about to be internally integrated in transformers and we will be able to use the inference API.

I'm already using Phi in Candle (Rust, it is very fast)

gugarosa changed pull request status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment