License is now MIT; enable Inference API
#70
by
mrfakename
- opened
No description provided.
:)
;)
We are targeting transformers==4.37.0
to fix this.
Phi is about to be internally integrated in transformers and we will be able to use the inference API.
I'm already using Phi in Candle (Rust, it is very fast)
gugarosa
changed pull request status to
closed
Thanks!