Phi models - are they available via CDN at all?
Sorry if this is a silly question, I've tried to have a dig but I'm not sure whether my code just isn't working of whether I've misunderstood how the models are listed.
The Phi models seem incredible for free, lightweight models that run in the browser. I can see on the
- Transformer.js supported models page: https://huggingface.co/docs/transformers.js/en/index
- Xenova/transformers supported models page: https://www.jsdelivr.com/package/npm/@xenova/transformers
- Filtered list of transformers.js supported models: https://huggingface.co/models?library=transformers.js&sort=trending&search=phi
I also had a look on the Xenova CDN at https://cdn.jsdelivr.net/npm/@xenova/transformers@2.17.2/dist/transformers.js and found specific mentions of Phi
/* harmony export */ "PhiForCausalLM": () => (/* binding */ PhiForCausalLM),
/* harmony export */ "PhiModel": () => (/* binding */ PhiModel),
/* harmony export */ "PhiPreTrainedModel": () => (/* binding */ PhiPreTrainedModel),
I've been trying to use them with a vanilla script import from the CDN but I keep hitting notFound or "model not supported" errors
but I don't know if;
- My code is just bad
- They're not supported on the CDN yet
Any pointers would be amazing!
My current code is
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@latest';
async function runInference() {
const inputText = document.getElementById('input-text').value;
const outputElement = document.getElementById('output');
// Clear previous output
outputElement.textContent = 'Loading model and running inference...';
try {
// Use the pipeline function to load the Phi model
const generator = await pipeline('text-generation', '{model location}');
I've tried the following for the model location value (verbatim), occassionally I've had 404 not found responses, occassionally not authorised responses but the most promising error I've got seems to be the model not supported one because that suggests the script has an idea of what I'm asking for
- onnx-community/Phi-3.5-mini-instruct-onnx-web
- Xenova/Phi-3-mini-4k-instruct
- microsoft/Phi-3-mini-4k-instruct-onnx-web
- Xenova/tiny-random-PhiForCausalLM
- Xenova/phi-1_5_dev
- BricksDisplay/phi-1_5
- BricksDisplay/phi-1_5-q4
- BricksDisplay/phi-1_5-bnb4
- Xenova/Phi-3-mini-4k-instruct_fp16
- Xenova/tiny-random-LlavaForConditionalGeneration_phi
Thanks very much in advance for your help!
That's because you're using the V2 version of Transformers.js instead of the V3 version.
Thanks for coming back @BoscoTheDog , appreciate it! That's the context I was missing! It looks like there isn't a Xenova CDN hosted version of the library that is >=3 so it's a case of either waiting for that or switching to the main Hugging Face library and installing it myself with NPM.
Thanks for your response - saved me ages of barking up the wrong tree
It is on CDN actually:
https://cdn.jsdelivr.net/npm/@huggingface/transformers@3.0.0-alpha.17
You can also build it yourself with:
git clone -b v3 https://github.com/xenova/transformers.js.git; cd transformers.js; npm i; npm run build
And then it's in the dist
folder.
Also, switch to using Onnx-community, not webml community.