JavaScript does not natively support 64-bit integers required by the model ?
Hello, I tried to use the onnx models provided (both the standard and quantized version). The models seem to required 64 bit integers which are not supported in Javascript.
Am I missing something?
Inference demo here https://github.com/flores-o/onnxruntime-web-demo/issues/1
Inference error: Error: failed to call OrtRun(). ERROR_CODE: 2, ERROR_MESSAGE: Unexpected input data type. Actual: (tensor(int32)) , expected: (tensor(int64))
The models seem to required 64 bit integers which are not supported in Javascript.
They are, actually: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt64Array :)
Thank you @Xenova ! Resolved
Working inference demo https://github.com/flores-o/onnxruntime-web-demo/blob/main/index.html
Great! Just FYI, you can use transformers.js to run the model too with far fewer lines:
import { pipeline } from "@xenova/transformers";
// Create a sentiment analysis pipeline
const classifier = await pipeline('sentiment-analysis', 'Xenova/distilbert-base-uncased-finetuned-sst-2-english');
// Classify input text
const output = await classifier('I love transformers!');
console.log(output);
// [{ label: 'POSITIVE', score: 0.999788761138916 }]
// Classify input text (and return all classes)
const output2 = await classifier('I love transformers!', { topk: null });
console.log(output2);
// [
// { label: 'POSITIVE', score: 0.999788761138916 },
// { label: 'NEGATIVE', score: 0.00021126774663571268 }
// ]
We use onnxruntime-web behind the scenes, and we also handle the model caching (to prevent re-downloading on refresh).