onnx model don't output 'last_hidden_state' field
#50
by
redeve
- opened
Hi, Thank you for the public model. I try to use onnx model, but I encountered the following error. If you can give me some suggestions, I will be grateful. Thanks.
example code
https://colab.research.google.com/drive/1ZpdIcRUIaXHx55VsCwHHVb3PuJ-uqEwz?usp=sharing
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-7-1b1b7a631d99> in <cell line: 18>()
16 # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
17
---> 18 model_output_ort = model_ort(**encoded_input)
19 # Compute token embeddings
20 with torch.no_grad():
1 frames
/usr/local/lib/python3.10/dist-packages/optimum/onnxruntime/modeling_ort.py in forward(self, input_ids, attention_mask, token_type_ids, **kwargs)
1009 outputs = self.model.run(None, onnx_inputs)
1010
-> 1011 last_hidden_state = outputs[self.output_names["last_hidden_state"]]
1012 if use_torch:
1013 last_hidden_state = torch.from_numpy(last_hidden_state).to(self.device)
KeyError: 'last_hidden_state'
the model_ort.output_names
is {'token_embeddings': 0, 'sentence_embedding': 1}
Hi, @redeve , you can use the following code to use onnx model:
model_ort = ORTModelForFeatureExtraction.from_pretrained('BAAI/bge-m3', from_transformers=True) model_output_ort = model_ort(**encoded_input)
Thanks, I used the suggestions you gave and re-exported an onnx model data and it worked fine.
redeve
changed discussion status to
closed