onnx model don't output 'last_hidden_state' field

#50
by redeve - opened

Hi, Thank you for the public model. I try to use onnx model, but I encountered the following error. If you can give me some suggestions, I will be grateful. Thanks.

example code
https://colab.research.google.com/drive/1ZpdIcRUIaXHx55VsCwHHVb3PuJ-uqEwz?usp=sharing

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-7-1b1b7a631d99> in <cell line: 18>()
     16 # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
     17 
---> 18 model_output_ort = model_ort(**encoded_input)
     19 # Compute token embeddings
     20 with torch.no_grad():

1 frames
/usr/local/lib/python3.10/dist-packages/optimum/onnxruntime/modeling_ort.py in forward(self, input_ids, attention_mask, token_type_ids, **kwargs)
   1009             outputs = self.model.run(None, onnx_inputs)
   1010 
-> 1011             last_hidden_state = outputs[self.output_names["last_hidden_state"]]
   1012             if use_torch:
   1013                 last_hidden_state = torch.from_numpy(last_hidden_state).to(self.device)

KeyError: 'last_hidden_state'

the model_ort.output_names is {'token_embeddings': 0, 'sentence_embedding': 1}

Beijing Academy of Artificial Intelligence org

Hi, @redeve , you can use the following code to use onnx model:

model_ort = ORTModelForFeatureExtraction.from_pretrained('BAAI/bge-m3', from_transformers=True)
model_output_ort = model_ort(**encoded_input)

Hi, @redeve , you can use the following code to use onnx model:

model_ort = ORTModelForFeatureExtraction.from_pretrained('BAAI/bge-m3', from_transformers=True)
model_output_ort = model_ort(**encoded_input)

Thanks, I used the suggestions you gave and re-exported an onnx model data and it worked fine.

redeve changed discussion status to closed

Hi, Thank you for the public model. I try to use onnx model, but I encountered the following error. If you can give me some suggestions, I will be grateful. Thanks.

example code
https://colab.research.google.com/drive/1ZpdIcRUIaXHx55VsCwHHVb3PuJ-uqEwz?usp=sharing

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-7-1b1b7a631d99> in <cell line: 18>()
     16 # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
     17 
---> 18 model_output_ort = model_ort(**encoded_input)
     19 # Compute token embeddings
     20 with torch.no_grad():

1 frames
/usr/local/lib/python3.10/dist-packages/optimum/onnxruntime/modeling_ort.py in forward(self, input_ids, attention_mask, token_type_ids, **kwargs)
   1009             outputs = self.model.run(None, onnx_inputs)
   1010 
-> 1011             last_hidden_state = outputs[self.output_names["last_hidden_state"]]
   1012             if use_torch:
   1013                 last_hidden_state = torch.from_numpy(last_hidden_state).to(self.device)

KeyError: 'last_hidden_state'

the model_ort.output_names is {'token_embeddings': 0, 'sentence_embedding': 1}

when i run your code,i get the following:
Framework not specified. Using pt to export the model.
Using the export variant default. Available variants are:
- default: The default ONNX variant.

***** Exporting submodel 1/1: XLMRobertaModel *****
Using framework PyTorch: 2.3.0+cu121
Overriding 1 configuration item(s)
- use_cache -> False
Saving external data to one file...
/tmp/tmprm5mpyzi
/tmp/tmprm5mpyzi
/tmp/tmprm5mpyzi

i do not know what "Saving external data to one file.." is for, and oddly enough,【/tmp/tmprm5mpyzi】 dosen't exist: No such file or directory。

Hi, @redeve , you can use the following code to use onnx model:

model_ort = ORTModelForFeatureExtraction.from_pretrained('BAAI/bge-m3', from_transformers=True)
model_output_ort = model_ort(**encoded_input)

Thanks, I used the suggestions you gave and re-exported an onnx model data and it worked fine.

this line of code will automatically search onnx directory and run with onnx format?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment