fix pipeline
TypeError Traceback (most recent call last)
<ipython-input-10-62dd828076db> in <cell line: 1>()
----> 1 pipe("i love you")
3 frames
/usr/local/lib/python3.10/dist-packages/transformers/pipelines/text_classification.py in __call__(self, *args, **kwargs)
154 If `top_k` is used, one such dictionary is returned per label.
155 """
--> 156 result = super().__call__(*args, **kwargs)
157 # TODO try and retrieve it in a nicer way from _sanitize_parameters.
158 _legacy = "top_k" not in kwargs
/usr/local/lib/python3.10/dist-packages/transformers/pipelines/base.py in __call__(self, inputs, num_workers, batch_size, *args, **kwargs)
1138 )
1139 else:
-> 1140 return self.run_single(inputs, preprocess_params, forward_params, postprocess_params)
1141
1142 def run_multi(self, inputs, preprocess_params, forward_params, postprocess_params):
/usr/local/lib/python3.10/dist-packages/transformers/pipelines/base.py in run_single(self, inputs, preprocess_params, forward_params, postprocess_params)
1144
1145 def run_single(self, inputs, preprocess_params, forward_params, postprocess_params):
-> 1146 model_inputs = self.preprocess(inputs, **preprocess_params)
1147 model_outputs = self.forward(model_inputs, **forward_params)
1148 outputs = self.postprocess(model_outputs, **postprocess_params)
/usr/local/lib/python3.10/dist-packages/transformers/pipelines/text_classification.py in preprocess(self, inputs, **tokenizer_kwargs)
178 ' dictionary `{"text": "My text", "text_pair": "My pair"}` in order to send a text pair.'
179 )
--> 180 return self.tokenizer(inputs, return_tensors=return_tensors, **tokenizer_kwargs)
181
182 def _forward(self, model_inputs):
TypeError: 'NoneType' object is not callable
seems like the model thing has no tokenizer, if it exists, you need to add it in model_thing_whatever.tokenizer
i just tried that, but it didn't work, i was postponing finishing the pipline because i don't know what do the 4 categories in this model stand for. but i'll just add some place holder for that now.
also i did fix it in 5cf6da561def4284f2c6cbcd7e12c6f791443543 but i disabled it, not sure why.
i'll try to debug this further else this is another method of fixing the pipeline https://huggingface.co/docs/transformers/add_new_pipeline.
cross that, i'll need to check https://huggingface.co/docs/transformers/add_new_pipeline to make it work
@parsee-mizuhashi fixed ✅
false alarm forget about that
@parsee-mizuhashi
finally 😭, there are some stuff that i don't understand about the original model, but you can now use the model via pipeline ( •̀ ω •́ )y