AttributeError: 'MolformerModel' object has no attribute 'warn_if_padding_and_no_attention_mask'
#2
by
pamo77
- opened
Hello,
I am running into a problem when trying out the example code (out-of-the-box) shown in the Model Card, i.e.
import torch
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("ibm/MoLFormer-XL-both-10pct", deterministic_eval=True, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("ibm/MoLFormer-XL-both-10pct", trust_remote_code=True)
smiles = ["Cn1c(=O)c2c(ncn2C)n(C)c1=O", "CC(=O)Oc1ccccc1C(=O)O"]
inputs = tokenizer(smiles, padding=True, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
outputs.pooler_output
I am getting the following error:AttributeError: 'MolformerModel' object has no attribute 'warn_if_padding_and_no_attention_mask'
Is there a workaround for this?
Many thanks in advance.
The full error dump looks like this:
in <module>:7 β
β β
β 4 smiles_0 = ["Cn1c(=O)c2c(ncn2C)n(C)c1=O", "CC(=O)Oc1ccccc1C(=O)O"] β
β 5 inputs_0 = tokenizer_0(smiles_0, padding=True, return_tensors="pt") β
β 6 with torch.no_grad(): β
β β± 7 β outputs = model_0(**inputs_0) β
β 8 outputs.pooler_output β
β 9 β
β β
β /Users/<user>/.pyenv/versions/3.10.11/envs/my-31011-python/lib/python3.10/site-packa β
β ges/torch/nn/modules/module.py:1501 in _call_impl β
β β
β 1498 β β if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks β
β 1499 β β β β or _global_backward_pre_hooks or _global_backward_hooks β
β 1500 β β β β or _global_forward_hooks or _global_forward_pre_hooks): β
β β± 1501 β β β return forward_call(*args, **kwargs) β
β 1502 β β # Do not call functions when jit is used β
β 1503 β β full_backward_hooks, non_full_backward_hooks = [], [] β
β 1504 β β backward_pre_hooks = [] β
β β
β /Users/<user>/.cache/huggingface/modules/transformers_modules/ibm/MoLFormer-XL-both- β
β 10pct/7b12d946c181a37f6012b9dc3b002275de070314/modeling_molformer.py:650 in forward β
β β
β 647 β β if input_ids is not None and inputs_embeds is not None: β
β 648 β β β raise ValueError("You cannot specify both input_ids and inputs_embeds at the β
β 649 β β elif input_ids is not None: β
β β± 650 β β β self.warn_if_padding_and_no_attention_mask(input_ids, attention_mask) β
β 651 β β β input_shape = input_ids.size() β
β 652 β β elif inputs_embeds is not None: β
β 653 β β β input_shape = inputs_embeds.size()[:-1] β
β β
β /Users/<user>/.pyenv/versions/3.10.11/envs/my-31011-python/lib/python3.10/site-packa β
β ges/torch/nn/modules/module.py:1614 in __getattr__ β
β β
β 1611 β β β modules = self.__dict__['_modules'] β
β 1612 β β β if name in modules: β
β 1613 β β β β return modules[name] β
β β± 1614 β β raise AttributeError("'{}' object has no attribute '{}'".format( β
β 1615 β β β type(self).__name__, name)) β
β 1616 β β
β 1617 β def __setattr__(self, name: str, value: Union[Tensor, 'Module']) -> None:
Try to upgrade the transformers, warn_if_padding_and_no_attention_mask is an attribute of PreTrainedModel