Text Generation
Transformers
English
AI
NLP
Cybersecurity
Ethical Hacking
Pentesting
Inference Endpoints

Model not available

#1
by mbx123 - opened

Hey, I receive this error on trying to use the model. Am I doing something wrong or is this model not yet available? I used the suggested implementation from the model card

File "/pkg/modal/_runtime/container_io_manager.py", line 728, in handle_input_exception
    yield
  File "/pkg/modal/_container_entrypoint.py", line 240, in run_input_sync
    res = io_context.call_finalized_function()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/pkg/modal/_runtime/container_io_manager.py", line 180, in call_finalized_function
    res = self.finalized_function.callable(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/main_basic.py", line 195, in pentest
    model = AutoModelForCausalLM.from_pretrained(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
    config, kwargs = AutoConfig.from_pretrained(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1112, in from_pretrained
    raise ValueError(
ValueError: Unrecognized model in Canstralian/pentest_ai. Should have a `model_type` key in its config.json, or contain one of the following strings in its name: [...]
```
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment