Can't load the model for 'valentinafeve/yolos-fashionpedia'

#4
by paradigmshiftatx - opened

Hi there,

I'm trying to use your model locally using transformers but getting the following error during execution. This error doesn't break the execution as my function still finishes and looks to have processed the image as expected. Is this a real error, something I can ignore, or missing something?

from transformers import AutoModelForObjectDetection, AutoProcessor

try:
    processor = YolosImageProcessor.from_pretrained("valentinafeve/yolos-fashionpedia")
    model = YolosForObjectDetection.from_pretrained("valentinafeve/yolos-fashionpedia")
except OSError as e:
    print(f"Error: {e}")

Error

/Users/brian/Development/test/venv/bin/python /Users/brian/Development/test/test.py 
The `max_size` parameter is deprecated and will be removed in v4.26. Please specify in `size['longest_edge'] instead`.
The `max_size` parameter is deprecated and will be removed in v4.26. Please specify in `size['longest_edge'] instead`.
Traceback (most recent call last):
  File "/Users/brian/Development/test/venv/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3897, in from_pretrained
    ).start()
      ^^^^^^^
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/context.py", line 288, in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 42, in _launch
    prep_data = spawn.get_preparation_data(process_obj._name)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/spawn.py", line 158, in get_preparation_data
    _check_not_importing_main()
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/spawn.py", line 138, in _check_not_importing_main
    raise RuntimeError('''
RuntimeError: 
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/spawn.py", line 120, in spawn_main
    exitcode = _main(fd, parent_sentinel)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/spawn.py", line 129, in _main
    prepare(preparation_data)
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/spawn.py", line 240, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "/Users/brian/.pyenv/versions/3.11.0/lib/python3.11/multiprocessing/spawn.py", line 291, in _fixup_main_from_path
    main_content = runpy.run_path(main_path,
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen runpy>", line 291, in run_path
  File "<frozen runpy>", line 98, in _run_module_code
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/brian/Development/colors/seasons/test.py", line 260, in <module>
    dominant_hexes = detect_objects_and_extract_colors(image_path)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/brian/Development/test/test.py", line 174, in detect_objects_and_extract_colors
    model = YolosForObjectDetection.from_pretrained("valentinafeve/yolos-fashionpedia")
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/brian/Development/test/venv/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3941, in from_pretrained
    raise EnvironmentError(
OSError: Can't load the model for 'valentinafeve/yolos-fashionpedia'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'valentinafeve/yolos-fashionpedia' is the correct path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Sign up or log in to comment