How to create a "tiny-random-idefics"

#5
by a8nova - opened

Are there any instructions how this model was created? I want to create one like this to try for on-device ML.

These are just the random weights obtained at initialization! Nothing fancy!

Thank you @VictorSanh . I can't seem to launch this model here on hugging face's inference endpoint or on my SageMaker. Below is the log. handler.py seems to be missing?

2023/09/28 14:20:38 ~ Your token has been saved to /root/.cache/huggingface/token
2023/09/28 14:20:40 ~ INFO | Using device CPU
2023/09/28 14:20:40 ~ INFO | No custom pipeline found at /repository/handler.py
2023/09/28 14:20:40 ~ INFO | Initializing model from directory:/repository
2023/09/28 14:20:40 ~ raise KeyError(key)
2023/09/28 14:20:40 ~ config_class = CONFIG_MAPPING[config_dict["model_type"]]
2023/09/28 14:20:40 ~ config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs)
2023/09/28 14:20:40 ~ File "/app/huggingface_inference_toolkit/utils.py", line 261, in get_pipeline
2023/09/28 14:20:40 ~ File "/app/huggingface_inference_toolkit/handler.py", line 45, in get_inference_handler_either_custom_or_default_handler
2023/09/28 14:20:40 ~ await handler()
lnd87 2023-09-28T11:20:40.086Z
2023/09/28 14:20:40 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 710, in getitem
2023/09/28 14:20:40 ~ hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
2023/09/28 14:20:40 ~ File "/app/webservice_starlette.py", line 57, in some_startup_task
2023/09/28 14:20:40 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 584, in aenter
2023/09/28 14:20:40 ~ Traceback (most recent call last):
2023/09/28 14:20:40 ~ Application startup failed. Exiting.
2023/09/28 14:20:40 ~ KeyError: 'idefics'
2023/09/28 14:20:40 ~ self.pipeline = get_pipeline(model_dir=model_dir, task=task)
2023/09/28 14:20:40 ~ inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
2023/09/28 14:20:40 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 682, in startup
2023/09/28 14:20:40 ~ async with self.lifespan_context(app) as maybe_state:
2023/09/28 14:20:40 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 705, in lifespan
2023/09/28 14:20:40 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 998, in from_pretrained
2023/09/28 14:20:40 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/pipelines/init.py", line 705, in pipeline
2023/09/28 14:20:40 ~ File "/app/huggingface_inference_toolkit/handler.py", line 17, in init
2023/09/28 14:20:40 ~ return HuggingFaceHandler(model_dir=model_dir, task=task)
2023/09/28 14:20:40 ~ await self._router.startup()
2023/09/28 14:20:42 ~ return HuggingFaceHandler(model_dir=model_dir, task=task)
2023/09/28 14:20:42 ~ Traceback (most recent call last):
2023/09/28 14:20:42 ~ config_class = CONFIG_MAPPING[config_dict["model_type"]]
2023/09/28 14:20:42 ~ self.pipeline = get_pipeline(model_dir=model_dir, task=task)
2023/09/28 14:20:42 ~ inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
2023/09/28 14:20:42 ~ await self._router.startup()
2023/09/28 14:20:42 ~ async with self.lifespan_context(app) as maybe_state:
2023/09/28 14:20:42 ~ INFO | No custom pipeline found at /repository/handler.py

Hmm I think I misunderstood the purpose of this model, this model isn't meant for inference right? Just for testing my inference code

That’s correct @a8nova . We use this model mostly for testing purposes (for instance that the whole forward pass is not buggy), it is not meant for solving real tasks or so

Thank you @VictorSanh ! I will close this

a8nova changed discussion status to closed

Sign up or log in to comment