Automatic Speech Recognition
NeMo
PyTorch
4 languages
automatic-speech-translation
speech
audio
Transformer
FastConformer
Conformer
NeMo
hf-asr-leaderboard
Eval Results

Dedicated Endpoint deployment fails for canary-1b

#26
by anemdat - opened

I've tried creating a dedicated endpoint for canary-1b here on HuggingFace (https://ui.endpoints.huggingface.co/<my_hf_account>/endpoints/dedicated), and got this error:

repository does not appear to have a file named config.json.

Is there any workaround for this? To compare, I deployed whisper-large-v3 in the same way without any issues.

The log for canary-1b endpoint deployment:

- 2024-06-15T12:10:48.278+00:00 /usr/local/lib/python3.10/dist-packages/diffusers/utils/outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
- 2024-06-15T12:10:48.278+00:00   torch.utils._pytree._register_pytree_node(
- 2024-06-15T12:10:48.342+00:00 2024-06-15 12:10:48,342 | INFO | Initializing model from directory:/repository
- 2024-06-15T12:10:48.342+00:00 2024-06-15 12:10:48,342 | INFO | No custom pipeline found at /repository/handler.py
- 2024-06-15T12:10:48.342+00:00 2024-06-15 12:10:48,342 | INFO | Using device CPU
- 2024-06-15T12:10:48.343+00:00 Traceback (most recent call last):
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 732, in lifespan
- 2024-06-15T12:10:48.343+00:00     async with self.lifespan_context(app) as maybe_state:
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 608, in __aenter__
- 2024-06-15T12:10:48.343+00:00     await self._router.startup()
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 709, in startup
- 2024-06-15T12:10:48.343+00:00     await handler()
- 2024-06-15T12:10:48.343+00:00   File "/app/webservice_starlette.py", line 60, in some_startup_task
- 2024-06-15T12:10:48.343+00:00     inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
- 2024-06-15T12:10:48.343+00:00   File "/app/huggingface_inference_toolkit/handler.py", line 54, in get_inference_handler_either_custom_or_default_handler
- 2024-06-15T12:10:48.343+00:00     return HuggingFaceHandler(model_dir=model_dir, task=task)
- 2024-06-15T12:10:48.343+00:00   File "/app/huggingface_inference_toolkit/handler.py", line 18, in __init__
- 2024-06-15T12:10:48.343+00:00     self.pipeline = get_pipeline(
- 2024-06-15T12:10:48.343+00:00   File "/app/huggingface_inference_toolkit/utils.py", line 276, in get_pipeline
- 2024-06-15T12:10:48.343+00:00     hf_pipeline = pipeline(
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/__init__.py", line 815, in pipeline
- 2024-06-15T12:10:48.343+00:00     config = AutoConfig.from_pretrained(
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 1111, in from_pretrained
- 2024-06-15T12:10:48.343+00:00     config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 633, in get_config_dict
- 2024-06-15T12:10:48.343+00:00     config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 688, in _get_config_dict
- 2024-06-15T12:10:48.343+00:00     resolved_config_file = cached_file(
- 2024-06-15T12:10:48.343+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py", line 369, in cached_file
- 2024-06-15T12:10:48.343+00:00     raise EnvironmentError(
- 2024-06-15T12:10:48.343+00:00 OSError: /repository does not appear to have a file named config.json. Checkout 'https://huggingface.co//repository/None' for available files.
- 2024-06-15T12:10:48.343+00:00 
- 2024-06-15T12:10:48.343+00:00 Application startup failed. Exiting.
- 2024-06-15T12:10:51.377+00:00 /usr/local/lib/python3.10/dist-packages/diffusers/utils/outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
- 2024-06-15T12:10:51.377+00:00   torch.utils._pytree._register_pytree_node(
- 2024-06-15T12:10:51.445+00:00 2024-06-15 12:10:51,445 | INFO | Initializing model from directory:/repository
- 2024-06-15T12:10:51.445+00:00 2024-06-15 12:10:51,445 | INFO | No custom pipeline found at /repository/handler.py
- 2024-06-15T12:10:51.445+00:00 2024-06-15 12:10:51,445 | INFO | Using device CPU
- 2024-06-15T12:10:51.446+00:00 Traceback (most recent call last):
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 732, in lifespan
- 2024-06-15T12:10:51.446+00:00     async with self.lifespan_context(app) as maybe_state:
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 608, in __aenter__
- 2024-06-15T12:10:51.446+00:00     await self._router.startup()
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 709, in startup
- 2024-06-15T12:10:51.446+00:00     await handler()
- 2024-06-15T12:10:51.446+00:00   File "/app/webservice_starlette.py", line 60, in some_startup_task
- 2024-06-15T12:10:51.446+00:00     inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
- 2024-06-15T12:10:51.446+00:00   File "/app/huggingface_inference_toolkit/handler.py", line 54, in get_inference_handler_either_custom_or_default_handler
- 2024-06-15T12:10:51.446+00:00     return HuggingFaceHandler(model_dir=model_dir, task=task)
- 2024-06-15T12:10:51.446+00:00   File "/app/huggingface_inference_toolkit/handler.py", line 18, in __init__
- 2024-06-15T12:10:51.446+00:00     self.pipeline = get_pipeline(
- 2024-06-15T12:10:51.446+00:00   File "/app/huggingface_inference_toolkit/utils.py", line 276, in get_pipeline
- 2024-06-15T12:10:51.446+00:00     hf_pipeline = pipeline(
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/__init__.py", line 815, in pipeline
- 2024-06-15T12:10:51.446+00:00     config = AutoConfig.from_pretrained(
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 1111, in from_pretrained
- 2024-06-15T12:10:51.446+00:00     config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 633, in get_config_dict
- 2024-06-15T12:10:51.446+00:00     config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 688, in _get_config_dict
- 2024-06-15T12:10:51.446+00:00     resolved_config_file = cached_file(
- 2024-06-15T12:10:51.446+00:00   File "/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py", line 369, in cached_file
- 2024-06-15T12:10:51.446+00:00     raise EnvironmentError(
- 2024-06-15T12:10:51.446+00:00 OSError: /repository does not appear to have a file named config.json. Checkout 'https://huggingface.co//repository/None' for available files.
- 2024-06-15T12:10:51.446+00:00 
- 2024-06-15T12:10:51.446+00:00 Application startup failed. Exiting.

Sign up or log in to comment