runtime error
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. Downloading (…)okenizer_config.json: 0%| | 0.00/727 [00:00<?, ?B/s] Downloading (…)okenizer_config.json: 100%|██████████| 727/727 [00:00<00:00, 5.85MB/s] Downloading tokenizer.model: 0%| | 0.00/500k [00:00<?, ?B/s] Downloading tokenizer.model: 100%|██████████| 500k/500k [00:00<00:00, 165MB/s] Downloading (…)/main/tokenizer.json: 0%| | 0.00/1.84M [00:00<?, ?B/s] Downloading (…)/main/tokenizer.json: 100%|██████████| 1.84M/1.84M [00:00<00:00, 25.1MB/s] Downloading (…)cial_tokens_map.json: 0%| | 0.00/411 [00:00<?, ?B/s] Downloading (…)cial_tokens_map.json: 100%|██████████| 411/411 [00:00<00:00, 4.18MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 9, in <module> model = AutoModelForCausalLM.from_pretrained("TheBloke/Pygmalion-7B-SuperHOT-8K-GPTQ") File "/home/user/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1222, in __getattribute__ requires_backends(cls, cls._backends) File "/home/user/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1210, in requires_backends raise ImportError("".join(failed)) ImportError: AutoModelForCausalLM requires the PyTorch library but it was not found in your environment. Checkout the instructions on the installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment. Please note that you may need to restart your runtime after installation.
Container logs:
Fetching error logs...