The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s] 0it [00:00, ?it/s] /opt/conda/lib/python3.10/site-packages/transformers/deepspeed.py:23: FutureWarning: transformers.deepspeed module is deprecated and will be removed in a future version. Please import deepspeed modules directly from transformers.integrations warnings.warn( 2024-07-01 06:29:39.261540: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2024-07-01 06:29:39.261717: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2024-07-01 06:29:39.402767: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. /opt/conda/lib/python3.10/site-packages/datasets/load.py:929: FutureWarning: The repository for data contains custom code which must be executed to correctly load the dataset. You can inspect the repository content at /kaggle/working/amr-tst-indo/AMRBART-id/fine-tune/data_interface/data.py You can avoid this message in future by passing the argument `trust_remote_code=True`. Passing `trust_remote_code=True` will be mandatory to load this dataset from the next major release of `datasets`. warnings.warn( Generating train split: 0 examples [00:00, ? examples/s] Generating train split: 401 examples [00:00, 7281.18 examples/s] Running tokenizer on train dataset: 0%| | 0/401 [00:00