diff --git "a/esm_log.txt" "b/esm_log.txt" new file mode 100644--- /dev/null +++ "b/esm_log.txt" @@ -0,0 +1,1722 @@ +Some weights of EsmForSequenceClassification were not initialized from the model checkpoint at facebook/esm2_t6_8M_UR50D and are newly initialized: ['classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.out_proj.bias', 'classifier.dense.weight'] +You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. +/home/rjo21/miniconda3/envs/finesm/lib/python3.9/site-packages/transformers/optimization.py:411: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning + warnings.warn( + 0%| | 0/13405 [00:00