Subsection of XLM_RoBERTa, 30k top Portuguese tokens (obtained from por-pt_web_2015_1M.tar.gz found here)
All credits for methodology go to David Dale/avidale/cointegrated. Created following an adaptation of their guide, which can be found in the comments section here.
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Collection including
homersimpson/subsec-xlm-roberta-portuguese-30k