YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
RobBERT-2023: Keeping Dutch Language Models Up-To-Date
RobBERT-2023 is the 2023 release of the Dutch RobBERT model.
It is a new version of original pdelobelle/robbert-v2-dutch-base model on the 2023 version of the OSCAR version.
We release a base model, but this time we also release an additional large model with 355M parameters (x3 over robbert-2022-base). We are particularly proud of the performance of both models, surpassing both the robbert-v2-base and robbert-2022-base models with +2.9 and +0.9 points on the DUMB benchmark from GroNLP. In addition, we also surpass BERTje with +18.6 points with robbert-2023-dutch-large
.
This is the same model with the same weights as DTAI-KULeuven/robbert-2023-dutch-large
.
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.