Edit model card

LiLT + XLM-RoBERTa-base

This model is created by combining the Language-Independent Layout Transformer (LiLT) with XLM-RoBERTa, a multilingual RoBERTa model (trained on 100 languages).

This way, we have a LayoutLM-like model for 100 languages :)

Downloads last month
46,380
Safetensors
Model size
284M params
Tensor type
I64
Β·
F32
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nielsr/lilt-xlm-roberta-base

Finetunes
32 models

Spaces using nielsr/lilt-xlm-roberta-base 6