Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
jinaai
/
xlm-roberta-flash-implementation
like
21
Follow
Jina AI
297
Transformers
94 languages
xlm-roberta
Inference Endpoints
License:
cc-by-nc-4.0
๐ช๐บ Region: EU
Model card
Files
Files and versions
Community
52
Train
Deploy
Use this model
main
xlm-roberta-flash-implementation
8 contributors
History:
58 commits
gmastrapas
Add `add_pooling_layer` arg to XLMRobertaLora (
#50
)
f221b0a
verified
9 days ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago
README.md
Safe
1.47 kB
Update README.md
about 1 month ago
block.py
Safe
17.8 kB
refine-codebase (#33)
3 months ago
configuration_xlm_roberta.py
Safe
6.54 kB
fix: set fp32 when using cpu bc bf16 is slow (#44)
about 2 months ago
convert_roberta_weights_to_flash.py
Safe
6.94 kB
Support for SequenceClassification (#7)
7 months ago
embedding.py
Safe
3.88 kB
refine-codebase (#33)
3 months ago
mha.py
Safe
34.4 kB
cpu-inference (#35)
3 months ago
mlp.py
Safe
7.62 kB
refine-codebase (#33)
3 months ago
modeling_lora.py
Safe
15.2 kB
Add `add_pooling_layer` arg to XLMRobertaLora (#50)
9 days ago
modeling_xlm_roberta.py
Safe
50 kB
refactor-task-type-to-task (#43)
2 months ago
rotary.py
Safe
24.5 kB
fix: update frequencies when updating the rope base value (#40)
3 months ago
stochastic_depth.py
Safe
3.76 kB
refine-codebase (#33)
3 months ago
xlm_padding.py
Safe
10 kB
refine-codebase (#33)
3 months ago