Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Turbo-AI
/
multilingual-e5-base__trim_vocab
like
0
Follow
Turbo
3
Feature Extraction
Transformers
Safetensors
xlm-roberta
text-embeddings-inference
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
multilingual-e5-base__trim_vocab
1 contributor
History:
3 commits
NghiemAbe
Upload tokenizer
e5554f8
verified
3 months ago
.gitattributes
Safe
1.52 kB
initial commit
3 months ago
README.md
Safe
5.17 kB
Upload model
3 months ago
config.json
Safe
1.36 kB
Upload model
3 months ago
model.safetensors
Safe
197 MB
LFS
Upload model
3 months ago
sentencepiece.bpe.model
Safe
5.07 MB
LFS
Upload tokenizer
3 months ago
special_tokens_map.json
Safe
964 Bytes
Upload tokenizer
3 months ago
tokenizer.json
Safe
1.35 MB
Upload tokenizer
3 months ago
tokenizer_config.json
Safe
1.15 kB
Upload tokenizer
3 months ago