Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
sentence-transformers
/
bert-base-wikipedia-sections-mean-tokens
like
0
Follow
Sentence Transformers
933
Sentence Similarity
sentence-transformers
PyTorch
google-tensorflow
TensorFlow
ONNX
Safetensors
OpenVINO
Transformers
bert
feature-extraction
text-embeddings-inference
Inference Endpoints
arxiv:
1908.10084
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
bert-base-wikipedia-sections-mean-tokens
/
tokenizer.json
nreimers
Add new SentenceTransformer model.
08a5ca3
over 3 years ago
raw
Copy download link
history
contribute
delete
Safe
466 kB
File too large to display, you can
check the raw version
instead.