Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
sentence-transformers
/
bert-base-wikipedia-sections-mean-tokens
like
0
Follow
Sentence Transformers
937
Sentence Similarity
sentence-transformers
PyTorch
google-tensorflow
TensorFlow
ONNX
Safetensors
OpenVINO
Transformers
bert
feature-extraction
text-embeddings-inference
Inference Endpoints
arxiv:
1908.10084
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
bfe50e6
bert-base-wikipedia-sections-mean-tokens
5 contributors
History:
5 commits
joaogante
HF staff
Add TF weights
bfe50e6
over 2 years ago
1_Pooling
Add new SentenceTransformer model.
over 3 years ago
.gitattributes
690 Bytes
initial commit
over 3 years ago
README.md
4.03 kB
Update README.md
over 3 years ago
config.json
640 Bytes
Add new SentenceTransformer model.
over 3 years ago
config_sentence_transformers.json
122 Bytes
Add new SentenceTransformer model.
over 3 years ago
modules.json
229 Bytes
Add new SentenceTransformer model.
over 3 years ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.LongStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
438 MB
LFS
Add new SentenceTransformer model.
over 3 years ago
sentence_bert_config.json
53 Bytes
Add new SentenceTransformer model.
over 3 years ago
special_tokens_map.json
112 Bytes
Add new SentenceTransformer model.
over 3 years ago
tf_model.h5
438 MB
LFS
Add TF weights
over 2 years ago
tokenizer.json
466 kB
Add new SentenceTransformer model.
over 3 years ago
tokenizer_config.json
429 Bytes
Add new SentenceTransformer model.
over 3 years ago
vocab.txt
232 kB
Add new SentenceTransformer model.
over 3 years ago