Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
laion
/
CLIP-ViT-g-14-laion2B-s12B-b42K
like
41
Follow
LAION eV
330
OpenCLIP
PyTorch
Safetensors
clip
arxiv:
1910.04867
License:
mit
Model card
Files
Files and versions
Community
3
Use this model
27027d5
CLIP-ViT-g-14-laion2B-s12B-b42K
4 contributors
History:
4 commits
mishig
HF staff
Add widget example input
27027d5
about 2 years ago
.gitattributes
Safe
1.38 kB
initial commit
about 2 years ago
README.md
Safe
7.42 kB
Add widget example input
about 2 years ago
config.json
Safe
4.61 kB
Add bin files
about 2 years ago
open_clip_pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
5.47 GB
LFS
Add bin files
about 2 years ago
preprocessor_config.json
Safe
316 Bytes
Update README add tokenizer/vocab/preprocessor cfg
about 2 years ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.LongStorage"
,
"collections.OrderedDict"
What is a pickle import?
5.47 GB
LFS
Add bin files
about 2 years ago
special_tokens_map.json
Safe
389 Bytes
Update README add tokenizer/vocab/preprocessor cfg
about 2 years ago
tokenizer.json
Safe
2.22 MB
Update README add tokenizer/vocab/preprocessor cfg
about 2 years ago
tokenizer_config.json
Safe
568 Bytes
Update README add tokenizer/vocab/preprocessor cfg
about 2 years ago
vocab.json
Safe
862 kB
Update README add tokenizer/vocab/preprocessor cfg
about 2 years ago