Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
togethercomputer
/
evo-1-131k-base
like
100
Follow
Together
472
Text Generation
Transformers
Safetensors
stripedhyena
long context
deep signal processing
hybrid
biology
genomics
custom_code
arxiv:
7 papers
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Train
Use this model
60eb4c7
evo-1-131k-base
4 contributors
History:
25 commits
Zymrael
Update model.py
60eb4c7
verified
10 months ago
.gitattributes
1.52 kB
initial commit
11 months ago
README.md
4.1 kB
Update README.md
10 months ago
cache.py
1.38 kB
init
11 months ago
config.json
1.73 kB
Fix auto tokenizer import reference format in auto map as list for slow and fast.
10 months ago
configuration_hyena.py
3.13 kB
init
11 months ago
engine.py
13.5 kB
init
11 months ago
generation_config.json
69 Bytes
Upload model
11 months ago
layers.py
5.39 kB
init
11 months ago
model-00001-of-00003.safetensors
4.98 GB
LFS
Upload model
11 months ago
model-00002-of-00003.safetensors
4.93 GB
LFS
Upload model
11 months ago
model-00003-of-00003.safetensors
3 GB
LFS
Upload model
11 months ago
model.py
19.5 kB
Update model.py
10 months ago
model.safetensors.index.json
34.9 kB
Upload model
11 months ago
modeling_hyena.py
5.55 kB
init
11 months ago
positional_embeddings.py
4.94 kB
init
11 months ago
pytorch_model.pt
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
16.8 GB
LFS
add pt ckpt
10 months ago
special_tokens_map.json
3 Bytes
Update byte tokenizer to be compatible with auto tokenizer and clean-up.
10 months ago
streamer.py
3.94 kB
init
11 months ago
tokenizer.py
4.37 kB
Remove tokenizer.json and replace tokenizer.py with correct version.
10 months ago
tokenizer_config.json
299 Bytes
Fix auto tokenizer import reference format in auto map as list for slow and fast.
10 months ago
utils.py
2.87 kB
init
11 months ago