Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
togethercomputer
/
StripedHyena-Hessian-7B
like
65
Follow
Together
478
Text Generation
Transformers
Safetensors
English
stripedhyena
custom_code
arxiv:
2302.10866
arxiv:
2310.18780
arxiv:
2311.05908
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Use this model
0dd06f7
StripedHyena-Hessian-7B
5 contributors
History:
27 commits
Zymrael
chore: update readme
0dd06f7
verified
12 months ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
README.md
2.88 kB
chore: update readme
12 months ago
cache.py
1.38 kB
feat: initial commit
about 1 year ago
config.json
1.69 kB
Add model_type to config.json.
about 1 year ago
configuration_hyena.py
3.13 kB
feat: initial commit
about 1 year ago
engine.py
11.7 kB
chore: small fix in dec
about 1 year ago
generation_config.json
69 Bytes
feat: initial commit
about 1 year ago
layers.py
5.16 kB
feat: initial commit
about 1 year ago
model-00001-of-00002.safetensors
9.89 GB
LFS
feat: initial commit
about 1 year ago
model-00002-of-00002.safetensors
5.4 GB
LFS
feat: initial commit
about 1 year ago
model.py
17.1 kB
chore: add checkpoint import
about 1 year ago
model.safetensors.index.json
28.5 kB
feat: initial commit
about 1 year ago
modeling_hyena.py
5.55 kB
fix: force correct mixed dtype after HF load
about 1 year ago
pytorch-model.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
18 GB
LFS
chore: upload pytorch standalone checkpoint
about 1 year ago
special_tokens_map.json
62 Bytes
feat: initial commit
about 1 year ago
tokenizer.json
1.8 MB
feat: initial commit
about 1 year ago
tokenizer_config.json
149 Bytes
Update tokenizer class in tokenizer_config,json to llama tokenizer.
about 1 year ago
utils.py
2.75 kB
feat: initial commit
about 1 year ago
vocab.json
628 kB
feat: initial commit
about 1 year ago