Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
13
Follow
AWS Inferentia and Trainium
70
License:
apache-2.0
Model card
Files
Files and versions
Community
271
c4ae732
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.23
/
inference
/
llama
/
meta-llama
8 contributors
History:
29 commits
dacorvo
HF staff
Synchronizing local compiler cache.
013f017
verified
6 months ago
Llama-2-13b-chat-hf
Synchronizing local compiler cache.
7 months ago
Llama-2-70b-chat-hf
Synchronizing local compiler cache.
7 months ago
Llama-2-7b-chat-hf
Synchronizing local compiler cache.
7 months ago
Meta-Llama-3-70B
Synchronizing local compiler cache.
6 months ago
Meta-Llama-3-8B
Synchronizing local compiler cache.
6 months ago