Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
18
Follow
AWS Inferentia and Trainium
104
License:
apache-2.0
Model card
Files
Files and versions
Community
435
e3ea9d7
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.23
/
inference
/
llama
Ctrl+K
Ctrl+K
3 contributors
History:
51 commits
dacorvo
HF Staff
Synchronizing local compiler cache.
5d220aa
verified
10 months ago
01-ai
Synchronizing local compiler cache.
11 months ago
LargeWorldModel
Synchronizing local compiler cache.
11 months ago
NousResearch
Synchronizing local compiler cache.
10 months ago
abacusai
Synchronizing local compiler cache.
11 months ago
defog
Synchronizing local compiler cache.
11 months ago
gorilla-llm
Synchronizing local compiler cache.
11 months ago
ibm
Synchronizing local compiler cache.
11 months ago
m-a-p
Synchronizing local compiler cache.
11 months ago
meta-llama
Delete neuronxcc-2.13.66.0+6dfecc895/0_REGISTRY/0.0.23/inference/llama/meta-llama/Meta-Llama-3-70B/3825d3e7288b5c5f14e2.json
10 months ago
princeton-nlp
Synchronizing local compiler cache.
11 months ago
sophosympatheia
Synchronizing local compiler cache.
11 months ago