Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
18
Follow
AWS Inferentia and Trainium
104
License:
apache-2.0
Model card
Files
Files and versions
Community
434
58cdf43
optimum-neuron-cache
/
neuronxcc-2.15.128.0+56dc5a86
/
0_REGISTRY
/
0.0.25
/
inference
/
llama
/
NousResearch
Ctrl+K
Ctrl+K
3 contributors
History:
9 commits
dacorvo
HF Staff
Synchronizing local compiler cache.
6aaebf0
verified
5 months ago
Hermes-2-Theta-Llama-3-8B
Synchronizing local compiler cache.
5 months ago
Hermes-3-Llama-3.1-8B
Synchronizing local compiler cache.
6 months ago
Llama-3.2-1B
Synchronizing local compiler cache.
6 months ago
Meta-Llama-3-8B-Instruct
Synchronizing local compiler cache.
7 months ago
Meta-Llama-3.1-8B-Instruct
Synchronizing local compiler cache.
6 months ago