This is a W8A8-FP8 quant created using llm-compressor which can be loaded with vllm.
- Downloads last month
- 207
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for aikitoria/c4ai-command-a-03-2025-FP8-Dynamic
Base model
CohereLabs/c4ai-command-a-03-2025