File size: 124 Bytes
5fa1a76
1
This guide will show you how to load models quantized with autoawq, but the process is similar for llm-awq quantized models.