Does vllm 0.8.4 support this quantized model?

#1
by traphix - opened

Does vllm 0.8.4 support this quantized model?

Red Hat AI org

Yes it does.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment