Plus Light
Light4Bear
AI & ML interests
None yet
Organizations
Light4Bear's activity
GPTQ Support
2
#1 opened 3 months ago
by
warlock-edward
Join LMSYS Chatbot Arena?
3
1
#11 opened 11 months ago
by
Light4Bear

Can VLLM be used for loading?
6
#4 opened about 1 year ago
by
wawoshashi
How many bits and what is the groupsize?
1
#3 opened about 1 year ago
by
vitvit
How to load command r+ in text-generation-webui?
5
#1 opened about 1 year ago
by
MLDataScientist
GPTQ/AWQ quant that is runable in vllm?
2
#4 opened about 1 year ago
by
Light4Bear

Difference between v0.2 and v0.4?
1
#2 opened about 1 year ago
by
Light4Bear

Instruct sequences?
12
#2 opened about 1 year ago
by
deleted
What is the context size of this model?
8
#1 opened about 1 year ago
by
MarinaraSpaghetti

Safetensor naming convention
10
#1 opened about 1 year ago
by
dannysemi
vLLM output gibberish but text-generation-webui is fine
1
#2 opened about 1 year ago
by
Light4Bear

GPTQ and AWQ quants
1
#1 opened about 1 year ago
by
Light4Bear

Congratulations!
5
10
#1 opened about 1 year ago
by
TomGrc
Difference between this and the non-"test" version?
16
#2 opened about 1 year ago
by
samgreen

8.0bpw-h8-exl2 quant of this model
6
#1 opened over 1 year ago
by
Light4Bear

8.0bpw-h8-exl2 quant of this model
6
#1 opened over 1 year ago
by
Light4Bear

8.0bpw-h8-exl2 quant of this model
6
#1 opened over 1 year ago
by
Light4Bear
