Mistral 12B merges
Collection
9 items
•
Updated
•
1
This is a merge of pre-trained language models
Goal of this merge was to create good all-round model for creative writing, casual chat and roleplay, for fast everyday use on limited vram.
Model is creative enough, good at instruction following, and has acceptable context sensitivity for 12b mistral. Narration is nice. If user is described in char card, model could reply for you.
Ru was tested too, on q4 is bad, on q5 and higher is good, not too dry, not too dumb.
Use Mistral tepmlate, t1.04.