Text Generation
Safetensors
English
llama
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prose
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
llama3
llama-3
enhanced quants
max quants
maxcpu quants
horror
finetune
Merge
Not-For-All-Audiences
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -74,6 +74,38 @@ You can see some of these wordstorm version "Dark Planets" in this model:
|
|
74 |
|
75 |
[ https://huggingface.co/DavidAU/L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B ]
|
76 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
77 |
<B>IMPORTANT: Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
|
78 |
|
79 |
If you are going to use this model, (source, GGUF or a different quant), please review this document for critical parameter, sampler and advance sampler settings (for multiple AI/LLM aps).
|
|
|
74 |
|
75 |
[ https://huggingface.co/DavidAU/L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B ]
|
76 |
|
77 |
+
MERGEKIT Formula:
|
78 |
+
|
79 |
+
```
|
80 |
+
models:
|
81 |
+
- model: Sao10K/L3-8B-Stheno-v3.2
|
82 |
+
parameters:
|
83 |
+
weight: [1,1,.75,.5,.25,.25,.05,.01]
|
84 |
+
density: .8
|
85 |
+
- model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
|
86 |
+
parameters:
|
87 |
+
weight: [0,0,.25,.35,.4,.25,.30,.04]
|
88 |
+
density: .6
|
89 |
+
merge_method: dare_ties
|
90 |
+
base_model: Hastagaras/Jamet-8B-L3-MK.V-Blackroot
|
91 |
+
dtype: bfloat16
|
92 |
+
|
93 |
+
```
|
94 |
+
|
95 |
+
NOTE:
|
96 |
+
|
97 |
+
This will NOT produce the "exact" version of this model (operation / output / attributes) because of the "density" settings.
|
98 |
+
Density introduces random pruning into the model which can have minor to major impacts in performance from slightly negative/positive
|
99 |
+
to very strongly positive/negative.
|
100 |
+
|
101 |
+
Each time you "create" this model (in mergekit) you will get a different model. This is NOT a fault or error, it is a feature of using "density".
|
102 |
+
|
103 |
+
The closer to "1" in terms of "density" the less pruning will occur, with NO pruning occuring at density of "1".
|
104 |
+
|
105 |
+
MERGEKIT:
|
106 |
+
|
107 |
+
https://github.com/arcee-ai/mergekit
|
108 |
+
|
109 |
<B>IMPORTANT: Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
|
110 |
|
111 |
If you are going to use this model, (source, GGUF or a different quant), please review this document for critical parameter, sampler and advance sampler settings (for multiple AI/LLM aps).
|