alexmarques commited on
Commit
5b554e0
·
verified ·
1 Parent(s): 1455f0f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -32,7 +32,7 @@ base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
32
  - **Model Developers:** Neural Magic
33
 
34
  Quantized version of [Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct).
35
- It achieves scores within 2.3% of the scores of the unquantized model for MMLU, ARC-Challenge, GSM-8k, Hellaswag and Winogrande, and within 7.4% for TruthfulQA.
36
 
37
  ### Model Optimizations
38
 
@@ -200,9 +200,9 @@ This version of the lm-evaluation-harness includes versions of MMLU, ARC-Challen
200
  </td>
201
  <td>78.06
202
  </td>
203
- <td>76.40
204
  </td>
205
- <td>97.9%
206
  </td>
207
  </tr>
208
  <tr>
@@ -210,9 +210,9 @@ This version of the lm-evaluation-harness includes versions of MMLU, ARC-Challen
210
  </td>
211
  <td>54.48
212
  </td>
213
- <td>50.46
214
  </td>
215
- <td>92.6%
216
  </td>
217
  </tr>
218
  <tr>
@@ -220,9 +220,9 @@ This version of the lm-evaluation-harness includes versions of MMLU, ARC-Challen
220
  </td>
221
  <td><strong>74.25</strong>
222
  </td>
223
- <td><strong>72.58</strong>
224
  </td>
225
- <td><strong>97.7%</strong>
226
  </td>
227
  </tr>
228
  </table>
 
32
  - **Model Developers:** Neural Magic
33
 
34
  Quantized version of [Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct).
35
+ It achieves scores within 3.1% of the scores of the unquantized model for MMLU, ARC-Challenge, GSM-8k, Hellaswag and Winogrande, and TruthfulQA.
36
 
37
  ### Model Optimizations
38
 
 
200
  </td>
201
  <td>78.06
202
  </td>
203
+ <td>77.98
204
  </td>
205
+ <td>99.9%
206
  </td>
207
  </tr>
208
  <tr>
 
210
  </td>
211
  <td>54.48
212
  </td>
213
+ <td>52.81
214
  </td>
215
+ <td>96.9%
216
  </td>
217
  </tr>
218
  <tr>
 
220
  </td>
221
  <td><strong>74.25</strong>
222
  </td>
223
+ <td><strong>73.24</strong>
224
  </td>
225
+ <td><strong>98.6%</strong>
226
  </td>
227
  </tr>
228
  </table>