Add library name, pipeline tag, and license
#1
by
nielsr
HF Staff
- opened
README.md
CHANGED
@@ -3,7 +3,11 @@ datasets:
|
|
3 |
- HuggingFaceTB/smollm-corpus
|
4 |
language:
|
5 |
- en
|
|
|
|
|
|
|
6 |
---
|
|
|
7 |
# Outlier-Safe Pre-Training
|
8 |
|
9 |
[](https://arxiv.org/abs/2506.19697)
|
@@ -25,7 +29,14 @@ A method that prevents outliers but significantly reduces efficiency is unlikely
|
|
25 |
3. 🧩**Ensuring full compatibility with existing inference pipelines**<br/>
|
26 |
We prioritize compatibility with widely adopted inference frameworks such as vLLM and SGLang. Rather than introducing architectural changes that break compatibility, OSP preserves computational invariance, allowing models to be directly integrated into existing pipelines without additional effort.
|
27 |
|
|
|
|
|
|
|
|
|
|
|
28 |
|
|
|
|
|
29 |
|
30 |
## Model Checkpoints
|
31 |
|
|
|
3 |
- HuggingFaceTB/smollm-corpus
|
4 |
language:
|
5 |
- en
|
6 |
+
library_name: transformers
|
7 |
+
pipeline_tag: text-generation
|
8 |
+
license: apache-2.0
|
9 |
---
|
10 |
+
|
11 |
# Outlier-Safe Pre-Training
|
12 |
|
13 |
[](https://arxiv.org/abs/2506.19697)
|
|
|
29 |
3. 🧩**Ensuring full compatibility with existing inference pipelines**<br/>
|
30 |
We prioritize compatibility with widely adopted inference frameworks such as vLLM and SGLang. Rather than introducing architectural changes that break compatibility, OSP preserves computational invariance, allowing models to be directly integrated into existing pipelines without additional effort.
|
31 |
|
32 |
+
<p align="center">
|
33 |
+
<img src="./images/figure2.png" alt="drawing" width="700"/>
|
34 |
+
</p>
|
35 |
+
|
36 |
+
## News
|
37 |
|
38 |
+
- **2025-06-25**: Released **Outlier-Safe Pre-Training for Robust 4-Bit Quantization of Large Language Models** on [arXiv](https://www.arxiv.org/abs/2506.19697), with [GitHub](https://github.com/dmis-lab/Outlier-Safe-Pre-Training) and [models](https://huggingface.co/collections/dmis-lab/outlier-safe-pre-training-osp-685bda10aa1e8a19fcb58ea8).
|
39 |
+
- **2025-05-16**: Our paper has been accepted to ACL 2025! 🎉
|
40 |
|
41 |
## Model Checkpoints
|
42 |
|