Update README.md
Browse files
README.md
CHANGED
@@ -22,9 +22,7 @@ tags:
|
|
22 |
This is an ONNX version of the Phi-4 multimodal model that is quantized to int4 precision to accelerate inference with ONNX Runtime.
|
23 |
|
24 |
## Model Run
|
25 |
-
|
26 |
-
|
27 |
-
For CPU: stay tuned!
|
28 |
|
29 |
<!-- ```bash
|
30 |
# Download the model directly using the Hugging Face CLI
|
|
|
22 |
This is an ONNX version of the Phi-4 multimodal model that is quantized to int4 precision to accelerate inference with ONNX Runtime.
|
23 |
|
24 |
## Model Run
|
25 |
+
For CPU: stay tuned or follow [this tutorial](https://github.com/microsoft/onnxruntime-genai/blob/main/examples/python/phi-4-multi-modal.md) to generate your own ONNX models for CPU!
|
|
|
|
|
26 |
|
27 |
<!-- ```bash
|
28 |
# Download the model directly using the Hugging Face CLI
|