Echo9Zulu's picture
Update README.md
8a1b075 verified
metadata
license: mit
base_model:
  - deepseek-ai/DeepSeek-R1-0528-Qwen3-8B
pipeline_tag: text-generation
tags:
  - OpenVINO
  - Optimum-Intel
  - OpenArc

My Project OpenArc, an inference engine for OpenVINO, now supports this model and serves inference over OpenAI compatible endpoints for text to text and text with vision!

We have a growing Discord community of others interested in using Intel for AI/ML.

Discord


This repo contains OpenVINO quantizied versions of DeepSeek-R1-0528-Qwen3-8B.

I reccomend starting with DeepSeek-R1-0528-Qwen3-8B-int4_asym-awq-se-ov

To download individual models from this repo use the provided snippet:

from huggingface_hub import snapshot_download

repo_id = "Echo9Zulu/DeepSeek-R1-0528-Qwen3-8B-OpenVINO"     

# Choose the weights you want
repo_directory = "DeepSeek-R1-0528-Qwen3-8B

# Where you want to save it
local_dir = "./Echo9Zulu_DeepSeek-R1-0528-Qwen3-8B/DeepSeek-R1-0528-Qwen3-8B-int4_asym-awq-se-ov"

snapshot_download(
    repo_id=repo_id,
    allow_patterns=[f"{repo_directory}/*"], 
    local_dir=local_dir,
    local_dir_use_symlinks=True
) 

print("Download complete!")