Improve model card: Add pipeline tag, library, paper, code, and usage

#1
by nielsr HF Staff - opened

This PR enhances the model card for HRWKV7-hxa079-Qwen3-8B by:

  • Adding Metadata: Included pipeline_tag: text-generation to ensure better discoverability on the Hugging Face Hub, and library_name: transformers to enable the "how to use" widget, as the model is compatible with the Transformers library via trust_remote_code=True.
  • Prominent Links: Added direct links to the associated paper (RADLADS: Rapid Attention Distillation to Linear Attention Decoders at Scale) and the main GitHub repository (https://github.com/recursal/RADLADS) at the top of the card. The existing code links within the "Thank you" and "Training Code" sections have been clarified.
  • Sample Usage: Provided a correct transformers-based code snippet for text generation, guiding users on how to run inference with the model.
  • Citation: Added the BibTeX citation from the paper's repository.

These changes improve the model's visibility, usability, and provide more comprehensive information for users and researchers.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment