Eagle Llama 3.1 8B Instruct

This is a converted Eagle speculator checkpoint for Llama 3.1 8B Instruct, compatible with the speculators library.

Model Details

Usage

from speculators.models.eagle import EagleSpeculator
from transformers import AutoModelForCausalLM

# Load the Eagle speculator
eagle_model = EagleSpeculator.from_pretrained("nm-testing/eagle-llama3.1-8b-instruct")

# Attach verifier model
verifier = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")
eagle_model.attach_verifier(verifier)

# Use for speculative decoding
# ... your speculative decoding code here ...

Conversion Details

This checkpoint was converted using:

speculators convert --eagle \
  yuhuili/EAGLE-LLaMA3.1-Instruct-8B \
  ./eagle-standard-converted \
  meta-llama/Meta-Llama-3.1-8B-Instruct

License

Apache 2.0

Downloads last month
330
Safetensors
Model size
252M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support