DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 175k • • 51 distilbert/distilbert-base-uncased Fill-Mask • 0.1B • Updated May 6, 2024 • 9.11M • • 728 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 932k • 200 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.21M • • 793
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.21M • • 793
DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 175k • • 51 distilbert/distilbert-base-uncased Fill-Mask • 0.1B • Updated May 6, 2024 • 9.11M • • 728 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 932k • 200 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.21M • • 793
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.21M • • 793