MoC: Mixtures of Text Chunking Learners for Retrieval-Augmented Generation System
The Meta-chunker-1.5B-60K was fully fine-tuned on the Qwen2.5-1.5B-Instruct utilizing 60K data entries from the CRUD_MASK.jsonl and WanJuan1_MASK.jsonl, which were prepared with GPT-4o and ERNIE-3.5-128K, respectively.
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.