license: apache-2.0 | |
library_name: transformers | |
pipeline_tag: text-generation | |
## Symbol-LLM: Towards Foundational Symbol-centric Interface for Large Language Models | |
Paper Link: https://arxiv.org/abs/2311.09278 | |
Project Page: https://xufangzhi.github.io/symbol-llm-page/ | |
Code: https://github.com/xufangzhi/Genius | |
## π₯ News | |
- π₯π₯π₯ Symbol-LLM is accepted by ACL 2024 οΌSee you in Thailand ! | |
- π₯π₯π₯ We have made Symbol-LLM series models (7B / 13B) public. | |
## Note | |
The work is under review. | |
## Citation | |
If you find it helpful, please kindly cite the paper. | |
``` | |
@article{xu2023symbol, | |
title={Symbol-LLM: Towards Foundational Symbol-centric Interface For Large Language Models}, | |
author={Xu, Fangzhi and Wu, Zhiyong and Sun, Qiushi and Ren, Siyu and Yuan, Fei and Yuan, Shuai and Lin, Qika and Qiao, Yu and Liu, Jun}, | |
journal={arXiv preprint arXiv:2311.09278}, | |
year={2023} | |
} | |
``` |