Update README.md
Browse files
README.md
CHANGED
@@ -91,5 +91,18 @@ The model is a decoder-only transformer architecture with the following modifica
|
|
91 |
* **ReLU Activation Function**: ReLU([Glorot et al., 2011](https://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf)) activation functions are adopted in feed-forward networks.
|
92 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
93 |
|
94 |
-
##
|
95 |
-
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-1.5B-Call/blob/main/LICENSE) License.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
91 |
* **ReLU Activation Function**: ReLU([Glorot et al., 2011](https://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf)) activation functions are adopted in feed-forward networks.
|
92 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
93 |
|
94 |
+
## License
|
95 |
+
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-1.5B-Call/blob/main/LICENSE) License.
|
96 |
+
|
97 |
+
## Citation
|
98 |
+
```
|
99 |
+
@misc{yi2024phonelmanefficientcapablesmall,
|
100 |
+
title={PhoneLM:an Efficient and Capable Small Language Model Family through Principled Pre-training},
|
101 |
+
author={Rongjie Yi and Xiang Li and Weikai Xie and Zhenyan Lu and Chenghua Wang and Ao Zhou and Shangguang Wang and Xiwen Zhang and Mengwei Xu},
|
102 |
+
year={2024},
|
103 |
+
eprint={2411.05046},
|
104 |
+
archivePrefix={arXiv},
|
105 |
+
primaryClass={cs.CL},
|
106 |
+
url={https://arxiv.org/abs/2411.05046},
|
107 |
+
}
|
108 |
+
```
|