|
--- |
|
language: de |
|
license: apache-2.0 |
|
datasets: oscar-corpus/OSCAR-2301 |
|
--- |
|
|
|
# mistral7b-de-pure-bf16 |
|
|
|
Mistral-7B-v0.1 adapted to German as part of our study on efficient language adaptation: "Language Adaptation on a Tight Academic Compute Budget: Tokenizer Swapping Works and Pure bfloat16 Is Enough". |
|
|
|
Code: https://github.com/konstantinjdobler/tight-budget-llm-adaptation |
|
|
|
Paper: https://openreview.net/forum?id=VYfJaHeVod |
|
|
|
## Usage |
|
```python |
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("konstantindobler/mistral7b-de-pure-bf16") |
|
model = AutoModelForCausalLM.from_pretrained("konstantindobler/mistral7b-de-pure-bf16") |
|
|
|
# Use model and tokenizer as usual |
|
``` |
|
|
|
## Details |
|
The model is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) and was adapted to German. |
|
The original tokenizer was kept. |
|
The model was then trained on 8 billion German tokens from [oscar-corpus/OSCAR-2301](https://huggingface.co/oscar-corpus/OSCAR-2301) with pure bfloat16 precision (no mixed precision). More details and hyperparameters can be found [in the paper](https://openreview.net/forum?id=VYfJaHeVod). |
|
|
|
## Disclaimer |
|
The web-scale dataset used for pretraining and tokenizer training ([oscar-corpus/OSCAR-2301](https://huggingface.co/oscar-corpus/OSCAR-2301)) might contain personal and sensitive information. |
|
Such behavior needs to be assessed carefully before any real-world deployment of the models. |
|
|
|
## Citation |
|
Please cite as follows: |
|
|
|
```bibtex |
|
@inproceedings{dobler2024language, |
|
title={Language Adaptation on a Tight Academic Compute Budget: Tokenizer Swapping Works and Pure bfloat16 Is Enough}, |
|
author={Konstantin Dobler and Gerard de Melo}, |
|
booktitle={2nd Workshop on Advancing Neural Network Training: Computational Efficiency, Scalability, and Resource Optimization (WANT@ICML 2024)}, |
|
year={2024}, |
|
url={https://openreview.net/forum?id=VYfJaHeVod} |
|
} |
|
``` |
|
|
|
|
|
## Acknowledgements |
|
The project on which this model is based was funded by the Federal Ministry of Education and Research under the funding code "KI-Servicezentrum Berlin-Brandenburg" 01IS22092. Responsibility for the content of this publication remains with the author. |
|
|