calpt's picture
Add adapter xlm-roberta-large-zh-wiki_pfeiffer version madx
76c68bd verified
|
raw
history blame
1.62 kB
---
tags:
- xlm-roberta
- adapterhub:zh/wiki
- adapter-transformers
language:
- zh
license: "apache-2.0"
---
# Adapter `xlm-roberta-large-zh-wiki_pfeiffer` for xlm-roberta-large
Pfeiffer Adapter trained with Masked Language Modelling on Chinese Wikipedia Articles for 250k steps and a batch size of 64.
**This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.**
## Usage
First, install `adapters`:
```
pip install -U adapters
```
Now, the adapter can be loaded and activated like this:
```python
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("xlm-roberta-large")
adapter_name = model.load_adapter("AdapterHub/xlm-roberta-large-zh-wiki_pfeiffer")
model.set_active_adapters(adapter_name)
```
## Architecture & Training
- Adapter architecture: pfeiffer
- Prediction head: None
- Dataset: [zh/wiki](https://adapterhub.ml/explore/zh/wiki/)
## Author Information
- Author name(s): Jonas Pfeiffer
- Author email: jonas@pfeiffer.ai
- Author links: [Website](https://pfeiffer.ai), [GitHub](https://github.com/jopfeiff), [Twitter](https://twitter.com/@PfeiffJo)
## Citation
```bibtex
@article{pfeiffer20madx,
title={{MAD-X}: An {A}dapter-based {F}ramework for {M}ulti-task {C}ross-lingual {T}ransfer},
author={Pfeiffer, Jonas and Vuli\'{c}, Ivan and Gurevych, Iryna and Ruder, Sebastian},
journal={arXiv preprint},
year={2020},
url={https://arxiv.org/pdf/2005.00052.pdf},
}
```
*This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/xlm-roberta-large-zh-wiki_pfeiffer.yaml*.