Edit model card

Adapter AdapterHub/xmod-base-ja_XX for AdapterHub/xmod-base

An adapter for the AdapterHub/xmod-base model that was trained on the ja/cc100 dataset.

This adapter was created for usage with the Adapters library.

Usage

First, install adapters:

pip install -U adapters

Now, the adapter can be loaded and activated like this:

from adapters import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("AdapterHub/xmod-base")
adapter_name = model.load_adapter("AdapterHub/xmod-base-ja_XX", source="hf", set_active=True)

Architecture & Training

This adapter was extracted from the original model checkpoint facebook/xmod-base to allow loading it independently via the Adapters library. For more information on architecture and training, please refer to the original model card.

Evaluation results

Citation

Lifting the Curse of Multilinguality by Pre-training Modular Transformers (Pfeiffer et al., 2022)

@inproceedings{pfeiffer-etal-2022-lifting,
    title = "Lifting the Curse of Multilinguality by Pre-training Modular Transformers",
    author = "Pfeiffer, Jonas  and
      Goyal, Naman  and
      Lin, Xi  and
      Li, Xian  and
      Cross, James  and
      Riedel, Sebastian  and
      Artetxe, Mikel",
    booktitle = "Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
    month = jul,
    year = "2022",
    address = "Seattle, United States",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.naacl-main.255",
    doi = "10.18653/v1/2022.naacl-main.255",
    pages = "3479--3495"
}
Downloads last month
3
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Collection including AdapterHub/xmod-base-ja_XX