calpt's picture
Add adapter distilbert-base-uncased_comsense_hellaswag_pfeiffer version 1
f4be759 verified
---
tags:
- adapter-transformers
- multiple-choice
- adapterhub:comsense/hellaswag
- distilbert
datasets:
- hellaswag
license: "apache-2.0"
---
# Adapter `distilbert-base-uncased_comsense_hellaswag_pfeiffer` for distilbert-base-uncased
Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the Hellaswag dataset for 15 epochs with early stopping and a learning rate of 1e-4.
**This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.**
## Usage
First, install `adapters`:
```
pip install -U adapters
```
Now, the adapter can be loaded and activated like this:
```python
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("distilbert-base-uncased")
adapter_name = model.load_adapter("AdapterHub/distilbert-base-uncased_comsense_hellaswag_pfeiffer")
model.set_active_adapters(adapter_name)
```
## Architecture & Training
- Adapter architecture: pfeiffer
- Prediction head: multiple choice
- Dataset: [HellaSwag](https://rowanzellers.com/hellaswag/)
## Author Information
- Author name(s): Clifton Poth
- Author email: calpt@mail.de
- Author links: [Website](https://calpt.github.io), [GitHub](https://github.com/calpt), [Twitter](https://twitter.com/@clifapt)
## Citation
```bibtex
```
*This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/distilbert-base-uncased_comsense_hellaswag_pfeiffer.yaml*.