metadata
language: en
license: apache-2.0
tags:
- fill-mask
datasets:
- wikipedia
- bookcorpus
80% 1x4 Block Sparse BERT-Base (uncased) Prune OFA
This model is was created using Prune OFA method described in Prune Once for All: Sparse Pre-Trained Language Models presented in ENLSP NeurIPS Workshop 2021.
For further details on the model and its result, see our paper and our implementation available here.