File size: 552 Bytes
d27d4b0
 
cafff37
 
 
d27d4b0
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
language: en
license: apache-2.0
tags: 
- fill-mask
datasets: 
- wikipedia
- bookcorpus
---
# 80% 1x4 Block Sparse BERT-Base (uncased) Prune OFA
This model is was created using Prune OFA method described in [Prune Once for All: Sparse Pre-Trained Language Models](https://arxiv.org/abs/2111.05754) presented in ENLSP NeurIPS Workshop 2021.

For further details on the model and its result, see our paper and our implementation available [here](https://github.com/IntelLabs/Model-Compression-Research-Package/tree/main/research/prune-once-for-all).