Edit model card

Introduction

Allenai's Longformer Encoder-Decoder (LED).

As described in Longformer: The Long-Document Transformer by Iz Beltagy, Matthew E. Peters, Arman Cohan, led-base-16384 was initialized from bart-base since both models share the exact same architecture. To be able to process 16K tokens, bart-base's position embedding matrix was simply copied 16 times.

This model is especially interesting for long-range summarization and question answering.

Fine-tuning for down-stream task

This notebook shows how led-base-16384 can effectively be fine-tuned on a downstream task.

Downloads last month
22,830
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for allenai/led-base-16384

Adapters
1 model
Finetunes
23 models

Spaces using allenai/led-base-16384 10