Edit model card

Service Deployment Code :

https://github.com/manojpreveen/Summarization-Service

Usage

This checkpoint should be loaded into BartForConditionalGeneration.from_pretrained. See the BART docs for more information.

Metrics for DistilBART models

Model Name MM Params Inference Time (MS) Speedup Rouge 2 Rouge-L
facebook/bart-large-cnn (baseline) 406 381 1 21.06 30.63
manojpreveen/distilbart-cnn-v3 306 307 1.24 21.26 30.59
manojpreveen/distilbart-cnn-v2 255 214 1.78 20.57 30.00
manojpreveen/distilbart-cnn-v1 230 182 2.09 20.17 29.70
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train manojpreveen/distilbart-cnn-v3