-
Notifications
You must be signed in to change notification settings - Fork 718
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
2023-05-11-distilbart_cnn_12_6_en (#13795)
* Add model 2023-05-11-distilbart_cnn_12_6_en * Add model 2023-05-11-distilbart_cnn_6_6_en * Add model 2023-05-11-distilbart_xsum_12_6_en * Add model 2023-05-11-distilbart_xsum_6_6_en * Add model 2023-05-11-bart_large_cnn_en * Update 2023-05-11-bart_large_cnn_en.md * Update 2023-05-11-distilbart_cnn_12_6_en.md * Update 2023-05-11-distilbart_cnn_6_6_en.md * Update 2023-05-11-distilbart_xsum_12_6_en.md * Update 2023-05-11-distilbart_xsum_6_6_en.md --------- Co-authored-by: prabod <prabod@rathnayaka.me> Co-authored-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>
- Loading branch information
1 parent
04149fb
commit de3e19e
Showing
5 changed files
with
426 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,74 @@ | ||
--- | ||
layout: model | ||
title: BART (large-sized model), fine-tuned on CNN Daily Mail | ||
author: John Snow Labs | ||
name: bart_large_cnn | ||
date: 2023-05-11 | ||
tags: [bart, summarization, cnn, text_to_text, en, open_source, tensorflow] | ||
task: Summarization | ||
language: en | ||
edition: Spark NLP 4.4.2 | ||
spark_version: 3.0 | ||
supported: true | ||
engine: tensorflow | ||
annotator: BartTransformer | ||
article_header: | ||
type: cover | ||
use_language_switcher: "Python-Scala-Java" | ||
--- | ||
|
||
## Description | ||
|
||
BART model pre-trained on English language, and fine-tuned on [CNN Daily Mail](https://huggingface.co/datasets/cnn_dailymail). It was introduced in the paper [BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension](https://arxiv.org/abs/1910.13461) by Lewis et al. and first released in [this repository (https://github.com/pytorch/fairseq/tree/master/examples/bart). | ||
|
||
Disclaimer: The team releasing BART did not write a model card for this model so this model card has been written by the Hugging Face team. | ||
|
||
### Model description | ||
|
||
BART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. | ||
|
||
BART is particularly effective when fine-tuned for text generation (e.g. summarization, translation) but also works well for comprehension tasks (e.g. text classification, question answering). This particular checkpoint has been fine-tuned on CNN Daily Mail, a large collection of text-summary pairs. | ||
|
||
## Predicted Entities | ||
|
||
|
||
|
||
{:.btn-box} | ||
<button class="button button-orange" disabled>Live Demo</button> | ||
<button class="button button-orange" disabled>Open in Colab</button> | ||
[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/bart_large_cnn_en_4.4.2_3.0_1683808096812.zip){:.button.button-orange} | ||
[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/bart_large_cnn_en_4.4.2_3.0_1683808096812.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3} | ||
|
||
## How to use | ||
|
||
You can use this model for text summarization. | ||
|
||
<div class="tabs-box" markdown="1"> | ||
{% include programmingLanguageSelectScalaPythonNLU.html %} | ||
```python | ||
bart = BartTransformer.pretrained("bart_large_cnn") \ | ||
.setTask("summarize:") \ | ||
.setMaxOutputLength(200) \ | ||
.setInputCols(["documents"]) \ | ||
.setOutputCol("summaries") | ||
``` | ||
```scala | ||
val bart = BartTransformer.pretrained("bart_large_cnn") | ||
.setTask("summarize:") | ||
.setMaxOutputLength(200) | ||
.setInputCols("documents") | ||
.setOutputCol("summaries") | ||
``` | ||
</div> | ||
|
||
{:.model-param} | ||
## Model Information | ||
|
||
{:.table-model} | ||
|---|---| | ||
|Model Name:|bart_large_cnn| | ||
|Compatibility:|Spark NLP 4.4.2+| | ||
|License:|Open Source| | ||
|Edition:|Official| | ||
|Language:|en| | ||
|Size:|975.3 MB| |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,86 @@ | ||
--- | ||
layout: model | ||
title: Abstractive Summarization by BART - DistilBART CNN | ||
author: John Snow Labs | ||
name: distilbart_cnn_12_6 | ||
date: 2023-05-11 | ||
tags: [bart, summarization, cnn, distil, text_to_text, en, open_source, tensorflow] | ||
task: Summarization | ||
language: en | ||
edition: Spark NLP 4.4.2 | ||
spark_version: 3.0 | ||
supported: true | ||
engine: tensorflow | ||
annotator: BartTransformer | ||
article_header: | ||
type: cover | ||
use_language_switcher: "Python-Scala-Java" | ||
--- | ||
|
||
## Description | ||
|
||
"BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension Transformer" The Facebook BART (Bidirectional and Auto-Regressive Transformer) model is a state-of-the-art language generation model that was introduced by Facebook AI in 2019. It is based on the transformer architecture and is designed to handle a wide range of natural language processing tasks such as text generation, summarization, and machine translation. | ||
|
||
This pre-trained model is DistilBART fine-tuned on the Extreme Summarization (CNN) Dataset. | ||
|
||
## Predicted Entities | ||
|
||
|
||
|
||
{:.btn-box} | ||
<button class="button button-orange" disabled>Live Demo</button> | ||
<button class="button button-orange" disabled>Open in Colab</button> | ||
[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/distilbart_cnn_12_6_en_4.4.2_3.0_1683807053526.zip){:.button.button-orange} | ||
[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/distilbart_cnn_12_6_en_4.4.2_3.0_1683807053526.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3} | ||
|
||
## How to use | ||
|
||
|
||
|
||
<div class="tabs-box" markdown="1"> | ||
{% include programmingLanguageSelectScalaPythonNLU.html %} | ||
```python | ||
bart = BartTransformer.pretrained("distilbart_cnn_12_6") \ | ||
.setTask("summarize:") \ | ||
.setMaxOutputLength(200) \ | ||
.setInputCols(["documents"]) \ | ||
.setOutputCol("summaries") | ||
``` | ||
```scala | ||
val bart = BartTransformer.pretrained("distilbart_cnn_12_6") | ||
.setTask("summarize:") | ||
.setMaxOutputLength(200) | ||
.setInputCols("documents") | ||
.setOutputCol("summaries") | ||
``` | ||
</div> | ||
|
||
{:.model-param} | ||
## Model Information | ||
|
||
{:.table-model} | ||
|---|---| | ||
|Model Name:|distilbart_cnn_12_6| | ||
|Compatibility:|Spark NLP 4.4.2+| | ||
|License:|Open Source| | ||
|Edition:|Official| | ||
|Language:|en| | ||
|Size:|870.4 MB| | ||
|
||
## Benchmarking | ||
|
||
```bash | ||
### Metrics for DistilBART models | ||
| Model Name | MM Params | Inference Time (MS) | Speedup | Rouge 2 | Rouge-L | | ||
|:---------------------------|------------:|----------------------:|----------:|----------:|----------:| | ||
| distilbart-xsum-12-1 | 222 | 90 | 2.54 | 18.31 | 33.37 | | ||
| distilbart-xsum-6-6 | 230 | 132 | 1.73 | 20.92 | 35.73 | | ||
| distilbart-xsum-12-3 | 255 | 106 | 2.16 | 21.37 | 36.39 | | ||
| distilbart-xsum-9-6 | 268 | 136 | 1.68 | 21.72 | 36.61 | | ||
| bart-large-xsum (baseline) | 406 | 229 | 1 | 21.85 | 36.50 | | ||
| distilbart-xsum-12-6 | 306 | 137 | 1.68 | 22.12 | 36.99 | | ||
| bart-large-cnn (baseline) | 406 | 381 | 1 | 21.06 | 30.63 | | ||
| distilbart-12-3-cnn | 255 | 214 | 1.78 | 20.57 | 30.00 | | ||
| distilbart-12-6-cnn | 306 | 307 | 1.24 | 21.26 | 30.59 | | ||
| distilbart-6-6-cnn | 230 | 182 | 2.09 | 20.17 | 29.70 | | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,86 @@ | ||
--- | ||
layout: model | ||
title: Abstractive Summarization by BART - DistilBART CNN | ||
author: John Snow Labs | ||
name: distilbart_cnn_6_6 | ||
date: 2023-05-11 | ||
tags: [bart, summarization, cnn, distil, text_to_text, en, open_source, tensorflow] | ||
task: Summarization | ||
language: en | ||
edition: Spark NLP 4.4.2 | ||
spark_version: 3.0 | ||
supported: true | ||
engine: tensorflow | ||
annotator: BartTransformer | ||
article_header: | ||
type: cover | ||
use_language_switcher: "Python-Scala-Java" | ||
--- | ||
|
||
## Description | ||
|
||
"BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension Transformer" The Facebook BART (Bidirectional and Auto-Regressive Transformer) model is a state-of-the-art language generation model that was introduced by Facebook AI in 2019. It is based on the transformer architecture and is designed to handle a wide range of natural language processing tasks such as text generation, summarization, and machine translation. | ||
|
||
This pre-trained model is DistilBART fine-tuned on the Extreme Summarization (CNN) Dataset. | ||
|
||
## Predicted Entities | ||
|
||
|
||
|
||
{:.btn-box} | ||
<button class="button button-orange" disabled>Live Demo</button> | ||
<button class="button button-orange" disabled>Open in Colab</button> | ||
[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/distilbart_cnn_6_6_en_4.4.2_3.0_1683807295608.zip){:.button.button-orange} | ||
[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/distilbart_cnn_6_6_en_4.4.2_3.0_1683807295608.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3} | ||
|
||
## How to use | ||
|
||
|
||
|
||
<div class="tabs-box" markdown="1"> | ||
{% include programmingLanguageSelectScalaPythonNLU.html %} | ||
```python | ||
bart = BartTransformer.pretrained("distilbart_cnn_6_6") \ | ||
.setTask("summarize:") \ | ||
.setMaxOutputLength(200) \ | ||
.setInputCols(["documents"]) \ | ||
.setOutputCol("summaries") | ||
``` | ||
```scala | ||
val bart = BartTransformer.pretrained("distilbart_cnn_6_6") | ||
.setTask("summarize:") | ||
.setMaxOutputLength(200) | ||
.setInputCols("documents") | ||
.setOutputCol("summaries") | ||
``` | ||
</div> | ||
|
||
{:.model-param} | ||
## Model Information | ||
|
||
{:.table-model} | ||
|---|---| | ||
|Model Name:|distilbart_cnn_6_6| | ||
|Compatibility:|Spark NLP 4.4.2+| | ||
|License:|Open Source| | ||
|Edition:|Official| | ||
|Language:|en| | ||
|Size:|551.9 MB| | ||
|
||
## Benchmarking | ||
|
||
```bash | ||
### Metrics for DistilBART models | ||
| Model Name | MM Params | Inference Time (MS) | Speedup | Rouge 2 | Rouge-L | | ||
|:---------------------------|------------:|----------------------:|----------:|----------:|----------:| | ||
| distilbart-xsum-12-1 | 222 | 90 | 2.54 | 18.31 | 33.37 | | ||
| distilbart-xsum-6-6 | 230 | 132 | 1.73 | 20.92 | 35.73 | | ||
| distilbart-xsum-12-3 | 255 | 106 | 2.16 | 21.37 | 36.39 | | ||
| distilbart-xsum-9-6 | 268 | 136 | 1.68 | 21.72 | 36.61 | | ||
| bart-large-xsum (baseline) | 406 | 229 | 1 | 21.85 | 36.50 | | ||
| distilbart-xsum-12-6 | 306 | 137 | 1.68 | 22.12 | 36.99 | | ||
| bart-large-cnn (baseline) | 406 | 381 | 1 | 21.06 | 30.63 | | ||
| distilbart-12-3-cnn | 255 | 214 | 1.78 | 20.57 | 30.00 | | ||
| distilbart-12-6-cnn | 306 | 307 | 1.24 | 21.26 | 30.59 | | ||
| distilbart-6-6-cnn | 230 | 182 | 2.09 | 20.17 | 29.70 | | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,90 @@ | ||
--- | ||
layout: model | ||
title: Abstractive Summarization by BART - DistilBART XSUM | ||
author: John Snow Labs | ||
name: distilbart_xsum_12_6 | ||
date: 2023-05-11 | ||
tags: [bart, summarization, text_to_text, xsum, distil, en, open_source, tensorflow] | ||
task: Summarization | ||
language: en | ||
edition: Spark NLP 4.4.2 | ||
spark_version: 3.0 | ||
supported: true | ||
engine: tensorflow | ||
annotator: BartTransformer | ||
article_header: | ||
type: cover | ||
use_language_switcher: "Python-Scala-Java" | ||
--- | ||
|
||
## Description | ||
|
||
"BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension Transformer" The Facebook BART (Bidirectional and Auto-Regressive Transformer) model is a state-of-the-art language generation model that was introduced by Facebook AI in 2019. It is based on the transformer architecture and is designed to handle a wide range of natural language processing tasks such as text generation, summarization, and machine translation. | ||
|
||
This pre-trained model is DistilBART fine-tuned on the Extreme Summarization (XSum) Dataset. | ||
|
||
## Predicted Entities | ||
|
||
|
||
|
||
{:.btn-box} | ||
<button class="button button-orange" disabled>Live Demo</button> | ||
<button class="button button-orange" disabled>Open in Colab</button> | ||
[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/distilbart_xsum_12_6_en_4.4.2_3.0_1683807498835.zip){:.button.button-orange} | ||
[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/distilbart_xsum_12_6_en_4.4.2_3.0_1683807498835.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3} | ||
|
||
## How to use | ||
|
||
|
||
|
||
<div class="tabs-box" markdown="1"> | ||
{% include programmingLanguageSelectScalaPythonNLU.html %} | ||
```python | ||
bart = BartTransformer.pretrained("distilbart_xsum_12_6") \ | ||
.setTask("summarize:") \ | ||
.setMaxOutputLength(200) \ | ||
.setInputCols(["documents"]) \ | ||
.setOutputCol("summaries") | ||
``` | ||
```scala | ||
val bart = BartTransformer.pretrained("distilbart_xsum_12_6") | ||
.setTask("summarize:") | ||
.setMaxOutputLength(200) | ||
.setInputCols("documents") | ||
.setOutputCol("summaries") | ||
``` | ||
</div> | ||
|
||
{:.model-param} | ||
## Model Information | ||
|
||
{:.table-model} | ||
|---|---| | ||
|Model Name:|distilbart_xsum_12_6| | ||
|Compatibility:|Spark NLP 4.4.2+| | ||
|License:|Open Source| | ||
|Edition:|Official| | ||
|Language:|en| | ||
|Size:|733.7 MB| | ||
|
||
## References | ||
|
||
https://huggingface.co/sshleifer/distilbart-xsum-12-6 | ||
|
||
## Benchmarking | ||
|
||
```bash | ||
### Metrics for DistilBART models | ||
| Model Name | MM Params | Inference Time (MS) | Speedup | Rouge 2 | Rouge-L | | ||
|:---------------------------|------------:|----------------------:|----------:|----------:|----------:| | ||
| distilbart-xsum-12-1 | 222 | 90 | 2.54 | 18.31 | 33.37 | | ||
| distilbart-xsum-6-6 | 230 | 132 | 1.73 | 20.92 | 35.73 | | ||
| distilbart-xsum-12-3 | 255 | 106 | 2.16 | 21.37 | 36.39 | | ||
| distilbart-xsum-9-6 | 268 | 136 | 1.68 | 21.72 | 36.61 | | ||
| bart-large-xsum (baseline) | 406 | 229 | 1 | 21.85 | 36.50 | | ||
| distilbart-xsum-12-6 | 306 | 137 | 1.68 | 22.12 | 36.99 | | ||
| bart-large-cnn (baseline) | 406 | 381 | 1 | 21.06 | 30.63 | | ||
| distilbart-12-3-cnn | 255 | 214 | 1.78 | 20.57 | 30.00 | | ||
| distilbart-12-6-cnn | 306 | 307 | 1.24 | 21.26 | 30.59 | | ||
| distilbart-6-6-cnn | 230 | 182 | 2.09 | 20.17 | 29.70 | | ||
``` |
Oops, something went wrong.