Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update XXXForSequence with multilabel and activation function #13779

Merged
merged 8 commits into from
May 10, 2023
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ AlbertForSequenceClassification

{%- capture description -%}
AlbertForSequenceClassification can load ALBERT Models with sequence classification/regression head on top
(a linear layer on top of the pooled output) e.g. for multi-class document classification tasks.
(a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:
```
Expand Down Expand Up @@ -125,4 +127,4 @@ scala_example=scala_example
api_link=api_link
python_api_link=python_api_link
source_link=source_link
%}
%}
7 changes: 4 additions & 3 deletions docs/en/transformer_entries/BertForSequenceClassification.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,9 @@ BertForSequenceClassification
{%- endcapture -%}

{%- capture description -%}
BertForSequenceClassification can load Bert Models with sequence classification/regression head on top (a linear layer on top of the pooled output)
e.g. for multi-class document classification tasks.
BertForSequenceClassification can load Bert Models with sequence classification/regression head on top (a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:
```
Expand Down Expand Up @@ -127,4 +128,4 @@ scala_example=scala_example
python_api_link=python_api_link
api_link=api_link
source_link=source_link
%}
%}
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,9 @@ CamemBertForSequenceClassification

{%- capture description -%}
CamemBertForSequenceClassification can load CamemBERT Models with sequence
classification/regression head on top (a linear layer on top of the pooled output) e.g. for
multi-class document classification tasks.
classification/regression head on top (a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:

Expand Down Expand Up @@ -126,4 +127,4 @@ scala_example=scala_example
api_link=api_link
python_api_link=python_api_link
source_link=source_link
%}
%}
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ DistilBertForSequenceClassification

{%- capture description -%}
DistilBertForSequenceClassification can load DistilBERT Models with sequence classification/regression head on top
(a linear layer on top of the pooled output) e.g. for multi-class document classification tasks.
(a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:
```
Expand Down Expand Up @@ -126,4 +128,4 @@ scala_example=scala_example
api_link=api_link
python_api_link=python_api_link
source_link=source_link
%}
%}
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ LongformerForSequenceClassification

{%- capture description -%}
LongformerForSequenceClassification can load Longformer Models with sequence classification/regression head on top
(a linear layer on top of the pooled output) e.g. for multi-class document classification tasks.
(a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:
```
Expand Down Expand Up @@ -124,4 +126,4 @@ scala_example=scala_example
api_link=api_link
python_api_link=python_api_link
source_link=source_link
%}
%}
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ RoBertaForSequenceClassification

{%- capture description -%}
RoBertaForSequenceClassification can load RoBERTa Models with sequence classification/regression head on top
(a linear layer on top of the pooled output) e.g. for multi-class document classification tasks.
(a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:
```
Expand Down Expand Up @@ -120,4 +122,4 @@ scala_example=scala_example
api_link=api_link
python_api_link=python_api_link
source_link=source_link
%}
%}
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ XlmRoBertaForSequenceClassification

{%- capture description -%}
XlmRoBertaForSequenceClassification can load XLM-RoBERTa Models with sequence classification/regression head on top
(a linear layer on top of the pooled output) e.g. for multi-class document classification tasks.
(a linear layer on top of the pooled output), e.g. for document classification tasks.

For multi-class, use `setActivation("softmax")`. For multi-label, use `setActivation("sigmoid")`.

Pretrained models can be loaded with `pretrained` of the companion object:
```
Expand Down Expand Up @@ -124,4 +126,4 @@ scala_example=scala_example
api_link=api_link
python_api_link=python_api_link
source_link=source_link
%}
%}