Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix links for APIs in Open Source #13312

  •  
  •  
  •  
1 change: 1 addition & 0 deletions docs/_includes/docs-annotation-pagination.html
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
</li>
</ul>
<ul class="pagination owl-carousel pagination_big">
<li><a href="release_notes_4_5_0">4.5.0</a></li>
<li><a href="release_notes_4_4_1">4.4.1</a></li>
<li><a href="release_notes_4_4_0">4.4.0</a></li>
<li><a href="release_notes_4_3_0">4.3.0</a></li>
Expand Down
9 changes: 9 additions & 0 deletions docs/_includes/input_output_image.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@

<div class="input_output_wrapper">
<div class="input_output_inner">
<div class="input_output_title">Input image</div>
<figure class="shadow"><img src="{{include.input_image}}" alt=""></figure></div>
<div class="input_output_inner">
<div class="input_output_title">Output image</div>
<figure class="shadow"><img src="{{include.output_image}}" alt=""></figure></div>
</div>
Original file line number Diff line number Diff line change
Expand Up @@ -166,22 +166,21 @@ Training data is available [here](https://github.com/Legal-NLP-EkStep/legal_NER#
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|--------------|-----------|--------|----------|---------|
| CASE_NUMBER | 0.83 | 0.80 | 0.82 | 112 |
| COURT | 0.92 | 0.94 | 0.93 | 140 |
| DATE | 0.97 | 0.97 | 0.97 | 204 |
| GPE | 0.81 | 0.75 | 0.78 | 95 |
| JUDGE | 0.84 | 0.86 | 0.85 | 57 |
| ORG | 0.75 | 0.76 | 0.76 | 131 |
| OTHER_PERSON | 0.83 | 0.90 | 0.86 | 241 |
| PETITIONER | 0.76 | 0.61 | 0.68 | 36 |
| PRECEDENT | 0.84 | 0.84 | 0.84 | 127 |
| PROVISION | 0.90 | 0.94 | 0.92 | 220 |
| RESPONDENT | 0.64 | 0.70 | 0.67 | 23 |
| STATUTE | 0.92 | 0.96 | 0.94 | 157 |
| WITNESS | 0.93 | 0.78 | 0.85 | 87 |
| micro-avg | 0.87 | 0.87 | 0.87 | 1630 |
| macro-avg | 0.84 | 0.83 | 0.83 | 1630 |
| weighted-avg | 0.87 | 0.87 | 0.87 | 1630 |
label precision recall f1-score support
CASE_NUMBER 0.83 0.80 0.82 112
COURT 0.92 0.94 0.93 140
DATE 0.97 0.97 0.97 204
GPE 0.81 0.75 0.78 95
JUDGE 0.84 0.86 0.85 57
ORG 0.75 0.76 0.76 131
OTHER_PERSON 0.83 0.90 0.86 241
PETITIONER 0.76 0.61 0.68 36
PRECEDENT 0.84 0.84 0.84 127
PROVISION 0.90 0.94 0.92 220
RESPONDENT 0.64 0.70 0.67 23
STATUTE 0.92 0.96 0.94 157
WITNESS 0.93 0.78 0.85 87
micro-avg 0.87 0.87 0.87 1630
macro-avg 0.84 0.83 0.83 1630
weighted-avg 0.87 0.87 0.87 1630
```
Original file line number Diff line number Diff line change
Expand Up @@ -219,14 +219,13 @@ Training data is available [here](https://github.com/Legal-NLP-EkStep/legal_NER#
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|--------------|-----------|--------|----------|---------|
| COURT | 0.92 | 0.91 | 0.91 | 109 |
| JUDGE | 0.96 | 0.92 | 0.94 | 168 |
| LAWYER | 0.94 | 0.93 | 0.94 | 377 |
| PETITIONER | 0.76 | 0.77 | 0.76 | 269 |
| RESPONDENT | 0.78 | 0.80 | 0.79 | 356 |
| micro-avg | 0.86 | 0.86 | 0.86 | 1279 |
| macro-avg | 0.87 | 0.86 | 0.87 | 1279 |
| weighted-avg | 0.86 | 0.86 | 0.86 | 1279 |
label precision recall f1-score support
COURT 0.92 0.91 0.91 109
JUDGE 0.96 0.92 0.94 168
LAWYER 0.94 0.93 0.94 377
PETITIONER 0.76 0.77 0.76 269
RESPONDENT 0.78 0.80 0.79 356
micro-avg 0.86 0.86 0.86 1279
macro-avg 0.87 0.86 0.87 1279
weighted-avg 0.86 0.86 0.86 1279
```
Original file line number Diff line number Diff line change
Expand Up @@ -97,15 +97,14 @@ Training data is available [here](https://zenodo.org/record/7109926#.Y1gJwexBw8E
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|---------------|-----------|--------|----------|---------|
| civil-law | 0.93 | 0.96 | 0.94 | 809 |
| insurance-law | 0.92 | 0.94 | 0.93 | 357 |
| other | 0.76 | 0.70 | 0.73 | 23 |
| penal-law | 0.97 | 0.95 | 0.96 | 913 |
| public-law | 0.94 | 0.94 | 0.94 | 1048 |
| social-law | 0.97 | 0.95 | 0.96 | 719 |
| accuracy | - | - | 0.95 | 3869 |
| macro-avg | 0.92 | 0.91 | 0.91 | 3869 |
| weighted-avg | 0.95 | 0.95 | 0.95 | 3869 |
label precision recall f1-score support
civil-law 0.93 0.96 0.94 809
insurance-law 0.92 0.94 0.93 357
other 0.76 0.70 0.73 23
penal-law 0.97 0.95 0.96 913
public-law 0.94 0.94 0.94 1048
social-law 0.97 0.95 0.96 719
accuracy - - 0.95 3869
macro-avg 0.92 0.91 0.91 3869
weighted-avg 0.95 0.95 0.95 3869
```
Original file line number Diff line number Diff line change
Expand Up @@ -157,12 +157,11 @@ Manual annotations on CUAD dataset
## Benchmarking

```bash
| Relation | Recall | Precision | F1 | Support |
|----------------------|--------|-----------|-------|---------|
| dated_as | 1.000 | 0.957 | 0.978 | 44 |
| has_alias | 0.950 | 0.974 | 0.962 | 40 |
| has_collective_alias | 0.667 | 1.000 | 0.800 | 3 |
| signed_by | 0.957 | 0.989 | 0.972 | 92 |
| Avg. | 0.913 | 0.977 | 0.938 | - |
| Weighted-Avg. | 0.973 | 0.974 | 0.973 | - |
label Recall Precision F1 Support
dated_as 1.000 0.957 0.978 44
has_alias 0.950 0.974 0.962 40
has_collective_alias 0.667 1.000 0.800 3
signed_by 0.957 0.989 0.972 92
Avg. 0.913 0.977 0.938 -
Weighted-Avg. 0.973 0.974 0.973 -
```
13 changes: 6 additions & 7 deletions docs/_posts/bunyamin-polat/2022-11-03-legre_obligations_md_en.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,11 +122,10 @@ Manual annotations on CUAD dataset
## Benchmarking

```bash
| Relation | Recall | Precision | F1 | Support |
|-------------------|--------|-----------|-------|---------|
| is_obliged_object | 0.989 | 0.994 | 0.992 | 177 |
| is_obliged_to | 0.995 | 1.000 | 0.998 | 202 |
| is_obliged_with | 1.000 | 0.961 | 0.980 | 49 |
| Avg. | 0.996 | 0.989 | 0.992 | - |
| Weighted-Avg. | 0.996 | 0.996 | 0.996 | - |
label Recall Precision F1 Support
is_obliged_object 0.989 0.994 0.992 177
is_obliged_to 0.995 1.000 0.998 202
is_obliged_with 1.000 0.961 0.980 49
Avg. 0.996 0.989 0.992 -
Weighted-Avg. 0.996 0.996 0.996 -
```
Original file line number Diff line number Diff line change
Expand Up @@ -111,11 +111,10 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|--------------------------|-----------|--------|----------|---------|
| asset-purchase-agreement | 0.96 | 0.96 | 0.96 | 27 |
| other | 0.99 | 0.99 | 0.99 | 85 |
| accuracy | - | - | 0.98 | 112 |
| macro-avg | 0.98 | 0.98 | 0.98 | 112 |
| weighted-avg | 0.98 | 0.98 | 0.98 | 112 |
label precision recall f1-score support
asset-purchase-agreement 0.96 0.96 0.96 27
other 0.99 0.99 0.99 85
accuracy - - 0.98 112
macro-avg 0.98 0.98 0.98 112
weighted-avg 0.98 0.98 0.98 112
```
Original file line number Diff line number Diff line change
Expand Up @@ -111,11 +111,10 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|------------------------|-----------|--------|----------|---------|
| distribution-agreement | 1.00 | 0.89 | 0.94 | 28 |
| other | 0.97 | 1.00 | 0.98 | 85 |
| accuracy | - | - | 0.97 | 113 |
| macro-avg | 0.98 | 0.95 | 0.96 | 113 |
| weighted-avg | 0.97 | 0.97 | 0.97 | 113 |
label precision recall f1-score support
distribution-agreement 1.00 0.89 0.94 28
other 0.97 1.00 0.98 85
accuracy - - 0.97 113
macro-avg 0.98 0.95 0.96 113
weighted-avg 0.97 0.97 0.97 113
```
Original file line number Diff line number Diff line change
Expand Up @@ -111,11 +111,10 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|--------------------------------|-----------|--------|----------|---------|
| executive-employment-agreement | 1.00 | 1.00 | 1.00 | 45 |
| other | 1.00 | 1.00 | 1.00 | 85 |
| accuracy | - | - | 1.00 | 130 |
| macro-avg | 1.00 | 1.00 | 1.00 | 130 |
| weighted-avg | 1.00 | 1.00 | 1.00 | 130 |
label precision recall f1-score support
executive-employment-agreement 1.00 1.00 1.00 45
other 1.00 1.00 1.00 85
accuracy - - 1.00 130
macro-avg 1.00 1.00 1.00 130
weighted-avg 1.00 1.00 1.00 130
```
Original file line number Diff line number Diff line change
Expand Up @@ -111,11 +111,10 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
|---------------------------|-----------|--------|----------|---------|
| indemnification-agreement | 1.00 | 1.00 | 1.00 | 31 |
| other | 1.00 | 1.00 | 1.00 | 85 |
| accuracy | - | - | 1.00 | 116 |
| macro-avg | 1.00 | 1.00 | 1.00 | 116 |
| weighted-avg | 1.00 | 1.00 | 1.00 | 116 |
label precision recall f1-score support
indemnification-agreement 1.00 1.00 1.00 31
other 1.00 1.00 1.00 85
accuracy - - 1.00 116
macro-avg 1.00 1.00 1.00 116
weighted-avg 1.00 1.00 1.00 116
```
Original file line number Diff line number Diff line change
Expand Up @@ -115,13 +115,11 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum

```bash

| label | precision | recall | f1-score | support |
|-----------------------------|-----------|--------|----------|---------|
| loan-and-security-agreement | 0.94 | 1.00 | 0.97 | 33 |
| other | 1.00 | 0.98 | 0.99 | 85 |
| accuracy | - | - | 0.98 | 118 |
| macro-avg | 0.97 | 0.99 | 0.98 | 118 |
| weighted-avg | 0.98 | 0.98 | 0.98 | 118 |

label precision recall f1-score support
loan-and-security-agreement 0.94 1.00 0.97 33
other 1.00 0.98 0.99 85
accuracy - - 0.98 118
macro-avg 0.97 0.99 0.98 118
weighted-avg 0.98 0.98 0.98 118

```
Original file line number Diff line number Diff line change
Expand Up @@ -116,13 +116,11 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum

```bash

| label | precision | recall | f1-score | support |
|-------------------------|-----------|--------|----------|---------|
| other | 0.97 | 0.99 | 0.98 | 85 |
| participation-agreement | 0.97 | 0.90 | 0.93 | 31 |
| accuracy | - | - | 0.97 | 116 |
| macro-avg | 0.97 | 0.95 | 0.96 | 116 |
| weighted-avg | 0.97 | 0.97 | 0.97 | 116 |

label precision recall f1-score support
other 0.97 0.99 0.98 85
participation-agreement 0.97 0.90 0.93 31
accuracy - - 0.97 116
macro-avg 0.97 0.95 0.96 116
weighted-avg 0.97 0.97 0.97 116

```
Original file line number Diff line number Diff line change
Expand Up @@ -116,13 +116,11 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum

```bash

| label | precision | recall | f1-score | support |
|-------------------------------|-----------|--------|----------|---------|
| other | 0.92 | 0.92 | 0.92 | 85 |
| securities-purchase-agreement | 0.77 | 0.77 | 0.77 | 30 |
| accuracy | - | - | 0.88 | 115 |
| macro-avg | 0.84 | 0.84 | 0.84 | 115 |
| weighted-avg | 0.88 | 0.88 | 0.88 | 115 |

label precision recall f1-score support
other 0.92 0.92 0.92 85
securities-purchase-agreement 0.77 0.77 0.77 30
accuracy - - 0.88 115
macro-avg 0.84 0.84 0.84 115
weighted-avg 0.88 0.88 0.88 115

```
Original file line number Diff line number Diff line change
Expand Up @@ -116,13 +116,11 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum

```bash

| label | precision | recall | f1-score | support |
|--------------------|-----------|--------|----------|---------|
| other | 0.92 | 0.96 | 0.94 | 85 |
| security-agreement | 0.92 | 0.82 | 0.87 | 40 |
| accuracy | - | - | 0.92 | 125 |
| macro-avg | 0.92 | 0.89 | 0.91 | 125 |
| weighted-avg | 0.92 | 0.92 | 0.92 | 125 |

label precision recall f1-score support
other 0.92 0.96 0.94 85
security-agreement 0.92 0.82 0.87 40
accuracy - - 0.92 125
macro-avg 0.92 0.89 0.91 125
weighted-avg 0.92 0.92 0.92 125

```
Original file line number Diff line number Diff line change
Expand Up @@ -116,13 +116,12 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum

```bash

| label | precision | recall | f1-score | support |
|--------------------------|-----------|--------|----------|---------|
| other | 0.87 | 0.91 | 0.89 | 43 |
| stock-purchase-agreement | 0.88 | 0.83 | 0.85 | 35 |
| accuracy | - | - | 0.87 | 78 |
| macro-avg | 0.87 | 0.87 | 0.87 | 78 |
| weighted-avg | 0.87 | 0.87 | 0.87 | 78 |
label precision recall f1-score support
other 0.87 0.91 0.89 43
stock-purchase-agreement 0.88 0.83 0.85 35
accuracy - - 0.87 78
macro-avg 0.87 0.87 0.87 78
weighted-avg 0.87 0.87 0.87 78


```
Original file line number Diff line number Diff line change
Expand Up @@ -116,13 +116,11 @@ Legal documents, scrapped from the Internet, and classified in-house + SEC docum

```bash

| label | precision | recall | f1-score | support |
|------------------------|-----------|--------|----------|---------|
| other | 0.99 | 0.98 | 0.98 | 85 |
| subscription-agreement | 0.94 | 0.97 | 0.95 | 30 |
| accuracy | - | - | 0.97 | 115 |
| macro-avg | 0.96 | 0.97 | 0.97 | 115 |
| weighted-avg | 0.97 | 0.97 | 0.97 | 115 |

label precision recall f1-score support
other 0.99 0.98 0.98 85
subscription-agreement 0.94 0.97 0.95 30
accuracy - - 0.97 115
macro-avg 0.96 0.97 0.97 115
weighted-avg 0.97 0.97 0.97 115

```
Original file line number Diff line number Diff line change
Expand Up @@ -150,22 +150,22 @@ Dataset is available [here](https://zenodo.org/record/7025333#.Y2zsquxBx83).
## Benchmarking

```bash
| label | precision | recall | f1-score | support |
| DATE | 0.9036 | 0.9223 | 0.9128 | 193 |
| DECISION | 0.9831 | 0.9831 | 0.9831 | 59 |
| DECREE | 0.5000 | 1.0000 | 0.6667 | 1 |
| DIRECTIVE | 1.0000 | 0.6667 | 0.8000 | 3 |
| EMERGENCY_ORDINANCE | 1.0000 | 0.9615 | 0.9804 | 26 |
| LAW | 0.9619 | 0.9806 | 0.9712 | 103 |
| LOC | 0.9110 | 0.8365 | 0.8721 | 159 |
| ORDER | 0.9767 | 1.0000 | 0.9882 | 42 |
| ORDINANCE | 1.0000 | 0.9500 | 0.9744 | 20 |
| ORG | 0.8899 | 0.8879 | 0.8889 | 455 |
| PER | 0.9091 | 0.9821 | 0.9442 | 112 |
| REGULATION | 0.9118 | 0.8378 | 0.8732 | 37 |
| REPORT | 0.7778 | 0.7778 | 0.7778 | 9 |
| TREATY | 1.0000 | 1.0000 | 1.0000 | 3 |
| micro-avg | 0.9139 | 0.9116 | 0.9127 | 1222 |
| macro-avg | 0.9089 | 0.9133 | 0.9024 | 1222 |
| weighted-avg | 0.9143 | 0.9116 | 0.9124 | 1222 |
label precision recall f1-score support
DATE 0.9036 0.9223 0.9128 193
DECISION 0.9831 0.9831 0.9831 59
DECREE 0.5000 1.0000 0.6667 1
DIRECTIVE 1.0000 0.6667 0.8000 3
EMERGENCY_ORDINANCE 1.0000 0.9615 0.9804 26
LAW 0.9619 0.9806 0.9712 103
LOC 0.9110 0.8365 0.8721 159
ORDER 0.9767 1.0000 0.9882 42
ORDINANCE 1.0000 0.9500 0.9744 20
ORG 0.8899 0.8879 0.8889 455
PER 0.9091 0.9821 0.9442 112
REGULATION 0.9118 0.8378 0.8732 37
REPORT 0.7778 0.7778 0.7778 9
TREATY 1.0000 1.0000 1.0000 3
micro-avg 0.9139 0.9116 0.9127 1222
macro-avg 0.9089 0.9133 0.9024 1222
weighted-avg 0.9143 0.9116 0.9124 1222
```
Loading