Update README.md
Browse files
README.md
CHANGED
@@ -25,7 +25,7 @@ metrics:
|
|
25 |
# rufimelo/Legal-SBERTimbau-nli-large
|
26 |
|
27 |
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
28 |
-
Legal-SBERTimbau-large is based on Legal-BERTimbau-large
|
29 |
It is adapted to the Portuguese legal domain.
|
30 |
|
31 |
## Usage (Sentence-Transformers)
|
@@ -104,7 +104,7 @@ print(sentence_embeddings)
|
|
104 |
|
105 |
## Training
|
106 |
|
107 |
-
Legal-SBERTimbau-large is based on Legal-BERTimbau-large
|
108 |
It was trained for Natural Language Inference (NLI). This was chosen due to the lack of Portuguese available data.
|
109 |
In addition to that, it was submitted to a fine tuning stage with the [assin](https://huggingface.co/datasets/assin) and [assin2](https://huggingface.co/datasets/assin2) datasets.
|
110 |
|
|
|
25 |
# rufimelo/Legal-SBERTimbau-nli-large
|
26 |
|
27 |
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
28 |
+
Legal-SBERTimbau-large is based on Legal-BERTimbau-large which derives from [BERTimbau](https://huggingface.co/neuralmind/bert-base-portuguese-cased) Large.
|
29 |
It is adapted to the Portuguese legal domain.
|
30 |
|
31 |
## Usage (Sentence-Transformers)
|
|
|
104 |
|
105 |
## Training
|
106 |
|
107 |
+
Legal-SBERTimbau-large is based on Legal-BERTimbau-large which derives from [BERTimbau](https://huggingface.co/neuralmind/bert-base-portuguese-cased) Large.
|
108 |
It was trained for Natural Language Inference (NLI). This was chosen due to the lack of Portuguese available data.
|
109 |
In addition to that, it was submitted to a fine tuning stage with the [assin](https://huggingface.co/datasets/assin) and [assin2](https://huggingface.co/datasets/assin2) datasets.
|
110 |
|