Update README.md
Browse files
README.md
CHANGED
@@ -59,25 +59,13 @@ metrics:
|
|
59 |
|
60 |
RoBERTaLexPT-base is pretrained from LegalPT and CrawlPT corpora, using [RoBERTa-base](https://huggingface.co/FacebookAI/roberta-base), introduced by [Liu et al. (2019)](https://arxiv.org/abs/1907.11692).
|
61 |
|
62 |
-
|
63 |
-
## Model Details
|
64 |
-
|
65 |
-
### Model Description
|
66 |
-
|
67 |
-
<!-- Provide a longer summary of what this model is. -->
|
68 |
-
|
69 |
-
- **Funded by:** [More Information Needed]
|
70 |
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
|
71 |
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
|
72 |
-
|
73 |
-
### Model Sources
|
74 |
-
|
75 |
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
|
76 |
- **Paper:** [More Information Needed]
|
77 |
|
78 |
## Training Details
|
79 |
|
80 |
-
### Training Data
|
81 |
RoBERTaLexPT-base is pretrained from both data:
|
82 |
- [LegalPT](https://huggingface.co/datasets/eduagarcia/LegalPT) is a Portuguese legal corpus by aggregating diverse sources of up to 125GiB data.
|
83 |
- CrawlPT is a duplication of three Portuguese general corpora: [brWaC](https://huggingface.co/datasets/eduagarcia/brwac_dedup), [CC100-PT](https://huggingface.co/datasets/eduagarcia/cc100-pt), [OSCAR-2301](https://huggingface.co/datasets/eduagarcia/OSCAR-2301-pt_dedup).
|
@@ -153,3 +141,5 @@ The model was evaluated on ["PortuLex" benchmark](eduagarcia/portuguese_benchmar
|
|
153 |
|
154 |
|
155 |
[More Information Needed]
|
|
|
|
|
|
59 |
|
60 |
RoBERTaLexPT-base is pretrained from LegalPT and CrawlPT corpora, using [RoBERTa-base](https://huggingface.co/FacebookAI/roberta-base), introduced by [Liu et al. (2019)](https://arxiv.org/abs/1907.11692).
|
61 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
62 |
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
|
63 |
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
|
|
|
|
|
|
|
64 |
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
|
65 |
- **Paper:** [More Information Needed]
|
66 |
|
67 |
## Training Details
|
68 |
|
|
|
69 |
RoBERTaLexPT-base is pretrained from both data:
|
70 |
- [LegalPT](https://huggingface.co/datasets/eduagarcia/LegalPT) is a Portuguese legal corpus by aggregating diverse sources of up to 125GiB data.
|
71 |
- CrawlPT is a duplication of three Portuguese general corpora: [brWaC](https://huggingface.co/datasets/eduagarcia/brwac_dedup), [CC100-PT](https://huggingface.co/datasets/eduagarcia/cc100-pt), [OSCAR-2301](https://huggingface.co/datasets/eduagarcia/OSCAR-2301-pt_dedup).
|
|
|
141 |
|
142 |
|
143 |
[More Information Needed]
|
144 |
+
|
145 |
+
## Acknowledgment
|