README.md
CHANGED
@@ -1,5 +1,6 @@
|
|
1 |
-
---
|
2 |
-
license: other
|
3 |
-
license_name: akibcoding
|
4 |
-
license_link: LICENSE
|
5 |
-
---
|
|
|
|
1 |
+
---
|
2 |
+
license: other
|
3 |
+
license_name: akibcoding
|
4 |
+
license_link: LICENSE
|
5 |
+
---
|
6 |
+
This model, referred to as LastBERT, is a lightweight yet potent BERT-based model designed for natural language processing (NLP) applications. It was created through knowledge distillation from a larger BERT model, resulting in a significant reduction in parameters—from 110 million in BERT-base-uncased to just 29 million in LastBERT, making it approximately 73.64% smaller. Despite its reduced size, LastBERT maintains robust performance across various tasks, such as paraphrase identification, sentiment analysis, and grammatical acceptability, as demonstrated on the General Language Understanding Evaluation (GLUE) benchmark. Additionally, the model has been applied to classify ADHD severity from social media text data, achieving an accuracy, F1 score, precision, and recall of 85%. This model offers an efficient solution for NLP tasks, particularly in resource-constrained environments, without substantial loss in performance.
|