metadata
license: other
license_name: akibcoding
license_link: LICENSE
This model, referred to as LastBERT, is a lightweight yet potent BERT-based model designed for natural language processing (NLP) applications. It was created through knowledge distillation from a larger BERT model, resulting in a significant reduction in parameters—from 110 million in BERT-base-uncased to just 29 million in LastBERT, making it approximately 73.64% smaller. Despite its reduced size, LastBERT maintains robust performance across various tasks, such as paraphrase identification, sentiment analysis, and grammatical acceptability, as demonstrated on the General Language Understanding Evaluation (GLUE) benchmark.