metadata
license: apache-2.0
datasets:
- AyoubChLin/20NewsGroup-AgNews-CnnNews
- AyoubChLin/CNN_News_Articles_2011-2022
- ag_news
language:
- en
metrics:
- accuracy
pipeline_tag: text-classification
widget:
- text: money in the pocket
- text: no one can win this cup in quatar..
- text: >-
new transformers architicture can build a large language model with low
ressources
Model Card for Model ID
This modelcard aims to be a base template for new models. It has been generated using this raw template.
Model Details
Model Description
- Developed by: CHERGUELAINE Ayoub & BOUBEKRI Faycal
- Shared by [optional]: HuggingFace
- Model type: Language model
- Language(s) (NLP): en
- License: [More Information Needed]
- Finetuned from model [optional]: ESG-Bert
Model Sources [optional]
- Repository: https://huggingface.co/nbroad/ESG-BERT
- Paper [optional]: [More Information Needed]
- Demo [optional]: [More Information Needed]
Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
Training Details
Training Data
(AyoubChLin/20NewsGroup-AgNews-CnnNews)[https://huggingface.co/datasets/AyoubChLin/20NewsGroup-AgNews-CnnNews]
Fine-tuning hyper-parameters
- learning_rate = 4e-5
- batch_size = 8
- max_seq_length = 256
- num_train_epochs = 2.0
Testing Data
Metrics
Accuracy