Fill-Mask
Transformers
Safetensors
roberta
eacortes commited on
Commit
3a36438
·
verified ·
1 Parent(s): dd8ad0e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,6 +7,6 @@ library_name: transformers
7
 
8
  This model is a ChemBERTa model trained on the augmented_canonical_pubchem_13m dataset.
9
 
10
- The model was trained for 10 epochs using NVIDIA Apex's FusedAdam optimizer with a reduce-on-plateau learning rate scheduler.
11
  To improve performance, mixed precision (fp16), TF32, and torch.compile were enabled. Training used gradient accumulation (16 steps) and batch size of 128 for efficient resource utilization.
12
  Evaluation was performed at regular intervals, with the best model selected based on validation performance.
 
7
 
8
  This model is a ChemBERTa model trained on the augmented_canonical_pubchem_13m dataset.
9
 
10
+ The model was trained for 24 epochs using NVIDIA Apex's FusedAdam optimizer with a reduce-on-plateau learning rate scheduler.
11
  To improve performance, mixed precision (fp16), TF32, and torch.compile were enabled. Training used gradient accumulation (16 steps) and batch size of 128 for efficient resource utilization.
12
  Evaluation was performed at regular intervals, with the best model selected based on validation performance.