ThunBERT_bs32_lr4 / README.md
damgomz's picture
Upload README.md with huggingface_hub
792d23f verified
|
raw
history blame
3.1 kB
---
language: en
tags:
- fill-mask
kwargs:
timestamp: '2024-05-11T12:25:48'
project_name: ThunBERT_bs32_lr4_emissions_tracker
run_id: b3318897-e0c1-4e52-b7eb-8a1d61e96626
duration: 164241.20640802383
emissions: 0.1719087733557666
emissions_rate: 1.046684794366977e-06
cpu_power: 42.5
gpu_power: 0.0
ram_power: 37.5
cpu_energy: 1.9389559455265577
gpu_energy: 0
ram_energy: 1.71083436371883
energy_consumed: 3.64979030924538
country_name: Switzerland
country_iso_code: CHE
region: .nan
cloud_provider: .nan
cloud_region: .nan
os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34
python_version: 3.10.4
codecarbon_version: 2.3.4
cpu_count: 4
cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
gpu_count: .nan
gpu_model: .nan
longitude: .nan
latitude: .nan
ram_total_size: 100
tracking_mode: machine
on_cloud: N
pue: 1.0
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 164241.20640802383 |
| Emissions (Co2eq in kg) | 0.1719087733557666 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 37.5 |
| CPU energy (kWh) | 1.9389559455265577 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 1.71083436371883 |
| Consumed energy (kWh) | 3.64979030924538 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 4 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.3161643223354459 |
| Emissions (Co2eq in kg) | 0.06432780584314267 |
## Note
15 May 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ThunBERT_bs32_lr4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 0.0005 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 20557 |
## Training and Testing steps
Epoch | Train Loss | Test Loss
---|---|---
| 0.0 | 6.438970 | 8.578580 |
| 0.5 | 7.817422 | 7.780288 |
| 1.0 | 7.740401 | 7.776956 |
| 1.5 | 7.708930 | 7.715732 |
| 2.0 | 7.687033 | 7.698126 |
| 2.5 | 7.672986 | 7.689110 |
| 3.0 | 7.659054 | 7.694452 |
| 3.5 | 7.656073 | 7.672446 |
| 4.0 | 7.642637 | 7.669732 |
| 4.5 | 7.643010 | 7.676241 |
| 5.0 | 7.632827 | 7.657712 |
| 5.5 | 7.629512 | 7.660105 |
| 6.0 | 7.626969 | 7.654580 |