llama-2-7b-hf-10g10s-rs1 / emissions.csv
g8a9's picture
Upload folder using huggingface_hub
72e3753
raw
history blame contribute delete
810 Bytes
timestamp,project_name,run_id,duration,emissions,emissions_rate,cpu_power,gpu_power,ram_power,cpu_energy,gpu_energy,ram_energy,energy_consumed,country_name,country_iso_code,region,cloud_provider,cloud_region,os,python_version,codecarbon_version,cpu_count,cpu_model,gpu_count,gpu_model,longitude,latitude,ram_total_size,tracking_mode,on_cloud,pue
2023-12-21T18:31:28,codecarbon,2432cd83-7fe7-43f4-8413-690a8cd5c26e,25352.08171439171,2.0619352210297115,8.133198860191451e-05,187.83183497007255,291.12454977318725,377.6170320510864,1.3337226526053243,2.0582504382656985,2.658412834437845,6.050385925308872,Italy,ITA,lombardy,,,Linux-4.18.0-372.9.1.el8.x86_64-x86_64-with-glibc2.28,3.10.0,2.3.2,4,Intel(R) Xeon(R) Gold 6326 CPU @ 2.90GHz,1,1 x NVIDIA A100 80GB PCIe,9.1922,45.4722,1006.9787521362305,machine,N,1.0