2023-10-17 14:21:44,567 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,568 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): ElectraModel( (embeddings): ElectraEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): ElectraEncoder( (layer): ModuleList( (0-11): 12 x ElectraLayer( (attention): ElectraAttention( (self): ElectraSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): ElectraSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): ElectraIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): ElectraOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=13, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-17 14:21:44,568 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,568 MultiCorpus: 7936 train + 992 dev + 992 test sentences - NER_ICDAR_EUROPEANA Corpus: 7936 train + 992 dev + 992 test sentences - /root/.flair/datasets/ner_icdar_europeana/fr 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Train: 7936 sentences 2023-10-17 14:21:44,569 (train_with_dev=False, train_with_test=False) 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Training Params: 2023-10-17 14:21:44,569 - learning_rate: "3e-05" 2023-10-17 14:21:44,569 - mini_batch_size: "4" 2023-10-17 14:21:44,569 - max_epochs: "10" 2023-10-17 14:21:44,569 - shuffle: "True" 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Plugins: 2023-10-17 14:21:44,569 - TensorboardLogger 2023-10-17 14:21:44,569 - LinearScheduler | warmup_fraction: '0.1' 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Final evaluation on model from best epoch (best-model.pt) 2023-10-17 14:21:44,569 - metric: "('micro avg', 'f1-score')" 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Computation: 2023-10-17 14:21:44,569 - compute on device: cuda:0 2023-10-17 14:21:44,569 - embedding storage: none 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Model training base path: "hmbench-icdar/fr-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5" 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:21:44,569 Logging anything other than scalars to TensorBoard is currently not supported. 2023-10-17 14:21:53,339 epoch 1 - iter 198/1984 - loss 2.35272971 - time (sec): 8.77 - samples/sec: 1839.29 - lr: 0.000003 - momentum: 0.000000 2023-10-17 14:22:02,373 epoch 1 - iter 396/1984 - loss 1.35226237 - time (sec): 17.80 - samples/sec: 1885.07 - lr: 0.000006 - momentum: 0.000000 2023-10-17 14:22:11,529 epoch 1 - iter 594/1984 - loss 0.99696504 - time (sec): 26.96 - samples/sec: 1827.49 - lr: 0.000009 - momentum: 0.000000 2023-10-17 14:22:20,576 epoch 1 - iter 792/1984 - loss 0.80282302 - time (sec): 36.01 - samples/sec: 1811.71 - lr: 0.000012 - momentum: 0.000000 2023-10-17 14:22:29,709 epoch 1 - iter 990/1984 - loss 0.68889213 - time (sec): 45.14 - samples/sec: 1785.58 - lr: 0.000015 - momentum: 0.000000 2023-10-17 14:22:38,778 epoch 1 - iter 1188/1984 - loss 0.59950974 - time (sec): 54.21 - samples/sec: 1788.54 - lr: 0.000018 - momentum: 0.000000 2023-10-17 14:22:48,278 epoch 1 - iter 1386/1984 - loss 0.53152483 - time (sec): 63.71 - samples/sec: 1787.61 - lr: 0.000021 - momentum: 0.000000 2023-10-17 14:22:57,520 epoch 1 - iter 1584/1984 - loss 0.48053194 - time (sec): 72.95 - samples/sec: 1789.09 - lr: 0.000024 - momentum: 0.000000 2023-10-17 14:23:06,591 epoch 1 - iter 1782/1984 - loss 0.44152631 - time (sec): 82.02 - samples/sec: 1792.58 - lr: 0.000027 - momentum: 0.000000 2023-10-17 14:23:16,377 epoch 1 - iter 1980/1984 - loss 0.40853255 - time (sec): 91.81 - samples/sec: 1782.55 - lr: 0.000030 - momentum: 0.000000 2023-10-17 14:23:16,556 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:23:16,557 EPOCH 1 done: loss 0.4079 - lr: 0.000030 2023-10-17 14:23:19,804 DEV : loss 0.09482365101575851 - f1-score (micro avg) 0.7296 2023-10-17 14:23:19,827 saving best model 2023-10-17 14:23:20,267 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:23:29,741 epoch 2 - iter 198/1984 - loss 0.10656840 - time (sec): 9.47 - samples/sec: 1799.02 - lr: 0.000030 - momentum: 0.000000 2023-10-17 14:23:38,991 epoch 2 - iter 396/1984 - loss 0.10818698 - time (sec): 18.72 - samples/sec: 1785.10 - lr: 0.000029 - momentum: 0.000000 2023-10-17 14:23:48,043 epoch 2 - iter 594/1984 - loss 0.11021072 - time (sec): 27.77 - samples/sec: 1791.14 - lr: 0.000029 - momentum: 0.000000 2023-10-17 14:23:57,001 epoch 2 - iter 792/1984 - loss 0.11163153 - time (sec): 36.73 - samples/sec: 1776.91 - lr: 0.000029 - momentum: 0.000000 2023-10-17 14:24:06,134 epoch 2 - iter 990/1984 - loss 0.11183860 - time (sec): 45.87 - samples/sec: 1789.70 - lr: 0.000028 - momentum: 0.000000 2023-10-17 14:24:15,206 epoch 2 - iter 1188/1984 - loss 0.11055508 - time (sec): 54.94 - samples/sec: 1783.11 - lr: 0.000028 - momentum: 0.000000 2023-10-17 14:24:24,164 epoch 2 - iter 1386/1984 - loss 0.11239117 - time (sec): 63.90 - samples/sec: 1783.76 - lr: 0.000028 - momentum: 0.000000 2023-10-17 14:24:33,227 epoch 2 - iter 1584/1984 - loss 0.11240077 - time (sec): 72.96 - samples/sec: 1784.23 - lr: 0.000027 - momentum: 0.000000 2023-10-17 14:24:42,472 epoch 2 - iter 1782/1984 - loss 0.11194093 - time (sec): 82.20 - samples/sec: 1791.13 - lr: 0.000027 - momentum: 0.000000 2023-10-17 14:24:51,688 epoch 2 - iter 1980/1984 - loss 0.11315701 - time (sec): 91.42 - samples/sec: 1790.60 - lr: 0.000027 - momentum: 0.000000 2023-10-17 14:24:51,873 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:24:51,873 EPOCH 2 done: loss 0.1131 - lr: 0.000027 2023-10-17 14:24:55,803 DEV : loss 0.09371839463710785 - f1-score (micro avg) 0.7416 2023-10-17 14:24:55,825 saving best model 2023-10-17 14:24:56,405 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:25:05,416 epoch 3 - iter 198/1984 - loss 0.08688426 - time (sec): 9.00 - samples/sec: 1688.59 - lr: 0.000026 - momentum: 0.000000 2023-10-17 14:25:14,414 epoch 3 - iter 396/1984 - loss 0.08535351 - time (sec): 18.00 - samples/sec: 1749.88 - lr: 0.000026 - momentum: 0.000000 2023-10-17 14:25:23,215 epoch 3 - iter 594/1984 - loss 0.08392751 - time (sec): 26.80 - samples/sec: 1804.77 - lr: 0.000026 - momentum: 0.000000 2023-10-17 14:25:31,824 epoch 3 - iter 792/1984 - loss 0.08103732 - time (sec): 35.41 - samples/sec: 1840.55 - lr: 0.000025 - momentum: 0.000000 2023-10-17 14:25:40,506 epoch 3 - iter 990/1984 - loss 0.08337144 - time (sec): 44.09 - samples/sec: 1849.78 - lr: 0.000025 - momentum: 0.000000 2023-10-17 14:25:49,403 epoch 3 - iter 1188/1984 - loss 0.08370851 - time (sec): 52.99 - samples/sec: 1861.28 - lr: 0.000025 - momentum: 0.000000 2023-10-17 14:25:58,429 epoch 3 - iter 1386/1984 - loss 0.08381845 - time (sec): 62.01 - samples/sec: 1844.58 - lr: 0.000024 - momentum: 0.000000 2023-10-17 14:26:07,378 epoch 3 - iter 1584/1984 - loss 0.08471920 - time (sec): 70.96 - samples/sec: 1839.09 - lr: 0.000024 - momentum: 0.000000 2023-10-17 14:26:16,525 epoch 3 - iter 1782/1984 - loss 0.08345150 - time (sec): 80.11 - samples/sec: 1841.82 - lr: 0.000024 - momentum: 0.000000 2023-10-17 14:26:25,615 epoch 3 - iter 1980/1984 - loss 0.08467558 - time (sec): 89.20 - samples/sec: 1834.64 - lr: 0.000023 - momentum: 0.000000 2023-10-17 14:26:25,813 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:26:25,813 EPOCH 3 done: loss 0.0846 - lr: 0.000023 2023-10-17 14:26:29,205 DEV : loss 0.12102336436510086 - f1-score (micro avg) 0.7527 2023-10-17 14:26:29,226 saving best model 2023-10-17 14:26:29,740 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:26:38,903 epoch 4 - iter 198/1984 - loss 0.06092971 - time (sec): 9.16 - samples/sec: 1722.01 - lr: 0.000023 - momentum: 0.000000 2023-10-17 14:26:48,164 epoch 4 - iter 396/1984 - loss 0.06579458 - time (sec): 18.42 - samples/sec: 1768.71 - lr: 0.000023 - momentum: 0.000000 2023-10-17 14:26:57,322 epoch 4 - iter 594/1984 - loss 0.06716833 - time (sec): 27.58 - samples/sec: 1757.12 - lr: 0.000022 - momentum: 0.000000 2023-10-17 14:27:06,588 epoch 4 - iter 792/1984 - loss 0.06375569 - time (sec): 36.84 - samples/sec: 1767.60 - lr: 0.000022 - momentum: 0.000000 2023-10-17 14:27:15,756 epoch 4 - iter 990/1984 - loss 0.06406550 - time (sec): 46.01 - samples/sec: 1774.90 - lr: 0.000022 - momentum: 0.000000 2023-10-17 14:27:24,732 epoch 4 - iter 1188/1984 - loss 0.06699263 - time (sec): 54.99 - samples/sec: 1778.23 - lr: 0.000021 - momentum: 0.000000 2023-10-17 14:27:33,952 epoch 4 - iter 1386/1984 - loss 0.06689916 - time (sec): 64.21 - samples/sec: 1779.83 - lr: 0.000021 - momentum: 0.000000 2023-10-17 14:27:42,808 epoch 4 - iter 1584/1984 - loss 0.06814155 - time (sec): 73.06 - samples/sec: 1785.97 - lr: 0.000021 - momentum: 0.000000 2023-10-17 14:27:51,811 epoch 4 - iter 1782/1984 - loss 0.06896506 - time (sec): 82.07 - samples/sec: 1795.00 - lr: 0.000020 - momentum: 0.000000 2023-10-17 14:28:00,740 epoch 4 - iter 1980/1984 - loss 0.06896094 - time (sec): 91.00 - samples/sec: 1798.71 - lr: 0.000020 - momentum: 0.000000 2023-10-17 14:28:00,926 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:28:00,926 EPOCH 4 done: loss 0.0691 - lr: 0.000020 2023-10-17 14:28:04,626 DEV : loss 0.1571902483701706 - f1-score (micro avg) 0.7542 2023-10-17 14:28:04,649 saving best model 2023-10-17 14:28:05,140 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:28:14,051 epoch 5 - iter 198/1984 - loss 0.04531813 - time (sec): 8.91 - samples/sec: 1796.32 - lr: 0.000020 - momentum: 0.000000 2023-10-17 14:28:23,211 epoch 5 - iter 396/1984 - loss 0.04647017 - time (sec): 18.07 - samples/sec: 1815.98 - lr: 0.000019 - momentum: 0.000000 2023-10-17 14:28:32,522 epoch 5 - iter 594/1984 - loss 0.04475855 - time (sec): 27.38 - samples/sec: 1830.73 - lr: 0.000019 - momentum: 0.000000 2023-10-17 14:28:41,587 epoch 5 - iter 792/1984 - loss 0.04669245 - time (sec): 36.45 - samples/sec: 1818.97 - lr: 0.000019 - momentum: 0.000000 2023-10-17 14:28:50,763 epoch 5 - iter 990/1984 - loss 0.04912804 - time (sec): 45.62 - samples/sec: 1834.17 - lr: 0.000018 - momentum: 0.000000 2023-10-17 14:28:59,777 epoch 5 - iter 1188/1984 - loss 0.04867525 - time (sec): 54.64 - samples/sec: 1821.02 - lr: 0.000018 - momentum: 0.000000 2023-10-17 14:29:09,023 epoch 5 - iter 1386/1984 - loss 0.04927445 - time (sec): 63.88 - samples/sec: 1821.15 - lr: 0.000018 - momentum: 0.000000 2023-10-17 14:29:18,033 epoch 5 - iter 1584/1984 - loss 0.04954311 - time (sec): 72.89 - samples/sec: 1809.93 - lr: 0.000017 - momentum: 0.000000 2023-10-17 14:29:27,001 epoch 5 - iter 1782/1984 - loss 0.04932246 - time (sec): 81.86 - samples/sec: 1803.75 - lr: 0.000017 - momentum: 0.000000 2023-10-17 14:29:36,222 epoch 5 - iter 1980/1984 - loss 0.04937008 - time (sec): 91.08 - samples/sec: 1796.77 - lr: 0.000017 - momentum: 0.000000 2023-10-17 14:29:36,399 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:29:36,399 EPOCH 5 done: loss 0.0493 - lr: 0.000017 2023-10-17 14:29:39,976 DEV : loss 0.16390633583068848 - f1-score (micro avg) 0.7745 2023-10-17 14:29:40,003 saving best model 2023-10-17 14:29:40,607 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:29:49,628 epoch 6 - iter 198/1984 - loss 0.03464468 - time (sec): 9.02 - samples/sec: 1836.48 - lr: 0.000016 - momentum: 0.000000 2023-10-17 14:29:58,781 epoch 6 - iter 396/1984 - loss 0.03116501 - time (sec): 18.17 - samples/sec: 1841.59 - lr: 0.000016 - momentum: 0.000000 2023-10-17 14:30:07,928 epoch 6 - iter 594/1984 - loss 0.03152775 - time (sec): 27.32 - samples/sec: 1836.49 - lr: 0.000016 - momentum: 0.000000 2023-10-17 14:30:17,082 epoch 6 - iter 792/1984 - loss 0.03071046 - time (sec): 36.47 - samples/sec: 1832.41 - lr: 0.000015 - momentum: 0.000000 2023-10-17 14:30:26,291 epoch 6 - iter 990/1984 - loss 0.03185003 - time (sec): 45.68 - samples/sec: 1820.68 - lr: 0.000015 - momentum: 0.000000 2023-10-17 14:30:35,672 epoch 6 - iter 1188/1984 - loss 0.03355453 - time (sec): 55.06 - samples/sec: 1823.91 - lr: 0.000015 - momentum: 0.000000 2023-10-17 14:30:44,728 epoch 6 - iter 1386/1984 - loss 0.03464690 - time (sec): 64.12 - samples/sec: 1807.92 - lr: 0.000014 - momentum: 0.000000 2023-10-17 14:30:53,876 epoch 6 - iter 1584/1984 - loss 0.03696921 - time (sec): 73.27 - samples/sec: 1795.05 - lr: 0.000014 - momentum: 0.000000 2023-10-17 14:31:02,724 epoch 6 - iter 1782/1984 - loss 0.03702742 - time (sec): 82.11 - samples/sec: 1799.57 - lr: 0.000014 - momentum: 0.000000 2023-10-17 14:31:11,783 epoch 6 - iter 1980/1984 - loss 0.03663836 - time (sec): 91.17 - samples/sec: 1795.02 - lr: 0.000013 - momentum: 0.000000 2023-10-17 14:31:11,968 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:31:11,968 EPOCH 6 done: loss 0.0367 - lr: 0.000013 2023-10-17 14:31:15,510 DEV : loss 0.19935466349124908 - f1-score (micro avg) 0.7647 2023-10-17 14:31:15,540 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:31:26,185 epoch 7 - iter 198/1984 - loss 0.02511532 - time (sec): 10.64 - samples/sec: 1492.77 - lr: 0.000013 - momentum: 0.000000 2023-10-17 14:31:36,808 epoch 7 - iter 396/1984 - loss 0.02353593 - time (sec): 21.27 - samples/sec: 1529.14 - lr: 0.000013 - momentum: 0.000000 2023-10-17 14:31:46,481 epoch 7 - iter 594/1984 - loss 0.02725110 - time (sec): 30.94 - samples/sec: 1579.91 - lr: 0.000012 - momentum: 0.000000 2023-10-17 14:31:55,734 epoch 7 - iter 792/1984 - loss 0.02742823 - time (sec): 40.19 - samples/sec: 1651.96 - lr: 0.000012 - momentum: 0.000000 2023-10-17 14:32:05,005 epoch 7 - iter 990/1984 - loss 0.02499010 - time (sec): 49.46 - samples/sec: 1704.82 - lr: 0.000012 - momentum: 0.000000 2023-10-17 14:32:14,065 epoch 7 - iter 1188/1984 - loss 0.02517202 - time (sec): 58.52 - samples/sec: 1716.61 - lr: 0.000011 - momentum: 0.000000 2023-10-17 14:32:22,809 epoch 7 - iter 1386/1984 - loss 0.02492481 - time (sec): 67.27 - samples/sec: 1727.30 - lr: 0.000011 - momentum: 0.000000 2023-10-17 14:32:31,645 epoch 7 - iter 1584/1984 - loss 0.02477025 - time (sec): 76.10 - samples/sec: 1737.03 - lr: 0.000011 - momentum: 0.000000 2023-10-17 14:32:40,907 epoch 7 - iter 1782/1984 - loss 0.02572704 - time (sec): 85.37 - samples/sec: 1736.07 - lr: 0.000010 - momentum: 0.000000 2023-10-17 14:32:49,995 epoch 7 - iter 1980/1984 - loss 0.02491306 - time (sec): 94.45 - samples/sec: 1732.11 - lr: 0.000010 - momentum: 0.000000 2023-10-17 14:32:50,174 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:32:50,174 EPOCH 7 done: loss 0.0250 - lr: 0.000010 2023-10-17 14:32:53,683 DEV : loss 0.20214584469795227 - f1-score (micro avg) 0.7623 2023-10-17 14:32:53,716 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:33:02,786 epoch 8 - iter 198/1984 - loss 0.01165556 - time (sec): 9.07 - samples/sec: 1793.90 - lr: 0.000010 - momentum: 0.000000 2023-10-17 14:33:11,842 epoch 8 - iter 396/1984 - loss 0.01479537 - time (sec): 18.13 - samples/sec: 1782.05 - lr: 0.000009 - momentum: 0.000000 2023-10-17 14:33:20,675 epoch 8 - iter 594/1984 - loss 0.01696006 - time (sec): 26.96 - samples/sec: 1809.49 - lr: 0.000009 - momentum: 0.000000 2023-10-17 14:33:29,808 epoch 8 - iter 792/1984 - loss 0.01648117 - time (sec): 36.09 - samples/sec: 1809.14 - lr: 0.000009 - momentum: 0.000000 2023-10-17 14:33:38,982 epoch 8 - iter 990/1984 - loss 0.01744306 - time (sec): 45.27 - samples/sec: 1798.74 - lr: 0.000008 - momentum: 0.000000 2023-10-17 14:33:48,084 epoch 8 - iter 1188/1984 - loss 0.01772065 - time (sec): 54.37 - samples/sec: 1786.89 - lr: 0.000008 - momentum: 0.000000 2023-10-17 14:33:57,336 epoch 8 - iter 1386/1984 - loss 0.01868319 - time (sec): 63.62 - samples/sec: 1786.50 - lr: 0.000008 - momentum: 0.000000 2023-10-17 14:34:06,576 epoch 8 - iter 1584/1984 - loss 0.01859987 - time (sec): 72.86 - samples/sec: 1790.02 - lr: 0.000007 - momentum: 0.000000 2023-10-17 14:34:17,150 epoch 8 - iter 1782/1984 - loss 0.01846760 - time (sec): 83.43 - samples/sec: 1753.15 - lr: 0.000007 - momentum: 0.000000 2023-10-17 14:34:27,610 epoch 8 - iter 1980/1984 - loss 0.01868024 - time (sec): 93.89 - samples/sec: 1742.66 - lr: 0.000007 - momentum: 0.000000 2023-10-17 14:34:27,831 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:34:27,831 EPOCH 8 done: loss 0.0187 - lr: 0.000007 2023-10-17 14:34:31,286 DEV : loss 0.22779549658298492 - f1-score (micro avg) 0.7628 2023-10-17 14:34:31,310 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:34:41,901 epoch 9 - iter 198/1984 - loss 0.01906451 - time (sec): 10.59 - samples/sec: 1589.50 - lr: 0.000006 - momentum: 0.000000 2023-10-17 14:34:51,675 epoch 9 - iter 396/1984 - loss 0.01555467 - time (sec): 20.36 - samples/sec: 1658.32 - lr: 0.000006 - momentum: 0.000000 2023-10-17 14:35:01,204 epoch 9 - iter 594/1984 - loss 0.01521979 - time (sec): 29.89 - samples/sec: 1678.39 - lr: 0.000006 - momentum: 0.000000 2023-10-17 14:35:10,391 epoch 9 - iter 792/1984 - loss 0.01491455 - time (sec): 39.08 - samples/sec: 1697.20 - lr: 0.000005 - momentum: 0.000000 2023-10-17 14:35:19,561 epoch 9 - iter 990/1984 - loss 0.01387345 - time (sec): 48.25 - samples/sec: 1725.91 - lr: 0.000005 - momentum: 0.000000 2023-10-17 14:35:28,646 epoch 9 - iter 1188/1984 - loss 0.01431168 - time (sec): 57.33 - samples/sec: 1728.81 - lr: 0.000005 - momentum: 0.000000 2023-10-17 14:35:37,776 epoch 9 - iter 1386/1984 - loss 0.01452248 - time (sec): 66.46 - samples/sec: 1739.53 - lr: 0.000004 - momentum: 0.000000 2023-10-17 14:35:46,951 epoch 9 - iter 1584/1984 - loss 0.01440257 - time (sec): 75.64 - samples/sec: 1742.47 - lr: 0.000004 - momentum: 0.000000 2023-10-17 14:35:56,334 epoch 9 - iter 1782/1984 - loss 0.01404208 - time (sec): 85.02 - samples/sec: 1738.26 - lr: 0.000004 - momentum: 0.000000 2023-10-17 14:36:05,792 epoch 9 - iter 1980/1984 - loss 0.01392469 - time (sec): 94.48 - samples/sec: 1732.59 - lr: 0.000003 - momentum: 0.000000 2023-10-17 14:36:05,980 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:36:05,980 EPOCH 9 done: loss 0.0140 - lr: 0.000003 2023-10-17 14:36:09,397 DEV : loss 0.23514559864997864 - f1-score (micro avg) 0.7616 2023-10-17 14:36:09,418 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:36:18,647 epoch 10 - iter 198/1984 - loss 0.00949880 - time (sec): 9.23 - samples/sec: 1832.95 - lr: 0.000003 - momentum: 0.000000 2023-10-17 14:36:27,628 epoch 10 - iter 396/1984 - loss 0.00775092 - time (sec): 18.21 - samples/sec: 1832.79 - lr: 0.000003 - momentum: 0.000000 2023-10-17 14:36:36,789 epoch 10 - iter 594/1984 - loss 0.00839411 - time (sec): 27.37 - samples/sec: 1833.08 - lr: 0.000002 - momentum: 0.000000 2023-10-17 14:36:45,901 epoch 10 - iter 792/1984 - loss 0.00870864 - time (sec): 36.48 - samples/sec: 1796.51 - lr: 0.000002 - momentum: 0.000000 2023-10-17 14:36:55,161 epoch 10 - iter 990/1984 - loss 0.00887633 - time (sec): 45.74 - samples/sec: 1804.60 - lr: 0.000002 - momentum: 0.000000 2023-10-17 14:37:04,293 epoch 10 - iter 1188/1984 - loss 0.00855463 - time (sec): 54.87 - samples/sec: 1791.29 - lr: 0.000001 - momentum: 0.000000 2023-10-17 14:37:13,457 epoch 10 - iter 1386/1984 - loss 0.00875616 - time (sec): 64.04 - samples/sec: 1789.47 - lr: 0.000001 - momentum: 0.000000 2023-10-17 14:37:22,534 epoch 10 - iter 1584/1984 - loss 0.00856302 - time (sec): 73.11 - samples/sec: 1784.41 - lr: 0.000001 - momentum: 0.000000 2023-10-17 14:37:31,887 epoch 10 - iter 1782/1984 - loss 0.00882088 - time (sec): 82.47 - samples/sec: 1770.65 - lr: 0.000000 - momentum: 0.000000 2023-10-17 14:37:41,392 epoch 10 - iter 1980/1984 - loss 0.00856634 - time (sec): 91.97 - samples/sec: 1780.44 - lr: 0.000000 - momentum: 0.000000 2023-10-17 14:37:41,566 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:37:41,566 EPOCH 10 done: loss 0.0086 - lr: 0.000000 2023-10-17 14:37:44,996 DEV : loss 0.23482687771320343 - f1-score (micro avg) 0.7613 2023-10-17 14:37:45,439 ---------------------------------------------------------------------------------------------------- 2023-10-17 14:37:45,440 Loading model from best epoch ... 2023-10-17 14:37:48,035 SequenceTagger predicts: Dictionary with 13 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG 2023-10-17 14:37:50,922 Results: - F-score (micro) 0.8006 - F-score (macro) 0.7098 - Accuracy 0.6849 By class: precision recall f1-score support LOC 0.8376 0.8901 0.8631 655 PER 0.7311 0.7803 0.7549 223 ORG 0.6087 0.4409 0.5114 127 micro avg 0.7924 0.8090 0.8006 1005 macro avg 0.7258 0.7038 0.7098 1005 weighted avg 0.7851 0.8090 0.7946 1005 2023-10-17 14:37:50,922 ----------------------------------------------------------------------------------------------------