2023-09-03 18:21:28,781 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,782 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=21, bias=True) (loss_function): CrossEntropyLoss() )" 2023-09-03 18:21:28,782 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,783 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator 2023-09-03 18:21:28,783 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,783 Train: 3575 sentences 2023-09-03 18:21:28,783 (train_with_dev=False, train_with_test=False) 2023-09-03 18:21:28,783 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,783 Training Params: 2023-09-03 18:21:28,783 - learning_rate: "3e-05" 2023-09-03 18:21:28,783 - mini_batch_size: "8" 2023-09-03 18:21:28,783 - max_epochs: "10" 2023-09-03 18:21:28,783 - shuffle: "True" 2023-09-03 18:21:28,783 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,783 Plugins: 2023-09-03 18:21:28,783 - LinearScheduler | warmup_fraction: '0.1' 2023-09-03 18:21:28,783 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,783 Final evaluation on model from best epoch (best-model.pt) 2023-09-03 18:21:28,783 - metric: "('micro avg', 'f1-score')" 2023-09-03 18:21:28,783 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,783 Computation: 2023-09-03 18:21:28,783 - compute on device: cuda:0 2023-09-03 18:21:28,783 - embedding storage: none 2023-09-03 18:21:28,783 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,784 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1" 2023-09-03 18:21:28,784 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:28,784 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:21:37,415 epoch 1 - iter 44/447 - loss 3.11470317 - time (sec): 8.63 - samples/sec: 1101.62 - lr: 0.000003 - momentum: 0.000000 2023-09-03 18:21:44,132 epoch 1 - iter 88/447 - loss 2.46591154 - time (sec): 15.35 - samples/sec: 1132.48 - lr: 0.000006 - momentum: 0.000000 2023-09-03 18:21:50,463 epoch 1 - iter 132/447 - loss 1.86942740 - time (sec): 21.68 - samples/sec: 1154.71 - lr: 0.000009 - momentum: 0.000000 2023-09-03 18:21:57,523 epoch 1 - iter 176/447 - loss 1.51187376 - time (sec): 28.74 - samples/sec: 1163.96 - lr: 0.000012 - momentum: 0.000000 2023-09-03 18:22:04,676 epoch 1 - iter 220/447 - loss 1.29508937 - time (sec): 35.89 - samples/sec: 1158.01 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:22:12,403 epoch 1 - iter 264/447 - loss 1.12696614 - time (sec): 43.62 - samples/sec: 1155.38 - lr: 0.000018 - momentum: 0.000000 2023-09-03 18:22:19,810 epoch 1 - iter 308/447 - loss 1.00970315 - time (sec): 51.03 - samples/sec: 1156.00 - lr: 0.000021 - momentum: 0.000000 2023-09-03 18:22:27,968 epoch 1 - iter 352/447 - loss 0.91037509 - time (sec): 59.18 - samples/sec: 1149.49 - lr: 0.000024 - momentum: 0.000000 2023-09-03 18:22:35,226 epoch 1 - iter 396/447 - loss 0.83951962 - time (sec): 66.44 - samples/sec: 1148.53 - lr: 0.000027 - momentum: 0.000000 2023-09-03 18:22:43,504 epoch 1 - iter 440/447 - loss 0.78127074 - time (sec): 74.72 - samples/sec: 1143.13 - lr: 0.000029 - momentum: 0.000000 2023-09-03 18:22:44,641 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:22:44,641 EPOCH 1 done: loss 0.7743 - lr: 0.000029 2023-09-03 18:22:55,295 DEV : loss 0.18594643473625183 - f1-score (micro avg) 0.6005 2023-09-03 18:22:55,320 saving best model 2023-09-03 18:22:55,824 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:23:02,965 epoch 2 - iter 44/447 - loss 0.18936146 - time (sec): 7.14 - samples/sec: 1197.29 - lr: 0.000030 - momentum: 0.000000 2023-09-03 18:23:10,629 epoch 2 - iter 88/447 - loss 0.20698027 - time (sec): 14.80 - samples/sec: 1151.07 - lr: 0.000029 - momentum: 0.000000 2023-09-03 18:23:18,076 epoch 2 - iter 132/447 - loss 0.20176059 - time (sec): 22.25 - samples/sec: 1154.89 - lr: 0.000029 - momentum: 0.000000 2023-09-03 18:23:26,066 epoch 2 - iter 176/447 - loss 0.19297787 - time (sec): 30.24 - samples/sec: 1127.24 - lr: 0.000029 - momentum: 0.000000 2023-09-03 18:23:33,176 epoch 2 - iter 220/447 - loss 0.18884083 - time (sec): 37.35 - samples/sec: 1123.49 - lr: 0.000028 - momentum: 0.000000 2023-09-03 18:23:41,142 epoch 2 - iter 264/447 - loss 0.17774497 - time (sec): 45.32 - samples/sec: 1115.72 - lr: 0.000028 - momentum: 0.000000 2023-09-03 18:23:49,190 epoch 2 - iter 308/447 - loss 0.17609029 - time (sec): 53.36 - samples/sec: 1121.18 - lr: 0.000028 - momentum: 0.000000 2023-09-03 18:23:56,535 epoch 2 - iter 352/447 - loss 0.17573283 - time (sec): 60.71 - samples/sec: 1117.53 - lr: 0.000027 - momentum: 0.000000 2023-09-03 18:24:03,661 epoch 2 - iter 396/447 - loss 0.17493628 - time (sec): 67.84 - samples/sec: 1118.95 - lr: 0.000027 - momentum: 0.000000 2023-09-03 18:24:12,041 epoch 2 - iter 440/447 - loss 0.17070290 - time (sec): 76.22 - samples/sec: 1119.71 - lr: 0.000027 - momentum: 0.000000 2023-09-03 18:24:13,175 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:24:13,175 EPOCH 2 done: loss 0.1697 - lr: 0.000027 2023-09-03 18:24:26,624 DEV : loss 0.1287333220243454 - f1-score (micro avg) 0.7164 2023-09-03 18:24:26,650 saving best model 2023-09-03 18:24:28,027 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:24:34,995 epoch 3 - iter 44/447 - loss 0.09495783 - time (sec): 6.97 - samples/sec: 1107.10 - lr: 0.000026 - momentum: 0.000000 2023-09-03 18:24:42,160 epoch 3 - iter 88/447 - loss 0.09050438 - time (sec): 14.13 - samples/sec: 1129.37 - lr: 0.000026 - momentum: 0.000000 2023-09-03 18:24:49,492 epoch 3 - iter 132/447 - loss 0.09851004 - time (sec): 21.46 - samples/sec: 1118.88 - lr: 0.000026 - momentum: 0.000000 2023-09-03 18:24:57,792 epoch 3 - iter 176/447 - loss 0.09053540 - time (sec): 29.76 - samples/sec: 1104.37 - lr: 0.000025 - momentum: 0.000000 2023-09-03 18:25:06,412 epoch 3 - iter 220/447 - loss 0.09334204 - time (sec): 38.38 - samples/sec: 1092.60 - lr: 0.000025 - momentum: 0.000000 2023-09-03 18:25:13,477 epoch 3 - iter 264/447 - loss 0.08944533 - time (sec): 45.45 - samples/sec: 1111.97 - lr: 0.000025 - momentum: 0.000000 2023-09-03 18:25:20,942 epoch 3 - iter 308/447 - loss 0.08936581 - time (sec): 52.91 - samples/sec: 1117.84 - lr: 0.000024 - momentum: 0.000000 2023-09-03 18:25:28,640 epoch 3 - iter 352/447 - loss 0.08872997 - time (sec): 60.61 - samples/sec: 1118.60 - lr: 0.000024 - momentum: 0.000000 2023-09-03 18:25:35,687 epoch 3 - iter 396/447 - loss 0.09035623 - time (sec): 67.66 - samples/sec: 1125.10 - lr: 0.000024 - momentum: 0.000000 2023-09-03 18:25:44,366 epoch 3 - iter 440/447 - loss 0.08953185 - time (sec): 76.34 - samples/sec: 1119.46 - lr: 0.000023 - momentum: 0.000000 2023-09-03 18:25:45,340 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:25:45,340 EPOCH 3 done: loss 0.0894 - lr: 0.000023 2023-09-03 18:25:58,086 DEV : loss 0.1225898340344429 - f1-score (micro avg) 0.7351 2023-09-03 18:25:58,112 saving best model 2023-09-03 18:25:59,469 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:26:06,724 epoch 4 - iter 44/447 - loss 0.06395735 - time (sec): 7.25 - samples/sec: 1236.05 - lr: 0.000023 - momentum: 0.000000 2023-09-03 18:26:13,500 epoch 4 - iter 88/447 - loss 0.05982834 - time (sec): 14.03 - samples/sec: 1212.04 - lr: 0.000023 - momentum: 0.000000 2023-09-03 18:26:21,427 epoch 4 - iter 132/447 - loss 0.05652974 - time (sec): 21.96 - samples/sec: 1186.41 - lr: 0.000022 - momentum: 0.000000 2023-09-03 18:26:29,661 epoch 4 - iter 176/447 - loss 0.05525616 - time (sec): 30.19 - samples/sec: 1181.96 - lr: 0.000022 - momentum: 0.000000 2023-09-03 18:26:37,306 epoch 4 - iter 220/447 - loss 0.05254340 - time (sec): 37.84 - samples/sec: 1172.27 - lr: 0.000022 - momentum: 0.000000 2023-09-03 18:26:44,944 epoch 4 - iter 264/447 - loss 0.05385835 - time (sec): 45.47 - samples/sec: 1165.24 - lr: 0.000021 - momentum: 0.000000 2023-09-03 18:26:51,760 epoch 4 - iter 308/447 - loss 0.05369138 - time (sec): 52.29 - samples/sec: 1172.55 - lr: 0.000021 - momentum: 0.000000 2023-09-03 18:26:58,807 epoch 4 - iter 352/447 - loss 0.05338970 - time (sec): 59.34 - samples/sec: 1174.43 - lr: 0.000021 - momentum: 0.000000 2023-09-03 18:27:05,146 epoch 4 - iter 396/447 - loss 0.05132492 - time (sec): 65.68 - samples/sec: 1170.43 - lr: 0.000020 - momentum: 0.000000 2023-09-03 18:27:12,845 epoch 4 - iter 440/447 - loss 0.05151521 - time (sec): 73.37 - samples/sec: 1163.44 - lr: 0.000020 - momentum: 0.000000 2023-09-03 18:27:13,894 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:27:13,894 EPOCH 4 done: loss 0.0514 - lr: 0.000020 2023-09-03 18:27:26,856 DEV : loss 0.1438855081796646 - f1-score (micro avg) 0.7474 2023-09-03 18:27:26,893 saving best model 2023-09-03 18:27:28,273 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:27:37,200 epoch 5 - iter 44/447 - loss 0.04007260 - time (sec): 8.93 - samples/sec: 1080.28 - lr: 0.000020 - momentum: 0.000000 2023-09-03 18:27:44,006 epoch 5 - iter 88/447 - loss 0.03678430 - time (sec): 15.73 - samples/sec: 1120.83 - lr: 0.000019 - momentum: 0.000000 2023-09-03 18:27:51,611 epoch 5 - iter 132/447 - loss 0.03344633 - time (sec): 23.34 - samples/sec: 1124.95 - lr: 0.000019 - momentum: 0.000000 2023-09-03 18:27:58,422 epoch 5 - iter 176/447 - loss 0.03311853 - time (sec): 30.15 - samples/sec: 1140.02 - lr: 0.000019 - momentum: 0.000000 2023-09-03 18:28:06,463 epoch 5 - iter 220/447 - loss 0.03289270 - time (sec): 38.19 - samples/sec: 1136.03 - lr: 0.000018 - momentum: 0.000000 2023-09-03 18:28:13,614 epoch 5 - iter 264/447 - loss 0.03272367 - time (sec): 45.34 - samples/sec: 1147.21 - lr: 0.000018 - momentum: 0.000000 2023-09-03 18:28:20,716 epoch 5 - iter 308/447 - loss 0.03211904 - time (sec): 52.44 - samples/sec: 1145.62 - lr: 0.000018 - momentum: 0.000000 2023-09-03 18:28:28,282 epoch 5 - iter 352/447 - loss 0.03099804 - time (sec): 60.01 - samples/sec: 1145.99 - lr: 0.000017 - momentum: 0.000000 2023-09-03 18:28:35,561 epoch 5 - iter 396/447 - loss 0.03134117 - time (sec): 67.29 - samples/sec: 1139.40 - lr: 0.000017 - momentum: 0.000000 2023-09-03 18:28:42,759 epoch 5 - iter 440/447 - loss 0.03292082 - time (sec): 74.48 - samples/sec: 1144.88 - lr: 0.000017 - momentum: 0.000000 2023-09-03 18:28:43,886 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:28:43,887 EPOCH 5 done: loss 0.0329 - lr: 0.000017 2023-09-03 18:28:56,796 DEV : loss 0.16667184233665466 - f1-score (micro avg) 0.761 2023-09-03 18:28:56,822 saving best model 2023-09-03 18:28:58,193 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:29:06,052 epoch 6 - iter 44/447 - loss 0.02307461 - time (sec): 7.86 - samples/sec: 1093.56 - lr: 0.000016 - momentum: 0.000000 2023-09-03 18:29:12,868 epoch 6 - iter 88/447 - loss 0.01945012 - time (sec): 14.67 - samples/sec: 1103.26 - lr: 0.000016 - momentum: 0.000000 2023-09-03 18:29:20,967 epoch 6 - iter 132/447 - loss 0.01718198 - time (sec): 22.77 - samples/sec: 1099.68 - lr: 0.000016 - momentum: 0.000000 2023-09-03 18:29:28,847 epoch 6 - iter 176/447 - loss 0.01824429 - time (sec): 30.65 - samples/sec: 1108.00 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:29:35,589 epoch 6 - iter 220/447 - loss 0.01838533 - time (sec): 37.39 - samples/sec: 1111.07 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:29:43,043 epoch 6 - iter 264/447 - loss 0.01877602 - time (sec): 44.85 - samples/sec: 1105.93 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:29:50,277 epoch 6 - iter 308/447 - loss 0.02057658 - time (sec): 52.08 - samples/sec: 1101.92 - lr: 0.000014 - momentum: 0.000000 2023-09-03 18:29:57,389 epoch 6 - iter 352/447 - loss 0.02153178 - time (sec): 59.19 - samples/sec: 1114.71 - lr: 0.000014 - momentum: 0.000000 2023-09-03 18:30:06,767 epoch 6 - iter 396/447 - loss 0.02220110 - time (sec): 68.57 - samples/sec: 1110.88 - lr: 0.000014 - momentum: 0.000000 2023-09-03 18:30:15,051 epoch 6 - iter 440/447 - loss 0.02172317 - time (sec): 76.86 - samples/sec: 1108.89 - lr: 0.000013 - momentum: 0.000000 2023-09-03 18:30:16,130 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:30:16,130 EPOCH 6 done: loss 0.0217 - lr: 0.000013 2023-09-03 18:30:29,605 DEV : loss 0.1836623251438141 - f1-score (micro avg) 0.7789 2023-09-03 18:30:29,632 saving best model 2023-09-03 18:30:31,501 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:30:38,866 epoch 7 - iter 44/447 - loss 0.01419595 - time (sec): 7.36 - samples/sec: 1187.09 - lr: 0.000013 - momentum: 0.000000 2023-09-03 18:30:46,088 epoch 7 - iter 88/447 - loss 0.01603520 - time (sec): 14.59 - samples/sec: 1157.28 - lr: 0.000013 - momentum: 0.000000 2023-09-03 18:30:55,966 epoch 7 - iter 132/447 - loss 0.01574985 - time (sec): 24.46 - samples/sec: 1102.88 - lr: 0.000012 - momentum: 0.000000 2023-09-03 18:31:03,653 epoch 7 - iter 176/447 - loss 0.01383093 - time (sec): 32.15 - samples/sec: 1098.70 - lr: 0.000012 - momentum: 0.000000 2023-09-03 18:31:11,583 epoch 7 - iter 220/447 - loss 0.01544671 - time (sec): 40.08 - samples/sec: 1099.46 - lr: 0.000012 - momentum: 0.000000 2023-09-03 18:31:18,562 epoch 7 - iter 264/447 - loss 0.01556334 - time (sec): 47.06 - samples/sec: 1105.80 - lr: 0.000011 - momentum: 0.000000 2023-09-03 18:31:25,934 epoch 7 - iter 308/447 - loss 0.01431665 - time (sec): 54.43 - samples/sec: 1105.32 - lr: 0.000011 - momentum: 0.000000 2023-09-03 18:31:33,925 epoch 7 - iter 352/447 - loss 0.01409783 - time (sec): 62.42 - samples/sec: 1097.26 - lr: 0.000011 - momentum: 0.000000 2023-09-03 18:31:41,095 epoch 7 - iter 396/447 - loss 0.01484694 - time (sec): 69.59 - samples/sec: 1094.28 - lr: 0.000010 - momentum: 0.000000 2023-09-03 18:31:48,109 epoch 7 - iter 440/447 - loss 0.01446960 - time (sec): 76.61 - samples/sec: 1100.22 - lr: 0.000010 - momentum: 0.000000 2023-09-03 18:31:50,291 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:31:50,291 EPOCH 7 done: loss 0.0142 - lr: 0.000010 2023-09-03 18:32:03,818 DEV : loss 0.1973269134759903 - f1-score (micro avg) 0.7837 2023-09-03 18:32:03,845 saving best model 2023-09-03 18:32:05,222 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:32:13,220 epoch 8 - iter 44/447 - loss 0.01029987 - time (sec): 8.00 - samples/sec: 1071.49 - lr: 0.000010 - momentum: 0.000000 2023-09-03 18:32:21,316 epoch 8 - iter 88/447 - loss 0.00934919 - time (sec): 16.09 - samples/sec: 1093.03 - lr: 0.000009 - momentum: 0.000000 2023-09-03 18:32:29,012 epoch 8 - iter 132/447 - loss 0.00805343 - time (sec): 23.79 - samples/sec: 1120.87 - lr: 0.000009 - momentum: 0.000000 2023-09-03 18:32:37,827 epoch 8 - iter 176/447 - loss 0.00742876 - time (sec): 32.60 - samples/sec: 1107.86 - lr: 0.000009 - momentum: 0.000000 2023-09-03 18:32:45,039 epoch 8 - iter 220/447 - loss 0.00944281 - time (sec): 39.82 - samples/sec: 1102.57 - lr: 0.000008 - momentum: 0.000000 2023-09-03 18:32:52,908 epoch 8 - iter 264/447 - loss 0.01021476 - time (sec): 47.68 - samples/sec: 1090.24 - lr: 0.000008 - momentum: 0.000000 2023-09-03 18:33:00,532 epoch 8 - iter 308/447 - loss 0.00971741 - time (sec): 55.31 - samples/sec: 1101.87 - lr: 0.000008 - momentum: 0.000000 2023-09-03 18:33:07,723 epoch 8 - iter 352/447 - loss 0.00925926 - time (sec): 62.50 - samples/sec: 1108.00 - lr: 0.000007 - momentum: 0.000000 2023-09-03 18:33:14,970 epoch 8 - iter 396/447 - loss 0.01013440 - time (sec): 69.75 - samples/sec: 1109.62 - lr: 0.000007 - momentum: 0.000000 2023-09-03 18:33:22,431 epoch 8 - iter 440/447 - loss 0.00973599 - time (sec): 77.21 - samples/sec: 1104.79 - lr: 0.000007 - momentum: 0.000000 2023-09-03 18:33:23,503 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:33:23,503 EPOCH 8 done: loss 0.0096 - lr: 0.000007 2023-09-03 18:33:36,556 DEV : loss 0.20980410277843475 - f1-score (micro avg) 0.7877 2023-09-03 18:33:36,582 saving best model 2023-09-03 18:33:38,290 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:33:45,638 epoch 9 - iter 44/447 - loss 0.00992836 - time (sec): 7.35 - samples/sec: 1108.84 - lr: 0.000006 - momentum: 0.000000 2023-09-03 18:33:53,704 epoch 9 - iter 88/447 - loss 0.00727258 - time (sec): 15.41 - samples/sec: 1127.51 - lr: 0.000006 - momentum: 0.000000 2023-09-03 18:34:01,989 epoch 9 - iter 132/447 - loss 0.00686773 - time (sec): 23.70 - samples/sec: 1085.59 - lr: 0.000006 - momentum: 0.000000 2023-09-03 18:34:10,168 epoch 9 - iter 176/447 - loss 0.00593792 - time (sec): 31.88 - samples/sec: 1093.36 - lr: 0.000005 - momentum: 0.000000 2023-09-03 18:34:18,970 epoch 9 - iter 220/447 - loss 0.00648321 - time (sec): 40.68 - samples/sec: 1073.40 - lr: 0.000005 - momentum: 0.000000 2023-09-03 18:34:26,084 epoch 9 - iter 264/447 - loss 0.00789765 - time (sec): 47.79 - samples/sec: 1085.54 - lr: 0.000005 - momentum: 0.000000 2023-09-03 18:34:34,436 epoch 9 - iter 308/447 - loss 0.00712320 - time (sec): 56.14 - samples/sec: 1094.59 - lr: 0.000004 - momentum: 0.000000 2023-09-03 18:34:41,491 epoch 9 - iter 352/447 - loss 0.00662608 - time (sec): 63.20 - samples/sec: 1099.33 - lr: 0.000004 - momentum: 0.000000 2023-09-03 18:34:48,326 epoch 9 - iter 396/447 - loss 0.00674970 - time (sec): 70.03 - samples/sec: 1105.13 - lr: 0.000004 - momentum: 0.000000 2023-09-03 18:34:55,784 epoch 9 - iter 440/447 - loss 0.00751222 - time (sec): 77.49 - samples/sec: 1101.10 - lr: 0.000003 - momentum: 0.000000 2023-09-03 18:34:56,819 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:34:56,819 EPOCH 9 done: loss 0.0076 - lr: 0.000003 2023-09-03 18:35:09,955 DEV : loss 0.2160281091928482 - f1-score (micro avg) 0.7901 2023-09-03 18:35:09,982 saving best model 2023-09-03 18:35:11,346 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:35:19,098 epoch 10 - iter 44/447 - loss 0.00476283 - time (sec): 7.75 - samples/sec: 1121.94 - lr: 0.000003 - momentum: 0.000000 2023-09-03 18:35:25,973 epoch 10 - iter 88/447 - loss 0.00479459 - time (sec): 14.63 - samples/sec: 1131.01 - lr: 0.000003 - momentum: 0.000000 2023-09-03 18:35:33,160 epoch 10 - iter 132/447 - loss 0.00414159 - time (sec): 21.81 - samples/sec: 1147.93 - lr: 0.000002 - momentum: 0.000000 2023-09-03 18:35:40,912 epoch 10 - iter 176/447 - loss 0.00424893 - time (sec): 29.56 - samples/sec: 1138.91 - lr: 0.000002 - momentum: 0.000000 2023-09-03 18:35:49,645 epoch 10 - iter 220/447 - loss 0.00517905 - time (sec): 38.30 - samples/sec: 1117.89 - lr: 0.000002 - momentum: 0.000000 2023-09-03 18:35:58,183 epoch 10 - iter 264/447 - loss 0.00508760 - time (sec): 46.84 - samples/sec: 1100.36 - lr: 0.000001 - momentum: 0.000000 2023-09-03 18:36:06,411 epoch 10 - iter 308/447 - loss 0.00478898 - time (sec): 55.06 - samples/sec: 1096.34 - lr: 0.000001 - momentum: 0.000000 2023-09-03 18:36:13,313 epoch 10 - iter 352/447 - loss 0.00492933 - time (sec): 61.96 - samples/sec: 1103.01 - lr: 0.000001 - momentum: 0.000000 2023-09-03 18:36:20,461 epoch 10 - iter 396/447 - loss 0.00523232 - time (sec): 69.11 - samples/sec: 1105.83 - lr: 0.000000 - momentum: 0.000000 2023-09-03 18:36:28,541 epoch 10 - iter 440/447 - loss 0.00518352 - time (sec): 77.19 - samples/sec: 1100.36 - lr: 0.000000 - momentum: 0.000000 2023-09-03 18:36:29,888 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:36:29,889 EPOCH 10 done: loss 0.0051 - lr: 0.000000 2023-09-03 18:36:42,963 DEV : loss 0.21760693192481995 - f1-score (micro avg) 0.7942 2023-09-03 18:36:42,989 saving best model 2023-09-03 18:36:44,882 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:36:44,883 Loading model from best epoch ... 2023-09-03 18:36:47,178 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time 2023-09-03 18:36:58,110 Results: - F-score (micro) 0.7599 - F-score (macro) 0.7005 - Accuracy 0.632 By class: precision recall f1-score support loc 0.8386 0.8540 0.8462 596 pers 0.6746 0.7658 0.7173 333 org 0.5328 0.4924 0.5118 132 prod 0.7872 0.5606 0.6549 66 time 0.7500 0.7959 0.7723 49 micro avg 0.7504 0.7696 0.7599 1176 macro avg 0.7166 0.6937 0.7005 1176 weighted avg 0.7512 0.7696 0.7584 1176 2023-09-03 18:36:58,110 ----------------------------------------------------------------------------------------------------