2023-09-03 18:37:15,477 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,478 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=21, bias=True) (loss_function): CrossEntropyLoss() )" 2023-09-03 18:37:15,478 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,478 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator 2023-09-03 18:37:15,478 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,478 Train: 3575 sentences 2023-09-03 18:37:15,478 (train_with_dev=False, train_with_test=False) 2023-09-03 18:37:15,479 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,479 Training Params: 2023-09-03 18:37:15,479 - learning_rate: "5e-05" 2023-09-03 18:37:15,479 - mini_batch_size: "8" 2023-09-03 18:37:15,479 - max_epochs: "10" 2023-09-03 18:37:15,479 - shuffle: "True" 2023-09-03 18:37:15,479 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,479 Plugins: 2023-09-03 18:37:15,479 - LinearScheduler | warmup_fraction: '0.1' 2023-09-03 18:37:15,479 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,479 Final evaluation on model from best epoch (best-model.pt) 2023-09-03 18:37:15,479 - metric: "('micro avg', 'f1-score')" 2023-09-03 18:37:15,479 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,479 Computation: 2023-09-03 18:37:15,479 - compute on device: cuda:0 2023-09-03 18:37:15,479 - embedding storage: none 2023-09-03 18:37:15,479 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,479 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1" 2023-09-03 18:37:15,480 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:15,480 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:37:24,050 epoch 1 - iter 44/447 - loss 2.97210936 - time (sec): 8.57 - samples/sec: 1109.39 - lr: 0.000005 - momentum: 0.000000 2023-09-03 18:37:31,154 epoch 1 - iter 88/447 - loss 2.08624157 - time (sec): 15.67 - samples/sec: 1108.96 - lr: 0.000010 - momentum: 0.000000 2023-09-03 18:37:37,865 epoch 1 - iter 132/447 - loss 1.59529987 - time (sec): 22.38 - samples/sec: 1118.30 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:37:45,327 epoch 1 - iter 176/447 - loss 1.29325004 - time (sec): 29.85 - samples/sec: 1120.76 - lr: 0.000020 - momentum: 0.000000 2023-09-03 18:37:52,890 epoch 1 - iter 220/447 - loss 1.11453909 - time (sec): 37.41 - samples/sec: 1111.03 - lr: 0.000024 - momentum: 0.000000 2023-09-03 18:38:01,040 epoch 1 - iter 264/447 - loss 0.97503409 - time (sec): 45.56 - samples/sec: 1106.17 - lr: 0.000029 - momentum: 0.000000 2023-09-03 18:38:08,840 epoch 1 - iter 308/447 - loss 0.87611792 - time (sec): 53.36 - samples/sec: 1105.42 - lr: 0.000034 - momentum: 0.000000 2023-09-03 18:38:17,358 epoch 1 - iter 352/447 - loss 0.79113112 - time (sec): 61.88 - samples/sec: 1099.44 - lr: 0.000039 - momentum: 0.000000 2023-09-03 18:38:24,914 epoch 1 - iter 396/447 - loss 0.73119621 - time (sec): 69.43 - samples/sec: 1099.03 - lr: 0.000044 - momentum: 0.000000 2023-09-03 18:38:33,495 epoch 1 - iter 440/447 - loss 0.68157940 - time (sec): 78.01 - samples/sec: 1094.86 - lr: 0.000049 - momentum: 0.000000 2023-09-03 18:38:34,666 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:38:34,666 EPOCH 1 done: loss 0.6756 - lr: 0.000049 2023-09-03 18:38:45,810 DEV : loss 0.1739310920238495 - f1-score (micro avg) 0.6196 2023-09-03 18:38:45,836 saving best model 2023-09-03 18:38:46,337 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:38:53,653 epoch 2 - iter 44/447 - loss 0.16560613 - time (sec): 7.31 - samples/sec: 1168.65 - lr: 0.000049 - momentum: 0.000000 2023-09-03 18:39:01,473 epoch 2 - iter 88/447 - loss 0.19089940 - time (sec): 15.13 - samples/sec: 1125.92 - lr: 0.000049 - momentum: 0.000000 2023-09-03 18:39:09,035 epoch 2 - iter 132/447 - loss 0.18220692 - time (sec): 22.70 - samples/sec: 1132.25 - lr: 0.000048 - momentum: 0.000000 2023-09-03 18:39:17,146 epoch 2 - iter 176/447 - loss 0.17541484 - time (sec): 30.81 - samples/sec: 1106.51 - lr: 0.000048 - momentum: 0.000000 2023-09-03 18:39:24,347 epoch 2 - iter 220/447 - loss 0.16907663 - time (sec): 38.01 - samples/sec: 1104.07 - lr: 0.000047 - momentum: 0.000000 2023-09-03 18:39:32,398 epoch 2 - iter 264/447 - loss 0.15835956 - time (sec): 46.06 - samples/sec: 1097.74 - lr: 0.000047 - momentum: 0.000000 2023-09-03 18:39:40,533 epoch 2 - iter 308/447 - loss 0.15636864 - time (sec): 54.19 - samples/sec: 1104.01 - lr: 0.000046 - momentum: 0.000000 2023-09-03 18:39:47,936 epoch 2 - iter 352/447 - loss 0.15607309 - time (sec): 61.60 - samples/sec: 1101.44 - lr: 0.000046 - momentum: 0.000000 2023-09-03 18:39:55,110 epoch 2 - iter 396/447 - loss 0.15418454 - time (sec): 68.77 - samples/sec: 1103.72 - lr: 0.000045 - momentum: 0.000000 2023-09-03 18:40:03,519 epoch 2 - iter 440/447 - loss 0.15161985 - time (sec): 77.18 - samples/sec: 1105.71 - lr: 0.000045 - momentum: 0.000000 2023-09-03 18:40:04,658 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:40:04,658 EPOCH 2 done: loss 0.1506 - lr: 0.000045 2023-09-03 18:40:18,194 DEV : loss 0.14003966748714447 - f1-score (micro avg) 0.6873 2023-09-03 18:40:18,220 saving best model 2023-09-03 18:40:19,597 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:40:26,572 epoch 3 - iter 44/447 - loss 0.09692877 - time (sec): 6.97 - samples/sec: 1106.06 - lr: 0.000044 - momentum: 0.000000 2023-09-03 18:40:33,739 epoch 3 - iter 88/447 - loss 0.08276589 - time (sec): 14.14 - samples/sec: 1128.70 - lr: 0.000043 - momentum: 0.000000 2023-09-03 18:40:41,049 epoch 3 - iter 132/447 - loss 0.09016044 - time (sec): 21.45 - samples/sec: 1119.62 - lr: 0.000043 - momentum: 0.000000 2023-09-03 18:40:49,347 epoch 3 - iter 176/447 - loss 0.08273976 - time (sec): 29.75 - samples/sec: 1104.94 - lr: 0.000042 - momentum: 0.000000 2023-09-03 18:40:58,045 epoch 3 - iter 220/447 - loss 0.08473180 - time (sec): 38.45 - samples/sec: 1090.82 - lr: 0.000042 - momentum: 0.000000 2023-09-03 18:41:05,198 epoch 3 - iter 264/447 - loss 0.08020328 - time (sec): 45.60 - samples/sec: 1108.30 - lr: 0.000041 - momentum: 0.000000 2023-09-03 18:41:12,722 epoch 3 - iter 308/447 - loss 0.07962750 - time (sec): 53.12 - samples/sec: 1113.43 - lr: 0.000041 - momentum: 0.000000 2023-09-03 18:41:20,476 epoch 3 - iter 352/447 - loss 0.07973679 - time (sec): 60.88 - samples/sec: 1113.73 - lr: 0.000040 - momentum: 0.000000 2023-09-03 18:41:27,559 epoch 3 - iter 396/447 - loss 0.08051684 - time (sec): 67.96 - samples/sec: 1120.12 - lr: 0.000040 - momentum: 0.000000 2023-09-03 18:41:36,296 epoch 3 - iter 440/447 - loss 0.08031459 - time (sec): 76.70 - samples/sec: 1114.23 - lr: 0.000039 - momentum: 0.000000 2023-09-03 18:41:37,267 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:41:37,267 EPOCH 3 done: loss 0.0809 - lr: 0.000039 2023-09-03 18:41:49,985 DEV : loss 0.12311021238565445 - f1-score (micro avg) 0.7417 2023-09-03 18:41:50,010 saving best model 2023-09-03 18:41:51,387 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:41:58,645 epoch 4 - iter 44/447 - loss 0.05953306 - time (sec): 7.26 - samples/sec: 1235.29 - lr: 0.000038 - momentum: 0.000000 2023-09-03 18:42:05,416 epoch 4 - iter 88/447 - loss 0.05304627 - time (sec): 14.03 - samples/sec: 1212.12 - lr: 0.000038 - momentum: 0.000000 2023-09-03 18:42:13,385 epoch 4 - iter 132/447 - loss 0.05012354 - time (sec): 22.00 - samples/sec: 1184.22 - lr: 0.000037 - momentum: 0.000000 2023-09-03 18:42:21,625 epoch 4 - iter 176/447 - loss 0.05186757 - time (sec): 30.24 - samples/sec: 1180.14 - lr: 0.000037 - momentum: 0.000000 2023-09-03 18:42:29,255 epoch 4 - iter 220/447 - loss 0.04888425 - time (sec): 37.87 - samples/sec: 1171.32 - lr: 0.000036 - momentum: 0.000000 2023-09-03 18:42:36,909 epoch 4 - iter 264/447 - loss 0.04960206 - time (sec): 45.52 - samples/sec: 1164.03 - lr: 0.000036 - momentum: 0.000000 2023-09-03 18:42:43,725 epoch 4 - iter 308/447 - loss 0.04940231 - time (sec): 52.34 - samples/sec: 1171.48 - lr: 0.000035 - momentum: 0.000000 2023-09-03 18:42:50,742 epoch 4 - iter 352/447 - loss 0.04805290 - time (sec): 59.35 - samples/sec: 1174.09 - lr: 0.000035 - momentum: 0.000000 2023-09-03 18:42:57,064 epoch 4 - iter 396/447 - loss 0.04661350 - time (sec): 65.68 - samples/sec: 1170.42 - lr: 0.000034 - momentum: 0.000000 2023-09-03 18:43:04,718 epoch 4 - iter 440/447 - loss 0.04615394 - time (sec): 73.33 - samples/sec: 1164.15 - lr: 0.000033 - momentum: 0.000000 2023-09-03 18:43:05,762 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:43:05,762 EPOCH 4 done: loss 0.0460 - lr: 0.000033 2023-09-03 18:43:18,509 DEV : loss 0.1624774932861328 - f1-score (micro avg) 0.735 2023-09-03 18:43:18,535 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:43:27,464 epoch 5 - iter 44/447 - loss 0.03232269 - time (sec): 8.93 - samples/sec: 1079.97 - lr: 0.000033 - momentum: 0.000000 2023-09-03 18:43:34,249 epoch 5 - iter 88/447 - loss 0.02953673 - time (sec): 15.71 - samples/sec: 1122.16 - lr: 0.000032 - momentum: 0.000000 2023-09-03 18:43:41,828 epoch 5 - iter 132/447 - loss 0.02726699 - time (sec): 23.29 - samples/sec: 1127.10 - lr: 0.000032 - momentum: 0.000000 2023-09-03 18:43:48,638 epoch 5 - iter 176/447 - loss 0.02799655 - time (sec): 30.10 - samples/sec: 1141.75 - lr: 0.000031 - momentum: 0.000000 2023-09-03 18:43:56,651 epoch 5 - iter 220/447 - loss 0.02985539 - time (sec): 38.12 - samples/sec: 1138.23 - lr: 0.000031 - momentum: 0.000000 2023-09-03 18:44:03,776 epoch 5 - iter 264/447 - loss 0.02979495 - time (sec): 45.24 - samples/sec: 1149.74 - lr: 0.000030 - momentum: 0.000000 2023-09-03 18:44:10,883 epoch 5 - iter 308/447 - loss 0.02790303 - time (sec): 52.35 - samples/sec: 1147.71 - lr: 0.000030 - momentum: 0.000000 2023-09-03 18:44:18,415 epoch 5 - iter 352/447 - loss 0.02765430 - time (sec): 59.88 - samples/sec: 1148.47 - lr: 0.000029 - momentum: 0.000000 2023-09-03 18:44:25,671 epoch 5 - iter 396/447 - loss 0.02832895 - time (sec): 67.13 - samples/sec: 1141.99 - lr: 0.000028 - momentum: 0.000000 2023-09-03 18:44:32,834 epoch 5 - iter 440/447 - loss 0.02991252 - time (sec): 74.30 - samples/sec: 1147.75 - lr: 0.000028 - momentum: 0.000000 2023-09-03 18:44:33,958 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:44:33,958 EPOCH 5 done: loss 0.0298 - lr: 0.000028 2023-09-03 18:44:46,860 DEV : loss 0.1907276213169098 - f1-score (micro avg) 0.751 2023-09-03 18:44:46,885 saving best model 2023-09-03 18:44:48,254 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:44:56,102 epoch 6 - iter 44/447 - loss 0.02353051 - time (sec): 7.85 - samples/sec: 1094.98 - lr: 0.000027 - momentum: 0.000000 2023-09-03 18:45:02,914 epoch 6 - iter 88/447 - loss 0.02152485 - time (sec): 14.66 - samples/sec: 1104.44 - lr: 0.000027 - momentum: 0.000000 2023-09-03 18:45:10,963 epoch 6 - iter 132/447 - loss 0.01978128 - time (sec): 22.71 - samples/sec: 1102.81 - lr: 0.000026 - momentum: 0.000000 2023-09-03 18:45:18,841 epoch 6 - iter 176/447 - loss 0.02010974 - time (sec): 30.59 - samples/sec: 1110.42 - lr: 0.000026 - momentum: 0.000000 2023-09-03 18:45:25,569 epoch 6 - iter 220/447 - loss 0.01936305 - time (sec): 37.31 - samples/sec: 1113.50 - lr: 0.000025 - momentum: 0.000000 2023-09-03 18:45:33,019 epoch 6 - iter 264/447 - loss 0.01765538 - time (sec): 44.76 - samples/sec: 1108.02 - lr: 0.000025 - momentum: 0.000000 2023-09-03 18:45:40,262 epoch 6 - iter 308/447 - loss 0.02086487 - time (sec): 52.01 - samples/sec: 1103.53 - lr: 0.000024 - momentum: 0.000000 2023-09-03 18:45:47,366 epoch 6 - iter 352/447 - loss 0.02325808 - time (sec): 59.11 - samples/sec: 1116.31 - lr: 0.000023 - momentum: 0.000000 2023-09-03 18:45:56,749 epoch 6 - iter 396/447 - loss 0.02368370 - time (sec): 68.49 - samples/sec: 1112.15 - lr: 0.000023 - momentum: 0.000000 2023-09-03 18:46:05,021 epoch 6 - iter 440/447 - loss 0.02311494 - time (sec): 76.77 - samples/sec: 1110.21 - lr: 0.000022 - momentum: 0.000000 2023-09-03 18:46:06,096 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:46:06,097 EPOCH 6 done: loss 0.0228 - lr: 0.000022 2023-09-03 18:46:19,323 DEV : loss 0.2008039355278015 - f1-score (micro avg) 0.7712 2023-09-03 18:46:19,350 saving best model 2023-09-03 18:46:20,734 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:46:28,118 epoch 7 - iter 44/447 - loss 0.01348835 - time (sec): 7.38 - samples/sec: 1183.96 - lr: 0.000022 - momentum: 0.000000 2023-09-03 18:46:35,357 epoch 7 - iter 88/447 - loss 0.01269963 - time (sec): 14.62 - samples/sec: 1154.42 - lr: 0.000021 - momentum: 0.000000 2023-09-03 18:46:45,612 epoch 7 - iter 132/447 - loss 0.01130388 - time (sec): 24.88 - samples/sec: 1084.59 - lr: 0.000021 - momentum: 0.000000 2023-09-03 18:46:53,306 epoch 7 - iter 176/447 - loss 0.01145780 - time (sec): 32.57 - samples/sec: 1084.51 - lr: 0.000020 - momentum: 0.000000 2023-09-03 18:47:01,248 epoch 7 - iter 220/447 - loss 0.01445916 - time (sec): 40.51 - samples/sec: 1087.73 - lr: 0.000020 - momentum: 0.000000 2023-09-03 18:47:08,234 epoch 7 - iter 264/447 - loss 0.01461023 - time (sec): 47.50 - samples/sec: 1095.59 - lr: 0.000019 - momentum: 0.000000 2023-09-03 18:47:15,605 epoch 7 - iter 308/447 - loss 0.01341043 - time (sec): 54.87 - samples/sec: 1096.51 - lr: 0.000018 - momentum: 0.000000 2023-09-03 18:47:23,581 epoch 7 - iter 352/447 - loss 0.01347986 - time (sec): 62.85 - samples/sec: 1089.87 - lr: 0.000018 - momentum: 0.000000 2023-09-03 18:47:30,784 epoch 7 - iter 396/447 - loss 0.01472372 - time (sec): 70.05 - samples/sec: 1087.16 - lr: 0.000017 - momentum: 0.000000 2023-09-03 18:47:37,761 epoch 7 - iter 440/447 - loss 0.01441325 - time (sec): 77.03 - samples/sec: 1094.22 - lr: 0.000017 - momentum: 0.000000 2023-09-03 18:47:39,921 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:47:39,921 EPOCH 7 done: loss 0.0141 - lr: 0.000017 2023-09-03 18:47:53,001 DEV : loss 0.20943114161491394 - f1-score (micro avg) 0.7786 2023-09-03 18:47:53,027 saving best model 2023-09-03 18:47:54,434 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:48:02,258 epoch 8 - iter 44/447 - loss 0.00861768 - time (sec): 7.82 - samples/sec: 1095.38 - lr: 0.000016 - momentum: 0.000000 2023-09-03 18:48:10,085 epoch 8 - iter 88/447 - loss 0.01029311 - time (sec): 15.65 - samples/sec: 1123.98 - lr: 0.000016 - momentum: 0.000000 2023-09-03 18:48:17,543 epoch 8 - iter 132/447 - loss 0.00902152 - time (sec): 23.11 - samples/sec: 1153.90 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:48:25,988 epoch 8 - iter 176/447 - loss 0.00843968 - time (sec): 31.55 - samples/sec: 1144.76 - lr: 0.000015 - momentum: 0.000000 2023-09-03 18:48:33,206 epoch 8 - iter 220/447 - loss 0.00879577 - time (sec): 38.77 - samples/sec: 1132.31 - lr: 0.000014 - momentum: 0.000000 2023-09-03 18:48:40,700 epoch 8 - iter 264/447 - loss 0.00969029 - time (sec): 46.26 - samples/sec: 1123.71 - lr: 0.000013 - momentum: 0.000000 2023-09-03 18:48:47,962 epoch 8 - iter 308/447 - loss 0.00980882 - time (sec): 53.53 - samples/sec: 1138.57 - lr: 0.000013 - momentum: 0.000000 2023-09-03 18:48:54,786 epoch 8 - iter 352/447 - loss 0.00918486 - time (sec): 60.35 - samples/sec: 1147.46 - lr: 0.000012 - momentum: 0.000000 2023-09-03 18:49:01,664 epoch 8 - iter 396/447 - loss 0.00871715 - time (sec): 67.23 - samples/sec: 1151.18 - lr: 0.000012 - momentum: 0.000000 2023-09-03 18:49:08,773 epoch 8 - iter 440/447 - loss 0.00886033 - time (sec): 74.34 - samples/sec: 1147.45 - lr: 0.000011 - momentum: 0.000000 2023-09-03 18:49:09,790 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:49:09,790 EPOCH 8 done: loss 0.0087 - lr: 0.000011 2023-09-03 18:49:22,105 DEV : loss 0.22768265008926392 - f1-score (micro avg) 0.7963 2023-09-03 18:49:22,131 saving best model 2023-09-03 18:49:23,664 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:49:30,687 epoch 9 - iter 44/447 - loss 0.01014767 - time (sec): 7.02 - samples/sec: 1160.03 - lr: 0.000011 - momentum: 0.000000 2023-09-03 18:49:38,342 epoch 9 - iter 88/447 - loss 0.00771239 - time (sec): 14.68 - samples/sec: 1183.93 - lr: 0.000010 - momentum: 0.000000 2023-09-03 18:49:46,167 epoch 9 - iter 132/447 - loss 0.00667470 - time (sec): 22.50 - samples/sec: 1143.26 - lr: 0.000010 - momentum: 0.000000 2023-09-03 18:49:53,874 epoch 9 - iter 176/447 - loss 0.00519821 - time (sec): 30.21 - samples/sec: 1153.69 - lr: 0.000009 - momentum: 0.000000 2023-09-03 18:50:02,130 epoch 9 - iter 220/447 - loss 0.00485846 - time (sec): 38.46 - samples/sec: 1135.16 - lr: 0.000008 - momentum: 0.000000 2023-09-03 18:50:08,827 epoch 9 - iter 264/447 - loss 0.00604157 - time (sec): 45.16 - samples/sec: 1148.78 - lr: 0.000008 - momentum: 0.000000 2023-09-03 18:50:16,674 epoch 9 - iter 308/447 - loss 0.00522451 - time (sec): 53.01 - samples/sec: 1159.33 - lr: 0.000007 - momentum: 0.000000 2023-09-03 18:50:23,367 epoch 9 - iter 352/447 - loss 0.00506470 - time (sec): 59.70 - samples/sec: 1163.74 - lr: 0.000007 - momentum: 0.000000 2023-09-03 18:50:29,866 epoch 9 - iter 396/447 - loss 0.00539255 - time (sec): 66.20 - samples/sec: 1169.13 - lr: 0.000006 - momentum: 0.000000 2023-09-03 18:50:36,957 epoch 9 - iter 440/447 - loss 0.00580390 - time (sec): 73.29 - samples/sec: 1164.19 - lr: 0.000006 - momentum: 0.000000 2023-09-03 18:50:37,953 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:50:37,953 EPOCH 9 done: loss 0.0059 - lr: 0.000006 2023-09-03 18:50:50,739 DEV : loss 0.22147929668426514 - f1-score (micro avg) 0.7969 2023-09-03 18:50:50,765 saving best model 2023-09-03 18:50:52,122 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:50:59,547 epoch 10 - iter 44/447 - loss 0.00413857 - time (sec): 7.42 - samples/sec: 1171.46 - lr: 0.000005 - momentum: 0.000000 2023-09-03 18:51:06,141 epoch 10 - iter 88/447 - loss 0.00278015 - time (sec): 14.02 - samples/sec: 1180.08 - lr: 0.000005 - momentum: 0.000000 2023-09-03 18:51:13,097 epoch 10 - iter 132/447 - loss 0.00278615 - time (sec): 20.97 - samples/sec: 1193.81 - lr: 0.000004 - momentum: 0.000000 2023-09-03 18:51:20,569 epoch 10 - iter 176/447 - loss 0.00302447 - time (sec): 28.45 - samples/sec: 1183.69 - lr: 0.000003 - momentum: 0.000000 2023-09-03 18:51:28,966 epoch 10 - iter 220/447 - loss 0.00366839 - time (sec): 36.84 - samples/sec: 1162.02 - lr: 0.000003 - momentum: 0.000000 2023-09-03 18:51:36,788 epoch 10 - iter 264/447 - loss 0.00391400 - time (sec): 44.66 - samples/sec: 1153.84 - lr: 0.000002 - momentum: 0.000000 2023-09-03 18:51:44,776 epoch 10 - iter 308/447 - loss 0.00371984 - time (sec): 52.65 - samples/sec: 1146.54 - lr: 0.000002 - momentum: 0.000000 2023-09-03 18:51:51,500 epoch 10 - iter 352/447 - loss 0.00399300 - time (sec): 59.38 - samples/sec: 1151.10 - lr: 0.000001 - momentum: 0.000000 2023-09-03 18:51:58,493 epoch 10 - iter 396/447 - loss 0.00406624 - time (sec): 66.37 - samples/sec: 1151.55 - lr: 0.000001 - momentum: 0.000000 2023-09-03 18:52:06,438 epoch 10 - iter 440/447 - loss 0.00443380 - time (sec): 74.31 - samples/sec: 1142.97 - lr: 0.000000 - momentum: 0.000000 2023-09-03 18:52:07,762 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:52:07,762 EPOCH 10 done: loss 0.0044 - lr: 0.000000 2023-09-03 18:52:21,014 DEV : loss 0.22894832491874695 - f1-score (micro avg) 0.7958 2023-09-03 18:52:21,562 ---------------------------------------------------------------------------------------------------- 2023-09-03 18:52:21,563 Loading model from best epoch ... 2023-09-03 18:52:23,844 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time 2023-09-03 18:52:34,774 Results: - F-score (micro) 0.7425 - F-score (macro) 0.6706 - Accuracy 0.6076 By class: precision recall f1-score support loc 0.8372 0.8456 0.8414 596 pers 0.6272 0.7477 0.6822 333 org 0.5462 0.4924 0.5179 132 prod 0.7568 0.4242 0.5437 66 time 0.7600 0.7755 0.7677 49 micro avg 0.7336 0.7517 0.7425 1176 macro avg 0.7055 0.6571 0.6706 1176 weighted avg 0.7373 0.7517 0.7402 1176 2023-09-03 18:52:34,774 ----------------------------------------------------------------------------------------------------