2023-09-03 21:47:48,328 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,329 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=21, bias=True) (loss_function): CrossEntropyLoss() )" 2023-09-03 21:47:48,329 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,329 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator 2023-09-03 21:47:48,329 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,329 Train: 3575 sentences 2023-09-03 21:47:48,329 (train_with_dev=False, train_with_test=False) 2023-09-03 21:47:48,329 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,329 Training Params: 2023-09-03 21:47:48,329 - learning_rate: "3e-05" 2023-09-03 21:47:48,330 - mini_batch_size: "8" 2023-09-03 21:47:48,330 - max_epochs: "10" 2023-09-03 21:47:48,330 - shuffle: "True" 2023-09-03 21:47:48,330 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,330 Plugins: 2023-09-03 21:47:48,330 - LinearScheduler | warmup_fraction: '0.1' 2023-09-03 21:47:48,330 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,330 Final evaluation on model from best epoch (best-model.pt) 2023-09-03 21:47:48,330 - metric: "('micro avg', 'f1-score')" 2023-09-03 21:47:48,330 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,330 Computation: 2023-09-03 21:47:48,330 - compute on device: cuda:0 2023-09-03 21:47:48,330 - embedding storage: none 2023-09-03 21:47:48,330 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,330 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4" 2023-09-03 21:47:48,330 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:48,330 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:47:55,785 epoch 1 - iter 44/447 - loss 3.12962511 - time (sec): 7.45 - samples/sec: 1175.07 - lr: 0.000003 - momentum: 0.000000 2023-09-03 21:48:03,528 epoch 1 - iter 88/447 - loss 2.35140020 - time (sec): 15.20 - samples/sec: 1169.16 - lr: 0.000006 - momentum: 0.000000 2023-09-03 21:48:10,452 epoch 1 - iter 132/447 - loss 1.78338774 - time (sec): 22.12 - samples/sec: 1162.88 - lr: 0.000009 - momentum: 0.000000 2023-09-03 21:48:18,571 epoch 1 - iter 176/447 - loss 1.43767302 - time (sec): 30.24 - samples/sec: 1141.34 - lr: 0.000012 - momentum: 0.000000 2023-09-03 21:48:25,531 epoch 1 - iter 220/447 - loss 1.22378907 - time (sec): 37.20 - samples/sec: 1148.20 - lr: 0.000015 - momentum: 0.000000 2023-09-03 21:48:32,585 epoch 1 - iter 264/447 - loss 1.08276545 - time (sec): 44.25 - samples/sec: 1150.88 - lr: 0.000018 - momentum: 0.000000 2023-09-03 21:48:39,754 epoch 1 - iter 308/447 - loss 0.97749772 - time (sec): 51.42 - samples/sec: 1151.81 - lr: 0.000021 - momentum: 0.000000 2023-09-03 21:48:46,915 epoch 1 - iter 352/447 - loss 0.89054623 - time (sec): 58.58 - samples/sec: 1154.42 - lr: 0.000024 - momentum: 0.000000 2023-09-03 21:48:53,666 epoch 1 - iter 396/447 - loss 0.81804034 - time (sec): 65.33 - samples/sec: 1157.81 - lr: 0.000027 - momentum: 0.000000 2023-09-03 21:49:02,254 epoch 1 - iter 440/447 - loss 0.75554468 - time (sec): 73.92 - samples/sec: 1153.07 - lr: 0.000029 - momentum: 0.000000 2023-09-03 21:49:03,335 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:49:03,335 EPOCH 1 done: loss 0.7477 - lr: 0.000029 2023-09-03 21:49:13,649 DEV : loss 0.18366600573062897 - f1-score (micro avg) 0.6486 2023-09-03 21:49:13,676 saving best model 2023-09-03 21:49:14,156 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:49:22,145 epoch 2 - iter 44/447 - loss 0.19258207 - time (sec): 7.99 - samples/sec: 1121.26 - lr: 0.000030 - momentum: 0.000000 2023-09-03 21:49:30,663 epoch 2 - iter 88/447 - loss 0.18581170 - time (sec): 16.50 - samples/sec: 1121.12 - lr: 0.000029 - momentum: 0.000000 2023-09-03 21:49:37,373 epoch 2 - iter 132/447 - loss 0.17637373 - time (sec): 23.22 - samples/sec: 1128.22 - lr: 0.000029 - momentum: 0.000000 2023-09-03 21:49:44,424 epoch 2 - iter 176/447 - loss 0.18195894 - time (sec): 30.27 - samples/sec: 1139.01 - lr: 0.000029 - momentum: 0.000000 2023-09-03 21:49:52,167 epoch 2 - iter 220/447 - loss 0.17720232 - time (sec): 38.01 - samples/sec: 1133.45 - lr: 0.000028 - momentum: 0.000000 2023-09-03 21:49:59,168 epoch 2 - iter 264/447 - loss 0.16769790 - time (sec): 45.01 - samples/sec: 1149.46 - lr: 0.000028 - momentum: 0.000000 2023-09-03 21:50:05,991 epoch 2 - iter 308/447 - loss 0.16592686 - time (sec): 51.83 - samples/sec: 1152.05 - lr: 0.000028 - momentum: 0.000000 2023-09-03 21:50:12,642 epoch 2 - iter 352/447 - loss 0.16512555 - time (sec): 58.48 - samples/sec: 1158.57 - lr: 0.000027 - momentum: 0.000000 2023-09-03 21:50:21,095 epoch 2 - iter 396/447 - loss 0.16132533 - time (sec): 66.94 - samples/sec: 1147.80 - lr: 0.000027 - momentum: 0.000000 2023-09-03 21:50:28,044 epoch 2 - iter 440/447 - loss 0.16014494 - time (sec): 73.89 - samples/sec: 1153.76 - lr: 0.000027 - momentum: 0.000000 2023-09-03 21:50:29,352 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:50:29,352 EPOCH 2 done: loss 0.1590 - lr: 0.000027 2023-09-03 21:50:42,071 DEV : loss 0.1284685730934143 - f1-score (micro avg) 0.6967 2023-09-03 21:50:42,097 saving best model 2023-09-03 21:50:43,417 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:50:50,812 epoch 3 - iter 44/447 - loss 0.11417537 - time (sec): 7.39 - samples/sec: 1157.30 - lr: 0.000026 - momentum: 0.000000 2023-09-03 21:50:59,278 epoch 3 - iter 88/447 - loss 0.10693471 - time (sec): 15.86 - samples/sec: 1125.46 - lr: 0.000026 - momentum: 0.000000 2023-09-03 21:51:06,565 epoch 3 - iter 132/447 - loss 0.09584293 - time (sec): 23.15 - samples/sec: 1132.38 - lr: 0.000026 - momentum: 0.000000 2023-09-03 21:51:13,634 epoch 3 - iter 176/447 - loss 0.09306988 - time (sec): 30.22 - samples/sec: 1140.56 - lr: 0.000025 - momentum: 0.000000 2023-09-03 21:51:20,197 epoch 3 - iter 220/447 - loss 0.09226293 - time (sec): 36.78 - samples/sec: 1143.63 - lr: 0.000025 - momentum: 0.000000 2023-09-03 21:51:27,371 epoch 3 - iter 264/447 - loss 0.08935900 - time (sec): 43.95 - samples/sec: 1150.24 - lr: 0.000025 - momentum: 0.000000 2023-09-03 21:51:34,321 epoch 3 - iter 308/447 - loss 0.09054067 - time (sec): 50.90 - samples/sec: 1153.10 - lr: 0.000024 - momentum: 0.000000 2023-09-03 21:51:41,820 epoch 3 - iter 352/447 - loss 0.08596907 - time (sec): 58.40 - samples/sec: 1156.31 - lr: 0.000024 - momentum: 0.000000 2023-09-03 21:51:48,537 epoch 3 - iter 396/447 - loss 0.08783809 - time (sec): 65.12 - samples/sec: 1164.56 - lr: 0.000024 - momentum: 0.000000 2023-09-03 21:51:56,728 epoch 3 - iter 440/447 - loss 0.08875828 - time (sec): 73.31 - samples/sec: 1163.68 - lr: 0.000023 - momentum: 0.000000 2023-09-03 21:51:57,734 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:51:57,734 EPOCH 3 done: loss 0.0884 - lr: 0.000023 2023-09-03 21:52:10,412 DEV : loss 0.12393897026777267 - f1-score (micro avg) 0.7455 2023-09-03 21:52:10,438 saving best model 2023-09-03 21:52:11,767 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:52:18,817 epoch 4 - iter 44/447 - loss 0.04837729 - time (sec): 7.05 - samples/sec: 1197.54 - lr: 0.000023 - momentum: 0.000000 2023-09-03 21:52:25,461 epoch 4 - iter 88/447 - loss 0.05700131 - time (sec): 13.69 - samples/sec: 1198.66 - lr: 0.000023 - momentum: 0.000000 2023-09-03 21:52:32,737 epoch 4 - iter 132/447 - loss 0.05240622 - time (sec): 20.97 - samples/sec: 1190.96 - lr: 0.000022 - momentum: 0.000000 2023-09-03 21:52:39,494 epoch 4 - iter 176/447 - loss 0.05021824 - time (sec): 27.73 - samples/sec: 1202.01 - lr: 0.000022 - momentum: 0.000000 2023-09-03 21:52:48,759 epoch 4 - iter 220/447 - loss 0.05199058 - time (sec): 36.99 - samples/sec: 1172.49 - lr: 0.000022 - momentum: 0.000000 2023-09-03 21:52:56,018 epoch 4 - iter 264/447 - loss 0.05288125 - time (sec): 44.25 - samples/sec: 1173.83 - lr: 0.000021 - momentum: 0.000000 2023-09-03 21:53:02,509 epoch 4 - iter 308/447 - loss 0.05181498 - time (sec): 50.74 - samples/sec: 1178.38 - lr: 0.000021 - momentum: 0.000000 2023-09-03 21:53:09,486 epoch 4 - iter 352/447 - loss 0.05179435 - time (sec): 57.72 - samples/sec: 1175.08 - lr: 0.000021 - momentum: 0.000000 2023-09-03 21:53:18,321 epoch 4 - iter 396/447 - loss 0.05213284 - time (sec): 66.55 - samples/sec: 1161.54 - lr: 0.000020 - momentum: 0.000000 2023-09-03 21:53:25,522 epoch 4 - iter 440/447 - loss 0.05214727 - time (sec): 73.75 - samples/sec: 1155.61 - lr: 0.000020 - momentum: 0.000000 2023-09-03 21:53:26,644 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:53:26,644 EPOCH 4 done: loss 0.0523 - lr: 0.000020 2023-09-03 21:53:39,531 DEV : loss 0.14459097385406494 - f1-score (micro avg) 0.7389 2023-09-03 21:53:39,558 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:53:47,425 epoch 5 - iter 44/447 - loss 0.04291044 - time (sec): 7.87 - samples/sec: 1140.36 - lr: 0.000020 - momentum: 0.000000 2023-09-03 21:53:54,897 epoch 5 - iter 88/447 - loss 0.03452113 - time (sec): 15.34 - samples/sec: 1125.95 - lr: 0.000019 - momentum: 0.000000 2023-09-03 21:54:02,424 epoch 5 - iter 132/447 - loss 0.03458062 - time (sec): 22.87 - samples/sec: 1142.12 - lr: 0.000019 - momentum: 0.000000 2023-09-03 21:54:09,729 epoch 5 - iter 176/447 - loss 0.03212976 - time (sec): 30.17 - samples/sec: 1149.47 - lr: 0.000019 - momentum: 0.000000 2023-09-03 21:54:16,699 epoch 5 - iter 220/447 - loss 0.03547472 - time (sec): 37.14 - samples/sec: 1148.25 - lr: 0.000018 - momentum: 0.000000 2023-09-03 21:54:24,438 epoch 5 - iter 264/447 - loss 0.03675887 - time (sec): 44.88 - samples/sec: 1142.21 - lr: 0.000018 - momentum: 0.000000 2023-09-03 21:54:33,086 epoch 5 - iter 308/447 - loss 0.03592056 - time (sec): 53.53 - samples/sec: 1131.73 - lr: 0.000018 - momentum: 0.000000 2023-09-03 21:54:39,877 epoch 5 - iter 352/447 - loss 0.03628439 - time (sec): 60.32 - samples/sec: 1137.15 - lr: 0.000017 - momentum: 0.000000 2023-09-03 21:54:47,432 epoch 5 - iter 396/447 - loss 0.03524403 - time (sec): 67.87 - samples/sec: 1131.06 - lr: 0.000017 - momentum: 0.000000 2023-09-03 21:54:55,117 epoch 5 - iter 440/447 - loss 0.03425228 - time (sec): 75.56 - samples/sec: 1129.93 - lr: 0.000017 - momentum: 0.000000 2023-09-03 21:54:56,153 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:54:56,154 EPOCH 5 done: loss 0.0339 - lr: 0.000017 2023-09-03 21:55:09,596 DEV : loss 0.15566274523735046 - f1-score (micro avg) 0.7765 2023-09-03 21:55:09,622 saving best model 2023-09-03 21:55:10,932 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:55:18,649 epoch 6 - iter 44/447 - loss 0.01226608 - time (sec): 7.72 - samples/sec: 1115.35 - lr: 0.000016 - momentum: 0.000000 2023-09-03 21:55:26,802 epoch 6 - iter 88/447 - loss 0.01561644 - time (sec): 15.87 - samples/sec: 1110.08 - lr: 0.000016 - momentum: 0.000000 2023-09-03 21:55:34,165 epoch 6 - iter 132/447 - loss 0.01849813 - time (sec): 23.23 - samples/sec: 1117.88 - lr: 0.000016 - momentum: 0.000000 2023-09-03 21:55:43,270 epoch 6 - iter 176/447 - loss 0.01853913 - time (sec): 32.34 - samples/sec: 1108.05 - lr: 0.000015 - momentum: 0.000000 2023-09-03 21:55:50,862 epoch 6 - iter 220/447 - loss 0.01997274 - time (sec): 39.93 - samples/sec: 1085.94 - lr: 0.000015 - momentum: 0.000000 2023-09-03 21:55:58,163 epoch 6 - iter 264/447 - loss 0.01944575 - time (sec): 47.23 - samples/sec: 1089.97 - lr: 0.000015 - momentum: 0.000000 2023-09-03 21:56:06,111 epoch 6 - iter 308/447 - loss 0.01889192 - time (sec): 55.18 - samples/sec: 1086.87 - lr: 0.000014 - momentum: 0.000000 2023-09-03 21:56:13,590 epoch 6 - iter 352/447 - loss 0.01981898 - time (sec): 62.66 - samples/sec: 1085.80 - lr: 0.000014 - momentum: 0.000000 2023-09-03 21:56:21,353 epoch 6 - iter 396/447 - loss 0.01996930 - time (sec): 70.42 - samples/sec: 1092.28 - lr: 0.000014 - momentum: 0.000000 2023-09-03 21:56:28,839 epoch 6 - iter 440/447 - loss 0.02090455 - time (sec): 77.91 - samples/sec: 1094.99 - lr: 0.000013 - momentum: 0.000000 2023-09-03 21:56:29,912 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:56:29,913 EPOCH 6 done: loss 0.0207 - lr: 0.000013 2023-09-03 21:56:43,475 DEV : loss 0.182601198554039 - f1-score (micro avg) 0.7789 2023-09-03 21:56:43,503 saving best model 2023-09-03 21:56:44,875 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:56:54,184 epoch 7 - iter 44/447 - loss 0.01603509 - time (sec): 9.31 - samples/sec: 1066.92 - lr: 0.000013 - momentum: 0.000000 2023-09-03 21:57:01,561 epoch 7 - iter 88/447 - loss 0.01323089 - time (sec): 16.68 - samples/sec: 1069.18 - lr: 0.000013 - momentum: 0.000000 2023-09-03 21:57:09,211 epoch 7 - iter 132/447 - loss 0.01186224 - time (sec): 24.33 - samples/sec: 1092.70 - lr: 0.000012 - momentum: 0.000000 2023-09-03 21:57:16,792 epoch 7 - iter 176/447 - loss 0.01210371 - time (sec): 31.92 - samples/sec: 1110.24 - lr: 0.000012 - momentum: 0.000000 2023-09-03 21:57:24,003 epoch 7 - iter 220/447 - loss 0.01409883 - time (sec): 39.13 - samples/sec: 1123.05 - lr: 0.000012 - momentum: 0.000000 2023-09-03 21:57:30,953 epoch 7 - iter 264/447 - loss 0.01369063 - time (sec): 46.08 - samples/sec: 1120.80 - lr: 0.000011 - momentum: 0.000000 2023-09-03 21:57:38,104 epoch 7 - iter 308/447 - loss 0.01417115 - time (sec): 53.23 - samples/sec: 1129.98 - lr: 0.000011 - momentum: 0.000000 2023-09-03 21:57:45,348 epoch 7 - iter 352/447 - loss 0.01331475 - time (sec): 60.47 - samples/sec: 1130.66 - lr: 0.000011 - momentum: 0.000000 2023-09-03 21:57:52,177 epoch 7 - iter 396/447 - loss 0.01453856 - time (sec): 67.30 - samples/sec: 1134.84 - lr: 0.000010 - momentum: 0.000000 2023-09-03 21:57:59,590 epoch 7 - iter 440/447 - loss 0.01481529 - time (sec): 74.71 - samples/sec: 1143.84 - lr: 0.000010 - momentum: 0.000000 2023-09-03 21:58:00,537 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:58:00,537 EPOCH 7 done: loss 0.0148 - lr: 0.000010 2023-09-03 21:58:12,973 DEV : loss 0.18851646780967712 - f1-score (micro avg) 0.7971 2023-09-03 21:58:13,000 saving best model 2023-09-03 21:58:14,355 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:58:21,584 epoch 8 - iter 44/447 - loss 0.01121020 - time (sec): 7.23 - samples/sec: 1188.87 - lr: 0.000010 - momentum: 0.000000 2023-09-03 21:58:29,461 epoch 8 - iter 88/447 - loss 0.00833087 - time (sec): 15.10 - samples/sec: 1134.48 - lr: 0.000009 - momentum: 0.000000 2023-09-03 21:58:36,309 epoch 8 - iter 132/447 - loss 0.00864124 - time (sec): 21.95 - samples/sec: 1154.73 - lr: 0.000009 - momentum: 0.000000 2023-09-03 21:58:43,289 epoch 8 - iter 176/447 - loss 0.00865783 - time (sec): 28.93 - samples/sec: 1158.04 - lr: 0.000009 - momentum: 0.000000 2023-09-03 21:58:50,704 epoch 8 - iter 220/447 - loss 0.00845649 - time (sec): 36.35 - samples/sec: 1149.70 - lr: 0.000008 - momentum: 0.000000 2023-09-03 21:58:57,699 epoch 8 - iter 264/447 - loss 0.00990314 - time (sec): 43.34 - samples/sec: 1158.31 - lr: 0.000008 - momentum: 0.000000 2023-09-03 21:59:04,899 epoch 8 - iter 308/447 - loss 0.00965520 - time (sec): 50.54 - samples/sec: 1156.44 - lr: 0.000008 - momentum: 0.000000 2023-09-03 21:59:13,335 epoch 8 - iter 352/447 - loss 0.01010932 - time (sec): 58.98 - samples/sec: 1146.74 - lr: 0.000007 - momentum: 0.000000 2023-09-03 21:59:21,564 epoch 8 - iter 396/447 - loss 0.01068112 - time (sec): 67.21 - samples/sec: 1141.36 - lr: 0.000007 - momentum: 0.000000 2023-09-03 21:59:28,510 epoch 8 - iter 440/447 - loss 0.01085444 - time (sec): 74.15 - samples/sec: 1147.24 - lr: 0.000007 - momentum: 0.000000 2023-09-03 21:59:29,803 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:59:29,803 EPOCH 8 done: loss 0.0107 - lr: 0.000007 2023-09-03 21:59:42,250 DEV : loss 0.20284989476203918 - f1-score (micro avg) 0.7876 2023-09-03 21:59:42,277 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:59:49,240 epoch 9 - iter 44/447 - loss 0.00423893 - time (sec): 6.96 - samples/sec: 1191.34 - lr: 0.000006 - momentum: 0.000000 2023-09-03 21:59:57,008 epoch 9 - iter 88/447 - loss 0.00509356 - time (sec): 14.73 - samples/sec: 1152.58 - lr: 0.000006 - momentum: 0.000000 2023-09-03 22:00:03,624 epoch 9 - iter 132/447 - loss 0.00622740 - time (sec): 21.35 - samples/sec: 1167.49 - lr: 0.000006 - momentum: 0.000000 2023-09-03 22:00:10,902 epoch 9 - iter 176/447 - loss 0.00693402 - time (sec): 28.62 - samples/sec: 1162.07 - lr: 0.000005 - momentum: 0.000000 2023-09-03 22:00:18,736 epoch 9 - iter 220/447 - loss 0.00610985 - time (sec): 36.46 - samples/sec: 1153.72 - lr: 0.000005 - momentum: 0.000000 2023-09-03 22:00:26,275 epoch 9 - iter 264/447 - loss 0.00550257 - time (sec): 44.00 - samples/sec: 1146.63 - lr: 0.000005 - momentum: 0.000000 2023-09-03 22:00:33,172 epoch 9 - iter 308/447 - loss 0.00667283 - time (sec): 50.89 - samples/sec: 1158.83 - lr: 0.000004 - momentum: 0.000000 2023-09-03 22:00:42,548 epoch 9 - iter 352/447 - loss 0.00651762 - time (sec): 60.27 - samples/sec: 1146.43 - lr: 0.000004 - momentum: 0.000000 2023-09-03 22:00:49,803 epoch 9 - iter 396/447 - loss 0.00616704 - time (sec): 67.53 - samples/sec: 1145.39 - lr: 0.000004 - momentum: 0.000000 2023-09-03 22:00:56,849 epoch 9 - iter 440/447 - loss 0.00609811 - time (sec): 74.57 - samples/sec: 1146.04 - lr: 0.000003 - momentum: 0.000000 2023-09-03 22:00:57,776 ---------------------------------------------------------------------------------------------------- 2023-09-03 22:00:57,777 EPOCH 9 done: loss 0.0062 - lr: 0.000003 2023-09-03 22:01:10,074 DEV : loss 0.21289774775505066 - f1-score (micro avg) 0.7969 2023-09-03 22:01:10,100 ---------------------------------------------------------------------------------------------------- 2023-09-03 22:01:18,803 epoch 10 - iter 44/447 - loss 0.00650982 - time (sec): 8.70 - samples/sec: 1135.98 - lr: 0.000003 - momentum: 0.000000 2023-09-03 22:01:26,909 epoch 10 - iter 88/447 - loss 0.00642166 - time (sec): 16.81 - samples/sec: 1103.46 - lr: 0.000003 - momentum: 0.000000 2023-09-03 22:01:34,164 epoch 10 - iter 132/447 - loss 0.00759233 - time (sec): 24.06 - samples/sec: 1114.27 - lr: 0.000002 - momentum: 0.000000 2023-09-03 22:01:41,073 epoch 10 - iter 176/447 - loss 0.00726370 - time (sec): 30.97 - samples/sec: 1123.88 - lr: 0.000002 - momentum: 0.000000 2023-09-03 22:01:48,439 epoch 10 - iter 220/447 - loss 0.00687022 - time (sec): 38.34 - samples/sec: 1129.50 - lr: 0.000002 - momentum: 0.000000 2023-09-03 22:01:55,158 epoch 10 - iter 264/447 - loss 0.00735313 - time (sec): 45.06 - samples/sec: 1138.72 - lr: 0.000001 - momentum: 0.000000 2023-09-03 22:02:02,331 epoch 10 - iter 308/447 - loss 0.00716427 - time (sec): 52.23 - samples/sec: 1139.16 - lr: 0.000001 - momentum: 0.000000 2023-09-03 22:02:10,045 epoch 10 - iter 352/447 - loss 0.00659423 - time (sec): 59.94 - samples/sec: 1136.17 - lr: 0.000001 - momentum: 0.000000 2023-09-03 22:02:16,825 epoch 10 - iter 396/447 - loss 0.00613042 - time (sec): 66.72 - samples/sec: 1145.61 - lr: 0.000000 - momentum: 0.000000 2023-09-03 22:02:24,335 epoch 10 - iter 440/447 - loss 0.00589795 - time (sec): 74.23 - samples/sec: 1152.23 - lr: 0.000000 - momentum: 0.000000 2023-09-03 22:02:25,349 ---------------------------------------------------------------------------------------------------- 2023-09-03 22:02:25,349 EPOCH 10 done: loss 0.0060 - lr: 0.000000 2023-09-03 22:02:38,119 DEV : loss 0.2145152986049652 - f1-score (micro avg) 0.793 2023-09-03 22:02:38,591 ---------------------------------------------------------------------------------------------------- 2023-09-03 22:02:38,592 Loading model from best epoch ... 2023-09-03 22:02:40,370 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time 2023-09-03 22:02:50,714 Results: - F-score (micro) 0.7551 - F-score (macro) 0.6917 - Accuracy 0.6265 By class: precision recall f1-score support loc 0.8289 0.8456 0.8372 596 pers 0.6693 0.7538 0.7090 333 org 0.5887 0.5530 0.5703 132 prod 0.7234 0.5152 0.6018 66 time 0.7255 0.7551 0.7400 49 micro avg 0.7461 0.7645 0.7551 1176 macro avg 0.7072 0.6845 0.6917 1176 weighted avg 0.7466 0.7645 0.7537 1176 2023-09-03 22:02:50,715 ----------------------------------------------------------------------------------------------------