2023-09-03 21:29:12,311 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,312 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=21, bias=True) (loss_function): CrossEntropyLoss() )" 2023-09-03 21:29:12,312 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,312 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator 2023-09-03 21:29:12,312 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,312 Train: 3575 sentences 2023-09-03 21:29:12,312 (train_with_dev=False, train_with_test=False) 2023-09-03 21:29:12,312 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,312 Training Params: 2023-09-03 21:29:12,312 - learning_rate: "5e-05" 2023-09-03 21:29:12,312 - mini_batch_size: "4" 2023-09-03 21:29:12,312 - max_epochs: "10" 2023-09-03 21:29:12,312 - shuffle: "True" 2023-09-03 21:29:12,312 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,313 Plugins: 2023-09-03 21:29:12,313 - LinearScheduler | warmup_fraction: '0.1' 2023-09-03 21:29:12,313 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,313 Final evaluation on model from best epoch (best-model.pt) 2023-09-03 21:29:12,313 - metric: "('micro avg', 'f1-score')" 2023-09-03 21:29:12,313 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,313 Computation: 2023-09-03 21:29:12,313 - compute on device: cuda:0 2023-09-03 21:29:12,313 - embedding storage: none 2023-09-03 21:29:12,313 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,313 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3" 2023-09-03 21:29:12,313 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:12,313 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:29:21,281 epoch 1 - iter 89/894 - loss 2.37573519 - time (sec): 8.97 - samples/sec: 935.65 - lr: 0.000005 - momentum: 0.000000 2023-09-03 21:29:30,672 epoch 1 - iter 178/894 - loss 1.43775852 - time (sec): 18.36 - samples/sec: 938.37 - lr: 0.000010 - momentum: 0.000000 2023-09-03 21:29:39,576 epoch 1 - iter 267/894 - loss 1.11155203 - time (sec): 27.26 - samples/sec: 925.17 - lr: 0.000015 - momentum: 0.000000 2023-09-03 21:29:49,321 epoch 1 - iter 356/894 - loss 0.88678663 - time (sec): 37.01 - samples/sec: 943.20 - lr: 0.000020 - momentum: 0.000000 2023-09-03 21:29:58,795 epoch 1 - iter 445/894 - loss 0.76216363 - time (sec): 46.48 - samples/sec: 939.28 - lr: 0.000025 - momentum: 0.000000 2023-09-03 21:30:08,426 epoch 1 - iter 534/894 - loss 0.67923090 - time (sec): 56.11 - samples/sec: 937.36 - lr: 0.000030 - momentum: 0.000000 2023-09-03 21:30:17,343 epoch 1 - iter 623/894 - loss 0.62299608 - time (sec): 65.03 - samples/sec: 932.84 - lr: 0.000035 - momentum: 0.000000 2023-09-03 21:30:26,620 epoch 1 - iter 712/894 - loss 0.58434755 - time (sec): 74.31 - samples/sec: 925.34 - lr: 0.000040 - momentum: 0.000000 2023-09-03 21:30:36,376 epoch 1 - iter 801/894 - loss 0.54340389 - time (sec): 84.06 - samples/sec: 927.55 - lr: 0.000045 - momentum: 0.000000 2023-09-03 21:30:45,270 epoch 1 - iter 890/894 - loss 0.51372191 - time (sec): 92.96 - samples/sec: 927.81 - lr: 0.000050 - momentum: 0.000000 2023-09-03 21:30:45,630 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:30:45,631 EPOCH 1 done: loss 0.5132 - lr: 0.000050 2023-09-03 21:30:56,853 DEV : loss 0.1689286082983017 - f1-score (micro avg) 0.6159 2023-09-03 21:30:56,880 saving best model 2023-09-03 21:30:57,363 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:31:06,371 epoch 2 - iter 89/894 - loss 0.21469836 - time (sec): 9.01 - samples/sec: 913.74 - lr: 0.000049 - momentum: 0.000000 2023-09-03 21:31:15,545 epoch 2 - iter 178/894 - loss 0.19766265 - time (sec): 18.18 - samples/sec: 927.20 - lr: 0.000049 - momentum: 0.000000 2023-09-03 21:31:24,622 epoch 2 - iter 267/894 - loss 0.18306157 - time (sec): 27.26 - samples/sec: 921.12 - lr: 0.000048 - momentum: 0.000000 2023-09-03 21:31:34,328 epoch 2 - iter 356/894 - loss 0.17429021 - time (sec): 36.96 - samples/sec: 911.61 - lr: 0.000048 - momentum: 0.000000 2023-09-03 21:31:43,318 epoch 2 - iter 445/894 - loss 0.17695779 - time (sec): 45.95 - samples/sec: 912.36 - lr: 0.000047 - momentum: 0.000000 2023-09-03 21:31:52,520 epoch 2 - iter 534/894 - loss 0.17167934 - time (sec): 55.16 - samples/sec: 910.02 - lr: 0.000047 - momentum: 0.000000 2023-09-03 21:32:02,104 epoch 2 - iter 623/894 - loss 0.16552044 - time (sec): 64.74 - samples/sec: 914.13 - lr: 0.000046 - momentum: 0.000000 2023-09-03 21:32:11,624 epoch 2 - iter 712/894 - loss 0.16532424 - time (sec): 74.26 - samples/sec: 914.87 - lr: 0.000046 - momentum: 0.000000 2023-09-03 21:32:21,057 epoch 2 - iter 801/894 - loss 0.16436528 - time (sec): 83.69 - samples/sec: 926.36 - lr: 0.000045 - momentum: 0.000000 2023-09-03 21:32:30,345 epoch 2 - iter 890/894 - loss 0.16458228 - time (sec): 92.98 - samples/sec: 927.28 - lr: 0.000044 - momentum: 0.000000 2023-09-03 21:32:30,726 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:32:30,726 EPOCH 2 done: loss 0.1645 - lr: 0.000044 2023-09-03 21:32:44,212 DEV : loss 0.17144104838371277 - f1-score (micro avg) 0.6691 2023-09-03 21:32:44,238 saving best model 2023-09-03 21:32:45,558 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:32:54,539 epoch 3 - iter 89/894 - loss 0.09817251 - time (sec): 8.98 - samples/sec: 912.52 - lr: 0.000044 - momentum: 0.000000 2023-09-03 21:33:03,449 epoch 3 - iter 178/894 - loss 0.09055944 - time (sec): 17.89 - samples/sec: 898.89 - lr: 0.000043 - momentum: 0.000000 2023-09-03 21:33:13,015 epoch 3 - iter 267/894 - loss 0.09533582 - time (sec): 27.46 - samples/sec: 911.87 - lr: 0.000043 - momentum: 0.000000 2023-09-03 21:33:22,020 epoch 3 - iter 356/894 - loss 0.10463513 - time (sec): 36.46 - samples/sec: 917.81 - lr: 0.000042 - momentum: 0.000000 2023-09-03 21:33:31,026 epoch 3 - iter 445/894 - loss 0.10568464 - time (sec): 45.47 - samples/sec: 913.54 - lr: 0.000042 - momentum: 0.000000 2023-09-03 21:33:40,255 epoch 3 - iter 534/894 - loss 0.10337665 - time (sec): 54.70 - samples/sec: 920.17 - lr: 0.000041 - momentum: 0.000000 2023-09-03 21:33:49,512 epoch 3 - iter 623/894 - loss 0.09850664 - time (sec): 63.95 - samples/sec: 918.61 - lr: 0.000041 - momentum: 0.000000 2023-09-03 21:33:59,018 epoch 3 - iter 712/894 - loss 0.09784361 - time (sec): 73.46 - samples/sec: 916.66 - lr: 0.000040 - momentum: 0.000000 2023-09-03 21:34:08,480 epoch 3 - iter 801/894 - loss 0.09360190 - time (sec): 82.92 - samples/sec: 920.71 - lr: 0.000039 - momentum: 0.000000 2023-09-03 21:34:18,615 epoch 3 - iter 890/894 - loss 0.09635431 - time (sec): 93.05 - samples/sec: 925.96 - lr: 0.000039 - momentum: 0.000000 2023-09-03 21:34:19,037 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:34:19,037 EPOCH 3 done: loss 0.0961 - lr: 0.000039 2023-09-03 21:34:32,624 DEV : loss 0.184891939163208 - f1-score (micro avg) 0.6806 2023-09-03 21:34:32,650 saving best model 2023-09-03 21:34:33,968 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:34:43,458 epoch 4 - iter 89/894 - loss 0.07799970 - time (sec): 9.49 - samples/sec: 876.00 - lr: 0.000038 - momentum: 0.000000 2023-09-03 21:34:52,551 epoch 4 - iter 178/894 - loss 0.06755105 - time (sec): 18.58 - samples/sec: 900.10 - lr: 0.000038 - momentum: 0.000000 2023-09-03 21:35:01,492 epoch 4 - iter 267/894 - loss 0.06069600 - time (sec): 27.52 - samples/sec: 911.39 - lr: 0.000037 - momentum: 0.000000 2023-09-03 21:35:11,552 epoch 4 - iter 356/894 - loss 0.06416203 - time (sec): 37.58 - samples/sec: 938.16 - lr: 0.000037 - momentum: 0.000000 2023-09-03 21:35:20,461 epoch 4 - iter 445/894 - loss 0.06358131 - time (sec): 46.49 - samples/sec: 935.92 - lr: 0.000036 - momentum: 0.000000 2023-09-03 21:35:29,741 epoch 4 - iter 534/894 - loss 0.06641721 - time (sec): 55.77 - samples/sec: 935.32 - lr: 0.000036 - momentum: 0.000000 2023-09-03 21:35:38,782 epoch 4 - iter 623/894 - loss 0.06367007 - time (sec): 64.81 - samples/sec: 939.43 - lr: 0.000035 - momentum: 0.000000 2023-09-03 21:35:47,600 epoch 4 - iter 712/894 - loss 0.06216577 - time (sec): 73.63 - samples/sec: 939.53 - lr: 0.000034 - momentum: 0.000000 2023-09-03 21:35:57,451 epoch 4 - iter 801/894 - loss 0.06304291 - time (sec): 83.48 - samples/sec: 938.26 - lr: 0.000034 - momentum: 0.000000 2023-09-03 21:36:06,188 epoch 4 - iter 890/894 - loss 0.06540605 - time (sec): 92.22 - samples/sec: 935.29 - lr: 0.000033 - momentum: 0.000000 2023-09-03 21:36:06,552 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:36:06,552 EPOCH 4 done: loss 0.0653 - lr: 0.000033 2023-09-03 21:36:19,272 DEV : loss 0.2018679976463318 - f1-score (micro avg) 0.7237 2023-09-03 21:36:19,298 saving best model 2023-09-03 21:36:20,629 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:36:29,669 epoch 5 - iter 89/894 - loss 0.06341849 - time (sec): 9.04 - samples/sec: 945.59 - lr: 0.000033 - momentum: 0.000000 2023-09-03 21:36:38,304 epoch 5 - iter 178/894 - loss 0.05096189 - time (sec): 17.67 - samples/sec: 947.13 - lr: 0.000032 - momentum: 0.000000 2023-09-03 21:36:47,080 epoch 5 - iter 267/894 - loss 0.04933270 - time (sec): 26.45 - samples/sec: 956.65 - lr: 0.000032 - momentum: 0.000000 2023-09-03 21:36:55,787 epoch 5 - iter 356/894 - loss 0.04822761 - time (sec): 35.16 - samples/sec: 947.00 - lr: 0.000031 - momentum: 0.000000 2023-09-03 21:37:05,496 epoch 5 - iter 445/894 - loss 0.04917756 - time (sec): 44.87 - samples/sec: 949.21 - lr: 0.000031 - momentum: 0.000000 2023-09-03 21:37:14,334 epoch 5 - iter 534/894 - loss 0.04999502 - time (sec): 53.70 - samples/sec: 946.49 - lr: 0.000030 - momentum: 0.000000 2023-09-03 21:37:22,917 epoch 5 - iter 623/894 - loss 0.04679459 - time (sec): 62.29 - samples/sec: 944.92 - lr: 0.000029 - momentum: 0.000000 2023-09-03 21:37:32,733 epoch 5 - iter 712/894 - loss 0.05112854 - time (sec): 72.10 - samples/sec: 956.88 - lr: 0.000029 - momentum: 0.000000 2023-09-03 21:37:41,829 epoch 5 - iter 801/894 - loss 0.04918769 - time (sec): 81.20 - samples/sec: 961.63 - lr: 0.000028 - momentum: 0.000000 2023-09-03 21:37:50,744 epoch 5 - iter 890/894 - loss 0.04937617 - time (sec): 90.11 - samples/sec: 957.25 - lr: 0.000028 - momentum: 0.000000 2023-09-03 21:37:51,109 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:37:51,109 EPOCH 5 done: loss 0.0493 - lr: 0.000028 2023-09-03 21:38:03,742 DEV : loss 0.22591951489448547 - f1-score (micro avg) 0.7599 2023-09-03 21:38:03,772 saving best model 2023-09-03 21:38:05,084 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:38:14,901 epoch 6 - iter 89/894 - loss 0.03093745 - time (sec): 9.82 - samples/sec: 989.27 - lr: 0.000027 - momentum: 0.000000 2023-09-03 21:38:23,655 epoch 6 - iter 178/894 - loss 0.02538333 - time (sec): 18.57 - samples/sec: 971.10 - lr: 0.000027 - momentum: 0.000000 2023-09-03 21:38:32,849 epoch 6 - iter 267/894 - loss 0.02745734 - time (sec): 27.76 - samples/sec: 969.05 - lr: 0.000026 - momentum: 0.000000 2023-09-03 21:38:41,966 epoch 6 - iter 356/894 - loss 0.02870489 - time (sec): 36.88 - samples/sec: 972.17 - lr: 0.000026 - momentum: 0.000000 2023-09-03 21:38:50,897 epoch 6 - iter 445/894 - loss 0.02870487 - time (sec): 45.81 - samples/sec: 977.49 - lr: 0.000025 - momentum: 0.000000 2023-09-03 21:38:59,601 epoch 6 - iter 534/894 - loss 0.03025865 - time (sec): 54.52 - samples/sec: 968.70 - lr: 0.000024 - momentum: 0.000000 2023-09-03 21:39:08,517 epoch 6 - iter 623/894 - loss 0.02891946 - time (sec): 63.43 - samples/sec: 958.04 - lr: 0.000024 - momentum: 0.000000 2023-09-03 21:39:17,249 epoch 6 - iter 712/894 - loss 0.03124647 - time (sec): 72.16 - samples/sec: 954.38 - lr: 0.000023 - momentum: 0.000000 2023-09-03 21:39:26,268 epoch 6 - iter 801/894 - loss 0.03206784 - time (sec): 81.18 - samples/sec: 956.96 - lr: 0.000023 - momentum: 0.000000 2023-09-03 21:39:35,360 epoch 6 - iter 890/894 - loss 0.03264358 - time (sec): 90.27 - samples/sec: 955.85 - lr: 0.000022 - momentum: 0.000000 2023-09-03 21:39:35,701 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:39:35,701 EPOCH 6 done: loss 0.0326 - lr: 0.000022 2023-09-03 21:39:48,613 DEV : loss 0.2312653362751007 - f1-score (micro avg) 0.7403 2023-09-03 21:39:48,640 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:39:58,053 epoch 7 - iter 89/894 - loss 0.02565388 - time (sec): 9.41 - samples/sec: 991.53 - lr: 0.000022 - momentum: 0.000000 2023-09-03 21:40:08,304 epoch 7 - iter 178/894 - loss 0.02241187 - time (sec): 19.66 - samples/sec: 971.81 - lr: 0.000021 - momentum: 0.000000 2023-09-03 21:40:17,537 epoch 7 - iter 267/894 - loss 0.02101239 - time (sec): 28.90 - samples/sec: 967.66 - lr: 0.000021 - momentum: 0.000000 2023-09-03 21:40:26,817 epoch 7 - iter 356/894 - loss 0.02145189 - time (sec): 38.18 - samples/sec: 977.31 - lr: 0.000020 - momentum: 0.000000 2023-09-03 21:40:36,306 epoch 7 - iter 445/894 - loss 0.02018220 - time (sec): 47.66 - samples/sec: 965.86 - lr: 0.000019 - momentum: 0.000000 2023-09-03 21:40:45,201 epoch 7 - iter 534/894 - loss 0.01918311 - time (sec): 56.56 - samples/sec: 953.90 - lr: 0.000019 - momentum: 0.000000 2023-09-03 21:40:53,682 epoch 7 - iter 623/894 - loss 0.01939565 - time (sec): 65.04 - samples/sec: 946.67 - lr: 0.000018 - momentum: 0.000000 2023-09-03 21:41:02,868 epoch 7 - iter 712/894 - loss 0.01938977 - time (sec): 74.23 - samples/sec: 942.98 - lr: 0.000018 - momentum: 0.000000 2023-09-03 21:41:11,585 epoch 7 - iter 801/894 - loss 0.01854526 - time (sec): 82.94 - samples/sec: 940.82 - lr: 0.000017 - momentum: 0.000000 2023-09-03 21:41:20,690 epoch 7 - iter 890/894 - loss 0.01808458 - time (sec): 92.05 - samples/sec: 935.55 - lr: 0.000017 - momentum: 0.000000 2023-09-03 21:41:21,109 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:41:21,109 EPOCH 7 done: loss 0.0180 - lr: 0.000017 2023-09-03 21:41:34,685 DEV : loss 0.27939727902412415 - f1-score (micro avg) 0.7561 2023-09-03 21:41:34,712 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:41:43,958 epoch 8 - iter 89/894 - loss 0.01939659 - time (sec): 9.24 - samples/sec: 945.57 - lr: 0.000016 - momentum: 0.000000 2023-09-03 21:41:53,013 epoch 8 - iter 178/894 - loss 0.01689374 - time (sec): 18.30 - samples/sec: 924.46 - lr: 0.000016 - momentum: 0.000000 2023-09-03 21:42:03,422 epoch 8 - iter 267/894 - loss 0.01403347 - time (sec): 28.71 - samples/sec: 942.21 - lr: 0.000015 - momentum: 0.000000 2023-09-03 21:42:12,750 epoch 8 - iter 356/894 - loss 0.01720869 - time (sec): 38.04 - samples/sec: 933.60 - lr: 0.000014 - momentum: 0.000000 2023-09-03 21:42:21,689 epoch 8 - iter 445/894 - loss 0.01536338 - time (sec): 46.98 - samples/sec: 937.88 - lr: 0.000014 - momentum: 0.000000 2023-09-03 21:42:31,262 epoch 8 - iter 534/894 - loss 0.01551848 - time (sec): 56.55 - samples/sec: 936.72 - lr: 0.000013 - momentum: 0.000000 2023-09-03 21:42:40,238 epoch 8 - iter 623/894 - loss 0.01450492 - time (sec): 65.52 - samples/sec: 936.97 - lr: 0.000013 - momentum: 0.000000 2023-09-03 21:42:49,068 epoch 8 - iter 712/894 - loss 0.01507741 - time (sec): 74.35 - samples/sec: 936.71 - lr: 0.000012 - momentum: 0.000000 2023-09-03 21:42:58,036 epoch 8 - iter 801/894 - loss 0.01430409 - time (sec): 83.32 - samples/sec: 941.84 - lr: 0.000012 - momentum: 0.000000 2023-09-03 21:43:06,614 epoch 8 - iter 890/894 - loss 0.01440714 - time (sec): 91.90 - samples/sec: 938.57 - lr: 0.000011 - momentum: 0.000000 2023-09-03 21:43:06,967 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:43:06,967 EPOCH 8 done: loss 0.0144 - lr: 0.000011 2023-09-03 21:43:19,717 DEV : loss 0.25227266550064087 - f1-score (micro avg) 0.7538 2023-09-03 21:43:19,743 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:43:28,846 epoch 9 - iter 89/894 - loss 0.00701860 - time (sec): 9.10 - samples/sec: 904.91 - lr: 0.000011 - momentum: 0.000000 2023-09-03 21:43:38,734 epoch 9 - iter 178/894 - loss 0.00405271 - time (sec): 18.99 - samples/sec: 932.90 - lr: 0.000010 - momentum: 0.000000 2023-09-03 21:43:47,869 epoch 9 - iter 267/894 - loss 0.00458330 - time (sec): 28.12 - samples/sec: 934.72 - lr: 0.000009 - momentum: 0.000000 2023-09-03 21:43:56,759 epoch 9 - iter 356/894 - loss 0.00756307 - time (sec): 37.01 - samples/sec: 938.88 - lr: 0.000009 - momentum: 0.000000 2023-09-03 21:44:05,409 epoch 9 - iter 445/894 - loss 0.00714667 - time (sec): 45.66 - samples/sec: 947.67 - lr: 0.000008 - momentum: 0.000000 2023-09-03 21:44:14,213 epoch 9 - iter 534/894 - loss 0.00708796 - time (sec): 54.47 - samples/sec: 946.20 - lr: 0.000008 - momentum: 0.000000 2023-09-03 21:44:22,932 epoch 9 - iter 623/894 - loss 0.00691866 - time (sec): 63.19 - samples/sec: 943.72 - lr: 0.000007 - momentum: 0.000000 2023-09-03 21:44:31,848 epoch 9 - iter 712/894 - loss 0.00610120 - time (sec): 72.10 - samples/sec: 947.24 - lr: 0.000007 - momentum: 0.000000 2023-09-03 21:44:41,509 epoch 9 - iter 801/894 - loss 0.00735687 - time (sec): 81.76 - samples/sec: 953.29 - lr: 0.000006 - momentum: 0.000000 2023-09-03 21:44:50,521 epoch 9 - iter 890/894 - loss 0.00763608 - time (sec): 90.78 - samples/sec: 949.23 - lr: 0.000006 - momentum: 0.000000 2023-09-03 21:44:50,930 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:44:50,931 EPOCH 9 done: loss 0.0079 - lr: 0.000006 2023-09-03 21:45:03,963 DEV : loss 0.2669627368450165 - f1-score (micro avg) 0.7631 2023-09-03 21:45:03,990 saving best model 2023-09-03 21:45:05,380 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:45:14,802 epoch 10 - iter 89/894 - loss 0.00553400 - time (sec): 9.42 - samples/sec: 971.28 - lr: 0.000005 - momentum: 0.000000 2023-09-03 21:45:23,626 epoch 10 - iter 178/894 - loss 0.00348736 - time (sec): 18.24 - samples/sec: 956.13 - lr: 0.000004 - momentum: 0.000000 2023-09-03 21:45:32,657 epoch 10 - iter 267/894 - loss 0.00270854 - time (sec): 27.28 - samples/sec: 962.01 - lr: 0.000004 - momentum: 0.000000 2023-09-03 21:45:41,872 epoch 10 - iter 356/894 - loss 0.00366180 - time (sec): 36.49 - samples/sec: 958.01 - lr: 0.000003 - momentum: 0.000000 2023-09-03 21:45:50,539 epoch 10 - iter 445/894 - loss 0.00392369 - time (sec): 45.16 - samples/sec: 953.26 - lr: 0.000003 - momentum: 0.000000 2023-09-03 21:45:59,537 epoch 10 - iter 534/894 - loss 0.00402871 - time (sec): 54.16 - samples/sec: 947.15 - lr: 0.000002 - momentum: 0.000000 2023-09-03 21:46:08,775 epoch 10 - iter 623/894 - loss 0.00360604 - time (sec): 63.39 - samples/sec: 951.43 - lr: 0.000002 - momentum: 0.000000 2023-09-03 21:46:19,239 epoch 10 - iter 712/894 - loss 0.00331703 - time (sec): 73.86 - samples/sec: 953.08 - lr: 0.000001 - momentum: 0.000000 2023-09-03 21:46:28,192 epoch 10 - iter 801/894 - loss 0.00357190 - time (sec): 82.81 - samples/sec: 944.85 - lr: 0.000001 - momentum: 0.000000 2023-09-03 21:46:36,938 epoch 10 - iter 890/894 - loss 0.00345665 - time (sec): 91.56 - samples/sec: 940.90 - lr: 0.000000 - momentum: 0.000000 2023-09-03 21:46:37,328 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:46:37,328 EPOCH 10 done: loss 0.0034 - lr: 0.000000 2023-09-03 21:46:50,655 DEV : loss 0.27164772152900696 - f1-score (micro avg) 0.7586 2023-09-03 21:46:51,154 ---------------------------------------------------------------------------------------------------- 2023-09-03 21:46:51,155 Loading model from best epoch ... 2023-09-03 21:46:52,931 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time 2023-09-03 21:47:03,568 Results: - F-score (micro) 0.7364 - F-score (macro) 0.6707 - Accuracy 0.6043 By class: precision recall f1-score support loc 0.8197 0.8238 0.8218 596 pers 0.6509 0.7447 0.6947 333 org 0.5833 0.4242 0.4912 132 prod 0.6250 0.5303 0.5738 66 time 0.7500 0.7959 0.7723 49 micro avg 0.7340 0.7389 0.7364 1176 macro avg 0.6858 0.6638 0.6707 1176 weighted avg 0.7315 0.7389 0.7327 1176 2023-09-03 21:47:03,568 ----------------------------------------------------------------------------------------------------