|
2023-09-03 19:29:16,528 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,529 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=21, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-09-03 19:29:16,529 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,529 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences |
|
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator |
|
2023-09-03 19:29:16,529 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,529 Train: 3575 sentences |
|
2023-09-03 19:29:16,530 (train_with_dev=False, train_with_test=False) |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,530 Training Params: |
|
2023-09-03 19:29:16,530 - learning_rate: "3e-05" |
|
2023-09-03 19:29:16,530 - mini_batch_size: "8" |
|
2023-09-03 19:29:16,530 - max_epochs: "10" |
|
2023-09-03 19:29:16,530 - shuffle: "True" |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,530 Plugins: |
|
2023-09-03 19:29:16,530 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,530 Final evaluation on model from best epoch (best-model.pt) |
|
2023-09-03 19:29:16,530 - metric: "('micro avg', 'f1-score')" |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,530 Computation: |
|
2023-09-03 19:29:16,530 - compute on device: cuda:0 |
|
2023-09-03 19:29:16,530 - embedding storage: none |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,530 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2" |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:16,530 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:29:23,448 epoch 1 - iter 44/447 - loss 3.19007209 - time (sec): 6.92 - samples/sec: 1152.59 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:29:30,626 epoch 1 - iter 88/447 - loss 2.53442556 - time (sec): 14.09 - samples/sec: 1130.73 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:29:38,072 epoch 1 - iter 132/447 - loss 1.80646532 - time (sec): 21.54 - samples/sec: 1146.61 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:29:45,084 epoch 1 - iter 176/447 - loss 1.48740869 - time (sec): 28.55 - samples/sec: 1142.99 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:29:52,496 epoch 1 - iter 220/447 - loss 1.25877196 - time (sec): 35.96 - samples/sec: 1147.28 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:30:01,853 epoch 1 - iter 264/447 - loss 1.07980864 - time (sec): 45.32 - samples/sec: 1138.90 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:30:09,492 epoch 1 - iter 308/447 - loss 0.97748992 - time (sec): 52.96 - samples/sec: 1125.74 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 19:30:16,350 epoch 1 - iter 352/447 - loss 0.88943186 - time (sec): 59.82 - samples/sec: 1137.59 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 19:30:23,843 epoch 1 - iter 396/447 - loss 0.82223418 - time (sec): 67.31 - samples/sec: 1135.66 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 19:30:30,904 epoch 1 - iter 440/447 - loss 0.76505808 - time (sec): 74.37 - samples/sec: 1138.13 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 19:30:32,376 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:30:32,376 EPOCH 1 done: loss 0.7533 - lr: 0.000029 |
|
2023-09-03 19:30:42,710 DEV : loss 0.2009868174791336 - f1-score (micro avg) 0.55 |
|
2023-09-03 19:30:42,736 saving best model |
|
2023-09-03 19:30:43,196 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:30:50,362 epoch 2 - iter 44/447 - loss 0.24012949 - time (sec): 7.16 - samples/sec: 1189.44 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-03 19:30:57,854 epoch 2 - iter 88/447 - loss 0.22293035 - time (sec): 14.66 - samples/sec: 1151.20 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 19:31:04,563 epoch 2 - iter 132/447 - loss 0.20286926 - time (sec): 21.37 - samples/sec: 1170.63 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 19:31:11,883 epoch 2 - iter 176/447 - loss 0.19655009 - time (sec): 28.69 - samples/sec: 1177.57 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 19:31:18,584 epoch 2 - iter 220/447 - loss 0.18797064 - time (sec): 35.39 - samples/sec: 1174.86 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 19:31:26,678 epoch 2 - iter 264/447 - loss 0.18041665 - time (sec): 43.48 - samples/sec: 1170.98 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 19:31:33,651 epoch 2 - iter 308/447 - loss 0.17434352 - time (sec): 50.45 - samples/sec: 1174.62 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 19:31:41,504 epoch 2 - iter 352/447 - loss 0.17006885 - time (sec): 58.31 - samples/sec: 1176.33 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 19:31:49,385 epoch 2 - iter 396/447 - loss 0.16869020 - time (sec): 66.19 - samples/sec: 1162.87 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 19:31:56,494 epoch 2 - iter 440/447 - loss 0.16625334 - time (sec): 73.30 - samples/sec: 1162.89 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 19:31:57,490 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:31:57,490 EPOCH 2 done: loss 0.1657 - lr: 0.000027 |
|
2023-09-03 19:32:10,118 DEV : loss 0.12543398141860962 - f1-score (micro avg) 0.6997 |
|
2023-09-03 19:32:10,144 saving best model |
|
2023-09-03 19:32:11,458 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:32:19,222 epoch 3 - iter 44/447 - loss 0.09271356 - time (sec): 7.76 - samples/sec: 1097.05 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 19:32:27,280 epoch 3 - iter 88/447 - loss 0.08576467 - time (sec): 15.82 - samples/sec: 1144.61 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 19:32:35,115 epoch 3 - iter 132/447 - loss 0.08869787 - time (sec): 23.65 - samples/sec: 1151.56 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 19:32:42,719 epoch 3 - iter 176/447 - loss 0.08093044 - time (sec): 31.26 - samples/sec: 1156.06 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 19:32:50,458 epoch 3 - iter 220/447 - loss 0.09036291 - time (sec): 39.00 - samples/sec: 1154.21 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 19:32:57,257 epoch 3 - iter 264/447 - loss 0.09249644 - time (sec): 45.80 - samples/sec: 1148.73 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 19:33:03,973 epoch 3 - iter 308/447 - loss 0.08893785 - time (sec): 52.51 - samples/sec: 1158.07 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 19:33:10,650 epoch 3 - iter 352/447 - loss 0.08855651 - time (sec): 59.19 - samples/sec: 1162.75 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 19:33:18,009 epoch 3 - iter 396/447 - loss 0.08761760 - time (sec): 66.55 - samples/sec: 1158.54 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 19:33:24,853 epoch 3 - iter 440/447 - loss 0.08937895 - time (sec): 73.39 - samples/sec: 1161.78 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 19:33:25,879 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:33:25,879 EPOCH 3 done: loss 0.0896 - lr: 0.000023 |
|
2023-09-03 19:33:38,507 DEV : loss 0.11516160517930984 - f1-score (micro avg) 0.7475 |
|
2023-09-03 19:33:38,533 saving best model |
|
2023-09-03 19:33:39,852 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:33:46,198 epoch 4 - iter 44/447 - loss 0.05182455 - time (sec): 6.34 - samples/sec: 1190.63 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 19:33:54,167 epoch 4 - iter 88/447 - loss 0.04665842 - time (sec): 14.31 - samples/sec: 1175.05 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 19:34:01,223 epoch 4 - iter 132/447 - loss 0.05436150 - time (sec): 21.37 - samples/sec: 1173.70 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 19:34:08,302 epoch 4 - iter 176/447 - loss 0.05361792 - time (sec): 28.45 - samples/sec: 1182.21 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 19:34:14,821 epoch 4 - iter 220/447 - loss 0.05472694 - time (sec): 34.97 - samples/sec: 1174.09 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 19:34:23,348 epoch 4 - iter 264/447 - loss 0.05132701 - time (sec): 43.50 - samples/sec: 1172.89 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 19:34:31,640 epoch 4 - iter 308/447 - loss 0.05098314 - time (sec): 51.79 - samples/sec: 1154.36 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 19:34:38,448 epoch 4 - iter 352/447 - loss 0.05076573 - time (sec): 58.60 - samples/sec: 1157.75 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 19:34:46,013 epoch 4 - iter 396/447 - loss 0.04989549 - time (sec): 66.16 - samples/sec: 1165.38 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 19:34:52,977 epoch 4 - iter 440/447 - loss 0.04980707 - time (sec): 73.12 - samples/sec: 1167.43 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 19:34:53,997 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:34:53,997 EPOCH 4 done: loss 0.0495 - lr: 0.000020 |
|
2023-09-03 19:35:06,658 DEV : loss 0.14562876522541046 - f1-score (micro avg) 0.7768 |
|
2023-09-03 19:35:06,684 saving best model |
|
2023-09-03 19:35:08,033 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:35:15,237 epoch 5 - iter 44/447 - loss 0.04532975 - time (sec): 7.20 - samples/sec: 1122.51 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 19:35:22,154 epoch 5 - iter 88/447 - loss 0.03636320 - time (sec): 14.12 - samples/sec: 1125.32 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:35:29,738 epoch 5 - iter 132/447 - loss 0.03369699 - time (sec): 21.70 - samples/sec: 1129.49 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:35:36,874 epoch 5 - iter 176/447 - loss 0.03420293 - time (sec): 28.84 - samples/sec: 1132.14 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:35:45,113 epoch 5 - iter 220/447 - loss 0.03139251 - time (sec): 37.08 - samples/sec: 1146.34 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:35:51,745 epoch 5 - iter 264/447 - loss 0.03132039 - time (sec): 43.71 - samples/sec: 1162.95 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:35:59,757 epoch 5 - iter 308/447 - loss 0.03075044 - time (sec): 51.72 - samples/sec: 1155.52 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:36:08,207 epoch 5 - iter 352/447 - loss 0.03108055 - time (sec): 60.17 - samples/sec: 1147.45 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:36:15,447 epoch 5 - iter 396/447 - loss 0.03125927 - time (sec): 67.41 - samples/sec: 1153.15 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:36:21,846 epoch 5 - iter 440/447 - loss 0.03113833 - time (sec): 73.81 - samples/sec: 1154.74 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:36:22,896 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:36:22,896 EPOCH 5 done: loss 0.0309 - lr: 0.000017 |
|
2023-09-03 19:36:35,956 DEV : loss 0.16441383957862854 - f1-score (micro avg) 0.7662 |
|
2023-09-03 19:36:35,983 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:36:43,439 epoch 6 - iter 44/447 - loss 0.02358977 - time (sec): 7.45 - samples/sec: 1151.83 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:36:50,732 epoch 6 - iter 88/447 - loss 0.02454260 - time (sec): 14.75 - samples/sec: 1137.84 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:36:57,815 epoch 6 - iter 132/447 - loss 0.02258539 - time (sec): 21.83 - samples/sec: 1136.93 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:37:05,179 epoch 6 - iter 176/447 - loss 0.02114354 - time (sec): 29.19 - samples/sec: 1136.33 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:37:13,071 epoch 6 - iter 220/447 - loss 0.02077686 - time (sec): 37.09 - samples/sec: 1122.22 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:37:20,210 epoch 6 - iter 264/447 - loss 0.02015972 - time (sec): 44.23 - samples/sec: 1129.70 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:37:27,081 epoch 6 - iter 308/447 - loss 0.02051455 - time (sec): 51.10 - samples/sec: 1131.68 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:37:34,883 epoch 6 - iter 352/447 - loss 0.02170789 - time (sec): 58.90 - samples/sec: 1127.99 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:37:43,150 epoch 6 - iter 396/447 - loss 0.02174907 - time (sec): 67.17 - samples/sec: 1117.14 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:37:52,285 epoch 6 - iter 440/447 - loss 0.02109897 - time (sec): 76.30 - samples/sec: 1114.19 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:37:53,648 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:37:53,648 EPOCH 6 done: loss 0.0209 - lr: 0.000013 |
|
2023-09-03 19:38:07,077 DEV : loss 0.1834675371646881 - f1-score (micro avg) 0.7753 |
|
2023-09-03 19:38:07,104 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:38:14,678 epoch 7 - iter 44/447 - loss 0.01475716 - time (sec): 7.57 - samples/sec: 1136.15 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:38:22,372 epoch 7 - iter 88/447 - loss 0.01538235 - time (sec): 15.27 - samples/sec: 1119.97 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:38:29,614 epoch 7 - iter 132/447 - loss 0.01281823 - time (sec): 22.51 - samples/sec: 1160.39 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:38:37,545 epoch 7 - iter 176/447 - loss 0.01584858 - time (sec): 30.44 - samples/sec: 1142.64 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:38:45,043 epoch 7 - iter 220/447 - loss 0.01466873 - time (sec): 37.94 - samples/sec: 1125.94 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:38:52,900 epoch 7 - iter 264/447 - loss 0.01357460 - time (sec): 45.79 - samples/sec: 1122.50 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:39:00,331 epoch 7 - iter 308/447 - loss 0.01390915 - time (sec): 53.23 - samples/sec: 1116.71 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:39:07,962 epoch 7 - iter 352/447 - loss 0.01346994 - time (sec): 60.86 - samples/sec: 1115.27 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:39:15,066 epoch 7 - iter 396/447 - loss 0.01405975 - time (sec): 67.96 - samples/sec: 1110.49 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:39:23,740 epoch 7 - iter 440/447 - loss 0.01354116 - time (sec): 76.63 - samples/sec: 1104.22 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:39:25,807 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:39:25,807 EPOCH 7 done: loss 0.0137 - lr: 0.000010 |
|
2023-09-03 19:39:38,952 DEV : loss 0.19323676824569702 - f1-score (micro avg) 0.7833 |
|
2023-09-03 19:39:38,979 saving best model |
|
2023-09-03 19:39:40,323 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:39:47,391 epoch 8 - iter 44/447 - loss 0.00788687 - time (sec): 7.07 - samples/sec: 1181.25 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:39:57,517 epoch 8 - iter 88/447 - loss 0.00844500 - time (sec): 17.19 - samples/sec: 1038.49 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:40:05,024 epoch 8 - iter 132/447 - loss 0.00996625 - time (sec): 24.70 - samples/sec: 1056.67 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:40:12,354 epoch 8 - iter 176/447 - loss 0.00894889 - time (sec): 32.03 - samples/sec: 1078.01 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:40:19,563 epoch 8 - iter 220/447 - loss 0.00890931 - time (sec): 39.24 - samples/sec: 1080.80 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:40:28,021 epoch 8 - iter 264/447 - loss 0.00897967 - time (sec): 47.70 - samples/sec: 1073.69 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:40:35,717 epoch 8 - iter 308/447 - loss 0.00944563 - time (sec): 55.39 - samples/sec: 1084.63 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:40:43,176 epoch 8 - iter 352/447 - loss 0.01053706 - time (sec): 62.85 - samples/sec: 1085.21 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:40:50,859 epoch 8 - iter 396/447 - loss 0.01097928 - time (sec): 70.53 - samples/sec: 1088.35 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:40:58,497 epoch 8 - iter 440/447 - loss 0.01105219 - time (sec): 78.17 - samples/sec: 1090.57 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:40:59,655 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:40:59,656 EPOCH 8 done: loss 0.0111 - lr: 0.000007 |
|
2023-09-03 19:41:12,783 DEV : loss 0.21089980006217957 - f1-score (micro avg) 0.7903 |
|
2023-09-03 19:41:12,811 saving best model |
|
2023-09-03 19:41:14,132 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:41:21,736 epoch 9 - iter 44/447 - loss 0.00372544 - time (sec): 7.60 - samples/sec: 1125.59 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:41:28,680 epoch 9 - iter 88/447 - loss 0.00446647 - time (sec): 14.55 - samples/sec: 1156.86 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:41:36,360 epoch 9 - iter 132/447 - loss 0.00581485 - time (sec): 22.23 - samples/sec: 1126.69 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:41:43,841 epoch 9 - iter 176/447 - loss 0.00668476 - time (sec): 29.71 - samples/sec: 1133.31 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:41:53,294 epoch 9 - iter 220/447 - loss 0.00663822 - time (sec): 39.16 - samples/sec: 1106.30 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:42:00,743 epoch 9 - iter 264/447 - loss 0.00613799 - time (sec): 46.61 - samples/sec: 1110.05 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:42:08,618 epoch 9 - iter 308/447 - loss 0.00629927 - time (sec): 54.49 - samples/sec: 1097.29 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:42:16,780 epoch 9 - iter 352/447 - loss 0.00623439 - time (sec): 62.65 - samples/sec: 1097.60 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:42:23,837 epoch 9 - iter 396/447 - loss 0.00632364 - time (sec): 69.70 - samples/sec: 1099.34 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:42:31,296 epoch 9 - iter 440/447 - loss 0.00680502 - time (sec): 77.16 - samples/sec: 1102.51 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:42:33,153 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:42:33,153 EPOCH 9 done: loss 0.0069 - lr: 0.000003 |
|
2023-09-03 19:42:46,375 DEV : loss 0.2204572707414627 - f1-score (micro avg) 0.7901 |
|
2023-09-03 19:42:46,401 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:42:54,639 epoch 10 - iter 44/447 - loss 0.00107852 - time (sec): 8.24 - samples/sec: 1110.82 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:43:01,983 epoch 10 - iter 88/447 - loss 0.00398631 - time (sec): 15.58 - samples/sec: 1101.53 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:43:09,675 epoch 10 - iter 132/447 - loss 0.00549052 - time (sec): 23.27 - samples/sec: 1089.55 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:43:18,909 epoch 10 - iter 176/447 - loss 0.00432581 - time (sec): 32.51 - samples/sec: 1084.96 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:43:26,150 epoch 10 - iter 220/447 - loss 0.00450331 - time (sec): 39.75 - samples/sec: 1094.80 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:43:33,036 epoch 10 - iter 264/447 - loss 0.00464585 - time (sec): 46.63 - samples/sec: 1110.24 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:43:40,136 epoch 10 - iter 308/447 - loss 0.00514209 - time (sec): 53.73 - samples/sec: 1108.01 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:43:48,593 epoch 10 - iter 352/447 - loss 0.00509009 - time (sec): 62.19 - samples/sec: 1096.76 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:43:56,066 epoch 10 - iter 396/447 - loss 0.00495633 - time (sec): 69.66 - samples/sec: 1095.69 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-03 19:44:04,386 epoch 10 - iter 440/447 - loss 0.00505250 - time (sec): 77.98 - samples/sec: 1095.08 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-03 19:44:05,515 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:44:05,515 EPOCH 10 done: loss 0.0050 - lr: 0.000000 |
|
2023-09-03 19:44:18,984 DEV : loss 0.22151651978492737 - f1-score (micro avg) 0.7885 |
|
2023-09-03 19:44:19,474 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:44:19,476 Loading model from best epoch ... |
|
2023-09-03 19:44:21,229 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time |
|
2023-09-03 19:44:31,874 |
|
Results: |
|
- F-score (micro) 0.7482 |
|
- F-score (macro) 0.6664 |
|
- Accuracy 0.6188 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
loc 0.8476 0.8490 0.8483 596 |
|
pers 0.6667 0.7508 0.7062 333 |
|
org 0.5038 0.5076 0.5057 132 |
|
prod 0.6800 0.5152 0.5862 66 |
|
time 0.6429 0.7347 0.6857 49 |
|
|
|
micro avg 0.7374 0.7594 0.7482 1176 |
|
macro avg 0.6682 0.6714 0.6664 1176 |
|
weighted avg 0.7398 0.7594 0.7481 1176 |
|
|
|
2023-09-03 19:44:31,874 ---------------------------------------------------------------------------------------------------- |
|
|