|
2023-09-03 19:10:58,280 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,281 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=21, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-09-03 19:10:58,281 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,281 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences |
|
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator |
|
2023-09-03 19:10:58,281 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,281 Train: 3575 sentences |
|
2023-09-03 19:10:58,281 (train_with_dev=False, train_with_test=False) |
|
2023-09-03 19:10:58,281 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,281 Training Params: |
|
2023-09-03 19:10:58,281 - learning_rate: "5e-05" |
|
2023-09-03 19:10:58,282 - mini_batch_size: "4" |
|
2023-09-03 19:10:58,282 - max_epochs: "10" |
|
2023-09-03 19:10:58,282 - shuffle: "True" |
|
2023-09-03 19:10:58,282 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,282 Plugins: |
|
2023-09-03 19:10:58,282 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-09-03 19:10:58,282 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,282 Final evaluation on model from best epoch (best-model.pt) |
|
2023-09-03 19:10:58,282 - metric: "('micro avg', 'f1-score')" |
|
2023-09-03 19:10:58,282 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,282 Computation: |
|
2023-09-03 19:10:58,282 - compute on device: cuda:0 |
|
2023-09-03 19:10:58,282 - embedding storage: none |
|
2023-09-03 19:10:58,282 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,282 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1" |
|
2023-09-03 19:10:58,282 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:58,282 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:11:08,149 epoch 1 - iter 89/894 - loss 2.69238806 - time (sec): 9.87 - samples/sec: 969.46 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:11:16,982 epoch 1 - iter 178/894 - loss 1.72783195 - time (sec): 18.70 - samples/sec: 938.61 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:11:25,549 epoch 1 - iter 267/894 - loss 1.32723397 - time (sec): 27.27 - samples/sec: 927.78 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:11:34,550 epoch 1 - iter 356/894 - loss 1.08297083 - time (sec): 36.27 - samples/sec: 933.52 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 19:11:43,678 epoch 1 - iter 445/894 - loss 0.93516935 - time (sec): 45.39 - samples/sec: 925.57 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 19:11:52,963 epoch 1 - iter 534/894 - loss 0.82076707 - time (sec): 54.68 - samples/sec: 931.70 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-03 19:12:02,152 epoch 1 - iter 623/894 - loss 0.73887451 - time (sec): 63.87 - samples/sec: 932.18 - lr: 0.000035 - momentum: 0.000000 |
|
2023-09-03 19:12:12,078 epoch 1 - iter 712/894 - loss 0.66806596 - time (sec): 73.79 - samples/sec: 936.82 - lr: 0.000040 - momentum: 0.000000 |
|
2023-09-03 19:12:20,908 epoch 1 - iter 801/894 - loss 0.62298748 - time (sec): 82.62 - samples/sec: 933.34 - lr: 0.000045 - momentum: 0.000000 |
|
2023-09-03 19:12:30,535 epoch 1 - iter 890/894 - loss 0.58400475 - time (sec): 92.25 - samples/sec: 933.31 - lr: 0.000050 - momentum: 0.000000 |
|
2023-09-03 19:12:31,038 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:12:31,038 EPOCH 1 done: loss 0.5821 - lr: 0.000050 |
|
2023-09-03 19:12:42,208 DEV : loss 0.18371683359146118 - f1-score (micro avg) 0.5825 |
|
2023-09-03 19:12:42,234 saving best model |
|
2023-09-03 19:12:42,727 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:12:51,709 epoch 2 - iter 89/894 - loss 0.17046008 - time (sec): 8.98 - samples/sec: 962.12 - lr: 0.000049 - momentum: 0.000000 |
|
2023-09-03 19:13:00,759 epoch 2 - iter 178/894 - loss 0.19669504 - time (sec): 18.03 - samples/sec: 957.33 - lr: 0.000049 - momentum: 0.000000 |
|
2023-09-03 19:13:09,665 epoch 2 - iter 267/894 - loss 0.18605037 - time (sec): 26.94 - samples/sec: 963.49 - lr: 0.000048 - momentum: 0.000000 |
|
2023-09-03 19:13:18,786 epoch 2 - iter 356/894 - loss 0.18077337 - time (sec): 36.06 - samples/sec: 952.93 - lr: 0.000048 - momentum: 0.000000 |
|
2023-09-03 19:13:27,532 epoch 2 - iter 445/894 - loss 0.17533928 - time (sec): 44.80 - samples/sec: 946.11 - lr: 0.000047 - momentum: 0.000000 |
|
2023-09-03 19:13:36,721 epoch 2 - iter 534/894 - loss 0.16613548 - time (sec): 53.99 - samples/sec: 947.52 - lr: 0.000047 - momentum: 0.000000 |
|
2023-09-03 19:13:45,883 epoch 2 - iter 623/894 - loss 0.16384940 - time (sec): 63.16 - samples/sec: 955.99 - lr: 0.000046 - momentum: 0.000000 |
|
2023-09-03 19:13:54,746 epoch 2 - iter 712/894 - loss 0.16395552 - time (sec): 72.02 - samples/sec: 953.51 - lr: 0.000046 - momentum: 0.000000 |
|
2023-09-03 19:14:03,506 epoch 2 - iter 801/894 - loss 0.16536964 - time (sec): 80.78 - samples/sec: 950.74 - lr: 0.000045 - momentum: 0.000000 |
|
2023-09-03 19:14:13,066 epoch 2 - iter 890/894 - loss 0.16119508 - time (sec): 90.34 - samples/sec: 954.72 - lr: 0.000044 - momentum: 0.000000 |
|
2023-09-03 19:14:13,434 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:14:13,434 EPOCH 2 done: loss 0.1608 - lr: 0.000044 |
|
2023-09-03 19:14:26,157 DEV : loss 0.15039657056331635 - f1-score (micro avg) 0.7048 |
|
2023-09-03 19:14:26,183 saving best model |
|
2023-09-03 19:14:27,527 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:14:36,028 epoch 3 - iter 89/894 - loss 0.11340688 - time (sec): 8.50 - samples/sec: 918.99 - lr: 0.000044 - momentum: 0.000000 |
|
2023-09-03 19:14:44,712 epoch 3 - iter 178/894 - loss 0.09592060 - time (sec): 17.18 - samples/sec: 941.66 - lr: 0.000043 - momentum: 0.000000 |
|
2023-09-03 19:14:53,577 epoch 3 - iter 267/894 - loss 0.09646760 - time (sec): 26.05 - samples/sec: 935.25 - lr: 0.000043 - momentum: 0.000000 |
|
2023-09-03 19:15:02,836 epoch 3 - iter 356/894 - loss 0.08755957 - time (sec): 35.31 - samples/sec: 941.36 - lr: 0.000042 - momentum: 0.000000 |
|
2023-09-03 19:15:12,599 epoch 3 - iter 445/894 - loss 0.09112142 - time (sec): 45.07 - samples/sec: 941.98 - lr: 0.000042 - momentum: 0.000000 |
|
2023-09-03 19:15:21,491 epoch 3 - iter 534/894 - loss 0.08801368 - time (sec): 53.96 - samples/sec: 951.86 - lr: 0.000041 - momentum: 0.000000 |
|
2023-09-03 19:15:30,675 epoch 3 - iter 623/894 - loss 0.09094375 - time (sec): 63.15 - samples/sec: 949.84 - lr: 0.000041 - momentum: 0.000000 |
|
2023-09-03 19:15:39,685 epoch 3 - iter 712/894 - loss 0.09156094 - time (sec): 72.16 - samples/sec: 949.51 - lr: 0.000040 - momentum: 0.000000 |
|
2023-09-03 19:15:48,448 epoch 3 - iter 801/894 - loss 0.09217268 - time (sec): 80.92 - samples/sec: 951.03 - lr: 0.000039 - momentum: 0.000000 |
|
2023-09-03 19:15:58,235 epoch 3 - iter 890/894 - loss 0.09239416 - time (sec): 90.71 - samples/sec: 950.84 - lr: 0.000039 - momentum: 0.000000 |
|
2023-09-03 19:15:58,594 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:15:58,595 EPOCH 3 done: loss 0.0922 - lr: 0.000039 |
|
2023-09-03 19:16:11,494 DEV : loss 0.16789782047271729 - f1-score (micro avg) 0.6876 |
|
2023-09-03 19:16:11,520 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:16:20,482 epoch 4 - iter 89/894 - loss 0.07502495 - time (sec): 8.96 - samples/sec: 1006.44 - lr: 0.000038 - momentum: 0.000000 |
|
2023-09-03 19:16:29,307 epoch 4 - iter 178/894 - loss 0.06363791 - time (sec): 17.79 - samples/sec: 964.85 - lr: 0.000038 - momentum: 0.000000 |
|
2023-09-03 19:16:38,803 epoch 4 - iter 267/894 - loss 0.06701115 - time (sec): 27.28 - samples/sec: 963.11 - lr: 0.000037 - momentum: 0.000000 |
|
2023-09-03 19:16:48,680 epoch 4 - iter 356/894 - loss 0.06452749 - time (sec): 37.16 - samples/sec: 972.68 - lr: 0.000037 - momentum: 0.000000 |
|
2023-09-03 19:16:58,142 epoch 4 - iter 445/894 - loss 0.05918610 - time (sec): 46.62 - samples/sec: 961.53 - lr: 0.000036 - momentum: 0.000000 |
|
2023-09-03 19:17:07,462 epoch 4 - iter 534/894 - loss 0.06098012 - time (sec): 55.94 - samples/sec: 957.32 - lr: 0.000036 - momentum: 0.000000 |
|
2023-09-03 19:17:16,307 epoch 4 - iter 623/894 - loss 0.06022659 - time (sec): 64.79 - samples/sec: 954.35 - lr: 0.000035 - momentum: 0.000000 |
|
2023-09-03 19:17:25,355 epoch 4 - iter 712/894 - loss 0.06200534 - time (sec): 73.83 - samples/sec: 951.40 - lr: 0.000034 - momentum: 0.000000 |
|
2023-09-03 19:17:33,874 epoch 4 - iter 801/894 - loss 0.06028832 - time (sec): 82.35 - samples/sec: 942.67 - lr: 0.000034 - momentum: 0.000000 |
|
2023-09-03 19:17:43,366 epoch 4 - iter 890/894 - loss 0.05867760 - time (sec): 91.84 - samples/sec: 938.71 - lr: 0.000033 - momentum: 0.000000 |
|
2023-09-03 19:17:43,740 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:17:43,740 EPOCH 4 done: loss 0.0588 - lr: 0.000033 |
|
2023-09-03 19:17:57,223 DEV : loss 0.20298035442829132 - f1-score (micro avg) 0.7452 |
|
2023-09-03 19:17:57,249 saving best model |
|
2023-09-03 19:17:58,587 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:18:08,986 epoch 5 - iter 89/894 - loss 0.05671696 - time (sec): 10.40 - samples/sec: 934.69 - lr: 0.000033 - momentum: 0.000000 |
|
2023-09-03 19:18:18,053 epoch 5 - iter 178/894 - loss 0.04365569 - time (sec): 19.46 - samples/sec: 912.17 - lr: 0.000032 - momentum: 0.000000 |
|
2023-09-03 19:18:27,552 epoch 5 - iter 267/894 - loss 0.04035798 - time (sec): 28.96 - samples/sec: 918.00 - lr: 0.000032 - momentum: 0.000000 |
|
2023-09-03 19:18:36,505 epoch 5 - iter 356/894 - loss 0.03970231 - time (sec): 37.92 - samples/sec: 914.97 - lr: 0.000031 - momentum: 0.000000 |
|
2023-09-03 19:18:46,170 epoch 5 - iter 445/894 - loss 0.03784230 - time (sec): 47.58 - samples/sec: 918.91 - lr: 0.000031 - momentum: 0.000000 |
|
2023-09-03 19:18:55,397 epoch 5 - iter 534/894 - loss 0.03987657 - time (sec): 56.81 - samples/sec: 923.90 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-03 19:19:04,547 epoch 5 - iter 623/894 - loss 0.03994062 - time (sec): 65.96 - samples/sec: 920.22 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 19:19:13,832 epoch 5 - iter 712/894 - loss 0.04061015 - time (sec): 75.24 - samples/sec: 923.28 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 19:19:22,963 epoch 5 - iter 801/894 - loss 0.04065690 - time (sec): 84.37 - samples/sec: 921.46 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 19:19:31,888 epoch 5 - iter 890/894 - loss 0.04197579 - time (sec): 93.30 - samples/sec: 923.92 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 19:19:32,322 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:19:32,322 EPOCH 5 done: loss 0.0420 - lr: 0.000028 |
|
2023-09-03 19:19:45,787 DEV : loss 0.21861010789871216 - f1-score (micro avg) 0.7381 |
|
2023-09-03 19:19:45,813 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:19:55,164 epoch 6 - iter 89/894 - loss 0.02420486 - time (sec): 9.35 - samples/sec: 926.99 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 19:20:04,086 epoch 6 - iter 178/894 - loss 0.03490990 - time (sec): 18.27 - samples/sec: 896.54 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 19:20:13,846 epoch 6 - iter 267/894 - loss 0.02886084 - time (sec): 28.03 - samples/sec: 911.47 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 19:20:23,022 epoch 6 - iter 356/894 - loss 0.03059653 - time (sec): 37.21 - samples/sec: 925.61 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 19:20:31,670 epoch 6 - iter 445/894 - loss 0.03090125 - time (sec): 45.86 - samples/sec: 916.17 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 19:20:40,713 epoch 6 - iter 534/894 - loss 0.03055680 - time (sec): 54.90 - samples/sec: 915.92 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 19:20:49,597 epoch 6 - iter 623/894 - loss 0.03149645 - time (sec): 63.78 - samples/sec: 911.66 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 19:20:59,900 epoch 6 - iter 712/894 - loss 0.03005318 - time (sec): 74.09 - samples/sec: 922.66 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 19:21:08,996 epoch 6 - iter 801/894 - loss 0.02957260 - time (sec): 83.18 - samples/sec: 924.36 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 19:21:18,642 epoch 6 - iter 890/894 - loss 0.02924329 - time (sec): 92.83 - samples/sec: 927.66 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 19:21:19,039 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:21:19,039 EPOCH 6 done: loss 0.0291 - lr: 0.000022 |
|
2023-09-03 19:21:32,508 DEV : loss 0.24210010468959808 - f1-score (micro avg) 0.7598 |
|
2023-09-03 19:21:32,535 saving best model |
|
2023-09-03 19:21:33,870 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:21:42,847 epoch 7 - iter 89/894 - loss 0.01754516 - time (sec): 8.98 - samples/sec: 979.14 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 19:21:51,750 epoch 7 - iter 178/894 - loss 0.02284649 - time (sec): 17.88 - samples/sec: 956.82 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 19:22:02,425 epoch 7 - iter 267/894 - loss 0.02108914 - time (sec): 28.55 - samples/sec: 954.57 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 19:22:11,584 epoch 7 - iter 356/894 - loss 0.02139335 - time (sec): 37.71 - samples/sec: 944.72 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 19:22:20,794 epoch 7 - iter 445/894 - loss 0.02142739 - time (sec): 46.92 - samples/sec: 946.25 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:22:29,777 epoch 7 - iter 534/894 - loss 0.02059029 - time (sec): 55.91 - samples/sec: 940.92 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:22:38,940 epoch 7 - iter 623/894 - loss 0.01995416 - time (sec): 65.07 - samples/sec: 936.79 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:22:48,055 epoch 7 - iter 712/894 - loss 0.01981773 - time (sec): 74.18 - samples/sec: 932.64 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:22:56,949 epoch 7 - iter 801/894 - loss 0.02006500 - time (sec): 83.08 - samples/sec: 927.80 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:23:06,568 epoch 7 - iter 890/894 - loss 0.01936137 - time (sec): 92.70 - samples/sec: 930.70 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:23:06,944 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:23:06,944 EPOCH 7 done: loss 0.0193 - lr: 0.000017 |
|
2023-09-03 19:23:20,362 DEV : loss 0.242599755525589 - f1-score (micro avg) 0.7696 |
|
2023-09-03 19:23:20,388 saving best model |
|
2023-09-03 19:23:21,700 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:23:31,009 epoch 8 - iter 89/894 - loss 0.01621017 - time (sec): 9.31 - samples/sec: 931.18 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:23:40,722 epoch 8 - iter 178/894 - loss 0.01277269 - time (sec): 19.02 - samples/sec: 934.74 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:23:49,975 epoch 8 - iter 267/894 - loss 0.01041762 - time (sec): 28.27 - samples/sec: 950.73 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:23:59,828 epoch 8 - iter 356/894 - loss 0.01011581 - time (sec): 38.13 - samples/sec: 956.09 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:24:08,579 epoch 8 - iter 445/894 - loss 0.01175983 - time (sec): 46.88 - samples/sec: 941.76 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:24:17,840 epoch 8 - iter 534/894 - loss 0.01291973 - time (sec): 56.14 - samples/sec: 937.53 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:24:27,048 epoch 8 - iter 623/894 - loss 0.01208747 - time (sec): 65.35 - samples/sec: 943.69 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:24:35,924 epoch 8 - iter 712/894 - loss 0.01226839 - time (sec): 74.22 - samples/sec: 942.20 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:24:44,759 epoch 8 - iter 801/894 - loss 0.01213190 - time (sec): 83.06 - samples/sec: 940.26 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:24:53,805 epoch 8 - iter 890/894 - loss 0.01190405 - time (sec): 92.10 - samples/sec: 935.67 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:24:54,212 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:24:54,212 EPOCH 8 done: loss 0.0119 - lr: 0.000011 |
|
2023-09-03 19:25:07,680 DEV : loss 0.2613238990306854 - f1-score (micro avg) 0.774 |
|
2023-09-03 19:25:07,707 saving best model |
|
2023-09-03 19:25:09,069 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:25:17,987 epoch 9 - iter 89/894 - loss 0.00997870 - time (sec): 8.92 - samples/sec: 927.34 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:25:27,459 epoch 9 - iter 178/894 - loss 0.00632446 - time (sec): 18.39 - samples/sec: 955.83 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:25:36,811 epoch 9 - iter 267/894 - loss 0.00637963 - time (sec): 27.74 - samples/sec: 933.36 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:25:46,415 epoch 9 - iter 356/894 - loss 0.00558079 - time (sec): 37.34 - samples/sec: 946.64 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:25:56,059 epoch 9 - iter 445/894 - loss 0.00590974 - time (sec): 46.99 - samples/sec: 940.55 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:26:05,743 epoch 9 - iter 534/894 - loss 0.00636897 - time (sec): 56.67 - samples/sec: 943.37 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:26:14,631 epoch 9 - iter 623/894 - loss 0.00614434 - time (sec): 65.56 - samples/sec: 945.95 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:26:23,473 epoch 9 - iter 712/894 - loss 0.00565531 - time (sec): 74.40 - samples/sec: 941.02 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:26:32,236 epoch 9 - iter 801/894 - loss 0.00552811 - time (sec): 83.17 - samples/sec: 940.20 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:26:41,209 epoch 9 - iter 890/894 - loss 0.00611643 - time (sec): 92.14 - samples/sec: 934.75 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:26:41,593 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:26:41,593 EPOCH 9 done: loss 0.0061 - lr: 0.000006 |
|
2023-09-03 19:26:54,966 DEV : loss 0.2732395827770233 - f1-score (micro avg) 0.7797 |
|
2023-09-03 19:26:54,993 saving best model |
|
2023-09-03 19:26:56,319 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:27:05,605 epoch 10 - iter 89/894 - loss 0.00478917 - time (sec): 9.28 - samples/sec: 946.10 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:27:14,380 epoch 10 - iter 178/894 - loss 0.00284241 - time (sec): 18.06 - samples/sec: 925.92 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:27:23,312 epoch 10 - iter 267/894 - loss 0.00262347 - time (sec): 26.99 - samples/sec: 936.64 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:27:32,573 epoch 10 - iter 356/894 - loss 0.00287710 - time (sec): 36.25 - samples/sec: 940.19 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:27:42,409 epoch 10 - iter 445/894 - loss 0.00327777 - time (sec): 46.09 - samples/sec: 938.49 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:27:52,015 epoch 10 - iter 534/894 - loss 0.00378774 - time (sec): 55.69 - samples/sec: 937.56 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:28:01,642 epoch 10 - iter 623/894 - loss 0.00405072 - time (sec): 65.32 - samples/sec: 933.00 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:28:10,575 epoch 10 - iter 712/894 - loss 0.00418249 - time (sec): 74.25 - samples/sec: 932.14 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:28:19,446 epoch 10 - iter 801/894 - loss 0.00407931 - time (sec): 83.13 - samples/sec: 927.77 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:28:29,092 epoch 10 - iter 890/894 - loss 0.00412299 - time (sec): 92.77 - samples/sec: 929.02 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-03 19:28:29,466 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:28:29,466 EPOCH 10 done: loss 0.0041 - lr: 0.000000 |
|
2023-09-03 19:28:42,905 DEV : loss 0.2644493281841278 - f1-score (micro avg) 0.7788 |
|
2023-09-03 19:28:43,391 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:28:43,392 Loading model from best epoch ... |
|
2023-09-03 19:28:45,259 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time |
|
2023-09-03 19:28:55,968 |
|
Results: |
|
- F-score (micro) 0.7471 |
|
- F-score (macro) 0.6726 |
|
- Accuracy 0.6148 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
loc 0.8409 0.8423 0.8416 596 |
|
pers 0.6453 0.7267 0.6836 333 |
|
org 0.6744 0.4394 0.5321 132 |
|
prod 0.6818 0.4545 0.5455 66 |
|
time 0.7451 0.7755 0.7600 49 |
|
|
|
micro avg 0.7546 0.7398 0.7471 1176 |
|
macro avg 0.7175 0.6477 0.6726 1176 |
|
weighted avg 0.7539 0.7398 0.7421 1176 |
|
|
|
2023-09-03 19:28:55,968 ---------------------------------------------------------------------------------------------------- |
|
|