|
2023-09-03 18:52:51,771 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,772 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=21, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-09-03 18:52:51,772 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,772 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences |
|
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator |
|
2023-09-03 18:52:51,772 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,772 Train: 3575 sentences |
|
2023-09-03 18:52:51,772 (train_with_dev=False, train_with_test=False) |
|
2023-09-03 18:52:51,772 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,773 Training Params: |
|
2023-09-03 18:52:51,773 - learning_rate: "3e-05" |
|
2023-09-03 18:52:51,773 - mini_batch_size: "4" |
|
2023-09-03 18:52:51,773 - max_epochs: "10" |
|
2023-09-03 18:52:51,773 - shuffle: "True" |
|
2023-09-03 18:52:51,773 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,773 Plugins: |
|
2023-09-03 18:52:51,773 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-09-03 18:52:51,773 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,773 Final evaluation on model from best epoch (best-model.pt) |
|
2023-09-03 18:52:51,773 - metric: "('micro avg', 'f1-score')" |
|
2023-09-03 18:52:51,773 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,773 Computation: |
|
2023-09-03 18:52:51,773 - compute on device: cuda:0 |
|
2023-09-03 18:52:51,773 - embedding storage: none |
|
2023-09-03 18:52:51,773 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,773 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1" |
|
2023-09-03 18:52:51,773 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:52:51,773 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:53:01,513 epoch 1 - iter 89/894 - loss 2.99699689 - time (sec): 9.74 - samples/sec: 982.05 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 18:53:10,325 epoch 1 - iter 178/894 - loss 2.06986610 - time (sec): 18.55 - samples/sec: 946.08 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 18:53:18,954 epoch 1 - iter 267/894 - loss 1.58199999 - time (sec): 27.18 - samples/sec: 930.69 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 18:53:27,950 epoch 1 - iter 356/894 - loss 1.28323944 - time (sec): 36.18 - samples/sec: 935.84 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 18:53:37,044 epoch 1 - iter 445/894 - loss 1.10202507 - time (sec): 45.27 - samples/sec: 928.12 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 18:53:46,353 epoch 1 - iter 534/894 - loss 0.96300193 - time (sec): 54.58 - samples/sec: 933.42 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 18:53:55,548 epoch 1 - iter 623/894 - loss 0.86405084 - time (sec): 63.77 - samples/sec: 933.57 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 18:54:05,482 epoch 1 - iter 712/894 - loss 0.77769083 - time (sec): 73.71 - samples/sec: 937.92 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 18:54:14,345 epoch 1 - iter 801/894 - loss 0.72155531 - time (sec): 82.57 - samples/sec: 933.95 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 18:54:23,977 epoch 1 - iter 890/894 - loss 0.67418228 - time (sec): 92.20 - samples/sec: 933.80 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-03 18:54:24,479 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:54:24,479 EPOCH 1 done: loss 0.6716 - lr: 0.000030 |
|
2023-09-03 18:54:35,493 DEV : loss 0.19621583819389343 - f1-score (micro avg) 0.5235 |
|
2023-09-03 18:54:35,520 saving best model |
|
2023-09-03 18:54:35,970 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:54:44,922 epoch 2 - iter 89/894 - loss 0.17209427 - time (sec): 8.95 - samples/sec: 965.42 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-03 18:54:53,969 epoch 2 - iter 178/894 - loss 0.18691490 - time (sec): 18.00 - samples/sec: 959.15 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 18:55:02,872 epoch 2 - iter 267/894 - loss 0.18352945 - time (sec): 26.90 - samples/sec: 964.77 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 18:55:12,051 epoch 2 - iter 356/894 - loss 0.17845196 - time (sec): 36.08 - samples/sec: 952.37 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-03 18:55:20,789 epoch 2 - iter 445/894 - loss 0.17332474 - time (sec): 44.82 - samples/sec: 945.81 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 18:55:29,919 epoch 2 - iter 534/894 - loss 0.16393436 - time (sec): 53.95 - samples/sec: 948.31 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 18:55:39,031 epoch 2 - iter 623/894 - loss 0.16288261 - time (sec): 63.06 - samples/sec: 957.44 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-03 18:55:47,840 epoch 2 - iter 712/894 - loss 0.16084894 - time (sec): 71.87 - samples/sec: 955.50 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 18:55:56,568 epoch 2 - iter 801/894 - loss 0.16255124 - time (sec): 80.60 - samples/sec: 952.88 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 18:56:06,100 epoch 2 - iter 890/894 - loss 0.15871408 - time (sec): 90.13 - samples/sec: 956.94 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-03 18:56:06,466 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:56:06,467 EPOCH 2 done: loss 0.1585 - lr: 0.000027 |
|
2023-09-03 18:56:19,172 DEV : loss 0.13305141031742096 - f1-score (micro avg) 0.7028 |
|
2023-09-03 18:56:19,199 saving best model |
|
2023-09-03 18:56:20,516 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:56:29,013 epoch 3 - iter 89/894 - loss 0.08962806 - time (sec): 8.50 - samples/sec: 919.40 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 18:56:37,719 epoch 3 - iter 178/894 - loss 0.08780954 - time (sec): 17.20 - samples/sec: 940.66 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 18:56:46,548 epoch 3 - iter 267/894 - loss 0.09281134 - time (sec): 26.03 - samples/sec: 935.89 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-03 18:56:55,781 epoch 3 - iter 356/894 - loss 0.08453025 - time (sec): 35.26 - samples/sec: 942.54 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 18:57:05,448 epoch 3 - iter 445/894 - loss 0.09105320 - time (sec): 44.93 - samples/sec: 944.93 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 18:57:14,265 epoch 3 - iter 534/894 - loss 0.08878389 - time (sec): 53.75 - samples/sec: 955.67 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-03 18:57:23,378 epoch 3 - iter 623/894 - loss 0.09032482 - time (sec): 62.86 - samples/sec: 954.16 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 18:57:32,300 epoch 3 - iter 712/894 - loss 0.09053936 - time (sec): 71.78 - samples/sec: 954.45 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 18:57:40,998 epoch 3 - iter 801/894 - loss 0.09166475 - time (sec): 80.48 - samples/sec: 956.22 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-03 18:57:50,674 epoch 3 - iter 890/894 - loss 0.09112858 - time (sec): 90.16 - samples/sec: 956.64 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 18:57:51,028 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:57:51,028 EPOCH 3 done: loss 0.0910 - lr: 0.000023 |
|
2023-09-03 18:58:04,037 DEV : loss 0.14156648516654968 - f1-score (micro avg) 0.7312 |
|
2023-09-03 18:58:04,064 saving best model |
|
2023-09-03 18:58:05,396 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:58:14,302 epoch 4 - iter 89/894 - loss 0.06843269 - time (sec): 8.91 - samples/sec: 1012.66 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 18:58:23,123 epoch 4 - iter 178/894 - loss 0.06184409 - time (sec): 17.73 - samples/sec: 968.09 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-03 18:58:32,577 epoch 4 - iter 267/894 - loss 0.05872966 - time (sec): 27.18 - samples/sec: 966.73 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 18:58:42,449 epoch 4 - iter 356/894 - loss 0.05557241 - time (sec): 37.05 - samples/sec: 975.48 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 18:58:51,812 epoch 4 - iter 445/894 - loss 0.05467367 - time (sec): 46.41 - samples/sec: 965.79 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-03 18:59:01,031 epoch 4 - iter 534/894 - loss 0.05481570 - time (sec): 55.63 - samples/sec: 962.60 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 18:59:09,843 epoch 4 - iter 623/894 - loss 0.05465991 - time (sec): 64.45 - samples/sec: 959.38 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 18:59:18,839 epoch 4 - iter 712/894 - loss 0.05541652 - time (sec): 73.44 - samples/sec: 956.47 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-03 18:59:27,327 epoch 4 - iter 801/894 - loss 0.05463429 - time (sec): 81.93 - samples/sec: 947.52 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 18:59:36,709 epoch 4 - iter 890/894 - loss 0.05453935 - time (sec): 91.31 - samples/sec: 944.18 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 18:59:37,086 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 18:59:37,086 EPOCH 4 done: loss 0.0549 - lr: 0.000020 |
|
2023-09-03 18:59:50,540 DEV : loss 0.21218053996562958 - f1-score (micro avg) 0.7671 |
|
2023-09-03 18:59:50,566 saving best model |
|
2023-09-03 18:59:52,162 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:00:02,477 epoch 5 - iter 89/894 - loss 0.04240015 - time (sec): 10.31 - samples/sec: 942.29 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-03 19:00:11,515 epoch 5 - iter 178/894 - loss 0.04116963 - time (sec): 19.35 - samples/sec: 917.53 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:00:20,920 epoch 5 - iter 267/894 - loss 0.04532837 - time (sec): 28.76 - samples/sec: 924.58 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:00:29,740 epoch 5 - iter 356/894 - loss 0.04477082 - time (sec): 37.58 - samples/sec: 923.25 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-03 19:00:39,349 epoch 5 - iter 445/894 - loss 0.04118324 - time (sec): 47.19 - samples/sec: 926.62 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:00:48,518 epoch 5 - iter 534/894 - loss 0.04162335 - time (sec): 56.35 - samples/sec: 931.35 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:00:57,557 epoch 5 - iter 623/894 - loss 0.03943254 - time (sec): 65.39 - samples/sec: 928.17 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-03 19:01:06,797 epoch 5 - iter 712/894 - loss 0.03955068 - time (sec): 74.63 - samples/sec: 930.83 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:01:15,902 epoch 5 - iter 801/894 - loss 0.03921819 - time (sec): 83.74 - samples/sec: 928.46 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:01:24,861 epoch 5 - iter 890/894 - loss 0.04074719 - time (sec): 92.70 - samples/sec: 929.92 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-03 19:01:25,294 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:01:25,294 EPOCH 5 done: loss 0.0407 - lr: 0.000017 |
|
2023-09-03 19:01:38,788 DEV : loss 0.22197993099689484 - f1-score (micro avg) 0.7586 |
|
2023-09-03 19:01:38,815 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:01:48,171 epoch 6 - iter 89/894 - loss 0.01977262 - time (sec): 9.36 - samples/sec: 926.42 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:01:57,047 epoch 6 - iter 178/894 - loss 0.02907246 - time (sec): 18.23 - samples/sec: 898.61 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:02:06,793 epoch 6 - iter 267/894 - loss 0.02606421 - time (sec): 27.98 - samples/sec: 913.25 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-03 19:02:15,974 epoch 6 - iter 356/894 - loss 0.02546854 - time (sec): 37.16 - samples/sec: 926.86 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:02:24,624 epoch 6 - iter 445/894 - loss 0.02436981 - time (sec): 45.81 - samples/sec: 917.13 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:02:33,659 epoch 6 - iter 534/894 - loss 0.02428737 - time (sec): 54.84 - samples/sec: 916.86 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-03 19:02:42,539 epoch 6 - iter 623/894 - loss 0.02613954 - time (sec): 63.72 - samples/sec: 912.52 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:02:52,855 epoch 6 - iter 712/894 - loss 0.02543888 - time (sec): 74.04 - samples/sec: 923.24 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:03:01,984 epoch 6 - iter 801/894 - loss 0.02680363 - time (sec): 83.17 - samples/sec: 924.51 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-03 19:03:11,677 epoch 6 - iter 890/894 - loss 0.02684866 - time (sec): 92.86 - samples/sec: 927.32 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:03:12,085 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:03:12,085 EPOCH 6 done: loss 0.0269 - lr: 0.000013 |
|
2023-09-03 19:03:25,620 DEV : loss 0.23417820036411285 - f1-score (micro avg) 0.7583 |
|
2023-09-03 19:03:25,647 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:03:34,698 epoch 7 - iter 89/894 - loss 0.02350094 - time (sec): 9.05 - samples/sec: 971.10 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:03:43,659 epoch 7 - iter 178/894 - loss 0.01866029 - time (sec): 18.01 - samples/sec: 949.80 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-03 19:03:54,334 epoch 7 - iter 267/894 - loss 0.01757408 - time (sec): 28.69 - samples/sec: 950.15 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:04:03,555 epoch 7 - iter 356/894 - loss 0.01632661 - time (sec): 37.91 - samples/sec: 939.89 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:04:12,843 epoch 7 - iter 445/894 - loss 0.01696761 - time (sec): 47.20 - samples/sec: 940.77 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-03 19:04:21,813 epoch 7 - iter 534/894 - loss 0.01791246 - time (sec): 56.17 - samples/sec: 936.56 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:04:31,035 epoch 7 - iter 623/894 - loss 0.01771411 - time (sec): 65.39 - samples/sec: 932.24 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:04:40,185 epoch 7 - iter 712/894 - loss 0.01755280 - time (sec): 74.54 - samples/sec: 928.22 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-03 19:04:49,163 epoch 7 - iter 801/894 - loss 0.01881491 - time (sec): 83.52 - samples/sec: 922.93 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:04:58,887 epoch 7 - iter 890/894 - loss 0.01856801 - time (sec): 93.24 - samples/sec: 925.29 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:04:59,269 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:04:59,269 EPOCH 7 done: loss 0.0185 - lr: 0.000010 |
|
2023-09-03 19:05:12,824 DEV : loss 0.2219560593366623 - f1-score (micro avg) 0.7553 |
|
2023-09-03 19:05:12,853 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:05:22,261 epoch 8 - iter 89/894 - loss 0.00890074 - time (sec): 9.41 - samples/sec: 921.36 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-03 19:05:31,949 epoch 8 - iter 178/894 - loss 0.01187531 - time (sec): 19.09 - samples/sec: 931.09 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:05:41,200 epoch 8 - iter 267/894 - loss 0.01028799 - time (sec): 28.35 - samples/sec: 948.27 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:05:51,068 epoch 8 - iter 356/894 - loss 0.00904645 - time (sec): 38.21 - samples/sec: 953.90 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-03 19:06:00,185 epoch 8 - iter 445/894 - loss 0.01215640 - time (sec): 47.33 - samples/sec: 932.74 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:06:09,410 epoch 8 - iter 534/894 - loss 0.01237356 - time (sec): 56.56 - samples/sec: 930.61 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:06:18,481 epoch 8 - iter 623/894 - loss 0.01215753 - time (sec): 65.63 - samples/sec: 939.67 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-03 19:06:27,206 epoch 8 - iter 712/894 - loss 0.01202186 - time (sec): 74.35 - samples/sec: 940.57 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:06:35,843 epoch 8 - iter 801/894 - loss 0.01148155 - time (sec): 82.99 - samples/sec: 941.04 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:06:44,618 epoch 8 - iter 890/894 - loss 0.01161529 - time (sec): 91.76 - samples/sec: 939.13 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-03 19:06:45,013 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:06:45,013 EPOCH 8 done: loss 0.0116 - lr: 0.000007 |
|
2023-09-03 19:06:57,836 DEV : loss 0.2321111261844635 - f1-score (micro avg) 0.7811 |
|
2023-09-03 19:06:57,864 saving best model |
|
2023-09-03 19:06:59,193 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:07:07,874 epoch 9 - iter 89/894 - loss 0.00777350 - time (sec): 8.68 - samples/sec: 952.56 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:07:17,061 epoch 9 - iter 178/894 - loss 0.00609744 - time (sec): 17.87 - samples/sec: 983.74 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:07:26,154 epoch 9 - iter 267/894 - loss 0.00658637 - time (sec): 26.96 - samples/sec: 960.39 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-03 19:07:35,421 epoch 9 - iter 356/894 - loss 0.00606818 - time (sec): 36.23 - samples/sec: 975.85 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:07:44,752 epoch 9 - iter 445/894 - loss 0.00770679 - time (sec): 45.56 - samples/sec: 970.09 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:07:54,114 epoch 9 - iter 534/894 - loss 0.00758096 - time (sec): 54.92 - samples/sec: 973.48 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-03 19:08:02,757 epoch 9 - iter 623/894 - loss 0.00665079 - time (sec): 63.56 - samples/sec: 975.67 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:08:11,354 epoch 9 - iter 712/894 - loss 0.00691389 - time (sec): 72.16 - samples/sec: 970.27 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:08:19,860 epoch 9 - iter 801/894 - loss 0.00689763 - time (sec): 80.67 - samples/sec: 969.33 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-03 19:08:28,559 epoch 9 - iter 890/894 - loss 0.00703245 - time (sec): 89.36 - samples/sec: 963.76 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:08:28,937 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:08:28,937 EPOCH 9 done: loss 0.0070 - lr: 0.000003 |
|
2023-09-03 19:08:41,698 DEV : loss 0.24807557463645935 - f1-score (micro avg) 0.7864 |
|
2023-09-03 19:08:41,727 saving best model |
|
2023-09-03 19:08:43,045 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:08:52,067 epoch 10 - iter 89/894 - loss 0.00626017 - time (sec): 9.02 - samples/sec: 973.79 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:09:00,618 epoch 10 - iter 178/894 - loss 0.00580896 - time (sec): 17.57 - samples/sec: 951.60 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-03 19:09:09,338 epoch 10 - iter 267/894 - loss 0.00557829 - time (sec): 26.29 - samples/sec: 961.57 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:09:18,325 epoch 10 - iter 356/894 - loss 0.00543496 - time (sec): 35.28 - samples/sec: 966.14 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:09:27,844 epoch 10 - iter 445/894 - loss 0.00500160 - time (sec): 44.80 - samples/sec: 965.53 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-03 19:09:37,143 epoch 10 - iter 534/894 - loss 0.00502452 - time (sec): 54.10 - samples/sec: 965.26 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:09:46,513 epoch 10 - iter 623/894 - loss 0.00467325 - time (sec): 63.47 - samples/sec: 960.27 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:09:55,177 epoch 10 - iter 712/894 - loss 0.00489280 - time (sec): 72.13 - samples/sec: 959.59 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-03 19:10:03,944 epoch 10 - iter 801/894 - loss 0.00473651 - time (sec): 80.90 - samples/sec: 953.32 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-03 19:10:13,430 epoch 10 - iter 890/894 - loss 0.00488641 - time (sec): 90.38 - samples/sec: 953.57 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-03 19:10:13,811 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:13,811 EPOCH 10 done: loss 0.0049 - lr: 0.000000 |
|
2023-09-03 19:10:26,993 DEV : loss 0.24311408400535583 - f1-score (micro avg) 0.7859 |
|
2023-09-03 19:10:27,473 ---------------------------------------------------------------------------------------------------- |
|
2023-09-03 19:10:27,475 Loading model from best epoch ... |
|
2023-09-03 19:10:29,409 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time |
|
2023-09-03 19:10:39,948 |
|
Results: |
|
- F-score (micro) 0.7513 |
|
- F-score (macro) 0.6694 |
|
- Accuracy 0.6173 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
loc 0.8276 0.8540 0.8406 596 |
|
pers 0.6974 0.7267 0.7118 333 |
|
org 0.5769 0.4545 0.5085 132 |
|
prod 0.6739 0.4697 0.5536 66 |
|
time 0.7115 0.7551 0.7327 49 |
|
|
|
micro avg 0.7552 0.7474 0.7513 1176 |
|
macro avg 0.6975 0.6520 0.6694 1176 |
|
weighted avg 0.7492 0.7474 0.7462 1176 |
|
|
|
2023-09-03 19:10:39,948 ---------------------------------------------------------------------------------------------------- |
|
|