stefan-it's picture
Upload folder using huggingface_hub
a903ec3
2023-10-17 11:21:40,050 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,051 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 11:21:40,051 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,051 MultiCorpus: 7936 train + 992 dev + 992 test sentences
- NER_ICDAR_EUROPEANA Corpus: 7936 train + 992 dev + 992 test sentences - /root/.flair/datasets/ner_icdar_europeana/fr
2023-10-17 11:21:40,051 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Train: 7936 sentences
2023-10-17 11:21:40,052 (train_with_dev=False, train_with_test=False)
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Training Params:
2023-10-17 11:21:40,052 - learning_rate: "5e-05"
2023-10-17 11:21:40,052 - mini_batch_size: "8"
2023-10-17 11:21:40,052 - max_epochs: "10"
2023-10-17 11:21:40,052 - shuffle: "True"
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Plugins:
2023-10-17 11:21:40,052 - TensorboardLogger
2023-10-17 11:21:40,052 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 11:21:40,052 - metric: "('micro avg', 'f1-score')"
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Computation:
2023-10-17 11:21:40,052 - compute on device: cuda:0
2023-10-17 11:21:40,052 - embedding storage: none
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Model training base path: "hmbench-icdar/fr-hmteams/teams-base-historic-multilingual-discriminator-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1"
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 ----------------------------------------------------------------------------------------------------
2023-10-17 11:21:40,052 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 11:21:45,743 epoch 1 - iter 99/992 - loss 2.00752470 - time (sec): 5.69 - samples/sec: 2843.79 - lr: 0.000005 - momentum: 0.000000
2023-10-17 11:21:51,606 epoch 1 - iter 198/992 - loss 1.16110239 - time (sec): 11.55 - samples/sec: 2877.04 - lr: 0.000010 - momentum: 0.000000
2023-10-17 11:21:57,730 epoch 1 - iter 297/992 - loss 0.86009292 - time (sec): 17.68 - samples/sec: 2807.08 - lr: 0.000015 - momentum: 0.000000
2023-10-17 11:22:03,344 epoch 1 - iter 396/992 - loss 0.70056867 - time (sec): 23.29 - samples/sec: 2807.69 - lr: 0.000020 - momentum: 0.000000
2023-10-17 11:22:09,204 epoch 1 - iter 495/992 - loss 0.59290102 - time (sec): 29.15 - samples/sec: 2807.40 - lr: 0.000025 - momentum: 0.000000
2023-10-17 11:22:15,182 epoch 1 - iter 594/992 - loss 0.51678549 - time (sec): 35.13 - samples/sec: 2808.16 - lr: 0.000030 - momentum: 0.000000
2023-10-17 11:22:20,959 epoch 1 - iter 693/992 - loss 0.46411643 - time (sec): 40.91 - samples/sec: 2806.30 - lr: 0.000035 - momentum: 0.000000
2023-10-17 11:22:26,973 epoch 1 - iter 792/992 - loss 0.42507693 - time (sec): 46.92 - samples/sec: 2798.32 - lr: 0.000040 - momentum: 0.000000
2023-10-17 11:22:32,888 epoch 1 - iter 891/992 - loss 0.39307524 - time (sec): 52.83 - samples/sec: 2786.92 - lr: 0.000045 - momentum: 0.000000
2023-10-17 11:22:38,704 epoch 1 - iter 990/992 - loss 0.36782772 - time (sec): 58.65 - samples/sec: 2790.32 - lr: 0.000050 - momentum: 0.000000
2023-10-17 11:22:38,823 ----------------------------------------------------------------------------------------------------
2023-10-17 11:22:38,823 EPOCH 1 done: loss 0.3674 - lr: 0.000050
2023-10-17 11:22:41,920 DEV : loss 0.09267440438270569 - f1-score (micro avg) 0.7287
2023-10-17 11:22:41,943 saving best model
2023-10-17 11:22:42,310 ----------------------------------------------------------------------------------------------------
2023-10-17 11:22:48,213 epoch 2 - iter 99/992 - loss 0.11596098 - time (sec): 5.90 - samples/sec: 2864.40 - lr: 0.000049 - momentum: 0.000000
2023-10-17 11:22:54,188 epoch 2 - iter 198/992 - loss 0.11376424 - time (sec): 11.88 - samples/sec: 2788.50 - lr: 0.000049 - momentum: 0.000000
2023-10-17 11:23:00,295 epoch 2 - iter 297/992 - loss 0.11209663 - time (sec): 17.98 - samples/sec: 2759.41 - lr: 0.000048 - momentum: 0.000000
2023-10-17 11:23:06,316 epoch 2 - iter 396/992 - loss 0.11161254 - time (sec): 24.00 - samples/sec: 2767.95 - lr: 0.000048 - momentum: 0.000000
2023-10-17 11:23:12,200 epoch 2 - iter 495/992 - loss 0.10962236 - time (sec): 29.89 - samples/sec: 2762.55 - lr: 0.000047 - momentum: 0.000000
2023-10-17 11:23:18,445 epoch 2 - iter 594/992 - loss 0.10658082 - time (sec): 36.13 - samples/sec: 2749.28 - lr: 0.000047 - momentum: 0.000000
2023-10-17 11:23:24,452 epoch 2 - iter 693/992 - loss 0.10590489 - time (sec): 42.14 - samples/sec: 2752.00 - lr: 0.000046 - momentum: 0.000000
2023-10-17 11:23:30,157 epoch 2 - iter 792/992 - loss 0.10554854 - time (sec): 47.85 - samples/sec: 2752.16 - lr: 0.000046 - momentum: 0.000000
2023-10-17 11:23:35,744 epoch 2 - iter 891/992 - loss 0.10485385 - time (sec): 53.43 - samples/sec: 2756.15 - lr: 0.000045 - momentum: 0.000000
2023-10-17 11:23:41,651 epoch 2 - iter 990/992 - loss 0.10541228 - time (sec): 59.34 - samples/sec: 2757.92 - lr: 0.000044 - momentum: 0.000000
2023-10-17 11:23:41,762 ----------------------------------------------------------------------------------------------------
2023-10-17 11:23:41,763 EPOCH 2 done: loss 0.1053 - lr: 0.000044
2023-10-17 11:23:45,546 DEV : loss 0.07957779616117477 - f1-score (micro avg) 0.7488
2023-10-17 11:23:45,567 saving best model
2023-10-17 11:23:46,062 ----------------------------------------------------------------------------------------------------
2023-10-17 11:23:51,998 epoch 3 - iter 99/992 - loss 0.07399883 - time (sec): 5.93 - samples/sec: 2792.44 - lr: 0.000044 - momentum: 0.000000
2023-10-17 11:23:57,894 epoch 3 - iter 198/992 - loss 0.07782338 - time (sec): 11.83 - samples/sec: 2777.78 - lr: 0.000043 - momentum: 0.000000
2023-10-17 11:24:03,807 epoch 3 - iter 297/992 - loss 0.07965927 - time (sec): 17.74 - samples/sec: 2795.99 - lr: 0.000043 - momentum: 0.000000
2023-10-17 11:24:09,711 epoch 3 - iter 396/992 - loss 0.07595699 - time (sec): 23.65 - samples/sec: 2808.81 - lr: 0.000042 - momentum: 0.000000
2023-10-17 11:24:15,396 epoch 3 - iter 495/992 - loss 0.07484677 - time (sec): 29.33 - samples/sec: 2819.60 - lr: 0.000042 - momentum: 0.000000
2023-10-17 11:24:21,231 epoch 3 - iter 594/992 - loss 0.07449912 - time (sec): 35.17 - samples/sec: 2800.02 - lr: 0.000041 - momentum: 0.000000
2023-10-17 11:24:27,126 epoch 3 - iter 693/992 - loss 0.07526423 - time (sec): 41.06 - samples/sec: 2795.76 - lr: 0.000041 - momentum: 0.000000
2023-10-17 11:24:33,067 epoch 3 - iter 792/992 - loss 0.07578093 - time (sec): 47.00 - samples/sec: 2790.69 - lr: 0.000040 - momentum: 0.000000
2023-10-17 11:24:39,214 epoch 3 - iter 891/992 - loss 0.07554286 - time (sec): 53.15 - samples/sec: 2782.60 - lr: 0.000039 - momentum: 0.000000
2023-10-17 11:24:45,213 epoch 3 - iter 990/992 - loss 0.07580919 - time (sec): 59.15 - samples/sec: 2766.04 - lr: 0.000039 - momentum: 0.000000
2023-10-17 11:24:45,340 ----------------------------------------------------------------------------------------------------
2023-10-17 11:24:45,340 EPOCH 3 done: loss 0.0761 - lr: 0.000039
2023-10-17 11:24:48,746 DEV : loss 0.09182097762823105 - f1-score (micro avg) 0.7489
2023-10-17 11:24:48,769 saving best model
2023-10-17 11:24:49,263 ----------------------------------------------------------------------------------------------------
2023-10-17 11:24:55,170 epoch 4 - iter 99/992 - loss 0.05468852 - time (sec): 5.90 - samples/sec: 2780.10 - lr: 0.000038 - momentum: 0.000000
2023-10-17 11:25:01,487 epoch 4 - iter 198/992 - loss 0.05293064 - time (sec): 12.22 - samples/sec: 2794.12 - lr: 0.000038 - momentum: 0.000000
2023-10-17 11:25:07,607 epoch 4 - iter 297/992 - loss 0.05364317 - time (sec): 18.34 - samples/sec: 2776.94 - lr: 0.000037 - momentum: 0.000000
2023-10-17 11:25:13,550 epoch 4 - iter 396/992 - loss 0.05402008 - time (sec): 24.28 - samples/sec: 2777.04 - lr: 0.000037 - momentum: 0.000000
2023-10-17 11:25:19,296 epoch 4 - iter 495/992 - loss 0.05392577 - time (sec): 30.03 - samples/sec: 2786.92 - lr: 0.000036 - momentum: 0.000000
2023-10-17 11:25:24,951 epoch 4 - iter 594/992 - loss 0.05443890 - time (sec): 35.69 - samples/sec: 2782.91 - lr: 0.000036 - momentum: 0.000000
2023-10-17 11:25:30,557 epoch 4 - iter 693/992 - loss 0.05350202 - time (sec): 41.29 - samples/sec: 2783.23 - lr: 0.000035 - momentum: 0.000000
2023-10-17 11:25:36,431 epoch 4 - iter 792/992 - loss 0.05357310 - time (sec): 47.17 - samples/sec: 2777.82 - lr: 0.000034 - momentum: 0.000000
2023-10-17 11:25:42,388 epoch 4 - iter 891/992 - loss 0.05445889 - time (sec): 53.12 - samples/sec: 2780.60 - lr: 0.000034 - momentum: 0.000000
2023-10-17 11:25:48,076 epoch 4 - iter 990/992 - loss 0.05431231 - time (sec): 58.81 - samples/sec: 2784.57 - lr: 0.000033 - momentum: 0.000000
2023-10-17 11:25:48,189 ----------------------------------------------------------------------------------------------------
2023-10-17 11:25:48,190 EPOCH 4 done: loss 0.0544 - lr: 0.000033
2023-10-17 11:25:51,644 DEV : loss 0.1407020390033722 - f1-score (micro avg) 0.7598
2023-10-17 11:25:51,666 saving best model
2023-10-17 11:25:52,143 ----------------------------------------------------------------------------------------------------
2023-10-17 11:25:58,088 epoch 5 - iter 99/992 - loss 0.04345102 - time (sec): 5.94 - samples/sec: 2737.20 - lr: 0.000033 - momentum: 0.000000
2023-10-17 11:26:04,229 epoch 5 - iter 198/992 - loss 0.04013191 - time (sec): 12.08 - samples/sec: 2773.50 - lr: 0.000032 - momentum: 0.000000
2023-10-17 11:26:10,153 epoch 5 - iter 297/992 - loss 0.04013654 - time (sec): 18.01 - samples/sec: 2788.59 - lr: 0.000032 - momentum: 0.000000
2023-10-17 11:26:16,359 epoch 5 - iter 396/992 - loss 0.04153776 - time (sec): 24.21 - samples/sec: 2797.16 - lr: 0.000031 - momentum: 0.000000
2023-10-17 11:26:22,273 epoch 5 - iter 495/992 - loss 0.04178549 - time (sec): 30.13 - samples/sec: 2793.12 - lr: 0.000031 - momentum: 0.000000
2023-10-17 11:26:27,904 epoch 5 - iter 594/992 - loss 0.04313324 - time (sec): 35.76 - samples/sec: 2796.33 - lr: 0.000030 - momentum: 0.000000
2023-10-17 11:26:34,115 epoch 5 - iter 693/992 - loss 0.04394145 - time (sec): 41.97 - samples/sec: 2773.35 - lr: 0.000029 - momentum: 0.000000
2023-10-17 11:26:40,034 epoch 5 - iter 792/992 - loss 0.04428695 - time (sec): 47.89 - samples/sec: 2761.16 - lr: 0.000029 - momentum: 0.000000
2023-10-17 11:26:45,880 epoch 5 - iter 891/992 - loss 0.04347387 - time (sec): 53.73 - samples/sec: 2758.23 - lr: 0.000028 - momentum: 0.000000
2023-10-17 11:26:51,571 epoch 5 - iter 990/992 - loss 0.04302101 - time (sec): 59.42 - samples/sec: 2753.59 - lr: 0.000028 - momentum: 0.000000
2023-10-17 11:26:51,696 ----------------------------------------------------------------------------------------------------
2023-10-17 11:26:51,696 EPOCH 5 done: loss 0.0429 - lr: 0.000028
2023-10-17 11:26:55,092 DEV : loss 0.1663055419921875 - f1-score (micro avg) 0.7778
2023-10-17 11:26:55,113 saving best model
2023-10-17 11:26:55,582 ----------------------------------------------------------------------------------------------------
2023-10-17 11:27:01,664 epoch 6 - iter 99/992 - loss 0.03345702 - time (sec): 6.08 - samples/sec: 2705.79 - lr: 0.000027 - momentum: 0.000000
2023-10-17 11:27:07,592 epoch 6 - iter 198/992 - loss 0.03258281 - time (sec): 12.01 - samples/sec: 2720.35 - lr: 0.000027 - momentum: 0.000000
2023-10-17 11:27:13,856 epoch 6 - iter 297/992 - loss 0.03031919 - time (sec): 18.27 - samples/sec: 2750.49 - lr: 0.000026 - momentum: 0.000000
2023-10-17 11:27:19,774 epoch 6 - iter 396/992 - loss 0.03089649 - time (sec): 24.19 - samples/sec: 2754.91 - lr: 0.000026 - momentum: 0.000000
2023-10-17 11:27:25,776 epoch 6 - iter 495/992 - loss 0.02999409 - time (sec): 30.19 - samples/sec: 2762.61 - lr: 0.000025 - momentum: 0.000000
2023-10-17 11:27:31,638 epoch 6 - iter 594/992 - loss 0.03070964 - time (sec): 36.05 - samples/sec: 2765.56 - lr: 0.000024 - momentum: 0.000000
2023-10-17 11:27:37,231 epoch 6 - iter 693/992 - loss 0.03109293 - time (sec): 41.65 - samples/sec: 2769.11 - lr: 0.000024 - momentum: 0.000000
2023-10-17 11:27:42,876 epoch 6 - iter 792/992 - loss 0.03056361 - time (sec): 47.29 - samples/sec: 2773.64 - lr: 0.000023 - momentum: 0.000000
2023-10-17 11:27:48,608 epoch 6 - iter 891/992 - loss 0.03117794 - time (sec): 53.02 - samples/sec: 2778.13 - lr: 0.000023 - momentum: 0.000000
2023-10-17 11:27:54,391 epoch 6 - iter 990/992 - loss 0.03126223 - time (sec): 58.80 - samples/sec: 2781.63 - lr: 0.000022 - momentum: 0.000000
2023-10-17 11:27:54,511 ----------------------------------------------------------------------------------------------------
2023-10-17 11:27:54,511 EPOCH 6 done: loss 0.0312 - lr: 0.000022
2023-10-17 11:27:59,227 DEV : loss 0.1633480340242386 - f1-score (micro avg) 0.7584
2023-10-17 11:27:59,263 ----------------------------------------------------------------------------------------------------
2023-10-17 11:28:05,215 epoch 7 - iter 99/992 - loss 0.01789778 - time (sec): 5.95 - samples/sec: 2739.15 - lr: 0.000022 - momentum: 0.000000
2023-10-17 11:28:11,218 epoch 7 - iter 198/992 - loss 0.01724841 - time (sec): 11.95 - samples/sec: 2745.60 - lr: 0.000021 - momentum: 0.000000
2023-10-17 11:28:17,621 epoch 7 - iter 297/992 - loss 0.01961745 - time (sec): 18.36 - samples/sec: 2700.02 - lr: 0.000021 - momentum: 0.000000
2023-10-17 11:28:23,473 epoch 7 - iter 396/992 - loss 0.01992616 - time (sec): 24.21 - samples/sec: 2712.23 - lr: 0.000020 - momentum: 0.000000
2023-10-17 11:28:29,388 epoch 7 - iter 495/992 - loss 0.02094923 - time (sec): 30.12 - samples/sec: 2720.79 - lr: 0.000019 - momentum: 0.000000
2023-10-17 11:28:35,340 epoch 7 - iter 594/992 - loss 0.01985550 - time (sec): 36.08 - samples/sec: 2728.57 - lr: 0.000019 - momentum: 0.000000
2023-10-17 11:28:41,393 epoch 7 - iter 693/992 - loss 0.02067558 - time (sec): 42.13 - samples/sec: 2724.08 - lr: 0.000018 - momentum: 0.000000
2023-10-17 11:28:47,206 epoch 7 - iter 792/992 - loss 0.02093216 - time (sec): 47.94 - samples/sec: 2718.22 - lr: 0.000018 - momentum: 0.000000
2023-10-17 11:28:53,073 epoch 7 - iter 891/992 - loss 0.02197206 - time (sec): 53.81 - samples/sec: 2739.75 - lr: 0.000017 - momentum: 0.000000
2023-10-17 11:28:58,840 epoch 7 - iter 990/992 - loss 0.02197388 - time (sec): 59.58 - samples/sec: 2747.71 - lr: 0.000017 - momentum: 0.000000
2023-10-17 11:28:58,961 ----------------------------------------------------------------------------------------------------
2023-10-17 11:28:58,961 EPOCH 7 done: loss 0.0219 - lr: 0.000017
2023-10-17 11:29:02,566 DEV : loss 0.1898750215768814 - f1-score (micro avg) 0.7665
2023-10-17 11:29:02,594 ----------------------------------------------------------------------------------------------------
2023-10-17 11:29:08,347 epoch 8 - iter 99/992 - loss 0.00858606 - time (sec): 5.75 - samples/sec: 2846.11 - lr: 0.000016 - momentum: 0.000000
2023-10-17 11:29:14,064 epoch 8 - iter 198/992 - loss 0.01171469 - time (sec): 11.47 - samples/sec: 2824.80 - lr: 0.000016 - momentum: 0.000000
2023-10-17 11:29:20,281 epoch 8 - iter 297/992 - loss 0.01211733 - time (sec): 17.69 - samples/sec: 2828.38 - lr: 0.000015 - momentum: 0.000000
2023-10-17 11:29:26,152 epoch 8 - iter 396/992 - loss 0.01169790 - time (sec): 23.56 - samples/sec: 2805.84 - lr: 0.000014 - momentum: 0.000000
2023-10-17 11:29:32,084 epoch 8 - iter 495/992 - loss 0.01211186 - time (sec): 29.49 - samples/sec: 2821.28 - lr: 0.000014 - momentum: 0.000000
2023-10-17 11:29:38,180 epoch 8 - iter 594/992 - loss 0.01413340 - time (sec): 35.58 - samples/sec: 2807.68 - lr: 0.000013 - momentum: 0.000000
2023-10-17 11:29:43,898 epoch 8 - iter 693/992 - loss 0.01436134 - time (sec): 41.30 - samples/sec: 2798.96 - lr: 0.000013 - momentum: 0.000000
2023-10-17 11:29:49,496 epoch 8 - iter 792/992 - loss 0.01496934 - time (sec): 46.90 - samples/sec: 2784.54 - lr: 0.000012 - momentum: 0.000000
2023-10-17 11:29:55,299 epoch 8 - iter 891/992 - loss 0.01507510 - time (sec): 52.70 - samples/sec: 2793.17 - lr: 0.000012 - momentum: 0.000000
2023-10-17 11:30:01,345 epoch 8 - iter 990/992 - loss 0.01559016 - time (sec): 58.75 - samples/sec: 2785.33 - lr: 0.000011 - momentum: 0.000000
2023-10-17 11:30:01,478 ----------------------------------------------------------------------------------------------------
2023-10-17 11:30:01,478 EPOCH 8 done: loss 0.0156 - lr: 0.000011
2023-10-17 11:30:05,096 DEV : loss 0.21736501157283783 - f1-score (micro avg) 0.7736
2023-10-17 11:30:05,127 ----------------------------------------------------------------------------------------------------
2023-10-17 11:30:11,217 epoch 9 - iter 99/992 - loss 0.01189736 - time (sec): 6.09 - samples/sec: 2601.72 - lr: 0.000011 - momentum: 0.000000
2023-10-17 11:30:17,689 epoch 9 - iter 198/992 - loss 0.01120784 - time (sec): 12.56 - samples/sec: 2643.77 - lr: 0.000010 - momentum: 0.000000
2023-10-17 11:30:23,899 epoch 9 - iter 297/992 - loss 0.01112974 - time (sec): 18.77 - samples/sec: 2685.89 - lr: 0.000009 - momentum: 0.000000
2023-10-17 11:30:29,897 epoch 9 - iter 396/992 - loss 0.00969053 - time (sec): 24.77 - samples/sec: 2692.32 - lr: 0.000009 - momentum: 0.000000
2023-10-17 11:30:35,679 epoch 9 - iter 495/992 - loss 0.00960030 - time (sec): 30.55 - samples/sec: 2702.51 - lr: 0.000008 - momentum: 0.000000
2023-10-17 11:30:41,692 epoch 9 - iter 594/992 - loss 0.00957094 - time (sec): 36.56 - samples/sec: 2700.06 - lr: 0.000008 - momentum: 0.000000
2023-10-17 11:30:47,567 epoch 9 - iter 693/992 - loss 0.00941102 - time (sec): 42.44 - samples/sec: 2704.48 - lr: 0.000007 - momentum: 0.000000
2023-10-17 11:30:53,670 epoch 9 - iter 792/992 - loss 0.01010825 - time (sec): 48.54 - samples/sec: 2709.90 - lr: 0.000007 - momentum: 0.000000
2023-10-17 11:30:59,631 epoch 9 - iter 891/992 - loss 0.01035941 - time (sec): 54.50 - samples/sec: 2710.26 - lr: 0.000006 - momentum: 0.000000
2023-10-17 11:31:05,562 epoch 9 - iter 990/992 - loss 0.01067263 - time (sec): 60.43 - samples/sec: 2708.56 - lr: 0.000006 - momentum: 0.000000
2023-10-17 11:31:05,685 ----------------------------------------------------------------------------------------------------
2023-10-17 11:31:05,685 EPOCH 9 done: loss 0.0107 - lr: 0.000006
2023-10-17 11:31:09,296 DEV : loss 0.2289990335702896 - f1-score (micro avg) 0.7598
2023-10-17 11:31:09,320 ----------------------------------------------------------------------------------------------------
2023-10-17 11:31:15,610 epoch 10 - iter 99/992 - loss 0.00433310 - time (sec): 6.29 - samples/sec: 2625.70 - lr: 0.000005 - momentum: 0.000000
2023-10-17 11:31:21,873 epoch 10 - iter 198/992 - loss 0.00533994 - time (sec): 12.55 - samples/sec: 2561.60 - lr: 0.000004 - momentum: 0.000000
2023-10-17 11:31:28,202 epoch 10 - iter 297/992 - loss 0.00637622 - time (sec): 18.88 - samples/sec: 2575.30 - lr: 0.000004 - momentum: 0.000000
2023-10-17 11:31:34,426 epoch 10 - iter 396/992 - loss 0.00717457 - time (sec): 25.10 - samples/sec: 2581.66 - lr: 0.000003 - momentum: 0.000000
2023-10-17 11:31:40,384 epoch 10 - iter 495/992 - loss 0.00716627 - time (sec): 31.06 - samples/sec: 2593.97 - lr: 0.000003 - momentum: 0.000000
2023-10-17 11:31:46,392 epoch 10 - iter 594/992 - loss 0.00678007 - time (sec): 37.07 - samples/sec: 2628.76 - lr: 0.000002 - momentum: 0.000000
2023-10-17 11:31:52,420 epoch 10 - iter 693/992 - loss 0.00705269 - time (sec): 43.10 - samples/sec: 2659.76 - lr: 0.000002 - momentum: 0.000000
2023-10-17 11:31:58,173 epoch 10 - iter 792/992 - loss 0.00695957 - time (sec): 48.85 - samples/sec: 2683.39 - lr: 0.000001 - momentum: 0.000000
2023-10-17 11:32:04,222 epoch 10 - iter 891/992 - loss 0.00746367 - time (sec): 54.90 - samples/sec: 2676.79 - lr: 0.000001 - momentum: 0.000000
2023-10-17 11:32:10,346 epoch 10 - iter 990/992 - loss 0.00797105 - time (sec): 61.02 - samples/sec: 2682.68 - lr: 0.000000 - momentum: 0.000000
2023-10-17 11:32:10,450 ----------------------------------------------------------------------------------------------------
2023-10-17 11:32:10,450 EPOCH 10 done: loss 0.0080 - lr: 0.000000
2023-10-17 11:32:15,038 DEV : loss 0.23297961056232452 - f1-score (micro avg) 0.7648
2023-10-17 11:32:15,550 ----------------------------------------------------------------------------------------------------
2023-10-17 11:32:15,552 Loading model from best epoch ...
2023-10-17 11:32:17,148 SequenceTagger predicts: Dictionary with 13 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG
2023-10-17 11:32:21,019
Results:
- F-score (micro) 0.7587
- F-score (macro) 0.6925
- Accuracy 0.6447
By class:
precision recall f1-score support
LOC 0.8560 0.7893 0.8213 655
PER 0.6335 0.7982 0.7063 223
ORG 0.5565 0.5433 0.5498 127
micro avg 0.7572 0.7602 0.7587 1005
macro avg 0.6820 0.7103 0.6925 1005
weighted avg 0.7687 0.7602 0.7615 1005
2023-10-17 11:32:21,020 ----------------------------------------------------------------------------------------------------