| distributed init (rank 1): env://, gpu 1 | distributed init (rank 0): env://, gpu 0 [16:05:36.402158] job dir: /mnt/localDisk2/wgj/FSFM-3C/codespace/fsfm-3c/finuetune/cross_dataset_DfD [16:05:36.402448] Namespace(aa='rand-m9-mstd0.5-inc1', accum_iter=1, apply_simple_augment=True, batch_size=32, blr=0.00025, clip_grad=None, color_jitter=None, cutmix=1.0, cutmix_minmax=None, data_path='../../../datasets/finetune_datasets/deepfakes_detection/FaceForensics/32_frames/DS_FF++_all_cls/c23', dataset_abs_path=None, device='cuda', dist_backend='nccl', dist_eval=True, dist_on_itp=False, dist_url='env://', distributed=True, drop_path=0.1, epochs=10, eval=False, finetune='../../pretrain/checkpoint/pretrained_models/VF2_ViT-B/checkpoint-400.pth', global_pool=True, gpu=0, input_size=224, layer_decay=0.65, local_rank=0, log_dir='./checkpoint/finetuned_models/FF++_c23_32frames', lr=None, min_lr=1e-06, mixup=0.8, mixup_mode='batch', mixup_prob=1.0, mixup_switch_prob=0.5, model='vit_base_patch16', nb_classes=2, normalize_from_IMN=False, num_workers=10, output_dir='./checkpoint/finetuned_models/FF++_c23_32frames', pin_mem=True, rank=0, recount=1, remode='pixel', reprob=0.25, resplit=False, resume='', seed=0, smoothing=0.1, start_epoch=0, train_split=None, val_split=None, warmup_epochs=5, weight_decay=0.05, world_size=2) [16:05:37.133613] Dataset ImageFolder Number of datapoints: 184185 Root location: ../../../datasets/finetune_datasets/deepfakes_detection/FaceForensics/32_frames/DS_FF++_all_cls/c23/train StandardTransform Transform: Compose( RandomResizedCropAndInterpolation(size=(224, 224), scale=(0.08, 1.0), ratio=(0.75, 1.3333), interpolation=PIL.Image.BILINEAR) RandomHorizontalFlip(p=0.5) ToTensor() Normalize(mean=tensor([0.5482, 0.4234, 0.3655]), std=tensor([0.2789, 0.2439, 0.2349])) ) [16:05:37.276832] Dataset ImageFolder Number of datapoints: 35796 Root location: ../../../datasets/finetune_datasets/deepfakes_detection/FaceForensics/32_frames/DS_FF++_all_cls/c23/val StandardTransform Transform: Compose( Resize(size=256, interpolation=bicubic, max_size=None, antialias=None) CenterCrop(size=(224, 224)) ToTensor() Normalize(mean=[0.5482207536697388, 0.42340534925460815, 0.3654651641845703], std=[0.2789176106452942, 0.2438540756702423, 0.23493893444538116]) ) [16:05:37.277033] len(dataset_train):184185 [16:05:37.277047] len(dataset_val):35796 [16:05:37.277117] Sampler_train = [16:05:37.277197] [INFO]log dir: %./checkpoint/finetuned_models/FF++_c23_32frames [16:05:37.279776] Mixup is activated! [16:05:46.767281] Load pre-trained checkpoint from: ../../pretrain/checkpoint/pretrained_models/VF2_ViT-B/checkpoint-400.pth [16:05:46.982690] _IncompatibleKeys(missing_keys=['head.weight', 'head.bias', 'fc_norm.weight', 'fc_norm.bias'], unexpected_keys=['mask_token', 'rep_decoder_pos_embed', 'decoder_pos_embed', 'norm.weight', 'norm.bias', 'projector.projection_head.0.weight', 'projector.projection_head.0.bias', 'projector.projection_head.1.weight', 'projector.projection_head.1.bias', 'projector.projection_head.3.weight', 'projector.projection_head.3.bias', 'predictor.projection_head.0.weight', 'predictor.projection_head.0.bias', 'predictor.projection_head.1.weight', 'predictor.projection_head.1.bias', 'predictor.projection_head.3.weight', 'predictor.projection_head.3.bias', 'rep_decoder_embed.weight', 'rep_decoder_embed.bias', 'rep_decoder_blocks.0.norm1.weight', 'rep_decoder_blocks.0.norm1.bias', 'rep_decoder_blocks.0.attn.qkv.weight', 'rep_decoder_blocks.0.attn.qkv.bias', 'rep_decoder_blocks.0.attn.proj.weight', 'rep_decoder_blocks.0.attn.proj.bias', 'rep_decoder_blocks.0.norm2.weight', 'rep_decoder_blocks.0.norm2.bias', 'rep_decoder_blocks.0.mlp.fc1.weight', 'rep_decoder_blocks.0.mlp.fc1.bias', 'rep_decoder_blocks.0.mlp.fc2.weight', 'rep_decoder_blocks.0.mlp.fc2.bias', 'rep_decoder_blocks.1.norm1.weight', 'rep_decoder_blocks.1.norm1.bias', 'rep_decoder_blocks.1.attn.qkv.weight', 'rep_decoder_blocks.1.attn.qkv.bias', 'rep_decoder_blocks.1.attn.proj.weight', 'rep_decoder_blocks.1.attn.proj.bias', 'rep_decoder_blocks.1.norm2.weight', 'rep_decoder_blocks.1.norm2.bias', 'rep_decoder_blocks.1.mlp.fc1.weight', 'rep_decoder_blocks.1.mlp.fc1.bias', 'rep_decoder_blocks.1.mlp.fc2.weight', 'rep_decoder_blocks.1.mlp.fc2.bias', 'rep_decoder_norm.weight', 'rep_decoder_norm.bias', 'rep_decoder_pred.weight', 'rep_decoder_pred.bias', 'decoder_embed.weight', 'decoder_embed.bias', 'decoder_blocks.0.norm1.weight', 'decoder_blocks.0.norm1.bias', 'decoder_blocks.0.attn.qkv.weight', 'decoder_blocks.0.attn.qkv.bias', 'decoder_blocks.0.attn.proj.weight', 'decoder_blocks.0.attn.proj.bias', 'decoder_blocks.0.norm2.weight', 'decoder_blocks.0.norm2.bias', 'decoder_blocks.0.mlp.fc1.weight', 'decoder_blocks.0.mlp.fc1.bias', 'decoder_blocks.0.mlp.fc2.weight', 'decoder_blocks.0.mlp.fc2.bias', 'decoder_blocks.1.norm1.weight', 'decoder_blocks.1.norm1.bias', 'decoder_blocks.1.attn.qkv.weight', 'decoder_blocks.1.attn.qkv.bias', 'decoder_blocks.1.attn.proj.weight', 'decoder_blocks.1.attn.proj.bias', 'decoder_blocks.1.norm2.weight', 'decoder_blocks.1.norm2.bias', 'decoder_blocks.1.mlp.fc1.weight', 'decoder_blocks.1.mlp.fc1.bias', 'decoder_blocks.1.mlp.fc2.weight', 'decoder_blocks.1.mlp.fc2.bias', 'decoder_blocks.2.norm1.weight', 'decoder_blocks.2.norm1.bias', 'decoder_blocks.2.attn.qkv.weight', 'decoder_blocks.2.attn.qkv.bias', 'decoder_blocks.2.attn.proj.weight', 'decoder_blocks.2.attn.proj.bias', 'decoder_blocks.2.norm2.weight', 'decoder_blocks.2.norm2.bias', 'decoder_blocks.2.mlp.fc1.weight', 'decoder_blocks.2.mlp.fc1.bias', 'decoder_blocks.2.mlp.fc2.weight', 'decoder_blocks.2.mlp.fc2.bias', 'decoder_blocks.3.norm1.weight', 'decoder_blocks.3.norm1.bias', 'decoder_blocks.3.attn.qkv.weight', 'decoder_blocks.3.attn.qkv.bias', 'decoder_blocks.3.attn.proj.weight', 'decoder_blocks.3.attn.proj.bias', 'decoder_blocks.3.norm2.weight', 'decoder_blocks.3.norm2.bias', 'decoder_blocks.3.mlp.fc1.weight', 'decoder_blocks.3.mlp.fc1.bias', 'decoder_blocks.3.mlp.fc2.weight', 'decoder_blocks.3.mlp.fc2.bias', 'decoder_blocks.4.norm1.weight', 'decoder_blocks.4.norm1.bias', 'decoder_blocks.4.attn.qkv.weight', 'decoder_blocks.4.attn.qkv.bias', 'decoder_blocks.4.attn.proj.weight', 'decoder_blocks.4.attn.proj.bias', 'decoder_blocks.4.norm2.weight', 'decoder_blocks.4.norm2.bias', 'decoder_blocks.4.mlp.fc1.weight', 'decoder_blocks.4.mlp.fc1.bias', 'decoder_blocks.4.mlp.fc2.weight', 'decoder_blocks.4.mlp.fc2.bias', 'decoder_blocks.5.norm1.weight', 'decoder_blocks.5.norm1.bias', 'decoder_blocks.5.attn.qkv.weight', 'decoder_blocks.5.attn.qkv.bias', 'decoder_blocks.5.attn.proj.weight', 'decoder_blocks.5.attn.proj.bias', 'decoder_blocks.5.norm2.weight', 'decoder_blocks.5.norm2.bias', 'decoder_blocks.5.mlp.fc1.weight', 'decoder_blocks.5.mlp.fc1.bias', 'decoder_blocks.5.mlp.fc2.weight', 'decoder_blocks.5.mlp.fc2.bias', 'decoder_blocks.6.norm1.weight', 'decoder_blocks.6.norm1.bias', 'decoder_blocks.6.attn.qkv.weight', 'decoder_blocks.6.attn.qkv.bias', 'decoder_blocks.6.attn.proj.weight', 'decoder_blocks.6.attn.proj.bias', 'decoder_blocks.6.norm2.weight', 'decoder_blocks.6.norm2.bias', 'decoder_blocks.6.mlp.fc1.weight', 'decoder_blocks.6.mlp.fc1.bias', 'decoder_blocks.6.mlp.fc2.weight', 'decoder_blocks.6.mlp.fc2.bias', 'decoder_blocks.7.norm1.weight', 'decoder_blocks.7.norm1.bias', 'decoder_blocks.7.attn.qkv.weight', 'decoder_blocks.7.attn.qkv.bias', 'decoder_blocks.7.attn.proj.weight', 'decoder_blocks.7.attn.proj.bias', 'decoder_blocks.7.norm2.weight', 'decoder_blocks.7.norm2.bias', 'decoder_blocks.7.mlp.fc1.weight', 'decoder_blocks.7.mlp.fc1.bias', 'decoder_blocks.7.mlp.fc2.weight', 'decoder_blocks.7.mlp.fc2.bias', 'decoder_norm.weight', 'decoder_norm.bias', 'decoder_pred.weight', 'decoder_pred.bias']) [16:05:48.573224] ========================================================================================== Layer (type:depth-idx) Output Shape Param # ========================================================================================== ├─PatchEmbed: 1-1 [-1, 196, 768] -- | └─Conv2d: 2-1 [-1, 768, 14, 14] 590,592 ├─Dropout: 1-2 [-1, 197, 768] -- ├─ModuleList: 1 [] -- | └─Block: 2-2 [-1, 197, 768] -- | | └─LayerNorm: 3-1 [-1, 197, 768] 1,536 | | └─Attention: 3-2 [-1, 197, 768] 2,362,368 | | └─Identity: 3-3 [-1, 197, 768] -- | | └─LayerNorm: 3-4 [-1, 197, 768] 1,536 | | └─Mlp: 3-5 [-1, 197, 768] 4,722,432 | | └─Identity: 3-6 [-1, 197, 768] -- | └─Block: 2-3 [-1, 197, 768] -- | | └─LayerNorm: 3-7 [-1, 197, 768] 1,536 | | └─Attention: 3-8 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-9 [-1, 197, 768] -- | | └─LayerNorm: 3-10 [-1, 197, 768] 1,536 | | └─Mlp: 3-11 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-12 [-1, 197, 768] -- | └─Block: 2-4 [-1, 197, 768] -- | | └─LayerNorm: 3-13 [-1, 197, 768] 1,536 | | └─Attention: 3-14 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-15 [-1, 197, 768] -- | | └─LayerNorm: 3-16 [-1, 197, 768] 1,536 | | └─Mlp: 3-17 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-18 [-1, 197, 768] -- | └─Block: 2-5 [-1, 197, 768] -- | | └─LayerNorm: 3-19 [-1, 197, 768] 1,536 | | └─Attention: 3-20 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-21 [-1, 197, 768] -- | | └─LayerNorm: 3-22 [-1, 197, 768] 1,536 | | └─Mlp: 3-23 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-24 [-1, 197, 768] -- | └─Block: 2-6 [-1, 197, 768] -- | | └─LayerNorm: 3-25 [-1, 197, 768] 1,536 | | └─Attention: 3-26 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-27 [-1, 197, 768] -- | | └─LayerNorm: 3-28 [-1, 197, 768] 1,536 | | └─Mlp: 3-29 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-30 [-1, 197, 768] -- | └─Block: 2-7 [-1, 197, 768] -- | | └─LayerNorm: 3-31 [-1, 197, 768] 1,536 | | └─Attention: 3-32 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-33 [-1, 197, 768] -- | | └─LayerNorm: 3-34 [-1, 197, 768] 1,536 | | └─Mlp: 3-35 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-36 [-1, 197, 768] -- | └─Block: 2-8 [-1, 197, 768] -- | | └─LayerNorm: 3-37 [-1, 197, 768] 1,536 | | └─Attention: 3-38 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-39 [-1, 197, 768] -- | | └─LayerNorm: 3-40 [-1, 197, 768] 1,536 | | └─Mlp: 3-41 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-42 [-1, 197, 768] -- | └─Block: 2-9 [-1, 197, 768] -- | | └─LayerNorm: 3-43 [-1, 197, 768] 1,536 | | └─Attention: 3-44 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-45 [-1, 197, 768] -- | | └─LayerNorm: 3-46 [-1, 197, 768] 1,536 | | └─Mlp: 3-47 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-48 [-1, 197, 768] -- | └─Block: 2-10 [-1, 197, 768] -- | | └─LayerNorm: 3-49 [-1, 197, 768] 1,536 | | └─Attention: 3-50 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-51 [-1, 197, 768] -- | | └─LayerNorm: 3-52 [-1, 197, 768] 1,536 | | └─Mlp: 3-53 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-54 [-1, 197, 768] -- | └─Block: 2-11 [-1, 197, 768] -- | | └─LayerNorm: 3-55 [-1, 197, 768] 1,536 | | └─Attention: 3-56 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-57 [-1, 197, 768] -- | | └─LayerNorm: 3-58 [-1, 197, 768] 1,536 | | └─Mlp: 3-59 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-60 [-1, 197, 768] -- | └─Block: 2-12 [-1, 197, 768] -- | | └─LayerNorm: 3-61 [-1, 197, 768] 1,536 | | └─Attention: 3-62 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-63 [-1, 197, 768] -- | | └─LayerNorm: 3-64 [-1, 197, 768] 1,536 | | └─Mlp: 3-65 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-66 [-1, 197, 768] -- | └─Block: 2-13 [-1, 197, 768] -- | | └─LayerNorm: 3-67 [-1, 197, 768] 1,536 | | └─Attention: 3-68 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-69 [-1, 197, 768] -- | | └─LayerNorm: 3-70 [-1, 197, 768] 1,536 | | └─Mlp: 3-71 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-72 [-1, 197, 768] -- ├─LayerNorm: 1-3 [-1, 768] 1,536 ├─Linear: 1-4 [-1, 2] 1,538 ========================================================================================== Total params: 85,648,130 Trainable params: 85,648,130 Non-trainable params: 0 Total mult-adds (M): 371.04 ========================================================================================== Input size (MB): 0.57 Forward/backward pass size (MB): 153.52 Params size (MB): 326.72 Estimated Total Size (MB): 480.82 ========================================================================================== [16:05:48.574383] ========================================================================================== Layer (type:depth-idx) Output Shape Param # ========================================================================================== ├─PatchEmbed: 1-1 [-1, 196, 768] -- | └─Conv2d: 2-1 [-1, 768, 14, 14] 590,592 ├─Dropout: 1-2 [-1, 197, 768] -- ├─ModuleList: 1 [] -- | └─Block: 2-2 [-1, 197, 768] -- | | └─LayerNorm: 3-1 [-1, 197, 768] 1,536 | | └─Attention: 3-2 [-1, 197, 768] 2,362,368 | | └─Identity: 3-3 [-1, 197, 768] -- | | └─LayerNorm: 3-4 [-1, 197, 768] 1,536 | | └─Mlp: 3-5 [-1, 197, 768] 4,722,432 | | └─Identity: 3-6 [-1, 197, 768] -- | └─Block: 2-3 [-1, 197, 768] -- | | └─LayerNorm: 3-7 [-1, 197, 768] 1,536 | | └─Attention: 3-8 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-9 [-1, 197, 768] -- | | └─LayerNorm: 3-10 [-1, 197, 768] 1,536 | | └─Mlp: 3-11 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-12 [-1, 197, 768] -- | └─Block: 2-4 [-1, 197, 768] -- | | └─LayerNorm: 3-13 [-1, 197, 768] 1,536 | | └─Attention: 3-14 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-15 [-1, 197, 768] -- | | └─LayerNorm: 3-16 [-1, 197, 768] 1,536 | | └─Mlp: 3-17 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-18 [-1, 197, 768] -- | └─Block: 2-5 [-1, 197, 768] -- | | └─LayerNorm: 3-19 [-1, 197, 768] 1,536 | | └─Attention: 3-20 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-21 [-1, 197, 768] -- | | └─LayerNorm: 3-22 [-1, 197, 768] 1,536 | | └─Mlp: 3-23 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-24 [-1, 197, 768] -- | └─Block: 2-6 [-1, 197, 768] -- | | └─LayerNorm: 3-25 [-1, 197, 768] 1,536 | | └─Attention: 3-26 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-27 [-1, 197, 768] -- | | └─LayerNorm: 3-28 [-1, 197, 768] 1,536 | | └─Mlp: 3-29 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-30 [-1, 197, 768] -- | └─Block: 2-7 [-1, 197, 768] -- | | └─LayerNorm: 3-31 [-1, 197, 768] 1,536 | | └─Attention: 3-32 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-33 [-1, 197, 768] -- | | └─LayerNorm: 3-34 [-1, 197, 768] 1,536 | | └─Mlp: 3-35 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-36 [-1, 197, 768] -- | └─Block: 2-8 [-1, 197, 768] -- | | └─LayerNorm: 3-37 [-1, 197, 768] 1,536 | | └─Attention: 3-38 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-39 [-1, 197, 768] -- | | └─LayerNorm: 3-40 [-1, 197, 768] 1,536 | | └─Mlp: 3-41 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-42 [-1, 197, 768] -- | └─Block: 2-9 [-1, 197, 768] -- | | └─LayerNorm: 3-43 [-1, 197, 768] 1,536 | | └─Attention: 3-44 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-45 [-1, 197, 768] -- | | └─LayerNorm: 3-46 [-1, 197, 768] 1,536 | | └─Mlp: 3-47 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-48 [-1, 197, 768] -- | └─Block: 2-10 [-1, 197, 768] -- | | └─LayerNorm: 3-49 [-1, 197, 768] 1,536 | | └─Attention: 3-50 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-51 [-1, 197, 768] -- | | └─LayerNorm: 3-52 [-1, 197, 768] 1,536 | | └─Mlp: 3-53 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-54 [-1, 197, 768] -- | └─Block: 2-11 [-1, 197, 768] -- | | └─LayerNorm: 3-55 [-1, 197, 768] 1,536 | | └─Attention: 3-56 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-57 [-1, 197, 768] -- | | └─LayerNorm: 3-58 [-1, 197, 768] 1,536 | | └─Mlp: 3-59 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-60 [-1, 197, 768] -- | └─Block: 2-12 [-1, 197, 768] -- | | └─LayerNorm: 3-61 [-1, 197, 768] 1,536 | | └─Attention: 3-62 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-63 [-1, 197, 768] -- | | └─LayerNorm: 3-64 [-1, 197, 768] 1,536 | | └─Mlp: 3-65 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-66 [-1, 197, 768] -- | └─Block: 2-13 [-1, 197, 768] -- | | └─LayerNorm: 3-67 [-1, 197, 768] 1,536 | | └─Attention: 3-68 [-1, 197, 768] 2,362,368 | | └─DropPath: 3-69 [-1, 197, 768] -- | | └─LayerNorm: 3-70 [-1, 197, 768] 1,536 | | └─Mlp: 3-71 [-1, 197, 768] 4,722,432 | | └─DropPath: 3-72 [-1, 197, 768] -- ├─LayerNorm: 1-3 [-1, 768] 1,536 ├─Linear: 1-4 [-1, 2] 1,538 ========================================================================================== Total params: 85,648,130 Trainable params: 85,648,130 Non-trainable params: 0 Total mult-adds (M): 371.04 ========================================================================================== Input size (MB): 0.57 Forward/backward pass size (MB): 153.52 Params size (MB): 326.72 Estimated Total Size (MB): 480.82 ========================================================================================== [16:05:48.576649] Model = VisionTransformer( (patch_embed): PatchEmbed( (proj): Conv2d(3, 768, kernel_size=(16, 16), stride=(16, 16)) ) (pos_drop): Dropout(p=0.0, inplace=False) (blocks): ModuleList( (0): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): Identity() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (1): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (2): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (3): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (4): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (5): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (6): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (7): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (8): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (9): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (10): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) (11): Block( (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (attn): Attention( (qkv): Linear(in_features=768, out_features=2304, bias=True) (attn_drop): Dropout(p=0.0, inplace=False) (proj): Linear(in_features=768, out_features=768, bias=True) (proj_drop): Dropout(p=0.0, inplace=False) ) (drop_path): DropPath() (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True) (mlp): Mlp( (fc1): Linear(in_features=768, out_features=3072, bias=True) (act): GELU(approximate='none') (fc2): Linear(in_features=3072, out_features=768, bias=True) (drop): Dropout(p=0.0, inplace=False) ) ) ) (head): Linear(in_features=768, out_features=2, bias=True) (fc_norm): LayerNorm((768,), eps=1e-06, elementwise_affine=True) ) [16:05:48.576711] number of params (M): 85.80 [16:05:48.576729] base lr: 2.50e-04 [16:05:48.576738] actual lr: 6.25e-05 [16:05:48.576746] accumulate grad iterations: 1 [16:05:48.576753] effective batch size: 64 [16:05:48.606357] criterion = SoftTargetCrossEntropy() [16:05:48.606387] Start training for 10 epochs [16:05:48.607169] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:05:50.450515] Epoch: [0] [ 0/2877] eta: 1:28:20 lr: 0.000000 loss: 0.6932 (0.6932) time: 1.8423 data: 1.0685 max mem: 4308 [16:06:04.032412] Epoch: [0] [ 100/2877] eta: 0:07:04 lr: 0.000000 loss: 0.6931 (0.6931) time: 0.1362 data: 0.0002 max mem: 5308 [16:06:17.523887] Epoch: [0] [ 200/2877] eta: 0:06:25 lr: 0.000001 loss: 0.6931 (0.6931) time: 0.1355 data: 0.0001 max mem: 5308 [16:06:30.957329] Epoch: [0] [ 300/2877] eta: 0:06:02 lr: 0.000001 loss: 0.6931 (0.6931) time: 0.1333 data: 0.0001 max mem: 5308 [16:06:44.319783] Epoch: [0] [ 400/2877] eta: 0:05:44 lr: 0.000002 loss: 0.6931 (0.6931) time: 0.1337 data: 0.0001 max mem: 5308 [16:06:57.975133] Epoch: [0] [ 500/2877] eta: 0:05:29 lr: 0.000002 loss: 0.6930 (0.6931) time: 0.1359 data: 0.0001 max mem: 5308 [16:07:11.632413] Epoch: [0] [ 600/2877] eta: 0:05:14 lr: 0.000003 loss: 0.6933 (0.6931) time: 0.1353 data: 0.0002 max mem: 5308 [16:07:25.178422] Epoch: [0] [ 700/2877] eta: 0:04:59 lr: 0.000003 loss: 0.6929 (0.6931) time: 0.1349 data: 0.0002 max mem: 5308 [16:07:38.707895] Epoch: [0] [ 800/2877] eta: 0:04:45 lr: 0.000003 loss: 0.6929 (0.6931) time: 0.1363 data: 0.0002 max mem: 5308 [16:07:52.301692] Epoch: [0] [ 900/2877] eta: 0:04:31 lr: 0.000004 loss: 0.6925 (0.6930) time: 0.1366 data: 0.0002 max mem: 5308 [16:08:06.004011] Epoch: [0] [1000/2877] eta: 0:04:17 lr: 0.000004 loss: 0.6929 (0.6930) time: 0.1365 data: 0.0002 max mem: 5308 [16:08:19.676187] Epoch: [0] [1100/2877] eta: 0:04:03 lr: 0.000005 loss: 0.6929 (0.6930) time: 0.1375 data: 0.0002 max mem: 5308 [16:08:33.261768] Epoch: [0] [1200/2877] eta: 0:03:49 lr: 0.000005 loss: 0.6928 (0.6930) time: 0.1351 data: 0.0001 max mem: 5308 [16:08:46.775194] Epoch: [0] [1300/2877] eta: 0:03:35 lr: 0.000006 loss: 0.6941 (0.6930) time: 0.1363 data: 0.0001 max mem: 5308 [16:09:00.521338] Epoch: [0] [1400/2877] eta: 0:03:22 lr: 0.000006 loss: 0.6927 (0.6929) time: 0.1384 data: 0.0002 max mem: 5308 [16:09:14.246663] Epoch: [0] [1500/2877] eta: 0:03:08 lr: 0.000007 loss: 0.6940 (0.6929) time: 0.1358 data: 0.0002 max mem: 5308 [16:09:27.850870] Epoch: [0] [1600/2877] eta: 0:02:54 lr: 0.000007 loss: 0.6919 (0.6928) time: 0.1357 data: 0.0002 max mem: 5308 [16:09:41.522406] Epoch: [0] [1700/2877] eta: 0:02:41 lr: 0.000007 loss: 0.6922 (0.6927) time: 0.1376 data: 0.0002 max mem: 5308 [16:09:55.123865] Epoch: [0] [1800/2877] eta: 0:02:27 lr: 0.000008 loss: 0.6916 (0.6926) time: 0.1350 data: 0.0001 max mem: 5308 [16:10:08.636991] Epoch: [0] [1900/2877] eta: 0:02:13 lr: 0.000008 loss: 0.6898 (0.6926) time: 0.1356 data: 0.0002 max mem: 5308 [16:10:22.310937] Epoch: [0] [2000/2877] eta: 0:01:59 lr: 0.000009 loss: 0.6908 (0.6925) time: 0.1373 data: 0.0002 max mem: 5308 [16:10:36.033887] Epoch: [0] [2100/2877] eta: 0:01:46 lr: 0.000009 loss: 0.6919 (0.6924) time: 0.1380 data: 0.0001 max mem: 5308 [16:10:49.725345] Epoch: [0] [2200/2877] eta: 0:01:32 lr: 0.000010 loss: 0.6892 (0.6922) time: 0.1372 data: 0.0002 max mem: 5308 [16:11:03.413572] Epoch: [0] [2300/2877] eta: 0:01:18 lr: 0.000010 loss: 0.6893 (0.6921) time: 0.1385 data: 0.0001 max mem: 5308 [16:11:17.136471] Epoch: [0] [2400/2877] eta: 0:01:05 lr: 0.000010 loss: 0.6832 (0.6919) time: 0.1371 data: 0.0002 max mem: 5308 [16:11:30.897210] Epoch: [0] [2500/2877] eta: 0:00:51 lr: 0.000011 loss: 0.6892 (0.6918) time: 0.1387 data: 0.0002 max mem: 5308 [16:11:44.587740] Epoch: [0] [2600/2877] eta: 0:00:37 lr: 0.000011 loss: 0.6890 (0.6917) time: 0.1381 data: 0.0002 max mem: 5308 [16:11:58.272199] Epoch: [0] [2700/2877] eta: 0:00:24 lr: 0.000012 loss: 0.6835 (0.6915) time: 0.1378 data: 0.0002 max mem: 5308 [16:12:11.958213] Epoch: [0] [2800/2877] eta: 0:00:10 lr: 0.000012 loss: 0.6845 (0.6914) time: 0.1355 data: 0.0001 max mem: 5308 [16:12:22.285649] Epoch: [0] [2876/2877] eta: 0:00:00 lr: 0.000012 loss: 0.6867 (0.6913) time: 0.1363 data: 0.0002 max mem: 5308 [16:12:22.478694] Epoch: [0] Total time: 0:06:33 (0.1369 s / it) [16:12:22.505555] Averaged stats: lr: 0.000012 loss: 0.6867 (0.6914) [16:12:24.049852] Test: [ 0/560] eta: 0:14:22 loss: 0.6874 (0.6874) auc: 65.8824 (65.8824) time: 1.5410 data: 1.5043 max mem: 5308 [16:12:24.606570] Test: [ 10/560] eta: 0:01:44 loss: 0.6699 (0.6707) auc: 74.2915 (75.0876) time: 0.1906 data: 0.1608 max mem: 5308 [16:12:25.281255] Test: [ 20/560] eta: 0:01:11 loss: 0.6682 (0.6698) auc: 80.7359 (79.9045) time: 0.0615 data: 0.0324 max mem: 5308 [16:12:26.055792] Test: [ 30/560] eta: 0:01:00 loss: 0.6613 (0.6669) auc: 84.3254 (81.4501) time: 0.0724 data: 0.0434 max mem: 5308 [16:12:26.956516] Test: [ 40/560] eta: 0:00:56 loss: 0.6635 (0.6669) auc: 82.5397 (80.9105) time: 0.0837 data: 0.0547 max mem: 5308 [16:12:28.036498] Test: [ 50/560] eta: 0:00:55 loss: 0.6705 (0.6675) auc: 82.0833 (81.5161) time: 0.0990 data: 0.0698 max mem: 5308 [16:12:28.658092] Test: [ 60/560] eta: 0:00:50 loss: 0.6636 (0.6668) auc: 81.8182 (81.1055) time: 0.0850 data: 0.0559 max mem: 5308 [16:12:29.369572] Test: [ 70/560] eta: 0:00:47 loss: 0.6630 (0.6662) auc: 81.8182 (81.3349) time: 0.0666 data: 0.0375 max mem: 5308 [16:12:29.667863] Test: [ 80/560] eta: 0:00:42 loss: 0.6582 (0.6662) auc: 81.8182 (81.1841) time: 0.0502 data: 0.0210 max mem: 5308 [16:12:29.961627] Test: [ 90/560] eta: 0:00:38 loss: 0.6635 (0.6661) auc: 81.3725 (81.0894) time: 0.0293 data: 0.0002 max mem: 5308 [16:12:30.255592] Test: [100/560] eta: 0:00:35 loss: 0.6651 (0.6666) auc: 80.2734 (80.9505) time: 0.0293 data: 0.0002 max mem: 5308 [16:12:30.737302] Test: [110/560] eta: 0:00:33 loss: 0.6591 (0.6660) auc: 80.7692 (81.1747) time: 0.0387 data: 0.0097 max mem: 5308 [16:12:31.030364] Test: [120/560] eta: 0:00:30 loss: 0.6592 (0.6666) auc: 82.3413 (80.8463) time: 0.0387 data: 0.0096 max mem: 5308 [16:12:31.323254] Test: [130/560] eta: 0:00:28 loss: 0.6714 (0.6671) auc: 80.5556 (80.7944) time: 0.0292 data: 0.0001 max mem: 5308 [16:12:31.616078] Test: [140/560] eta: 0:00:27 loss: 0.6726 (0.6678) auc: 81.3492 (80.7031) time: 0.0292 data: 0.0001 max mem: 5308 [16:12:32.499121] Test: [150/560] eta: 0:00:27 loss: 0.6670 (0.6674) auc: 80.0000 (80.6251) time: 0.0587 data: 0.0297 max mem: 5308 [16:12:33.498265] Test: [160/560] eta: 0:00:27 loss: 0.6597 (0.6674) auc: 80.0000 (80.6803) time: 0.0940 data: 0.0649 max mem: 5308 [16:12:33.844053] Test: [170/560] eta: 0:00:25 loss: 0.6586 (0.6673) auc: 80.4545 (80.6816) time: 0.0672 data: 0.0379 max mem: 5308 [16:12:34.136093] Test: [180/560] eta: 0:00:24 loss: 0.6663 (0.6679) auc: 78.5425 (80.5652) time: 0.0318 data: 0.0026 max mem: 5308 [16:12:34.775514] Test: [190/560] eta: 0:00:23 loss: 0.6724 (0.6679) auc: 79.5455 (80.5298) time: 0.0465 data: 0.0175 max mem: 5308 [16:12:35.105373] Test: [200/560] eta: 0:00:22 loss: 0.6635 (0.6675) auc: 81.7460 (80.6410) time: 0.0484 data: 0.0191 max mem: 5308 [16:12:35.399734] Test: [210/560] eta: 0:00:21 loss: 0.6644 (0.6676) auc: 80.4167 (80.5790) time: 0.0311 data: 0.0018 max mem: 5308 [16:12:36.013776] Test: [220/560] eta: 0:00:20 loss: 0.6683 (0.6677) auc: 77.5000 (80.5544) time: 0.0453 data: 0.0162 max mem: 5308 [16:12:36.435416] Test: [230/560] eta: 0:00:19 loss: 0.6665 (0.6677) auc: 81.7460 (80.5966) time: 0.0517 data: 0.0228 max mem: 5308 [16:12:37.072250] Test: [240/560] eta: 0:00:19 loss: 0.6651 (0.6679) auc: 81.9444 (80.6255) time: 0.0529 data: 0.0239 max mem: 5308 [16:12:37.892736] Test: [250/560] eta: 0:00:18 loss: 0.6668 (0.6677) auc: 83.1349 (80.8067) time: 0.0728 data: 0.0438 max mem: 5308 [16:12:38.564210] Test: [260/560] eta: 0:00:18 loss: 0.6521 (0.6672) auc: 85.2941 (80.9234) time: 0.0745 data: 0.0454 max mem: 5308 [16:12:39.306146] Test: [270/560] eta: 0:00:17 loss: 0.6680 (0.6676) auc: 78.9683 (80.6967) time: 0.0706 data: 0.0412 max mem: 5308 [16:12:39.834238] Test: [280/560] eta: 0:00:17 loss: 0.6686 (0.6676) auc: 78.9683 (80.7285) time: 0.0634 data: 0.0341 max mem: 5308 [16:12:40.128630] Test: [290/560] eta: 0:00:16 loss: 0.6686 (0.6681) auc: 80.3571 (80.7003) time: 0.0410 data: 0.0119 max mem: 5308 [16:12:40.424259] Test: [300/560] eta: 0:00:15 loss: 0.6693 (0.6678) auc: 82.9960 (80.8058) time: 0.0294 data: 0.0001 max mem: 5308 [16:12:40.721737] Test: [310/560] eta: 0:00:14 loss: 0.6672 (0.6678) auc: 82.7451 (80.7535) time: 0.0296 data: 0.0002 max mem: 5308 [16:12:41.018891] Test: [320/560] eta: 0:00:13 loss: 0.6692 (0.6679) auc: 76.9841 (80.7093) time: 0.0296 data: 0.0002 max mem: 5308 [16:12:41.314974] Test: [330/560] eta: 0:00:13 loss: 0.6665 (0.6678) auc: 77.7083 (80.6883) time: 0.0296 data: 0.0002 max mem: 5308 [16:12:41.612151] Test: [340/560] eta: 0:00:12 loss: 0.6671 (0.6679) auc: 81.9608 (80.8021) time: 0.0296 data: 0.0002 max mem: 5308 [16:12:41.908457] Test: [350/560] eta: 0:00:11 loss: 0.6654 (0.6677) auc: 85.1190 (80.8910) time: 0.0296 data: 0.0002 max mem: 5308 [16:12:42.204875] Test: [360/560] eta: 0:00:10 loss: 0.6701 (0.6679) auc: 83.1349 (80.8605) time: 0.0295 data: 0.0002 max mem: 5308 [16:12:42.497602] Test: [370/560] eta: 0:00:10 loss: 0.6742 (0.6681) auc: 79.9595 (80.8488) time: 0.0294 data: 0.0002 max mem: 5308 [16:12:42.790967] Test: [380/560] eta: 0:00:09 loss: 0.6709 (0.6681) auc: 80.3922 (80.9071) time: 0.0292 data: 0.0002 max mem: 5308 [16:12:43.086368] Test: [390/560] eta: 0:00:08 loss: 0.6762 (0.6683) auc: 77.7778 (80.7482) time: 0.0294 data: 0.0002 max mem: 5308 [16:12:43.379540] Test: [400/560] eta: 0:00:08 loss: 0.6762 (0.6684) auc: 77.1255 (80.7119) time: 0.0293 data: 0.0002 max mem: 5308 [16:12:43.677215] Test: [410/560] eta: 0:00:07 loss: 0.6733 (0.6684) auc: 76.1905 (80.5823) time: 0.0295 data: 0.0002 max mem: 5308 [16:12:43.971819] Test: [420/560] eta: 0:00:07 loss: 0.6726 (0.6683) auc: 76.1719 (80.5809) time: 0.0295 data: 0.0002 max mem: 5308 [16:12:44.270619] Test: [430/560] eta: 0:00:06 loss: 0.6609 (0.6682) auc: 80.8594 (80.6182) time: 0.0296 data: 0.0002 max mem: 5308 [16:12:44.567943] Test: [440/560] eta: 0:00:05 loss: 0.6609 (0.6683) auc: 80.1932 (80.5704) time: 0.0297 data: 0.0002 max mem: 5308 [16:12:44.866051] Test: [450/560] eta: 0:00:05 loss: 0.6699 (0.6683) auc: 79.3651 (80.5627) time: 0.0297 data: 0.0002 max mem: 5308 [16:12:45.164818] Test: [460/560] eta: 0:00:04 loss: 0.6682 (0.6683) auc: 79.7571 (80.5176) time: 0.0297 data: 0.0002 max mem: 5308 [16:12:45.458745] Test: [470/560] eta: 0:00:04 loss: 0.6690 (0.6683) auc: 79.3651 (80.4494) time: 0.0295 data: 0.0002 max mem: 5308 [16:12:45.754666] Test: [480/560] eta: 0:00:03 loss: 0.6668 (0.6683) auc: 79.3522 (80.4269) time: 0.0294 data: 0.0002 max mem: 5308 [16:12:46.048831] Test: [490/560] eta: 0:00:03 loss: 0.6645 (0.6683) auc: 81.5686 (80.4755) time: 0.0294 data: 0.0002 max mem: 5308 [16:12:46.342329] Test: [500/560] eta: 0:00:02 loss: 0.6645 (0.6683) auc: 81.7814 (80.4898) time: 0.0293 data: 0.0001 max mem: 5308 [16:12:46.639511] Test: [510/560] eta: 0:00:02 loss: 0.6654 (0.6684) auc: 82.0346 (80.4497) time: 0.0295 data: 0.0002 max mem: 5308 [16:12:46.934005] Test: [520/560] eta: 0:00:01 loss: 0.6642 (0.6684) auc: 79.7619 (80.4525) time: 0.0295 data: 0.0002 max mem: 5308 [16:12:47.229027] Test: [530/560] eta: 0:00:01 loss: 0.6773 (0.6687) auc: 77.7056 (80.3659) time: 0.0294 data: 0.0002 max mem: 5308 [16:12:47.523797] Test: [540/560] eta: 0:00:00 loss: 0.6773 (0.6688) auc: 77.7056 (80.3787) time: 0.0294 data: 0.0002 max mem: 5308 [16:12:47.816427] Test: [550/560] eta: 0:00:00 loss: 0.6687 (0.6689) auc: 84.7222 (80.4725) time: 0.0293 data: 0.0002 max mem: 5308 [16:12:48.151312] Test: [559/560] eta: 0:00:00 loss: 0.6658 (0.6688) auc: 85.5159 (80.5176) time: 0.0328 data: 0.0001 max mem: 5308 [16:12:48.281775] Test: Total time: 0:00:25 (0.0460 s / it) [16:12:48.283914] * Auc 80.462 loss 0.669 [16:12:48.284079] AUC of the network on the 35796 val images: 80.46% [16:12:48.284101] Max auc: 80.46% [16:12:48.284121] Save model with min_val_loss at epoch: 0 [16:12:49.589367] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:12:50.814360] Epoch: [1] [ 0/2877] eta: 0:58:40 lr: 0.000013 loss: 0.6833 (0.6833) time: 1.2237 data: 1.0531 max mem: 5308 [16:13:04.564163] Epoch: [1] [ 100/2877] eta: 0:06:51 lr: 0.000013 loss: 0.6868 (0.6877) time: 0.1379 data: 0.0001 max mem: 5308 [16:13:18.317716] Epoch: [1] [ 200/2877] eta: 0:06:22 lr: 0.000013 loss: 0.6849 (0.6860) time: 0.1376 data: 0.0002 max mem: 5308 [16:13:31.983257] Epoch: [1] [ 300/2877] eta: 0:06:02 lr: 0.000014 loss: 0.6855 (0.6861) time: 0.1385 data: 0.0002 max mem: 5308 [16:13:45.552327] Epoch: [1] [ 400/2877] eta: 0:05:45 lr: 0.000014 loss: 0.6858 (0.6863) time: 0.1351 data: 0.0002 max mem: 5308 [16:13:59.181856] Epoch: [1] [ 500/2877] eta: 0:05:30 lr: 0.000015 loss: 0.6854 (0.6862) time: 0.1365 data: 0.0001 max mem: 5308 [16:14:12.799206] Epoch: [1] [ 600/2877] eta: 0:05:15 lr: 0.000015 loss: 0.6831 (0.6861) time: 0.1358 data: 0.0001 max mem: 5308 [16:14:26.418644] Epoch: [1] [ 700/2877] eta: 0:05:00 lr: 0.000016 loss: 0.6785 (0.6856) time: 0.1353 data: 0.0002 max mem: 5308 [16:14:39.957483] Epoch: [1] [ 800/2877] eta: 0:04:46 lr: 0.000016 loss: 0.6807 (0.6855) time: 0.1349 data: 0.0001 max mem: 5308 [16:14:53.462317] Epoch: [1] [ 900/2877] eta: 0:04:31 lr: 0.000016 loss: 0.6776 (0.6851) time: 0.1354 data: 0.0001 max mem: 5308 [16:15:07.138044] Epoch: [1] [1000/2877] eta: 0:04:17 lr: 0.000017 loss: 0.6803 (0.6848) time: 0.1371 data: 0.0002 max mem: 5308 [16:15:20.713470] Epoch: [1] [1100/2877] eta: 0:04:03 lr: 0.000017 loss: 0.6775 (0.6846) time: 0.1345 data: 0.0001 max mem: 5308 [16:15:34.224270] Epoch: [1] [1200/2877] eta: 0:03:49 lr: 0.000018 loss: 0.6848 (0.6844) time: 0.1361 data: 0.0002 max mem: 5308 [16:15:47.828917] Epoch: [1] [1300/2877] eta: 0:03:36 lr: 0.000018 loss: 0.6839 (0.6843) time: 0.1352 data: 0.0001 max mem: 5308 [16:16:01.484432] Epoch: [1] [1400/2877] eta: 0:03:22 lr: 0.000019 loss: 0.6692 (0.6839) time: 0.1356 data: 0.0001 max mem: 5308 [16:16:15.073369] Epoch: [1] [1500/2877] eta: 0:03:08 lr: 0.000019 loss: 0.6901 (0.6836) time: 0.1349 data: 0.0001 max mem: 5308 [16:16:28.593575] Epoch: [1] [1600/2877] eta: 0:02:54 lr: 0.000019 loss: 0.6787 (0.6834) time: 0.1352 data: 0.0001 max mem: 5308 [16:16:42.286181] Epoch: [1] [1700/2877] eta: 0:02:40 lr: 0.000020 loss: 0.6732 (0.6832) time: 0.1371 data: 0.0001 max mem: 5308 [16:16:55.963633] Epoch: [1] [1800/2877] eta: 0:02:27 lr: 0.000020 loss: 0.6744 (0.6829) time: 0.1376 data: 0.0002 max mem: 5308 [16:17:09.528095] Epoch: [1] [1900/2877] eta: 0:02:13 lr: 0.000021 loss: 0.6760 (0.6827) time: 0.1348 data: 0.0001 max mem: 5308 [16:17:23.097459] Epoch: [1] [2000/2877] eta: 0:01:59 lr: 0.000021 loss: 0.6745 (0.6822) time: 0.1371 data: 0.0002 max mem: 5308 [16:17:36.824208] Epoch: [1] [2100/2877] eta: 0:01:46 lr: 0.000022 loss: 0.6872 (0.6821) time: 0.1365 data: 0.0004 max mem: 5308 [16:17:50.321987] Epoch: [1] [2200/2877] eta: 0:01:32 lr: 0.000022 loss: 0.6902 (0.6817) time: 0.1347 data: 0.0001 max mem: 5308 [16:18:03.852620] Epoch: [1] [2300/2877] eta: 0:01:18 lr: 0.000022 loss: 0.6730 (0.6813) time: 0.1347 data: 0.0002 max mem: 5308 [16:18:17.518695] Epoch: [1] [2400/2877] eta: 0:01:05 lr: 0.000023 loss: 0.6745 (0.6809) time: 0.1346 data: 0.0001 max mem: 5308 [16:18:31.016072] Epoch: [1] [2500/2877] eta: 0:00:51 lr: 0.000023 loss: 0.6791 (0.6807) time: 0.1345 data: 0.0001 max mem: 5308 [16:18:44.531931] Epoch: [1] [2600/2877] eta: 0:00:37 lr: 0.000024 loss: 0.6788 (0.6803) time: 0.1367 data: 0.0002 max mem: 5308 [16:18:58.038934] Epoch: [1] [2700/2877] eta: 0:00:24 lr: 0.000024 loss: 0.6632 (0.6798) time: 0.1349 data: 0.0002 max mem: 5308 [16:19:11.602946] Epoch: [1] [2800/2877] eta: 0:00:10 lr: 0.000025 loss: 0.6751 (0.6796) time: 0.1350 data: 0.0001 max mem: 5308 [16:19:21.886110] Epoch: [1] [2876/2877] eta: 0:00:00 lr: 0.000025 loss: 0.6872 (0.6794) time: 0.1365 data: 0.0003 max mem: 5308 [16:19:22.139825] Epoch: [1] Total time: 0:06:32 (0.1364 s / it) [16:19:22.141273] Averaged stats: lr: 0.000025 loss: 0.6872 (0.6792) [16:19:23.862187] Test: [ 0/560] eta: 0:16:01 loss: 0.6199 (0.6199) auc: 77.2549 (77.2549) time: 1.7162 data: 1.6805 max mem: 5308 [16:19:24.156453] Test: [ 10/560] eta: 0:01:40 loss: 0.5863 (0.5673) auc: 79.7571 (81.0819) time: 0.1827 data: 0.1529 max mem: 5308 [16:19:24.450426] Test: [ 20/560] eta: 0:00:59 loss: 0.5400 (0.5459) auc: 85.4167 (85.0816) time: 0.0293 data: 0.0001 max mem: 5308 [16:19:24.744669] Test: [ 30/560] eta: 0:00:44 loss: 0.5356 (0.5336) auc: 89.6104 (86.6654) time: 0.0293 data: 0.0001 max mem: 5308 [16:19:25.039285] Test: [ 40/560] eta: 0:00:36 loss: 0.5345 (0.5379) auc: 87.0588 (85.5144) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:25.335152] Test: [ 50/560] eta: 0:00:31 loss: 0.5345 (0.5371) auc: 86.2500 (86.0696) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:25.629295] Test: [ 60/560] eta: 0:00:28 loss: 0.5321 (0.5359) auc: 84.5703 (86.0086) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:25.924806] Test: [ 70/560] eta: 0:00:26 loss: 0.5244 (0.5342) auc: 86.4372 (86.3364) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:26.219207] Test: [ 80/560] eta: 0:00:24 loss: 0.5179 (0.5336) auc: 87.4510 (86.3074) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:26.514697] Test: [ 90/560] eta: 0:00:22 loss: 0.5179 (0.5330) auc: 86.8182 (86.3460) time: 0.0294 data: 0.0001 max mem: 5308 [16:19:26.809356] Test: [100/560] eta: 0:00:21 loss: 0.5150 (0.5330) auc: 86.6667 (86.3862) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:27.105125] Test: [110/560] eta: 0:00:20 loss: 0.5150 (0.5317) auc: 86.6667 (86.4107) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:27.401260] Test: [120/560] eta: 0:00:19 loss: 0.5234 (0.5346) auc: 85.7488 (86.1502) time: 0.0295 data: 0.0002 max mem: 5308 [16:19:27.701153] Test: [130/560] eta: 0:00:18 loss: 0.5421 (0.5351) auc: 85.8824 (86.1065) time: 0.0297 data: 0.0002 max mem: 5308 [16:19:28.002013] Test: [140/560] eta: 0:00:17 loss: 0.5461 (0.5368) auc: 85.9375 (85.9855) time: 0.0299 data: 0.0002 max mem: 5308 [16:19:28.297677] Test: [150/560] eta: 0:00:16 loss: 0.5461 (0.5370) auc: 85.7143 (85.9060) time: 0.0297 data: 0.0002 max mem: 5308 [16:19:28.596264] Test: [160/560] eta: 0:00:15 loss: 0.5056 (0.5359) auc: 87.8431 (86.0864) time: 0.0296 data: 0.0002 max mem: 5308 [16:19:28.891268] Test: [170/560] eta: 0:00:15 loss: 0.5083 (0.5358) auc: 89.0625 (86.1361) time: 0.0296 data: 0.0002 max mem: 5308 [16:19:29.188616] Test: [180/560] eta: 0:00:14 loss: 0.5421 (0.5372) auc: 85.0202 (86.0175) time: 0.0295 data: 0.0002 max mem: 5308 [16:19:29.484843] Test: [190/560] eta: 0:00:14 loss: 0.5465 (0.5377) auc: 85.4902 (85.9749) time: 0.0296 data: 0.0002 max mem: 5308 [16:19:29.780836] Test: [200/560] eta: 0:00:13 loss: 0.5329 (0.5368) auc: 86.9048 (86.0595) time: 0.0295 data: 0.0002 max mem: 5308 [16:19:30.076069] Test: [210/560] eta: 0:00:13 loss: 0.5329 (0.5370) auc: 85.9375 (86.0292) time: 0.0295 data: 0.0002 max mem: 5308 [16:19:30.371325] Test: [220/560] eta: 0:00:12 loss: 0.5339 (0.5369) auc: 86.2745 (86.0260) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:30.677045] Test: [230/560] eta: 0:00:12 loss: 0.5339 (0.5367) auc: 87.4494 (86.0625) time: 0.0300 data: 0.0002 max mem: 5308 [16:19:30.981872] Test: [240/560] eta: 0:00:11 loss: 0.5358 (0.5365) auc: 87.6984 (86.1192) time: 0.0305 data: 0.0003 max mem: 5308 [16:19:31.278720] Test: [250/560] eta: 0:00:11 loss: 0.5170 (0.5357) auc: 87.4510 (86.2013) time: 0.0300 data: 0.0005 max mem: 5308 [16:19:31.580255] Test: [260/560] eta: 0:00:10 loss: 0.5075 (0.5342) auc: 88.4921 (86.3637) time: 0.0298 data: 0.0004 max mem: 5308 [16:19:31.893679] Test: [270/560] eta: 0:00:10 loss: 0.5216 (0.5353) auc: 85.8824 (86.2147) time: 0.0307 data: 0.0006 max mem: 5308 [16:19:32.227421] Test: [280/560] eta: 0:00:10 loss: 0.5390 (0.5350) auc: 87.4494 (86.3088) time: 0.0323 data: 0.0013 max mem: 5308 [16:19:32.527720] Test: [290/560] eta: 0:00:09 loss: 0.5389 (0.5358) auc: 87.3016 (86.2926) time: 0.0316 data: 0.0010 max mem: 5308 [16:19:32.843664] Test: [300/560] eta: 0:00:09 loss: 0.5389 (0.5347) auc: 87.1094 (86.4196) time: 0.0307 data: 0.0004 max mem: 5308 [16:19:33.188259] Test: [310/560] eta: 0:00:08 loss: 0.5244 (0.5353) auc: 87.4510 (86.3325) time: 0.0329 data: 0.0007 max mem: 5308 [16:19:33.517267] Test: [320/560] eta: 0:00:08 loss: 0.5555 (0.5357) auc: 82.7451 (86.3011) time: 0.0336 data: 0.0016 max mem: 5308 [16:19:33.838051] Test: [330/560] eta: 0:00:08 loss: 0.5249 (0.5353) auc: 86.1111 (86.3852) time: 0.0324 data: 0.0022 max mem: 5308 [16:19:34.143107] Test: [340/560] eta: 0:00:07 loss: 0.5117 (0.5347) auc: 89.0688 (86.4461) time: 0.0312 data: 0.0014 max mem: 5308 [16:19:34.438400] Test: [350/560] eta: 0:00:07 loss: 0.5117 (0.5342) auc: 90.0794 (86.5140) time: 0.0299 data: 0.0005 max mem: 5308 [16:19:34.736987] Test: [360/560] eta: 0:00:06 loss: 0.5313 (0.5349) auc: 87.0588 (86.4405) time: 0.0296 data: 0.0002 max mem: 5308 [16:19:35.047267] Test: [370/560] eta: 0:00:06 loss: 0.5425 (0.5353) auc: 85.7143 (86.4359) time: 0.0304 data: 0.0002 max mem: 5308 [16:19:35.361125] Test: [380/560] eta: 0:00:06 loss: 0.5366 (0.5350) auc: 87.5000 (86.5020) time: 0.0311 data: 0.0008 max mem: 5308 [16:19:35.657852] Test: [390/560] eta: 0:00:05 loss: 0.5415 (0.5357) auc: 85.5469 (86.3943) time: 0.0304 data: 0.0007 max mem: 5308 [16:19:35.958511] Test: [400/560] eta: 0:00:05 loss: 0.5587 (0.5363) auc: 81.3765 (86.3117) time: 0.0298 data: 0.0002 max mem: 5308 [16:19:36.263555] Test: [410/560] eta: 0:00:05 loss: 0.5478 (0.5367) auc: 80.4688 (86.2255) time: 0.0302 data: 0.0007 max mem: 5308 [16:19:36.579654] Test: [420/560] eta: 0:00:04 loss: 0.5345 (0.5367) auc: 85.7143 (86.2285) time: 0.0310 data: 0.0008 max mem: 5308 [16:19:36.878039] Test: [430/560] eta: 0:00:04 loss: 0.5312 (0.5366) auc: 87.1094 (86.2376) time: 0.0306 data: 0.0005 max mem: 5308 [16:19:37.175130] Test: [440/560] eta: 0:00:04 loss: 0.5059 (0.5365) auc: 87.3016 (86.2549) time: 0.0297 data: 0.0004 max mem: 5308 [16:19:37.475337] Test: [450/560] eta: 0:00:03 loss: 0.5143 (0.5365) auc: 86.7188 (86.2521) time: 0.0298 data: 0.0002 max mem: 5308 [16:19:37.775856] Test: [460/560] eta: 0:00:03 loss: 0.5445 (0.5369) auc: 86.6397 (86.2055) time: 0.0300 data: 0.0002 max mem: 5308 [16:19:38.071411] Test: [470/560] eta: 0:00:03 loss: 0.5445 (0.5371) auc: 86.6397 (86.1717) time: 0.0297 data: 0.0002 max mem: 5308 [16:19:38.365808] Test: [480/560] eta: 0:00:02 loss: 0.5358 (0.5372) auc: 87.0588 (86.1623) time: 0.0294 data: 0.0001 max mem: 5308 [16:19:38.660715] Test: [490/560] eta: 0:00:02 loss: 0.5359 (0.5370) auc: 86.2500 (86.1819) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:38.954378] Test: [500/560] eta: 0:00:02 loss: 0.5387 (0.5368) auc: 86.2500 (86.2110) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:39.247719] Test: [510/560] eta: 0:00:01 loss: 0.5268 (0.5370) auc: 86.8627 (86.1983) time: 0.0293 data: 0.0002 max mem: 5308 [16:19:39.541687] Test: [520/560] eta: 0:00:01 loss: 0.5404 (0.5372) auc: 85.3175 (86.1499) time: 0.0293 data: 0.0002 max mem: 5308 [16:19:39.837173] Test: [530/560] eta: 0:00:00 loss: 0.5661 (0.5379) auc: 83.3333 (86.0771) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:40.131131] Test: [540/560] eta: 0:00:00 loss: 0.5661 (0.5380) auc: 84.3137 (86.0803) time: 0.0294 data: 0.0002 max mem: 5308 [16:19:40.419223] Test: [550/560] eta: 0:00:00 loss: 0.5323 (0.5378) auc: 90.6883 (86.1869) time: 0.0290 data: 0.0001 max mem: 5308 [16:19:40.663676] Test: [559/560] eta: 0:00:00 loss: 0.5252 (0.5374) auc: 90.4545 (86.1971) time: 0.0280 data: 0.0001 max mem: 5308 [16:19:40.819742] Test: Total time: 0:00:18 (0.0333 s / it) [16:19:40.820725] * Auc 86.264 loss 0.538 [16:19:40.820891] AUC of the network on the 35796 val images: 86.26% [16:19:40.820907] Max auc: 86.26% [16:19:40.820939] Save model with min_val_loss at epoch: 1 [16:19:46.339390] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:19:47.637979] Epoch: [2] [ 0/2877] eta: 1:02:10 lr: 0.000025 loss: 0.7165 (0.7165) time: 1.2968 data: 1.1477 max mem: 5308 [16:20:01.267124] Epoch: [2] [ 100/2877] eta: 0:06:50 lr: 0.000025 loss: 0.6604 (0.6752) time: 0.1364 data: 0.0001 max mem: 5308 [16:20:15.258477] Epoch: [2] [ 200/2877] eta: 0:06:25 lr: 0.000026 loss: 0.6904 (0.6746) time: 0.1441 data: 0.0002 max mem: 5308 [16:20:29.239041] Epoch: [2] [ 300/2877] eta: 0:06:07 lr: 0.000026 loss: 0.6594 (0.6719) time: 0.1358 data: 0.0001 max mem: 5308 [16:20:42.969540] Epoch: [2] [ 400/2877] eta: 0:05:49 lr: 0.000027 loss: 0.6765 (0.6732) time: 0.1372 data: 0.0001 max mem: 5308 [16:20:56.764980] Epoch: [2] [ 500/2877] eta: 0:05:34 lr: 0.000027 loss: 0.6765 (0.6721) time: 0.1347 data: 0.0001 max mem: 5308 [16:21:10.409059] Epoch: [2] [ 600/2877] eta: 0:05:18 lr: 0.000028 loss: 0.6658 (0.6726) time: 0.1362 data: 0.0002 max mem: 5308 [16:21:23.903876] Epoch: [2] [ 700/2877] eta: 0:05:02 lr: 0.000028 loss: 0.6737 (0.6726) time: 0.1364 data: 0.0002 max mem: 5308 [16:21:37.450678] Epoch: [2] [ 800/2877] eta: 0:04:48 lr: 0.000028 loss: 0.6704 (0.6724) time: 0.1350 data: 0.0002 max mem: 5308 [16:21:50.989475] Epoch: [2] [ 900/2877] eta: 0:04:33 lr: 0.000029 loss: 0.6576 (0.6718) time: 0.1349 data: 0.0001 max mem: 5308 [16:22:04.709834] Epoch: [2] [1000/2877] eta: 0:04:19 lr: 0.000029 loss: 0.6528 (0.6716) time: 0.1386 data: 0.0002 max mem: 5308 [16:22:18.403792] Epoch: [2] [1100/2877] eta: 0:04:05 lr: 0.000030 loss: 0.6631 (0.6712) time: 0.1371 data: 0.0002 max mem: 5308 [16:22:31.932525] Epoch: [2] [1200/2877] eta: 0:03:51 lr: 0.000030 loss: 0.6615 (0.6704) time: 0.1351 data: 0.0002 max mem: 5308 [16:22:45.480301] Epoch: [2] [1300/2877] eta: 0:03:37 lr: 0.000031 loss: 0.6593 (0.6699) time: 0.1340 data: 0.0003 max mem: 5308 [16:22:59.098426] Epoch: [2] [1400/2877] eta: 0:03:23 lr: 0.000031 loss: 0.6724 (0.6699) time: 0.1381 data: 0.0001 max mem: 5308 [16:23:12.669181] Epoch: [2] [1500/2877] eta: 0:03:09 lr: 0.000032 loss: 0.6577 (0.6696) time: 0.1353 data: 0.0002 max mem: 5308 [16:23:26.308111] Epoch: [2] [1600/2877] eta: 0:02:55 lr: 0.000032 loss: 0.6637 (0.6692) time: 0.1360 data: 0.0001 max mem: 5308 [16:23:39.853575] Epoch: [2] [1700/2877] eta: 0:02:41 lr: 0.000032 loss: 0.6608 (0.6687) time: 0.1345 data: 0.0001 max mem: 5308 [16:23:53.416766] Epoch: [2] [1800/2877] eta: 0:02:27 lr: 0.000033 loss: 0.6728 (0.6684) time: 0.1360 data: 0.0002 max mem: 5308 [16:24:07.145006] Epoch: [2] [1900/2877] eta: 0:02:14 lr: 0.000033 loss: 0.6593 (0.6683) time: 0.1379 data: 0.0002 max mem: 5308 [16:24:20.744708] Epoch: [2] [2000/2877] eta: 0:02:00 lr: 0.000034 loss: 0.6531 (0.6680) time: 0.1366 data: 0.0002 max mem: 5308 [16:24:34.345699] Epoch: [2] [2100/2877] eta: 0:01:46 lr: 0.000034 loss: 0.6766 (0.6679) time: 0.1341 data: 0.0001 max mem: 5308 [16:24:47.709802] Epoch: [2] [2200/2877] eta: 0:01:32 lr: 0.000035 loss: 0.6689 (0.6678) time: 0.1339 data: 0.0002 max mem: 5308 [16:25:01.120341] Epoch: [2] [2300/2877] eta: 0:01:18 lr: 0.000035 loss: 0.6509 (0.6675) time: 0.1345 data: 0.0001 max mem: 5308 [16:25:14.925764] Epoch: [2] [2400/2877] eta: 0:01:05 lr: 0.000035 loss: 0.6481 (0.6673) time: 0.1367 data: 0.0001 max mem: 5308 [16:25:28.427998] Epoch: [2] [2500/2877] eta: 0:00:51 lr: 0.000036 loss: 0.6668 (0.6670) time: 0.1344 data: 0.0001 max mem: 5308 [16:25:41.875240] Epoch: [2] [2600/2877] eta: 0:00:37 lr: 0.000036 loss: 0.6613 (0.6667) time: 0.1346 data: 0.0002 max mem: 5308 [16:25:55.592308] Epoch: [2] [2700/2877] eta: 0:00:24 lr: 0.000037 loss: 0.6785 (0.6669) time: 0.1345 data: 0.0001 max mem: 5308 [16:26:08.957665] Epoch: [2] [2800/2877] eta: 0:00:10 lr: 0.000037 loss: 0.6531 (0.6667) time: 0.1343 data: 0.0001 max mem: 5308 [16:26:19.181296] Epoch: [2] [2876/2877] eta: 0:00:00 lr: 0.000037 loss: 0.6555 (0.6665) time: 0.1339 data: 0.0002 max mem: 5308 [16:26:19.477982] Epoch: [2] Total time: 0:06:33 (0.1366 s / it) [16:26:19.479622] Averaged stats: lr: 0.000037 loss: 0.6555 (0.6670) [16:26:21.111044] Test: [ 0/560] eta: 0:15:10 loss: 0.5531 (0.5531) auc: 84.3137 (84.3137) time: 1.6265 data: 1.5928 max mem: 5308 [16:26:21.405558] Test: [ 10/560] eta: 0:01:35 loss: 0.5306 (0.5092) auc: 84.6154 (85.2935) time: 0.1745 data: 0.1449 max mem: 5308 [16:26:21.699526] Test: [ 20/560] eta: 0:00:56 loss: 0.4709 (0.4845) auc: 88.4921 (88.1831) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:21.994245] Test: [ 30/560] eta: 0:00:42 loss: 0.4340 (0.4645) auc: 92.1569 (89.8603) time: 0.0294 data: 0.0001 max mem: 5308 [16:26:22.288917] Test: [ 40/560] eta: 0:00:35 loss: 0.4404 (0.4692) auc: 92.1569 (89.1604) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:22.582875] Test: [ 50/560] eta: 0:00:30 loss: 0.4447 (0.4692) auc: 89.0196 (89.4849) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:22.876857] Test: [ 60/560] eta: 0:00:27 loss: 0.4571 (0.4679) auc: 89.2857 (89.4706) time: 0.0293 data: 0.0002 max mem: 5308 [16:26:23.171834] Test: [ 70/560] eta: 0:00:25 loss: 0.4538 (0.4656) auc: 90.4545 (89.7818) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:23.468136] Test: [ 80/560] eta: 0:00:23 loss: 0.4326 (0.4630) auc: 90.6250 (90.0154) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:23.764308] Test: [ 90/560] eta: 0:00:22 loss: 0.4345 (0.4628) auc: 90.9091 (90.1639) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:24.059547] Test: [100/560] eta: 0:00:20 loss: 0.4393 (0.4615) auc: 92.1569 (90.3776) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:24.354243] Test: [110/560] eta: 0:00:19 loss: 0.4393 (0.4594) auc: 91.6667 (90.4877) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:24.648435] Test: [120/560] eta: 0:00:18 loss: 0.4542 (0.4640) auc: 89.3720 (90.1903) time: 0.0294 data: 0.0001 max mem: 5308 [16:26:24.943355] Test: [130/560] eta: 0:00:17 loss: 0.4643 (0.4642) auc: 89.0873 (90.2240) time: 0.0294 data: 0.0001 max mem: 5308 [16:26:25.238808] Test: [140/560] eta: 0:00:17 loss: 0.4643 (0.4665) auc: 89.4531 (90.1040) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:25.533929] Test: [150/560] eta: 0:00:16 loss: 0.4592 (0.4659) auc: 89.4531 (90.0881) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:25.828892] Test: [160/560] eta: 0:00:15 loss: 0.4181 (0.4635) auc: 92.9167 (90.3370) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:26.123891] Test: [170/560] eta: 0:00:15 loss: 0.4133 (0.4626) auc: 92.9688 (90.4135) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:26.418348] Test: [180/560] eta: 0:00:14 loss: 0.4463 (0.4639) auc: 91.9028 (90.4030) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:26.711305] Test: [190/560] eta: 0:00:13 loss: 0.4716 (0.4649) auc: 89.4118 (90.3332) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:27.004262] Test: [200/560] eta: 0:00:13 loss: 0.4675 (0.4636) auc: 89.7917 (90.4004) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:27.297394] Test: [210/560] eta: 0:00:12 loss: 0.4640 (0.4640) auc: 90.5882 (90.3557) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:27.591396] Test: [220/560] eta: 0:00:12 loss: 0.4640 (0.4636) auc: 89.8039 (90.3635) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:27.884366] Test: [230/560] eta: 0:00:11 loss: 0.4636 (0.4636) auc: 90.8730 (90.3789) time: 0.0293 data: 0.0002 max mem: 5308 [16:26:28.177439] Test: [240/560] eta: 0:00:11 loss: 0.4665 (0.4633) auc: 90.8730 (90.4067) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:28.470520] Test: [250/560] eta: 0:00:11 loss: 0.4435 (0.4629) auc: 90.0794 (90.3966) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:28.763102] Test: [260/560] eta: 0:00:10 loss: 0.4049 (0.4607) auc: 94.7917 (90.5515) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:29.055926] Test: [270/560] eta: 0:00:10 loss: 0.4417 (0.4617) auc: 91.7647 (90.4599) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:29.349590] Test: [280/560] eta: 0:00:09 loss: 0.4523 (0.4609) auc: 90.8730 (90.5731) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:29.643414] Test: [290/560] eta: 0:00:09 loss: 0.4692 (0.4622) auc: 90.8333 (90.5487) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:29.940927] Test: [300/560] eta: 0:00:09 loss: 0.4692 (0.4610) auc: 90.8333 (90.6489) time: 0.0295 data: 0.0001 max mem: 5308 [16:26:30.235024] Test: [310/560] eta: 0:00:08 loss: 0.4482 (0.4614) auc: 92.9688 (90.5925) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:30.528499] Test: [320/560] eta: 0:00:08 loss: 0.4773 (0.4619) auc: 89.0196 (90.5660) time: 0.0293 data: 0.0002 max mem: 5308 [16:26:30.821331] Test: [330/560] eta: 0:00:07 loss: 0.4512 (0.4615) auc: 90.4762 (90.6389) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:31.115061] Test: [340/560] eta: 0:00:07 loss: 0.4437 (0.4608) auc: 93.3333 (90.6772) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:31.408505] Test: [350/560] eta: 0:00:07 loss: 0.4366 (0.4602) auc: 92.5490 (90.7098) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:31.701236] Test: [360/560] eta: 0:00:06 loss: 0.4639 (0.4611) auc: 90.1961 (90.6646) time: 0.0292 data: 0.0002 max mem: 5308 [16:26:31.995097] Test: [370/560] eta: 0:00:06 loss: 0.4716 (0.4615) auc: 90.1961 (90.6945) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:32.287887] Test: [380/560] eta: 0:00:06 loss: 0.4561 (0.4614) auc: 90.0810 (90.7008) time: 0.0292 data: 0.0001 max mem: 5308 [16:26:32.581467] Test: [390/560] eta: 0:00:05 loss: 0.4660 (0.4622) auc: 89.2157 (90.6406) time: 0.0293 data: 0.0002 max mem: 5308 [16:26:32.875088] Test: [400/560] eta: 0:00:05 loss: 0.4820 (0.4631) auc: 88.3333 (90.5659) time: 0.0293 data: 0.0002 max mem: 5308 [16:26:33.168588] Test: [410/560] eta: 0:00:04 loss: 0.4800 (0.4635) auc: 85.4902 (90.4969) time: 0.0293 data: 0.0001 max mem: 5308 [16:26:33.466102] Test: [420/560] eta: 0:00:04 loss: 0.4650 (0.4634) auc: 88.0952 (90.5133) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:33.760378] Test: [430/560] eta: 0:00:04 loss: 0.4337 (0.4628) auc: 90.2344 (90.5516) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:34.056815] Test: [440/560] eta: 0:00:03 loss: 0.4285 (0.4625) auc: 92.9412 (90.5902) time: 0.0295 data: 0.0001 max mem: 5308 [16:26:34.351552] Test: [450/560] eta: 0:00:03 loss: 0.4416 (0.4626) auc: 92.5781 (90.5940) time: 0.0295 data: 0.0001 max mem: 5308 [16:26:34.653540] Test: [460/560] eta: 0:00:03 loss: 0.4745 (0.4634) auc: 91.4062 (90.5519) time: 0.0297 data: 0.0002 max mem: 5308 [16:26:34.957577] Test: [470/560] eta: 0:00:02 loss: 0.4949 (0.4636) auc: 91.2500 (90.5349) time: 0.0302 data: 0.0002 max mem: 5308 [16:26:35.264652] Test: [480/560] eta: 0:00:02 loss: 0.4733 (0.4636) auc: 90.9804 (90.5347) time: 0.0304 data: 0.0002 max mem: 5308 [16:26:35.562485] Test: [490/560] eta: 0:00:02 loss: 0.4581 (0.4632) auc: 90.8333 (90.5474) time: 0.0301 data: 0.0002 max mem: 5308 [16:26:35.858423] Test: [500/560] eta: 0:00:01 loss: 0.4391 (0.4628) auc: 91.2698 (90.5832) time: 0.0296 data: 0.0002 max mem: 5308 [16:26:36.153921] Test: [510/560] eta: 0:00:01 loss: 0.4369 (0.4629) auc: 91.0931 (90.5886) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:36.447778] Test: [520/560] eta: 0:00:01 loss: 0.4962 (0.4635) auc: 88.4921 (90.5208) time: 0.0294 data: 0.0002 max mem: 5308 [16:26:36.743644] Test: [530/560] eta: 0:00:00 loss: 0.4980 (0.4645) auc: 86.7188 (90.4640) time: 0.0294 data: 0.0001 max mem: 5308 [16:26:37.041015] Test: [540/560] eta: 0:00:00 loss: 0.4834 (0.4649) auc: 90.8730 (90.4612) time: 0.0296 data: 0.0002 max mem: 5308 [16:26:37.334678] Test: [550/560] eta: 0:00:00 loss: 0.4619 (0.4647) auc: 93.7500 (90.5577) time: 0.0295 data: 0.0002 max mem: 5308 [16:26:37.578430] Test: [559/560] eta: 0:00:00 loss: 0.4506 (0.4643) auc: 93.1174 (90.5584) time: 0.0283 data: 0.0001 max mem: 5308 [16:26:37.739991] Test: Total time: 0:00:18 (0.0326 s / it) [16:26:37.935279] * Auc 90.655 loss 0.465 [16:26:37.935464] AUC of the network on the 35796 val images: 90.66% [16:26:37.935478] Max auc: 90.66% [16:26:37.935494] Save model with min_val_loss at epoch: 2 [16:26:43.878768] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:26:45.050746] Epoch: [3] [ 0/2877] eta: 0:56:07 lr: 0.000038 loss: 0.6943 (0.6943) time: 1.1703 data: 1.0218 max mem: 5308 [16:26:58.596383] Epoch: [3] [ 100/2877] eta: 0:06:44 lr: 0.000038 loss: 0.6621 (0.6595) time: 0.1376 data: 0.0003 max mem: 5308 [16:27:12.195681] Epoch: [3] [ 200/2877] eta: 0:06:17 lr: 0.000038 loss: 0.6638 (0.6616) time: 0.1365 data: 0.0001 max mem: 5308 [16:27:25.812936] Epoch: [3] [ 300/2877] eta: 0:05:58 lr: 0.000039 loss: 0.6623 (0.6614) time: 0.1360 data: 0.0002 max mem: 5308 [16:27:39.436733] Epoch: [3] [ 400/2877] eta: 0:05:43 lr: 0.000039 loss: 0.6630 (0.6607) time: 0.1359 data: 0.0001 max mem: 5308 [16:27:53.086569] Epoch: [3] [ 500/2877] eta: 0:05:28 lr: 0.000040 loss: 0.6543 (0.6599) time: 0.1355 data: 0.0001 max mem: 5308 [16:28:06.627451] Epoch: [3] [ 600/2877] eta: 0:05:13 lr: 0.000040 loss: 0.6741 (0.6592) time: 0.1353 data: 0.0002 max mem: 5308 [16:28:20.210534] Epoch: [3] [ 700/2877] eta: 0:04:59 lr: 0.000041 loss: 0.6428 (0.6581) time: 0.1360 data: 0.0002 max mem: 5308 [16:28:33.950740] Epoch: [3] [ 800/2877] eta: 0:04:45 lr: 0.000041 loss: 0.6669 (0.6593) time: 0.1357 data: 0.0002 max mem: 5308 [16:28:47.619977] Epoch: [3] [ 900/2877] eta: 0:04:31 lr: 0.000041 loss: 0.6650 (0.6594) time: 0.1349 data: 0.0001 max mem: 5308 [16:29:01.214547] Epoch: [3] [1000/2877] eta: 0:04:17 lr: 0.000042 loss: 0.6498 (0.6593) time: 0.1353 data: 0.0001 max mem: 5308 [16:29:14.820381] Epoch: [3] [1100/2877] eta: 0:04:03 lr: 0.000042 loss: 0.6678 (0.6601) time: 0.1364 data: 0.0002 max mem: 5308 [16:29:28.539202] Epoch: [3] [1200/2877] eta: 0:03:49 lr: 0.000043 loss: 0.6594 (0.6600) time: 0.1388 data: 0.0002 max mem: 5308 [16:29:42.284180] Epoch: [3] [1300/2877] eta: 0:03:36 lr: 0.000043 loss: 0.6500 (0.6595) time: 0.1365 data: 0.0002 max mem: 5308 [16:29:56.322270] Epoch: [3] [1400/2877] eta: 0:03:22 lr: 0.000044 loss: 0.6544 (0.6592) time: 0.1418 data: 0.0002 max mem: 5308 [16:30:09.870311] Epoch: [3] [1500/2877] eta: 0:03:08 lr: 0.000044 loss: 0.6399 (0.6588) time: 0.1356 data: 0.0002 max mem: 5308 [16:30:23.425541] Epoch: [3] [1600/2877] eta: 0:02:55 lr: 0.000044 loss: 0.6411 (0.6585) time: 0.1346 data: 0.0001 max mem: 5308 [16:30:36.996074] Epoch: [3] [1700/2877] eta: 0:02:41 lr: 0.000045 loss: 0.6385 (0.6583) time: 0.1351 data: 0.0001 max mem: 5308 [16:30:50.562991] Epoch: [3] [1800/2877] eta: 0:02:27 lr: 0.000045 loss: 0.6388 (0.6580) time: 0.1363 data: 0.0002 max mem: 5308 [16:31:04.243200] Epoch: [3] [1900/2877] eta: 0:02:13 lr: 0.000046 loss: 0.6377 (0.6577) time: 0.1365 data: 0.0001 max mem: 5308 [16:31:17.823299] Epoch: [3] [2000/2877] eta: 0:02:00 lr: 0.000046 loss: 0.6447 (0.6573) time: 0.1360 data: 0.0002 max mem: 5308 [16:31:31.678856] Epoch: [3] [2100/2877] eta: 0:01:46 lr: 0.000047 loss: 0.6652 (0.6572) time: 0.1368 data: 0.0002 max mem: 5308 [16:31:45.098358] Epoch: [3] [2200/2877] eta: 0:01:32 lr: 0.000047 loss: 0.6440 (0.6570) time: 0.1342 data: 0.0001 max mem: 5308 [16:31:58.595272] Epoch: [3] [2300/2877] eta: 0:01:18 lr: 0.000047 loss: 0.6484 (0.6567) time: 0.1352 data: 0.0001 max mem: 5308 [16:32:12.193128] Epoch: [3] [2400/2877] eta: 0:01:05 lr: 0.000048 loss: 0.6542 (0.6568) time: 0.1345 data: 0.0001 max mem: 5308 [16:32:25.689839] Epoch: [3] [2500/2877] eta: 0:00:51 lr: 0.000048 loss: 0.6394 (0.6564) time: 0.1346 data: 0.0002 max mem: 5308 [16:32:39.157908] Epoch: [3] [2600/2877] eta: 0:00:37 lr: 0.000049 loss: 0.6660 (0.6563) time: 0.1344 data: 0.0001 max mem: 5308 [16:32:52.778353] Epoch: [3] [2700/2877] eta: 0:00:24 lr: 0.000049 loss: 0.6547 (0.6562) time: 0.1349 data: 0.0002 max mem: 5308 [16:33:06.511982] Epoch: [3] [2800/2877] eta: 0:00:10 lr: 0.000050 loss: 0.6472 (0.6562) time: 0.1368 data: 0.0002 max mem: 5308 [16:33:16.829379] Epoch: [3] [2876/2877] eta: 0:00:00 lr: 0.000050 loss: 0.6689 (0.6563) time: 0.1347 data: 0.0003 max mem: 5308 [16:33:17.134921] Epoch: [3] Total time: 0:06:33 (0.1367 s / it) [16:33:17.139310] Averaged stats: lr: 0.000050 loss: 0.6689 (0.6556) [16:33:18.709785] Test: [ 0/560] eta: 0:14:35 loss: 0.4656 (0.4656) auc: 89.8039 (89.8039) time: 1.5641 data: 1.5292 max mem: 5308 [16:33:19.045121] Test: [ 10/560] eta: 0:01:34 loss: 0.4633 (0.4501) auc: 89.4737 (89.1802) time: 0.1726 data: 0.1427 max mem: 5308 [16:33:19.339624] Test: [ 20/560] eta: 0:00:56 loss: 0.4258 (0.4267) auc: 90.8730 (91.0200) time: 0.0314 data: 0.0021 max mem: 5308 [16:33:19.638472] Test: [ 30/560] eta: 0:00:42 loss: 0.3888 (0.4056) auc: 93.5065 (92.1459) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:19.937800] Test: [ 40/560] eta: 0:00:35 loss: 0.3762 (0.4098) auc: 92.9412 (91.6432) time: 0.0298 data: 0.0002 max mem: 5308 [16:33:20.235685] Test: [ 50/560] eta: 0:00:30 loss: 0.3905 (0.4110) auc: 90.5882 (91.7305) time: 0.0298 data: 0.0002 max mem: 5308 [16:33:20.531242] Test: [ 60/560] eta: 0:00:27 loss: 0.4165 (0.4106) auc: 90.8730 (91.6471) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:20.826597] Test: [ 70/560] eta: 0:00:25 loss: 0.4148 (0.4081) auc: 91.2500 (91.7978) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:21.124013] Test: [ 80/560] eta: 0:00:23 loss: 0.3648 (0.4042) auc: 92.7083 (92.1027) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:21.432703] Test: [ 90/560] eta: 0:00:22 loss: 0.3535 (0.4040) auc: 93.9394 (92.3078) time: 0.0302 data: 0.0003 max mem: 5308 [16:33:21.727987] Test: [100/560] eta: 0:00:20 loss: 0.3846 (0.4021) auc: 94.1176 (92.4474) time: 0.0301 data: 0.0003 max mem: 5308 [16:33:22.049702] Test: [110/560] eta: 0:00:19 loss: 0.3763 (0.4003) auc: 94.1176 (92.4311) time: 0.0308 data: 0.0010 max mem: 5308 [16:33:22.358594] Test: [120/560] eta: 0:00:18 loss: 0.3895 (0.4054) auc: 90.9804 (92.2035) time: 0.0309 data: 0.0010 max mem: 5308 [16:33:22.662434] Test: [130/560] eta: 0:00:18 loss: 0.4027 (0.4055) auc: 92.7451 (92.2564) time: 0.0300 data: 0.0004 max mem: 5308 [16:33:22.962125] Test: [140/560] eta: 0:00:17 loss: 0.4011 (0.4073) auc: 92.8571 (92.2185) time: 0.0301 data: 0.0004 max mem: 5308 [16:33:23.269137] Test: [150/560] eta: 0:00:16 loss: 0.4011 (0.4064) auc: 91.4062 (92.2495) time: 0.0302 data: 0.0003 max mem: 5308 [16:33:23.592149] Test: [160/560] eta: 0:00:15 loss: 0.3548 (0.4028) auc: 94.7917 (92.4752) time: 0.0314 data: 0.0009 max mem: 5308 [16:33:23.908228] Test: [170/560] eta: 0:00:15 loss: 0.3475 (0.4015) auc: 94.5312 (92.5682) time: 0.0319 data: 0.0014 max mem: 5308 [16:33:24.210706] Test: [180/560] eta: 0:00:14 loss: 0.3812 (0.4022) auc: 94.3320 (92.5973) time: 0.0309 data: 0.0008 max mem: 5308 [16:33:24.530997] Test: [190/560] eta: 0:00:14 loss: 0.4044 (0.4032) auc: 92.0635 (92.5135) time: 0.0311 data: 0.0006 max mem: 5308 [16:33:24.830979] Test: [200/560] eta: 0:00:13 loss: 0.4016 (0.4014) auc: 92.9412 (92.6151) time: 0.0309 data: 0.0007 max mem: 5308 [16:33:25.154002] Test: [210/560] eta: 0:00:13 loss: 0.4014 (0.4020) auc: 92.9167 (92.5588) time: 0.0311 data: 0.0005 max mem: 5308 [16:33:25.488443] Test: [220/560] eta: 0:00:12 loss: 0.4014 (0.4015) auc: 92.7126 (92.5921) time: 0.0327 data: 0.0006 max mem: 5308 [16:33:25.787408] Test: [230/560] eta: 0:00:12 loss: 0.3993 (0.4017) auc: 93.3333 (92.6013) time: 0.0315 data: 0.0004 max mem: 5308 [16:33:26.101067] Test: [240/560] eta: 0:00:11 loss: 0.3970 (0.4012) auc: 92.8571 (92.6410) time: 0.0305 data: 0.0004 max mem: 5308 [16:33:26.399829] Test: [250/560] eta: 0:00:11 loss: 0.3769 (0.4007) auc: 92.4603 (92.6418) time: 0.0305 data: 0.0004 max mem: 5308 [16:33:26.696777] Test: [260/560] eta: 0:00:10 loss: 0.3375 (0.3984) auc: 95.6863 (92.7598) time: 0.0297 data: 0.0002 max mem: 5308 [16:33:26.989731] Test: [270/560] eta: 0:00:10 loss: 0.3788 (0.3992) auc: 94.0476 (92.7416) time: 0.0294 data: 0.0002 max mem: 5308 [16:33:27.287478] Test: [280/560] eta: 0:00:10 loss: 0.3893 (0.3977) auc: 95.2381 (92.8583) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:27.584979] Test: [290/560] eta: 0:00:09 loss: 0.3932 (0.3986) auc: 94.5833 (92.8585) time: 0.0297 data: 0.0002 max mem: 5308 [16:33:27.881259] Test: [300/560] eta: 0:00:09 loss: 0.3932 (0.3975) auc: 94.4444 (92.9134) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:28.180188] Test: [310/560] eta: 0:00:08 loss: 0.3861 (0.3980) auc: 94.3359 (92.8552) time: 0.0297 data: 0.0002 max mem: 5308 [16:33:28.484366] Test: [320/560] eta: 0:00:08 loss: 0.4158 (0.3987) auc: 90.5882 (92.8233) time: 0.0301 data: 0.0002 max mem: 5308 [16:33:28.780174] Test: [330/560] eta: 0:00:08 loss: 0.3940 (0.3981) auc: 93.6508 (92.8770) time: 0.0299 data: 0.0002 max mem: 5308 [16:33:29.077141] Test: [340/560] eta: 0:00:07 loss: 0.3766 (0.3975) auc: 94.9219 (92.9039) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:29.375631] Test: [350/560] eta: 0:00:07 loss: 0.3710 (0.3971) auc: 94.4444 (92.9181) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:29.674596] Test: [360/560] eta: 0:00:06 loss: 0.3913 (0.3981) auc: 94.3320 (92.8836) time: 0.0297 data: 0.0002 max mem: 5308 [16:33:29.978078] Test: [370/560] eta: 0:00:06 loss: 0.4071 (0.3984) auc: 91.3725 (92.9106) time: 0.0300 data: 0.0002 max mem: 5308 [16:33:30.273528] Test: [380/560] eta: 0:00:06 loss: 0.3977 (0.3984) auc: 91.7749 (92.8958) time: 0.0298 data: 0.0002 max mem: 5308 [16:33:30.570540] Test: [390/560] eta: 0:00:05 loss: 0.3983 (0.3990) auc: 92.8571 (92.8694) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:30.871713] Test: [400/560] eta: 0:00:05 loss: 0.4159 (0.4001) auc: 92.4603 (92.8002) time: 0.0298 data: 0.0002 max mem: 5308 [16:33:31.169944] Test: [410/560] eta: 0:00:05 loss: 0.4183 (0.4004) auc: 90.2778 (92.7698) time: 0.0299 data: 0.0002 max mem: 5308 [16:33:31.466163] Test: [420/560] eta: 0:00:04 loss: 0.3997 (0.4005) auc: 90.2778 (92.7564) time: 0.0296 data: 0.0002 max mem: 5308 [16:33:31.776466] Test: [430/560] eta: 0:00:04 loss: 0.3553 (0.3997) auc: 92.0635 (92.7807) time: 0.0302 data: 0.0002 max mem: 5308 [16:33:32.077589] Test: [440/560] eta: 0:00:04 loss: 0.3489 (0.3991) auc: 96.0317 (92.8355) time: 0.0305 data: 0.0002 max mem: 5308 [16:33:32.376383] Test: [450/560] eta: 0:00:03 loss: 0.3726 (0.3992) auc: 94.5312 (92.8254) time: 0.0299 data: 0.0002 max mem: 5308 [16:33:32.681431] Test: [460/560] eta: 0:00:03 loss: 0.4115 (0.4002) auc: 92.7126 (92.7846) time: 0.0299 data: 0.0002 max mem: 5308 [16:33:32.978379] Test: [470/560] eta: 0:00:03 loss: 0.4388 (0.4004) auc: 92.9688 (92.7787) time: 0.0298 data: 0.0002 max mem: 5308 [16:33:33.273434] Test: [480/560] eta: 0:00:02 loss: 0.4187 (0.4003) auc: 92.9688 (92.7861) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:33.568824] Test: [490/560] eta: 0:00:02 loss: 0.3820 (0.3999) auc: 92.9688 (92.8000) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:33.863661] Test: [500/560] eta: 0:00:01 loss: 0.3784 (0.3994) auc: 93.6508 (92.8325) time: 0.0294 data: 0.0002 max mem: 5308 [16:33:34.158371] Test: [510/560] eta: 0:00:01 loss: 0.3516 (0.3993) auc: 94.9020 (92.8449) time: 0.0294 data: 0.0001 max mem: 5308 [16:33:34.453544] Test: [520/560] eta: 0:00:01 loss: 0.4263 (0.4003) auc: 90.2834 (92.7724) time: 0.0294 data: 0.0002 max mem: 5308 [16:33:34.750686] Test: [530/560] eta: 0:00:00 loss: 0.4544 (0.4013) auc: 89.0688 (92.7377) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:35.046272] Test: [540/560] eta: 0:00:00 loss: 0.4042 (0.4018) auc: 92.1569 (92.7251) time: 0.0295 data: 0.0002 max mem: 5308 [16:33:35.340547] Test: [550/560] eta: 0:00:00 loss: 0.3983 (0.4016) auc: 95.1417 (92.7946) time: 0.0294 data: 0.0002 max mem: 5308 [16:33:35.587489] Test: [559/560] eta: 0:00:00 loss: 0.3917 (0.4013) auc: 94.7368 (92.7929) time: 0.0285 data: 0.0001 max mem: 5308 [16:33:35.754960] Test: Total time: 0:00:18 (0.0332 s / it) [16:33:35.888378] * Auc 92.783 loss 0.402 [16:33:35.888657] AUC of the network on the 35796 val images: 92.78% [16:33:35.888673] Max auc: 92.78% [16:33:35.888690] Save model with min_val_loss at epoch: 3 [16:33:42.744857] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:33:44.042219] Epoch: [4] [ 0/2877] eta: 1:02:08 lr: 0.000050 loss: 0.6511 (0.6511) time: 1.2961 data: 1.1590 max mem: 5308 [16:33:57.810826] Epoch: [4] [ 100/2877] eta: 0:06:54 lr: 0.000050 loss: 0.6458 (0.6455) time: 0.1369 data: 0.0001 max mem: 5308 [16:34:11.479424] Epoch: [4] [ 200/2877] eta: 0:06:22 lr: 0.000051 loss: 0.6566 (0.6468) time: 0.1372 data: 0.0002 max mem: 5308 [16:34:25.277174] Epoch: [4] [ 300/2877] eta: 0:06:04 lr: 0.000051 loss: 0.6598 (0.6468) time: 0.1358 data: 0.0002 max mem: 5308 [16:34:38.957024] Epoch: [4] [ 400/2877] eta: 0:05:47 lr: 0.000052 loss: 0.6701 (0.6481) time: 0.1371 data: 0.0002 max mem: 5308 [16:34:52.678582] Epoch: [4] [ 500/2877] eta: 0:05:31 lr: 0.000052 loss: 0.6552 (0.6482) time: 0.1378 data: 0.0002 max mem: 5308 [16:35:06.250517] Epoch: [4] [ 600/2877] eta: 0:05:16 lr: 0.000053 loss: 0.6406 (0.6481) time: 0.1355 data: 0.0001 max mem: 5308 [16:35:19.862980] Epoch: [4] [ 700/2877] eta: 0:05:01 lr: 0.000053 loss: 0.6201 (0.6467) time: 0.1359 data: 0.0001 max mem: 5308 [16:35:33.648142] Epoch: [4] [ 800/2877] eta: 0:04:47 lr: 0.000053 loss: 0.6587 (0.6466) time: 0.1381 data: 0.0002 max mem: 5308 [16:35:47.350552] Epoch: [4] [ 900/2877] eta: 0:04:33 lr: 0.000054 loss: 0.6448 (0.6464) time: 0.1366 data: 0.0002 max mem: 5308 [16:36:00.974971] Epoch: [4] [1000/2877] eta: 0:04:19 lr: 0.000054 loss: 0.6556 (0.6460) time: 0.1366 data: 0.0001 max mem: 5308 [16:36:14.764059] Epoch: [4] [1100/2877] eta: 0:04:05 lr: 0.000055 loss: 0.6480 (0.6463) time: 0.1361 data: 0.0002 max mem: 5308 [16:36:28.453421] Epoch: [4] [1200/2877] eta: 0:03:51 lr: 0.000055 loss: 0.6571 (0.6471) time: 0.1371 data: 0.0002 max mem: 5308 [16:36:42.243190] Epoch: [4] [1300/2877] eta: 0:03:37 lr: 0.000056 loss: 0.6494 (0.6471) time: 0.1416 data: 0.0002 max mem: 5308 [16:36:55.933950] Epoch: [4] [1400/2877] eta: 0:03:23 lr: 0.000056 loss: 0.6425 (0.6474) time: 0.1362 data: 0.0001 max mem: 5308 [16:37:09.685110] Epoch: [4] [1500/2877] eta: 0:03:09 lr: 0.000057 loss: 0.6421 (0.6470) time: 0.1362 data: 0.0002 max mem: 5308 [16:37:23.283264] Epoch: [4] [1600/2877] eta: 0:02:55 lr: 0.000057 loss: 0.6546 (0.6476) time: 0.1368 data: 0.0002 max mem: 5308 [16:37:36.763387] Epoch: [4] [1700/2877] eta: 0:02:41 lr: 0.000057 loss: 0.6416 (0.6474) time: 0.1354 data: 0.0001 max mem: 5308 [16:37:50.349995] Epoch: [4] [1800/2877] eta: 0:02:28 lr: 0.000058 loss: 0.6335 (0.6471) time: 0.1355 data: 0.0003 max mem: 5308 [16:38:04.030844] Epoch: [4] [1900/2877] eta: 0:02:14 lr: 0.000058 loss: 0.5991 (0.6469) time: 0.1357 data: 0.0002 max mem: 5308 [16:38:17.613326] Epoch: [4] [2000/2877] eta: 0:02:00 lr: 0.000059 loss: 0.6542 (0.6466) time: 0.1363 data: 0.0001 max mem: 5308 [16:38:31.268284] Epoch: [4] [2100/2877] eta: 0:01:46 lr: 0.000059 loss: 0.6430 (0.6465) time: 0.1359 data: 0.0002 max mem: 5308 [16:38:44.766551] Epoch: [4] [2200/2877] eta: 0:01:32 lr: 0.000060 loss: 0.6393 (0.6461) time: 0.1347 data: 0.0001 max mem: 5308 [16:38:58.292868] Epoch: [4] [2300/2877] eta: 0:01:19 lr: 0.000060 loss: 0.6165 (0.6461) time: 0.1355 data: 0.0002 max mem: 5308 [16:39:11.843930] Epoch: [4] [2400/2877] eta: 0:01:05 lr: 0.000060 loss: 0.6441 (0.6461) time: 0.1347 data: 0.0001 max mem: 5308 [16:39:25.410481] Epoch: [4] [2500/2877] eta: 0:00:51 lr: 0.000061 loss: 0.6400 (0.6460) time: 0.1351 data: 0.0002 max mem: 5308 [16:39:38.932735] Epoch: [4] [2600/2877] eta: 0:00:37 lr: 0.000061 loss: 0.6409 (0.6460) time: 0.1366 data: 0.0002 max mem: 5308 [16:39:52.658598] Epoch: [4] [2700/2877] eta: 0:00:24 lr: 0.000062 loss: 0.6399 (0.6460) time: 0.1369 data: 0.0002 max mem: 5308 [16:40:06.145199] Epoch: [4] [2800/2877] eta: 0:00:10 lr: 0.000062 loss: 0.6222 (0.6457) time: 0.1347 data: 0.0002 max mem: 5308 [16:40:16.356421] Epoch: [4] [2876/2877] eta: 0:00:00 lr: 0.000062 loss: 0.6469 (0.6455) time: 0.1342 data: 0.0004 max mem: 5308 [16:40:16.636055] Epoch: [4] Total time: 0:06:33 (0.1369 s / it) [16:40:16.637632] Averaged stats: lr: 0.000062 loss: 0.6469 (0.6465) [16:40:18.243864] Test: [ 0/560] eta: 0:14:56 loss: 0.3983 (0.3983) auc: 91.9608 (91.9608) time: 1.6007 data: 1.5676 max mem: 5308 [16:40:18.541935] Test: [ 10/560] eta: 0:01:34 loss: 0.4247 (0.4286) auc: 90.3382 (90.6281) time: 0.1725 data: 0.1427 max mem: 5308 [16:40:18.847959] Test: [ 20/560] eta: 0:00:56 loss: 0.3992 (0.4044) auc: 92.0833 (92.5250) time: 0.0301 data: 0.0002 max mem: 5308 [16:40:19.148728] Test: [ 30/560] eta: 0:00:42 loss: 0.3631 (0.3804) auc: 95.2941 (93.5242) time: 0.0302 data: 0.0002 max mem: 5308 [16:40:19.445396] Test: [ 40/560] eta: 0:00:35 loss: 0.3395 (0.3816) auc: 95.4167 (93.3740) time: 0.0298 data: 0.0002 max mem: 5308 [16:40:19.745332] Test: [ 50/560] eta: 0:00:30 loss: 0.3426 (0.3834) auc: 94.4444 (93.4503) time: 0.0297 data: 0.0002 max mem: 5308 [16:40:20.044372] Test: [ 60/560] eta: 0:00:27 loss: 0.4023 (0.3844) auc: 92.2078 (93.2596) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:20.346662] Test: [ 70/560] eta: 0:00:25 loss: 0.3979 (0.3820) auc: 91.6667 (93.2924) time: 0.0300 data: 0.0003 max mem: 5308 [16:40:20.643227] Test: [ 80/560] eta: 0:00:23 loss: 0.3253 (0.3775) auc: 96.3563 (93.6133) time: 0.0299 data: 0.0003 max mem: 5308 [16:40:20.937792] Test: [ 90/560] eta: 0:00:22 loss: 0.3180 (0.3777) auc: 96.3563 (93.7296) time: 0.0295 data: 0.0002 max mem: 5308 [16:40:21.232956] Test: [100/560] eta: 0:00:20 loss: 0.3612 (0.3754) auc: 95.6710 (93.9056) time: 0.0294 data: 0.0002 max mem: 5308 [16:40:21.529422] Test: [110/560] eta: 0:00:19 loss: 0.3424 (0.3735) auc: 96.0938 (93.9680) time: 0.0295 data: 0.0002 max mem: 5308 [16:40:21.823675] Test: [120/560] eta: 0:00:18 loss: 0.3904 (0.3800) auc: 91.2698 (93.6949) time: 0.0294 data: 0.0002 max mem: 5308 [16:40:22.118582] Test: [130/560] eta: 0:00:17 loss: 0.3918 (0.3804) auc: 92.0635 (93.7517) time: 0.0294 data: 0.0002 max mem: 5308 [16:40:22.413047] Test: [140/560] eta: 0:00:17 loss: 0.3775 (0.3816) auc: 94.7368 (93.7320) time: 0.0294 data: 0.0002 max mem: 5308 [16:40:22.708760] Test: [150/560] eta: 0:00:16 loss: 0.3741 (0.3801) auc: 95.1417 (93.8032) time: 0.0294 data: 0.0002 max mem: 5308 [16:40:23.007112] Test: [160/560] eta: 0:00:15 loss: 0.3328 (0.3759) auc: 96.8627 (94.0329) time: 0.0296 data: 0.0002 max mem: 5308 [16:40:23.309228] Test: [170/560] eta: 0:00:15 loss: 0.3082 (0.3742) auc: 96.8750 (94.1022) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:23.610047] Test: [180/560] eta: 0:00:14 loss: 0.3351 (0.3746) auc: 95.5466 (94.1337) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:23.912681] Test: [190/560] eta: 0:00:14 loss: 0.3845 (0.3755) auc: 92.5490 (94.0857) time: 0.0301 data: 0.0002 max mem: 5308 [16:40:24.217596] Test: [200/560] eta: 0:00:13 loss: 0.3530 (0.3737) auc: 93.7255 (94.1707) time: 0.0303 data: 0.0002 max mem: 5308 [16:40:24.514705] Test: [210/560] eta: 0:00:13 loss: 0.3530 (0.3746) auc: 94.2460 (94.1157) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:24.818878] Test: [220/560] eta: 0:00:12 loss: 0.3847 (0.3744) auc: 94.5312 (94.1312) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:25.123082] Test: [230/560] eta: 0:00:12 loss: 0.3771 (0.3744) auc: 94.5098 (94.1270) time: 0.0303 data: 0.0003 max mem: 5308 [16:40:25.424692] Test: [240/560] eta: 0:00:11 loss: 0.3674 (0.3741) auc: 93.7255 (94.1363) time: 0.0302 data: 0.0003 max mem: 5308 [16:40:25.728324] Test: [250/560] eta: 0:00:11 loss: 0.3531 (0.3740) auc: 94.0476 (94.1160) time: 0.0302 data: 0.0002 max mem: 5308 [16:40:26.029705] Test: [260/560] eta: 0:00:10 loss: 0.3019 (0.3719) auc: 95.2941 (94.2010) time: 0.0301 data: 0.0002 max mem: 5308 [16:40:26.329324] Test: [270/560] eta: 0:00:10 loss: 0.3676 (0.3725) auc: 94.8413 (94.1887) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:26.629267] Test: [280/560] eta: 0:00:09 loss: 0.3593 (0.3707) auc: 95.6349 (94.2884) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:26.928617] Test: [290/560] eta: 0:00:09 loss: 0.3593 (0.3717) auc: 96.5368 (94.2940) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:27.230075] Test: [300/560] eta: 0:00:09 loss: 0.3833 (0.3708) auc: 95.2381 (94.3361) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:27.530149] Test: [310/560] eta: 0:00:08 loss: 0.3677 (0.3712) auc: 94.3723 (94.2765) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:27.831464] Test: [320/560] eta: 0:00:08 loss: 0.3743 (0.3717) auc: 92.9412 (94.2592) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:28.131337] Test: [330/560] eta: 0:00:07 loss: 0.3712 (0.3714) auc: 94.5833 (94.2804) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:28.430530] Test: [340/560] eta: 0:00:07 loss: 0.3503 (0.3708) auc: 95.5466 (94.3077) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:28.731597] Test: [350/560] eta: 0:00:07 loss: 0.3503 (0.3704) auc: 95.3125 (94.3168) time: 0.0299 data: 0.0002 max mem: 5308 [16:40:29.033453] Test: [360/560] eta: 0:00:06 loss: 0.3745 (0.3716) auc: 94.0476 (94.2794) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:29.335337] Test: [370/560] eta: 0:00:06 loss: 0.3907 (0.3719) auc: 94.1176 (94.3029) time: 0.0301 data: 0.0002 max mem: 5308 [16:40:29.637790] Test: [380/560] eta: 0:00:06 loss: 0.3695 (0.3719) auc: 93.3333 (94.2928) time: 0.0301 data: 0.0003 max mem: 5308 [16:40:29.938532] Test: [390/560] eta: 0:00:05 loss: 0.3512 (0.3723) auc: 94.1406 (94.2889) time: 0.0300 data: 0.0003 max mem: 5308 [16:40:30.242376] Test: [400/560] eta: 0:00:05 loss: 0.3946 (0.3734) auc: 94.4444 (94.2279) time: 0.0301 data: 0.0002 max mem: 5308 [16:40:30.542145] Test: [410/560] eta: 0:00:05 loss: 0.3965 (0.3739) auc: 91.2698 (94.1800) time: 0.0301 data: 0.0002 max mem: 5308 [16:40:30.837880] Test: [420/560] eta: 0:00:04 loss: 0.3843 (0.3738) auc: 92.5781 (94.1766) time: 0.0297 data: 0.0002 max mem: 5308 [16:40:31.131797] Test: [430/560] eta: 0:00:04 loss: 0.3280 (0.3728) auc: 95.4365 (94.2180) time: 0.0294 data: 0.0001 max mem: 5308 [16:40:31.426822] Test: [440/560] eta: 0:00:04 loss: 0.3213 (0.3721) auc: 97.2222 (94.2700) time: 0.0294 data: 0.0001 max mem: 5308 [16:40:31.721996] Test: [450/560] eta: 0:00:03 loss: 0.3544 (0.3724) auc: 95.5466 (94.2590) time: 0.0294 data: 0.0001 max mem: 5308 [16:40:32.044833] Test: [460/560] eta: 0:00:03 loss: 0.3787 (0.3733) auc: 94.1406 (94.2267) time: 0.0308 data: 0.0004 max mem: 5308 [16:40:32.355967] Test: [470/560] eta: 0:00:02 loss: 0.3958 (0.3735) auc: 94.1406 (94.2243) time: 0.0315 data: 0.0004 max mem: 5308 [16:40:32.654999] Test: [480/560] eta: 0:00:02 loss: 0.3958 (0.3735) auc: 94.1406 (94.2142) time: 0.0303 data: 0.0002 max mem: 5308 [16:40:32.957986] Test: [490/560] eta: 0:00:02 loss: 0.3516 (0.3729) auc: 94.3320 (94.2351) time: 0.0300 data: 0.0002 max mem: 5308 [16:40:33.262547] Test: [500/560] eta: 0:00:01 loss: 0.3226 (0.3724) auc: 94.9020 (94.2624) time: 0.0302 data: 0.0002 max mem: 5308 [16:40:33.573445] Test: [510/560] eta: 0:00:01 loss: 0.3202 (0.3721) auc: 95.2381 (94.2822) time: 0.0306 data: 0.0003 max mem: 5308 [16:40:33.878367] Test: [520/560] eta: 0:00:01 loss: 0.3958 (0.3731) auc: 91.9028 (94.2206) time: 0.0306 data: 0.0003 max mem: 5308 [16:40:34.186354] Test: [530/560] eta: 0:00:00 loss: 0.4133 (0.3742) auc: 90.6883 (94.1936) time: 0.0305 data: 0.0002 max mem: 5308 [16:40:34.491950] Test: [540/560] eta: 0:00:00 loss: 0.4007 (0.3748) auc: 94.6860 (94.1792) time: 0.0306 data: 0.0004 max mem: 5308 [16:40:34.807089] Test: [550/560] eta: 0:00:00 loss: 0.3773 (0.3748) auc: 96.0317 (94.2335) time: 0.0309 data: 0.0004 max mem: 5308 [16:40:35.060713] Test: [559/560] eta: 0:00:00 loss: 0.3609 (0.3746) auc: 95.9514 (94.2390) time: 0.0298 data: 0.0002 max mem: 5308 [16:40:35.251088] Test: Total time: 0:00:18 (0.0332 s / it) [16:40:35.252738] * Auc 94.252 loss 0.374 [16:40:35.253060] AUC of the network on the 35796 val images: 94.25% [16:40:35.253098] Max auc: 94.25% [16:40:35.253141] Save model with min_val_loss at epoch: 4 [16:40:42.176151] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:40:43.468897] Epoch: [5] [ 0/2877] eta: 1:01:54 lr: 0.000063 loss: 0.5772 (0.5772) time: 1.2910 data: 1.1455 max mem: 5308 [16:40:57.012389] Epoch: [5] [ 100/2877] eta: 0:06:47 lr: 0.000062 loss: 0.6558 (0.6514) time: 0.1356 data: 0.0002 max mem: 5308 [16:41:10.585197] Epoch: [5] [ 200/2877] eta: 0:06:18 lr: 0.000062 loss: 0.6386 (0.6489) time: 0.1352 data: 0.0002 max mem: 5308 [16:41:24.182124] Epoch: [5] [ 300/2877] eta: 0:05:59 lr: 0.000062 loss: 0.6481 (0.6462) time: 0.1342 data: 0.0001 max mem: 5308 [16:41:37.615728] Epoch: [5] [ 400/2877] eta: 0:05:42 lr: 0.000062 loss: 0.6311 (0.6430) time: 0.1341 data: 0.0001 max mem: 5308 [16:41:51.353112] Epoch: [5] [ 500/2877] eta: 0:05:28 lr: 0.000062 loss: 0.6430 (0.6411) time: 0.1382 data: 0.0002 max mem: 5308 [16:42:04.927749] Epoch: [5] [ 600/2877] eta: 0:05:13 lr: 0.000062 loss: 0.6408 (0.6417) time: 0.1348 data: 0.0002 max mem: 5308 [16:42:18.387655] Epoch: [5] [ 700/2877] eta: 0:04:58 lr: 0.000062 loss: 0.6562 (0.6414) time: 0.1345 data: 0.0002 max mem: 5308 [16:42:31.892206] Epoch: [5] [ 800/2877] eta: 0:04:44 lr: 0.000062 loss: 0.6654 (0.6425) time: 0.1353 data: 0.0001 max mem: 5308 [16:42:45.447795] Epoch: [5] [ 900/2877] eta: 0:04:30 lr: 0.000062 loss: 0.6338 (0.6422) time: 0.1361 data: 0.0002 max mem: 5308 [16:42:58.998026] Epoch: [5] [1000/2877] eta: 0:04:16 lr: 0.000062 loss: 0.6396 (0.6424) time: 0.1355 data: 0.0001 max mem: 5308 [16:43:12.538461] Epoch: [5] [1100/2877] eta: 0:04:02 lr: 0.000062 loss: 0.6217 (0.6406) time: 0.1359 data: 0.0001 max mem: 5308 [16:43:26.084964] Epoch: [5] [1200/2877] eta: 0:03:48 lr: 0.000061 loss: 0.6310 (0.6404) time: 0.1352 data: 0.0001 max mem: 5308 [16:43:39.714357] Epoch: [5] [1300/2877] eta: 0:03:35 lr: 0.000061 loss: 0.6452 (0.6403) time: 0.1373 data: 0.0002 max mem: 5308 [16:43:53.378611] Epoch: [5] [1400/2877] eta: 0:03:21 lr: 0.000061 loss: 0.6413 (0.6405) time: 0.1358 data: 0.0001 max mem: 5308 [16:44:07.108151] Epoch: [5] [1500/2877] eta: 0:03:07 lr: 0.000061 loss: 0.6363 (0.6406) time: 0.1373 data: 0.0002 max mem: 5308 [16:44:20.632945] Epoch: [5] [1600/2877] eta: 0:02:54 lr: 0.000061 loss: 0.6070 (0.6396) time: 0.1346 data: 0.0002 max mem: 5308 [16:44:34.235762] Epoch: [5] [1700/2877] eta: 0:02:40 lr: 0.000060 loss: 0.6470 (0.6391) time: 0.1358 data: 0.0002 max mem: 5308 [16:44:47.837088] Epoch: [5] [1800/2877] eta: 0:02:26 lr: 0.000060 loss: 0.6069 (0.6387) time: 0.1347 data: 0.0002 max mem: 5308 [16:45:01.361648] Epoch: [5] [1900/2877] eta: 0:02:13 lr: 0.000060 loss: 0.6338 (0.6386) time: 0.1347 data: 0.0001 max mem: 5308 [16:45:14.952500] Epoch: [5] [2000/2877] eta: 0:01:59 lr: 0.000060 loss: 0.6231 (0.6382) time: 0.1364 data: 0.0001 max mem: 5308 [16:45:28.564329] Epoch: [5] [2100/2877] eta: 0:01:45 lr: 0.000059 loss: 0.6318 (0.6380) time: 0.1365 data: 0.0002 max mem: 5308 [16:45:42.300997] Epoch: [5] [2200/2877] eta: 0:01:32 lr: 0.000059 loss: 0.6595 (0.6381) time: 0.1361 data: 0.0002 max mem: 5308 [16:45:55.951614] Epoch: [5] [2300/2877] eta: 0:01:18 lr: 0.000059 loss: 0.6247 (0.6380) time: 0.1375 data: 0.0002 max mem: 5308 [16:46:09.693003] Epoch: [5] [2400/2877] eta: 0:01:05 lr: 0.000058 loss: 0.6491 (0.6379) time: 0.1376 data: 0.0002 max mem: 5308 [16:46:23.322810] Epoch: [5] [2500/2877] eta: 0:00:51 lr: 0.000058 loss: 0.6367 (0.6377) time: 0.1359 data: 0.0001 max mem: 5308 [16:46:36.865571] Epoch: [5] [2600/2877] eta: 0:00:37 lr: 0.000058 loss: 0.6260 (0.6374) time: 0.1364 data: 0.0002 max mem: 5308 [16:46:50.532848] Epoch: [5] [2700/2877] eta: 0:00:24 lr: 0.000057 loss: 0.6473 (0.6372) time: 0.1377 data: 0.0002 max mem: 5308 [16:47:04.219073] Epoch: [5] [2800/2877] eta: 0:00:10 lr: 0.000057 loss: 0.6279 (0.6370) time: 0.1362 data: 0.0001 max mem: 5308 [16:47:14.569461] Epoch: [5] [2876/2877] eta: 0:00:00 lr: 0.000057 loss: 0.6627 (0.6368) time: 0.1348 data: 0.0003 max mem: 5308 [16:47:14.788519] Epoch: [5] Total time: 0:06:32 (0.1365 s / it) [16:47:14.792874] Averaged stats: lr: 0.000057 loss: 0.6627 (0.6372) [16:47:16.343459] Test: [ 0/560] eta: 0:14:25 loss: 0.3417 (0.3417) auc: 93.3333 (93.3333) time: 1.5456 data: 1.5114 max mem: 5308 [16:47:16.638769] Test: [ 10/560] eta: 0:01:32 loss: 0.3665 (0.3749) auc: 92.5000 (91.9170) time: 0.1673 data: 0.1375 max mem: 5308 [16:47:16.932525] Test: [ 20/560] eta: 0:00:54 loss: 0.3368 (0.3430) auc: 94.3320 (93.9786) time: 0.0294 data: 0.0002 max mem: 5308 [16:47:17.226436] Test: [ 30/560] eta: 0:00:41 loss: 0.2718 (0.3218) auc: 97.2222 (94.7001) time: 0.0293 data: 0.0001 max mem: 5308 [16:47:17.521685] Test: [ 40/560] eta: 0:00:34 loss: 0.2909 (0.3223) auc: 97.2222 (94.6276) time: 0.0294 data: 0.0002 max mem: 5308 [16:47:17.816253] Test: [ 50/560] eta: 0:00:30 loss: 0.3022 (0.3232) auc: 96.0317 (94.8007) time: 0.0294 data: 0.0001 max mem: 5308 [16:47:18.112901] Test: [ 60/560] eta: 0:00:27 loss: 0.3461 (0.3246) auc: 95.3125 (94.6625) time: 0.0295 data: 0.0001 max mem: 5308 [16:47:18.406924] Test: [ 70/560] eta: 0:00:24 loss: 0.3461 (0.3220) auc: 93.5065 (94.6922) time: 0.0295 data: 0.0001 max mem: 5308 [16:47:18.702249] Test: [ 80/560] eta: 0:00:23 loss: 0.2888 (0.3173) auc: 95.8333 (94.8807) time: 0.0294 data: 0.0001 max mem: 5308 [16:47:18.997653] Test: [ 90/560] eta: 0:00:21 loss: 0.2574 (0.3169) auc: 96.4286 (94.9502) time: 0.0294 data: 0.0002 max mem: 5308 [16:47:19.297397] Test: [100/560] eta: 0:00:20 loss: 0.2890 (0.3137) auc: 96.0938 (95.1163) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:19.596488] Test: [110/560] eta: 0:00:19 loss: 0.2890 (0.3131) auc: 96.0938 (95.0909) time: 0.0299 data: 0.0002 max mem: 5308 [16:47:19.897170] Test: [120/560] eta: 0:00:18 loss: 0.3217 (0.3185) auc: 92.9412 (94.9032) time: 0.0299 data: 0.0002 max mem: 5308 [16:47:20.191481] Test: [130/560] eta: 0:00:17 loss: 0.3217 (0.3193) auc: 95.2381 (94.9605) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:20.503939] Test: [140/560] eta: 0:00:16 loss: 0.3146 (0.3200) auc: 96.2500 (94.9655) time: 0.0303 data: 0.0002 max mem: 5308 [16:47:20.798073] Test: [150/560] eta: 0:00:16 loss: 0.3055 (0.3189) auc: 96.2500 (95.0111) time: 0.0303 data: 0.0002 max mem: 5308 [16:47:21.093433] Test: [160/560] eta: 0:00:15 loss: 0.2829 (0.3145) auc: 97.2222 (95.1870) time: 0.0294 data: 0.0002 max mem: 5308 [16:47:21.392092] Test: [170/560] eta: 0:00:15 loss: 0.2550 (0.3128) auc: 96.8750 (95.2374) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:21.692040] Test: [180/560] eta: 0:00:14 loss: 0.2721 (0.3133) auc: 96.3563 (95.2388) time: 0.0298 data: 0.0002 max mem: 5308 [16:47:21.990112] Test: [190/560] eta: 0:00:13 loss: 0.3207 (0.3139) auc: 94.5098 (95.2146) time: 0.0298 data: 0.0002 max mem: 5308 [16:47:22.288257] Test: [200/560] eta: 0:00:13 loss: 0.2928 (0.3118) auc: 94.9219 (95.3148) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:22.587947] Test: [210/560] eta: 0:00:12 loss: 0.2958 (0.3127) auc: 95.6710 (95.2594) time: 0.0298 data: 0.0002 max mem: 5308 [16:47:22.887044] Test: [220/560] eta: 0:00:12 loss: 0.3158 (0.3127) auc: 95.5466 (95.2623) time: 0.0299 data: 0.0002 max mem: 5308 [16:47:23.186790] Test: [230/560] eta: 0:00:11 loss: 0.3048 (0.3126) auc: 95.5466 (95.2817) time: 0.0299 data: 0.0002 max mem: 5308 [16:47:23.486500] Test: [240/560] eta: 0:00:11 loss: 0.2938 (0.3122) auc: 96.2302 (95.2830) time: 0.0299 data: 0.0002 max mem: 5308 [16:47:23.787548] Test: [250/560] eta: 0:00:11 loss: 0.3221 (0.3123) auc: 95.4167 (95.2702) time: 0.0299 data: 0.0002 max mem: 5308 [16:47:24.083484] Test: [260/560] eta: 0:00:10 loss: 0.2594 (0.3105) auc: 96.0784 (95.3243) time: 0.0298 data: 0.0002 max mem: 5308 [16:47:24.379254] Test: [270/560] eta: 0:00:10 loss: 0.3123 (0.3113) auc: 94.6429 (95.3007) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:24.676299] Test: [280/560] eta: 0:00:09 loss: 0.2806 (0.3090) auc: 96.7611 (95.3904) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:24.974137] Test: [290/560] eta: 0:00:09 loss: 0.2896 (0.3099) auc: 96.7611 (95.3811) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:25.271567] Test: [300/560] eta: 0:00:09 loss: 0.3148 (0.3091) auc: 95.2941 (95.4169) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:25.567666] Test: [310/560] eta: 0:00:08 loss: 0.2955 (0.3095) auc: 95.2941 (95.3757) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:25.865425] Test: [320/560] eta: 0:00:08 loss: 0.3148 (0.3099) auc: 94.9020 (95.3609) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:26.163346] Test: [330/560] eta: 0:00:07 loss: 0.3085 (0.3096) auc: 95.6349 (95.3773) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:26.461581] Test: [340/560] eta: 0:00:07 loss: 0.2823 (0.3092) auc: 96.4286 (95.3975) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:26.757227] Test: [350/560] eta: 0:00:07 loss: 0.2823 (0.3088) auc: 96.0938 (95.4076) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:27.053019] Test: [360/560] eta: 0:00:06 loss: 0.3106 (0.3097) auc: 95.6863 (95.3927) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:27.349069] Test: [370/560] eta: 0:00:06 loss: 0.3244 (0.3101) auc: 95.0000 (95.3994) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:27.645862] Test: [380/560] eta: 0:00:06 loss: 0.3064 (0.3100) auc: 95.0000 (95.3999) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:27.942192] Test: [390/560] eta: 0:00:05 loss: 0.2989 (0.3105) auc: 95.6863 (95.3849) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:28.238616] Test: [400/560] eta: 0:00:05 loss: 0.3050 (0.3115) auc: 96.0317 (95.3316) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:28.534610] Test: [410/560] eta: 0:00:05 loss: 0.3357 (0.3119) auc: 94.4444 (95.3020) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:28.830570] Test: [420/560] eta: 0:00:04 loss: 0.3204 (0.3122) auc: 94.4444 (95.2765) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:29.126578] Test: [430/560] eta: 0:00:04 loss: 0.2655 (0.3112) auc: 96.4706 (95.3185) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:29.423429] Test: [440/560] eta: 0:00:03 loss: 0.2513 (0.3105) auc: 98.0159 (95.3584) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:29.719809] Test: [450/560] eta: 0:00:03 loss: 0.2803 (0.3107) auc: 96.7611 (95.3477) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:30.016181] Test: [460/560] eta: 0:00:03 loss: 0.3275 (0.3115) auc: 95.3125 (95.3159) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:30.312346] Test: [470/560] eta: 0:00:02 loss: 0.3496 (0.3118) auc: 95.2941 (95.3085) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:30.610200] Test: [480/560] eta: 0:00:02 loss: 0.3201 (0.3117) auc: 94.9219 (95.2830) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:30.905512] Test: [490/560] eta: 0:00:02 loss: 0.2932 (0.3112) auc: 94.9219 (95.3049) time: 0.0296 data: 0.0002 max mem: 5308 [16:47:31.202447] Test: [500/560] eta: 0:00:01 loss: 0.2709 (0.3107) auc: 97.6471 (95.3380) time: 0.0295 data: 0.0002 max mem: 5308 [16:47:31.500211] Test: [510/560] eta: 0:00:01 loss: 0.2687 (0.3102) auc: 97.2549 (95.3574) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:31.798296] Test: [520/560] eta: 0:00:01 loss: 0.3191 (0.3113) auc: 94.5098 (95.3149) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:32.095839] Test: [530/560] eta: 0:00:00 loss: 0.3565 (0.3122) auc: 94.1176 (95.3098) time: 0.0297 data: 0.0002 max mem: 5308 [16:47:32.392225] Test: [540/560] eta: 0:00:00 loss: 0.3362 (0.3127) auc: 95.2941 (95.2958) time: 0.0296 data: 0.0003 max mem: 5308 [16:47:32.680938] Test: [550/560] eta: 0:00:00 loss: 0.2815 (0.3122) auc: 97.9757 (95.3462) time: 0.0292 data: 0.0002 max mem: 5308 [16:47:32.924339] Test: [559/560] eta: 0:00:00 loss: 0.2856 (0.3124) auc: 97.9757 (95.3413) time: 0.0280 data: 0.0001 max mem: 5308 [16:47:33.084797] Test: Total time: 0:00:18 (0.0327 s / it) [16:47:33.086421] * Auc 95.324 loss 0.312 [16:47:33.086763] AUC of the network on the 35796 val images: 95.32% [16:47:33.086826] Max auc: 95.32% [16:47:33.086878] Save model with min_val_loss at epoch: 5 [16:47:39.406623] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:47:40.599153] Epoch: [6] [ 0/2877] eta: 0:57:05 lr: 0.000057 loss: 0.6522 (0.6522) time: 1.1907 data: 1.0428 max mem: 5308 [16:47:54.080706] Epoch: [6] [ 100/2877] eta: 0:06:43 lr: 0.000056 loss: 0.6370 (0.6369) time: 0.1342 data: 0.0001 max mem: 5308 [16:48:07.638272] Epoch: [6] [ 200/2877] eta: 0:06:15 lr: 0.000056 loss: 0.6109 (0.6326) time: 0.1360 data: 0.0002 max mem: 5308 [16:48:21.278878] Epoch: [6] [ 300/2877] eta: 0:05:58 lr: 0.000055 loss: 0.6318 (0.6329) time: 0.1379 data: 0.0002 max mem: 5308 [16:48:35.185213] Epoch: [6] [ 400/2877] eta: 0:05:44 lr: 0.000055 loss: 0.6355 (0.6350) time: 0.1378 data: 0.0002 max mem: 5308 [16:48:48.909980] Epoch: [6] [ 500/2877] eta: 0:05:29 lr: 0.000055 loss: 0.6371 (0.6341) time: 0.1378 data: 0.0002 max mem: 5308 [16:49:02.676973] Epoch: [6] [ 600/2877] eta: 0:05:15 lr: 0.000054 loss: 0.6495 (0.6320) time: 0.1353 data: 0.0001 max mem: 5308 [16:49:16.152461] Epoch: [6] [ 700/2877] eta: 0:05:00 lr: 0.000054 loss: 0.6092 (0.6322) time: 0.1347 data: 0.0001 max mem: 5308 [16:49:29.648020] Epoch: [6] [ 800/2877] eta: 0:04:45 lr: 0.000053 loss: 0.6313 (0.6323) time: 0.1362 data: 0.0002 max mem: 5308 [16:49:43.445985] Epoch: [6] [ 900/2877] eta: 0:04:32 lr: 0.000053 loss: 0.6463 (0.6330) time: 0.1376 data: 0.0002 max mem: 5308 [16:49:57.115401] Epoch: [6] [1000/2877] eta: 0:04:18 lr: 0.000052 loss: 0.6471 (0.6339) time: 0.1389 data: 0.0002 max mem: 5308 [16:50:10.822280] Epoch: [6] [1100/2877] eta: 0:04:04 lr: 0.000052 loss: 0.6291 (0.6335) time: 0.1371 data: 0.0002 max mem: 5308 [16:50:24.466554] Epoch: [6] [1200/2877] eta: 0:03:50 lr: 0.000051 loss: 0.6272 (0.6325) time: 0.1369 data: 0.0002 max mem: 5308 [16:50:38.192054] Epoch: [6] [1300/2877] eta: 0:03:36 lr: 0.000051 loss: 0.6416 (0.6324) time: 0.1374 data: 0.0002 max mem: 5308 [16:50:51.799162] Epoch: [6] [1400/2877] eta: 0:03:22 lr: 0.000050 loss: 0.6062 (0.6314) time: 0.1356 data: 0.0001 max mem: 5308 [16:51:05.428294] Epoch: [6] [1500/2877] eta: 0:03:08 lr: 0.000049 loss: 0.6197 (0.6308) time: 0.1352 data: 0.0001 max mem: 5308 [16:51:19.189556] Epoch: [6] [1600/2877] eta: 0:02:55 lr: 0.000049 loss: 0.6336 (0.6308) time: 0.1380 data: 0.0001 max mem: 5308 [16:51:32.793653] Epoch: [6] [1700/2877] eta: 0:02:41 lr: 0.000048 loss: 0.6170 (0.6309) time: 0.1356 data: 0.0002 max mem: 5308 [16:51:46.347403] Epoch: [6] [1800/2877] eta: 0:02:27 lr: 0.000048 loss: 0.6475 (0.6308) time: 0.1359 data: 0.0001 max mem: 5308 [16:51:59.925107] Epoch: [6] [1900/2877] eta: 0:02:13 lr: 0.000047 loss: 0.6297 (0.6306) time: 0.1350 data: 0.0002 max mem: 5308 [16:52:13.623957] Epoch: [6] [2000/2877] eta: 0:02:00 lr: 0.000047 loss: 0.6358 (0.6312) time: 0.1365 data: 0.0001 max mem: 5308 [16:52:27.116564] Epoch: [6] [2100/2877] eta: 0:01:46 lr: 0.000046 loss: 0.6350 (0.6306) time: 0.1357 data: 0.0001 max mem: 5308 [16:52:40.622149] Epoch: [6] [2200/2877] eta: 0:01:32 lr: 0.000045 loss: 0.6371 (0.6305) time: 0.1356 data: 0.0001 max mem: 5308 [16:52:54.351357] Epoch: [6] [2300/2877] eta: 0:01:18 lr: 0.000045 loss: 0.6300 (0.6302) time: 0.1362 data: 0.0001 max mem: 5308 [16:53:07.977226] Epoch: [6] [2400/2877] eta: 0:01:05 lr: 0.000044 loss: 0.6274 (0.6301) time: 0.1350 data: 0.0001 max mem: 5308 [16:53:21.601472] Epoch: [6] [2500/2877] eta: 0:00:51 lr: 0.000044 loss: 0.6433 (0.6301) time: 0.1381 data: 0.0002 max mem: 5308 [16:53:35.268222] Epoch: [6] [2600/2877] eta: 0:00:37 lr: 0.000043 loss: 0.6139 (0.6298) time: 0.1373 data: 0.0002 max mem: 5308 [16:53:48.920531] Epoch: [6] [2700/2877] eta: 0:00:24 lr: 0.000042 loss: 0.6323 (0.6299) time: 0.1353 data: 0.0001 max mem: 5308 [16:54:02.416730] Epoch: [6] [2800/2877] eta: 0:00:10 lr: 0.000042 loss: 0.6214 (0.6301) time: 0.1363 data: 0.0002 max mem: 5308 [16:54:12.985575] Epoch: [6] [2876/2877] eta: 0:00:00 lr: 0.000041 loss: 0.6351 (0.6300) time: 0.1388 data: 0.0004 max mem: 5308 [16:54:13.257016] Epoch: [6] Total time: 0:06:33 (0.1369 s / it) [16:54:13.263624] Averaged stats: lr: 0.000041 loss: 0.6351 (0.6286) [16:54:14.810265] Test: [ 0/560] eta: 0:14:24 loss: 0.3152 (0.3152) auc: 96.0784 (96.0784) time: 1.5433 data: 1.5067 max mem: 5308 [16:54:15.152883] Test: [ 10/560] eta: 0:01:34 loss: 0.3369 (0.3541) auc: 94.1667 (92.9577) time: 0.1713 data: 0.1408 max mem: 5308 [16:54:15.453292] Test: [ 20/560] eta: 0:00:56 loss: 0.3322 (0.3237) auc: 94.6429 (94.7081) time: 0.0320 data: 0.0022 max mem: 5308 [16:54:15.755146] Test: [ 30/560] eta: 0:00:42 loss: 0.2635 (0.3032) auc: 97.5709 (95.3566) time: 0.0300 data: 0.0002 max mem: 5308 [16:54:16.062704] Test: [ 40/560] eta: 0:00:35 loss: 0.2684 (0.3025) auc: 96.6667 (95.3752) time: 0.0303 data: 0.0003 max mem: 5308 [16:54:16.366571] Test: [ 50/560] eta: 0:00:30 loss: 0.2769 (0.3024) auc: 95.2381 (95.5836) time: 0.0304 data: 0.0003 max mem: 5308 [16:54:16.664267] Test: [ 60/560] eta: 0:00:27 loss: 0.3153 (0.3037) auc: 95.4167 (95.4739) time: 0.0300 data: 0.0002 max mem: 5308 [16:54:16.966503] Test: [ 70/560] eta: 0:00:25 loss: 0.2950 (0.3009) auc: 95.4167 (95.5010) time: 0.0299 data: 0.0002 max mem: 5308 [16:54:17.267627] Test: [ 80/560] eta: 0:00:23 loss: 0.2617 (0.2951) auc: 96.0317 (95.6791) time: 0.0301 data: 0.0002 max mem: 5308 [16:54:17.568580] Test: [ 90/560] eta: 0:00:22 loss: 0.2432 (0.2947) auc: 96.8254 (95.7502) time: 0.0300 data: 0.0002 max mem: 5308 [16:54:17.870989] Test: [100/560] eta: 0:00:20 loss: 0.2532 (0.2906) auc: 96.8254 (95.9114) time: 0.0301 data: 0.0002 max mem: 5308 [16:54:18.172584] Test: [110/560] eta: 0:00:19 loss: 0.2608 (0.2904) auc: 96.9697 (95.8937) time: 0.0301 data: 0.0002 max mem: 5308 [16:54:18.471735] Test: [120/560] eta: 0:00:18 loss: 0.3035 (0.2965) auc: 93.7198 (95.6860) time: 0.0299 data: 0.0002 max mem: 5308 [16:54:18.771099] Test: [130/560] eta: 0:00:18 loss: 0.3088 (0.2978) auc: 94.6429 (95.7137) time: 0.0298 data: 0.0002 max mem: 5308 [16:54:19.070366] Test: [140/560] eta: 0:00:17 loss: 0.2994 (0.2984) auc: 96.4844 (95.7124) time: 0.0298 data: 0.0002 max mem: 5308 [16:54:19.369488] Test: [150/560] eta: 0:00:16 loss: 0.2794 (0.2969) auc: 95.9514 (95.7346) time: 0.0298 data: 0.0002 max mem: 5308 [16:54:19.668899] Test: [160/560] eta: 0:00:15 loss: 0.2472 (0.2923) auc: 98.0159 (95.9040) time: 0.0298 data: 0.0002 max mem: 5308 [16:54:19.967742] Test: [170/560] eta: 0:00:15 loss: 0.2297 (0.2909) auc: 97.9167 (95.9473) time: 0.0298 data: 0.0002 max mem: 5308 [16:54:20.266995] Test: [180/560] eta: 0:00:14 loss: 0.2545 (0.2916) auc: 97.5709 (95.9468) time: 0.0298 data: 0.0002 max mem: 5308 [16:54:20.561694] Test: [190/560] eta: 0:00:14 loss: 0.2947 (0.2923) auc: 96.0317 (95.9191) time: 0.0296 data: 0.0002 max mem: 5308 [16:54:20.855374] Test: [200/560] eta: 0:00:13 loss: 0.2833 (0.2904) auc: 95.6349 (95.9764) time: 0.0294 data: 0.0002 max mem: 5308 [16:54:21.148366] Test: [210/560] eta: 0:00:13 loss: 0.2893 (0.2913) auc: 96.7611 (95.9413) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:21.441679] Test: [220/560] eta: 0:00:12 loss: 0.2901 (0.2910) auc: 96.7611 (95.9664) time: 0.0292 data: 0.0002 max mem: 5308 [16:54:21.734709] Test: [230/560] eta: 0:00:12 loss: 0.2897 (0.2911) auc: 96.0784 (95.9685) time: 0.0292 data: 0.0001 max mem: 5308 [16:54:22.028462] Test: [240/560] eta: 0:00:11 loss: 0.2982 (0.2909) auc: 96.0317 (95.9659) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:22.321558] Test: [250/560] eta: 0:00:11 loss: 0.3017 (0.2913) auc: 95.6863 (95.9489) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:22.615472] Test: [260/560] eta: 0:00:10 loss: 0.2435 (0.2895) auc: 96.2500 (96.0034) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:22.908898] Test: [270/560] eta: 0:00:10 loss: 0.2808 (0.2904) auc: 95.0000 (95.9762) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:23.202246] Test: [280/560] eta: 0:00:09 loss: 0.2720 (0.2880) auc: 96.8254 (96.0529) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:23.495724] Test: [290/560] eta: 0:00:09 loss: 0.2720 (0.2888) auc: 97.9167 (96.0496) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:23.788968] Test: [300/560] eta: 0:00:09 loss: 0.2885 (0.2879) auc: 96.3563 (96.0825) time: 0.0293 data: 0.0001 max mem: 5308 [16:54:24.082978] Test: [310/560] eta: 0:00:08 loss: 0.2698 (0.2883) auc: 96.0784 (96.0512) time: 0.0293 data: 0.0001 max mem: 5308 [16:54:24.376990] Test: [320/560] eta: 0:00:08 loss: 0.2904 (0.2886) auc: 95.6863 (96.0351) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:24.669213] Test: [330/560] eta: 0:00:07 loss: 0.2789 (0.2882) auc: 96.0417 (96.0405) time: 0.0292 data: 0.0001 max mem: 5308 [16:54:24.962415] Test: [340/560] eta: 0:00:07 loss: 0.2700 (0.2878) auc: 96.0784 (96.0502) time: 0.0292 data: 0.0001 max mem: 5308 [16:54:25.255968] Test: [350/560] eta: 0:00:07 loss: 0.2623 (0.2876) auc: 96.0784 (96.0496) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:25.548928] Test: [360/560] eta: 0:00:06 loss: 0.2944 (0.2887) auc: 96.0784 (96.0334) time: 0.0292 data: 0.0002 max mem: 5308 [16:54:25.840865] Test: [370/560] eta: 0:00:06 loss: 0.3074 (0.2891) auc: 96.4286 (96.0230) time: 0.0292 data: 0.0002 max mem: 5308 [16:54:26.136132] Test: [380/560] eta: 0:00:06 loss: 0.2839 (0.2889) auc: 96.0784 (96.0354) time: 0.0293 data: 0.0002 max mem: 5308 [16:54:26.430535] Test: [390/560] eta: 0:00:05 loss: 0.2736 (0.2893) auc: 96.4286 (96.0216) time: 0.0294 data: 0.0002 max mem: 5308 [16:54:26.725593] Test: [400/560] eta: 0:00:05 loss: 0.2790 (0.2904) auc: 96.4286 (95.9658) time: 0.0294 data: 0.0001 max mem: 5308 [16:54:27.019421] Test: [410/560] eta: 0:00:05 loss: 0.3138 (0.2909) auc: 94.1176 (95.9280) time: 0.0294 data: 0.0001 max mem: 5308 [16:54:27.313532] Test: [420/560] eta: 0:00:04 loss: 0.3102 (0.2910) auc: 94.5098 (95.9083) time: 0.0293 data: 0.0001 max mem: 5308 [16:54:27.607818] Test: [430/560] eta: 0:00:04 loss: 0.2360 (0.2900) auc: 97.2549 (95.9382) time: 0.0293 data: 0.0001 max mem: 5308 [16:54:27.903179] Test: [440/560] eta: 0:00:03 loss: 0.2216 (0.2891) auc: 98.4127 (95.9747) time: 0.0294 data: 0.0002 max mem: 5308 [16:54:28.197255] Test: [450/560] eta: 0:00:03 loss: 0.2510 (0.2891) auc: 97.9757 (95.9819) time: 0.0294 data: 0.0002 max mem: 5308 [16:54:28.492515] Test: [460/560] eta: 0:00:03 loss: 0.3055 (0.2899) auc: 97.1660 (95.9681) time: 0.0294 data: 0.0001 max mem: 5308 [16:54:28.788490] Test: [470/560] eta: 0:00:02 loss: 0.3258 (0.2903) auc: 95.6349 (95.9551) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:29.083999] Test: [480/560] eta: 0:00:02 loss: 0.2915 (0.2901) auc: 94.5098 (95.9295) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:29.378853] Test: [490/560] eta: 0:00:02 loss: 0.2696 (0.2894) auc: 95.2381 (95.9511) time: 0.0294 data: 0.0002 max mem: 5308 [16:54:29.674679] Test: [500/560] eta: 0:00:01 loss: 0.2494 (0.2890) auc: 96.6667 (95.9653) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:29.970441] Test: [510/560] eta: 0:00:01 loss: 0.2494 (0.2886) auc: 96.3636 (95.9733) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:30.266166] Test: [520/560] eta: 0:00:01 loss: 0.2964 (0.2897) auc: 94.9219 (95.9298) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:30.562651] Test: [530/560] eta: 0:00:00 loss: 0.3291 (0.2907) auc: 94.9219 (95.9282) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:30.857014] Test: [540/560] eta: 0:00:00 loss: 0.3206 (0.2912) auc: 96.1353 (95.9222) time: 0.0295 data: 0.0002 max mem: 5308 [16:54:31.149239] Test: [550/560] eta: 0:00:00 loss: 0.2766 (0.2907) auc: 98.8095 (95.9683) time: 0.0293 data: 0.0001 max mem: 5308 [16:54:31.393272] Test: [559/560] eta: 0:00:00 loss: 0.2626 (0.2908) auc: 98.7854 (95.9745) time: 0.0282 data: 0.0001 max mem: 5308 [16:54:31.550379] Test: Total time: 0:00:18 (0.0327 s / it) [16:54:31.739843] * Auc 95.957 loss 0.291 [16:54:31.740015] AUC of the network on the 35796 val images: 95.96% [16:54:31.740029] Max auc: 95.96% [16:54:31.740045] Save model with min_val_loss at epoch: 6 [16:54:37.628563] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [16:54:38.756981] Epoch: [7] [ 0/2877] eta: 0:54:01 lr: 0.000041 loss: 0.6367 (0.6367) time: 1.1265 data: 0.9808 max mem: 5308 [16:54:52.284020] Epoch: [7] [ 100/2877] eta: 0:06:42 lr: 0.000041 loss: 0.6129 (0.6186) time: 0.1345 data: 0.0001 max mem: 5308 [16:55:05.723895] Epoch: [7] [ 200/2877] eta: 0:06:14 lr: 0.000040 loss: 0.6395 (0.6242) time: 0.1354 data: 0.0001 max mem: 5308 [16:55:19.305505] Epoch: [7] [ 300/2877] eta: 0:05:56 lr: 0.000039 loss: 0.6434 (0.6244) time: 0.1346 data: 0.0001 max mem: 5308 [16:55:32.813436] Epoch: [7] [ 400/2877] eta: 0:05:40 lr: 0.000039 loss: 0.6400 (0.6259) time: 0.1364 data: 0.0002 max mem: 5308 [16:55:46.326043] Epoch: [7] [ 500/2877] eta: 0:05:25 lr: 0.000038 loss: 0.6243 (0.6278) time: 0.1342 data: 0.0001 max mem: 5308 [16:55:59.878577] Epoch: [7] [ 600/2877] eta: 0:05:11 lr: 0.000037 loss: 0.6134 (0.6268) time: 0.1345 data: 0.0001 max mem: 5308 [16:56:13.342151] Epoch: [7] [ 700/2877] eta: 0:04:57 lr: 0.000037 loss: 0.6317 (0.6275) time: 0.1373 data: 0.0002 max mem: 5308 [16:56:26.863728] Epoch: [7] [ 800/2877] eta: 0:04:43 lr: 0.000036 loss: 0.6211 (0.6270) time: 0.1355 data: 0.0002 max mem: 5308 [16:56:40.656291] Epoch: [7] [ 900/2877] eta: 0:04:29 lr: 0.000035 loss: 0.6230 (0.6256) time: 0.1384 data: 0.0001 max mem: 5308 [16:56:54.326976] Epoch: [7] [1000/2877] eta: 0:04:16 lr: 0.000035 loss: 0.6272 (0.6249) time: 0.1371 data: 0.0001 max mem: 5308 [16:57:07.937514] Epoch: [7] [1100/2877] eta: 0:04:02 lr: 0.000034 loss: 0.6261 (0.6251) time: 0.1343 data: 0.0001 max mem: 5308 [16:57:21.563019] Epoch: [7] [1200/2877] eta: 0:03:48 lr: 0.000033 loss: 0.6326 (0.6244) time: 0.1379 data: 0.0001 max mem: 5308 [16:57:35.544482] Epoch: [7] [1300/2877] eta: 0:03:35 lr: 0.000033 loss: 0.6146 (0.6245) time: 0.1404 data: 0.0003 max mem: 5308 [16:57:49.259913] Epoch: [7] [1400/2877] eta: 0:03:21 lr: 0.000032 loss: 0.6135 (0.6237) time: 0.1364 data: 0.0002 max mem: 5308 [16:58:02.838837] Epoch: [7] [1500/2877] eta: 0:03:08 lr: 0.000031 loss: 0.6286 (0.6233) time: 0.1348 data: 0.0002 max mem: 5308 [16:58:16.435000] Epoch: [7] [1600/2877] eta: 0:02:54 lr: 0.000031 loss: 0.6257 (0.6235) time: 0.1365 data: 0.0001 max mem: 5308 [16:58:29.953159] Epoch: [7] [1700/2877] eta: 0:02:40 lr: 0.000030 loss: 0.6087 (0.6238) time: 0.1357 data: 0.0002 max mem: 5308 [16:58:43.503593] Epoch: [7] [1800/2877] eta: 0:02:27 lr: 0.000029 loss: 0.6253 (0.6239) time: 0.1350 data: 0.0001 max mem: 5308 [16:58:57.016062] Epoch: [7] [1900/2877] eta: 0:02:13 lr: 0.000029 loss: 0.5862 (0.6233) time: 0.1348 data: 0.0001 max mem: 5308 [16:59:10.544350] Epoch: [7] [2000/2877] eta: 0:01:59 lr: 0.000028 loss: 0.6206 (0.6230) time: 0.1363 data: 0.0001 max mem: 5308 [16:59:24.138178] Epoch: [7] [2100/2877] eta: 0:01:45 lr: 0.000027 loss: 0.5977 (0.6233) time: 0.1351 data: 0.0001 max mem: 5308 [16:59:37.709028] Epoch: [7] [2200/2877] eta: 0:01:32 lr: 0.000027 loss: 0.6593 (0.6237) time: 0.1363 data: 0.0001 max mem: 5308 [16:59:51.352662] Epoch: [7] [2300/2877] eta: 0:01:18 lr: 0.000026 loss: 0.6463 (0.6241) time: 0.1363 data: 0.0001 max mem: 5308 [17:00:05.145415] Epoch: [7] [2400/2877] eta: 0:01:05 lr: 0.000025 loss: 0.6176 (0.6242) time: 0.1399 data: 0.0003 max mem: 5308 [17:00:18.809743] Epoch: [7] [2500/2877] eta: 0:00:51 lr: 0.000025 loss: 0.6550 (0.6244) time: 0.1365 data: 0.0002 max mem: 5308 [17:00:32.435589] Epoch: [7] [2600/2877] eta: 0:00:37 lr: 0.000024 loss: 0.6365 (0.6236) time: 0.1351 data: 0.0002 max mem: 5308 [17:00:46.026841] Epoch: [7] [2700/2877] eta: 0:00:24 lr: 0.000023 loss: 0.6156 (0.6236) time: 0.1368 data: 0.0002 max mem: 5308 [17:00:59.592778] Epoch: [7] [2800/2877] eta: 0:00:10 lr: 0.000023 loss: 0.6397 (0.6238) time: 0.1355 data: 0.0001 max mem: 5308 [17:01:09.900033] Epoch: [7] [2876/2877] eta: 0:00:00 lr: 0.000022 loss: 0.6353 (0.6237) time: 0.1356 data: 0.0002 max mem: 5308 [17:01:10.171286] Epoch: [7] Total time: 0:06:32 (0.1364 s / it) [17:01:10.200073] Averaged stats: lr: 0.000022 loss: 0.6353 (0.6236) [17:01:12.074082] Test: [ 0/560] eta: 0:17:27 loss: 0.2804 (0.2804) auc: 97.2549 (97.2549) time: 1.8710 data: 1.8355 max mem: 5308 [17:01:12.371786] Test: [ 10/560] eta: 0:01:48 loss: 0.3160 (0.3266) auc: 94.5833 (93.2085) time: 0.1971 data: 0.1671 max mem: 5308 [17:01:12.670280] Test: [ 20/560] eta: 0:01:03 loss: 0.2878 (0.2932) auc: 95.6349 (95.1651) time: 0.0297 data: 0.0002 max mem: 5308 [17:01:12.968683] Test: [ 30/560] eta: 0:00:47 loss: 0.2503 (0.2785) auc: 97.6471 (95.6137) time: 0.0297 data: 0.0002 max mem: 5308 [17:01:13.266324] Test: [ 40/560] eta: 0:00:38 loss: 0.2523 (0.2792) auc: 97.2222 (95.5100) time: 0.0297 data: 0.0002 max mem: 5308 [17:01:13.564210] Test: [ 50/560] eta: 0:00:33 loss: 0.2580 (0.2761) auc: 96.2302 (95.8369) time: 0.0297 data: 0.0002 max mem: 5308 [17:01:13.864809] Test: [ 60/560] eta: 0:00:29 loss: 0.2975 (0.2777) auc: 96.3636 (95.7366) time: 0.0298 data: 0.0002 max mem: 5308 [17:01:14.163253] Test: [ 70/560] eta: 0:00:27 loss: 0.2685 (0.2763) auc: 96.3636 (95.7890) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:14.464013] Test: [ 80/560] eta: 0:00:25 loss: 0.2400 (0.2708) auc: 97.5709 (95.9685) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:14.761651] Test: [ 90/560] eta: 0:00:23 loss: 0.2479 (0.2700) auc: 96.8750 (96.0333) time: 0.0298 data: 0.0002 max mem: 5308 [17:01:15.057378] Test: [100/560] eta: 0:00:22 loss: 0.2479 (0.2661) auc: 97.2222 (96.2029) time: 0.0296 data: 0.0002 max mem: 5308 [17:01:15.351853] Test: [110/560] eta: 0:00:20 loss: 0.2361 (0.2658) auc: 97.5709 (96.1847) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:15.649753] Test: [120/560] eta: 0:00:19 loss: 0.2871 (0.2712) auc: 94.2029 (95.9796) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:15.949925] Test: [130/560] eta: 0:00:18 loss: 0.2743 (0.2716) auc: 94.9393 (96.0059) time: 0.0298 data: 0.0002 max mem: 5308 [17:01:16.251168] Test: [140/560] eta: 0:00:17 loss: 0.2607 (0.2716) auc: 96.7611 (96.0276) time: 0.0300 data: 0.0002 max mem: 5308 [17:01:16.548945] Test: [150/560] eta: 0:00:17 loss: 0.2475 (0.2706) auc: 97.1660 (96.0421) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:16.849005] Test: [160/560] eta: 0:00:16 loss: 0.2266 (0.2666) auc: 98.4314 (96.1835) time: 0.0298 data: 0.0002 max mem: 5308 [17:01:17.149362] Test: [170/560] eta: 0:00:15 loss: 0.2215 (0.2655) auc: 98.0392 (96.2151) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:17.450371] Test: [180/560] eta: 0:00:15 loss: 0.2457 (0.2659) auc: 97.1660 (96.2138) time: 0.0300 data: 0.0002 max mem: 5308 [17:01:17.749662] Test: [190/560] eta: 0:00:14 loss: 0.2640 (0.2663) auc: 96.0784 (96.1995) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:18.049966] Test: [200/560] eta: 0:00:14 loss: 0.2582 (0.2647) auc: 96.0938 (96.2578) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:18.348970] Test: [210/560] eta: 0:00:13 loss: 0.2517 (0.2657) auc: 97.0833 (96.2255) time: 0.0299 data: 0.0002 max mem: 5308 [17:01:18.648805] Test: [220/560] eta: 0:00:12 loss: 0.2545 (0.2659) auc: 97.0833 (96.2420) time: 0.0298 data: 0.0002 max mem: 5308 [17:01:18.947814] Test: [230/560] eta: 0:00:12 loss: 0.2545 (0.2660) auc: 96.8627 (96.2288) time: 0.0298 data: 0.0002 max mem: 5308 [17:01:19.244693] Test: [240/560] eta: 0:00:11 loss: 0.2486 (0.2659) auc: 96.4706 (96.2313) time: 0.0297 data: 0.0002 max mem: 5308 [17:01:19.540252] Test: [250/560] eta: 0:00:11 loss: 0.2745 (0.2664) auc: 96.4706 (96.2166) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:19.834793] Test: [260/560] eta: 0:00:11 loss: 0.2434 (0.2651) auc: 96.6667 (96.2575) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:20.128372] Test: [270/560] eta: 0:00:10 loss: 0.2765 (0.2659) auc: 96.4286 (96.2298) time: 0.0293 data: 0.0002 max mem: 5308 [17:01:20.422110] Test: [280/560] eta: 0:00:10 loss: 0.2304 (0.2635) auc: 97.2222 (96.3077) time: 0.0293 data: 0.0001 max mem: 5308 [17:01:20.717473] Test: [290/560] eta: 0:00:09 loss: 0.2411 (0.2641) auc: 97.9167 (96.2991) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:21.011090] Test: [300/560] eta: 0:00:09 loss: 0.2502 (0.2634) auc: 96.5587 (96.3297) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:21.305291] Test: [310/560] eta: 0:00:08 loss: 0.2377 (0.2639) auc: 96.0784 (96.2910) time: 0.0293 data: 0.0002 max mem: 5308 [17:01:21.601052] Test: [320/560] eta: 0:00:08 loss: 0.2653 (0.2643) auc: 96.0784 (96.2756) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:21.897565] Test: [330/560] eta: 0:00:08 loss: 0.2607 (0.2642) auc: 96.7611 (96.2714) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:22.192778] Test: [340/560] eta: 0:00:07 loss: 0.2518 (0.2637) auc: 97.2222 (96.2886) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:22.487431] Test: [350/560] eta: 0:00:07 loss: 0.2518 (0.2638) auc: 96.6667 (96.2853) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:22.782388] Test: [360/560] eta: 0:00:06 loss: 0.2743 (0.2646) auc: 96.0317 (96.2739) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:23.078191] Test: [370/560] eta: 0:00:06 loss: 0.2743 (0.2649) auc: 96.2500 (96.2597) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:23.373292] Test: [380/560] eta: 0:00:06 loss: 0.2427 (0.2647) auc: 97.1660 (96.2774) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:23.668606] Test: [390/560] eta: 0:00:05 loss: 0.2339 (0.2648) auc: 97.6190 (96.2774) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:23.963794] Test: [400/560] eta: 0:00:05 loss: 0.2628 (0.2657) auc: 96.4844 (96.2288) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:24.258771] Test: [410/560] eta: 0:00:05 loss: 0.2862 (0.2662) auc: 95.2381 (96.1946) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:24.554077] Test: [420/560] eta: 0:00:04 loss: 0.2902 (0.2665) auc: 95.2381 (96.1666) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:24.849620] Test: [430/560] eta: 0:00:04 loss: 0.2337 (0.2656) auc: 97.2222 (96.2036) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:25.144998] Test: [440/560] eta: 0:00:04 loss: 0.2272 (0.2650) auc: 97.9757 (96.2268) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:25.440671] Test: [450/560] eta: 0:00:03 loss: 0.2293 (0.2651) auc: 97.7733 (96.2265) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:25.736536] Test: [460/560] eta: 0:00:03 loss: 0.2627 (0.2655) auc: 96.8254 (96.2162) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:26.030298] Test: [470/560] eta: 0:00:03 loss: 0.2862 (0.2660) auc: 96.0784 (96.2105) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:26.324483] Test: [480/560] eta: 0:00:02 loss: 0.2681 (0.2660) auc: 96.0784 (96.1890) time: 0.0293 data: 0.0001 max mem: 5308 [17:01:26.619365] Test: [490/560] eta: 0:00:02 loss: 0.2418 (0.2655) auc: 96.0938 (96.2103) time: 0.0294 data: 0.0002 max mem: 5308 [17:01:26.912880] Test: [500/560] eta: 0:00:01 loss: 0.2452 (0.2651) auc: 97.1660 (96.2262) time: 0.0294 data: 0.0001 max mem: 5308 [17:01:27.207593] Test: [510/560] eta: 0:00:01 loss: 0.2594 (0.2645) auc: 96.4706 (96.2345) time: 0.0293 data: 0.0001 max mem: 5308 [17:01:27.504055] Test: [520/560] eta: 0:00:01 loss: 0.2717 (0.2655) auc: 95.9514 (96.1982) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:27.799968] Test: [530/560] eta: 0:00:00 loss: 0.2986 (0.2660) auc: 95.6863 (96.1974) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:28.096310] Test: [540/560] eta: 0:00:00 loss: 0.2673 (0.2663) auc: 96.4286 (96.1839) time: 0.0295 data: 0.0002 max mem: 5308 [17:01:28.387414] Test: [550/560] eta: 0:00:00 loss: 0.2385 (0.2656) auc: 98.4127 (96.2246) time: 0.0293 data: 0.0001 max mem: 5308 [17:01:28.634075] Test: [559/560] eta: 0:00:00 loss: 0.2417 (0.2658) auc: 98.0159 (96.2217) time: 0.0283 data: 0.0001 max mem: 5308 [17:01:28.792698] Test: Total time: 0:00:18 (0.0332 s / it) [17:01:28.794261] * Auc 96.194 loss 0.265 [17:01:28.794612] AUC of the network on the 35796 val images: 96.19% [17:01:28.794651] Max auc: 96.19% [17:01:28.794694] Save model with min_val_loss at epoch: 7 [17:01:34.948463] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [17:01:36.140535] Epoch: [8] [ 0/2877] eta: 0:57:05 lr: 0.000022 loss: 0.6962 (0.6962) time: 1.1907 data: 1.0442 max mem: 5308 [17:01:49.773067] Epoch: [8] [ 100/2877] eta: 0:06:47 lr: 0.000022 loss: 0.6390 (0.6219) time: 0.1369 data: 0.0001 max mem: 5308 [17:02:03.396414] Epoch: [8] [ 200/2877] eta: 0:06:18 lr: 0.000021 loss: 0.6071 (0.6208) time: 0.1361 data: 0.0002 max mem: 5308 [17:02:17.190507] Epoch: [8] [ 300/2877] eta: 0:06:01 lr: 0.000020 loss: 0.6445 (0.6232) time: 0.1385 data: 0.0003 max mem: 5308 [17:02:30.973527] Epoch: [8] [ 400/2877] eta: 0:05:46 lr: 0.000020 loss: 0.5857 (0.6225) time: 0.1358 data: 0.0001 max mem: 5308 [17:02:44.696351] Epoch: [8] [ 500/2877] eta: 0:05:30 lr: 0.000019 loss: 0.6168 (0.6209) time: 0.1365 data: 0.0002 max mem: 5308 [17:02:58.375106] Epoch: [8] [ 600/2877] eta: 0:05:16 lr: 0.000019 loss: 0.6445 (0.6232) time: 0.1353 data: 0.0002 max mem: 5308 [17:03:12.209276] Epoch: [8] [ 700/2877] eta: 0:05:02 lr: 0.000018 loss: 0.6126 (0.6242) time: 0.1438 data: 0.0002 max mem: 5308 [17:03:25.706107] Epoch: [8] [ 800/2877] eta: 0:04:47 lr: 0.000017 loss: 0.6363 (0.6234) time: 0.1345 data: 0.0001 max mem: 5308 [17:03:39.152352] Epoch: [8] [ 900/2877] eta: 0:04:32 lr: 0.000017 loss: 0.6205 (0.6228) time: 0.1345 data: 0.0001 max mem: 5308 [17:03:52.594851] Epoch: [8] [1000/2877] eta: 0:04:18 lr: 0.000016 loss: 0.6318 (0.6231) time: 0.1358 data: 0.0002 max mem: 5308 [17:04:06.191296] Epoch: [8] [1100/2877] eta: 0:04:04 lr: 0.000016 loss: 0.6062 (0.6225) time: 0.1361 data: 0.0002 max mem: 5308 [17:04:19.646104] Epoch: [8] [1200/2877] eta: 0:03:49 lr: 0.000015 loss: 0.6201 (0.6223) time: 0.1352 data: 0.0001 max mem: 5308 [17:04:33.306230] Epoch: [8] [1300/2877] eta: 0:03:36 lr: 0.000014 loss: 0.6430 (0.6222) time: 0.1364 data: 0.0002 max mem: 5308 [17:04:47.060889] Epoch: [8] [1400/2877] eta: 0:03:22 lr: 0.000014 loss: 0.6127 (0.6219) time: 0.1358 data: 0.0001 max mem: 5308 [17:05:00.655358] Epoch: [8] [1500/2877] eta: 0:03:08 lr: 0.000013 loss: 0.6248 (0.6217) time: 0.1360 data: 0.0002 max mem: 5308 [17:05:14.176957] Epoch: [8] [1600/2877] eta: 0:02:54 lr: 0.000013 loss: 0.6255 (0.6222) time: 0.1350 data: 0.0001 max mem: 5308 [17:05:27.913867] Epoch: [8] [1700/2877] eta: 0:02:41 lr: 0.000012 loss: 0.6227 (0.6221) time: 0.1375 data: 0.0002 max mem: 5308 [17:05:41.631218] Epoch: [8] [1800/2877] eta: 0:02:27 lr: 0.000012 loss: 0.6198 (0.6222) time: 0.1391 data: 0.0002 max mem: 5308 [17:05:55.447244] Epoch: [8] [1900/2877] eta: 0:02:13 lr: 0.000011 loss: 0.5774 (0.6218) time: 0.1385 data: 0.0002 max mem: 5308 [17:06:09.029080] Epoch: [8] [2000/2877] eta: 0:02:00 lr: 0.000011 loss: 0.6533 (0.6222) time: 0.1358 data: 0.0002 max mem: 5308 [17:06:22.686067] Epoch: [8] [2100/2877] eta: 0:01:46 lr: 0.000010 loss: 0.5891 (0.6218) time: 0.1360 data: 0.0001 max mem: 5308 [17:06:36.319359] Epoch: [8] [2200/2877] eta: 0:01:32 lr: 0.000010 loss: 0.6360 (0.6213) time: 0.1382 data: 0.0002 max mem: 5308 [17:06:49.997239] Epoch: [8] [2300/2877] eta: 0:01:18 lr: 0.000009 loss: 0.6396 (0.6213) time: 0.1365 data: 0.0001 max mem: 5308 [17:07:03.564143] Epoch: [8] [2400/2877] eta: 0:01:05 lr: 0.000009 loss: 0.6095 (0.6208) time: 0.1347 data: 0.0001 max mem: 5308 [17:07:17.081220] Epoch: [8] [2500/2877] eta: 0:00:51 lr: 0.000008 loss: 0.6199 (0.6206) time: 0.1351 data: 0.0001 max mem: 5308 [17:07:30.611089] Epoch: [8] [2600/2877] eta: 0:00:37 lr: 0.000008 loss: 0.6360 (0.6204) time: 0.1352 data: 0.0001 max mem: 5308 [17:07:44.119886] Epoch: [8] [2700/2877] eta: 0:00:24 lr: 0.000008 loss: 0.6387 (0.6202) time: 0.1357 data: 0.0001 max mem: 5308 [17:07:57.637464] Epoch: [8] [2800/2877] eta: 0:00:10 lr: 0.000007 loss: 0.6330 (0.6200) time: 0.1350 data: 0.0001 max mem: 5308 [17:08:07.901785] Epoch: [8] [2876/2877] eta: 0:00:00 lr: 0.000007 loss: 0.5983 (0.6200) time: 0.1342 data: 0.0002 max mem: 5308 [17:08:08.170562] Epoch: [8] Total time: 0:06:33 (0.1367 s / it) [17:08:08.172203] Averaged stats: lr: 0.000007 loss: 0.5983 (0.6206) [17:08:09.676327] Test: [ 0/560] eta: 0:13:59 loss: 0.2754 (0.2754) auc: 97.2549 (97.2549) time: 1.4993 data: 1.4633 max mem: 5308 [17:08:10.002217] Test: [ 10/560] eta: 0:01:31 loss: 0.3183 (0.3282) auc: 94.3750 (93.2773) time: 0.1658 data: 0.1358 max mem: 5308 [17:08:10.299564] Test: [ 20/560] eta: 0:00:54 loss: 0.2958 (0.2975) auc: 95.6349 (95.3388) time: 0.0310 data: 0.0016 max mem: 5308 [17:08:10.594951] Test: [ 30/560] eta: 0:00:41 loss: 0.2550 (0.2818) auc: 98.3806 (95.7652) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:10.889354] Test: [ 40/560] eta: 0:00:34 loss: 0.2596 (0.2808) auc: 97.0833 (95.7116) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:11.185629] Test: [ 50/560] eta: 0:00:30 loss: 0.2617 (0.2794) auc: 96.4286 (96.0119) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:11.482017] Test: [ 60/560] eta: 0:00:27 loss: 0.2936 (0.2812) auc: 96.4286 (95.9175) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:11.779318] Test: [ 70/560] eta: 0:00:24 loss: 0.2743 (0.2793) auc: 96.8627 (95.9671) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:12.073539] Test: [ 80/560] eta: 0:00:23 loss: 0.2441 (0.2734) auc: 97.5709 (96.1631) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:12.369856] Test: [ 90/560] eta: 0:00:21 loss: 0.2341 (0.2726) auc: 97.2222 (96.2109) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:12.675377] Test: [100/560] eta: 0:00:20 loss: 0.2347 (0.2688) auc: 96.9697 (96.3751) time: 0.0300 data: 0.0002 max mem: 5308 [17:08:12.971402] Test: [110/560] eta: 0:00:19 loss: 0.2347 (0.2684) auc: 97.1660 (96.3786) time: 0.0300 data: 0.0002 max mem: 5308 [17:08:13.269793] Test: [120/560] eta: 0:00:18 loss: 0.2726 (0.2741) auc: 94.5312 (96.1964) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:13.564731] Test: [130/560] eta: 0:00:17 loss: 0.2768 (0.2750) auc: 95.3441 (96.2212) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:13.862946] Test: [140/560] eta: 0:00:16 loss: 0.2729 (0.2753) auc: 96.8627 (96.2604) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:14.160117] Test: [150/560] eta: 0:00:16 loss: 0.2596 (0.2740) auc: 97.5709 (96.2675) time: 0.0297 data: 0.0002 max mem: 5308 [17:08:14.458932] Test: [160/560] eta: 0:00:15 loss: 0.2313 (0.2694) auc: 98.4314 (96.4105) time: 0.0297 data: 0.0002 max mem: 5308 [17:08:14.753447] Test: [170/560] eta: 0:00:14 loss: 0.2190 (0.2682) auc: 98.3806 (96.4370) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:15.048312] Test: [180/560] eta: 0:00:14 loss: 0.2434 (0.2686) auc: 97.5709 (96.4521) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:15.347837] Test: [190/560] eta: 0:00:13 loss: 0.2718 (0.2691) auc: 96.4706 (96.4486) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:15.640502] Test: [200/560] eta: 0:00:13 loss: 0.2584 (0.2673) auc: 96.3563 (96.5113) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:15.936175] Test: [210/560] eta: 0:00:12 loss: 0.2584 (0.2684) auc: 97.5000 (96.4752) time: 0.0293 data: 0.0002 max mem: 5308 [17:08:16.231221] Test: [220/560] eta: 0:00:12 loss: 0.2662 (0.2684) auc: 97.5000 (96.4914) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:16.522694] Test: [230/560] eta: 0:00:11 loss: 0.2622 (0.2685) auc: 96.8627 (96.4778) time: 0.0293 data: 0.0002 max mem: 5308 [17:08:16.814805] Test: [240/560] eta: 0:00:11 loss: 0.2622 (0.2684) auc: 96.8627 (96.4781) time: 0.0291 data: 0.0001 max mem: 5308 [17:08:17.106604] Test: [250/560] eta: 0:00:11 loss: 0.2870 (0.2691) auc: 96.6667 (96.4596) time: 0.0291 data: 0.0001 max mem: 5308 [17:08:17.401698] Test: [260/560] eta: 0:00:10 loss: 0.2332 (0.2676) auc: 97.2549 (96.4988) time: 0.0293 data: 0.0002 max mem: 5308 [17:08:17.695184] Test: [270/560] eta: 0:00:10 loss: 0.2728 (0.2684) auc: 96.4706 (96.4777) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:17.987549] Test: [280/560] eta: 0:00:09 loss: 0.2354 (0.2659) auc: 97.6190 (96.5461) time: 0.0292 data: 0.0001 max mem: 5308 [17:08:18.282507] Test: [290/560] eta: 0:00:09 loss: 0.2471 (0.2667) auc: 97.9167 (96.5354) time: 0.0293 data: 0.0002 max mem: 5308 [17:08:18.577157] Test: [300/560] eta: 0:00:08 loss: 0.2603 (0.2659) auc: 97.1660 (96.5654) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:18.872562] Test: [310/560] eta: 0:00:08 loss: 0.2490 (0.2663) auc: 96.4706 (96.5318) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:19.167468] Test: [320/560] eta: 0:00:08 loss: 0.2725 (0.2666) auc: 96.4706 (96.5182) time: 0.0294 data: 0.0001 max mem: 5308 [17:08:19.462754] Test: [330/560] eta: 0:00:07 loss: 0.2695 (0.2665) auc: 97.1660 (96.5135) time: 0.0294 data: 0.0001 max mem: 5308 [17:08:19.755393] Test: [340/560] eta: 0:00:07 loss: 0.2572 (0.2661) auc: 97.1660 (96.5231) time: 0.0293 data: 0.0001 max mem: 5308 [17:08:20.048248] Test: [350/560] eta: 0:00:07 loss: 0.2522 (0.2661) auc: 96.2500 (96.5183) time: 0.0292 data: 0.0001 max mem: 5308 [17:08:20.342027] Test: [360/560] eta: 0:00:06 loss: 0.2813 (0.2670) auc: 96.2500 (96.5038) time: 0.0292 data: 0.0001 max mem: 5308 [17:08:20.642912] Test: [370/560] eta: 0:00:06 loss: 0.2865 (0.2674) auc: 96.4286 (96.4792) time: 0.0296 data: 0.0002 max mem: 5308 [17:08:20.938396] Test: [380/560] eta: 0:00:06 loss: 0.2473 (0.2672) auc: 97.1660 (96.4978) time: 0.0297 data: 0.0002 max mem: 5308 [17:08:21.234675] Test: [390/560] eta: 0:00:05 loss: 0.2379 (0.2674) auc: 97.6190 (96.4987) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:21.534730] Test: [400/560] eta: 0:00:05 loss: 0.2594 (0.2683) auc: 96.8254 (96.4631) time: 0.0297 data: 0.0002 max mem: 5308 [17:08:21.830597] Test: [410/560] eta: 0:00:04 loss: 0.2852 (0.2688) auc: 95.2381 (96.4242) time: 0.0297 data: 0.0002 max mem: 5308 [17:08:22.124395] Test: [420/560] eta: 0:00:04 loss: 0.2852 (0.2690) auc: 96.0784 (96.4048) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:22.420086] Test: [430/560] eta: 0:00:04 loss: 0.2291 (0.2680) auc: 97.9757 (96.4436) time: 0.0293 data: 0.0001 max mem: 5308 [17:08:22.717324] Test: [440/560] eta: 0:00:03 loss: 0.2124 (0.2673) auc: 98.4127 (96.4713) time: 0.0295 data: 0.0001 max mem: 5308 [17:08:23.011438] Test: [450/560] eta: 0:00:03 loss: 0.2318 (0.2674) auc: 98.0159 (96.4662) time: 0.0295 data: 0.0002 max mem: 5308 [17:08:23.306639] Test: [460/560] eta: 0:00:03 loss: 0.2797 (0.2680) auc: 96.9697 (96.4568) time: 0.0294 data: 0.0001 max mem: 5308 [17:08:23.600078] Test: [470/560] eta: 0:00:02 loss: 0.2826 (0.2683) auc: 96.4706 (96.4567) time: 0.0294 data: 0.0001 max mem: 5308 [17:08:23.894253] Test: [480/560] eta: 0:00:02 loss: 0.2681 (0.2683) auc: 96.2500 (96.4371) time: 0.0293 data: 0.0001 max mem: 5308 [17:08:24.188039] Test: [490/560] eta: 0:00:02 loss: 0.2446 (0.2676) auc: 96.2891 (96.4565) time: 0.0293 data: 0.0001 max mem: 5308 [17:08:24.482017] Test: [500/560] eta: 0:00:01 loss: 0.2446 (0.2672) auc: 97.9757 (96.4736) time: 0.0293 data: 0.0001 max mem: 5308 [17:08:24.777451] Test: [510/560] eta: 0:00:01 loss: 0.2504 (0.2667) auc: 96.4706 (96.4854) time: 0.0294 data: 0.0002 max mem: 5308 [17:08:25.070204] Test: [520/560] eta: 0:00:01 loss: 0.2821 (0.2677) auc: 96.0784 (96.4505) time: 0.0293 data: 0.0002 max mem: 5308 [17:08:25.363899] Test: [530/560] eta: 0:00:00 loss: 0.2905 (0.2684) auc: 94.9219 (96.4542) time: 0.0292 data: 0.0001 max mem: 5308 [17:08:25.656778] Test: [540/560] eta: 0:00:00 loss: 0.2814 (0.2689) auc: 96.6667 (96.4368) time: 0.0293 data: 0.0002 max mem: 5308 [17:08:25.947386] Test: [550/560] eta: 0:00:00 loss: 0.2483 (0.2684) auc: 98.4375 (96.4758) time: 0.0291 data: 0.0002 max mem: 5308 [17:08:26.191558] Test: [559/560] eta: 0:00:00 loss: 0.2540 (0.2686) auc: 98.4127 (96.4731) time: 0.0281 data: 0.0001 max mem: 5308 [17:08:26.390012] Test: Total time: 0:00:18 (0.0325 s / it) [17:08:50.621774] * Auc 96.445 loss 0.268 [17:08:50.622510] AUC of the network on the 35796 val images: 96.44% [17:08:50.622527] Max auc: 96.44% [17:08:50.627888] log_dir: ./checkpoint/finetuned_models/FF++_c23_32frames [17:08:52.149322] Epoch: [9] [ 0/2877] eta: 1:12:51 lr: 0.000007 loss: 0.6032 (0.6032) time: 1.5196 data: 1.2606 max mem: 5308 [17:09:05.775742] Epoch: [9] [ 100/2877] eta: 0:06:56 lr: 0.000006 loss: 0.6102 (0.6082) time: 0.1353 data: 0.0001 max mem: 5308 [17:09:19.458680] Epoch: [9] [ 200/2877] eta: 0:06:23 lr: 0.000006 loss: 0.6095 (0.6100) time: 0.1394 data: 0.0001 max mem: 5308 [17:09:33.084087] Epoch: [9] [ 300/2877] eta: 0:06:03 lr: 0.000006 loss: 0.6123 (0.6104) time: 0.1359 data: 0.0001 max mem: 5308 [17:09:46.810966] Epoch: [9] [ 400/2877] eta: 0:05:46 lr: 0.000005 loss: 0.6216 (0.6157) time: 0.1367 data: 0.0002 max mem: 5308 [17:10:00.288635] Epoch: [9] [ 500/2877] eta: 0:05:30 lr: 0.000005 loss: 0.6054 (0.6169) time: 0.1345 data: 0.0002 max mem: 5308 [17:10:13.843967] Epoch: [9] [ 600/2877] eta: 0:05:15 lr: 0.000005 loss: 0.6183 (0.6168) time: 0.1355 data: 0.0002 max mem: 5308 [17:10:27.464085] Epoch: [9] [ 700/2877] eta: 0:05:00 lr: 0.000004 loss: 0.6176 (0.6184) time: 0.1373 data: 0.0002 max mem: 5308 [17:10:41.024085] Epoch: [9] [ 800/2877] eta: 0:04:46 lr: 0.000004 loss: 0.6251 (0.6185) time: 0.1348 data: 0.0001 max mem: 5308 [17:10:54.575950] Epoch: [9] [ 900/2877] eta: 0:04:31 lr: 0.000004 loss: 0.6224 (0.6186) time: 0.1359 data: 0.0001 max mem: 5308 [17:11:08.130035] Epoch: [9] [1000/2877] eta: 0:04:17 lr: 0.000004 loss: 0.6258 (0.6193) time: 0.1356 data: 0.0003 max mem: 5308 [17:11:21.765699] Epoch: [9] [1100/2877] eta: 0:04:03 lr: 0.000003 loss: 0.5994 (0.6199) time: 0.1375 data: 0.0001 max mem: 5308 [17:11:35.438211] Epoch: [9] [1200/2877] eta: 0:03:50 lr: 0.000003 loss: 0.6504 (0.6205) time: 0.1348 data: 0.0001 max mem: 5308 [17:11:49.021473] Epoch: [9] [1300/2877] eta: 0:03:36 lr: 0.000003 loss: 0.6244 (0.6211) time: 0.1378 data: 0.0002 max mem: 5308 [17:12:02.691884] Epoch: [9] [1400/2877] eta: 0:03:22 lr: 0.000003 loss: 0.6132 (0.6206) time: 0.1373 data: 0.0002 max mem: 5308 [17:12:16.246649] Epoch: [9] [1500/2877] eta: 0:03:08 lr: 0.000002 loss: 0.6366 (0.6204) time: 0.1362 data: 0.0002 max mem: 5308 [17:12:29.873774] Epoch: [9] [1600/2877] eta: 0:02:54 lr: 0.000002 loss: 0.6490 (0.6199) time: 0.1380 data: 0.0002 max mem: 5308 [17:12:43.536798] Epoch: [9] [1700/2877] eta: 0:02:41 lr: 0.000002 loss: 0.5923 (0.6198) time: 0.1371 data: 0.0002 max mem: 5308 [17:12:57.281224] Epoch: [9] [1800/2877] eta: 0:02:27 lr: 0.000002 loss: 0.6158 (0.6198) time: 0.1361 data: 0.0001 max mem: 5308 [17:13:11.025623] Epoch: [9] [1900/2877] eta: 0:02:13 lr: 0.000002 loss: 0.6244 (0.6196) time: 0.1392 data: 0.0002 max mem: 5308 [17:13:24.712475] Epoch: [9] [2000/2877] eta: 0:02:00 lr: 0.000002 loss: 0.6158 (0.6202) time: 0.1379 data: 0.0002 max mem: 5308 [17:13:38.306898] Epoch: [9] [2100/2877] eta: 0:01:46 lr: 0.000001 loss: 0.6236 (0.6196) time: 0.1358 data: 0.0001 max mem: 5308 [17:13:51.861112] Epoch: [9] [2200/2877] eta: 0:01:32 lr: 0.000001 loss: 0.6389 (0.6194) time: 0.1353 data: 0.0001 max mem: 5308 [17:14:05.479928] Epoch: [9] [2300/2877] eta: 0:01:18 lr: 0.000001 loss: 0.6032 (0.6192) time: 0.1358 data: 0.0001 max mem: 5308 [17:14:19.094618] Epoch: [9] [2400/2877] eta: 0:01:05 lr: 0.000001 loss: 0.6254 (0.6190) time: 0.1380 data: 0.0002 max mem: 5308 [17:14:32.699114] Epoch: [9] [2500/2877] eta: 0:00:51 lr: 0.000001 loss: 0.6126 (0.6189) time: 0.1344 data: 0.0001 max mem: 5308 [17:14:46.246451] Epoch: [9] [2600/2877] eta: 0:00:37 lr: 0.000001 loss: 0.5802 (0.6186) time: 0.1349 data: 0.0001 max mem: 5308 [17:14:59.899232] Epoch: [9] [2700/2877] eta: 0:00:24 lr: 0.000001 loss: 0.6309 (0.6190) time: 0.1381 data: 0.0002 max mem: 5308 [17:15:13.615289] Epoch: [9] [2800/2877] eta: 0:00:10 lr: 0.000001 loss: 0.6427 (0.6193) time: 0.1364 data: 0.0002 max mem: 5308 [17:15:24.066361] Epoch: [9] [2876/2877] eta: 0:00:00 lr: 0.000001 loss: 0.6170 (0.6194) time: 0.1359 data: 0.0003 max mem: 5308 [17:15:24.332226] Epoch: [9] Total time: 0:06:33 (0.1368 s / it) [17:15:24.383123] Averaged stats: lr: 0.000001 loss: 0.6170 (0.6199) [17:15:26.022391] Test: [ 0/560] eta: 0:15:15 loss: 0.2722 (0.2722) auc: 97.2549 (97.2549) time: 1.6355 data: 1.5981 max mem: 5308 [17:15:26.345784] Test: [ 10/560] eta: 0:01:37 loss: 0.3206 (0.3271) auc: 95.0000 (93.4459) time: 0.1780 data: 0.1477 max mem: 5308 [17:15:26.645646] Test: [ 20/560] eta: 0:00:58 loss: 0.2955 (0.2959) auc: 95.6349 (95.4811) time: 0.0311 data: 0.0014 max mem: 5308 [17:15:26.952988] Test: [ 30/560] eta: 0:00:43 loss: 0.2514 (0.2803) auc: 98.3806 (95.8764) time: 0.0302 data: 0.0002 max mem: 5308 [17:15:27.250217] Test: [ 40/560] eta: 0:00:36 loss: 0.2580 (0.2793) auc: 97.0833 (95.8053) time: 0.0301 data: 0.0002 max mem: 5308 [17:15:27.544844] Test: [ 50/560] eta: 0:00:31 loss: 0.2610 (0.2777) auc: 96.8254 (96.0871) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:27.839547] Test: [ 60/560] eta: 0:00:28 loss: 0.2934 (0.2794) auc: 96.8254 (95.9960) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:28.133396] Test: [ 70/560] eta: 0:00:25 loss: 0.2725 (0.2776) auc: 96.8627 (96.0572) time: 0.0293 data: 0.0001 max mem: 5308 [17:15:28.427168] Test: [ 80/560] eta: 0:00:23 loss: 0.2436 (0.2715) auc: 97.9757 (96.2697) time: 0.0293 data: 0.0002 max mem: 5308 [17:15:28.721503] Test: [ 90/560] eta: 0:00:22 loss: 0.2328 (0.2708) auc: 97.2656 (96.3182) time: 0.0293 data: 0.0002 max mem: 5308 [17:15:29.019369] Test: [100/560] eta: 0:00:21 loss: 0.2356 (0.2671) auc: 97.1660 (96.4760) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:29.314216] Test: [110/560] eta: 0:00:19 loss: 0.2380 (0.2668) auc: 97.1660 (96.4704) time: 0.0296 data: 0.0002 max mem: 5308 [17:15:29.608451] Test: [120/560] eta: 0:00:18 loss: 0.2711 (0.2727) auc: 94.5833 (96.2833) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:29.902725] Test: [130/560] eta: 0:00:18 loss: 0.2764 (0.2735) auc: 95.8333 (96.3002) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:30.198474] Test: [140/560] eta: 0:00:17 loss: 0.2708 (0.2739) auc: 96.8627 (96.3366) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:30.495214] Test: [150/560] eta: 0:00:16 loss: 0.2604 (0.2726) auc: 97.5709 (96.3466) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:30.789025] Test: [160/560] eta: 0:00:15 loss: 0.2305 (0.2680) auc: 98.3333 (96.4796) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:31.084164] Test: [170/560] eta: 0:00:15 loss: 0.2185 (0.2668) auc: 98.3333 (96.5032) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:31.378995] Test: [180/560] eta: 0:00:14 loss: 0.2432 (0.2672) auc: 97.6562 (96.5190) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:31.677022] Test: [190/560] eta: 0:00:14 loss: 0.2711 (0.2677) auc: 96.8627 (96.5226) time: 0.0296 data: 0.0002 max mem: 5308 [17:15:31.970709] Test: [200/560] eta: 0:00:13 loss: 0.2630 (0.2660) auc: 96.3563 (96.5816) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:32.264122] Test: [210/560] eta: 0:00:13 loss: 0.2630 (0.2671) auc: 97.1660 (96.5477) time: 0.0293 data: 0.0002 max mem: 5308 [17:15:32.559574] Test: [220/560] eta: 0:00:12 loss: 0.2652 (0.2671) auc: 97.1660 (96.5606) time: 0.0293 data: 0.0002 max mem: 5308 [17:15:32.853910] Test: [230/560] eta: 0:00:12 loss: 0.2627 (0.2673) auc: 97.5709 (96.5524) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:33.148498] Test: [240/560] eta: 0:00:11 loss: 0.2627 (0.2671) auc: 97.6471 (96.5505) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:33.444130] Test: [250/560] eta: 0:00:11 loss: 0.2834 (0.2678) auc: 96.6667 (96.5275) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:33.740958] Test: [260/560] eta: 0:00:10 loss: 0.2338 (0.2664) auc: 97.2549 (96.5704) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:34.037485] Test: [270/560] eta: 0:00:10 loss: 0.2719 (0.2671) auc: 96.4706 (96.5534) time: 0.0296 data: 0.0002 max mem: 5308 [17:15:34.331805] Test: [280/560] eta: 0:00:09 loss: 0.2327 (0.2646) auc: 97.6190 (96.6211) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:34.626515] Test: [290/560] eta: 0:00:09 loss: 0.2470 (0.2654) auc: 97.9167 (96.6098) time: 0.0294 data: 0.0002 max mem: 5308 [17:15:34.924812] Test: [300/560] eta: 0:00:09 loss: 0.2571 (0.2647) auc: 97.1660 (96.6394) time: 0.0296 data: 0.0002 max mem: 5308 [17:15:35.216619] Test: [310/560] eta: 0:00:08 loss: 0.2501 (0.2650) auc: 96.8627 (96.6078) time: 0.0294 data: 0.0001 max mem: 5308 [17:15:35.509710] Test: [320/560] eta: 0:00:08 loss: 0.2691 (0.2654) auc: 96.8627 (96.5913) time: 0.0292 data: 0.0001 max mem: 5308 [17:15:35.808104] Test: [330/560] eta: 0:00:07 loss: 0.2688 (0.2654) auc: 97.0833 (96.5869) time: 0.0295 data: 0.0001 max mem: 5308 [17:15:36.162883] Test: [340/560] eta: 0:00:07 loss: 0.2554 (0.2650) auc: 96.8254 (96.5908) time: 0.0325 data: 0.0002 max mem: 5308 [17:15:36.454381] Test: [350/560] eta: 0:00:07 loss: 0.2463 (0.2650) auc: 96.8254 (96.5862) time: 0.0322 data: 0.0002 max mem: 5308 [17:15:36.755903] Test: [360/560] eta: 0:00:06 loss: 0.2791 (0.2659) auc: 96.8254 (96.5709) time: 0.0296 data: 0.0002 max mem: 5308 [17:15:37.058823] Test: [370/560] eta: 0:00:06 loss: 0.2842 (0.2663) auc: 96.8254 (96.5491) time: 0.0301 data: 0.0002 max mem: 5308 [17:15:37.357357] Test: [380/560] eta: 0:00:06 loss: 0.2488 (0.2661) auc: 97.5709 (96.5647) time: 0.0300 data: 0.0002 max mem: 5308 [17:15:37.657345] Test: [390/560] eta: 0:00:05 loss: 0.2341 (0.2663) auc: 97.6190 (96.5660) time: 0.0298 data: 0.0002 max mem: 5308 [17:15:37.959345] Test: [400/560] eta: 0:00:05 loss: 0.2575 (0.2673) auc: 97.2222 (96.5297) time: 0.0300 data: 0.0002 max mem: 5308 [17:15:38.258105] Test: [410/560] eta: 0:00:05 loss: 0.2843 (0.2677) auc: 95.6349 (96.4911) time: 0.0299 data: 0.0002 max mem: 5308 [17:15:38.562521] Test: [420/560] eta: 0:00:04 loss: 0.2843 (0.2679) auc: 96.0784 (96.4682) time: 0.0300 data: 0.0002 max mem: 5308 [17:15:38.857080] Test: [430/560] eta: 0:00:04 loss: 0.2232 (0.2669) auc: 97.9757 (96.5079) time: 0.0298 data: 0.0002 max mem: 5308 [17:15:39.150906] Test: [440/560] eta: 0:00:04 loss: 0.2123 (0.2662) auc: 98.4127 (96.5350) time: 0.0293 data: 0.0001 max mem: 5308 [17:15:39.445237] Test: [450/560] eta: 0:00:03 loss: 0.2309 (0.2663) auc: 98.0469 (96.5324) time: 0.0293 data: 0.0001 max mem: 5308 [17:15:39.743831] Test: [460/560] eta: 0:00:03 loss: 0.2795 (0.2669) auc: 96.9697 (96.5207) time: 0.0295 data: 0.0002 max mem: 5308 [17:15:40.037987] Test: [470/560] eta: 0:00:02 loss: 0.2803 (0.2672) auc: 96.4706 (96.5188) time: 0.0296 data: 0.0002 max mem: 5308 [17:15:40.332330] Test: [480/560] eta: 0:00:02 loss: 0.2657 (0.2672) auc: 96.2500 (96.4992) time: 0.0294 data: 0.0001 max mem: 5308 [17:15:40.625663] Test: [490/560] eta: 0:00:02 loss: 0.2419 (0.2665) auc: 96.2891 (96.5190) time: 0.0293 data: 0.0002 max mem: 5308 [17:15:40.919337] Test: [500/560] eta: 0:00:01 loss: 0.2419 (0.2661) auc: 97.5709 (96.5363) time: 0.0293 data: 0.0001 max mem: 5308 [17:15:41.210644] Test: [510/560] eta: 0:00:01 loss: 0.2526 (0.2656) auc: 96.8627 (96.5470) time: 0.0292 data: 0.0001 max mem: 5308 [17:15:41.501098] Test: [520/560] eta: 0:00:01 loss: 0.2786 (0.2666) auc: 96.0784 (96.5132) time: 0.0290 data: 0.0001 max mem: 5308 [17:15:41.793075] Test: [530/560] eta: 0:00:00 loss: 0.2901 (0.2673) auc: 95.3125 (96.5152) time: 0.0291 data: 0.0001 max mem: 5308 [17:15:42.085467] Test: [540/560] eta: 0:00:00 loss: 0.2839 (0.2678) auc: 97.0833 (96.4989) time: 0.0292 data: 0.0002 max mem: 5308 [17:15:42.374333] Test: [550/560] eta: 0:00:00 loss: 0.2509 (0.2673) auc: 98.4375 (96.5387) time: 0.0290 data: 0.0001 max mem: 5308 [17:15:42.615863] Test: [559/560] eta: 0:00:00 loss: 0.2576 (0.2676) auc: 98.4127 (96.5354) time: 0.0279 data: 0.0001 max mem: 5308 [17:15:42.770750] Test: Total time: 0:00:18 (0.0328 s / it) [17:15:48.938519] * Auc 96.507 loss 0.267 [17:15:48.939175] AUC of the network on the 35796 val images: 96.51% [17:15:48.939190] Max auc: 96.51% [17:15:49.852552] Training time 1:10:01