Namespace(aux=False, aux_weight=0.5, backbone='resnet50', base_size=768, batch_size=16, checkname='default', crop_size=768, ctx=[gpu(0), gpu(1), gpu(2), gpu(3), gpu(4), gpu(5), gpu(6), gpu(7)], dataset='mhp', dtype='float32', epochs=120, eval=False, kvstore='device', log_interval=1, logging_file='train.log', lr=1e-05, mode=None, model='icnet', model_zoo=None, momentum=0.9, ngpus=8, no_cuda=False, no_val=False, no_wd=False, norm_kwargs={'num_devices': 8}, norm_layer=, resume=None, save_dir='runs/mhp/icnet/resnet50/', start_epoch=0, syncbn=True, test_batch_size=16, train_split='train', weight_decay=0.0001, workers=48) Model file not found. Downloading. ICNet( (conv1): HybridSequential( (0): Conv2D(3 -> 64, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_syncbatchnorm0_', in_channels=64) (2): Activation(relu) (3): Conv2D(64 -> 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (4): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_syncbatchnorm1_', in_channels=64) (5): Activation(relu) (6): Conv2D(64 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) ) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_syncbatchnorm2_', in_channels=128) (relu): Activation(relu) (maxpool): MaxPool2D(size=(3, 3), stride=(2, 2), padding=(1, 1), ceil_mode=False) (layer1): HybridSequential( (0): BottleneckV1b( (conv1): Conv2D(128 -> 64, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm0_', in_channels=64) (relu1): Activation(relu) (conv2): Conv2D(64 -> 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm1_', in_channels=64) (relu2): Activation(relu) (conv3): Conv2D(64 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm2_', in_channels=256) (relu3): Activation(relu) (downsample): HybridSequential( (0): Conv2D(128 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_down1_syncbatchnorm0_', in_channels=256) ) ) (1): BottleneckV1b( (conv1): Conv2D(256 -> 64, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm3_', in_channels=64) (relu1): Activation(relu) (conv2): Conv2D(64 -> 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm4_', in_channels=64) (relu2): Activation(relu) (conv3): Conv2D(64 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm5_', in_channels=256) (relu3): Activation(relu) ) (2): BottleneckV1b( (conv1): Conv2D(256 -> 64, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm6_', in_channels=64) (relu1): Activation(relu) (conv2): Conv2D(64 -> 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm7_', in_channels=64) (relu2): Activation(relu) (conv3): Conv2D(64 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers1_syncbatchnorm8_', in_channels=256) (relu3): Activation(relu) ) ) (layer2): HybridSequential( (0): BottleneckV1b( (conv1): Conv2D(256 -> 128, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm0_', in_channels=128) (relu1): Activation(relu) (conv2): Conv2D(128 -> 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm1_', in_channels=128) (relu2): Activation(relu) (conv3): Conv2D(128 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm2_', in_channels=512) (relu3): Activation(relu) (downsample): HybridSequential( (0): Conv2D(256 -> 512, kernel_size=(1, 1), stride=(2, 2), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_down2_syncbatchnorm0_', in_channels=512) ) ) (1): BottleneckV1b( (conv1): Conv2D(512 -> 128, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm3_', in_channels=128) (relu1): Activation(relu) (conv2): Conv2D(128 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm4_', in_channels=128) (relu2): Activation(relu) (conv3): Conv2D(128 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm5_', in_channels=512) (relu3): Activation(relu) ) (2): BottleneckV1b( (conv1): Conv2D(512 -> 128, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm6_', in_channels=128) (relu1): Activation(relu) (conv2): Conv2D(128 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm7_', in_channels=128) (relu2): Activation(relu) (conv3): Conv2D(128 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm8_', in_channels=512) (relu3): Activation(relu) ) (3): BottleneckV1b( (conv1): Conv2D(512 -> 128, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm9_', in_channels=128) (relu1): Activation(relu) (conv2): Conv2D(128 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm10_', in_channels=128) (relu2): Activation(relu) (conv3): Conv2D(128 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers2_syncbatchnorm11_', in_channels=512) (relu3): Activation(relu) ) ) (layer3): HybridSequential( (0): BottleneckV1b( (conv1): Conv2D(512 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm0_', in_channels=256) (relu1): Activation(relu) (conv2): Conv2D(256 -> 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm1_', in_channels=256) (relu2): Activation(relu) (conv3): Conv2D(256 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm2_', in_channels=1024) (relu3): Activation(relu) (downsample): HybridSequential( (0): Conv2D(512 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_down3_syncbatchnorm0_', in_channels=1024) ) ) (1): BottleneckV1b( (conv1): Conv2D(1024 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm3_', in_channels=256) (relu1): Activation(relu) (conv2): Conv2D(256 -> 256, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm4_', in_channels=256) (relu2): Activation(relu) (conv3): Conv2D(256 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm5_', in_channels=1024) (relu3): Activation(relu) ) (2): BottleneckV1b( (conv1): Conv2D(1024 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm6_', in_channels=256) (relu1): Activation(relu) (conv2): Conv2D(256 -> 256, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm7_', in_channels=256) (relu2): Activation(relu) (conv3): Conv2D(256 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm8_', in_channels=1024) (relu3): Activation(relu) ) (3): BottleneckV1b( (conv1): Conv2D(1024 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm9_', in_channels=256) (relu1): Activation(relu) (conv2): Conv2D(256 -> 256, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm10_', in_channels=256) (relu2): Activation(relu) (conv3): Conv2D(256 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm11_', in_channels=1024) (relu3): Activation(relu) ) (4): BottleneckV1b( (conv1): Conv2D(1024 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm12_', in_channels=256) (relu1): Activation(relu) (conv2): Conv2D(256 -> 256, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm13_', in_channels=256) (relu2): Activation(relu) (conv3): Conv2D(256 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm14_', in_channels=1024) (relu3): Activation(relu) ) (5): BottleneckV1b( (conv1): Conv2D(1024 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm15_', in_channels=256) (relu1): Activation(relu) (conv2): Conv2D(256 -> 256, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm16_', in_channels=256) (relu2): Activation(relu) (conv3): Conv2D(256 -> 1024, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers3_syncbatchnorm17_', in_channels=1024) (relu3): Activation(relu) ) ) (layer4): HybridSequential( (0): BottleneckV1b( (conv1): Conv2D(1024 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm0_', in_channels=512) (relu1): Activation(relu) (conv2): Conv2D(512 -> 512, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm1_', in_channels=512) (relu2): Activation(relu) (conv3): Conv2D(512 -> 2048, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm2_', in_channels=2048) (relu3): Activation(relu) (downsample): HybridSequential( (0): Conv2D(1024 -> 2048, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_down4_syncbatchnorm0_', in_channels=2048) ) ) (1): BottleneckV1b( (conv1): Conv2D(2048 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm3_', in_channels=512) (relu1): Activation(relu) (conv2): Conv2D(512 -> 512, kernel_size=(3, 3), stride=(1, 1), padding=(4, 4), dilation=(4, 4), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm4_', in_channels=512) (relu2): Activation(relu) (conv3): Conv2D(512 -> 2048, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm5_', in_channels=2048) (relu3): Activation(relu) ) (2): BottleneckV1b( (conv1): Conv2D(2048 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm6_', in_channels=512) (relu1): Activation(relu) (conv2): Conv2D(512 -> 512, kernel_size=(3, 3), stride=(1, 1), padding=(4, 4), dilation=(4, 4), bias=False) (bn2): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm7_', in_channels=512) (relu2): Activation(relu) (conv3): Conv2D(512 -> 2048, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn3): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_resnetv1s_layers4_syncbatchnorm8_', in_channels=2048) (relu3): Activation(relu) ) ) (conv_sub1): HybridSequential( (0): ConvBnRelu( (conv): Conv2D(3 -> 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False) (bn): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_hybridsequential0_convbnrelu0_syncbatchnorm0_', in_channels=32) (relu): Activation(relu) ) (1): ConvBnRelu( (conv): Conv2D(32 -> 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False) (bn): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_hybridsequential0_convbnrelu1_syncbatchnorm0_', in_channels=32) (relu): Activation(relu) ) (2): ConvBnRelu( (conv): Conv2D(32 -> 64, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False) (bn): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_hybridsequential0_convbnrelu2_syncbatchnorm0_', in_channels=64) (relu): Activation(relu) ) ) (psp_head): _PSPHead( (psp): _PyramidPooling( (conv1): HybridSequential( (0): Conv2D(2048 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0__pyramidpooling0_hybridsequential0_syncbatchnorm0_', in_channels=512) (2): Activation(relu) ) (conv2): HybridSequential( (0): Conv2D(2048 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0__pyramidpooling0_hybridsequential1_syncbatchnorm0_', in_channels=512) (2): Activation(relu) ) (conv3): HybridSequential( (0): Conv2D(2048 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0__pyramidpooling0_hybridsequential2_syncbatchnorm0_', in_channels=512) (2): Activation(relu) ) (conv4): HybridSequential( (0): Conv2D(2048 -> 512, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0__pyramidpooling0_hybridsequential3_syncbatchnorm0_', in_channels=512) (2): Activation(relu) ) ) (block): HybridSequential( (0): Conv2D(4096 -> 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0__psphead0_syncbatchnorm0_', in_channels=512) (2): Activation(relu) (3): Dropout(p = 0.1, axes=()) ) ) (head): _ICHead( (cff_12): CascadeFeatureFusion( (conv_low): HybridSequential( (0): Conv2D(128 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_cascadefeaturefusion0_hybridsequential0_syncbatchnorm0_', in_channels=128) ) (conv_hign): HybridSequential( (0): Conv2D(64 -> 128, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_cascadefeaturefusion0_hybridsequential1_syncbatchnorm0_', in_channels=128) ) (conv_low_cls): Conv2D(128 -> 18, kernel_size=(1, 1), stride=(1, 1), bias=False) ) (cff_24): CascadeFeatureFusion( (conv_low): HybridSequential( (0): Conv2D(256 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_cascadefeaturefusion1_hybridsequential0_syncbatchnorm0_', in_channels=128) ) (conv_hign): HybridSequential( (0): Conv2D(256 -> 128, kernel_size=(1, 1), stride=(1, 1), bias=False) (1): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_cascadefeaturefusion1_hybridsequential1_syncbatchnorm0_', in_channels=128) ) (conv_low_cls): Conv2D(128 -> 18, kernel_size=(1, 1), stride=(1, 1), bias=False) ) (conv_cls): Conv2D(128 -> 18, kernel_size=(1, 1), stride=(1, 1), bias=False) ) (conv_sub4): ConvBnRelu( (conv): Conv2D(512 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_convbnrelu0_syncbatchnorm0_', in_channels=256) (relu): Activation(relu) ) (conv_sub2): ConvBnRelu( (conv): Conv2D(512 -> 256, kernel_size=(1, 1), stride=(1, 1), bias=False) (bn): SyncBatchNorm(eps=1e-05, momentum=0.9, fix_gamma=False, use_global_stats=False, ndev=8, key='icnet0_convbnrelu1_syncbatchnorm0_', in_channels=256) (relu): Activation(relu) ) ) Starting Epoch: 0 Total Epochs: 120 Epoch 0 iteration 0001/0187: training loss 8.222 Epoch 0 iteration 0002/0187: training loss 8.143 Epoch 0 iteration 0003/0187: training loss 8.056 Epoch 0 iteration 0004/0187: training loss 7.960 Epoch 0 iteration 0005/0187: training loss 7.890 Epoch 0 iteration 0006/0187: training loss 7.822 Epoch 0 iteration 0007/0187: training loss 7.761 Epoch 0 iteration 0008/0187: training loss 7.704 Epoch 0 iteration 0009/0187: training loss 7.642 Epoch 0 iteration 0010/0187: training loss 7.608 Epoch 0 iteration 0011/0187: training loss 7.567 Epoch 0 iteration 0012/0187: training loss 7.529 Epoch 0 iteration 0013/0187: training loss 7.481 Epoch 0 iteration 0014/0187: training loss 7.419 Epoch 0 iteration 0015/0187: training loss 7.363 Epoch 0 iteration 0016/0187: training loss 7.324 Epoch 0 iteration 0017/0187: training loss 7.273 Epoch 0 iteration 0018/0187: training loss 7.234 Epoch 0 iteration 0019/0187: training loss 7.193 Epoch 0 iteration 0020/0187: training loss 7.146 Epoch 0 iteration 0021/0187: training loss 7.102 Epoch 0 iteration 0022/0187: training loss 7.073 Epoch 0 iteration 0023/0187: training loss 7.031 Epoch 0 iteration 0024/0187: training loss 6.985 Epoch 0 iteration 0025/0187: training loss 6.953 Epoch 0 iteration 0026/0187: training loss 6.914 Epoch 0 iteration 0027/0187: training loss 6.880 Epoch 0 iteration 0028/0187: training loss 6.845 Epoch 0 iteration 0029/0187: training loss 6.817 Epoch 0 iteration 0030/0187: training loss 6.776 Epoch 0 iteration 0031/0187: training loss 6.741 Epoch 0 iteration 0032/0187: training loss 6.707 Epoch 0 iteration 0033/0187: training loss 6.681 Epoch 0 iteration 0034/0187: training loss 6.653 Epoch 0 iteration 0035/0187: training loss 6.616 Epoch 0 iteration 0036/0187: training loss 6.584 Epoch 0 iteration 0037/0187: training loss 6.554 Epoch 0 iteration 0038/0187: training loss 6.530 Epoch 0 iteration 0039/0187: training loss 6.506 Epoch 0 iteration 0040/0187: training loss 6.479 Epoch 0 iteration 0041/0187: training loss 6.449 Epoch 0 iteration 0042/0187: training loss 6.426 Epoch 0 iteration 0043/0187: training loss 6.400 Epoch 0 iteration 0044/0187: training loss 6.372 Epoch 0 iteration 0045/0187: training loss 6.351 Epoch 0 iteration 0046/0187: training loss 6.327 Epoch 0 iteration 0047/0187: training loss 6.304 Epoch 0 iteration 0048/0187: training loss 6.284 Epoch 0 iteration 0049/0187: training loss 6.260 Epoch 0 iteration 0050/0187: training loss 6.234 Epoch 0 iteration 0051/0187: training loss 6.209 Epoch 0 iteration 0052/0187: training loss 6.186 Epoch 0 iteration 0053/0187: training loss 6.161 Epoch 0 iteration 0054/0187: training loss 6.135 Epoch 0 iteration 0055/0187: training loss 6.106 Epoch 0 iteration 0056/0187: training loss 6.086 Epoch 0 iteration 0057/0187: training loss 6.066 Epoch 0 iteration 0058/0187: training loss 6.048 Epoch 0 iteration 0059/0187: training loss 6.024 Epoch 0 iteration 0060/0187: training loss 6.007 Epoch 0 iteration 0061/0187: training loss 5.989 Epoch 0 iteration 0062/0187: training loss 5.972 Epoch 0 iteration 0063/0187: training loss 5.952 Epoch 0 iteration 0064/0187: training loss 5.938 Epoch 0 iteration 0065/0187: training loss 5.917 Epoch 0 iteration 0066/0187: training loss 5.901 Epoch 0 iteration 0067/0187: training loss 5.883 Epoch 0 iteration 0068/0187: training loss 5.862 Epoch 0 iteration 0069/0187: training loss 5.839 Epoch 0 iteration 0070/0187: training loss 5.818 Epoch 0 iteration 0071/0187: training loss 5.801 Epoch 0 iteration 0072/0187: training loss 5.784 Epoch 0 iteration 0073/0187: training loss 5.767 Epoch 0 iteration 0074/0187: training loss 5.746 Epoch 0 iteration 0075/0187: training loss 5.728 Epoch 0 iteration 0076/0187: training loss 5.710 Epoch 0 iteration 0077/0187: training loss 5.691 Epoch 0 iteration 0078/0187: training loss 5.674 Epoch 0 iteration 0079/0187: training loss 5.659 Epoch 0 iteration 0080/0187: training loss 5.642 Epoch 0 iteration 0081/0187: training loss 5.622 Epoch 0 iteration 0082/0187: training loss 5.603 Epoch 0 iteration 0083/0187: training loss 5.585 Epoch 0 iteration 0084/0187: training loss 5.565 Epoch 0 iteration 0085/0187: training loss 5.547 Epoch 0 iteration 0086/0187: training loss 5.533 Epoch 0 iteration 0087/0187: training loss 5.519 Epoch 0 iteration 0088/0187: training loss 5.505 Epoch 0 iteration 0089/0187: training loss 5.489 Epoch 0 iteration 0090/0187: training loss 5.479 Epoch 0 iteration 0091/0188: training loss 5.464 Epoch 0 iteration 0092/0188: training loss 5.445 Epoch 0 iteration 0093/0188: training loss 5.431 Epoch 0 iteration 0094/0188: training loss 5.413 Epoch 0 iteration 0095/0188: training loss 5.395 Epoch 0 iteration 0096/0188: training loss 5.381 Epoch 0 iteration 0097/0188: training loss 5.369 Epoch 0 iteration 0098/0188: training loss 5.355 Epoch 0 iteration 0099/0188: training loss 5.341 Epoch 0 iteration 0100/0188: training loss 5.327 Epoch 0 iteration 0101/0188: training loss 5.317 Epoch 0 iteration 0102/0188: training loss 5.300 Epoch 0 iteration 0103/0188: training loss 5.287 Epoch 0 iteration 0104/0188: training loss 5.272 Epoch 0 iteration 0105/0188: training loss 5.257 Epoch 0 iteration 0106/0188: training loss 5.243 Epoch 0 iteration 0107/0188: training loss 5.228 Epoch 0 iteration 0108/0188: training loss 5.216 Epoch 0 iteration 0109/0188: training loss 5.202 Epoch 0 iteration 0110/0188: training loss 5.188 Epoch 0 iteration 0111/0188: training loss 5.177 Epoch 0 iteration 0112/0188: training loss 5.162 Epoch 0 iteration 0113/0188: training loss 5.149 Epoch 0 iteration 0114/0188: training loss 5.136 Epoch 0 iteration 0115/0188: training loss 5.121 Epoch 0 iteration 0116/0188: training loss 5.110 Epoch 0 iteration 0117/0188: training loss 5.094 Epoch 0 iteration 0118/0188: training loss 5.081 Epoch 0 iteration 0119/0188: training loss 5.068 Epoch 0 iteration 0120/0188: training loss 5.058 Epoch 0 iteration 0121/0188: training loss 5.043 Epoch 0 iteration 0122/0188: training loss 5.032 Epoch 0 iteration 0123/0188: training loss 5.021 Epoch 0 iteration 0124/0188: training loss 5.009 Epoch 0 iteration 0125/0188: training loss 4.999 Epoch 0 iteration 0126/0188: training loss 4.986 Epoch 0 iteration 0127/0188: training loss 4.973 Epoch 0 iteration 0128/0188: training loss 4.964 Epoch 0 iteration 0129/0188: training loss 4.952 Epoch 0 iteration 0130/0188: training loss 4.942 Epoch 0 iteration 0131/0188: training loss 4.930 Epoch 0 iteration 0132/0188: training loss 4.920 Epoch 0 iteration 0133/0188: training loss 4.908 Epoch 0 iteration 0134/0188: training loss 4.898 Epoch 0 iteration 0135/0188: training loss 4.887 Epoch 0 iteration 0136/0188: training loss 4.878 Epoch 0 iteration 0137/0188: training loss 4.867 Epoch 0 iteration 0138/0188: training loss 4.857 Epoch 0 iteration 0139/0188: training loss 4.846 Epoch 0 iteration 0140/0188: training loss 4.836 Epoch 0 iteration 0141/0188: training loss 4.823 Epoch 0 iteration 0142/0188: training loss 4.815 Epoch 0 iteration 0143/0188: training loss 4.805 Epoch 0 iteration 0144/0188: training loss 4.793 Epoch 0 iteration 0145/0188: training loss 4.783 Epoch 0 iteration 0146/0188: training loss 4.774 Epoch 0 iteration 0147/0188: training loss 4.763 Epoch 0 iteration 0148/0188: training loss 4.754 Epoch 0 iteration 0149/0188: training loss 4.743 Epoch 0 iteration 0150/0188: training loss 4.734 Epoch 0 iteration 0151/0188: training loss 4.724 Epoch 0 iteration 0152/0188: training loss 4.713 Epoch 0 iteration 0153/0188: training loss 4.703 Epoch 0 iteration 0154/0188: training loss 4.696 Epoch 0 iteration 0155/0188: training loss 4.687 Epoch 0 iteration 0156/0188: training loss 4.677 Epoch 0 iteration 0157/0188: training loss 4.667 Epoch 0 iteration 0158/0188: training loss 4.658 Epoch 0 iteration 0159/0188: training loss 4.650 Epoch 0 iteration 0160/0188: training loss 4.638 Epoch 0 iteration 0161/0188: training loss 4.627 Epoch 0 iteration 0162/0188: training loss 4.617 Epoch 0 iteration 0163/0188: training loss 4.609 Epoch 0 iteration 0164/0188: training loss 4.601 Epoch 0 iteration 0165/0188: training loss 4.592 Epoch 0 iteration 0166/0188: training loss 4.583 Epoch 0 iteration 0167/0188: training loss 4.575 Epoch 0 iteration 0168/0188: training loss 4.563 Epoch 0 iteration 0169/0188: training loss 4.554 Epoch 0 iteration 0170/0188: training loss 4.545 Epoch 0 iteration 0171/0188: training loss 4.536 Epoch 0 iteration 0172/0188: training loss 4.527 Epoch 0 iteration 0173/0188: training loss 4.516 Epoch 0 iteration 0174/0188: training loss 4.509 Epoch 0 iteration 0175/0188: training loss 4.502 Epoch 0 iteration 0176/0188: training loss 4.493 Epoch 0 iteration 0177/0188: training loss 4.485 Epoch 0 iteration 0178/0188: training loss 4.476 Epoch 0 iteration 0179/0188: training loss 4.469 Epoch 0 iteration 0180/0188: training loss 4.462 Epoch 0 iteration 0181/0188: training loss 4.454 Epoch 0 iteration 0182/0188: training loss 4.444 Epoch 0 iteration 0183/0188: training loss 4.437 Epoch 0 iteration 0184/0188: training loss 4.429 Epoch 0 iteration 0185/0188: training loss 4.420 Epoch 0 iteration 0186/0188: training loss 4.410 Epoch 0 validation pixAcc: 0.775, mIoU: 0.178 Epoch 1 iteration 0001/0187: training loss 2.730 Epoch 1 iteration 0002/0187: training loss 2.823 Epoch 1 iteration 0003/0187: training loss 2.848 Epoch 1 iteration 0004/0187: training loss 2.925 Epoch 1 iteration 0005/0187: training loss 2.930 Epoch 1 iteration 0006/0187: training loss 2.882 Epoch 1 iteration 0007/0187: training loss 2.867 Epoch 1 iteration 0008/0187: training loss 2.833 Epoch 1 iteration 0009/0187: training loss 2.806 Epoch 1 iteration 0010/0187: training loss 2.774 Epoch 1 iteration 0011/0187: training loss 2.791 Epoch 1 iteration 0012/0187: training loss 2.793 Epoch 1 iteration 0013/0187: training loss 2.786 Epoch 1 iteration 0014/0187: training loss 2.801 Epoch 1 iteration 0015/0187: training loss 2.811 Epoch 1 iteration 0016/0187: training loss 2.806 Epoch 1 iteration 0017/0187: training loss 2.800 Epoch 1 iteration 0018/0187: training loss 2.818 Epoch 1 iteration 0019/0187: training loss 2.816 Epoch 1 iteration 0020/0187: training loss 2.821 Epoch 1 iteration 0021/0187: training loss 2.816 Epoch 1 iteration 0022/0187: training loss 2.805 Epoch 1 iteration 0023/0187: training loss 2.806 Epoch 1 iteration 0024/0187: training loss 2.791 Epoch 1 iteration 0025/0187: training loss 2.785 Epoch 1 iteration 0026/0187: training loss 2.807 Epoch 1 iteration 0027/0187: training loss 2.791 Epoch 1 iteration 0028/0187: training loss 2.795 Epoch 1 iteration 0029/0187: training loss 2.778 Epoch 1 iteration 0030/0187: training loss 2.782 Epoch 1 iteration 0031/0187: training loss 2.773 Epoch 1 iteration 0032/0187: training loss 2.763 Epoch 1 iteration 0033/0187: training loss 2.761 Epoch 1 iteration 0034/0187: training loss 2.751 Epoch 1 iteration 0035/0187: training loss 2.754 Epoch 1 iteration 0036/0187: training loss 2.750 Epoch 1 iteration 0037/0187: training loss 2.745 Epoch 1 iteration 0038/0187: training loss 2.739 Epoch 1 iteration 0039/0187: training loss 2.731 Epoch 1 iteration 0040/0187: training loss 2.725 Epoch 1 iteration 0041/0187: training loss 2.715 Epoch 1 iteration 0042/0187: training loss 2.717 Epoch 1 iteration 0043/0187: training loss 2.709 Epoch 1 iteration 0044/0187: training loss 2.696 Epoch 1 iteration 0045/0187: training loss 2.699 Epoch 1 iteration 0046/0187: training loss 2.695 Epoch 1 iteration 0047/0187: training loss 2.694 Epoch 1 iteration 0048/0187: training loss 2.688 Epoch 1 iteration 0049/0187: training loss 2.678 Epoch 1 iteration 0050/0187: training loss 2.672 Epoch 1 iteration 0051/0187: training loss 2.668 Epoch 1 iteration 0052/0187: training loss 2.667 Epoch 1 iteration 0053/0187: training loss 2.664 Epoch 1 iteration 0054/0187: training loss 2.656 Epoch 1 iteration 0055/0187: training loss 2.646 Epoch 1 iteration 0056/0187: training loss 2.642 Epoch 1 iteration 0057/0187: training loss 2.637 Epoch 1 iteration 0058/0187: training loss 2.635 Epoch 1 iteration 0059/0187: training loss 2.632 Epoch 1 iteration 0060/0187: training loss 2.627 Epoch 1 iteration 0061/0187: training loss 2.629 Epoch 1 iteration 0062/0187: training loss 2.628 Epoch 1 iteration 0063/0187: training loss 2.624 Epoch 1 iteration 0064/0187: training loss 2.623 Epoch 1 iteration 0065/0187: training loss 2.623 Epoch 1 iteration 0066/0187: training loss 2.621 Epoch 1 iteration 0067/0187: training loss 2.616 Epoch 1 iteration 0068/0187: training loss 2.610 Epoch 1 iteration 0069/0187: training loss 2.610 Epoch 1 iteration 0070/0187: training loss 2.612 Epoch 1 iteration 0071/0187: training loss 2.611 Epoch 1 iteration 0072/0187: training loss 2.607 Epoch 1 iteration 0073/0187: training loss 2.606 Epoch 1 iteration 0074/0187: training loss 2.599 Epoch 1 iteration 0075/0187: training loss 2.596 Epoch 1 iteration 0076/0187: training loss 2.596 Epoch 1 iteration 0077/0187: training loss 2.595 Epoch 1 iteration 0078/0187: training loss 2.592 Epoch 1 iteration 0079/0187: training loss 2.588 Epoch 1 iteration 0080/0187: training loss 2.587 Epoch 1 iteration 0081/0187: training loss 2.580 Epoch 1 iteration 0082/0187: training loss 2.580 Epoch 1 iteration 0083/0187: training loss 2.578 Epoch 1 iteration 0084/0187: training loss 2.575 Epoch 1 iteration 0085/0187: training loss 2.574 Epoch 1 iteration 0086/0187: training loss 2.571 Epoch 1 iteration 0087/0187: training loss 2.569 Epoch 1 iteration 0088/0187: training loss 2.565 Epoch 1 iteration 0089/0187: training loss 2.561 Epoch 1 iteration 0090/0187: training loss 2.559 Epoch 1 iteration 0091/0187: training loss 2.561 Epoch 1 iteration 0092/0187: training loss 2.561 Epoch 1 iteration 0093/0187: training loss 2.557 Epoch 1 iteration 0094/0187: training loss 2.556 Epoch 1 iteration 0095/0187: training loss 2.555 Epoch 1 iteration 0096/0187: training loss 2.555 Epoch 1 iteration 0097/0187: training loss 2.556 Epoch 1 iteration 0098/0187: training loss 2.555 Epoch 1 iteration 0099/0187: training loss 2.557 Epoch 1 iteration 0100/0187: training loss 2.553 Epoch 1 iteration 0101/0187: training loss 2.551 Epoch 1 iteration 0102/0187: training loss 2.550 Epoch 1 iteration 0103/0187: training loss 2.547 Epoch 1 iteration 0104/0187: training loss 2.545 Epoch 1 iteration 0105/0187: training loss 2.542 Epoch 1 iteration 0106/0187: training loss 2.541 Epoch 1 iteration 0107/0187: training loss 2.539 Epoch 1 iteration 0108/0187: training loss 2.537 Epoch 1 iteration 0109/0187: training loss 2.535 Epoch 1 iteration 0110/0187: training loss 2.533 Epoch 1 iteration 0111/0187: training loss 2.531 Epoch 1 iteration 0112/0187: training loss 2.528 Epoch 1 iteration 0113/0187: training loss 2.529 Epoch 1 iteration 0114/0187: training loss 2.526 Epoch 1 iteration 0115/0187: training loss 2.523 Epoch 1 iteration 0116/0187: training loss 2.520 Epoch 1 iteration 0117/0187: training loss 2.521 Epoch 1 iteration 0118/0187: training loss 2.522 Epoch 1 iteration 0119/0187: training loss 2.519 Epoch 1 iteration 0120/0187: training loss 2.518 Epoch 1 iteration 0121/0187: training loss 2.519 Epoch 1 iteration 0122/0187: training loss 2.515 Epoch 1 iteration 0123/0187: training loss 2.512 Epoch 1 iteration 0124/0187: training loss 2.510 Epoch 1 iteration 0125/0187: training loss 2.506 Epoch 1 iteration 0126/0187: training loss 2.504 Epoch 1 iteration 0127/0187: training loss 2.502 Epoch 1 iteration 0128/0187: training loss 2.498 Epoch 1 iteration 0129/0187: training loss 2.501 Epoch 1 iteration 0130/0187: training loss 2.500 Epoch 1 iteration 0131/0187: training loss 2.500 Epoch 1 iteration 0132/0187: training loss 2.498 Epoch 1 iteration 0133/0187: training loss 2.495 Epoch 1 iteration 0134/0187: training loss 2.490 Epoch 1 iteration 0135/0187: training loss 2.490 Epoch 1 iteration 0136/0187: training loss 2.489 Epoch 1 iteration 0137/0187: training loss 2.487 Epoch 1 iteration 0138/0187: training loss 2.488 Epoch 1 iteration 0139/0187: training loss 2.488 Epoch 1 iteration 0140/0187: training loss 2.484 Epoch 1 iteration 0141/0187: training loss 2.480 Epoch 1 iteration 0142/0187: training loss 2.477 Epoch 1 iteration 0143/0187: training loss 2.476 Epoch 1 iteration 0144/0187: training loss 2.475 Epoch 1 iteration 0145/0187: training loss 2.474 Epoch 1 iteration 0146/0187: training loss 2.470 Epoch 1 iteration 0147/0187: training loss 2.468 Epoch 1 iteration 0148/0187: training loss 2.466 Epoch 1 iteration 0149/0187: training loss 2.463 Epoch 1 iteration 0150/0187: training loss 2.462 Epoch 1 iteration 0151/0187: training loss 2.460 Epoch 1 iteration 0152/0187: training loss 2.459 Epoch 1 iteration 0153/0187: training loss 2.458 Epoch 1 iteration 0154/0187: training loss 2.456 Epoch 1 iteration 0155/0187: training loss 2.453 Epoch 1 iteration 0156/0187: training loss 2.452 Epoch 1 iteration 0157/0187: training loss 2.452 Epoch 1 iteration 0158/0187: training loss 2.450 Epoch 1 iteration 0159/0187: training loss 2.448 Epoch 1 iteration 0160/0187: training loss 2.446 Epoch 1 iteration 0161/0187: training loss 2.444 Epoch 1 iteration 0162/0187: training loss 2.440 Epoch 1 iteration 0163/0187: training loss 2.442 Epoch 1 iteration 0164/0187: training loss 2.439 Epoch 1 iteration 0165/0187: training loss 2.438 Epoch 1 iteration 0166/0187: training loss 2.435 Epoch 1 iteration 0167/0187: training loss 2.434 Epoch 1 iteration 0168/0187: training loss 2.431 Epoch 1 iteration 0169/0187: training loss 2.429 Epoch 1 iteration 0170/0187: training loss 2.427 Epoch 1 iteration 0171/0187: training loss 2.428 Epoch 1 iteration 0172/0187: training loss 2.428 Epoch 1 iteration 0173/0187: training loss 2.426 Epoch 1 iteration 0174/0187: training loss 2.426 Epoch 1 iteration 0175/0187: training loss 2.425 Epoch 1 iteration 0176/0187: training loss 2.425 Epoch 1 iteration 0177/0187: training loss 2.422 Epoch 1 iteration 0178/0187: training loss 2.419 Epoch 1 iteration 0179/0187: training loss 2.417 Epoch 1 iteration 0180/0187: training loss 2.414 Epoch 1 iteration 0181/0187: training loss 2.413 Epoch 1 iteration 0182/0187: training loss 2.411 Epoch 1 iteration 0183/0187: training loss 2.408 Epoch 1 iteration 0184/0187: training loss 2.406 Epoch 1 iteration 0185/0187: training loss 2.405 Epoch 1 iteration 0186/0187: training loss 2.405 Epoch 1 iteration 0187/0187: training loss 2.403 Epoch 1 validation pixAcc: 0.814, mIoU: 0.238 Epoch 2 iteration 0001/0187: training loss 2.121 Epoch 2 iteration 0002/0187: training loss 2.100 Epoch 2 iteration 0003/0187: training loss 2.098 Epoch 2 iteration 0004/0187: training loss 2.122 Epoch 2 iteration 0005/0187: training loss 2.121 Epoch 2 iteration 0006/0187: training loss 2.139 Epoch 2 iteration 0007/0187: training loss 2.122 Epoch 2 iteration 0008/0187: training loss 2.099 Epoch 2 iteration 0009/0187: training loss 2.076 Epoch 2 iteration 0010/0187: training loss 2.103 Epoch 2 iteration 0011/0187: training loss 2.076 Epoch 2 iteration 0012/0187: training loss 2.078 Epoch 2 iteration 0013/0187: training loss 2.049 Epoch 2 iteration 0014/0187: training loss 2.033 Epoch 2 iteration 0015/0187: training loss 2.018 Epoch 2 iteration 0016/0187: training loss 2.033 Epoch 2 iteration 0017/0187: training loss 2.046 Epoch 2 iteration 0018/0187: training loss 2.029 Epoch 2 iteration 0019/0187: training loss 2.015 Epoch 2 iteration 0020/0187: training loss 1.999 Epoch 2 iteration 0021/0187: training loss 1.993 Epoch 2 iteration 0022/0187: training loss 2.002 Epoch 2 iteration 0023/0187: training loss 2.003 Epoch 2 iteration 0024/0187: training loss 1.998 Epoch 2 iteration 0025/0187: training loss 2.016 Epoch 2 iteration 0026/0187: training loss 2.010 Epoch 2 iteration 0027/0187: training loss 2.018 Epoch 2 iteration 0028/0187: training loss 2.019 Epoch 2 iteration 0029/0187: training loss 2.021 Epoch 2 iteration 0030/0187: training loss 2.007 Epoch 2 iteration 0031/0187: training loss 2.011 Epoch 2 iteration 0032/0187: training loss 2.018 Epoch 2 iteration 0033/0187: training loss 2.012 Epoch 2 iteration 0034/0187: training loss 2.027 Epoch 2 iteration 0035/0187: training loss 2.027 Epoch 2 iteration 0036/0187: training loss 2.034 Epoch 2 iteration 0037/0187: training loss 2.028 Epoch 2 iteration 0038/0187: training loss 2.025 Epoch 2 iteration 0039/0187: training loss 2.019 Epoch 2 iteration 0040/0187: training loss 2.016 Epoch 2 iteration 0041/0187: training loss 2.018 Epoch 2 iteration 0042/0187: training loss 2.020 Epoch 2 iteration 0043/0187: training loss 2.019 Epoch 2 iteration 0044/0187: training loss 2.015 Epoch 2 iteration 0045/0187: training loss 2.012 Epoch 2 iteration 0046/0187: training loss 2.007 Epoch 2 iteration 0047/0187: training loss 2.010 Epoch 2 iteration 0048/0187: training loss 2.008 Epoch 2 iteration 0049/0187: training loss 2.006 Epoch 2 iteration 0050/0187: training loss 2.002 Epoch 2 iteration 0051/0187: training loss 2.001 Epoch 2 iteration 0052/0187: training loss 2.004 Epoch 2 iteration 0053/0187: training loss 2.006 Epoch 2 iteration 0054/0187: training loss 2.006 Epoch 2 iteration 0055/0187: training loss 2.002 Epoch 2 iteration 0056/0187: training loss 2.003 Epoch 2 iteration 0057/0187: training loss 2.003 Epoch 2 iteration 0058/0187: training loss 1.999 Epoch 2 iteration 0059/0187: training loss 1.997 Epoch 2 iteration 0060/0187: training loss 1.993 Epoch 2 iteration 0061/0187: training loss 1.991 Epoch 2 iteration 0062/0187: training loss 1.989 Epoch 2 iteration 0063/0187: training loss 1.985 Epoch 2 iteration 0064/0187: training loss 1.990 Epoch 2 iteration 0065/0187: training loss 1.989 Epoch 2 iteration 0066/0187: training loss 1.989 Epoch 2 iteration 0067/0187: training loss 1.985 Epoch 2 iteration 0068/0187: training loss 1.985 Epoch 2 iteration 0069/0187: training loss 1.983 Epoch 2 iteration 0070/0187: training loss 1.981 Epoch 2 iteration 0071/0187: training loss 1.984 Epoch 2 iteration 0072/0187: training loss 1.984 Epoch 2 iteration 0073/0187: training loss 1.984 Epoch 2 iteration 0074/0187: training loss 1.989 Epoch 2 iteration 0075/0187: training loss 1.987 Epoch 2 iteration 0076/0187: training loss 1.988 Epoch 2 iteration 0077/0187: training loss 1.986 Epoch 2 iteration 0078/0187: training loss 1.990 Epoch 2 iteration 0079/0187: training loss 1.988 Epoch 2 iteration 0080/0187: training loss 1.983 Epoch 2 iteration 0081/0187: training loss 1.984 Epoch 2 iteration 0082/0187: training loss 1.983 Epoch 2 iteration 0083/0187: training loss 1.981 Epoch 2 iteration 0084/0187: training loss 1.982 Epoch 2 iteration 0085/0187: training loss 1.982 Epoch 2 iteration 0086/0187: training loss 1.980 Epoch 2 iteration 0087/0187: training loss 1.981 Epoch 2 iteration 0088/0187: training loss 1.977 Epoch 2 iteration 0089/0187: training loss 1.975 Epoch 2 iteration 0090/0187: training loss 1.972 Epoch 2 iteration 0091/0188: training loss 1.970 Epoch 2 iteration 0092/0188: training loss 1.968 Epoch 2 iteration 0093/0188: training loss 1.966 Epoch 2 iteration 0094/0188: training loss 1.964 Epoch 2 iteration 0095/0188: training loss 1.961 Epoch 2 iteration 0096/0188: training loss 1.957 Epoch 2 iteration 0097/0188: training loss 1.953 Epoch 2 iteration 0098/0188: training loss 1.951 Epoch 2 iteration 0099/0188: training loss 1.951 Epoch 2 iteration 0100/0188: training loss 1.953 Epoch 2 iteration 0101/0188: training loss 1.951 Epoch 2 iteration 0102/0188: training loss 1.948 Epoch 2 iteration 0103/0188: training loss 1.946 Epoch 2 iteration 0104/0188: training loss 1.944 Epoch 2 iteration 0105/0188: training loss 1.944 Epoch 2 iteration 0106/0188: training loss 1.943 Epoch 2 iteration 0107/0188: training loss 1.941 Epoch 2 iteration 0108/0188: training loss 1.941 Epoch 2 iteration 0109/0188: training loss 1.943 Epoch 2 iteration 0110/0188: training loss 1.942 Epoch 2 iteration 0111/0188: training loss 1.941 Epoch 2 iteration 0112/0188: training loss 1.939 Epoch 2 iteration 0113/0188: training loss 1.938 Epoch 2 iteration 0114/0188: training loss 1.937 Epoch 2 iteration 0115/0188: training loss 1.935 Epoch 2 iteration 0116/0188: training loss 1.933 Epoch 2 iteration 0117/0188: training loss 1.931 Epoch 2 iteration 0118/0188: training loss 1.929 Epoch 2 iteration 0119/0188: training loss 1.927 Epoch 2 iteration 0120/0188: training loss 1.929 Epoch 2 iteration 0121/0188: training loss 1.926 Epoch 2 iteration 0122/0188: training loss 1.926 Epoch 2 iteration 0123/0188: training loss 1.925 Epoch 2 iteration 0124/0188: training loss 1.924 Epoch 2 iteration 0125/0188: training loss 1.922 Epoch 2 iteration 0126/0188: training loss 1.921 Epoch 2 iteration 0127/0188: training loss 1.920 Epoch 2 iteration 0128/0188: training loss 1.920 Epoch 2 iteration 0129/0188: training loss 1.919 Epoch 2 iteration 0130/0188: training loss 1.919 Epoch 2 iteration 0131/0188: training loss 1.919 Epoch 2 iteration 0132/0188: training loss 1.920 Epoch 2 iteration 0133/0188: training loss 1.923 Epoch 2 iteration 0134/0188: training loss 1.921 Epoch 2 iteration 0135/0188: training loss 1.920 Epoch 2 iteration 0136/0188: training loss 1.918 Epoch 2 iteration 0137/0188: training loss 1.920 Epoch 2 iteration 0138/0188: training loss 1.919 Epoch 2 iteration 0139/0188: training loss 1.919 Epoch 2 iteration 0140/0188: training loss 1.918 Epoch 2 iteration 0141/0188: training loss 1.919 Epoch 2 iteration 0142/0188: training loss 1.917 Epoch 2 iteration 0143/0188: training loss 1.916 Epoch 2 iteration 0144/0188: training loss 1.916 Epoch 2 iteration 0145/0188: training loss 1.915 Epoch 2 iteration 0146/0188: training loss 1.914 Epoch 2 iteration 0147/0188: training loss 1.913 Epoch 2 iteration 0148/0188: training loss 1.911 Epoch 2 iteration 0149/0188: training loss 1.910 Epoch 2 iteration 0150/0188: training loss 1.909 Epoch 2 iteration 0151/0188: training loss 1.909 Epoch 2 iteration 0152/0188: training loss 1.909 Epoch 2 iteration 0153/0188: training loss 1.908 Epoch 2 iteration 0154/0188: training loss 1.909 Epoch 2 iteration 0155/0188: training loss 1.907 Epoch 2 iteration 0156/0188: training loss 1.906 Epoch 2 iteration 0157/0188: training loss 1.905 Epoch 2 iteration 0158/0188: training loss 1.903 Epoch 2 iteration 0159/0188: training loss 1.904 Epoch 2 iteration 0160/0188: training loss 1.904 Epoch 2 iteration 0161/0188: training loss 1.902 Epoch 2 iteration 0162/0188: training loss 1.900 Epoch 2 iteration 0163/0188: training loss 1.898 Epoch 2 iteration 0164/0188: training loss 1.896 Epoch 2 iteration 0165/0188: training loss 1.895 Epoch 2 iteration 0166/0188: training loss 1.893 Epoch 2 iteration 0167/0188: training loss 1.892 Epoch 2 iteration 0168/0188: training loss 1.892 Epoch 2 iteration 0169/0188: training loss 1.891 Epoch 2 iteration 0170/0188: training loss 1.889 Epoch 2 iteration 0171/0188: training loss 1.889 Epoch 2 iteration 0172/0188: training loss 1.890 Epoch 2 iteration 0173/0188: training loss 1.889 Epoch 2 iteration 0174/0188: training loss 1.889 Epoch 2 iteration 0175/0188: training loss 1.888 Epoch 2 iteration 0176/0188: training loss 1.888 Epoch 2 iteration 0177/0188: training loss 1.890 Epoch 2 iteration 0178/0188: training loss 1.892 Epoch 2 iteration 0179/0188: training loss 1.892 Epoch 2 iteration 0180/0188: training loss 1.892 Epoch 2 iteration 0181/0188: training loss 1.891 Epoch 2 iteration 0182/0188: training loss 1.889 Epoch 2 iteration 0183/0188: training loss 1.887 Epoch 2 iteration 0184/0188: training loss 1.885 Epoch 2 iteration 0185/0188: training loss 1.885 Epoch 2 iteration 0186/0188: training loss 1.887 Epoch 2 validation pixAcc: 0.829, mIoU: 0.257 Epoch 3 iteration 0001/0187: training loss 1.638 Epoch 3 iteration 0002/0187: training loss 1.644 Epoch 3 iteration 0003/0187: training loss 1.704 Epoch 3 iteration 0004/0187: training loss 1.696 Epoch 3 iteration 0005/0187: training loss 1.710 Epoch 3 iteration 0006/0187: training loss 1.709 Epoch 3 iteration 0007/0187: training loss 1.728 Epoch 3 iteration 0008/0187: training loss 1.751 Epoch 3 iteration 0009/0187: training loss 1.730 Epoch 3 iteration 0010/0187: training loss 1.754 Epoch 3 iteration 0011/0187: training loss 1.737 Epoch 3 iteration 0012/0187: training loss 1.744 Epoch 3 iteration 0013/0187: training loss 1.733 Epoch 3 iteration 0014/0187: training loss 1.713 Epoch 3 iteration 0015/0187: training loss 1.712 Epoch 3 iteration 0016/0187: training loss 1.712 Epoch 3 iteration 0017/0187: training loss 1.693 Epoch 3 iteration 0018/0187: training loss 1.712 Epoch 3 iteration 0019/0187: training loss 1.727 Epoch 3 iteration 0020/0187: training loss 1.714 Epoch 3 iteration 0021/0187: training loss 1.708 Epoch 3 iteration 0022/0187: training loss 1.703 Epoch 3 iteration 0023/0187: training loss 1.699 Epoch 3 iteration 0024/0187: training loss 1.704 Epoch 3 iteration 0025/0187: training loss 1.700 Epoch 3 iteration 0026/0187: training loss 1.710 Epoch 3 iteration 0027/0187: training loss 1.704 Epoch 3 iteration 0028/0187: training loss 1.694 Epoch 3 iteration 0029/0187: training loss 1.690 Epoch 3 iteration 0030/0187: training loss 1.696 Epoch 3 iteration 0031/0187: training loss 1.687 Epoch 3 iteration 0032/0187: training loss 1.697 Epoch 3 iteration 0033/0187: training loss 1.694 Epoch 3 iteration 0034/0187: training loss 1.686 Epoch 3 iteration 0035/0187: training loss 1.689 Epoch 3 iteration 0036/0187: training loss 1.690 Epoch 3 iteration 0037/0187: training loss 1.688 Epoch 3 iteration 0038/0187: training loss 1.693 Epoch 3 iteration 0039/0187: training loss 1.687 Epoch 3 iteration 0040/0187: training loss 1.682 Epoch 3 iteration 0041/0187: training loss 1.678 Epoch 3 iteration 0042/0187: training loss 1.680 Epoch 3 iteration 0043/0187: training loss 1.678 Epoch 3 iteration 0044/0187: training loss 1.679 Epoch 3 iteration 0045/0187: training loss 1.684 Epoch 3 iteration 0046/0187: training loss 1.680 Epoch 3 iteration 0047/0187: training loss 1.667 Epoch 3 iteration 0048/0187: training loss 1.663 Epoch 3 iteration 0049/0187: training loss 1.661 Epoch 3 iteration 0050/0187: training loss 1.661 Epoch 3 iteration 0051/0187: training loss 1.660 Epoch 3 iteration 0052/0187: training loss 1.659 Epoch 3 iteration 0053/0187: training loss 1.662 Epoch 3 iteration 0054/0187: training loss 1.662 Epoch 3 iteration 0055/0187: training loss 1.661 Epoch 3 iteration 0056/0187: training loss 1.659 Epoch 3 iteration 0057/0187: training loss 1.656 Epoch 3 iteration 0058/0187: training loss 1.658 Epoch 3 iteration 0059/0187: training loss 1.655 Epoch 3 iteration 0060/0187: training loss 1.657 Epoch 3 iteration 0061/0187: training loss 1.653 Epoch 3 iteration 0062/0187: training loss 1.654 Epoch 3 iteration 0063/0187: training loss 1.654 Epoch 3 iteration 0064/0187: training loss 1.654 Epoch 3 iteration 0065/0187: training loss 1.649 Epoch 3 iteration 0066/0187: training loss 1.646 Epoch 3 iteration 0067/0187: training loss 1.647 Epoch 3 iteration 0068/0187: training loss 1.652 Epoch 3 iteration 0069/0187: training loss 1.650 Epoch 3 iteration 0070/0187: training loss 1.654 Epoch 3 iteration 0071/0187: training loss 1.652 Epoch 3 iteration 0072/0187: training loss 1.651 Epoch 3 iteration 0073/0187: training loss 1.654 Epoch 3 iteration 0074/0187: training loss 1.655 Epoch 3 iteration 0075/0187: training loss 1.654 Epoch 3 iteration 0076/0187: training loss 1.660 Epoch 3 iteration 0077/0187: training loss 1.665 Epoch 3 iteration 0078/0187: training loss 1.669 Epoch 3 iteration 0079/0187: training loss 1.671 Epoch 3 iteration 0080/0187: training loss 1.669 Epoch 3 iteration 0081/0187: training loss 1.667 Epoch 3 iteration 0082/0187: training loss 1.665 Epoch 3 iteration 0083/0187: training loss 1.667 Epoch 3 iteration 0084/0187: training loss 1.665 Epoch 3 iteration 0085/0187: training loss 1.666 Epoch 3 iteration 0086/0187: training loss 1.666 Epoch 3 iteration 0087/0187: training loss 1.664 Epoch 3 iteration 0088/0187: training loss 1.667 Epoch 3 iteration 0089/0187: training loss 1.666 Epoch 3 iteration 0090/0187: training loss 1.665 Epoch 3 iteration 0091/0187: training loss 1.665 Epoch 3 iteration 0092/0187: training loss 1.663 Epoch 3 iteration 0093/0187: training loss 1.676 Epoch 3 iteration 0094/0187: training loss 1.673 Epoch 3 iteration 0095/0187: training loss 1.673 Epoch 3 iteration 0096/0187: training loss 1.673 Epoch 3 iteration 0097/0187: training loss 1.671 Epoch 3 iteration 0098/0187: training loss 1.672 Epoch 3 iteration 0099/0187: training loss 1.670 Epoch 3 iteration 0100/0187: training loss 1.670 Epoch 3 iteration 0101/0187: training loss 1.669 Epoch 3 iteration 0102/0187: training loss 1.668 Epoch 3 iteration 0103/0187: training loss 1.666 Epoch 3 iteration 0104/0187: training loss 1.663 Epoch 3 iteration 0105/0187: training loss 1.662 Epoch 3 iteration 0106/0187: training loss 1.662 Epoch 3 iteration 0107/0187: training loss 1.662 Epoch 3 iteration 0108/0187: training loss 1.662 Epoch 3 iteration 0109/0187: training loss 1.662 Epoch 3 iteration 0110/0187: training loss 1.662 Epoch 3 iteration 0111/0187: training loss 1.660 Epoch 3 iteration 0112/0187: training loss 1.661 Epoch 3 iteration 0113/0187: training loss 1.661 Epoch 3 iteration 0114/0187: training loss 1.661 Epoch 3 iteration 0115/0187: training loss 1.659 Epoch 3 iteration 0116/0187: training loss 1.657 Epoch 3 iteration 0117/0187: training loss 1.658 Epoch 3 iteration 0118/0187: training loss 1.657 Epoch 3 iteration 0119/0187: training loss 1.655 Epoch 3 iteration 0120/0187: training loss 1.655 Epoch 3 iteration 0121/0187: training loss 1.656 Epoch 3 iteration 0122/0187: training loss 1.654 Epoch 3 iteration 0123/0187: training loss 1.652 Epoch 3 iteration 0124/0187: training loss 1.651 Epoch 3 iteration 0125/0187: training loss 1.649 Epoch 3 iteration 0126/0187: training loss 1.647 Epoch 3 iteration 0127/0187: training loss 1.647 Epoch 3 iteration 0128/0187: training loss 1.645 Epoch 3 iteration 0129/0187: training loss 1.644 Epoch 3 iteration 0130/0187: training loss 1.642 Epoch 3 iteration 0131/0187: training loss 1.641 Epoch 3 iteration 0132/0187: training loss 1.640 Epoch 3 iteration 0133/0187: training loss 1.640 Epoch 3 iteration 0134/0187: training loss 1.641 Epoch 3 iteration 0135/0187: training loss 1.640 Epoch 3 iteration 0136/0187: training loss 1.640 Epoch 3 iteration 0137/0187: training loss 1.639 Epoch 3 iteration 0138/0187: training loss 1.640 Epoch 3 iteration 0139/0187: training loss 1.646 Epoch 3 iteration 0140/0187: training loss 1.645 Epoch 3 iteration 0141/0187: training loss 1.645 Epoch 3 iteration 0142/0187: training loss 1.644 Epoch 3 iteration 0143/0187: training loss 1.644 Epoch 3 iteration 0144/0187: training loss 1.641 Epoch 3 iteration 0145/0187: training loss 1.641 Epoch 3 iteration 0146/0187: training loss 1.637 Epoch 3 iteration 0147/0187: training loss 1.636 Epoch 3 iteration 0148/0187: training loss 1.634 Epoch 3 iteration 0149/0187: training loss 1.633 Epoch 3 iteration 0150/0187: training loss 1.631 Epoch 3 iteration 0151/0187: training loss 1.631 Epoch 3 iteration 0152/0187: training loss 1.631 Epoch 3 iteration 0153/0187: training loss 1.633 Epoch 3 iteration 0154/0187: training loss 1.634 Epoch 3 iteration 0155/0187: training loss 1.636 Epoch 3 iteration 0156/0187: training loss 1.635 Epoch 3 iteration 0157/0187: training loss 1.636 Epoch 3 iteration 0158/0187: training loss 1.635 Epoch 3 iteration 0159/0187: training loss 1.635 Epoch 3 iteration 0160/0187: training loss 1.636 Epoch 3 iteration 0161/0187: training loss 1.638 Epoch 3 iteration 0162/0187: training loss 1.639 Epoch 3 iteration 0163/0187: training loss 1.639 Epoch 3 iteration 0164/0187: training loss 1.638 Epoch 3 iteration 0165/0187: training loss 1.637 Epoch 3 iteration 0166/0187: training loss 1.636 Epoch 3 iteration 0167/0187: training loss 1.637 Epoch 3 iteration 0168/0187: training loss 1.636 Epoch 3 iteration 0169/0187: training loss 1.635 Epoch 3 iteration 0170/0187: training loss 1.635 Epoch 3 iteration 0171/0187: training loss 1.635 Epoch 3 iteration 0172/0187: training loss 1.634 Epoch 3 iteration 0173/0187: training loss 1.633 Epoch 3 iteration 0174/0187: training loss 1.633 Epoch 3 iteration 0175/0187: training loss 1.633 Epoch 3 iteration 0176/0187: training loss 1.632 Epoch 3 iteration 0177/0187: training loss 1.631 Epoch 3 iteration 0178/0187: training loss 1.630 Epoch 3 iteration 0179/0187: training loss 1.629 Epoch 3 iteration 0180/0187: training loss 1.627 Epoch 3 iteration 0181/0187: training loss 1.628 Epoch 3 iteration 0182/0187: training loss 1.626 Epoch 3 iteration 0183/0187: training loss 1.625 Epoch 3 iteration 0184/0187: training loss 1.624 Epoch 3 iteration 0185/0187: training loss 1.623 Epoch 3 iteration 0186/0187: training loss 1.626 Epoch 3 iteration 0187/0187: training loss 1.625 Epoch 3 validation pixAcc: 0.837, mIoU: 0.274 Epoch 4 iteration 0001/0187: training loss 1.534 Epoch 4 iteration 0002/0187: training loss 1.567 Epoch 4 iteration 0003/0187: training loss 1.587 Epoch 4 iteration 0004/0187: training loss 1.605 Epoch 4 iteration 0005/0187: training loss 1.591 Epoch 4 iteration 0006/0187: training loss 1.639 Epoch 4 iteration 0007/0187: training loss 1.616 Epoch 4 iteration 0008/0187: training loss 1.625 Epoch 4 iteration 0009/0187: training loss 1.624 Epoch 4 iteration 0010/0187: training loss 1.597 Epoch 4 iteration 0011/0187: training loss 1.553 Epoch 4 iteration 0012/0187: training loss 1.551 Epoch 4 iteration 0013/0187: training loss 1.555 Epoch 4 iteration 0014/0187: training loss 1.568 Epoch 4 iteration 0015/0187: training loss 1.570 Epoch 4 iteration 0016/0187: training loss 1.575 Epoch 4 iteration 0017/0187: training loss 1.563 Epoch 4 iteration 0018/0187: training loss 1.563 Epoch 4 iteration 0019/0187: training loss 1.563 Epoch 4 iteration 0020/0187: training loss 1.554 Epoch 4 iteration 0021/0187: training loss 1.547 Epoch 4 iteration 0022/0187: training loss 1.538 Epoch 4 iteration 0023/0187: training loss 1.529 Epoch 4 iteration 0024/0187: training loss 1.528 Epoch 4 iteration 0025/0187: training loss 1.537 Epoch 4 iteration 0026/0187: training loss 1.537 Epoch 4 iteration 0027/0187: training loss 1.527 Epoch 4 iteration 0028/0187: training loss 1.519 Epoch 4 iteration 0029/0187: training loss 1.511 Epoch 4 iteration 0030/0187: training loss 1.521 Epoch 4 iteration 0031/0187: training loss 1.514 Epoch 4 iteration 0032/0187: training loss 1.507 Epoch 4 iteration 0033/0187: training loss 1.504 Epoch 4 iteration 0034/0187: training loss 1.501 Epoch 4 iteration 0035/0187: training loss 1.500 Epoch 4 iteration 0036/0187: training loss 1.503 Epoch 4 iteration 0037/0187: training loss 1.499 Epoch 4 iteration 0038/0187: training loss 1.511 Epoch 4 iteration 0039/0187: training loss 1.504 Epoch 4 iteration 0040/0187: training loss 1.511 Epoch 4 iteration 0041/0187: training loss 1.511 Epoch 4 iteration 0042/0187: training loss 1.504 Epoch 4 iteration 0043/0187: training loss 1.501 Epoch 4 iteration 0044/0187: training loss 1.501 Epoch 4 iteration 0045/0187: training loss 1.502 Epoch 4 iteration 0046/0187: training loss 1.508 Epoch 4 iteration 0047/0187: training loss 1.509 Epoch 4 iteration 0048/0187: training loss 1.513 Epoch 4 iteration 0049/0187: training loss 1.511 Epoch 4 iteration 0050/0187: training loss 1.515 Epoch 4 iteration 0051/0187: training loss 1.515 Epoch 4 iteration 0052/0187: training loss 1.510 Epoch 4 iteration 0053/0187: training loss 1.518 Epoch 4 iteration 0054/0187: training loss 1.517 Epoch 4 iteration 0055/0187: training loss 1.516 Epoch 4 iteration 0056/0187: training loss 1.515 Epoch 4 iteration 0057/0187: training loss 1.515 Epoch 4 iteration 0058/0187: training loss 1.511 Epoch 4 iteration 0059/0187: training loss 1.506 Epoch 4 iteration 0060/0187: training loss 1.509 Epoch 4 iteration 0061/0187: training loss 1.508 Epoch 4 iteration 0062/0187: training loss 1.506 Epoch 4 iteration 0063/0187: training loss 1.507 Epoch 4 iteration 0064/0187: training loss 1.508 Epoch 4 iteration 0065/0187: training loss 1.509 Epoch 4 iteration 0066/0187: training loss 1.509 Epoch 4 iteration 0067/0187: training loss 1.511 Epoch 4 iteration 0068/0187: training loss 1.510 Epoch 4 iteration 0069/0187: training loss 1.510 Epoch 4 iteration 0070/0187: training loss 1.506 Epoch 4 iteration 0071/0187: training loss 1.503 Epoch 4 iteration 0072/0187: training loss 1.502 Epoch 4 iteration 0073/0187: training loss 1.502 Epoch 4 iteration 0074/0187: training loss 1.505 Epoch 4 iteration 0075/0187: training loss 1.506 Epoch 4 iteration 0076/0187: training loss 1.506 Epoch 4 iteration 0077/0187: training loss 1.507 Epoch 4 iteration 0078/0187: training loss 1.506 Epoch 4 iteration 0079/0187: training loss 1.503 Epoch 4 iteration 0080/0187: training loss 1.505 Epoch 4 iteration 0081/0187: training loss 1.506 Epoch 4 iteration 0082/0187: training loss 1.507 Epoch 4 iteration 0083/0187: training loss 1.506 Epoch 4 iteration 0084/0187: training loss 1.506 Epoch 4 iteration 0085/0187: training loss 1.506 Epoch 4 iteration 0086/0187: training loss 1.508 Epoch 4 iteration 0087/0187: training loss 1.510 Epoch 4 iteration 0088/0187: training loss 1.509 Epoch 4 iteration 0089/0187: training loss 1.508 Epoch 4 iteration 0090/0187: training loss 1.506 Epoch 4 iteration 0091/0188: training loss 1.504 Epoch 4 iteration 0092/0188: training loss 1.502 Epoch 4 iteration 0093/0188: training loss 1.504 Epoch 4 iteration 0094/0188: training loss 1.502 Epoch 4 iteration 0095/0188: training loss 1.501 Epoch 4 iteration 0096/0188: training loss 1.501 Epoch 4 iteration 0097/0188: training loss 1.499 Epoch 4 iteration 0098/0188: training loss 1.497 Epoch 4 iteration 0099/0188: training loss 1.495 Epoch 4 iteration 0100/0188: training loss 1.496 Epoch 4 iteration 0101/0188: training loss 1.497 Epoch 4 iteration 0102/0188: training loss 1.495 Epoch 4 iteration 0103/0188: training loss 1.501 Epoch 4 iteration 0104/0188: training loss 1.502 Epoch 4 iteration 0105/0188: training loss 1.507 Epoch 4 iteration 0106/0188: training loss 1.504 Epoch 4 iteration 0107/0188: training loss 1.506 Epoch 4 iteration 0108/0188: training loss 1.506 Epoch 4 iteration 0109/0188: training loss 1.502 Epoch 4 iteration 0110/0188: training loss 1.506 Epoch 4 iteration 0111/0188: training loss 1.506 Epoch 4 iteration 0112/0188: training loss 1.505 Epoch 4 iteration 0113/0188: training loss 1.505 Epoch 4 iteration 0114/0188: training loss 1.503 Epoch 4 iteration 0115/0188: training loss 1.502 Epoch 4 iteration 0116/0188: training loss 1.503 Epoch 4 iteration 0117/0188: training loss 1.503 Epoch 4 iteration 0118/0188: training loss 1.519 Epoch 4 iteration 0119/0188: training loss 1.518 Epoch 4 iteration 0120/0188: training loss 1.517 Epoch 4 iteration 0121/0188: training loss 1.517 Epoch 4 iteration 0122/0188: training loss 1.517 Epoch 4 iteration 0123/0188: training loss 1.515 Epoch 4 iteration 0124/0188: training loss 1.514 Epoch 4 iteration 0125/0188: training loss 1.515 Epoch 4 iteration 0126/0188: training loss 1.513 Epoch 4 iteration 0127/0188: training loss 1.512 Epoch 4 iteration 0128/0188: training loss 1.513 Epoch 4 iteration 0129/0188: training loss 1.515 Epoch 4 iteration 0130/0188: training loss 1.515 Epoch 4 iteration 0131/0188: training loss 1.514 Epoch 4 iteration 0132/0188: training loss 1.514 Epoch 4 iteration 0133/0188: training loss 1.513 Epoch 4 iteration 0134/0188: training loss 1.516 Epoch 4 iteration 0135/0188: training loss 1.516 Epoch 4 iteration 0136/0188: training loss 1.514 Epoch 4 iteration 0137/0188: training loss 1.518 Epoch 4 iteration 0138/0188: training loss 1.516 Epoch 4 iteration 0139/0188: training loss 1.515 Epoch 4 iteration 0140/0188: training loss 1.514 Epoch 4 iteration 0141/0188: training loss 1.511 Epoch 4 iteration 0142/0188: training loss 1.512 Epoch 4 iteration 0143/0188: training loss 1.510 Epoch 4 iteration 0144/0188: training loss 1.510 Epoch 4 iteration 0145/0188: training loss 1.511 Epoch 4 iteration 0146/0188: training loss 1.510 Epoch 4 iteration 0147/0188: training loss 1.509 Epoch 4 iteration 0148/0188: training loss 1.508 Epoch 4 iteration 0149/0188: training loss 1.507 Epoch 4 iteration 0150/0188: training loss 1.506 Epoch 4 iteration 0151/0188: training loss 1.506 Epoch 4 iteration 0152/0188: training loss 1.509 Epoch 4 iteration 0153/0188: training loss 1.509 Epoch 4 iteration 0154/0188: training loss 1.508 Epoch 4 iteration 0155/0188: training loss 1.507 Epoch 4 iteration 0156/0188: training loss 1.504 Epoch 4 iteration 0157/0188: training loss 1.503 Epoch 4 iteration 0158/0188: training loss 1.503 Epoch 4 iteration 0159/0188: training loss 1.501 Epoch 4 iteration 0160/0188: training loss 1.501 Epoch 4 iteration 0161/0188: training loss 1.501 Epoch 4 iteration 0162/0188: training loss 1.501 Epoch 4 iteration 0163/0188: training loss 1.499 Epoch 4 iteration 0164/0188: training loss 1.500 Epoch 4 iteration 0165/0188: training loss 1.499 Epoch 4 iteration 0166/0188: training loss 1.499 Epoch 4 iteration 0167/0188: training loss 1.498 Epoch 4 iteration 0168/0188: training loss 1.497 Epoch 4 iteration 0169/0188: training loss 1.495 Epoch 4 iteration 0170/0188: training loss 1.494 Epoch 4 iteration 0171/0188: training loss 1.495 Epoch 4 iteration 0172/0188: training loss 1.495 Epoch 4 iteration 0173/0188: training loss 1.495 Epoch 4 iteration 0174/0188: training loss 1.493 Epoch 4 iteration 0175/0188: training loss 1.493 Epoch 4 iteration 0176/0188: training loss 1.492 Epoch 4 iteration 0177/0188: training loss 1.492 Epoch 4 iteration 0178/0188: training loss 1.490 Epoch 4 iteration 0179/0188: training loss 1.491 Epoch 4 iteration 0180/0188: training loss 1.492 Epoch 4 iteration 0181/0188: training loss 1.493 Epoch 4 iteration 0182/0188: training loss 1.492 Epoch 4 iteration 0183/0188: training loss 1.490 Epoch 4 iteration 0184/0188: training loss 1.490 Epoch 4 iteration 0185/0188: training loss 1.490 Epoch 4 iteration 0186/0188: training loss 1.489 Epoch 4 validation pixAcc: 0.844, mIoU: 0.290 Epoch 5 iteration 0001/0187: training loss 1.531 Epoch 5 iteration 0002/0187: training loss 1.444 Epoch 5 iteration 0003/0187: training loss 1.430 Epoch 5 iteration 0004/0187: training loss 1.414 Epoch 5 iteration 0005/0187: training loss 1.405 Epoch 5 iteration 0006/0187: training loss 1.409 Epoch 5 iteration 0007/0187: training loss 1.385 Epoch 5 iteration 0008/0187: training loss 1.387 Epoch 5 iteration 0009/0187: training loss 1.380 Epoch 5 iteration 0010/0187: training loss 1.417 Epoch 5 iteration 0011/0187: training loss 1.411 Epoch 5 iteration 0012/0187: training loss 1.417 Epoch 5 iteration 0013/0187: training loss 1.403 Epoch 5 iteration 0014/0187: training loss 1.403 Epoch 5 iteration 0015/0187: training loss 1.401 Epoch 5 iteration 0016/0187: training loss 1.408 Epoch 5 iteration 0017/0187: training loss 1.408 Epoch 5 iteration 0018/0187: training loss 1.433 Epoch 5 iteration 0019/0187: training loss 1.434 Epoch 5 iteration 0020/0187: training loss 1.439 Epoch 5 iteration 0021/0187: training loss 1.428 Epoch 5 iteration 0022/0187: training loss 1.432 Epoch 5 iteration 0023/0187: training loss 1.427 Epoch 5 iteration 0024/0187: training loss 1.444 Epoch 5 iteration 0025/0187: training loss 1.451 Epoch 5 iteration 0026/0187: training loss 1.449 Epoch 5 iteration 0027/0187: training loss 1.441 Epoch 5 iteration 0028/0187: training loss 1.432 Epoch 5 iteration 0029/0187: training loss 1.437 Epoch 5 iteration 0030/0187: training loss 1.428 Epoch 5 iteration 0031/0187: training loss 1.426 Epoch 5 iteration 0032/0187: training loss 1.425 Epoch 5 iteration 0033/0187: training loss 1.425 Epoch 5 iteration 0034/0187: training loss 1.434 Epoch 5 iteration 0035/0187: training loss 1.425 Epoch 5 iteration 0036/0187: training loss 1.421 Epoch 5 iteration 0037/0187: training loss 1.424 Epoch 5 iteration 0038/0187: training loss 1.425 Epoch 5 iteration 0039/0187: training loss 1.428 Epoch 5 iteration 0040/0187: training loss 1.427 Epoch 5 iteration 0041/0187: training loss 1.447 Epoch 5 iteration 0042/0187: training loss 1.446 Epoch 5 iteration 0043/0187: training loss 1.444 Epoch 5 iteration 0044/0187: training loss 1.440 Epoch 5 iteration 0045/0187: training loss 1.451 Epoch 5 iteration 0046/0187: training loss 1.447 Epoch 5 iteration 0047/0187: training loss 1.452 Epoch 5 iteration 0048/0187: training loss 1.449 Epoch 5 iteration 0049/0187: training loss 1.444 Epoch 5 iteration 0050/0187: training loss 1.441 Epoch 5 iteration 0051/0187: training loss 1.444 Epoch 5 iteration 0052/0187: training loss 1.443 Epoch 5 iteration 0053/0187: training loss 1.446 Epoch 5 iteration 0054/0187: training loss 1.448 Epoch 5 iteration 0055/0187: training loss 1.444 Epoch 5 iteration 0056/0187: training loss 1.443 Epoch 5 iteration 0057/0187: training loss 1.446 Epoch 5 iteration 0058/0187: training loss 1.444 Epoch 5 iteration 0059/0187: training loss 1.447 Epoch 5 iteration 0060/0187: training loss 1.447 Epoch 5 iteration 0061/0187: training loss 1.443 Epoch 5 iteration 0062/0187: training loss 1.440 Epoch 5 iteration 0063/0187: training loss 1.436 Epoch 5 iteration 0064/0187: training loss 1.434 Epoch 5 iteration 0065/0187: training loss 1.430 Epoch 5 iteration 0066/0187: training loss 1.430 Epoch 5 iteration 0067/0187: training loss 1.427 Epoch 5 iteration 0068/0187: training loss 1.425 Epoch 5 iteration 0069/0187: training loss 1.421 Epoch 5 iteration 0070/0187: training loss 1.422 Epoch 5 iteration 0071/0187: training loss 1.422 Epoch 5 iteration 0072/0187: training loss 1.422 Epoch 5 iteration 0073/0187: training loss 1.422 Epoch 5 iteration 0074/0187: training loss 1.422 Epoch 5 iteration 0075/0187: training loss 1.422 Epoch 5 iteration 0076/0187: training loss 1.422 Epoch 5 iteration 0077/0187: training loss 1.420 Epoch 5 iteration 0078/0187: training loss 1.419 Epoch 5 iteration 0079/0187: training loss 1.417 Epoch 5 iteration 0080/0187: training loss 1.420 Epoch 5 iteration 0081/0187: training loss 1.419 Epoch 5 iteration 0082/0187: training loss 1.417 Epoch 5 iteration 0083/0187: training loss 1.416 Epoch 5 iteration 0084/0187: training loss 1.416 Epoch 5 iteration 0085/0187: training loss 1.413 Epoch 5 iteration 0086/0187: training loss 1.413 Epoch 5 iteration 0087/0187: training loss 1.414 Epoch 5 iteration 0088/0187: training loss 1.414 Epoch 5 iteration 0089/0187: training loss 1.414 Epoch 5 iteration 0090/0187: training loss 1.413 Epoch 5 iteration 0091/0187: training loss 1.413 Epoch 5 iteration 0092/0187: training loss 1.412 Epoch 5 iteration 0093/0187: training loss 1.411 Epoch 5 iteration 0094/0187: training loss 1.409 Epoch 5 iteration 0095/0187: training loss 1.410 Epoch 5 iteration 0096/0187: training loss 1.412 Epoch 5 iteration 0097/0187: training loss 1.415 Epoch 5 iteration 0098/0187: training loss 1.416 Epoch 5 iteration 0099/0187: training loss 1.415 Epoch 5 iteration 0100/0187: training loss 1.413 Epoch 5 iteration 0101/0187: training loss 1.414 Epoch 5 iteration 0102/0187: training loss 1.412 Epoch 5 iteration 0103/0187: training loss 1.413 Epoch 5 iteration 0104/0187: training loss 1.413 Epoch 5 iteration 0105/0187: training loss 1.412 Epoch 5 iteration 0106/0187: training loss 1.411 Epoch 5 iteration 0107/0187: training loss 1.413 Epoch 5 iteration 0108/0187: training loss 1.414 Epoch 5 iteration 0109/0187: training loss 1.414 Epoch 5 iteration 0110/0187: training loss 1.414 Epoch 5 iteration 0111/0187: training loss 1.411 Epoch 5 iteration 0112/0187: training loss 1.410 Epoch 5 iteration 0113/0187: training loss 1.408 Epoch 5 iteration 0114/0187: training loss 1.407 Epoch 5 iteration 0115/0187: training loss 1.405 Epoch 5 iteration 0116/0187: training loss 1.407 Epoch 5 iteration 0117/0187: training loss 1.405 Epoch 5 iteration 0118/0187: training loss 1.406 Epoch 5 iteration 0119/0187: training loss 1.405 Epoch 5 iteration 0120/0187: training loss 1.406 Epoch 5 iteration 0121/0187: training loss 1.404 Epoch 5 iteration 0122/0187: training loss 1.405 Epoch 5 iteration 0123/0187: training loss 1.403 Epoch 5 iteration 0124/0187: training loss 1.402 Epoch 5 iteration 0125/0187: training loss 1.400 Epoch 5 iteration 0126/0187: training loss 1.401 Epoch 5 iteration 0127/0187: training loss 1.398 Epoch 5 iteration 0128/0187: training loss 1.397 Epoch 5 iteration 0129/0187: training loss 1.398 Epoch 5 iteration 0130/0187: training loss 1.397 Epoch 5 iteration 0131/0187: training loss 1.398 Epoch 5 iteration 0132/0187: training loss 1.397 Epoch 5 iteration 0133/0187: training loss 1.397 Epoch 5 iteration 0134/0187: training loss 1.394 Epoch 5 iteration 0135/0187: training loss 1.398 Epoch 5 iteration 0136/0187: training loss 1.398 Epoch 5 iteration 0137/0187: training loss 1.397 Epoch 5 iteration 0138/0187: training loss 1.397 Epoch 5 iteration 0139/0187: training loss 1.397 Epoch 5 iteration 0140/0187: training loss 1.398 Epoch 5 iteration 0141/0187: training loss 1.396 Epoch 5 iteration 0142/0187: training loss 1.397 Epoch 5 iteration 0143/0187: training loss 1.395 Epoch 5 iteration 0144/0187: training loss 1.395 Epoch 5 iteration 0145/0187: training loss 1.395 Epoch 5 iteration 0146/0187: training loss 1.395 Epoch 5 iteration 0147/0187: training loss 1.393 Epoch 5 iteration 0148/0187: training loss 1.392 Epoch 5 iteration 0149/0187: training loss 1.392 Epoch 5 iteration 0150/0187: training loss 1.391 Epoch 5 iteration 0151/0187: training loss 1.389 Epoch 5 iteration 0152/0187: training loss 1.389 Epoch 5 iteration 0153/0187: training loss 1.389 Epoch 5 iteration 0154/0187: training loss 1.389 Epoch 5 iteration 0155/0187: training loss 1.389 Epoch 5 iteration 0156/0187: training loss 1.391 Epoch 5 iteration 0157/0187: training loss 1.393 Epoch 5 iteration 0158/0187: training loss 1.394 Epoch 5 iteration 0159/0187: training loss 1.393 Epoch 5 iteration 0160/0187: training loss 1.393 Epoch 5 iteration 0161/0187: training loss 1.392 Epoch 5 iteration 0162/0187: training loss 1.392 Epoch 5 iteration 0163/0187: training loss 1.392 Epoch 5 iteration 0164/0187: training loss 1.390 Epoch 5 iteration 0165/0187: training loss 1.390 Epoch 5 iteration 0166/0187: training loss 1.389 Epoch 5 iteration 0167/0187: training loss 1.388 Epoch 5 iteration 0168/0187: training loss 1.387 Epoch 5 iteration 0169/0187: training loss 1.387 Epoch 5 iteration 0170/0187: training loss 1.388 Epoch 5 iteration 0171/0187: training loss 1.387 Epoch 5 iteration 0172/0187: training loss 1.387 Epoch 5 iteration 0173/0187: training loss 1.387 Epoch 5 iteration 0174/0187: training loss 1.387 Epoch 5 iteration 0175/0187: training loss 1.388 Epoch 5 iteration 0176/0187: training loss 1.386 Epoch 5 iteration 0177/0187: training loss 1.386 Epoch 5 iteration 0178/0187: training loss 1.384 Epoch 5 iteration 0179/0187: training loss 1.382 Epoch 5 iteration 0180/0187: training loss 1.383 Epoch 5 iteration 0181/0187: training loss 1.381 Epoch 5 iteration 0182/0187: training loss 1.380 Epoch 5 iteration 0183/0187: training loss 1.379 Epoch 5 iteration 0184/0187: training loss 1.380 Epoch 5 iteration 0185/0187: training loss 1.379 Epoch 5 iteration 0186/0187: training loss 1.379 Epoch 5 iteration 0187/0187: training loss 1.377 Epoch 5 validation pixAcc: 0.850, mIoU: 0.308 Epoch 6 iteration 0001/0187: training loss 1.270 Epoch 6 iteration 0002/0187: training loss 1.258 Epoch 6 iteration 0003/0187: training loss 1.245 Epoch 6 iteration 0004/0187: training loss 1.271 Epoch 6 iteration 0005/0187: training loss 1.257 Epoch 6 iteration 0006/0187: training loss 1.255 Epoch 6 iteration 0007/0187: training loss 1.266 Epoch 6 iteration 0008/0187: training loss 1.289 Epoch 6 iteration 0009/0187: training loss 1.263 Epoch 6 iteration 0010/0187: training loss 1.285 Epoch 6 iteration 0011/0187: training loss 1.333 Epoch 6 iteration 0012/0187: training loss 1.319 Epoch 6 iteration 0013/0187: training loss 1.300 Epoch 6 iteration 0014/0187: training loss 1.305 Epoch 6 iteration 0015/0187: training loss 1.324 Epoch 6 iteration 0016/0187: training loss 1.315 Epoch 6 iteration 0017/0187: training loss 1.305 Epoch 6 iteration 0018/0187: training loss 1.319 Epoch 6 iteration 0019/0187: training loss 1.315 Epoch 6 iteration 0020/0187: training loss 1.300 Epoch 6 iteration 0021/0187: training loss 1.293 Epoch 6 iteration 0022/0187: training loss 1.286 Epoch 6 iteration 0023/0187: training loss 1.279 Epoch 6 iteration 0024/0187: training loss 1.291 Epoch 6 iteration 0025/0187: training loss 1.289 Epoch 6 iteration 0026/0187: training loss 1.286 Epoch 6 iteration 0027/0187: training loss 1.275 Epoch 6 iteration 0028/0187: training loss 1.275 Epoch 6 iteration 0029/0187: training loss 1.281 Epoch 6 iteration 0030/0187: training loss 1.278 Epoch 6 iteration 0031/0187: training loss 1.278 Epoch 6 iteration 0032/0187: training loss 1.276 Epoch 6 iteration 0033/0187: training loss 1.275 Epoch 6 iteration 0034/0187: training loss 1.273 Epoch 6 iteration 0035/0187: training loss 1.281 Epoch 6 iteration 0036/0187: training loss 1.275 Epoch 6 iteration 0037/0187: training loss 1.278 Epoch 6 iteration 0038/0187: training loss 1.277 Epoch 6 iteration 0039/0187: training loss 1.275 Epoch 6 iteration 0040/0187: training loss 1.278 Epoch 6 iteration 0041/0187: training loss 1.276 Epoch 6 iteration 0042/0187: training loss 1.282 Epoch 6 iteration 0043/0187: training loss 1.286 Epoch 6 iteration 0044/0187: training loss 1.290 Epoch 6 iteration 0045/0187: training loss 1.285 Epoch 6 iteration 0046/0187: training loss 1.282 Epoch 6 iteration 0047/0187: training loss 1.288 Epoch 6 iteration 0048/0187: training loss 1.285 Epoch 6 iteration 0049/0187: training loss 1.285 Epoch 6 iteration 0050/0187: training loss 1.284 Epoch 6 iteration 0051/0187: training loss 1.283 Epoch 6 iteration 0052/0187: training loss 1.281 Epoch 6 iteration 0053/0187: training loss 1.280 Epoch 6 iteration 0054/0187: training loss 1.277 Epoch 6 iteration 0055/0187: training loss 1.283 Epoch 6 iteration 0056/0187: training loss 1.282 Epoch 6 iteration 0057/0187: training loss 1.281 Epoch 6 iteration 0058/0187: training loss 1.283 Epoch 6 iteration 0059/0187: training loss 1.288 Epoch 6 iteration 0060/0187: training loss 1.284 Epoch 6 iteration 0061/0187: training loss 1.278 Epoch 6 iteration 0062/0187: training loss 1.276 Epoch 6 iteration 0063/0187: training loss 1.274 Epoch 6 iteration 0064/0187: training loss 1.272 Epoch 6 iteration 0065/0187: training loss 1.267 Epoch 6 iteration 0066/0187: training loss 1.267 Epoch 6 iteration 0067/0187: training loss 1.267 Epoch 6 iteration 0068/0187: training loss 1.268 Epoch 6 iteration 0069/0187: training loss 1.269 Epoch 6 iteration 0070/0187: training loss 1.271 Epoch 6 iteration 0071/0187: training loss 1.272 Epoch 6 iteration 0072/0187: training loss 1.275 Epoch 6 iteration 0073/0187: training loss 1.273 Epoch 6 iteration 0074/0187: training loss 1.273 Epoch 6 iteration 0075/0187: training loss 1.272 Epoch 6 iteration 0076/0187: training loss 1.272 Epoch 6 iteration 0077/0187: training loss 1.278 Epoch 6 iteration 0078/0187: training loss 1.274 Epoch 6 iteration 0079/0187: training loss 1.275 Epoch 6 iteration 0080/0187: training loss 1.274 Epoch 6 iteration 0081/0187: training loss 1.277 Epoch 6 iteration 0082/0187: training loss 1.277 Epoch 6 iteration 0083/0187: training loss 1.276 Epoch 6 iteration 0084/0187: training loss 1.275 Epoch 6 iteration 0085/0187: training loss 1.274 Epoch 6 iteration 0086/0187: training loss 1.272 Epoch 6 iteration 0087/0187: training loss 1.271 Epoch 6 iteration 0088/0187: training loss 1.271 Epoch 6 iteration 0089/0187: training loss 1.276 Epoch 6 iteration 0090/0187: training loss 1.277 Epoch 6 iteration 0091/0188: training loss 1.278 Epoch 6 iteration 0092/0188: training loss 1.283 Epoch 6 iteration 0093/0188: training loss 1.285 Epoch 6 iteration 0094/0188: training loss 1.286 Epoch 6 iteration 0095/0188: training loss 1.288 Epoch 6 iteration 0096/0188: training loss 1.290 Epoch 6 iteration 0097/0188: training loss 1.293 Epoch 6 iteration 0098/0188: training loss 1.291 Epoch 6 iteration 0099/0188: training loss 1.290 Epoch 6 iteration 0100/0188: training loss 1.288 Epoch 6 iteration 0101/0188: training loss 1.291 Epoch 6 iteration 0102/0188: training loss 1.290 Epoch 6 iteration 0103/0188: training loss 1.288 Epoch 6 iteration 0104/0188: training loss 1.286 Epoch 6 iteration 0105/0188: training loss 1.286 Epoch 6 iteration 0106/0188: training loss 1.285 Epoch 6 iteration 0107/0188: training loss 1.288 Epoch 6 iteration 0108/0188: training loss 1.289 Epoch 6 iteration 0109/0188: training loss 1.289 Epoch 6 iteration 0110/0188: training loss 1.292 Epoch 6 iteration 0111/0188: training loss 1.292 Epoch 6 iteration 0112/0188: training loss 1.292 Epoch 6 iteration 0113/0188: training loss 1.293 Epoch 6 iteration 0114/0188: training loss 1.297 Epoch 6 iteration 0115/0188: training loss 1.295 Epoch 6 iteration 0116/0188: training loss 1.296 Epoch 6 iteration 0117/0188: training loss 1.296 Epoch 6 iteration 0118/0188: training loss 1.299 Epoch 6 iteration 0119/0188: training loss 1.300 Epoch 6 iteration 0120/0188: training loss 1.299 Epoch 6 iteration 0121/0188: training loss 1.303 Epoch 6 iteration 0122/0188: training loss 1.301 Epoch 6 iteration 0123/0188: training loss 1.303 Epoch 6 iteration 0124/0188: training loss 1.305 Epoch 6 iteration 0125/0188: training loss 1.306 Epoch 6 iteration 0126/0188: training loss 1.304 Epoch 6 iteration 0127/0188: training loss 1.303 Epoch 6 iteration 0128/0188: training loss 1.302 Epoch 6 iteration 0129/0188: training loss 1.301 Epoch 6 iteration 0130/0188: training loss 1.301 Epoch 6 iteration 0131/0188: training loss 1.300 Epoch 6 iteration 0132/0188: training loss 1.299 Epoch 6 iteration 0133/0188: training loss 1.297 Epoch 6 iteration 0134/0188: training loss 1.300 Epoch 6 iteration 0135/0188: training loss 1.299 Epoch 6 iteration 0136/0188: training loss 1.298 Epoch 6 iteration 0137/0188: training loss 1.297 Epoch 6 iteration 0138/0188: training loss 1.295 Epoch 6 iteration 0139/0188: training loss 1.294 Epoch 6 iteration 0140/0188: training loss 1.294 Epoch 6 iteration 0141/0188: training loss 1.297 Epoch 6 iteration 0142/0188: training loss 1.296 Epoch 6 iteration 0143/0188: training loss 1.297 Epoch 6 iteration 0144/0188: training loss 1.296 Epoch 6 iteration 0145/0188: training loss 1.296 Epoch 6 iteration 0146/0188: training loss 1.295 Epoch 6 iteration 0147/0188: training loss 1.294 Epoch 6 iteration 0148/0188: training loss 1.292 Epoch 6 iteration 0149/0188: training loss 1.293 Epoch 6 iteration 0150/0188: training loss 1.292 Epoch 6 iteration 0151/0188: training loss 1.290 Epoch 6 iteration 0152/0188: training loss 1.290 Epoch 6 iteration 0153/0188: training loss 1.289 Epoch 6 iteration 0154/0188: training loss 1.288 Epoch 6 iteration 0155/0188: training loss 1.288 Epoch 6 iteration 0156/0188: training loss 1.288 Epoch 6 iteration 0157/0188: training loss 1.290 Epoch 6 iteration 0158/0188: training loss 1.289 Epoch 6 iteration 0159/0188: training loss 1.288 Epoch 6 iteration 0160/0188: training loss 1.287 Epoch 6 iteration 0161/0188: training loss 1.288 Epoch 6 iteration 0162/0188: training loss 1.288 Epoch 6 iteration 0163/0188: training loss 1.287 Epoch 6 iteration 0164/0188: training loss 1.288 Epoch 6 iteration 0165/0188: training loss 1.288 Epoch 6 iteration 0166/0188: training loss 1.287 Epoch 6 iteration 0167/0188: training loss 1.287 Epoch 6 iteration 0168/0188: training loss 1.287 Epoch 6 iteration 0169/0188: training loss 1.287 Epoch 6 iteration 0170/0188: training loss 1.286 Epoch 6 iteration 0171/0188: training loss 1.286 Epoch 6 iteration 0172/0188: training loss 1.285 Epoch 6 iteration 0173/0188: training loss 1.284 Epoch 6 iteration 0174/0188: training loss 1.285 Epoch 6 iteration 0175/0188: training loss 1.285 Epoch 6 iteration 0176/0188: training loss 1.283 Epoch 6 iteration 0177/0188: training loss 1.283 Epoch 6 iteration 0178/0188: training loss 1.282 Epoch 6 iteration 0179/0188: training loss 1.283 Epoch 6 iteration 0180/0188: training loss 1.283 Epoch 6 iteration 0181/0188: training loss 1.282 Epoch 6 iteration 0182/0188: training loss 1.282 Epoch 6 iteration 0183/0188: training loss 1.282 Epoch 6 iteration 0184/0188: training loss 1.282 Epoch 6 iteration 0185/0188: training loss 1.283 Epoch 6 iteration 0186/0188: training loss 1.283 Epoch 6 validation pixAcc: 0.852, mIoU: 0.314 Epoch 7 iteration 0001/0187: training loss 1.305 Epoch 7 iteration 0002/0187: training loss 1.207 Epoch 7 iteration 0003/0187: training loss 1.179 Epoch 7 iteration 0004/0187: training loss 1.188 Epoch 7 iteration 0005/0187: training loss 1.182 Epoch 7 iteration 0006/0187: training loss 1.220 Epoch 7 iteration 0007/0187: training loss 1.224 Epoch 7 iteration 0008/0187: training loss 1.232 Epoch 7 iteration 0009/0187: training loss 1.207 Epoch 7 iteration 0010/0187: training loss 1.210 Epoch 7 iteration 0011/0187: training loss 1.208 Epoch 7 iteration 0012/0187: training loss 1.194 Epoch 7 iteration 0013/0187: training loss 1.219 Epoch 7 iteration 0014/0187: training loss 1.232 Epoch 7 iteration 0015/0187: training loss 1.213 Epoch 7 iteration 0016/0187: training loss 1.210 Epoch 7 iteration 0017/0187: training loss 1.193 Epoch 7 iteration 0018/0187: training loss 1.209 Epoch 7 iteration 0019/0187: training loss 1.200 Epoch 7 iteration 0020/0187: training loss 1.204 Epoch 7 iteration 0021/0187: training loss 1.201 Epoch 7 iteration 0022/0187: training loss 1.200 Epoch 7 iteration 0023/0187: training loss 1.191 Epoch 7 iteration 0024/0187: training loss 1.203 Epoch 7 iteration 0025/0187: training loss 1.196 Epoch 7 iteration 0026/0187: training loss 1.193 Epoch 7 iteration 0027/0187: training loss 1.193 Epoch 7 iteration 0028/0187: training loss 1.191 Epoch 7 iteration 0029/0187: training loss 1.188 Epoch 7 iteration 0030/0187: training loss 1.188 Epoch 7 iteration 0031/0187: training loss 1.188 Epoch 7 iteration 0032/0187: training loss 1.185 Epoch 7 iteration 0033/0187: training loss 1.188 Epoch 7 iteration 0034/0187: training loss 1.186 Epoch 7 iteration 0035/0187: training loss 1.186 Epoch 7 iteration 0036/0187: training loss 1.194 Epoch 7 iteration 0037/0187: training loss 1.194 Epoch 7 iteration 0038/0187: training loss 1.192 Epoch 7 iteration 0039/0187: training loss 1.192 Epoch 7 iteration 0040/0187: training loss 1.192 Epoch 7 iteration 0041/0187: training loss 1.193 Epoch 7 iteration 0042/0187: training loss 1.196 Epoch 7 iteration 0043/0187: training loss 1.196 Epoch 7 iteration 0044/0187: training loss 1.199 Epoch 7 iteration 0045/0187: training loss 1.198 Epoch 7 iteration 0046/0187: training loss 1.206 Epoch 7 iteration 0047/0187: training loss 1.209 Epoch 7 iteration 0048/0187: training loss 1.209 Epoch 7 iteration 0049/0187: training loss 1.216 Epoch 7 iteration 0050/0187: training loss 1.212 Epoch 7 iteration 0051/0187: training loss 1.214 Epoch 7 iteration 0052/0187: training loss 1.214 Epoch 7 iteration 0053/0187: training loss 1.219 Epoch 7 iteration 0054/0187: training loss 1.220 Epoch 7 iteration 0055/0187: training loss 1.222 Epoch 7 iteration 0056/0187: training loss 1.220 Epoch 7 iteration 0057/0187: training loss 1.221 Epoch 7 iteration 0058/0187: training loss 1.225 Epoch 7 iteration 0059/0187: training loss 1.233 Epoch 7 iteration 0060/0187: training loss 1.242 Epoch 7 iteration 0061/0187: training loss 1.243 Epoch 7 iteration 0062/0187: training loss 1.244 Epoch 7 iteration 0063/0187: training loss 1.241 Epoch 7 iteration 0064/0187: training loss 1.240 Epoch 7 iteration 0065/0187: training loss 1.240 Epoch 7 iteration 0066/0187: training loss 1.240 Epoch 7 iteration 0067/0187: training loss 1.237 Epoch 7 iteration 0068/0187: training loss 1.243 Epoch 7 iteration 0069/0187: training loss 1.242 Epoch 7 iteration 0070/0187: training loss 1.242 Epoch 7 iteration 0071/0187: training loss 1.242 Epoch 7 iteration 0072/0187: training loss 1.244 Epoch 7 iteration 0073/0187: training loss 1.243 Epoch 7 iteration 0074/0187: training loss 1.243 Epoch 7 iteration 0075/0187: training loss 1.242 Epoch 7 iteration 0076/0187: training loss 1.245 Epoch 7 iteration 0077/0187: training loss 1.245 Epoch 7 iteration 0078/0187: training loss 1.244 Epoch 7 iteration 0079/0187: training loss 1.241 Epoch 7 iteration 0080/0187: training loss 1.242 Epoch 7 iteration 0081/0187: training loss 1.237 Epoch 7 iteration 0082/0187: training loss 1.236 Epoch 7 iteration 0083/0187: training loss 1.233 Epoch 7 iteration 0084/0187: training loss 1.233 Epoch 7 iteration 0085/0187: training loss 1.235 Epoch 7 iteration 0086/0187: training loss 1.239 Epoch 7 iteration 0087/0187: training loss 1.237 Epoch 7 iteration 0088/0187: training loss 1.234 Epoch 7 iteration 0089/0187: training loss 1.232 Epoch 7 iteration 0090/0187: training loss 1.231 Epoch 7 iteration 0091/0187: training loss 1.230 Epoch 7 iteration 0092/0187: training loss 1.230 Epoch 7 iteration 0093/0187: training loss 1.229 Epoch 7 iteration 0094/0187: training loss 1.232 Epoch 7 iteration 0095/0187: training loss 1.232 Epoch 7 iteration 0096/0187: training loss 1.232 Epoch 7 iteration 0097/0187: training loss 1.233 Epoch 7 iteration 0098/0187: training loss 1.231 Epoch 7 iteration 0099/0187: training loss 1.231 Epoch 7 iteration 0100/0187: training loss 1.230 Epoch 7 iteration 0101/0187: training loss 1.232 Epoch 7 iteration 0102/0187: training loss 1.231 Epoch 7 iteration 0103/0187: training loss 1.234 Epoch 7 iteration 0104/0187: training loss 1.233 Epoch 7 iteration 0105/0187: training loss 1.233 Epoch 7 iteration 0106/0187: training loss 1.232 Epoch 7 iteration 0107/0187: training loss 1.233 Epoch 7 iteration 0108/0187: training loss 1.231 Epoch 7 iteration 0109/0187: training loss 1.230 Epoch 7 iteration 0110/0187: training loss 1.231 Epoch 7 iteration 0111/0187: training loss 1.231 Epoch 7 iteration 0112/0187: training loss 1.230 Epoch 7 iteration 0113/0187: training loss 1.229 Epoch 7 iteration 0114/0187: training loss 1.227 Epoch 7 iteration 0115/0187: training loss 1.227 Epoch 7 iteration 0116/0187: training loss 1.229 Epoch 7 iteration 0117/0187: training loss 1.229 Epoch 7 iteration 0118/0187: training loss 1.230 Epoch 7 iteration 0119/0187: training loss 1.230 Epoch 7 iteration 0120/0187: training loss 1.231 Epoch 7 iteration 0121/0187: training loss 1.229 Epoch 7 iteration 0122/0187: training loss 1.228 Epoch 7 iteration 0123/0187: training loss 1.230 Epoch 7 iteration 0124/0187: training loss 1.230 Epoch 7 iteration 0125/0187: training loss 1.230 Epoch 7 iteration 0126/0187: training loss 1.228 Epoch 7 iteration 0127/0187: training loss 1.231 Epoch 7 iteration 0128/0187: training loss 1.229 Epoch 7 iteration 0129/0187: training loss 1.229 Epoch 7 iteration 0130/0187: training loss 1.229 Epoch 7 iteration 0131/0187: training loss 1.227 Epoch 7 iteration 0132/0187: training loss 1.226 Epoch 7 iteration 0133/0187: training loss 1.224 Epoch 7 iteration 0134/0187: training loss 1.224 Epoch 7 iteration 0135/0187: training loss 1.225 Epoch 7 iteration 0136/0187: training loss 1.224 Epoch 7 iteration 0137/0187: training loss 1.225 Epoch 7 iteration 0138/0187: training loss 1.225 Epoch 7 iteration 0139/0187: training loss 1.225 Epoch 7 iteration 0140/0187: training loss 1.225 Epoch 7 iteration 0141/0187: training loss 1.224 Epoch 7 iteration 0142/0187: training loss 1.225 Epoch 7 iteration 0143/0187: training loss 1.224 Epoch 7 iteration 0144/0187: training loss 1.223 Epoch 7 iteration 0145/0187: training loss 1.221 Epoch 7 iteration 0146/0187: training loss 1.221 Epoch 7 iteration 0147/0187: training loss 1.220 Epoch 7 iteration 0148/0187: training loss 1.219 Epoch 7 iteration 0149/0187: training loss 1.217 Epoch 7 iteration 0150/0187: training loss 1.218 Epoch 7 iteration 0151/0187: training loss 1.217 Epoch 7 iteration 0152/0187: training loss 1.217 Epoch 7 iteration 0153/0187: training loss 1.217 Epoch 7 iteration 0154/0187: training loss 1.216 Epoch 7 iteration 0155/0187: training loss 1.217 Epoch 7 iteration 0156/0187: training loss 1.218 Epoch 7 iteration 0157/0187: training loss 1.218 Epoch 7 iteration 0158/0187: training loss 1.217 Epoch 7 iteration 0159/0187: training loss 1.217 Epoch 7 iteration 0160/0187: training loss 1.217 Epoch 7 iteration 0161/0187: training loss 1.217 Epoch 7 iteration 0162/0187: training loss 1.218 Epoch 7 iteration 0163/0187: training loss 1.217 Epoch 7 iteration 0164/0187: training loss 1.216 Epoch 7 iteration 0165/0187: training loss 1.218 Epoch 7 iteration 0166/0187: training loss 1.217 Epoch 7 iteration 0167/0187: training loss 1.216 Epoch 7 iteration 0168/0187: training loss 1.216 Epoch 7 iteration 0169/0187: training loss 1.214 Epoch 7 iteration 0170/0187: training loss 1.214 Epoch 7 iteration 0171/0187: training loss 1.213 Epoch 7 iteration 0172/0187: training loss 1.213 Epoch 7 iteration 0173/0187: training loss 1.213 Epoch 7 iteration 0174/0187: training loss 1.215 Epoch 7 iteration 0175/0187: training loss 1.218 Epoch 7 iteration 0176/0187: training loss 1.218 Epoch 7 iteration 0177/0187: training loss 1.219 Epoch 7 iteration 0178/0187: training loss 1.220 Epoch 7 iteration 0179/0187: training loss 1.220 Epoch 7 iteration 0180/0187: training loss 1.222 Epoch 7 iteration 0181/0187: training loss 1.221 Epoch 7 iteration 0182/0187: training loss 1.221 Epoch 7 iteration 0183/0187: training loss 1.220 Epoch 7 iteration 0184/0187: training loss 1.220 Epoch 7 iteration 0185/0187: training loss 1.220 Epoch 7 iteration 0186/0187: training loss 1.219 Epoch 7 iteration 0187/0187: training loss 1.221 Epoch 7 validation pixAcc: 0.855, mIoU: 0.319 Epoch 8 iteration 0001/0187: training loss 1.140 Epoch 8 iteration 0002/0187: training loss 1.123 Epoch 8 iteration 0003/0187: training loss 1.095 Epoch 8 iteration 0004/0187: training loss 1.151 Epoch 8 iteration 0005/0187: training loss 1.169 Epoch 8 iteration 0006/0187: training loss 1.163 Epoch 8 iteration 0007/0187: training loss 1.139 Epoch 8 iteration 0008/0187: training loss 1.145 Epoch 8 iteration 0009/0187: training loss 1.132 Epoch 8 iteration 0010/0187: training loss 1.164 Epoch 8 iteration 0011/0187: training loss 1.168 Epoch 8 iteration 0012/0187: training loss 1.170 Epoch 8 iteration 0013/0187: training loss 1.173 Epoch 8 iteration 0014/0187: training loss 1.168 Epoch 8 iteration 0015/0187: training loss 1.168 Epoch 8 iteration 0016/0187: training loss 1.191 Epoch 8 iteration 0017/0187: training loss 1.182 Epoch 8 iteration 0018/0187: training loss 1.192 Epoch 8 iteration 0019/0187: training loss 1.178 Epoch 8 iteration 0020/0187: training loss 1.187 Epoch 8 iteration 0021/0187: training loss 1.194 Epoch 8 iteration 0022/0187: training loss 1.194 Epoch 8 iteration 0023/0187: training loss 1.194 Epoch 8 iteration 0024/0187: training loss 1.198 Epoch 8 iteration 0025/0187: training loss 1.185 Epoch 8 iteration 0026/0187: training loss 1.193 Epoch 8 iteration 0027/0187: training loss 1.199 Epoch 8 iteration 0028/0187: training loss 1.195 Epoch 8 iteration 0029/0187: training loss 1.189 Epoch 8 iteration 0030/0187: training loss 1.188 Epoch 8 iteration 0031/0187: training loss 1.188 Epoch 8 iteration 0032/0187: training loss 1.183 Epoch 8 iteration 0033/0187: training loss 1.178 Epoch 8 iteration 0034/0187: training loss 1.175 Epoch 8 iteration 0035/0187: training loss 1.172 Epoch 8 iteration 0036/0187: training loss 1.170 Epoch 8 iteration 0037/0187: training loss 1.171 Epoch 8 iteration 0038/0187: training loss 1.173 Epoch 8 iteration 0039/0187: training loss 1.167 Epoch 8 iteration 0040/0187: training loss 1.164 Epoch 8 iteration 0041/0187: training loss 1.165 Epoch 8 iteration 0042/0187: training loss 1.161 Epoch 8 iteration 0043/0187: training loss 1.164 Epoch 8 iteration 0044/0187: training loss 1.170 Epoch 8 iteration 0045/0187: training loss 1.164 Epoch 8 iteration 0046/0187: training loss 1.168 Epoch 8 iteration 0047/0187: training loss 1.174 Epoch 8 iteration 0048/0187: training loss 1.182 Epoch 8 iteration 0049/0187: training loss 1.181 Epoch 8 iteration 0050/0187: training loss 1.185 Epoch 8 iteration 0051/0187: training loss 1.181 Epoch 8 iteration 0052/0187: training loss 1.182 Epoch 8 iteration 0053/0187: training loss 1.180 Epoch 8 iteration 0054/0187: training loss 1.176 Epoch 8 iteration 0055/0187: training loss 1.176 Epoch 8 iteration 0056/0187: training loss 1.174 Epoch 8 iteration 0057/0187: training loss 1.176 Epoch 8 iteration 0058/0187: training loss 1.177 Epoch 8 iteration 0059/0187: training loss 1.178 Epoch 8 iteration 0060/0187: training loss 1.179 Epoch 8 iteration 0061/0187: training loss 1.176 Epoch 8 iteration 0062/0187: training loss 1.176 Epoch 8 iteration 0063/0187: training loss 1.179 Epoch 8 iteration 0064/0187: training loss 1.182 Epoch 8 iteration 0065/0187: training loss 1.179 Epoch 8 iteration 0066/0187: training loss 1.182 Epoch 8 iteration 0067/0187: training loss 1.182 Epoch 8 iteration 0068/0187: training loss 1.182 Epoch 8 iteration 0069/0187: training loss 1.181 Epoch 8 iteration 0070/0187: training loss 1.178 Epoch 8 iteration 0071/0187: training loss 1.175 Epoch 8 iteration 0072/0187: training loss 1.172 Epoch 8 iteration 0073/0187: training loss 1.173 Epoch 8 iteration 0074/0187: training loss 1.172 Epoch 8 iteration 0075/0187: training loss 1.169 Epoch 8 iteration 0076/0187: training loss 1.166 Epoch 8 iteration 0077/0187: training loss 1.164 Epoch 8 iteration 0078/0187: training loss 1.164 Epoch 8 iteration 0079/0187: training loss 1.162 Epoch 8 iteration 0080/0187: training loss 1.163 Epoch 8 iteration 0081/0187: training loss 1.160 Epoch 8 iteration 0082/0187: training loss 1.157 Epoch 8 iteration 0083/0187: training loss 1.161 Epoch 8 iteration 0084/0187: training loss 1.166 Epoch 8 iteration 0085/0187: training loss 1.171 Epoch 8 iteration 0086/0187: training loss 1.171 Epoch 8 iteration 0087/0187: training loss 1.170 Epoch 8 iteration 0088/0187: training loss 1.169 Epoch 8 iteration 0089/0187: training loss 1.169 Epoch 8 iteration 0090/0187: training loss 1.170 Epoch 8 iteration 0091/0188: training loss 1.167 Epoch 8 iteration 0092/0188: training loss 1.167 Epoch 8 iteration 0093/0188: training loss 1.172 Epoch 8 iteration 0094/0188: training loss 1.172 Epoch 8 iteration 0095/0188: training loss 1.172 Epoch 8 iteration 0096/0188: training loss 1.172 Epoch 8 iteration 0097/0188: training loss 1.173 Epoch 8 iteration 0098/0188: training loss 1.172 Epoch 8 iteration 0099/0188: training loss 1.170 Epoch 8 iteration 0100/0188: training loss 1.169 Epoch 8 iteration 0101/0188: training loss 1.168 Epoch 8 iteration 0102/0188: training loss 1.167 Epoch 8 iteration 0103/0188: training loss 1.167 Epoch 8 iteration 0104/0188: training loss 1.169 Epoch 8 iteration 0105/0188: training loss 1.168 Epoch 8 iteration 0106/0188: training loss 1.166 Epoch 8 iteration 0107/0188: training loss 1.165 Epoch 8 iteration 0108/0188: training loss 1.165 Epoch 8 iteration 0109/0188: training loss 1.164 Epoch 8 iteration 0110/0188: training loss 1.163 Epoch 8 iteration 0111/0188: training loss 1.162 Epoch 8 iteration 0112/0188: training loss 1.161 Epoch 8 iteration 0113/0188: training loss 1.161 Epoch 8 iteration 0114/0188: training loss 1.163 Epoch 8 iteration 0115/0188: training loss 1.162 Epoch 8 iteration 0116/0188: training loss 1.164 Epoch 8 iteration 0117/0188: training loss 1.165 Epoch 8 iteration 0118/0188: training loss 1.163 Epoch 8 iteration 0119/0188: training loss 1.162 Epoch 8 iteration 0120/0188: training loss 1.162 Epoch 8 iteration 0121/0188: training loss 1.162 Epoch 8 iteration 0122/0188: training loss 1.162 Epoch 8 iteration 0123/0188: training loss 1.162 Epoch 8 iteration 0124/0188: training loss 1.163 Epoch 8 iteration 0125/0188: training loss 1.162 Epoch 8 iteration 0126/0188: training loss 1.161 Epoch 8 iteration 0127/0188: training loss 1.160 Epoch 8 iteration 0128/0188: training loss 1.161 Epoch 8 iteration 0129/0188: training loss 1.160 Epoch 8 iteration 0130/0188: training loss 1.162 Epoch 8 iteration 0131/0188: training loss 1.159 Epoch 8 iteration 0132/0188: training loss 1.159 Epoch 8 iteration 0133/0188: training loss 1.160 Epoch 8 iteration 0134/0188: training loss 1.160 Epoch 8 iteration 0135/0188: training loss 1.159 Epoch 8 iteration 0136/0188: training loss 1.158 Epoch 8 iteration 0137/0188: training loss 1.157 Epoch 8 iteration 0138/0188: training loss 1.156 Epoch 8 iteration 0139/0188: training loss 1.156 Epoch 8 iteration 0140/0188: training loss 1.157 Epoch 8 iteration 0141/0188: training loss 1.157 Epoch 8 iteration 0142/0188: training loss 1.155 Epoch 8 iteration 0143/0188: training loss 1.156 Epoch 8 iteration 0144/0188: training loss 1.156 Epoch 8 iteration 0145/0188: training loss 1.160 Epoch 8 iteration 0146/0188: training loss 1.161 Epoch 8 iteration 0147/0188: training loss 1.164 Epoch 8 iteration 0148/0188: training loss 1.165 Epoch 8 iteration 0149/0188: training loss 1.165 Epoch 8 iteration 0150/0188: training loss 1.163 Epoch 8 iteration 0151/0188: training loss 1.165 Epoch 8 iteration 0152/0188: training loss 1.164 Epoch 8 iteration 0153/0188: training loss 1.165 Epoch 8 iteration 0154/0188: training loss 1.164 Epoch 8 iteration 0155/0188: training loss 1.164 Epoch 8 iteration 0156/0188: training loss 1.163 Epoch 8 iteration 0157/0188: training loss 1.163 Epoch 8 iteration 0158/0188: training loss 1.162 Epoch 8 iteration 0159/0188: training loss 1.162 Epoch 8 iteration 0160/0188: training loss 1.162 Epoch 8 iteration 0161/0188: training loss 1.162 Epoch 8 iteration 0162/0188: training loss 1.160 Epoch 8 iteration 0163/0188: training loss 1.160 Epoch 8 iteration 0164/0188: training loss 1.161 Epoch 8 iteration 0165/0188: training loss 1.160 Epoch 8 iteration 0166/0188: training loss 1.161 Epoch 8 iteration 0167/0188: training loss 1.160 Epoch 8 iteration 0168/0188: training loss 1.160 Epoch 8 iteration 0169/0188: training loss 1.161 Epoch 8 iteration 0170/0188: training loss 1.161 Epoch 8 iteration 0171/0188: training loss 1.162 Epoch 8 iteration 0172/0188: training loss 1.160 Epoch 8 iteration 0173/0188: training loss 1.162 Epoch 8 iteration 0174/0188: training loss 1.164 Epoch 8 iteration 0175/0188: training loss 1.163 Epoch 8 iteration 0176/0188: training loss 1.164 Epoch 8 iteration 0177/0188: training loss 1.163 Epoch 8 iteration 0178/0188: training loss 1.162 Epoch 8 iteration 0179/0188: training loss 1.161 Epoch 8 iteration 0180/0188: training loss 1.162 Epoch 8 iteration 0181/0188: training loss 1.162 Epoch 8 iteration 0182/0188: training loss 1.162 Epoch 8 iteration 0183/0188: training loss 1.161 Epoch 8 iteration 0184/0188: training loss 1.161 Epoch 8 iteration 0185/0188: training loss 1.160 Epoch 8 iteration 0186/0188: training loss 1.160 Epoch 8 validation pixAcc: 0.854, mIoU: 0.329 Epoch 9 iteration 0001/0187: training loss 1.192 Epoch 9 iteration 0002/0187: training loss 1.207 Epoch 9 iteration 0003/0187: training loss 1.139 Epoch 9 iteration 0004/0187: training loss 1.112 Epoch 9 iteration 0005/0187: training loss 1.114 Epoch 9 iteration 0006/0187: training loss 1.117 Epoch 9 iteration 0007/0187: training loss 1.089 Epoch 9 iteration 0008/0187: training loss 1.085 Epoch 9 iteration 0009/0187: training loss 1.071 Epoch 9 iteration 0010/0187: training loss 1.084 Epoch 9 iteration 0011/0187: training loss 1.144 Epoch 9 iteration 0012/0187: training loss 1.156 Epoch 9 iteration 0013/0187: training loss 1.158 Epoch 9 iteration 0014/0187: training loss 1.158 Epoch 9 iteration 0015/0187: training loss 1.163 Epoch 9 iteration 0016/0187: training loss 1.149 Epoch 9 iteration 0017/0187: training loss 1.167 Epoch 9 iteration 0018/0187: training loss 1.163 Epoch 9 iteration 0019/0187: training loss 1.157 Epoch 9 iteration 0020/0187: training loss 1.153 Epoch 9 iteration 0021/0187: training loss 1.148 Epoch 9 iteration 0022/0187: training loss 1.151 Epoch 9 iteration 0023/0187: training loss 1.135 Epoch 9 iteration 0024/0187: training loss 1.124 Epoch 9 iteration 0025/0187: training loss 1.122 Epoch 9 iteration 0026/0187: training loss 1.124 Epoch 9 iteration 0027/0187: training loss 1.130 Epoch 9 iteration 0028/0187: training loss 1.126 Epoch 9 iteration 0029/0187: training loss 1.127 Epoch 9 iteration 0030/0187: training loss 1.126 Epoch 9 iteration 0031/0187: training loss 1.125 Epoch 9 iteration 0032/0187: training loss 1.122 Epoch 9 iteration 0033/0187: training loss 1.131 Epoch 9 iteration 0034/0187: training loss 1.128 Epoch 9 iteration 0035/0187: training loss 1.123 Epoch 9 iteration 0036/0187: training loss 1.119 Epoch 9 iteration 0037/0187: training loss 1.118 Epoch 9 iteration 0038/0187: training loss 1.125 Epoch 9 iteration 0039/0187: training loss 1.126 Epoch 9 iteration 0040/0187: training loss 1.118 Epoch 9 iteration 0041/0187: training loss 1.115 Epoch 9 iteration 0042/0187: training loss 1.111 Epoch 9 iteration 0043/0187: training loss 1.123 Epoch 9 iteration 0044/0187: training loss 1.125 Epoch 9 iteration 0045/0187: training loss 1.126 Epoch 9 iteration 0046/0187: training loss 1.125 Epoch 9 iteration 0047/0187: training loss 1.123 Epoch 9 iteration 0048/0187: training loss 1.118 Epoch 9 iteration 0049/0187: training loss 1.121 Epoch 9 iteration 0050/0187: training loss 1.118 Epoch 9 iteration 0051/0187: training loss 1.118 Epoch 9 iteration 0052/0187: training loss 1.121 Epoch 9 iteration 0053/0187: training loss 1.115 Epoch 9 iteration 0054/0187: training loss 1.118 Epoch 9 iteration 0055/0187: training loss 1.120 Epoch 9 iteration 0056/0187: training loss 1.118 Epoch 9 iteration 0057/0187: training loss 1.124 Epoch 9 iteration 0058/0187: training loss 1.120 Epoch 9 iteration 0059/0187: training loss 1.122 Epoch 9 iteration 0060/0187: training loss 1.120 Epoch 9 iteration 0061/0187: training loss 1.118 Epoch 9 iteration 0062/0187: training loss 1.123 Epoch 9 iteration 0063/0187: training loss 1.124 Epoch 9 iteration 0064/0187: training loss 1.126 Epoch 9 iteration 0065/0187: training loss 1.125 Epoch 9 iteration 0066/0187: training loss 1.129 Epoch 9 iteration 0067/0187: training loss 1.129 Epoch 9 iteration 0068/0187: training loss 1.129 Epoch 9 iteration 0069/0187: training loss 1.126 Epoch 9 iteration 0070/0187: training loss 1.124 Epoch 9 iteration 0071/0187: training loss 1.123 Epoch 9 iteration 0072/0187: training loss 1.124 Epoch 9 iteration 0073/0187: training loss 1.128 Epoch 9 iteration 0074/0187: training loss 1.127 Epoch 9 iteration 0075/0187: training loss 1.126 Epoch 9 iteration 0076/0187: training loss 1.123 Epoch 9 iteration 0077/0187: training loss 1.124 Epoch 9 iteration 0078/0187: training loss 1.126 Epoch 9 iteration 0079/0187: training loss 1.122 Epoch 9 iteration 0080/0187: training loss 1.121 Epoch 9 iteration 0081/0187: training loss 1.120 Epoch 9 iteration 0082/0187: training loss 1.123 Epoch 9 iteration 0083/0187: training loss 1.124 Epoch 9 iteration 0084/0187: training loss 1.123 Epoch 9 iteration 0085/0187: training loss 1.121 Epoch 9 iteration 0086/0187: training loss 1.123 Epoch 9 iteration 0087/0187: training loss 1.124 Epoch 9 iteration 0088/0187: training loss 1.123 Epoch 9 iteration 0089/0187: training loss 1.122 Epoch 9 iteration 0090/0187: training loss 1.122 Epoch 9 iteration 0091/0187: training loss 1.129 Epoch 9 iteration 0092/0187: training loss 1.129 Epoch 9 iteration 0093/0187: training loss 1.127 Epoch 9 iteration 0094/0187: training loss 1.133 Epoch 9 iteration 0095/0187: training loss 1.134 Epoch 9 iteration 0096/0187: training loss 1.133 Epoch 9 iteration 0097/0187: training loss 1.131 Epoch 9 iteration 0098/0187: training loss 1.133 Epoch 9 iteration 0099/0187: training loss 1.133 Epoch 9 iteration 0100/0187: training loss 1.134 Epoch 9 iteration 0101/0187: training loss 1.136 Epoch 9 iteration 0102/0187: training loss 1.134 Epoch 9 iteration 0103/0187: training loss 1.133 Epoch 9 iteration 0104/0187: training loss 1.130 Epoch 9 iteration 0105/0187: training loss 1.135 Epoch 9 iteration 0106/0187: training loss 1.136 Epoch 9 iteration 0107/0187: training loss 1.138 Epoch 9 iteration 0108/0187: training loss 1.138 Epoch 9 iteration 0109/0187: training loss 1.137 Epoch 9 iteration 0110/0187: training loss 1.135 Epoch 9 iteration 0111/0187: training loss 1.136 Epoch 9 iteration 0112/0187: training loss 1.134 Epoch 9 iteration 0113/0187: training loss 1.137 Epoch 9 iteration 0114/0187: training loss 1.136 Epoch 9 iteration 0115/0187: training loss 1.136 Epoch 9 iteration 0116/0187: training loss 1.138 Epoch 9 iteration 0117/0187: training loss 1.137 Epoch 9 iteration 0118/0187: training loss 1.136 Epoch 9 iteration 0119/0187: training loss 1.135 Epoch 9 iteration 0120/0187: training loss 1.135 Epoch 9 iteration 0121/0187: training loss 1.133 Epoch 9 iteration 0122/0187: training loss 1.134 Epoch 9 iteration 0123/0187: training loss 1.134 Epoch 9 iteration 0124/0187: training loss 1.134 Epoch 9 iteration 0125/0187: training loss 1.133 Epoch 9 iteration 0126/0187: training loss 1.137 Epoch 9 iteration 0127/0187: training loss 1.135 Epoch 9 iteration 0128/0187: training loss 1.134 Epoch 9 iteration 0129/0187: training loss 1.135 Epoch 9 iteration 0130/0187: training loss 1.135 Epoch 9 iteration 0131/0187: training loss 1.133 Epoch 9 iteration 0132/0187: training loss 1.133 Epoch 9 iteration 0133/0187: training loss 1.132 Epoch 9 iteration 0134/0187: training loss 1.132 Epoch 9 iteration 0135/0187: training loss 1.131 Epoch 9 iteration 0136/0187: training loss 1.130 Epoch 9 iteration 0137/0187: training loss 1.132 Epoch 9 iteration 0138/0187: training loss 1.134 Epoch 9 iteration 0139/0187: training loss 1.135 Epoch 9 iteration 0140/0187: training loss 1.136 Epoch 9 iteration 0141/0187: training loss 1.135 Epoch 9 iteration 0142/0187: training loss 1.133 Epoch 9 iteration 0143/0187: training loss 1.132 Epoch 9 iteration 0144/0187: training loss 1.131 Epoch 9 iteration 0145/0187: training loss 1.132 Epoch 9 iteration 0146/0187: training loss 1.132 Epoch 9 iteration 0147/0187: training loss 1.130 Epoch 9 iteration 0148/0187: training loss 1.131 Epoch 9 iteration 0149/0187: training loss 1.131 Epoch 9 iteration 0150/0187: training loss 1.130 Epoch 9 iteration 0151/0187: training loss 1.129 Epoch 9 iteration 0152/0187: training loss 1.128 Epoch 9 iteration 0153/0187: training loss 1.126 Epoch 9 iteration 0154/0187: training loss 1.127 Epoch 9 iteration 0155/0187: training loss 1.128 Epoch 9 iteration 0156/0187: training loss 1.127 Epoch 9 iteration 0157/0187: training loss 1.127 Epoch 9 iteration 0158/0187: training loss 1.127 Epoch 9 iteration 0159/0187: training loss 1.125 Epoch 9 iteration 0160/0187: training loss 1.125 Epoch 9 iteration 0161/0187: training loss 1.125 Epoch 9 iteration 0162/0187: training loss 1.126 Epoch 9 iteration 0163/0187: training loss 1.127 Epoch 9 iteration 0164/0187: training loss 1.128 Epoch 9 iteration 0165/0187: training loss 1.128 Epoch 9 iteration 0166/0187: training loss 1.128 Epoch 9 iteration 0167/0187: training loss 1.127 Epoch 9 iteration 0168/0187: training loss 1.127 Epoch 9 iteration 0169/0187: training loss 1.126 Epoch 9 iteration 0170/0187: training loss 1.125 Epoch 9 iteration 0171/0187: training loss 1.126 Epoch 9 iteration 0172/0187: training loss 1.125 Epoch 9 iteration 0173/0187: training loss 1.126 Epoch 9 iteration 0174/0187: training loss 1.125 Epoch 9 iteration 0175/0187: training loss 1.126 Epoch 9 iteration 0176/0187: training loss 1.127 Epoch 9 iteration 0177/0187: training loss 1.127 Epoch 9 iteration 0178/0187: training loss 1.128 Epoch 9 iteration 0179/0187: training loss 1.129 Epoch 9 iteration 0180/0187: training loss 1.130 Epoch 9 iteration 0181/0187: training loss 1.131 Epoch 9 iteration 0182/0187: training loss 1.131 Epoch 9 iteration 0183/0187: training loss 1.131 Epoch 9 iteration 0184/0187: training loss 1.131 Epoch 9 iteration 0185/0187: training loss 1.131 Epoch 9 iteration 0186/0187: training loss 1.132 Epoch 9 iteration 0187/0187: training loss 1.133 Epoch 9 validation pixAcc: 0.857, mIoU: 0.333 Epoch 10 iteration 0001/0187: training loss 1.005 Epoch 10 iteration 0002/0187: training loss 1.048 Epoch 10 iteration 0003/0187: training loss 1.055 Epoch 10 iteration 0004/0187: training loss 1.000 Epoch 10 iteration 0005/0187: training loss 1.005 Epoch 10 iteration 0006/0187: training loss 1.049 Epoch 10 iteration 0007/0187: training loss 1.071 Epoch 10 iteration 0008/0187: training loss 1.100 Epoch 10 iteration 0009/0187: training loss 1.078 Epoch 10 iteration 0010/0187: training loss 1.088 Epoch 10 iteration 0011/0187: training loss 1.083 Epoch 10 iteration 0012/0187: training loss 1.081 Epoch 10 iteration 0013/0187: training loss 1.078 Epoch 10 iteration 0014/0187: training loss 1.063 Epoch 10 iteration 0015/0187: training loss 1.069 Epoch 10 iteration 0016/0187: training loss 1.076 Epoch 10 iteration 0017/0187: training loss 1.085 Epoch 10 iteration 0018/0187: training loss 1.086 Epoch 10 iteration 0019/0187: training loss 1.085 Epoch 10 iteration 0020/0187: training loss 1.098 Epoch 10 iteration 0021/0187: training loss 1.104 Epoch 10 iteration 0022/0187: training loss 1.103 Epoch 10 iteration 0023/0187: training loss 1.097 Epoch 10 iteration 0024/0187: training loss 1.101 Epoch 10 iteration 0025/0187: training loss 1.093 Epoch 10 iteration 0026/0187: training loss 1.102 Epoch 10 iteration 0027/0187: training loss 1.095 Epoch 10 iteration 0028/0187: training loss 1.096 Epoch 10 iteration 0029/0187: training loss 1.093 Epoch 10 iteration 0030/0187: training loss 1.085 Epoch 10 iteration 0031/0187: training loss 1.081 Epoch 10 iteration 0032/0187: training loss 1.074 Epoch 10 iteration 0033/0187: training loss 1.076 Epoch 10 iteration 0034/0187: training loss 1.075 Epoch 10 iteration 0035/0187: training loss 1.077 Epoch 10 iteration 0036/0187: training loss 1.079 Epoch 10 iteration 0037/0187: training loss 1.076 Epoch 10 iteration 0038/0187: training loss 1.071 Epoch 10 iteration 0039/0187: training loss 1.082 Epoch 10 iteration 0040/0187: training loss 1.078 Epoch 10 iteration 0041/0187: training loss 1.074 Epoch 10 iteration 0042/0187: training loss 1.070 Epoch 10 iteration 0043/0187: training loss 1.066 Epoch 10 iteration 0044/0187: training loss 1.067 Epoch 10 iteration 0045/0187: training loss 1.062 Epoch 10 iteration 0046/0187: training loss 1.066 Epoch 10 iteration 0047/0187: training loss 1.064 Epoch 10 iteration 0048/0187: training loss 1.070 Epoch 10 iteration 0049/0187: training loss 1.069 Epoch 10 iteration 0050/0187: training loss 1.069 Epoch 10 iteration 0051/0187: training loss 1.070 Epoch 10 iteration 0052/0187: training loss 1.067 Epoch 10 iteration 0053/0187: training loss 1.063 Epoch 10 iteration 0054/0187: training loss 1.061 Epoch 10 iteration 0055/0187: training loss 1.063 Epoch 10 iteration 0056/0187: training loss 1.063 Epoch 10 iteration 0057/0187: training loss 1.063 Epoch 10 iteration 0058/0187: training loss 1.070 Epoch 10 iteration 0059/0187: training loss 1.072 Epoch 10 iteration 0060/0187: training loss 1.073 Epoch 10 iteration 0061/0187: training loss 1.071 Epoch 10 iteration 0062/0187: training loss 1.072 Epoch 10 iteration 0063/0187: training loss 1.070 Epoch 10 iteration 0064/0187: training loss 1.068 Epoch 10 iteration 0065/0187: training loss 1.065 Epoch 10 iteration 0066/0187: training loss 1.062 Epoch 10 iteration 0067/0187: training loss 1.061 Epoch 10 iteration 0068/0187: training loss 1.065 Epoch 10 iteration 0069/0187: training loss 1.066 Epoch 10 iteration 0070/0187: training loss 1.065 Epoch 10 iteration 0071/0187: training loss 1.065 Epoch 10 iteration 0072/0187: training loss 1.066 Epoch 10 iteration 0073/0187: training loss 1.064 Epoch 10 iteration 0074/0187: training loss 1.063 Epoch 10 iteration 0075/0187: training loss 1.062 Epoch 10 iteration 0076/0187: training loss 1.066 Epoch 10 iteration 0077/0187: training loss 1.065 Epoch 10 iteration 0078/0187: training loss 1.062 Epoch 10 iteration 0079/0187: training loss 1.063 Epoch 10 iteration 0080/0187: training loss 1.064 Epoch 10 iteration 0081/0187: training loss 1.063 Epoch 10 iteration 0082/0187: training loss 1.064 Epoch 10 iteration 0083/0187: training loss 1.063 Epoch 10 iteration 0084/0187: training loss 1.061 Epoch 10 iteration 0085/0187: training loss 1.061 Epoch 10 iteration 0086/0187: training loss 1.060 Epoch 10 iteration 0087/0187: training loss 1.059 Epoch 10 iteration 0088/0187: training loss 1.060 Epoch 10 iteration 0089/0187: training loss 1.062 Epoch 10 iteration 0090/0187: training loss 1.063 Epoch 10 iteration 0091/0188: training loss 1.065 Epoch 10 iteration 0092/0188: training loss 1.066 Epoch 10 iteration 0093/0188: training loss 1.068 Epoch 10 iteration 0094/0188: training loss 1.066 Epoch 10 iteration 0095/0188: training loss 1.065 Epoch 10 iteration 0096/0188: training loss 1.064 Epoch 10 iteration 0097/0188: training loss 1.067 Epoch 10 iteration 0098/0188: training loss 1.064 Epoch 10 iteration 0099/0188: training loss 1.061 Epoch 10 iteration 0100/0188: training loss 1.061 Epoch 10 iteration 0101/0188: training loss 1.061 Epoch 10 iteration 0102/0188: training loss 1.062 Epoch 10 iteration 0103/0188: training loss 1.061 Epoch 10 iteration 0104/0188: training loss 1.063 Epoch 10 iteration 0105/0188: training loss 1.060 Epoch 10 iteration 0106/0188: training loss 1.060 Epoch 10 iteration 0107/0188: training loss 1.060 Epoch 10 iteration 0108/0188: training loss 1.060 Epoch 10 iteration 0109/0188: training loss 1.058 Epoch 10 iteration 0110/0188: training loss 1.058 Epoch 10 iteration 0111/0188: training loss 1.059 Epoch 10 iteration 0112/0188: training loss 1.059 Epoch 10 iteration 0113/0188: training loss 1.059 Epoch 10 iteration 0114/0188: training loss nan Epoch 10 iteration 0115/0188: training loss nan Epoch 10 iteration 0116/0188: training loss nan Epoch 10 iteration 0117/0188: training loss nan Epoch 10 iteration 0118/0188: training loss nan Epoch 10 iteration 0119/0188: training loss nan Epoch 10 iteration 0120/0188: training loss nan Epoch 10 iteration 0121/0188: training loss nan Epoch 10 iteration 0122/0188: training loss nan Epoch 10 iteration 0123/0188: training loss nan Epoch 10 iteration 0124/0188: training loss nan Epoch 10 iteration 0125/0188: training loss nan Epoch 10 iteration 0126/0188: training loss nan Epoch 10 iteration 0127/0188: training loss nan Epoch 10 iteration 0128/0188: training loss nan Epoch 10 iteration 0129/0188: training loss nan Epoch 10 iteration 0130/0188: training loss nan Epoch 10 iteration 0131/0188: training loss nan Epoch 10 iteration 0132/0188: training loss nan Epoch 10 iteration 0133/0188: training loss nan Epoch 10 iteration 0134/0188: training loss nan Epoch 10 iteration 0135/0188: training loss nan Epoch 10 iteration 0136/0188: training loss nan Epoch 10 iteration 0137/0188: training loss nan Epoch 10 iteration 0138/0188: training loss nan Epoch 10 iteration 0139/0188: training loss nan Epoch 10 iteration 0140/0188: training loss nan Epoch 10 iteration 0141/0188: training loss nan Epoch 10 iteration 0142/0188: training loss nan Epoch 10 iteration 0143/0188: training loss nan Epoch 10 iteration 0144/0188: training loss nan Epoch 10 iteration 0145/0188: training loss nan Epoch 10 iteration 0146/0188: training loss nan Epoch 10 iteration 0147/0188: training loss nan Epoch 10 iteration 0148/0188: training loss nan Epoch 10 iteration 0149/0188: training loss nan Epoch 10 iteration 0150/0188: training loss nan Epoch 10 iteration 0151/0188: training loss nan Epoch 10 iteration 0152/0188: training loss nan Epoch 10 iteration 0153/0188: training loss nan Epoch 10 iteration 0154/0188: training loss nan Epoch 10 iteration 0155/0188: training loss nan Epoch 10 iteration 0156/0188: training loss nan Epoch 10 iteration 0157/0188: training loss nan Epoch 10 iteration 0158/0188: training loss nan Epoch 10 iteration 0159/0188: training loss nan Epoch 10 iteration 0160/0188: training loss nan Epoch 10 iteration 0161/0188: training loss nan Epoch 10 iteration 0162/0188: training loss nan Epoch 10 iteration 0163/0188: training loss nan Epoch 10 iteration 0164/0188: training loss nan Epoch 10 iteration 0165/0188: training loss nan Epoch 10 iteration 0166/0188: training loss nan Epoch 10 iteration 0167/0188: training loss nan Epoch 10 iteration 0168/0188: training loss nan Epoch 10 iteration 0169/0188: training loss nan Epoch 10 iteration 0170/0188: training loss nan Epoch 10 iteration 0171/0188: training loss nan Epoch 10 iteration 0172/0188: training loss nan Epoch 10 iteration 0173/0188: training loss nan Epoch 10 iteration 0174/0188: training loss nan Epoch 10 iteration 0175/0188: training loss nan Epoch 10 iteration 0176/0188: training loss nan Epoch 10 iteration 0177/0188: training loss nan Epoch 10 iteration 0178/0188: training loss nan Epoch 10 iteration 0179/0188: training loss nan Epoch 10 iteration 0180/0188: training loss nan Epoch 10 iteration 0181/0188: training loss nan Epoch 10 iteration 0182/0188: training loss nan Epoch 10 iteration 0183/0188: training loss nan Epoch 10 iteration 0184/0188: training loss nan Epoch 10 iteration 0185/0188: training loss nan Epoch 10 iteration 0186/0188: training loss nan Epoch 10 validation pixAcc: 0.861, mIoU: 0.346 Epoch 11 iteration 0001/0187: training loss nan Epoch 11 iteration 0002/0187: training loss nan Epoch 11 iteration 0003/0187: training loss nan Epoch 11 iteration 0004/0187: training loss nan Epoch 11 iteration 0005/0187: training loss nan Epoch 11 iteration 0006/0187: training loss nan Epoch 11 iteration 0007/0187: training loss nan Epoch 11 iteration 0008/0187: training loss nan Epoch 11 iteration 0009/0187: training loss nan Epoch 11 iteration 0010/0187: training loss nan Epoch 11 iteration 0011/0187: training loss nan Epoch 11 iteration 0012/0187: training loss nan Epoch 11 iteration 0013/0187: training loss nan Epoch 11 iteration 0014/0187: training loss nan Epoch 11 iteration 0015/0187: training loss nan Epoch 11 iteration 0016/0187: training loss nan Epoch 11 iteration 0017/0187: training loss nan Epoch 11 iteration 0018/0187: training loss nan Epoch 11 iteration 0019/0187: training loss nan Epoch 11 iteration 0020/0187: training loss nan Epoch 11 iteration 0021/0187: training loss nan Epoch 11 iteration 0022/0187: training loss nan Epoch 11 iteration 0023/0187: training loss nan Epoch 11 iteration 0024/0187: training loss nan Epoch 11 iteration 0025/0187: training loss nan Epoch 11 iteration 0026/0187: training loss nan Epoch 11 iteration 0027/0187: training loss nan Epoch 11 iteration 0028/0187: training loss nan Epoch 11 iteration 0029/0187: training loss nan Epoch 11 iteration 0030/0187: training loss nan Epoch 11 iteration 0031/0187: training loss nan Epoch 11 iteration 0032/0187: training loss nan Epoch 11 iteration 0033/0187: training loss nan Epoch 11 iteration 0034/0187: training loss nan Epoch 11 iteration 0035/0187: training loss nan Epoch 11 iteration 0036/0187: training loss nan Epoch 11 iteration 0037/0187: training loss nan Epoch 11 iteration 0038/0187: training loss nan Epoch 11 iteration 0039/0187: training loss nan Epoch 11 iteration 0040/0187: training loss nan Epoch 11 iteration 0041/0187: training loss nan Epoch 11 iteration 0042/0187: training loss nan Epoch 11 iteration 0043/0187: training loss nan Epoch 11 iteration 0044/0187: training loss nan Epoch 11 iteration 0045/0187: training loss nan Epoch 11 iteration 0046/0187: training loss nan Epoch 11 iteration 0047/0187: training loss nan Epoch 11 iteration 0048/0187: training loss nan Epoch 11 iteration 0049/0187: training loss nan Epoch 11 iteration 0050/0187: training loss nan Epoch 11 iteration 0051/0187: training loss nan Epoch 11 iteration 0052/0187: training loss nan Epoch 11 iteration 0053/0187: training loss nan Epoch 11 iteration 0054/0187: training loss nan Epoch 11 iteration 0055/0187: training loss nan Epoch 11 iteration 0056/0187: training loss nan Epoch 11 iteration 0057/0187: training loss nan Epoch 11 iteration 0058/0187: training loss nan Epoch 11 iteration 0059/0187: training loss nan Epoch 11 iteration 0060/0187: training loss nan Epoch 11 iteration 0061/0187: training loss nan Epoch 11 iteration 0062/0187: training loss nan Epoch 11 iteration 0063/0187: training loss nan Epoch 11 iteration 0064/0187: training loss nan Epoch 11 iteration 0065/0187: training loss nan Epoch 11 iteration 0066/0187: training loss nan Epoch 11 iteration 0067/0187: training loss nan Epoch 11 iteration 0068/0187: training loss nan Epoch 11 iteration 0069/0187: training loss nan Epoch 11 iteration 0070/0187: training loss nan Epoch 11 iteration 0071/0187: training loss nan Epoch 11 iteration 0072/0187: training loss nan Epoch 11 iteration 0073/0187: training loss nan Epoch 11 iteration 0074/0187: training loss nan Epoch 11 iteration 0075/0187: training loss nan Epoch 11 iteration 0076/0187: training loss nan Epoch 11 iteration 0077/0187: training loss nan Epoch 11 iteration 0078/0187: training loss nan Epoch 11 iteration 0079/0187: training loss nan Epoch 11 iteration 0080/0187: training loss nan Epoch 11 iteration 0081/0187: training loss nan Epoch 11 iteration 0082/0187: training loss nan Epoch 11 iteration 0083/0187: training loss nan Epoch 11 iteration 0084/0187: training loss nan Epoch 11 iteration 0085/0187: training loss nan Epoch 11 iteration 0086/0187: training loss nan Epoch 11 iteration 0087/0187: training loss nan Epoch 11 iteration 0088/0187: training loss nan Epoch 11 iteration 0089/0187: training loss nan Epoch 11 iteration 0090/0187: training loss nan Epoch 11 iteration 0091/0187: training loss nan Epoch 11 iteration 0092/0187: training loss nan Epoch 11 iteration 0093/0187: training loss nan Epoch 11 iteration 0094/0187: training loss nan Epoch 11 iteration 0095/0187: training loss nan Epoch 11 iteration 0096/0187: training loss nan Epoch 11 iteration 0097/0187: training loss nan Epoch 11 iteration 0098/0187: training loss nan Epoch 11 iteration 0099/0187: training loss nan Epoch 11 iteration 0100/0187: training loss nan Epoch 11 iteration 0101/0187: training loss nan Epoch 11 iteration 0102/0187: training loss nan Epoch 11 iteration 0103/0187: training loss nan Epoch 11 iteration 0104/0187: training loss nan Epoch 11 iteration 0105/0187: training loss nan Epoch 11 iteration 0106/0187: training loss nan Epoch 11 iteration 0107/0187: training loss nan Epoch 11 iteration 0108/0187: training loss nan Epoch 11 iteration 0109/0187: training loss nan Epoch 11 iteration 0110/0187: training loss nan Epoch 11 iteration 0111/0187: training loss nan Epoch 11 iteration 0112/0187: training loss nan Epoch 11 iteration 0113/0187: training loss nan Epoch 11 iteration 0114/0187: training loss nan Epoch 11 iteration 0115/0187: training loss nan Epoch 11 iteration 0116/0187: training loss nan Epoch 11 iteration 0117/0187: training loss nan Epoch 11 iteration 0118/0187: training loss nan Epoch 11 iteration 0119/0187: training loss nan Epoch 11 iteration 0120/0187: training loss nan Epoch 11 iteration 0121/0187: training loss nan Epoch 11 iteration 0122/0187: training loss nan Epoch 11 iteration 0123/0187: training loss nan Epoch 11 iteration 0124/0187: training loss nan Epoch 11 iteration 0125/0187: training loss nan Epoch 11 iteration 0126/0187: training loss nan Epoch 11 iteration 0127/0187: training loss nan Epoch 11 iteration 0128/0187: training loss nan Epoch 11 iteration 0129/0187: training loss nan Epoch 11 iteration 0130/0187: training loss nan Epoch 11 iteration 0131/0187: training loss nan Epoch 11 iteration 0132/0187: training loss nan Epoch 11 iteration 0133/0187: training loss nan Epoch 11 iteration 0134/0187: training loss nan Epoch 11 iteration 0135/0187: training loss nan Epoch 11 iteration 0136/0187: training loss nan Epoch 11 iteration 0137/0187: training loss nan Epoch 11 iteration 0138/0187: training loss nan Epoch 11 iteration 0139/0187: training loss nan Epoch 11 iteration 0140/0187: training loss nan Epoch 11 iteration 0141/0187: training loss nan Epoch 11 iteration 0142/0187: training loss nan Epoch 11 iteration 0143/0187: training loss nan Epoch 11 iteration 0144/0187: training loss nan Epoch 11 iteration 0145/0187: training loss nan Epoch 11 iteration 0146/0187: training loss nan Epoch 11 iteration 0147/0187: training loss nan Epoch 11 iteration 0148/0187: training loss nan Epoch 11 iteration 0149/0187: training loss nan Epoch 11 iteration 0150/0187: training loss nan Epoch 11 iteration 0151/0187: training loss nan Epoch 11 iteration 0152/0187: training loss nan Epoch 11 iteration 0153/0187: training loss nan Epoch 11 iteration 0154/0187: training loss nan Epoch 11 iteration 0155/0187: training loss nan Epoch 11 iteration 0156/0187: training loss nan Epoch 11 iteration 0157/0187: training loss nan Epoch 11 iteration 0158/0187: training loss nan Epoch 11 iteration 0159/0187: training loss nan Epoch 11 iteration 0160/0187: training loss nan Epoch 11 iteration 0161/0187: training loss nan Epoch 11 iteration 0162/0187: training loss nan Epoch 11 iteration 0163/0187: training loss nan Epoch 11 iteration 0164/0187: training loss nan Epoch 11 iteration 0165/0187: training loss nan Epoch 11 iteration 0166/0187: training loss nan Epoch 11 iteration 0167/0187: training loss nan Epoch 11 iteration 0168/0187: training loss nan Epoch 11 iteration 0169/0187: training loss nan Epoch 11 iteration 0170/0187: training loss nan Epoch 11 iteration 0171/0187: training loss nan Epoch 11 iteration 0172/0187: training loss nan Epoch 11 iteration 0173/0187: training loss nan Epoch 11 iteration 0174/0187: training loss nan Epoch 11 iteration 0175/0187: training loss nan Epoch 11 iteration 0176/0187: training loss nan Epoch 11 iteration 0177/0187: training loss nan Epoch 11 iteration 0178/0187: training loss nan Epoch 11 iteration 0179/0187: training loss nan Epoch 11 iteration 0180/0187: training loss nan Epoch 11 iteration 0181/0187: training loss nan Epoch 11 iteration 0182/0187: training loss nan Epoch 11 iteration 0183/0187: training loss nan Epoch 11 iteration 0184/0187: training loss nan Epoch 11 iteration 0185/0187: training loss nan Epoch 11 iteration 0186/0187: training loss nan Epoch 11 iteration 0187/0187: training loss nan Epoch 11 validation pixAcc: 0.861, mIoU: 0.352 Epoch 12 iteration 0001/0187: training loss 0.960 Epoch 12 iteration 0002/0187: training loss 0.938 Epoch 12 iteration 0003/0187: training loss 0.964 Epoch 12 iteration 0004/0187: training loss 1.001 Epoch 12 iteration 0005/0187: training loss 0.997 Epoch 12 iteration 0006/0187: training loss 0.995 Epoch 12 iteration 0007/0187: training loss 0.989 Epoch 12 iteration 0008/0187: training loss 0.992 Epoch 12 iteration 0009/0187: training loss 0.982 Epoch 12 iteration 0010/0187: training loss 0.980 Epoch 12 iteration 0011/0187: training loss 0.966 Epoch 12 iteration 0012/0187: training loss 0.970 Epoch 12 iteration 0013/0187: training loss 0.971 Epoch 12 iteration 0014/0187: training loss 0.978 Epoch 12 iteration 0015/0187: training loss 0.983 Epoch 12 iteration 0016/0187: training loss 1.001 Epoch 12 iteration 0017/0187: training loss 0.992 Epoch 12 iteration 0018/0187: training loss 0.996 Epoch 12 iteration 0019/0187: training loss 0.988 Epoch 12 iteration 0020/0187: training loss 0.989 Epoch 12 iteration 0021/0187: training loss 1.011 Epoch 12 iteration 0022/0187: training loss 1.002 Epoch 12 iteration 0023/0187: training loss 0.994 Epoch 12 iteration 0024/0187: training loss 1.009 Epoch 12 iteration 0025/0187: training loss 1.017 Epoch 12 iteration 0026/0187: training loss 1.015 Epoch 12 iteration 0027/0187: training loss 1.019 Epoch 12 iteration 0028/0187: training loss 1.016 Epoch 12 iteration 0029/0187: training loss 1.023 Epoch 12 iteration 0030/0187: training loss 1.020 Epoch 12 iteration 0031/0187: training loss 1.030 Epoch 12 iteration 0032/0187: training loss 1.027 Epoch 12 iteration 0033/0187: training loss 1.038 Epoch 12 iteration 0034/0187: training loss 1.033 Epoch 12 iteration 0035/0187: training loss 1.031 Epoch 12 iteration 0036/0187: training loss 1.034 Epoch 12 iteration 0037/0187: training loss 1.036 Epoch 12 iteration 0038/0187: training loss 1.035 Epoch 12 iteration 0039/0187: training loss 1.034 Epoch 12 iteration 0040/0187: training loss 1.031 Epoch 12 iteration 0041/0187: training loss 1.034 Epoch 12 iteration 0042/0187: training loss 1.031 Epoch 12 iteration 0043/0187: training loss 1.033 Epoch 12 iteration 0044/0187: training loss 1.029 Epoch 12 iteration 0045/0187: training loss 1.024 Epoch 12 iteration 0046/0187: training loss 1.021 Epoch 12 iteration 0047/0187: training loss 1.021 Epoch 12 iteration 0048/0187: training loss 1.019 Epoch 12 iteration 0049/0187: training loss 1.017 Epoch 12 iteration 0050/0187: training loss 1.018 Epoch 12 iteration 0051/0187: training loss 1.019 Epoch 12 iteration 0052/0187: training loss 1.020 Epoch 12 iteration 0053/0187: training loss 1.016 Epoch 12 iteration 0054/0187: training loss 1.014 Epoch 12 iteration 0055/0187: training loss 1.019 Epoch 12 iteration 0056/0187: training loss 1.021 Epoch 12 iteration 0057/0187: training loss 1.018 Epoch 12 iteration 0058/0187: training loss 1.020 Epoch 12 iteration 0059/0187: training loss 1.018 Epoch 12 iteration 0060/0187: training loss 1.017 Epoch 12 iteration 0061/0187: training loss 1.016 Epoch 12 iteration 0062/0187: training loss 1.020 Epoch 12 iteration 0063/0187: training loss 1.023 Epoch 12 iteration 0064/0187: training loss 1.021 Epoch 12 iteration 0065/0187: training loss 1.020 Epoch 12 iteration 0066/0187: training loss 1.023 Epoch 12 iteration 0067/0187: training loss 1.029 Epoch 12 iteration 0068/0187: training loss 1.028 Epoch 12 iteration 0069/0187: training loss 1.024 Epoch 12 iteration 0070/0187: training loss 1.019 Epoch 12 iteration 0071/0187: training loss 1.018 Epoch 12 iteration 0072/0187: training loss 1.019 Epoch 12 iteration 0073/0187: training loss 1.017 Epoch 12 iteration 0074/0187: training loss 1.014 Epoch 12 iteration 0075/0187: training loss 1.012 Epoch 12 iteration 0076/0187: training loss 1.012 Epoch 12 iteration 0077/0187: training loss 1.012 Epoch 12 iteration 0078/0187: training loss 1.014 Epoch 12 iteration 0079/0187: training loss 1.015 Epoch 12 iteration 0080/0187: training loss 1.016 Epoch 12 iteration 0081/0187: training loss 1.015 Epoch 12 iteration 0082/0187: training loss 1.013 Epoch 12 iteration 0083/0187: training loss 1.012 Epoch 12 iteration 0084/0187: training loss 1.011 Epoch 12 iteration 0085/0187: training loss 1.009 Epoch 12 iteration 0086/0187: training loss 1.007 Epoch 12 iteration 0087/0187: training loss 1.008 Epoch 12 iteration 0088/0187: training loss 1.007 Epoch 12 iteration 0089/0187: training loss 1.008 Epoch 12 iteration 0090/0187: training loss 1.012 Epoch 12 iteration 0091/0188: training loss 1.014 Epoch 12 iteration 0092/0188: training loss 1.011 Epoch 12 iteration 0093/0188: training loss 1.011 Epoch 12 iteration 0094/0188: training loss 1.011 Epoch 12 iteration 0095/0188: training loss 1.015 Epoch 12 iteration 0096/0188: training loss 1.017 Epoch 12 iteration 0097/0188: training loss 1.015 Epoch 12 iteration 0098/0188: training loss 1.013 Epoch 12 iteration 0099/0188: training loss 1.013 Epoch 12 iteration 0100/0188: training loss 1.012 Epoch 12 iteration 0101/0188: training loss 1.015 Epoch 12 iteration 0102/0188: training loss 1.018 Epoch 12 iteration 0103/0188: training loss 1.018 Epoch 12 iteration 0104/0188: training loss 1.017 Epoch 12 iteration 0105/0188: training loss 1.017 Epoch 12 iteration 0106/0188: training loss 1.020 Epoch 12 iteration 0107/0188: training loss 1.023 Epoch 12 iteration 0108/0188: training loss 1.024 Epoch 12 iteration 0109/0188: training loss 1.023 Epoch 12 iteration 0110/0188: training loss 1.023 Epoch 12 iteration 0111/0188: training loss 1.023 Epoch 12 iteration 0112/0188: training loss 1.020 Epoch 12 iteration 0113/0188: training loss 1.020 Epoch 12 iteration 0114/0188: training loss 1.020 Epoch 12 iteration 0115/0188: training loss 1.018 Epoch 12 iteration 0116/0188: training loss 1.019 Epoch 12 iteration 0117/0188: training loss 1.020 Epoch 12 iteration 0118/0188: training loss 1.020 Epoch 12 iteration 0119/0188: training loss 1.019 Epoch 12 iteration 0120/0188: training loss 1.019 Epoch 12 iteration 0121/0188: training loss 1.019 Epoch 12 iteration 0122/0188: training loss 1.020 Epoch 12 iteration 0123/0188: training loss 1.020 Epoch 12 iteration 0124/0188: training loss 1.020 Epoch 12 iteration 0125/0188: training loss 1.020 Epoch 12 iteration 0126/0188: training loss 1.019 Epoch 12 iteration 0127/0188: training loss 1.020 Epoch 12 iteration 0128/0188: training loss 1.023 Epoch 12 iteration 0129/0188: training loss 1.022 Epoch 12 iteration 0130/0188: training loss 1.024 Epoch 12 iteration 0131/0188: training loss 1.025 Epoch 12 iteration 0132/0188: training loss 1.024 Epoch 12 iteration 0133/0188: training loss 1.023 Epoch 12 iteration 0134/0188: training loss 1.025 Epoch 12 iteration 0135/0188: training loss 1.025 Epoch 12 iteration 0136/0188: training loss 1.024 Epoch 12 iteration 0137/0188: training loss 1.023 Epoch 12 iteration 0138/0188: training loss 1.023 Epoch 12 iteration 0139/0188: training loss 1.024 Epoch 12 iteration 0140/0188: training loss 1.025 Epoch 12 iteration 0141/0188: training loss 1.024 Epoch 12 iteration 0142/0188: training loss 1.024 Epoch 12 iteration 0143/0188: training loss 1.024 Epoch 12 iteration 0144/0188: training loss 1.025 Epoch 12 iteration 0145/0188: training loss 1.025 Epoch 12 iteration 0146/0188: training loss 1.024 Epoch 12 iteration 0147/0188: training loss 1.024 Epoch 12 iteration 0148/0188: training loss 1.024 Epoch 12 iteration 0149/0188: training loss 1.024 Epoch 12 iteration 0150/0188: training loss 1.026 Epoch 12 iteration 0151/0188: training loss 1.027 Epoch 12 iteration 0152/0188: training loss 1.027 Epoch 12 iteration 0153/0188: training loss 1.029 Epoch 12 iteration 0154/0188: training loss 1.029 Epoch 12 iteration 0155/0188: training loss 1.030 Epoch 12 iteration 0156/0188: training loss 1.033 Epoch 12 iteration 0157/0188: training loss 1.034 Epoch 12 iteration 0158/0188: training loss 1.034 Epoch 12 iteration 0159/0188: training loss 1.035 Epoch 12 iteration 0160/0188: training loss 1.034 Epoch 12 iteration 0161/0188: training loss 1.035 Epoch 12 iteration 0162/0188: training loss 1.034 Epoch 12 iteration 0163/0188: training loss 1.035 Epoch 12 iteration 0164/0188: training loss 1.034 Epoch 12 iteration 0165/0188: training loss 1.035 Epoch 12 iteration 0166/0188: training loss 1.034 Epoch 12 iteration 0167/0188: training loss 1.034 Epoch 12 iteration 0168/0188: training loss 1.034 Epoch 12 iteration 0169/0188: training loss 1.034 Epoch 12 iteration 0170/0188: training loss 1.034 Epoch 12 iteration 0171/0188: training loss 1.035 Epoch 12 iteration 0172/0188: training loss 1.035 Epoch 12 iteration 0173/0188: training loss 1.034 Epoch 12 iteration 0174/0188: training loss 1.034 Epoch 12 iteration 0175/0188: training loss 1.033 Epoch 12 iteration 0176/0188: training loss 1.034 Epoch 12 iteration 0177/0188: training loss 1.033 Epoch 12 iteration 0178/0188: training loss 1.032 Epoch 12 iteration 0179/0188: training loss 1.033 Epoch 12 iteration 0180/0188: training loss 1.032 Epoch 12 iteration 0181/0188: training loss 1.031 Epoch 12 iteration 0182/0188: training loss 1.032 Epoch 12 iteration 0183/0188: training loss 1.031 Epoch 12 iteration 0184/0188: training loss 1.031 Epoch 12 iteration 0185/0188: training loss 1.031 Epoch 12 iteration 0186/0188: training loss 1.032 Epoch 12 validation pixAcc: 0.863, mIoU: 0.357 Epoch 13 iteration 0001/0187: training loss 1.010 Epoch 13 iteration 0002/0187: training loss 0.959 Epoch 13 iteration 0003/0187: training loss 1.130 Epoch 13 iteration 0004/0187: training loss 1.104 Epoch 13 iteration 0005/0187: training loss 1.049 Epoch 13 iteration 0006/0187: training loss 1.033 Epoch 13 iteration 0007/0187: training loss 1.020 Epoch 13 iteration 0008/0187: training loss 1.034 Epoch 13 iteration 0009/0187: training loss 1.057 Epoch 13 iteration 0010/0187: training loss 1.039 Epoch 13 iteration 0011/0187: training loss 1.032 Epoch 13 iteration 0012/0187: training loss 1.018 Epoch 13 iteration 0013/0187: training loss 1.019 Epoch 13 iteration 0014/0187: training loss 1.016 Epoch 13 iteration 0015/0187: training loss 1.021 Epoch 13 iteration 0016/0187: training loss 1.025 Epoch 13 iteration 0017/0187: training loss 1.033 Epoch 13 iteration 0018/0187: training loss 1.022 Epoch 13 iteration 0019/0187: training loss 1.022 Epoch 13 iteration 0020/0187: training loss 1.017 Epoch 13 iteration 0021/0187: training loss 1.010 Epoch 13 iteration 0022/0187: training loss 1.007 Epoch 13 iteration 0023/0187: training loss 1.000 Epoch 13 iteration 0024/0187: training loss 1.003 Epoch 13 iteration 0025/0187: training loss 1.006 Epoch 13 iteration 0026/0187: training loss 1.006 Epoch 13 iteration 0027/0187: training loss 1.003 Epoch 13 iteration 0028/0187: training loss 1.001 Epoch 13 iteration 0029/0187: training loss 1.001 Epoch 13 iteration 0030/0187: training loss 0.992 Epoch 13 iteration 0031/0187: training loss 0.991 Epoch 13 iteration 0032/0187: training loss 0.991 Epoch 13 iteration 0033/0187: training loss 0.990 Epoch 13 iteration 0034/0187: training loss nan Epoch 13 iteration 0035/0187: training loss nan Epoch 13 iteration 0036/0187: training loss nan Epoch 13 iteration 0037/0187: training loss nan Epoch 13 iteration 0038/0187: training loss nan Epoch 13 iteration 0039/0187: training loss nan Epoch 13 iteration 0040/0187: training loss nan Epoch 13 iteration 0041/0187: training loss nan Epoch 13 iteration 0042/0187: training loss nan Epoch 13 iteration 0043/0187: training loss nan Epoch 13 iteration 0044/0187: training loss nan Epoch 13 iteration 0045/0187: training loss nan Epoch 13 iteration 0046/0187: training loss nan Epoch 13 iteration 0047/0187: training loss nan Epoch 13 iteration 0048/0187: training loss nan Epoch 13 iteration 0049/0187: training loss nan Epoch 13 iteration 0050/0187: training loss nan Epoch 13 iteration 0051/0187: training loss nan Epoch 13 iteration 0052/0187: training loss nan Epoch 13 iteration 0053/0187: training loss nan Epoch 13 iteration 0054/0187: training loss nan Epoch 13 iteration 0055/0187: training loss nan Epoch 13 iteration 0056/0187: training loss nan Epoch 13 iteration 0057/0187: training loss nan Epoch 13 iteration 0058/0187: training loss nan Epoch 13 iteration 0059/0187: training loss nan Epoch 13 iteration 0060/0187: training loss nan Epoch 13 iteration 0061/0187: training loss nan Epoch 13 iteration 0062/0187: training loss nan Epoch 13 iteration 0063/0187: training loss nan Epoch 13 iteration 0064/0187: training loss nan Epoch 13 iteration 0065/0187: training loss nan Epoch 13 iteration 0066/0187: training loss nan Epoch 13 iteration 0067/0187: training loss nan Epoch 13 iteration 0068/0187: training loss nan Epoch 13 iteration 0069/0187: training loss nan Epoch 13 iteration 0070/0187: training loss nan Epoch 13 iteration 0071/0187: training loss nan Epoch 13 iteration 0072/0187: training loss nan Epoch 13 iteration 0073/0187: training loss nan Epoch 13 iteration 0074/0187: training loss nan Epoch 13 iteration 0075/0187: training loss nan Epoch 13 iteration 0076/0187: training loss nan Epoch 13 iteration 0077/0187: training loss nan Epoch 13 iteration 0078/0187: training loss nan Epoch 13 iteration 0079/0187: training loss nan Epoch 13 iteration 0080/0187: training loss nan Epoch 13 iteration 0081/0187: training loss nan Epoch 13 iteration 0082/0187: training loss nan Epoch 13 iteration 0083/0187: training loss nan Epoch 13 iteration 0084/0187: training loss nan Epoch 13 iteration 0085/0187: training loss nan Epoch 13 iteration 0086/0187: training loss nan Epoch 13 iteration 0087/0187: training loss nan Epoch 13 iteration 0088/0187: training loss nan Epoch 13 iteration 0089/0187: training loss nan Epoch 13 iteration 0090/0187: training loss nan Epoch 13 iteration 0091/0187: training loss nan Epoch 13 iteration 0092/0187: training loss nan Epoch 13 iteration 0093/0187: training loss nan Epoch 13 iteration 0094/0187: training loss nan Epoch 13 iteration 0095/0187: training loss nan Epoch 13 iteration 0096/0187: training loss nan Epoch 13 iteration 0097/0187: training loss nan Epoch 13 iteration 0098/0187: training loss nan Epoch 13 iteration 0099/0187: training loss nan Epoch 13 iteration 0100/0187: training loss nan Epoch 13 iteration 0101/0187: training loss nan Epoch 13 iteration 0102/0187: training loss nan Epoch 13 iteration 0103/0187: training loss nan Epoch 13 iteration 0104/0187: training loss nan Epoch 13 iteration 0105/0187: training loss nan Epoch 13 iteration 0106/0187: training loss nan Epoch 13 iteration 0107/0187: training loss nan Epoch 13 iteration 0108/0187: training loss nan Epoch 13 iteration 0109/0187: training loss nan Epoch 13 iteration 0110/0187: training loss nan Epoch 13 iteration 0111/0187: training loss nan Epoch 13 iteration 0112/0187: training loss nan Epoch 13 iteration 0113/0187: training loss nan Epoch 13 iteration 0114/0187: training loss nan Epoch 13 iteration 0115/0187: training loss nan Epoch 13 iteration 0116/0187: training loss nan Epoch 13 iteration 0117/0187: training loss nan Epoch 13 iteration 0118/0187: training loss nan Epoch 13 iteration 0119/0187: training loss nan Epoch 13 iteration 0120/0187: training loss nan Epoch 13 iteration 0121/0187: training loss nan Epoch 13 iteration 0122/0187: training loss nan Epoch 13 iteration 0123/0187: training loss nan Epoch 13 iteration 0124/0187: training loss nan Epoch 13 iteration 0125/0187: training loss nan Epoch 13 iteration 0126/0187: training loss nan Epoch 13 iteration 0127/0187: training loss nan Epoch 13 iteration 0128/0187: training loss nan Epoch 13 iteration 0129/0187: training loss nan Epoch 13 iteration 0130/0187: training loss nan Epoch 13 iteration 0131/0187: training loss nan Epoch 13 iteration 0132/0187: training loss nan Epoch 13 iteration 0133/0187: training loss nan Epoch 13 iteration 0134/0187: training loss nan Epoch 13 iteration 0135/0187: training loss nan Epoch 13 iteration 0136/0187: training loss nan Epoch 13 iteration 0137/0187: training loss nan Epoch 13 iteration 0138/0187: training loss nan Epoch 13 iteration 0139/0187: training loss nan Epoch 13 iteration 0140/0187: training loss nan Epoch 13 iteration 0141/0187: training loss nan Epoch 13 iteration 0142/0187: training loss nan Epoch 13 iteration 0143/0187: training loss nan Epoch 13 iteration 0144/0187: training loss nan Epoch 13 iteration 0145/0187: training loss nan Epoch 13 iteration 0146/0187: training loss nan Epoch 13 iteration 0147/0187: training loss nan Epoch 13 iteration 0148/0187: training loss nan Epoch 13 iteration 0149/0187: training loss nan Epoch 13 iteration 0150/0187: training loss nan Epoch 13 iteration 0151/0187: training loss nan Epoch 13 iteration 0152/0187: training loss nan Epoch 13 iteration 0153/0187: training loss nan Epoch 13 iteration 0154/0187: training loss nan Epoch 13 iteration 0155/0187: training loss nan Epoch 13 iteration 0156/0187: training loss nan Epoch 13 iteration 0157/0187: training loss nan Epoch 13 iteration 0158/0187: training loss nan Epoch 13 iteration 0159/0187: training loss nan Epoch 13 iteration 0160/0187: training loss nan Epoch 13 iteration 0161/0187: training loss nan Epoch 13 iteration 0162/0187: training loss nan Epoch 13 iteration 0163/0187: training loss nan Epoch 13 iteration 0164/0187: training loss nan Epoch 13 iteration 0165/0187: training loss nan Epoch 13 iteration 0166/0187: training loss nan Epoch 13 iteration 0167/0187: training loss nan Epoch 13 iteration 0168/0187: training loss nan Epoch 13 iteration 0169/0187: training loss nan Epoch 13 iteration 0170/0187: training loss nan Epoch 13 iteration 0171/0187: training loss nan Epoch 13 iteration 0172/0187: training loss nan Epoch 13 iteration 0173/0187: training loss nan Epoch 13 iteration 0174/0187: training loss nan Epoch 13 iteration 0175/0187: training loss nan Epoch 13 iteration 0176/0187: training loss nan Epoch 13 iteration 0177/0187: training loss nan Epoch 13 iteration 0178/0187: training loss nan Epoch 13 iteration 0179/0187: training loss nan Epoch 13 iteration 0180/0187: training loss nan Epoch 13 iteration 0181/0187: training loss nan Epoch 13 iteration 0182/0187: training loss nan Epoch 13 iteration 0183/0187: training loss nan Epoch 13 iteration 0184/0187: training loss nan Epoch 13 iteration 0185/0187: training loss nan Epoch 13 iteration 0186/0187: training loss nan Epoch 13 iteration 0187/0187: training loss nan Epoch 13 validation pixAcc: 0.864, mIoU: 0.361 Epoch 14 iteration 0001/0187: training loss 0.997 Epoch 14 iteration 0002/0187: training loss 0.900 Epoch 14 iteration 0003/0187: training loss 0.919 Epoch 14 iteration 0004/0187: training loss 0.976 Epoch 14 iteration 0005/0187: training loss 0.939 Epoch 14 iteration 0006/0187: training loss 0.963 Epoch 14 iteration 0007/0187: training loss 0.948 Epoch 14 iteration 0008/0187: training loss 0.934 Epoch 14 iteration 0009/0187: training loss 0.938 Epoch 14 iteration 0010/0187: training loss 0.913 Epoch 14 iteration 0011/0187: training loss 0.910 Epoch 14 iteration 0012/0187: training loss 0.914 Epoch 14 iteration 0013/0187: training loss 0.928 Epoch 14 iteration 0014/0187: training loss 0.930 Epoch 14 iteration 0015/0187: training loss 0.941 Epoch 14 iteration 0016/0187: training loss 0.938 Epoch 14 iteration 0017/0187: training loss 0.945 Epoch 14 iteration 0018/0187: training loss 0.943 Epoch 14 iteration 0019/0187: training loss 0.937 Epoch 14 iteration 0020/0187: training loss 0.935 Epoch 14 iteration 0021/0187: training loss 0.938 Epoch 14 iteration 0022/0187: training loss 0.941 Epoch 14 iteration 0023/0187: training loss 0.940 Epoch 14 iteration 0024/0187: training loss 0.943 Epoch 14 iteration 0025/0187: training loss 0.947 Epoch 14 iteration 0026/0187: training loss 0.955 Epoch 14 iteration 0027/0187: training loss 0.946 Epoch 14 iteration 0028/0187: training loss 0.950 Epoch 14 iteration 0029/0187: training loss 0.946 Epoch 14 iteration 0030/0187: training loss 0.944 Epoch 14 iteration 0031/0187: training loss 0.945 Epoch 14 iteration 0032/0187: training loss 0.945 Epoch 14 iteration 0033/0187: training loss 0.944 Epoch 14 iteration 0034/0187: training loss 0.943 Epoch 14 iteration 0035/0187: training loss 0.937 Epoch 14 iteration 0036/0187: training loss 0.940 Epoch 14 iteration 0037/0187: training loss 0.943 Epoch 14 iteration 0038/0187: training loss 0.938 Epoch 14 iteration 0039/0187: training loss 0.945 Epoch 14 iteration 0040/0187: training loss 0.942 Epoch 14 iteration 0041/0187: training loss 0.943 Epoch 14 iteration 0042/0187: training loss 0.946 Epoch 14 iteration 0043/0187: training loss nan Epoch 14 iteration 0044/0187: training loss nan Epoch 14 iteration 0045/0187: training loss nan Epoch 14 iteration 0046/0187: training loss nan Epoch 14 iteration 0047/0187: training loss nan Epoch 14 iteration 0048/0187: training loss nan Epoch 14 iteration 0049/0187: training loss nan Epoch 14 iteration 0050/0187: training loss nan Epoch 14 iteration 0051/0187: training loss nan Epoch 14 iteration 0052/0187: training loss nan Epoch 14 iteration 0053/0187: training loss nan Epoch 14 iteration 0054/0187: training loss nan Epoch 14 iteration 0055/0187: training loss nan Epoch 14 iteration 0056/0187: training loss nan Epoch 14 iteration 0057/0187: training loss nan Epoch 14 iteration 0058/0187: training loss nan Epoch 14 iteration 0059/0187: training loss nan Epoch 14 iteration 0060/0187: training loss nan Epoch 14 iteration 0061/0187: training loss nan Epoch 14 iteration 0062/0187: training loss nan Epoch 14 iteration 0063/0187: training loss nan Epoch 14 iteration 0064/0187: training loss nan Epoch 14 iteration 0065/0187: training loss nan Epoch 14 iteration 0066/0187: training loss nan Epoch 14 iteration 0067/0187: training loss nan Epoch 14 iteration 0068/0187: training loss nan Epoch 14 iteration 0069/0187: training loss nan Epoch 14 iteration 0070/0187: training loss nan Epoch 14 iteration 0071/0187: training loss nan Epoch 14 iteration 0072/0187: training loss nan Epoch 14 iteration 0073/0187: training loss nan Epoch 14 iteration 0074/0187: training loss nan Epoch 14 iteration 0075/0187: training loss nan Epoch 14 iteration 0076/0187: training loss nan Epoch 14 iteration 0077/0187: training loss nan Epoch 14 iteration 0078/0187: training loss nan Epoch 14 iteration 0079/0187: training loss nan Epoch 14 iteration 0080/0187: training loss nan Epoch 14 iteration 0081/0187: training loss nan Epoch 14 iteration 0082/0187: training loss nan Epoch 14 iteration 0083/0187: training loss nan Epoch 14 iteration 0084/0187: training loss nan Epoch 14 iteration 0085/0187: training loss nan Epoch 14 iteration 0086/0187: training loss nan Epoch 14 iteration 0087/0187: training loss nan Epoch 14 iteration 0088/0187: training loss nan Epoch 14 iteration 0089/0187: training loss nan Epoch 14 iteration 0090/0187: training loss nan Epoch 14 iteration 0091/0188: training loss nan Epoch 14 iteration 0092/0188: training loss nan Epoch 14 iteration 0093/0188: training loss nan Epoch 14 iteration 0094/0188: training loss nan Epoch 14 iteration 0095/0188: training loss nan Epoch 14 iteration 0096/0188: training loss nan Epoch 14 iteration 0097/0188: training loss nan Epoch 14 iteration 0098/0188: training loss nan Epoch 14 iteration 0099/0188: training loss nan Epoch 14 iteration 0100/0188: training loss nan Epoch 14 iteration 0101/0188: training loss nan Epoch 14 iteration 0102/0188: training loss nan Epoch 14 iteration 0103/0188: training loss nan Epoch 14 iteration 0104/0188: training loss nan Epoch 14 iteration 0105/0188: training loss nan Epoch 14 iteration 0106/0188: training loss nan Epoch 14 iteration 0107/0188: training loss nan Epoch 14 iteration 0108/0188: training loss nan Epoch 14 iteration 0109/0188: training loss nan Epoch 14 iteration 0110/0188: training loss nan Epoch 14 iteration 0111/0188: training loss nan Epoch 14 iteration 0112/0188: training loss nan Epoch 14 iteration 0113/0188: training loss nan Epoch 14 iteration 0114/0188: training loss nan Epoch 14 iteration 0115/0188: training loss nan Epoch 14 iteration 0116/0188: training loss nan Epoch 14 iteration 0117/0188: training loss nan Epoch 14 iteration 0118/0188: training loss nan Epoch 14 iteration 0119/0188: training loss nan Epoch 14 iteration 0120/0188: training loss nan Epoch 14 iteration 0121/0188: training loss nan Epoch 14 iteration 0122/0188: training loss nan Epoch 14 iteration 0123/0188: training loss nan Epoch 14 iteration 0124/0188: training loss nan Epoch 14 iteration 0125/0188: training loss nan Epoch 14 iteration 0126/0188: training loss nan Epoch 14 iteration 0127/0188: training loss nan Epoch 14 iteration 0128/0188: training loss nan Epoch 14 iteration 0129/0188: training loss nan Epoch 14 iteration 0130/0188: training loss nan Epoch 14 iteration 0131/0188: training loss nan Epoch 14 iteration 0132/0188: training loss nan Epoch 14 iteration 0133/0188: training loss nan Epoch 14 iteration 0134/0188: training loss nan Epoch 14 iteration 0135/0188: training loss nan Epoch 14 iteration 0136/0188: training loss nan Epoch 14 iteration 0137/0188: training loss nan Epoch 14 iteration 0138/0188: training loss nan Epoch 14 iteration 0139/0188: training loss nan Epoch 14 iteration 0140/0188: training loss nan Epoch 14 iteration 0141/0188: training loss nan Epoch 14 iteration 0142/0188: training loss nan Epoch 14 iteration 0143/0188: training loss nan Epoch 14 iteration 0144/0188: training loss nan Epoch 14 iteration 0145/0188: training loss nan Epoch 14 iteration 0146/0188: training loss nan Epoch 14 iteration 0147/0188: training loss nan Epoch 14 iteration 0148/0188: training loss nan Epoch 14 iteration 0149/0188: training loss nan Epoch 14 iteration 0150/0188: training loss nan Epoch 14 iteration 0151/0188: training loss nan Epoch 14 iteration 0152/0188: training loss nan Epoch 14 iteration 0153/0188: training loss nan Epoch 14 iteration 0154/0188: training loss nan Epoch 14 iteration 0155/0188: training loss nan Epoch 14 iteration 0156/0188: training loss nan Epoch 14 iteration 0157/0188: training loss nan Epoch 14 iteration 0158/0188: training loss nan Epoch 14 iteration 0159/0188: training loss nan Epoch 14 iteration 0160/0188: training loss nan Epoch 14 iteration 0161/0188: training loss nan Epoch 14 iteration 0162/0188: training loss nan Epoch 14 iteration 0163/0188: training loss nan Epoch 14 iteration 0164/0188: training loss nan Epoch 14 iteration 0165/0188: training loss nan Epoch 14 iteration 0166/0188: training loss nan Epoch 14 iteration 0167/0188: training loss nan Epoch 14 iteration 0168/0188: training loss nan Epoch 14 iteration 0169/0188: training loss nan Epoch 14 iteration 0170/0188: training loss nan Epoch 14 iteration 0171/0188: training loss nan Epoch 14 iteration 0172/0188: training loss nan Epoch 14 iteration 0173/0188: training loss nan Epoch 14 iteration 0174/0188: training loss nan Epoch 14 iteration 0175/0188: training loss nan Epoch 14 iteration 0176/0188: training loss nan Epoch 14 iteration 0177/0188: training loss nan Epoch 14 iteration 0178/0188: training loss nan Epoch 14 iteration 0179/0188: training loss nan Epoch 14 iteration 0180/0188: training loss nan Epoch 14 iteration 0181/0188: training loss nan Epoch 14 iteration 0182/0188: training loss nan Epoch 14 iteration 0183/0188: training loss nan Epoch 14 iteration 0184/0188: training loss nan Epoch 14 iteration 0185/0188: training loss nan Epoch 14 iteration 0186/0188: training loss nan Epoch 14 validation pixAcc: 0.864, mIoU: 0.357 Epoch 15 iteration 0001/0187: training loss 1.001 Epoch 15 iteration 0002/0187: training loss 0.965 Epoch 15 iteration 0003/0187: training loss 1.023 Epoch 15 iteration 0004/0187: training loss 0.994 Epoch 15 iteration 0005/0187: training loss 0.998 Epoch 15 iteration 0006/0187: training loss 0.990 Epoch 15 iteration 0007/0187: training loss nan Epoch 15 iteration 0008/0187: training loss nan Epoch 15 iteration 0009/0187: training loss nan Epoch 15 iteration 0010/0187: training loss nan Epoch 15 iteration 0011/0187: training loss nan Epoch 15 iteration 0012/0187: training loss nan Epoch 15 iteration 0013/0187: training loss nan Epoch 15 iteration 0014/0187: training loss nan Epoch 15 iteration 0015/0187: training loss nan Epoch 15 iteration 0016/0187: training loss nan Epoch 15 iteration 0017/0187: training loss nan Epoch 15 iteration 0018/0187: training loss nan Epoch 15 iteration 0019/0187: training loss nan Epoch 15 iteration 0020/0187: training loss nan Epoch 15 iteration 0021/0187: training loss nan Epoch 15 iteration 0022/0187: training loss nan Epoch 15 iteration 0023/0187: training loss nan Epoch 15 iteration 0024/0187: training loss nan Epoch 15 iteration 0025/0187: training loss nan Epoch 15 iteration 0026/0187: training loss nan Epoch 15 iteration 0027/0187: training loss nan Epoch 15 iteration 0028/0187: training loss nan Epoch 15 iteration 0029/0187: training loss nan Epoch 15 iteration 0030/0187: training loss nan Epoch 15 iteration 0031/0187: training loss nan Epoch 15 iteration 0032/0187: training loss nan Epoch 15 iteration 0033/0187: training loss nan Epoch 15 iteration 0034/0187: training loss nan Epoch 15 iteration 0035/0187: training loss nan Epoch 15 iteration 0036/0187: training loss nan Epoch 15 iteration 0037/0187: training loss nan Epoch 15 iteration 0038/0187: training loss nan Epoch 15 iteration 0039/0187: training loss nan Epoch 15 iteration 0040/0187: training loss nan Epoch 15 iteration 0041/0187: training loss nan Epoch 15 iteration 0042/0187: training loss nan Epoch 15 iteration 0043/0187: training loss nan Epoch 15 iteration 0044/0187: training loss nan Epoch 15 iteration 0045/0187: training loss nan Epoch 15 iteration 0046/0187: training loss nan Epoch 15 iteration 0047/0187: training loss nan Epoch 15 iteration 0048/0187: training loss nan Epoch 15 iteration 0049/0187: training loss nan Epoch 15 iteration 0050/0187: training loss nan Epoch 15 iteration 0051/0187: training loss nan Epoch 15 iteration 0052/0187: training loss nan Epoch 15 iteration 0053/0187: training loss nan Epoch 15 iteration 0054/0187: training loss nan Epoch 15 iteration 0055/0187: training loss nan Epoch 15 iteration 0056/0187: training loss nan Epoch 15 iteration 0057/0187: training loss nan Epoch 15 iteration 0058/0187: training loss nan Epoch 15 iteration 0059/0187: training loss nan Epoch 15 iteration 0060/0187: training loss nan Epoch 15 iteration 0061/0187: training loss nan Epoch 15 iteration 0062/0187: training loss nan Epoch 15 iteration 0063/0187: training loss nan Epoch 15 iteration 0064/0187: training loss nan Epoch 15 iteration 0065/0187: training loss nan Epoch 15 iteration 0066/0187: training loss nan Epoch 15 iteration 0067/0187: training loss nan Epoch 15 iteration 0068/0187: training loss nan Epoch 15 iteration 0069/0187: training loss nan Epoch 15 iteration 0070/0187: training loss nan Epoch 15 iteration 0071/0187: training loss nan Epoch 15 iteration 0072/0187: training loss nan Epoch 15 iteration 0073/0187: training loss nan Epoch 15 iteration 0074/0187: training loss nan Epoch 15 iteration 0075/0187: training loss nan Epoch 15 iteration 0076/0187: training loss nan Epoch 15 iteration 0077/0187: training loss nan Epoch 15 iteration 0078/0187: training loss nan Epoch 15 iteration 0079/0187: training loss nan Epoch 15 iteration 0080/0187: training loss nan Epoch 15 iteration 0081/0187: training loss nan Epoch 15 iteration 0082/0187: training loss nan Epoch 15 iteration 0083/0187: training loss nan Epoch 15 iteration 0084/0187: training loss nan Epoch 15 iteration 0085/0187: training loss nan Epoch 15 iteration 0086/0187: training loss nan Epoch 15 iteration 0087/0187: training loss nan Epoch 15 iteration 0088/0187: training loss nan Epoch 15 iteration 0089/0187: training loss nan Epoch 15 iteration 0090/0187: training loss nan Epoch 15 iteration 0091/0187: training loss nan Epoch 15 iteration 0092/0187: training loss nan Epoch 15 iteration 0093/0187: training loss nan Epoch 15 iteration 0094/0187: training loss nan Epoch 15 iteration 0095/0187: training loss nan Epoch 15 iteration 0096/0187: training loss nan Epoch 15 iteration 0097/0187: training loss nan Epoch 15 iteration 0098/0187: training loss nan Epoch 15 iteration 0099/0187: training loss nan Epoch 15 iteration 0100/0187: training loss nan Epoch 15 iteration 0101/0187: training loss nan Epoch 15 iteration 0102/0187: training loss nan Epoch 15 iteration 0103/0187: training loss nan Epoch 15 iteration 0104/0187: training loss nan Epoch 15 iteration 0105/0187: training loss nan Epoch 15 iteration 0106/0187: training loss nan Epoch 15 iteration 0107/0187: training loss nan Epoch 15 iteration 0108/0187: training loss nan Epoch 15 iteration 0109/0187: training loss nan Epoch 15 iteration 0110/0187: training loss nan Epoch 15 iteration 0111/0187: training loss nan Epoch 15 iteration 0112/0187: training loss nan Epoch 15 iteration 0113/0187: training loss nan Epoch 15 iteration 0114/0187: training loss nan Epoch 15 iteration 0115/0187: training loss nan Epoch 15 iteration 0116/0187: training loss nan Epoch 15 iteration 0117/0187: training loss nan Epoch 15 iteration 0118/0187: training loss nan Epoch 15 iteration 0119/0187: training loss nan Epoch 15 iteration 0120/0187: training loss nan Epoch 15 iteration 0121/0187: training loss nan Epoch 15 iteration 0122/0187: training loss nan Epoch 15 iteration 0123/0187: training loss nan Epoch 15 iteration 0124/0187: training loss nan Epoch 15 iteration 0125/0187: training loss nan Epoch 15 iteration 0126/0187: training loss nan Epoch 15 iteration 0127/0187: training loss nan Epoch 15 iteration 0128/0187: training loss nan Epoch 15 iteration 0129/0187: training loss nan Epoch 15 iteration 0130/0187: training loss nan Epoch 15 iteration 0131/0187: training loss nan Epoch 15 iteration 0132/0187: training loss nan Epoch 15 iteration 0133/0187: training loss nan Epoch 15 iteration 0134/0187: training loss nan Epoch 15 iteration 0135/0187: training loss nan Epoch 15 iteration 0136/0187: training loss nan Epoch 15 iteration 0137/0187: training loss nan Epoch 15 iteration 0138/0187: training loss nan Epoch 15 iteration 0139/0187: training loss nan Epoch 15 iteration 0140/0187: training loss nan Epoch 15 iteration 0141/0187: training loss nan Epoch 15 iteration 0142/0187: training loss nan Epoch 15 iteration 0143/0187: training loss nan Epoch 15 iteration 0144/0187: training loss nan Epoch 15 iteration 0145/0187: training loss nan Epoch 15 iteration 0146/0187: training loss nan Epoch 15 iteration 0147/0187: training loss nan Epoch 15 iteration 0148/0187: training loss nan Epoch 15 iteration 0149/0187: training loss nan Epoch 15 iteration 0150/0187: training loss nan Epoch 15 iteration 0151/0187: training loss nan Epoch 15 iteration 0152/0187: training loss nan Epoch 15 iteration 0153/0187: training loss nan Epoch 15 iteration 0154/0187: training loss nan Epoch 15 iteration 0155/0187: training loss nan Epoch 15 iteration 0156/0187: training loss nan Epoch 15 iteration 0157/0187: training loss nan Epoch 15 iteration 0158/0187: training loss nan Epoch 15 iteration 0159/0187: training loss nan Epoch 15 iteration 0160/0187: training loss nan Epoch 15 iteration 0161/0187: training loss nan Epoch 15 iteration 0162/0187: training loss nan Epoch 15 iteration 0163/0187: training loss nan Epoch 15 iteration 0164/0187: training loss nan Epoch 15 iteration 0165/0187: training loss nan Epoch 15 iteration 0166/0187: training loss nan Epoch 15 iteration 0167/0187: training loss nan Epoch 15 iteration 0168/0187: training loss nan Epoch 15 iteration 0169/0187: training loss nan Epoch 15 iteration 0170/0187: training loss nan Epoch 15 iteration 0171/0187: training loss nan Epoch 15 iteration 0172/0187: training loss nan Epoch 15 iteration 0173/0187: training loss nan Epoch 15 iteration 0174/0187: training loss nan Epoch 15 iteration 0175/0187: training loss nan Epoch 15 iteration 0176/0187: training loss nan Epoch 15 iteration 0177/0187: training loss nan Epoch 15 iteration 0178/0187: training loss nan Epoch 15 iteration 0179/0187: training loss nan Epoch 15 iteration 0180/0187: training loss nan Epoch 15 iteration 0181/0187: training loss nan Epoch 15 iteration 0182/0187: training loss nan Epoch 15 iteration 0183/0187: training loss nan Epoch 15 iteration 0184/0187: training loss nan Epoch 15 iteration 0185/0187: training loss nan Epoch 15 iteration 0186/0187: training loss nan Epoch 15 iteration 0187/0187: training loss nan Epoch 15 validation pixAcc: 0.867, mIoU: 0.368 Epoch 16 iteration 0001/0187: training loss 0.731 Epoch 16 iteration 0002/0187: training loss 0.842 Epoch 16 iteration 0003/0187: training loss 0.927 Epoch 16 iteration 0004/0187: training loss 0.929 Epoch 16 iteration 0005/0187: training loss 0.912 Epoch 16 iteration 0006/0187: training loss 0.921 Epoch 16 iteration 0007/0187: training loss 0.942 Epoch 16 iteration 0008/0187: training loss 0.950 Epoch 16 iteration 0009/0187: training loss 0.953 Epoch 16 iteration 0010/0187: training loss 0.942 Epoch 16 iteration 0011/0187: training loss 0.926 Epoch 16 iteration 0012/0187: training loss 0.939 Epoch 16 iteration 0013/0187: training loss 0.943 Epoch 16 iteration 0014/0187: training loss 0.947 Epoch 16 iteration 0015/0187: training loss 0.945 Epoch 16 iteration 0016/0187: training loss 0.944 Epoch 16 iteration 0017/0187: training loss 0.940 Epoch 16 iteration 0018/0187: training loss 0.943 Epoch 16 iteration 0019/0187: training loss 0.941 Epoch 16 iteration 0020/0187: training loss 0.952 Epoch 16 iteration 0021/0187: training loss 0.940 Epoch 16 iteration 0022/0187: training loss 0.937 Epoch 16 iteration 0023/0187: training loss 0.932 Epoch 16 iteration 0024/0187: training loss nan Epoch 16 iteration 0025/0187: training loss nan Epoch 16 iteration 0026/0187: training loss nan Epoch 16 iteration 0027/0187: training loss nan Epoch 16 iteration 0028/0187: training loss nan Epoch 16 iteration 0029/0187: training loss nan Epoch 16 iteration 0030/0187: training loss nan Epoch 16 iteration 0031/0187: training loss nan Epoch 16 iteration 0032/0187: training loss nan Epoch 16 iteration 0033/0187: training loss nan Epoch 16 iteration 0034/0187: training loss nan Epoch 16 iteration 0035/0187: training loss nan Epoch 16 iteration 0036/0187: training loss nan Epoch 16 iteration 0037/0187: training loss nan Epoch 16 iteration 0038/0187: training loss nan Epoch 16 iteration 0039/0187: training loss nan Epoch 16 iteration 0040/0187: training loss nan Epoch 16 iteration 0041/0187: training loss nan Epoch 16 iteration 0042/0187: training loss nan Epoch 16 iteration 0043/0187: training loss nan Epoch 16 iteration 0044/0187: training loss nan Epoch 16 iteration 0045/0187: training loss nan Epoch 16 iteration 0046/0187: training loss nan Epoch 16 iteration 0047/0187: training loss nan Epoch 16 iteration 0048/0187: training loss nan Epoch 16 iteration 0049/0187: training loss nan Epoch 16 iteration 0050/0187: training loss nan Epoch 16 iteration 0051/0187: training loss nan Epoch 16 iteration 0052/0187: training loss nan Epoch 16 iteration 0053/0187: training loss nan Epoch 16 iteration 0054/0187: training loss nan Epoch 16 iteration 0055/0187: training loss nan Epoch 16 iteration 0056/0187: training loss nan Epoch 16 iteration 0057/0187: training loss nan Epoch 16 iteration 0058/0187: training loss nan Epoch 16 iteration 0059/0187: training loss nan Epoch 16 iteration 0060/0187: training loss nan Epoch 16 iteration 0061/0187: training loss nan Epoch 16 iteration 0062/0187: training loss nan Epoch 16 iteration 0063/0187: training loss nan Epoch 16 iteration 0064/0187: training loss nan Epoch 16 iteration 0065/0187: training loss nan Epoch 16 iteration 0066/0187: training loss nan Epoch 16 iteration 0067/0187: training loss nan Epoch 16 iteration 0068/0187: training loss nan Epoch 16 iteration 0069/0187: training loss nan Epoch 16 iteration 0070/0187: training loss nan Epoch 16 iteration 0071/0187: training loss nan Epoch 16 iteration 0072/0187: training loss nan Epoch 16 iteration 0073/0187: training loss nan Epoch 16 iteration 0074/0187: training loss nan Epoch 16 iteration 0075/0187: training loss nan Epoch 16 iteration 0076/0187: training loss nan Epoch 16 iteration 0077/0187: training loss nan Epoch 16 iteration 0078/0187: training loss nan Epoch 16 iteration 0079/0187: training loss nan Epoch 16 iteration 0080/0187: training loss nan Epoch 16 iteration 0081/0187: training loss nan Epoch 16 iteration 0082/0187: training loss nan Epoch 16 iteration 0083/0187: training loss nan Epoch 16 iteration 0084/0187: training loss nan Epoch 16 iteration 0085/0187: training loss nan Epoch 16 iteration 0086/0187: training loss nan Epoch 16 iteration 0087/0187: training loss nan Epoch 16 iteration 0088/0187: training loss nan Epoch 16 iteration 0089/0187: training loss nan Epoch 16 iteration 0090/0187: training loss nan Epoch 16 iteration 0091/0188: training loss nan Epoch 16 iteration 0092/0188: training loss nan Epoch 16 iteration 0093/0188: training loss nan Epoch 16 iteration 0094/0188: training loss nan Epoch 16 iteration 0095/0188: training loss nan Epoch 16 iteration 0096/0188: training loss nan Epoch 16 iteration 0097/0188: training loss nan Epoch 16 iteration 0098/0188: training loss nan Epoch 16 iteration 0099/0188: training loss nan Epoch 16 iteration 0100/0188: training loss nan Epoch 16 iteration 0101/0188: training loss nan Epoch 16 iteration 0102/0188: training loss nan Epoch 16 iteration 0103/0188: training loss nan Epoch 16 iteration 0104/0188: training loss nan Epoch 16 iteration 0105/0188: training loss nan Epoch 16 iteration 0106/0188: training loss nan Epoch 16 iteration 0107/0188: training loss nan Epoch 16 iteration 0108/0188: training loss nan Epoch 16 iteration 0109/0188: training loss nan Epoch 16 iteration 0110/0188: training loss nan Epoch 16 iteration 0111/0188: training loss nan Epoch 16 iteration 0112/0188: training loss nan Epoch 16 iteration 0113/0188: training loss nan Epoch 16 iteration 0114/0188: training loss nan Epoch 16 iteration 0115/0188: training loss nan Epoch 16 iteration 0116/0188: training loss nan Epoch 16 iteration 0117/0188: training loss nan Epoch 16 iteration 0118/0188: training loss nan Epoch 16 iteration 0119/0188: training loss nan Epoch 16 iteration 0120/0188: training loss nan Epoch 16 iteration 0121/0188: training loss nan Epoch 16 iteration 0122/0188: training loss nan Epoch 16 iteration 0123/0188: training loss nan Epoch 16 iteration 0124/0188: training loss nan Epoch 16 iteration 0125/0188: training loss nan Epoch 16 iteration 0126/0188: training loss nan Epoch 16 iteration 0127/0188: training loss nan Epoch 16 iteration 0128/0188: training loss nan Epoch 16 iteration 0129/0188: training loss nan Epoch 16 iteration 0130/0188: training loss nan Epoch 16 iteration 0131/0188: training loss nan Epoch 16 iteration 0132/0188: training loss nan Epoch 16 iteration 0133/0188: training loss nan Epoch 16 iteration 0134/0188: training loss nan Epoch 16 iteration 0135/0188: training loss nan Epoch 16 iteration 0136/0188: training loss nan Epoch 16 iteration 0137/0188: training loss nan Epoch 16 iteration 0138/0188: training loss nan Epoch 16 iteration 0139/0188: training loss nan Epoch 16 iteration 0140/0188: training loss nan Epoch 16 iteration 0141/0188: training loss nan Epoch 16 iteration 0142/0188: training loss nan Epoch 16 iteration 0143/0188: training loss nan Epoch 16 iteration 0144/0188: training loss nan Epoch 16 iteration 0145/0188: training loss nan Epoch 16 iteration 0146/0188: training loss nan Epoch 16 iteration 0147/0188: training loss nan Epoch 16 iteration 0148/0188: training loss nan Epoch 16 iteration 0149/0188: training loss nan Epoch 16 iteration 0150/0188: training loss nan Epoch 16 iteration 0151/0188: training loss nan Epoch 16 iteration 0152/0188: training loss nan Epoch 16 iteration 0153/0188: training loss nan Epoch 16 iteration 0154/0188: training loss nan Epoch 16 iteration 0155/0188: training loss nan Epoch 16 iteration 0156/0188: training loss nan Epoch 16 iteration 0157/0188: training loss nan Epoch 16 iteration 0158/0188: training loss nan Epoch 16 iteration 0159/0188: training loss nan Epoch 16 iteration 0160/0188: training loss nan Epoch 16 iteration 0161/0188: training loss nan Epoch 16 iteration 0162/0188: training loss nan Epoch 16 iteration 0163/0188: training loss nan Epoch 16 iteration 0164/0188: training loss nan Epoch 16 iteration 0165/0188: training loss nan Epoch 16 iteration 0166/0188: training loss nan Epoch 16 iteration 0167/0188: training loss nan Epoch 16 iteration 0168/0188: training loss nan Epoch 16 iteration 0169/0188: training loss nan Epoch 16 iteration 0170/0188: training loss nan Epoch 16 iteration 0171/0188: training loss nan Epoch 16 iteration 0172/0188: training loss nan Epoch 16 iteration 0173/0188: training loss nan Epoch 16 iteration 0174/0188: training loss nan Epoch 16 iteration 0175/0188: training loss nan Epoch 16 iteration 0176/0188: training loss nan Epoch 16 iteration 0177/0188: training loss nan Epoch 16 iteration 0178/0188: training loss nan Epoch 16 iteration 0179/0188: training loss nan Epoch 16 iteration 0180/0188: training loss nan Epoch 16 iteration 0181/0188: training loss nan Epoch 16 iteration 0182/0188: training loss nan Epoch 16 iteration 0183/0188: training loss nan Epoch 16 iteration 0184/0188: training loss nan Epoch 16 iteration 0185/0188: training loss nan Epoch 16 iteration 0186/0188: training loss nan Epoch 16 validation pixAcc: 0.866, mIoU: 0.368 Epoch 17 iteration 0001/0187: training loss 0.775 Epoch 17 iteration 0002/0187: training loss 0.805 Epoch 17 iteration 0003/0187: training loss 0.808 Epoch 17 iteration 0004/0187: training loss 0.814 Epoch 17 iteration 0005/0187: training loss 0.801 Epoch 17 iteration 0006/0187: training loss 0.817 Epoch 17 iteration 0007/0187: training loss 0.833 Epoch 17 iteration 0008/0187: training loss 0.829 Epoch 17 iteration 0009/0187: training loss 0.876 Epoch 17 iteration 0010/0187: training loss 0.897 Epoch 17 iteration 0011/0187: training loss 0.904 Epoch 17 iteration 0012/0187: training loss 0.909 Epoch 17 iteration 0013/0187: training loss 0.905 Epoch 17 iteration 0014/0187: training loss 0.905 Epoch 17 iteration 0015/0187: training loss 0.903 Epoch 17 iteration 0016/0187: training loss 0.897 Epoch 17 iteration 0017/0187: training loss 0.884 Epoch 17 iteration 0018/0187: training loss 0.897 Epoch 17 iteration 0019/0187: training loss 0.899 Epoch 17 iteration 0020/0187: training loss 0.901 Epoch 17 iteration 0021/0187: training loss 0.907 Epoch 17 iteration 0022/0187: training loss 0.910 Epoch 17 iteration 0023/0187: training loss 0.912 Epoch 17 iteration 0024/0187: training loss 0.909 Epoch 17 iteration 0025/0187: training loss 0.911 Epoch 17 iteration 0026/0187: training loss 0.907 Epoch 17 iteration 0027/0187: training loss 0.908 Epoch 17 iteration 0028/0187: training loss 0.907 Epoch 17 iteration 0029/0187: training loss 0.921 Epoch 17 iteration 0030/0187: training loss 0.916 Epoch 17 iteration 0031/0187: training loss 0.912 Epoch 17 iteration 0032/0187: training loss 0.911 Epoch 17 iteration 0033/0187: training loss 0.910 Epoch 17 iteration 0034/0187: training loss 0.910 Epoch 17 iteration 0035/0187: training loss 0.910 Epoch 17 iteration 0036/0187: training loss 0.907 Epoch 17 iteration 0037/0187: training loss 0.904 Epoch 17 iteration 0038/0187: training loss 0.910 Epoch 17 iteration 0039/0187: training loss 0.908 Epoch 17 iteration 0040/0187: training loss nan Epoch 17 iteration 0041/0187: training loss nan Epoch 17 iteration 0042/0187: training loss nan Epoch 17 iteration 0043/0187: training loss nan Epoch 17 iteration 0044/0187: training loss nan Epoch 17 iteration 0045/0187: training loss nan Epoch 17 iteration 0046/0187: training loss nan Epoch 17 iteration 0047/0187: training loss nan Epoch 17 iteration 0048/0187: training loss nan Epoch 17 iteration 0049/0187: training loss nan Epoch 17 iteration 0050/0187: training loss nan Epoch 17 iteration 0051/0187: training loss nan Epoch 17 iteration 0052/0187: training loss nan Epoch 17 iteration 0053/0187: training loss nan Epoch 17 iteration 0054/0187: training loss nan Epoch 17 iteration 0055/0187: training loss nan Epoch 17 iteration 0056/0187: training loss nan Epoch 17 iteration 0057/0187: training loss nan Epoch 17 iteration 0058/0187: training loss nan Epoch 17 iteration 0059/0187: training loss nan Epoch 17 iteration 0060/0187: training loss nan Epoch 17 iteration 0061/0187: training loss nan Epoch 17 iteration 0062/0187: training loss nan Epoch 17 iteration 0063/0187: training loss nan Epoch 17 iteration 0064/0187: training loss nan Epoch 17 iteration 0065/0187: training loss nan Epoch 17 iteration 0066/0187: training loss nan Epoch 17 iteration 0067/0187: training loss nan Epoch 17 iteration 0068/0187: training loss nan Epoch 17 iteration 0069/0187: training loss nan Epoch 17 iteration 0070/0187: training loss nan Epoch 17 iteration 0071/0187: training loss nan Epoch 17 iteration 0072/0187: training loss nan Epoch 17 iteration 0073/0187: training loss nan Epoch 17 iteration 0074/0187: training loss nan Epoch 17 iteration 0075/0187: training loss nan Epoch 17 iteration 0076/0187: training loss nan Epoch 17 iteration 0077/0187: training loss nan Epoch 17 iteration 0078/0187: training loss nan Epoch 17 iteration 0079/0187: training loss nan Epoch 17 iteration 0080/0187: training loss nan Epoch 17 iteration 0081/0187: training loss nan Epoch 17 iteration 0082/0187: training loss nan Epoch 17 iteration 0083/0187: training loss nan Epoch 17 iteration 0084/0187: training loss nan Epoch 17 iteration 0085/0187: training loss nan Epoch 17 iteration 0086/0187: training loss nan Epoch 17 iteration 0087/0187: training loss nan Epoch 17 iteration 0088/0187: training loss nan Epoch 17 iteration 0089/0187: training loss nan Epoch 17 iteration 0090/0187: training loss nan Epoch 17 iteration 0091/0187: training loss nan Epoch 17 iteration 0092/0187: training loss nan Epoch 17 iteration 0093/0187: training loss nan Epoch 17 iteration 0094/0187: training loss nan Epoch 17 iteration 0095/0187: training loss nan Epoch 17 iteration 0096/0187: training loss nan Epoch 17 iteration 0097/0187: training loss nan Epoch 17 iteration 0098/0187: training loss nan Epoch 17 iteration 0099/0187: training loss nan Epoch 17 iteration 0100/0187: training loss nan Epoch 17 iteration 0101/0187: training loss nan Epoch 17 iteration 0102/0187: training loss nan Epoch 17 iteration 0103/0187: training loss nan Epoch 17 iteration 0104/0187: training loss nan Epoch 17 iteration 0105/0187: training loss nan Epoch 17 iteration 0106/0187: training loss nan Epoch 17 iteration 0107/0187: training loss nan Epoch 17 iteration 0108/0187: training loss nan Epoch 17 iteration 0109/0187: training loss nan Epoch 17 iteration 0110/0187: training loss nan Epoch 17 iteration 0111/0187: training loss nan Epoch 17 iteration 0112/0187: training loss nan Epoch 17 iteration 0113/0187: training loss nan Epoch 17 iteration 0114/0187: training loss nan Epoch 17 iteration 0115/0187: training loss nan Epoch 17 iteration 0116/0187: training loss nan Epoch 17 iteration 0117/0187: training loss nan Epoch 17 iteration 0118/0187: training loss nan Epoch 17 iteration 0119/0187: training loss nan Epoch 17 iteration 0120/0187: training loss nan Epoch 17 iteration 0121/0187: training loss nan Epoch 17 iteration 0122/0187: training loss nan Epoch 17 iteration 0123/0187: training loss nan Epoch 17 iteration 0124/0187: training loss nan Epoch 17 iteration 0125/0187: training loss nan Epoch 17 iteration 0126/0187: training loss nan Epoch 17 iteration 0127/0187: training loss nan Epoch 17 iteration 0128/0187: training loss nan Epoch 17 iteration 0129/0187: training loss nan Epoch 17 iteration 0130/0187: training loss nan Epoch 17 iteration 0131/0187: training loss nan Epoch 17 iteration 0132/0187: training loss nan Epoch 17 iteration 0133/0187: training loss nan Epoch 17 iteration 0134/0187: training loss nan Epoch 17 iteration 0135/0187: training loss nan Epoch 17 iteration 0136/0187: training loss nan Epoch 17 iteration 0137/0187: training loss nan Epoch 17 iteration 0138/0187: training loss nan Epoch 17 iteration 0139/0187: training loss nan Epoch 17 iteration 0140/0187: training loss nan Epoch 17 iteration 0141/0187: training loss nan Epoch 17 iteration 0142/0187: training loss nan Epoch 17 iteration 0143/0187: training loss nan Epoch 17 iteration 0144/0187: training loss nan Epoch 17 iteration 0145/0187: training loss nan Epoch 17 iteration 0146/0187: training loss nan Epoch 17 iteration 0147/0187: training loss nan Epoch 17 iteration 0148/0187: training loss nan Epoch 17 iteration 0149/0187: training loss nan Epoch 17 iteration 0150/0187: training loss nan Epoch 17 iteration 0151/0187: training loss nan Epoch 17 iteration 0152/0187: training loss nan Epoch 17 iteration 0153/0187: training loss nan Epoch 17 iteration 0154/0187: training loss nan Epoch 17 iteration 0155/0187: training loss nan Epoch 17 iteration 0156/0187: training loss nan Epoch 17 iteration 0157/0187: training loss nan Epoch 17 iteration 0158/0187: training loss nan Epoch 17 iteration 0159/0187: training loss nan Epoch 17 iteration 0160/0187: training loss nan Epoch 17 iteration 0161/0187: training loss nan Epoch 17 iteration 0162/0187: training loss nan Epoch 17 iteration 0163/0187: training loss nan Epoch 17 iteration 0164/0187: training loss nan Epoch 17 iteration 0165/0187: training loss nan Epoch 17 iteration 0166/0187: training loss nan Epoch 17 iteration 0167/0187: training loss nan Epoch 17 iteration 0168/0187: training loss nan Epoch 17 iteration 0169/0187: training loss nan Epoch 17 iteration 0170/0187: training loss nan Epoch 17 iteration 0171/0187: training loss nan Epoch 17 iteration 0172/0187: training loss nan Epoch 17 iteration 0173/0187: training loss nan Epoch 17 iteration 0174/0187: training loss nan Epoch 17 iteration 0175/0187: training loss nan Epoch 17 iteration 0176/0187: training loss nan Epoch 17 iteration 0177/0187: training loss nan Epoch 17 iteration 0178/0187: training loss nan Epoch 17 iteration 0179/0187: training loss nan Epoch 17 iteration 0180/0187: training loss nan Epoch 17 iteration 0181/0187: training loss nan Epoch 17 iteration 0182/0187: training loss nan Epoch 17 iteration 0183/0187: training loss nan Epoch 17 iteration 0184/0187: training loss nan Epoch 17 iteration 0185/0187: training loss nan Epoch 17 iteration 0186/0187: training loss nan Epoch 17 iteration 0187/0187: training loss nan Epoch 17 validation pixAcc: 0.866, mIoU: 0.360 Epoch 18 iteration 0001/0187: training loss 0.837 Epoch 18 iteration 0002/0187: training loss 0.809 Epoch 18 iteration 0003/0187: training loss 0.843 Epoch 18 iteration 0004/0187: training loss 0.830 Epoch 18 iteration 0005/0187: training loss 0.860 Epoch 18 iteration 0006/0187: training loss 0.870 Epoch 18 iteration 0007/0187: training loss 0.886 Epoch 18 iteration 0008/0187: training loss 0.879 Epoch 18 iteration 0009/0187: training loss 0.899 Epoch 18 iteration 0010/0187: training loss 0.908 Epoch 18 iteration 0011/0187: training loss 0.917 Epoch 18 iteration 0012/0187: training loss 0.912 Epoch 18 iteration 0013/0187: training loss 0.901 Epoch 18 iteration 0014/0187: training loss 0.887 Epoch 18 iteration 0015/0187: training loss 0.891 Epoch 18 iteration 0016/0187: training loss 0.910 Epoch 18 iteration 0017/0187: training loss 0.908 Epoch 18 iteration 0018/0187: training loss 0.918 Epoch 18 iteration 0019/0187: training loss 0.918 Epoch 18 iteration 0020/0187: training loss 0.918 Epoch 18 iteration 0021/0187: training loss 0.916 Epoch 18 iteration 0022/0187: training loss 0.915 Epoch 18 iteration 0023/0187: training loss 0.913 Epoch 18 iteration 0024/0187: training loss 0.911 Epoch 18 iteration 0025/0187: training loss 0.901 Epoch 18 iteration 0026/0187: training loss 0.894 Epoch 18 iteration 0027/0187: training loss 0.891 Epoch 18 iteration 0028/0187: training loss 0.896 Epoch 18 iteration 0029/0187: training loss nan Epoch 18 iteration 0030/0187: training loss nan Epoch 18 iteration 0031/0187: training loss nan Epoch 18 iteration 0032/0187: training loss nan Epoch 18 iteration 0033/0187: training loss nan Epoch 18 iteration 0034/0187: training loss nan Epoch 18 iteration 0035/0187: training loss nan Epoch 18 iteration 0036/0187: training loss nan Epoch 18 iteration 0037/0187: training loss nan Epoch 18 iteration 0038/0187: training loss nan Epoch 18 iteration 0039/0187: training loss nan Epoch 18 iteration 0040/0187: training loss nan Epoch 18 iteration 0041/0187: training loss nan Epoch 18 iteration 0042/0187: training loss nan Epoch 18 iteration 0043/0187: training loss nan Epoch 18 iteration 0044/0187: training loss nan Epoch 18 iteration 0045/0187: training loss nan Epoch 18 iteration 0046/0187: training loss nan Epoch 18 iteration 0047/0187: training loss nan Epoch 18 iteration 0048/0187: training loss nan Epoch 18 iteration 0049/0187: training loss nan Epoch 18 iteration 0050/0187: training loss nan Epoch 18 iteration 0051/0187: training loss nan Epoch 18 iteration 0052/0187: training loss nan Epoch 18 iteration 0053/0187: training loss nan Epoch 18 iteration 0054/0187: training loss nan Epoch 18 iteration 0055/0187: training loss nan Epoch 18 iteration 0056/0187: training loss nan Epoch 18 iteration 0057/0187: training loss nan Epoch 18 iteration 0058/0187: training loss nan Epoch 18 iteration 0059/0187: training loss nan Epoch 18 iteration 0060/0187: training loss nan Epoch 18 iteration 0061/0187: training loss nan Epoch 18 iteration 0062/0187: training loss nan Epoch 18 iteration 0063/0187: training loss nan Epoch 18 iteration 0064/0187: training loss nan Epoch 18 iteration 0065/0187: training loss nan Epoch 18 iteration 0066/0187: training loss nan Epoch 18 iteration 0067/0187: training loss nan Epoch 18 iteration 0068/0187: training loss nan Epoch 18 iteration 0069/0187: training loss nan Epoch 18 iteration 0070/0187: training loss nan Epoch 18 iteration 0071/0187: training loss nan Epoch 18 iteration 0072/0187: training loss nan Epoch 18 iteration 0073/0187: training loss nan Epoch 18 iteration 0074/0187: training loss nan Epoch 18 iteration 0075/0187: training loss nan Epoch 18 iteration 0076/0187: training loss nan Epoch 18 iteration 0077/0187: training loss nan Epoch 18 iteration 0078/0187: training loss nan Epoch 18 iteration 0079/0187: training loss nan Epoch 18 iteration 0080/0187: training loss nan Epoch 18 iteration 0081/0187: training loss nan Epoch 18 iteration 0082/0187: training loss nan Epoch 18 iteration 0083/0187: training loss nan Epoch 18 iteration 0084/0187: training loss nan Epoch 18 iteration 0085/0187: training loss nan Epoch 18 iteration 0086/0187: training loss nan Epoch 18 iteration 0087/0187: training loss nan Epoch 18 iteration 0088/0187: training loss nan Epoch 18 iteration 0089/0187: training loss nan Epoch 18 iteration 0090/0187: training loss nan Epoch 18 iteration 0091/0188: training loss nan Epoch 18 iteration 0092/0188: training loss nan Epoch 18 iteration 0093/0188: training loss nan Epoch 18 iteration 0094/0188: training loss nan Epoch 18 iteration 0095/0188: training loss nan Epoch 18 iteration 0096/0188: training loss nan Epoch 18 iteration 0097/0188: training loss nan Epoch 18 iteration 0098/0188: training loss nan Epoch 18 iteration 0099/0188: training loss nan Epoch 18 iteration 0100/0188: training loss nan Epoch 18 iteration 0101/0188: training loss nan Epoch 18 iteration 0102/0188: training loss nan Epoch 18 iteration 0103/0188: training loss nan Epoch 18 iteration 0104/0188: training loss nan Epoch 18 iteration 0105/0188: training loss nan Epoch 18 iteration 0106/0188: training loss nan Epoch 18 iteration 0107/0188: training loss nan Epoch 18 iteration 0108/0188: training loss nan Epoch 18 iteration 0109/0188: training loss nan Epoch 18 iteration 0110/0188: training loss nan Epoch 18 iteration 0111/0188: training loss nan Epoch 18 iteration 0112/0188: training loss nan Epoch 18 iteration 0113/0188: training loss nan Epoch 18 iteration 0114/0188: training loss nan Epoch 18 iteration 0115/0188: training loss nan Epoch 18 iteration 0116/0188: training loss nan Epoch 18 iteration 0117/0188: training loss nan Epoch 18 iteration 0118/0188: training loss nan Epoch 18 iteration 0119/0188: training loss nan Epoch 18 iteration 0120/0188: training loss nan Epoch 18 iteration 0121/0188: training loss nan Epoch 18 iteration 0122/0188: training loss nan Epoch 18 iteration 0123/0188: training loss nan Epoch 18 iteration 0124/0188: training loss nan Epoch 18 iteration 0125/0188: training loss nan Epoch 18 iteration 0126/0188: training loss nan Epoch 18 iteration 0127/0188: training loss nan Epoch 18 iteration 0128/0188: training loss nan Epoch 18 iteration 0129/0188: training loss nan Epoch 18 iteration 0130/0188: training loss nan Epoch 18 iteration 0131/0188: training loss nan Epoch 18 iteration 0132/0188: training loss nan Epoch 18 iteration 0133/0188: training loss nan Epoch 18 iteration 0134/0188: training loss nan Epoch 18 iteration 0135/0188: training loss nan Epoch 18 iteration 0136/0188: training loss nan Epoch 18 iteration 0137/0188: training loss nan Epoch 18 iteration 0138/0188: training loss nan Epoch 18 iteration 0139/0188: training loss nan Epoch 18 iteration 0140/0188: training loss nan Epoch 18 iteration 0141/0188: training loss nan Epoch 18 iteration 0142/0188: training loss nan Epoch 18 iteration 0143/0188: training loss nan Epoch 18 iteration 0144/0188: training loss nan Epoch 18 iteration 0145/0188: training loss nan Epoch 18 iteration 0146/0188: training loss nan Epoch 18 iteration 0147/0188: training loss nan Epoch 18 iteration 0148/0188: training loss nan Epoch 18 iteration 0149/0188: training loss nan Epoch 18 iteration 0150/0188: training loss nan Epoch 18 iteration 0151/0188: training loss nan Epoch 18 iteration 0152/0188: training loss nan Epoch 18 iteration 0153/0188: training loss nan Epoch 18 iteration 0154/0188: training loss nan Epoch 18 iteration 0155/0188: training loss nan Epoch 18 iteration 0156/0188: training loss nan Epoch 18 iteration 0157/0188: training loss nan Epoch 18 iteration 0158/0188: training loss nan Epoch 18 iteration 0159/0188: training loss nan Epoch 18 iteration 0160/0188: training loss nan Epoch 18 iteration 0161/0188: training loss nan Epoch 18 iteration 0162/0188: training loss nan Epoch 18 iteration 0163/0188: training loss nan Epoch 18 iteration 0164/0188: training loss nan Epoch 18 iteration 0165/0188: training loss nan Epoch 18 iteration 0166/0188: training loss nan Epoch 18 iteration 0167/0188: training loss nan Epoch 18 iteration 0168/0188: training loss nan Epoch 18 iteration 0169/0188: training loss nan Epoch 18 iteration 0170/0188: training loss nan Epoch 18 iteration 0171/0188: training loss nan Epoch 18 iteration 0172/0188: training loss nan Epoch 18 iteration 0173/0188: training loss nan Epoch 18 iteration 0174/0188: training loss nan Epoch 18 iteration 0175/0188: training loss nan Epoch 18 iteration 0176/0188: training loss nan Epoch 18 iteration 0177/0188: training loss nan Epoch 18 iteration 0178/0188: training loss nan Epoch 18 iteration 0179/0188: training loss nan Epoch 18 iteration 0180/0188: training loss nan Epoch 18 iteration 0181/0188: training loss nan Epoch 18 iteration 0182/0188: training loss nan Epoch 18 iteration 0183/0188: training loss nan Epoch 18 iteration 0184/0188: training loss nan Epoch 18 iteration 0185/0188: training loss nan Epoch 18 iteration 0186/0188: training loss nan Epoch 18 validation pixAcc: 0.865, mIoU: 0.363 Epoch 19 iteration 0001/0187: training loss 1.041 Epoch 19 iteration 0002/0187: training loss 1.001 Epoch 19 iteration 0003/0187: training loss 0.961 Epoch 19 iteration 0004/0187: training loss 0.919 Epoch 19 iteration 0005/0187: training loss 0.908 Epoch 19 iteration 0006/0187: training loss 0.883 Epoch 19 iteration 0007/0187: training loss 0.878 Epoch 19 iteration 0008/0187: training loss 0.850 Epoch 19 iteration 0009/0187: training loss 0.894 Epoch 19 iteration 0010/0187: training loss 0.912 Epoch 19 iteration 0011/0187: training loss 0.911 Epoch 19 iteration 0012/0187: training loss 0.903 Epoch 19 iteration 0013/0187: training loss 0.914 Epoch 19 iteration 0014/0187: training loss 0.909 Epoch 19 iteration 0015/0187: training loss 0.912 Epoch 19 iteration 0016/0187: training loss 0.926 Epoch 19 iteration 0017/0187: training loss 0.936 Epoch 19 iteration 0018/0187: training loss 0.937 Epoch 19 iteration 0019/0187: training loss 0.942 Epoch 19 iteration 0020/0187: training loss 0.937 Epoch 19 iteration 0021/0187: training loss 0.943 Epoch 19 iteration 0022/0187: training loss 0.957 Epoch 19 iteration 0023/0187: training loss 0.959 Epoch 19 iteration 0024/0187: training loss 0.955 Epoch 19 iteration 0025/0187: training loss 0.954 Epoch 19 iteration 0026/0187: training loss 0.952 Epoch 19 iteration 0027/0187: training loss 0.958 Epoch 19 iteration 0028/0187: training loss 0.956 Epoch 19 iteration 0029/0187: training loss 0.952 Epoch 19 iteration 0030/0187: training loss 0.949 Epoch 19 iteration 0031/0187: training loss 0.945 Epoch 19 iteration 0032/0187: training loss 0.944 Epoch 19 iteration 0033/0187: training loss nan Epoch 19 iteration 0034/0187: training loss nan Epoch 19 iteration 0035/0187: training loss nan Epoch 19 iteration 0036/0187: training loss nan Epoch 19 iteration 0037/0187: training loss nan Epoch 19 iteration 0038/0187: training loss nan Epoch 19 iteration 0039/0187: training loss nan Epoch 19 iteration 0040/0187: training loss nan Epoch 19 iteration 0041/0187: training loss nan Epoch 19 iteration 0042/0187: training loss nan Epoch 19 iteration 0043/0187: training loss nan Epoch 19 iteration 0044/0187: training loss nan Epoch 19 iteration 0045/0187: training loss nan Epoch 19 iteration 0046/0187: training loss nan Epoch 19 iteration 0047/0187: training loss nan Epoch 19 iteration 0048/0187: training loss nan Epoch 19 iteration 0049/0187: training loss nan Epoch 19 iteration 0050/0187: training loss nan Epoch 19 iteration 0051/0187: training loss nan Epoch 19 iteration 0052/0187: training loss nan Epoch 19 iteration 0053/0187: training loss nan Epoch 19 iteration 0054/0187: training loss nan Epoch 19 iteration 0055/0187: training loss nan Epoch 19 iteration 0056/0187: training loss nan Epoch 19 iteration 0057/0187: training loss nan Epoch 19 iteration 0058/0187: training loss nan Epoch 19 iteration 0059/0187: training loss nan Epoch 19 iteration 0060/0187: training loss nan Epoch 19 iteration 0061/0187: training loss nan Epoch 19 iteration 0062/0187: training loss nan Epoch 19 iteration 0063/0187: training loss nan Epoch 19 iteration 0064/0187: training loss nan Epoch 19 iteration 0065/0187: training loss nan Epoch 19 iteration 0066/0187: training loss nan Epoch 19 iteration 0067/0187: training loss nan Epoch 19 iteration 0068/0187: training loss nan Epoch 19 iteration 0069/0187: training loss nan Epoch 19 iteration 0070/0187: training loss nan Epoch 19 iteration 0071/0187: training loss nan Epoch 19 iteration 0072/0187: training loss nan Epoch 19 iteration 0073/0187: training loss nan Epoch 19 iteration 0074/0187: training loss nan Epoch 19 iteration 0075/0187: training loss nan Epoch 19 iteration 0076/0187: training loss nan Epoch 19 iteration 0077/0187: training loss nan Epoch 19 iteration 0078/0187: training loss nan Epoch 19 iteration 0079/0187: training loss nan Epoch 19 iteration 0080/0187: training loss nan Epoch 19 iteration 0081/0187: training loss nan Epoch 19 iteration 0082/0187: training loss nan Epoch 19 iteration 0083/0187: training loss nan Epoch 19 iteration 0084/0187: training loss nan Epoch 19 iteration 0085/0187: training loss nan Epoch 19 iteration 0086/0187: training loss nan Epoch 19 iteration 0087/0187: training loss nan Epoch 19 iteration 0088/0187: training loss nan Epoch 19 iteration 0089/0187: training loss nan Epoch 19 iteration 0090/0187: training loss nan Epoch 19 iteration 0091/0187: training loss nan Epoch 19 iteration 0092/0187: training loss nan Epoch 19 iteration 0093/0187: training loss nan Epoch 19 iteration 0094/0187: training loss nan Epoch 19 iteration 0095/0187: training loss nan Epoch 19 iteration 0096/0187: training loss nan Epoch 19 iteration 0097/0187: training loss nan Epoch 19 iteration 0098/0187: training loss nan Epoch 19 iteration 0099/0187: training loss nan Epoch 19 iteration 0100/0187: training loss nan Epoch 19 iteration 0101/0187: training loss nan Epoch 19 iteration 0102/0187: training loss nan Epoch 19 iteration 0103/0187: training loss nan Epoch 19 iteration 0104/0187: training loss nan Epoch 19 iteration 0105/0187: training loss nan Epoch 19 iteration 0106/0187: training loss nan Epoch 19 iteration 0107/0187: training loss nan Epoch 19 iteration 0108/0187: training loss nan Epoch 19 iteration 0109/0187: training loss nan Epoch 19 iteration 0110/0187: training loss nan Epoch 19 iteration 0111/0187: training loss nan Epoch 19 iteration 0112/0187: training loss nan Epoch 19 iteration 0113/0187: training loss nan Epoch 19 iteration 0114/0187: training loss nan Epoch 19 iteration 0115/0187: training loss nan Epoch 19 iteration 0116/0187: training loss nan Epoch 19 iteration 0117/0187: training loss nan Epoch 19 iteration 0118/0187: training loss nan Epoch 19 iteration 0119/0187: training loss nan Epoch 19 iteration 0120/0187: training loss nan Epoch 19 iteration 0121/0187: training loss nan Epoch 19 iteration 0122/0187: training loss nan Epoch 19 iteration 0123/0187: training loss nan Epoch 19 iteration 0124/0187: training loss nan Epoch 19 iteration 0125/0187: training loss nan Epoch 19 iteration 0126/0187: training loss nan Epoch 19 iteration 0127/0187: training loss nan Epoch 19 iteration 0128/0187: training loss nan Epoch 19 iteration 0129/0187: training loss nan Epoch 19 iteration 0130/0187: training loss nan Epoch 19 iteration 0131/0187: training loss nan Epoch 19 iteration 0132/0187: training loss nan Epoch 19 iteration 0133/0187: training loss nan Epoch 19 iteration 0134/0187: training loss nan Epoch 19 iteration 0135/0187: training loss nan Epoch 19 iteration 0136/0187: training loss nan Epoch 19 iteration 0137/0187: training loss nan Epoch 19 iteration 0138/0187: training loss nan Epoch 19 iteration 0139/0187: training loss nan Epoch 19 iteration 0140/0187: training loss nan Epoch 19 iteration 0141/0187: training loss nan Epoch 19 iteration 0142/0187: training loss nan Epoch 19 iteration 0143/0187: training loss nan Epoch 19 iteration 0144/0187: training loss nan Epoch 19 iteration 0145/0187: training loss nan Epoch 19 iteration 0146/0187: training loss nan Epoch 19 iteration 0147/0187: training loss nan Epoch 19 iteration 0148/0187: training loss nan Epoch 19 iteration 0149/0187: training loss nan Epoch 19 iteration 0150/0187: training loss nan Epoch 19 iteration 0151/0187: training loss nan Epoch 19 iteration 0152/0187: training loss nan Epoch 19 iteration 0153/0187: training loss nan Epoch 19 iteration 0154/0187: training loss nan Epoch 19 iteration 0155/0187: training loss nan Epoch 19 iteration 0156/0187: training loss nan Epoch 19 iteration 0157/0187: training loss nan Epoch 19 iteration 0158/0187: training loss nan Epoch 19 iteration 0159/0187: training loss nan Epoch 19 iteration 0160/0187: training loss nan Epoch 19 iteration 0161/0187: training loss nan Epoch 19 iteration 0162/0187: training loss nan Epoch 19 iteration 0163/0187: training loss nan Epoch 19 iteration 0164/0187: training loss nan Epoch 19 iteration 0165/0187: training loss nan Epoch 19 iteration 0166/0187: training loss nan Epoch 19 iteration 0167/0187: training loss nan Epoch 19 iteration 0168/0187: training loss nan Epoch 19 iteration 0169/0187: training loss nan Epoch 19 iteration 0170/0187: training loss nan Epoch 19 iteration 0171/0187: training loss nan Epoch 19 iteration 0172/0187: training loss nan Epoch 19 iteration 0173/0187: training loss nan Epoch 19 iteration 0174/0187: training loss nan Epoch 19 iteration 0175/0187: training loss nan Epoch 19 iteration 0176/0187: training loss nan Epoch 19 iteration 0177/0187: training loss nan Epoch 19 iteration 0178/0187: training loss nan Epoch 19 iteration 0179/0187: training loss nan Epoch 19 iteration 0180/0187: training loss nan Epoch 19 iteration 0181/0187: training loss nan Epoch 19 iteration 0182/0187: training loss nan Epoch 19 iteration 0183/0187: training loss nan Epoch 19 iteration 0184/0187: training loss nan Epoch 19 iteration 0185/0187: training loss nan Epoch 19 iteration 0186/0187: training loss nan Epoch 19 iteration 0187/0187: training loss nan Epoch 19 validation pixAcc: 0.868, mIoU: 0.367 Epoch 20 iteration 0001/0187: training loss 0.899 Epoch 20 iteration 0002/0187: training loss 0.937 Epoch 20 iteration 0003/0187: training loss 0.912 Epoch 20 iteration 0004/0187: training loss 0.865 Epoch 20 iteration 0005/0187: training loss 0.894 Epoch 20 iteration 0006/0187: training loss 0.864 Epoch 20 iteration 0007/0187: training loss 0.845 Epoch 20 iteration 0008/0187: training loss 0.848 Epoch 20 iteration 0009/0187: training loss 0.853 Epoch 20 iteration 0010/0187: training loss 0.834 Epoch 20 iteration 0011/0187: training loss 0.817 Epoch 20 iteration 0012/0187: training loss 0.839 Epoch 20 iteration 0013/0187: training loss 0.856 Epoch 20 iteration 0014/0187: training loss 0.846 Epoch 20 iteration 0015/0187: training loss 0.849 Epoch 20 iteration 0016/0187: training loss 0.843 Epoch 20 iteration 0017/0187: training loss 0.835 Epoch 20 iteration 0018/0187: training loss 0.843 Epoch 20 iteration 0019/0187: training loss 0.846 Epoch 20 iteration 0020/0187: training loss 0.833 Epoch 20 iteration 0021/0187: training loss 0.837 Epoch 20 iteration 0022/0187: training loss 0.839 Epoch 20 iteration 0023/0187: training loss 0.841 Epoch 20 iteration 0024/0187: training loss 0.835 Epoch 20 iteration 0025/0187: training loss 0.831 Epoch 20 iteration 0026/0187: training loss 0.827 Epoch 20 iteration 0027/0187: training loss 0.834 Epoch 20 iteration 0028/0187: training loss 0.833 Epoch 20 iteration 0029/0187: training loss 0.834 Epoch 20 iteration 0030/0187: training loss 0.836 Epoch 20 iteration 0031/0187: training loss 0.835 Epoch 20 iteration 0032/0187: training loss 0.841 Epoch 20 iteration 0033/0187: training loss 0.846 Epoch 20 iteration 0034/0187: training loss 0.844 Epoch 20 iteration 0035/0187: training loss 0.843 Epoch 20 iteration 0036/0187: training loss 0.845 Epoch 20 iteration 0037/0187: training loss 0.848 Epoch 20 iteration 0038/0187: training loss 0.858 Epoch 20 iteration 0039/0187: training loss 0.881 Epoch 20 iteration 0040/0187: training loss 0.882 Epoch 20 iteration 0041/0187: training loss 0.884 Epoch 20 iteration 0042/0187: training loss 0.887 Epoch 20 iteration 0043/0187: training loss 0.886 Epoch 20 iteration 0044/0187: training loss 0.885 Epoch 20 iteration 0045/0187: training loss 0.881 Epoch 20 iteration 0046/0187: training loss 0.879 Epoch 20 iteration 0047/0187: training loss 0.877 Epoch 20 iteration 0048/0187: training loss 0.876 Epoch 20 iteration 0049/0187: training loss 0.877 Epoch 20 iteration 0050/0187: training loss 0.874 Epoch 20 iteration 0051/0187: training loss 0.874 Epoch 20 iteration 0052/0187: training loss 0.873 Epoch 20 iteration 0053/0187: training loss 0.876 Epoch 20 iteration 0054/0187: training loss 0.881 Epoch 20 iteration 0055/0187: training loss 0.881 Epoch 20 iteration 0056/0187: training loss 0.878 Epoch 20 iteration 0057/0187: training loss 0.880 Epoch 20 iteration 0058/0187: training loss 0.879 Epoch 20 iteration 0059/0187: training loss 0.884 Epoch 20 iteration 0060/0187: training loss 0.883 Epoch 20 iteration 0061/0187: training loss 0.888 Epoch 20 iteration 0062/0187: training loss 0.893 Epoch 20 iteration 0063/0187: training loss 0.892 Epoch 20 iteration 0064/0187: training loss 0.890 Epoch 20 iteration 0065/0187: training loss 0.892 Epoch 20 iteration 0066/0187: training loss 0.889 Epoch 20 iteration 0067/0187: training loss 0.886 Epoch 20 iteration 0068/0187: training loss 0.886 Epoch 20 iteration 0069/0187: training loss 0.885 Epoch 20 iteration 0070/0187: training loss 0.886 Epoch 20 iteration 0071/0187: training loss 0.882 Epoch 20 iteration 0072/0187: training loss 0.885 Epoch 20 iteration 0073/0187: training loss 0.885 Epoch 20 iteration 0074/0187: training loss 0.884 Epoch 20 iteration 0075/0187: training loss 0.885 Epoch 20 iteration 0076/0187: training loss 0.885 Epoch 20 iteration 0077/0187: training loss 0.888 Epoch 20 iteration 0078/0187: training loss 0.887 Epoch 20 iteration 0079/0187: training loss 0.887 Epoch 20 iteration 0080/0187: training loss 0.884 Epoch 20 iteration 0081/0187: training loss 0.883 Epoch 20 iteration 0082/0187: training loss 0.885 Epoch 20 iteration 0083/0187: training loss 0.888 Epoch 20 iteration 0084/0187: training loss 0.889 Epoch 20 iteration 0085/0187: training loss 0.887 Epoch 20 iteration 0086/0187: training loss 0.887 Epoch 20 iteration 0087/0187: training loss 0.888 Epoch 20 iteration 0088/0187: training loss 0.888 Epoch 20 iteration 0089/0187: training loss 0.887 Epoch 20 iteration 0090/0187: training loss 0.886 Epoch 20 iteration 0091/0188: training loss 0.888 Epoch 20 iteration 0092/0188: training loss 0.890 Epoch 20 iteration 0093/0188: training loss 0.890 Epoch 20 iteration 0094/0188: training loss 0.888 Epoch 20 iteration 0095/0188: training loss 0.887 Epoch 20 iteration 0096/0188: training loss 0.890 Epoch 20 iteration 0097/0188: training loss 0.889 Epoch 20 iteration 0098/0188: training loss 0.888 Epoch 20 iteration 0099/0188: training loss 0.887 Epoch 20 iteration 0100/0188: training loss 0.887 Epoch 20 iteration 0101/0188: training loss 0.887 Epoch 20 iteration 0102/0188: training loss 0.888 Epoch 20 iteration 0103/0188: training loss 0.889 Epoch 20 iteration 0104/0188: training loss 0.889 Epoch 20 iteration 0105/0188: training loss 0.888 Epoch 20 iteration 0106/0188: training loss 0.888 Epoch 20 iteration 0107/0188: training loss 0.887 Epoch 20 iteration 0108/0188: training loss 0.888 Epoch 20 iteration 0109/0188: training loss 0.887 Epoch 20 iteration 0110/0188: training loss 0.889 Epoch 20 iteration 0111/0188: training loss 0.888 Epoch 20 iteration 0112/0188: training loss 0.887 Epoch 20 iteration 0113/0188: training loss 0.888 Epoch 20 iteration 0114/0188: training loss 0.887 Epoch 20 iteration 0115/0188: training loss 0.886 Epoch 20 iteration 0116/0188: training loss 0.888 Epoch 20 iteration 0117/0188: training loss 0.890 Epoch 20 iteration 0118/0188: training loss 0.888 Epoch 20 iteration 0119/0188: training loss 0.889 Epoch 20 iteration 0120/0188: training loss 0.889 Epoch 20 iteration 0121/0188: training loss 0.889 Epoch 20 iteration 0122/0188: training loss 0.887 Epoch 20 iteration 0123/0188: training loss 0.886 Epoch 20 iteration 0124/0188: training loss 0.885 Epoch 20 iteration 0125/0188: training loss 0.884 Epoch 20 iteration 0126/0188: training loss 0.885 Epoch 20 iteration 0127/0188: training loss 0.885 Epoch 20 iteration 0128/0188: training loss 0.885 Epoch 20 iteration 0129/0188: training loss 0.884 Epoch 20 iteration 0130/0188: training loss 0.884 Epoch 20 iteration 0131/0188: training loss 0.885 Epoch 20 iteration 0132/0188: training loss 0.884 Epoch 20 iteration 0133/0188: training loss 0.884 Epoch 20 iteration 0134/0188: training loss 0.883 Epoch 20 iteration 0135/0188: training loss 0.884 Epoch 20 iteration 0136/0188: training loss 0.883 Epoch 20 iteration 0137/0188: training loss 0.883 Epoch 20 iteration 0138/0188: training loss 0.881 Epoch 20 iteration 0139/0188: training loss 0.882 Epoch 20 iteration 0140/0188: training loss 0.882 Epoch 20 iteration 0141/0188: training loss 0.882 Epoch 20 iteration 0142/0188: training loss 0.881 Epoch 20 iteration 0143/0188: training loss 0.879 Epoch 20 iteration 0144/0188: training loss 0.879 Epoch 20 iteration 0145/0188: training loss 0.878 Epoch 20 iteration 0146/0188: training loss 0.879 Epoch 20 iteration 0147/0188: training loss 0.878 Epoch 20 iteration 0148/0188: training loss 0.878 Epoch 20 iteration 0149/0188: training loss 0.878 Epoch 20 iteration 0150/0188: training loss 0.879 Epoch 20 iteration 0151/0188: training loss 0.879 Epoch 20 iteration 0152/0188: training loss nan Epoch 20 iteration 0153/0188: training loss nan Epoch 20 iteration 0154/0188: training loss nan Epoch 20 iteration 0155/0188: training loss nan Epoch 20 iteration 0156/0188: training loss nan Epoch 20 iteration 0157/0188: training loss nan Epoch 20 iteration 0158/0188: training loss nan Epoch 20 iteration 0159/0188: training loss nan Epoch 20 iteration 0160/0188: training loss nan Epoch 20 iteration 0161/0188: training loss nan Epoch 20 iteration 0162/0188: training loss nan Epoch 20 iteration 0163/0188: training loss nan Epoch 20 iteration 0164/0188: training loss nan Epoch 20 iteration 0165/0188: training loss nan Epoch 20 iteration 0166/0188: training loss nan Epoch 20 iteration 0167/0188: training loss nan Epoch 20 iteration 0168/0188: training loss nan Epoch 20 iteration 0169/0188: training loss nan Epoch 20 iteration 0170/0188: training loss nan Epoch 20 iteration 0171/0188: training loss nan Epoch 20 iteration 0172/0188: training loss nan Epoch 20 iteration 0173/0188: training loss nan Epoch 20 iteration 0174/0188: training loss nan Epoch 20 iteration 0175/0188: training loss nan Epoch 20 iteration 0176/0188: training loss nan Epoch 20 iteration 0177/0188: training loss nan Epoch 20 iteration 0178/0188: training loss nan Epoch 20 iteration 0179/0188: training loss nan Epoch 20 iteration 0180/0188: training loss nan Epoch 20 iteration 0181/0188: training loss nan Epoch 20 iteration 0182/0188: training loss nan Epoch 20 iteration 0183/0188: training loss nan Epoch 20 iteration 0184/0188: training loss nan Epoch 20 iteration 0185/0188: training loss nan Epoch 20 iteration 0186/0188: training loss nan Epoch 20 validation pixAcc: 0.867, mIoU: 0.366 Epoch 21 iteration 0001/0187: training loss 0.817 Epoch 21 iteration 0002/0187: training loss 0.865 Epoch 21 iteration 0003/0187: training loss 0.815 Epoch 21 iteration 0004/0187: training loss 0.890 Epoch 21 iteration 0005/0187: training loss 0.862 Epoch 21 iteration 0006/0187: training loss 0.851 Epoch 21 iteration 0007/0187: training loss 0.856 Epoch 21 iteration 0008/0187: training loss 0.851 Epoch 21 iteration 0009/0187: training loss 0.843 Epoch 21 iteration 0010/0187: training loss 0.844 Epoch 21 iteration 0011/0187: training loss 0.826 Epoch 21 iteration 0012/0187: training loss 0.828 Epoch 21 iteration 0013/0187: training loss 0.832 Epoch 21 iteration 0014/0187: training loss 0.825 Epoch 21 iteration 0015/0187: training loss 0.831 Epoch 21 iteration 0016/0187: training loss 0.839 Epoch 21 iteration 0017/0187: training loss 0.836 Epoch 21 iteration 0018/0187: training loss 0.834 Epoch 21 iteration 0019/0187: training loss 0.841 Epoch 21 iteration 0020/0187: training loss 0.841 Epoch 21 iteration 0021/0187: training loss 0.838 Epoch 21 iteration 0022/0187: training loss 0.847 Epoch 21 iteration 0023/0187: training loss 0.845 Epoch 21 iteration 0024/0187: training loss 0.845 Epoch 21 iteration 0025/0187: training loss 0.847 Epoch 21 iteration 0026/0187: training loss 0.849 Epoch 21 iteration 0027/0187: training loss 0.854 Epoch 21 iteration 0028/0187: training loss 0.854 Epoch 21 iteration 0029/0187: training loss 0.847 Epoch 21 iteration 0030/0187: training loss 0.844 Epoch 21 iteration 0031/0187: training loss 0.847 Epoch 21 iteration 0032/0187: training loss 0.849 Epoch 21 iteration 0033/0187: training loss 0.852 Epoch 21 iteration 0034/0187: training loss 0.853 Epoch 21 iteration 0035/0187: training loss 0.861 Epoch 21 iteration 0036/0187: training loss 0.862 Epoch 21 iteration 0037/0187: training loss 0.860 Epoch 21 iteration 0038/0187: training loss 0.860 Epoch 21 iteration 0039/0187: training loss 0.859 Epoch 21 iteration 0040/0187: training loss 0.861 Epoch 21 iteration 0041/0187: training loss 0.858 Epoch 21 iteration 0042/0187: training loss 0.858 Epoch 21 iteration 0043/0187: training loss 0.859 Epoch 21 iteration 0044/0187: training loss 0.857 Epoch 21 iteration 0045/0187: training loss 0.851 Epoch 21 iteration 0046/0187: training loss 0.852 Epoch 21 iteration 0047/0187: training loss 0.852 Epoch 21 iteration 0048/0187: training loss 0.851 Epoch 21 iteration 0049/0187: training loss 0.855 Epoch 21 iteration 0050/0187: training loss 0.854 Epoch 21 iteration 0051/0187: training loss 0.858 Epoch 21 iteration 0052/0187: training loss 0.857 Epoch 21 iteration 0053/0187: training loss 0.857 Epoch 21 iteration 0054/0187: training loss 0.855 Epoch 21 iteration 0055/0187: training loss 0.853 Epoch 21 iteration 0056/0187: training loss 0.857 Epoch 21 iteration 0057/0187: training loss 0.861 Epoch 21 iteration 0058/0187: training loss 0.860 Epoch 21 iteration 0059/0187: training loss 0.861 Epoch 21 iteration 0060/0187: training loss 0.861 Epoch 21 iteration 0061/0187: training loss 0.860 Epoch 21 iteration 0062/0187: training loss 0.860 Epoch 21 iteration 0063/0187: training loss 0.857 Epoch 21 iteration 0064/0187: training loss 0.858 Epoch 21 iteration 0065/0187: training loss 0.857 Epoch 21 iteration 0066/0187: training loss 0.858 Epoch 21 iteration 0067/0187: training loss 0.857 Epoch 21 iteration 0068/0187: training loss 0.857 Epoch 21 iteration 0069/0187: training loss 0.857 Epoch 21 iteration 0070/0187: training loss 0.858 Epoch 21 iteration 0071/0187: training loss 0.859 Epoch 21 iteration 0072/0187: training loss 0.859 Epoch 21 iteration 0073/0187: training loss 0.860 Epoch 21 iteration 0074/0187: training loss 0.863 Epoch 21 iteration 0075/0187: training loss 0.864 Epoch 21 iteration 0076/0187: training loss 0.866 Epoch 21 iteration 0077/0187: training loss 0.865 Epoch 21 iteration 0078/0187: training loss 0.865 Epoch 21 iteration 0079/0187: training loss 0.865 Epoch 21 iteration 0080/0187: training loss 0.862 Epoch 21 iteration 0081/0187: training loss 0.861 Epoch 21 iteration 0082/0187: training loss 0.861 Epoch 21 iteration 0083/0187: training loss 0.860 Epoch 21 iteration 0084/0187: training loss 0.861 Epoch 21 iteration 0085/0187: training loss 0.860 Epoch 21 iteration 0086/0187: training loss nan Epoch 21 iteration 0087/0187: training loss nan Epoch 21 iteration 0088/0187: training loss nan Epoch 21 iteration 0089/0187: training loss nan Epoch 21 iteration 0090/0187: training loss nan Epoch 21 iteration 0091/0187: training loss nan Epoch 21 iteration 0092/0187: training loss nan Epoch 21 iteration 0093/0187: training loss nan Epoch 21 iteration 0094/0187: training loss nan Epoch 21 iteration 0095/0187: training loss nan Epoch 21 iteration 0096/0187: training loss nan Epoch 21 iteration 0097/0187: training loss nan Epoch 21 iteration 0098/0187: training loss nan Epoch 21 iteration 0099/0187: training loss nan Epoch 21 iteration 0100/0187: training loss nan Epoch 21 iteration 0101/0187: training loss nan Epoch 21 iteration 0102/0187: training loss nan Epoch 21 iteration 0103/0187: training loss nan Epoch 21 iteration 0104/0187: training loss nan Epoch 21 iteration 0105/0187: training loss nan Epoch 21 iteration 0106/0187: training loss nan Epoch 21 iteration 0107/0187: training loss nan Epoch 21 iteration 0108/0187: training loss nan Epoch 21 iteration 0109/0187: training loss nan Epoch 21 iteration 0110/0187: training loss nan Epoch 21 iteration 0111/0187: training loss nan Epoch 21 iteration 0112/0187: training loss nan Epoch 21 iteration 0113/0187: training loss nan Epoch 21 iteration 0114/0187: training loss nan Epoch 21 iteration 0115/0187: training loss nan Epoch 21 iteration 0116/0187: training loss nan Epoch 21 iteration 0117/0187: training loss nan Epoch 21 iteration 0118/0187: training loss nan Epoch 21 iteration 0119/0187: training loss nan Epoch 21 iteration 0120/0187: training loss nan Epoch 21 iteration 0121/0187: training loss nan Epoch 21 iteration 0122/0187: training loss nan Epoch 21 iteration 0123/0187: training loss nan Epoch 21 iteration 0124/0187: training loss nan Epoch 21 iteration 0125/0187: training loss nan Epoch 21 iteration 0126/0187: training loss nan Epoch 21 iteration 0127/0187: training loss nan Epoch 21 iteration 0128/0187: training loss nan Epoch 21 iteration 0129/0187: training loss nan Epoch 21 iteration 0130/0187: training loss nan Epoch 21 iteration 0131/0187: training loss nan Epoch 21 iteration 0132/0187: training loss nan Epoch 21 iteration 0133/0187: training loss nan Epoch 21 iteration 0134/0187: training loss nan Epoch 21 iteration 0135/0187: training loss nan Epoch 21 iteration 0136/0187: training loss nan Epoch 21 iteration 0137/0187: training loss nan Epoch 21 iteration 0138/0187: training loss nan Epoch 21 iteration 0139/0187: training loss nan Epoch 21 iteration 0140/0187: training loss nan Epoch 21 iteration 0141/0187: training loss nan Epoch 21 iteration 0142/0187: training loss nan Epoch 21 iteration 0143/0187: training loss nan Epoch 21 iteration 0144/0187: training loss nan Epoch 21 iteration 0145/0187: training loss nan Epoch 21 iteration 0146/0187: training loss nan Epoch 21 iteration 0147/0187: training loss nan Epoch 21 iteration 0148/0187: training loss nan Epoch 21 iteration 0149/0187: training loss nan Epoch 21 iteration 0150/0187: training loss nan Epoch 21 iteration 0151/0187: training loss nan Epoch 21 iteration 0152/0187: training loss nan Epoch 21 iteration 0153/0187: training loss nan Epoch 21 iteration 0154/0187: training loss nan Epoch 21 iteration 0155/0187: training loss nan Epoch 21 iteration 0156/0187: training loss nan Epoch 21 iteration 0157/0187: training loss nan Epoch 21 iteration 0158/0187: training loss nan Epoch 21 iteration 0159/0187: training loss nan Epoch 21 iteration 0160/0187: training loss nan Epoch 21 iteration 0161/0187: training loss nan Epoch 21 iteration 0162/0187: training loss nan Epoch 21 iteration 0163/0187: training loss nan Epoch 21 iteration 0164/0187: training loss nan Epoch 21 iteration 0165/0187: training loss nan Epoch 21 iteration 0166/0187: training loss nan Epoch 21 iteration 0167/0187: training loss nan Epoch 21 iteration 0168/0187: training loss nan Epoch 21 iteration 0169/0187: training loss nan Epoch 21 iteration 0170/0187: training loss nan Epoch 21 iteration 0171/0187: training loss nan Epoch 21 iteration 0172/0187: training loss nan Epoch 21 iteration 0173/0187: training loss nan Epoch 21 iteration 0174/0187: training loss nan Epoch 21 iteration 0175/0187: training loss nan Epoch 21 iteration 0176/0187: training loss nan Epoch 21 iteration 0177/0187: training loss nan Epoch 21 iteration 0178/0187: training loss nan Epoch 21 iteration 0179/0187: training loss nan Epoch 21 iteration 0180/0187: training loss nan Epoch 21 iteration 0181/0187: training loss nan Epoch 21 iteration 0182/0187: training loss nan Epoch 21 iteration 0183/0187: training loss nan Epoch 21 iteration 0184/0187: training loss nan Epoch 21 iteration 0185/0187: training loss nan Epoch 21 iteration 0186/0187: training loss nan Epoch 21 iteration 0187/0187: training loss nan Epoch 21 validation pixAcc: 0.868, mIoU: 0.367 Epoch 22 iteration 0001/0187: training loss 0.832 Epoch 22 iteration 0002/0187: training loss 0.935 Epoch 22 iteration 0003/0187: training loss 0.894 Epoch 22 iteration 0004/0187: training loss 0.902 Epoch 22 iteration 0005/0187: training loss 0.859 Epoch 22 iteration 0006/0187: training loss 0.868 Epoch 22 iteration 0007/0187: training loss 0.849 Epoch 22 iteration 0008/0187: training loss 0.850 Epoch 22 iteration 0009/0187: training loss 0.824 Epoch 22 iteration 0010/0187: training loss 0.838 Epoch 22 iteration 0011/0187: training loss 0.856 Epoch 22 iteration 0012/0187: training loss 0.855 Epoch 22 iteration 0013/0187: training loss 0.851 Epoch 22 iteration 0014/0187: training loss 0.853 Epoch 22 iteration 0015/0187: training loss 0.848 Epoch 22 iteration 0016/0187: training loss 0.856 Epoch 22 iteration 0017/0187: training loss 0.857 Epoch 22 iteration 0018/0187: training loss 0.853 Epoch 22 iteration 0019/0187: training loss 0.844 Epoch 22 iteration 0020/0187: training loss 0.840 Epoch 22 iteration 0021/0187: training loss 0.838 Epoch 22 iteration 0022/0187: training loss 0.837 Epoch 22 iteration 0023/0187: training loss 0.840 Epoch 22 iteration 0024/0187: training loss 0.847 Epoch 22 iteration 0025/0187: training loss 0.842 Epoch 22 iteration 0026/0187: training loss 0.846 Epoch 22 iteration 0027/0187: training loss 0.848 Epoch 22 iteration 0028/0187: training loss 0.842 Epoch 22 iteration 0029/0187: training loss 0.837 Epoch 22 iteration 0030/0187: training loss 0.836 Epoch 22 iteration 0031/0187: training loss 0.837 Epoch 22 iteration 0032/0187: training loss 0.839 Epoch 22 iteration 0033/0187: training loss 0.841 Epoch 22 iteration 0034/0187: training loss 0.840 Epoch 22 iteration 0035/0187: training loss 0.836 Epoch 22 iteration 0036/0187: training loss 0.835 Epoch 22 iteration 0037/0187: training loss 0.840 Epoch 22 iteration 0038/0187: training loss 0.835 Epoch 22 iteration 0039/0187: training loss 0.838 Epoch 22 iteration 0040/0187: training loss 0.841 Epoch 22 iteration 0041/0187: training loss 0.845 Epoch 22 iteration 0042/0187: training loss 0.845 Epoch 22 iteration 0043/0187: training loss 0.845 Epoch 22 iteration 0044/0187: training loss 0.845 Epoch 22 iteration 0045/0187: training loss 0.842 Epoch 22 iteration 0046/0187: training loss 0.841 Epoch 22 iteration 0047/0187: training loss 0.848 Epoch 22 iteration 0048/0187: training loss 0.844 Epoch 22 iteration 0049/0187: training loss 0.846 Epoch 22 iteration 0050/0187: training loss 0.842 Epoch 22 iteration 0051/0187: training loss 0.840 Epoch 22 iteration 0052/0187: training loss 0.842 Epoch 22 iteration 0053/0187: training loss 0.843 Epoch 22 iteration 0054/0187: training loss 0.842 Epoch 22 iteration 0055/0187: training loss 0.844 Epoch 22 iteration 0056/0187: training loss 0.845 Epoch 22 iteration 0057/0187: training loss 0.842 Epoch 22 iteration 0058/0187: training loss 0.844 Epoch 22 iteration 0059/0187: training loss 0.845 Epoch 22 iteration 0060/0187: training loss 0.845 Epoch 22 iteration 0061/0187: training loss 0.844 Epoch 22 iteration 0062/0187: training loss 0.842 Epoch 22 iteration 0063/0187: training loss 0.842 Epoch 22 iteration 0064/0187: training loss 0.845 Epoch 22 iteration 0065/0187: training loss 0.848 Epoch 22 iteration 0066/0187: training loss 0.846 Epoch 22 iteration 0067/0187: training loss 0.847 Epoch 22 iteration 0068/0187: training loss 0.847 Epoch 22 iteration 0069/0187: training loss 0.846 Epoch 22 iteration 0070/0187: training loss 0.843 Epoch 22 iteration 0071/0187: training loss 0.845 Epoch 22 iteration 0072/0187: training loss 0.849 Epoch 22 iteration 0073/0187: training loss 0.851 Epoch 22 iteration 0074/0187: training loss 0.848 Epoch 22 iteration 0075/0187: training loss 0.852 Epoch 22 iteration 0076/0187: training loss 0.854 Epoch 22 iteration 0077/0187: training loss 0.850 Epoch 22 iteration 0078/0187: training loss 0.853 Epoch 22 iteration 0079/0187: training loss 0.851 Epoch 22 iteration 0080/0187: training loss 0.855 Epoch 22 iteration 0081/0187: training loss 0.854 Epoch 22 iteration 0082/0187: training loss 0.853 Epoch 22 iteration 0083/0187: training loss 0.854 Epoch 22 iteration 0084/0187: training loss 0.852 Epoch 22 iteration 0085/0187: training loss 0.851 Epoch 22 iteration 0086/0187: training loss 0.850 Epoch 22 iteration 0087/0187: training loss 0.853 Epoch 22 iteration 0088/0187: training loss 0.854 Epoch 22 iteration 0089/0187: training loss 0.852 Epoch 22 iteration 0090/0187: training loss 0.852 Epoch 22 iteration 0091/0188: training loss 0.853 Epoch 22 iteration 0092/0188: training loss 0.854 Epoch 22 iteration 0093/0188: training loss 0.856 Epoch 22 iteration 0094/0188: training loss 0.856 Epoch 22 iteration 0095/0188: training loss 0.856 Epoch 22 iteration 0096/0188: training loss 0.858 Epoch 22 iteration 0097/0188: training loss 0.860 Epoch 22 iteration 0098/0188: training loss 0.861 Epoch 22 iteration 0099/0188: training loss 0.859 Epoch 22 iteration 0100/0188: training loss 0.858 Epoch 22 iteration 0101/0188: training loss 0.859 Epoch 22 iteration 0102/0188: training loss 0.862 Epoch 22 iteration 0103/0188: training loss 0.859 Epoch 22 iteration 0104/0188: training loss 0.860 Epoch 22 iteration 0105/0188: training loss 0.862 Epoch 22 iteration 0106/0188: training loss 0.861 Epoch 22 iteration 0107/0188: training loss 0.860 Epoch 22 iteration 0108/0188: training loss 0.861 Epoch 22 iteration 0109/0188: training loss 0.860 Epoch 22 iteration 0110/0188: training loss 0.859 Epoch 22 iteration 0111/0188: training loss 0.858 Epoch 22 iteration 0112/0188: training loss 0.857 Epoch 22 iteration 0113/0188: training loss 0.859 Epoch 22 iteration 0114/0188: training loss 0.860 Epoch 22 iteration 0115/0188: training loss 0.862 Epoch 22 iteration 0116/0188: training loss 0.862 Epoch 22 iteration 0117/0188: training loss 0.861 Epoch 22 iteration 0118/0188: training loss 0.861 Epoch 22 iteration 0119/0188: training loss 0.860 Epoch 22 iteration 0120/0188: training loss nan Epoch 22 iteration 0121/0188: training loss nan Epoch 22 iteration 0122/0188: training loss nan Epoch 22 iteration 0123/0188: training loss nan Epoch 22 iteration 0124/0188: training loss nan Epoch 22 iteration 0125/0188: training loss nan Epoch 22 iteration 0126/0188: training loss nan Epoch 22 iteration 0127/0188: training loss nan Epoch 22 iteration 0128/0188: training loss nan Epoch 22 iteration 0129/0188: training loss nan Epoch 22 iteration 0130/0188: training loss nan Epoch 22 iteration 0131/0188: training loss nan Epoch 22 iteration 0132/0188: training loss nan Epoch 22 iteration 0133/0188: training loss nan Epoch 22 iteration 0134/0188: training loss nan Epoch 22 iteration 0135/0188: training loss nan Epoch 22 iteration 0136/0188: training loss nan Epoch 22 iteration 0137/0188: training loss nan Epoch 22 iteration 0138/0188: training loss nan Epoch 22 iteration 0139/0188: training loss nan Epoch 22 iteration 0140/0188: training loss nan Epoch 22 iteration 0141/0188: training loss nan Epoch 22 iteration 0142/0188: training loss nan Epoch 22 iteration 0143/0188: training loss nan Epoch 22 iteration 0144/0188: training loss nan Epoch 22 iteration 0145/0188: training loss nan Epoch 22 iteration 0146/0188: training loss nan Epoch 22 iteration 0147/0188: training loss nan Epoch 22 iteration 0148/0188: training loss nan Epoch 22 iteration 0149/0188: training loss nan Epoch 22 iteration 0150/0188: training loss nan Epoch 22 iteration 0151/0188: training loss nan Epoch 22 iteration 0152/0188: training loss nan Epoch 22 iteration 0153/0188: training loss nan Epoch 22 iteration 0154/0188: training loss nan Epoch 22 iteration 0155/0188: training loss nan Epoch 22 iteration 0156/0188: training loss nan Epoch 22 iteration 0157/0188: training loss nan Epoch 22 iteration 0158/0188: training loss nan Epoch 22 iteration 0159/0188: training loss nan Epoch 22 iteration 0160/0188: training loss nan Epoch 22 iteration 0161/0188: training loss nan Epoch 22 iteration 0162/0188: training loss nan Epoch 22 iteration 0163/0188: training loss nan Epoch 22 iteration 0164/0188: training loss nan Epoch 22 iteration 0165/0188: training loss nan Epoch 22 iteration 0166/0188: training loss nan Epoch 22 iteration 0167/0188: training loss nan Epoch 22 iteration 0168/0188: training loss nan Epoch 22 iteration 0169/0188: training loss nan Epoch 22 iteration 0170/0188: training loss nan Epoch 22 iteration 0171/0188: training loss nan Epoch 22 iteration 0172/0188: training loss nan Epoch 22 iteration 0173/0188: training loss nan Epoch 22 iteration 0174/0188: training loss nan Epoch 22 iteration 0175/0188: training loss nan Epoch 22 iteration 0176/0188: training loss nan Epoch 22 iteration 0177/0188: training loss nan Epoch 22 iteration 0178/0188: training loss nan Epoch 22 iteration 0179/0188: training loss nan Epoch 22 iteration 0180/0188: training loss nan Epoch 22 iteration 0181/0188: training loss nan Epoch 22 iteration 0182/0188: training loss nan Epoch 22 iteration 0183/0188: training loss nan Epoch 22 iteration 0184/0188: training loss nan Epoch 22 iteration 0185/0188: training loss nan Epoch 22 iteration 0186/0188: training loss nan Epoch 22 validation pixAcc: 0.869, mIoU: 0.373 Epoch 23 iteration 0001/0187: training loss 0.808 Epoch 23 iteration 0002/0187: training loss 0.776 Epoch 23 iteration 0003/0187: training loss 0.762 Epoch 23 iteration 0004/0187: training loss 0.775 Epoch 23 iteration 0005/0187: training loss 0.782 Epoch 23 iteration 0006/0187: training loss 0.792 Epoch 23 iteration 0007/0187: training loss 0.775 Epoch 23 iteration 0008/0187: training loss 0.779 Epoch 23 iteration 0009/0187: training loss 0.775 Epoch 23 iteration 0010/0187: training loss 0.765 Epoch 23 iteration 0011/0187: training loss 0.768 Epoch 23 iteration 0012/0187: training loss 0.770 Epoch 23 iteration 0013/0187: training loss 0.778 Epoch 23 iteration 0014/0187: training loss 0.795 Epoch 23 iteration 0015/0187: training loss 0.817 Epoch 23 iteration 0016/0187: training loss 0.821 Epoch 23 iteration 0017/0187: training loss 0.831 Epoch 23 iteration 0018/0187: training loss 0.828 Epoch 23 iteration 0019/0187: training loss 0.828 Epoch 23 iteration 0020/0187: training loss 0.830 Epoch 23 iteration 0021/0187: training loss 0.826 Epoch 23 iteration 0022/0187: training loss 0.826 Epoch 23 iteration 0023/0187: training loss 0.826 Epoch 23 iteration 0024/0187: training loss 0.825 Epoch 23 iteration 0025/0187: training loss 0.828 Epoch 23 iteration 0026/0187: training loss 0.825 Epoch 23 iteration 0027/0187: training loss 0.826 Epoch 23 iteration 0028/0187: training loss 0.820 Epoch 23 iteration 0029/0187: training loss 0.816 Epoch 23 iteration 0030/0187: training loss 0.822 Epoch 23 iteration 0031/0187: training loss 0.820 Epoch 23 iteration 0032/0187: training loss 0.827 Epoch 23 iteration 0033/0187: training loss 0.836 Epoch 23 iteration 0034/0187: training loss 0.829 Epoch 23 iteration 0035/0187: training loss 0.831 Epoch 23 iteration 0036/0187: training loss 0.834 Epoch 23 iteration 0037/0187: training loss 0.831 Epoch 23 iteration 0038/0187: training loss 0.831 Epoch 23 iteration 0039/0187: training loss 0.828 Epoch 23 iteration 0040/0187: training loss 0.824 Epoch 23 iteration 0041/0187: training loss 0.824 Epoch 23 iteration 0042/0187: training loss 0.827 Epoch 23 iteration 0043/0187: training loss 0.824 Epoch 23 iteration 0044/0187: training loss 0.825 Epoch 23 iteration 0045/0187: training loss 0.823 Epoch 23 iteration 0046/0187: training loss 0.821 Epoch 23 iteration 0047/0187: training loss 0.817 Epoch 23 iteration 0048/0187: training loss 0.815 Epoch 23 iteration 0049/0187: training loss 0.816 Epoch 23 iteration 0050/0187: training loss 0.817 Epoch 23 iteration 0051/0187: training loss 0.815 Epoch 23 iteration 0052/0187: training loss 0.820 Epoch 23 iteration 0053/0187: training loss 0.819 Epoch 23 iteration 0054/0187: training loss 0.820 Epoch 23 iteration 0055/0187: training loss 0.817 Epoch 23 iteration 0056/0187: training loss 0.817 Epoch 23 iteration 0057/0187: training loss 0.815 Epoch 23 iteration 0058/0187: training loss 0.817 Epoch 23 iteration 0059/0187: training loss 0.817 Epoch 23 iteration 0060/0187: training loss 0.819 Epoch 23 iteration 0061/0187: training loss 0.817 Epoch 23 iteration 0062/0187: training loss 0.819 Epoch 23 iteration 0063/0187: training loss 0.818 Epoch 23 iteration 0064/0187: training loss 0.819 Epoch 23 iteration 0065/0187: training loss 0.819 Epoch 23 iteration 0066/0187: training loss 0.817 Epoch 23 iteration 0067/0187: training loss 0.818 Epoch 23 iteration 0068/0187: training loss 0.820 Epoch 23 iteration 0069/0187: training loss 0.817 Epoch 23 iteration 0070/0187: training loss 0.821 Epoch 23 iteration 0071/0187: training loss 0.821 Epoch 23 iteration 0072/0187: training loss nan Epoch 23 iteration 0073/0187: training loss nan Epoch 23 iteration 0074/0187: training loss nan Epoch 23 iteration 0075/0187: training loss nan Epoch 23 iteration 0076/0187: training loss nan Epoch 23 iteration 0077/0187: training loss nan Epoch 23 iteration 0078/0187: training loss nan Epoch 23 iteration 0079/0187: training loss nan Epoch 23 iteration 0080/0187: training loss nan Epoch 23 iteration 0081/0187: training loss nan Epoch 23 iteration 0082/0187: training loss nan Epoch 23 iteration 0083/0187: training loss nan Epoch 23 iteration 0084/0187: training loss nan Epoch 23 iteration 0085/0187: training loss nan Epoch 23 iteration 0086/0187: training loss nan Epoch 23 iteration 0087/0187: training loss nan Epoch 23 iteration 0088/0187: training loss nan Epoch 23 iteration 0089/0187: training loss nan Epoch 23 iteration 0090/0187: training loss nan Epoch 23 iteration 0091/0187: training loss nan Epoch 23 iteration 0092/0187: training loss nan Epoch 23 iteration 0093/0187: training loss nan Epoch 23 iteration 0094/0187: training loss nan Epoch 23 iteration 0095/0187: training loss nan Epoch 23 iteration 0096/0187: training loss nan Epoch 23 iteration 0097/0187: training loss nan Epoch 23 iteration 0098/0187: training loss nan Epoch 23 iteration 0099/0187: training loss nan Epoch 23 iteration 0100/0187: training loss nan Epoch 23 iteration 0101/0187: training loss nan Epoch 23 iteration 0102/0187: training loss nan Epoch 23 iteration 0103/0187: training loss nan Epoch 23 iteration 0104/0187: training loss nan Epoch 23 iteration 0105/0187: training loss nan Epoch 23 iteration 0106/0187: training loss nan Epoch 23 iteration 0107/0187: training loss nan Epoch 23 iteration 0108/0187: training loss nan Epoch 23 iteration 0109/0187: training loss nan Epoch 23 iteration 0110/0187: training loss nan Epoch 23 iteration 0111/0187: training loss nan Epoch 23 iteration 0112/0187: training loss nan Epoch 23 iteration 0113/0187: training loss nan Epoch 23 iteration 0114/0187: training loss nan Epoch 23 iteration 0115/0187: training loss nan Epoch 23 iteration 0116/0187: training loss nan Epoch 23 iteration 0117/0187: training loss nan Epoch 23 iteration 0118/0187: training loss nan Epoch 23 iteration 0119/0187: training loss nan Epoch 23 iteration 0120/0187: training loss nan Epoch 23 iteration 0121/0187: training loss nan Epoch 23 iteration 0122/0187: training loss nan Epoch 23 iteration 0123/0187: training loss nan Epoch 23 iteration 0124/0187: training loss nan Epoch 23 iteration 0125/0187: training loss nan Epoch 23 iteration 0126/0187: training loss nan Epoch 23 iteration 0127/0187: training loss nan Epoch 23 iteration 0128/0187: training loss nan Epoch 23 iteration 0129/0187: training loss nan Epoch 23 iteration 0130/0187: training loss nan Epoch 23 iteration 0131/0187: training loss nan Epoch 23 iteration 0132/0187: training loss nan Epoch 23 iteration 0133/0187: training loss nan Epoch 23 iteration 0134/0187: training loss nan Epoch 23 iteration 0135/0187: training loss nan Epoch 23 iteration 0136/0187: training loss nan Epoch 23 iteration 0137/0187: training loss nan Epoch 23 iteration 0138/0187: training loss nan Epoch 23 iteration 0139/0187: training loss nan Epoch 23 iteration 0140/0187: training loss nan Epoch 23 iteration 0141/0187: training loss nan Epoch 23 iteration 0142/0187: training loss nan Epoch 23 iteration 0143/0187: training loss nan Epoch 23 iteration 0144/0187: training loss nan Epoch 23 iteration 0145/0187: training loss nan Epoch 23 iteration 0146/0187: training loss nan Epoch 23 iteration 0147/0187: training loss nan Epoch 23 iteration 0148/0187: training loss nan Epoch 23 iteration 0149/0187: training loss nan Epoch 23 iteration 0150/0187: training loss nan Epoch 23 iteration 0151/0187: training loss nan Epoch 23 iteration 0152/0187: training loss nan Epoch 23 iteration 0153/0187: training loss nan Epoch 23 iteration 0154/0187: training loss nan Epoch 23 iteration 0155/0187: training loss nan Epoch 23 iteration 0156/0187: training loss nan Epoch 23 iteration 0157/0187: training loss nan Epoch 23 iteration 0158/0187: training loss nan Epoch 23 iteration 0159/0187: training loss nan Epoch 23 iteration 0160/0187: training loss nan Epoch 23 iteration 0161/0187: training loss nan Epoch 23 iteration 0162/0187: training loss nan Epoch 23 iteration 0163/0187: training loss nan Epoch 23 iteration 0164/0187: training loss nan Epoch 23 iteration 0165/0187: training loss nan Epoch 23 iteration 0166/0187: training loss nan Epoch 23 iteration 0167/0187: training loss nan Epoch 23 iteration 0168/0187: training loss nan Epoch 23 iteration 0169/0187: training loss nan Epoch 23 iteration 0170/0187: training loss nan Epoch 23 iteration 0171/0187: training loss nan Epoch 23 iteration 0172/0187: training loss nan Epoch 23 iteration 0173/0187: training loss nan Epoch 23 iteration 0174/0187: training loss nan Epoch 23 iteration 0175/0187: training loss nan Epoch 23 iteration 0176/0187: training loss nan Epoch 23 iteration 0177/0187: training loss nan Epoch 23 iteration 0178/0187: training loss nan Epoch 23 iteration 0179/0187: training loss nan Epoch 23 iteration 0180/0187: training loss nan Epoch 23 iteration 0181/0187: training loss nan Epoch 23 iteration 0182/0187: training loss nan Epoch 23 iteration 0183/0187: training loss nan Epoch 23 iteration 0184/0187: training loss nan Epoch 23 iteration 0185/0187: training loss nan Epoch 23 iteration 0186/0187: training loss nan Epoch 23 iteration 0187/0187: training loss nan Epoch 23 validation pixAcc: 0.871, mIoU: 0.377 Epoch 24 iteration 0001/0187: training loss 0.804 Epoch 24 iteration 0002/0187: training loss 0.806 Epoch 24 iteration 0003/0187: training loss 0.790 Epoch 24 iteration 0004/0187: training loss 0.777 Epoch 24 iteration 0005/0187: training loss 0.802 Epoch 24 iteration 0006/0187: training loss 0.787 Epoch 24 iteration 0007/0187: training loss 0.794 Epoch 24 iteration 0008/0187: training loss 0.775 Epoch 24 iteration 0009/0187: training loss 0.771 Epoch 24 iteration 0010/0187: training loss 0.769 Epoch 24 iteration 0011/0187: training loss 0.786 Epoch 24 iteration 0012/0187: training loss 0.784 Epoch 24 iteration 0013/0187: training loss 0.782 Epoch 24 iteration 0014/0187: training loss 0.789 Epoch 24 iteration 0015/0187: training loss 0.789 Epoch 24 iteration 0016/0187: training loss 0.791 Epoch 24 iteration 0017/0187: training loss 0.802 Epoch 24 iteration 0018/0187: training loss 0.798 Epoch 24 iteration 0019/0187: training loss 0.803 Epoch 24 iteration 0020/0187: training loss 0.811 Epoch 24 iteration 0021/0187: training loss 0.823 Epoch 24 iteration 0022/0187: training loss 0.821 Epoch 24 iteration 0023/0187: training loss 0.819 Epoch 24 iteration 0024/0187: training loss 0.826 Epoch 24 iteration 0025/0187: training loss 0.824 Epoch 24 iteration 0026/0187: training loss 0.823 Epoch 24 iteration 0027/0187: training loss 0.817 Epoch 24 iteration 0028/0187: training loss 0.823 Epoch 24 iteration 0029/0187: training loss 0.817 Epoch 24 iteration 0030/0187: training loss 0.813 Epoch 24 iteration 0031/0187: training loss 0.810 Epoch 24 iteration 0032/0187: training loss 0.812 Epoch 24 iteration 0033/0187: training loss 0.809 Epoch 24 iteration 0034/0187: training loss 0.805 Epoch 24 iteration 0035/0187: training loss 0.812 Epoch 24 iteration 0036/0187: training loss 0.809 Epoch 24 iteration 0037/0187: training loss 0.805 Epoch 24 iteration 0038/0187: training loss 0.806 Epoch 24 iteration 0039/0187: training loss 0.812 Epoch 24 iteration 0040/0187: training loss 0.813 Epoch 24 iteration 0041/0187: training loss 0.814 Epoch 24 iteration 0042/0187: training loss 0.819 Epoch 24 iteration 0043/0187: training loss 0.825 Epoch 24 iteration 0044/0187: training loss 0.821 Epoch 24 iteration 0045/0187: training loss 0.820 Epoch 24 iteration 0046/0187: training loss 0.818 Epoch 24 iteration 0047/0187: training loss 0.821 Epoch 24 iteration 0048/0187: training loss 0.817 Epoch 24 iteration 0049/0187: training loss 0.816 Epoch 24 iteration 0050/0187: training loss 0.816 Epoch 24 iteration 0051/0187: training loss 0.814 Epoch 24 iteration 0052/0187: training loss 0.812 Epoch 24 iteration 0053/0187: training loss 0.813 Epoch 24 iteration 0054/0187: training loss 0.812 Epoch 24 iteration 0055/0187: training loss 0.814 Epoch 24 iteration 0056/0187: training loss 0.811 Epoch 24 iteration 0057/0187: training loss 0.810 Epoch 24 iteration 0058/0187: training loss 0.811 Epoch 24 iteration 0059/0187: training loss 0.812 Epoch 24 iteration 0060/0187: training loss 0.810 Epoch 24 iteration 0061/0187: training loss 0.813 Epoch 24 iteration 0062/0187: training loss 0.813 Epoch 24 iteration 0063/0187: training loss 0.813 Epoch 24 iteration 0064/0187: training loss 0.814 Epoch 24 iteration 0065/0187: training loss 0.814 Epoch 24 iteration 0066/0187: training loss 0.813 Epoch 24 iteration 0067/0187: training loss 0.814 Epoch 24 iteration 0068/0187: training loss 0.812 Epoch 24 iteration 0069/0187: training loss 0.813 Epoch 24 iteration 0070/0187: training loss 0.815 Epoch 24 iteration 0071/0187: training loss 0.815 Epoch 24 iteration 0072/0187: training loss 0.815 Epoch 24 iteration 0073/0187: training loss 0.813 Epoch 24 iteration 0074/0187: training loss 0.815 Epoch 24 iteration 0075/0187: training loss 0.820 Epoch 24 iteration 0076/0187: training loss 0.820 Epoch 24 iteration 0077/0187: training loss 0.818 Epoch 24 iteration 0078/0187: training loss 0.818 Epoch 24 iteration 0079/0187: training loss 0.819 Epoch 24 iteration 0080/0187: training loss 0.819 Epoch 24 iteration 0081/0187: training loss 0.820 Epoch 24 iteration 0082/0187: training loss 0.822 Epoch 24 iteration 0083/0187: training loss 0.822 Epoch 24 iteration 0084/0187: training loss 0.821 Epoch 24 iteration 0085/0187: training loss 0.821 Epoch 24 iteration 0086/0187: training loss 0.822 Epoch 24 iteration 0087/0187: training loss 0.822 Epoch 24 iteration 0088/0187: training loss 0.824 Epoch 24 iteration 0089/0187: training loss 0.822 Epoch 24 iteration 0090/0187: training loss 0.820 Epoch 24 iteration 0091/0188: training loss 0.820 Epoch 24 iteration 0092/0188: training loss 0.820 Epoch 24 iteration 0093/0188: training loss 0.821 Epoch 24 iteration 0094/0188: training loss 0.824 Epoch 24 iteration 0095/0188: training loss 0.822 Epoch 24 iteration 0096/0188: training loss 0.821 Epoch 24 iteration 0097/0188: training loss 0.822 Epoch 24 iteration 0098/0188: training loss 0.821 Epoch 24 iteration 0099/0188: training loss 0.820 Epoch 24 iteration 0100/0188: training loss 0.821 Epoch 24 iteration 0101/0188: training loss 0.821 Epoch 24 iteration 0102/0188: training loss 0.821 Epoch 24 iteration 0103/0188: training loss 0.822 Epoch 24 iteration 0104/0188: training loss 0.822 Epoch 24 iteration 0105/0188: training loss 0.822 Epoch 24 iteration 0106/0188: training loss 0.822 Epoch 24 iteration 0107/0188: training loss 0.821 Epoch 24 iteration 0108/0188: training loss 0.820 Epoch 24 iteration 0109/0188: training loss 0.821 Epoch 24 iteration 0110/0188: training loss 0.821 Epoch 24 iteration 0111/0188: training loss 0.821 Epoch 24 iteration 0112/0188: training loss 0.821 Epoch 24 iteration 0113/0188: training loss 0.819 Epoch 24 iteration 0114/0188: training loss 0.820 Epoch 24 iteration 0115/0188: training loss 0.818 Epoch 24 iteration 0116/0188: training loss 0.819 Epoch 24 iteration 0117/0188: training loss 0.819 Epoch 24 iteration 0118/0188: training loss 0.818 Epoch 24 iteration 0119/0188: training loss 0.821 Epoch 24 iteration 0120/0188: training loss 0.820 Epoch 24 iteration 0121/0188: training loss 0.819 Epoch 24 iteration 0122/0188: training loss 0.820 Epoch 24 iteration 0123/0188: training loss 0.820 Epoch 24 iteration 0124/0188: training loss 0.819 Epoch 24 iteration 0125/0188: training loss 0.820 Epoch 24 iteration 0126/0188: training loss 0.821 Epoch 24 iteration 0127/0188: training loss 0.822 Epoch 24 iteration 0128/0188: training loss 0.824 Epoch 24 iteration 0129/0188: training loss 0.824 Epoch 24 iteration 0130/0188: training loss 0.824 Epoch 24 iteration 0131/0188: training loss 0.824 Epoch 24 iteration 0132/0188: training loss 0.823 Epoch 24 iteration 0133/0188: training loss 0.822 Epoch 24 iteration 0134/0188: training loss 0.822 Epoch 24 iteration 0135/0188: training loss 0.822 Epoch 24 iteration 0136/0188: training loss 0.823 Epoch 24 iteration 0137/0188: training loss 0.823 Epoch 24 iteration 0138/0188: training loss 0.823 Epoch 24 iteration 0139/0188: training loss 0.822 Epoch 24 iteration 0140/0188: training loss 0.824 Epoch 24 iteration 0141/0188: training loss 0.825 Epoch 24 iteration 0142/0188: training loss 0.824 Epoch 24 iteration 0143/0188: training loss 0.824 Epoch 24 iteration 0144/0188: training loss 0.824 Epoch 24 iteration 0145/0188: training loss 0.825 Epoch 24 iteration 0146/0188: training loss 0.824 Epoch 24 iteration 0147/0188: training loss 0.824 Epoch 24 iteration 0148/0188: training loss 0.824 Epoch 24 iteration 0149/0188: training loss 0.826 Epoch 24 iteration 0150/0188: training loss 0.828 Epoch 24 iteration 0151/0188: training loss 0.827 Epoch 24 iteration 0152/0188: training loss 0.826 Epoch 24 iteration 0153/0188: training loss 0.826 Epoch 24 iteration 0154/0188: training loss 0.827 Epoch 24 iteration 0155/0188: training loss 0.827 Epoch 24 iteration 0156/0188: training loss 0.827 Epoch 24 iteration 0157/0188: training loss 0.827 Epoch 24 iteration 0158/0188: training loss 0.826 Epoch 24 iteration 0159/0188: training loss 0.826 Epoch 24 iteration 0160/0188: training loss 0.827 Epoch 24 iteration 0161/0188: training loss 0.828 Epoch 24 iteration 0162/0188: training loss 0.828 Epoch 24 iteration 0163/0188: training loss 0.827 Epoch 24 iteration 0164/0188: training loss 0.826 Epoch 24 iteration 0165/0188: training loss 0.827 Epoch 24 iteration 0166/0188: training loss 0.829 Epoch 24 iteration 0167/0188: training loss 0.830 Epoch 24 iteration 0168/0188: training loss 0.830 Epoch 24 iteration 0169/0188: training loss 0.831 Epoch 24 iteration 0170/0188: training loss 0.831 Epoch 24 iteration 0171/0188: training loss 0.831 Epoch 24 iteration 0172/0188: training loss 0.830 Epoch 24 iteration 0173/0188: training loss 0.830 Epoch 24 iteration 0174/0188: training loss 0.830 Epoch 24 iteration 0175/0188: training loss 0.830 Epoch 24 iteration 0176/0188: training loss 0.830 Epoch 24 iteration 0177/0188: training loss 0.830 Epoch 24 iteration 0178/0188: training loss 0.831 Epoch 24 iteration 0179/0188: training loss 0.831 Epoch 24 iteration 0180/0188: training loss 0.830 Epoch 24 iteration 0181/0188: training loss 0.829 Epoch 24 iteration 0182/0188: training loss 0.829 Epoch 24 iteration 0183/0188: training loss 0.829 Epoch 24 iteration 0184/0188: training loss 0.828 Epoch 24 iteration 0185/0188: training loss 0.828 Epoch 24 iteration 0186/0188: training loss 0.829 Epoch 24 validation pixAcc: 0.868, mIoU: 0.372 Epoch 25 iteration 0001/0187: training loss 0.754 Epoch 25 iteration 0002/0187: training loss 0.879 Epoch 25 iteration 0003/0187: training loss 0.863 Epoch 25 iteration 0004/0187: training loss 0.863 Epoch 25 iteration 0005/0187: training loss 0.841 Epoch 25 iteration 0006/0187: training loss 0.829 Epoch 25 iteration 0007/0187: training loss 0.824 Epoch 25 iteration 0008/0187: training loss 0.819 Epoch 25 iteration 0009/0187: training loss 0.816 Epoch 25 iteration 0010/0187: training loss 0.829 Epoch 25 iteration 0011/0187: training loss 0.816 Epoch 25 iteration 0012/0187: training loss 0.817 Epoch 25 iteration 0013/0187: training loss 0.825 Epoch 25 iteration 0014/0187: training loss 0.824 Epoch 25 iteration 0015/0187: training loss 0.835 Epoch 25 iteration 0016/0187: training loss 0.833 Epoch 25 iteration 0017/0187: training loss 0.840 Epoch 25 iteration 0018/0187: training loss 0.825 Epoch 25 iteration 0019/0187: training loss 0.817 Epoch 25 iteration 0020/0187: training loss 0.824 Epoch 25 iteration 0021/0187: training loss 0.826 Epoch 25 iteration 0022/0187: training loss 0.825 Epoch 25 iteration 0023/0187: training loss 0.826 Epoch 25 iteration 0024/0187: training loss 0.832 Epoch 25 iteration 0025/0187: training loss 0.830 Epoch 25 iteration 0026/0187: training loss 0.833 Epoch 25 iteration 0027/0187: training loss 0.832 Epoch 25 iteration 0028/0187: training loss 0.828 Epoch 25 iteration 0029/0187: training loss 0.823 Epoch 25 iteration 0030/0187: training loss 0.823 Epoch 25 iteration 0031/0187: training loss 0.826 Epoch 25 iteration 0032/0187: training loss 0.822 Epoch 25 iteration 0033/0187: training loss 0.817 Epoch 25 iteration 0034/0187: training loss 0.814 Epoch 25 iteration 0035/0187: training loss 0.816 Epoch 25 iteration 0036/0187: training loss 0.829 Epoch 25 iteration 0037/0187: training loss 0.829 Epoch 25 iteration 0038/0187: training loss 0.827 Epoch 25 iteration 0039/0187: training loss 0.823 Epoch 25 iteration 0040/0187: training loss 0.818 Epoch 25 iteration 0041/0187: training loss 0.814 Epoch 25 iteration 0042/0187: training loss 0.815 Epoch 25 iteration 0043/0187: training loss 0.814 Epoch 25 iteration 0044/0187: training loss 0.814 Epoch 25 iteration 0045/0187: training loss 0.811 Epoch 25 iteration 0046/0187: training loss 0.810 Epoch 25 iteration 0047/0187: training loss 0.812 Epoch 25 iteration 0048/0187: training loss 0.809 Epoch 25 iteration 0049/0187: training loss 0.808 Epoch 25 iteration 0050/0187: training loss 0.811 Epoch 25 iteration 0051/0187: training loss 0.809 Epoch 25 iteration 0052/0187: training loss 0.818 Epoch 25 iteration 0053/0187: training loss 0.820 Epoch 25 iteration 0054/0187: training loss 0.824 Epoch 25 iteration 0055/0187: training loss 0.820 Epoch 25 iteration 0056/0187: training loss 0.818 Epoch 25 iteration 0057/0187: training loss 0.816 Epoch 25 iteration 0058/0187: training loss 0.823 Epoch 25 iteration 0059/0187: training loss 0.821 Epoch 25 iteration 0060/0187: training loss 0.820 Epoch 25 iteration 0061/0187: training loss 0.817 Epoch 25 iteration 0062/0187: training loss 0.816 Epoch 25 iteration 0063/0187: training loss 0.822 Epoch 25 iteration 0064/0187: training loss 0.823 Epoch 25 iteration 0065/0187: training loss 0.823 Epoch 25 iteration 0066/0187: training loss 0.823 Epoch 25 iteration 0067/0187: training loss 0.827 Epoch 25 iteration 0068/0187: training loss 0.830 Epoch 25 iteration 0069/0187: training loss 0.828 Epoch 25 iteration 0070/0187: training loss 0.832 Epoch 25 iteration 0071/0187: training loss 0.832 Epoch 25 iteration 0072/0187: training loss 0.832 Epoch 25 iteration 0073/0187: training loss 0.832 Epoch 25 iteration 0074/0187: training loss 0.833 Epoch 25 iteration 0075/0187: training loss 0.831 Epoch 25 iteration 0076/0187: training loss 0.833 Epoch 25 iteration 0077/0187: training loss 0.831 Epoch 25 iteration 0078/0187: training loss 0.831 Epoch 25 iteration 0079/0187: training loss 0.830 Epoch 25 iteration 0080/0187: training loss 0.829 Epoch 25 iteration 0081/0187: training loss 0.829 Epoch 25 iteration 0082/0187: training loss 0.829 Epoch 25 iteration 0083/0187: training loss 0.828 Epoch 25 iteration 0084/0187: training loss 0.828 Epoch 25 iteration 0085/0187: training loss 0.830 Epoch 25 iteration 0086/0187: training loss 0.828 Epoch 25 iteration 0087/0187: training loss 0.831 Epoch 25 iteration 0088/0187: training loss 0.831 Epoch 25 iteration 0089/0187: training loss 0.831 Epoch 25 iteration 0090/0187: training loss 0.830 Epoch 25 iteration 0091/0187: training loss 0.829 Epoch 25 iteration 0092/0187: training loss 0.831 Epoch 25 iteration 0093/0187: training loss 0.834 Epoch 25 iteration 0094/0187: training loss 0.834 Epoch 25 iteration 0095/0187: training loss 0.834 Epoch 25 iteration 0096/0187: training loss 0.834 Epoch 25 iteration 0097/0187: training loss 0.834 Epoch 25 iteration 0098/0187: training loss 0.833 Epoch 25 iteration 0099/0187: training loss 0.834 Epoch 25 iteration 0100/0187: training loss 0.834 Epoch 25 iteration 0101/0187: training loss 0.833 Epoch 25 iteration 0102/0187: training loss 0.833 Epoch 25 iteration 0103/0187: training loss 0.832 Epoch 25 iteration 0104/0187: training loss 0.834 Epoch 25 iteration 0105/0187: training loss 0.834 Epoch 25 iteration 0106/0187: training loss 0.834 Epoch 25 iteration 0107/0187: training loss 0.832 Epoch 25 iteration 0108/0187: training loss 0.829 Epoch 25 iteration 0109/0187: training loss 0.829 Epoch 25 iteration 0110/0187: training loss 0.830 Epoch 25 iteration 0111/0187: training loss 0.830 Epoch 25 iteration 0112/0187: training loss 0.830 Epoch 25 iteration 0113/0187: training loss 0.830 Epoch 25 iteration 0114/0187: training loss 0.831 Epoch 25 iteration 0115/0187: training loss 0.829 Epoch 25 iteration 0116/0187: training loss 0.828 Epoch 25 iteration 0117/0187: training loss 0.831 Epoch 25 iteration 0118/0187: training loss 0.832 Epoch 25 iteration 0119/0187: training loss 0.832 Epoch 25 iteration 0120/0187: training loss 0.832 Epoch 25 iteration 0121/0187: training loss 0.830 Epoch 25 iteration 0122/0187: training loss 0.830 Epoch 25 iteration 0123/0187: training loss 0.827 Epoch 25 iteration 0124/0187: training loss 0.827 Epoch 25 iteration 0125/0187: training loss 0.826 Epoch 25 iteration 0126/0187: training loss 0.827 Epoch 25 iteration 0127/0187: training loss 0.829 Epoch 25 iteration 0128/0187: training loss 0.829 Epoch 25 iteration 0129/0187: training loss 0.828 Epoch 25 iteration 0130/0187: training loss 0.829 Epoch 25 iteration 0131/0187: training loss 0.830 Epoch 25 iteration 0132/0187: training loss 0.832 Epoch 25 iteration 0133/0187: training loss 0.831 Epoch 25 iteration 0134/0187: training loss 0.830 Epoch 25 iteration 0135/0187: training loss 0.829 Epoch 25 iteration 0136/0187: training loss 0.829 Epoch 25 iteration 0137/0187: training loss 0.829 Epoch 25 iteration 0138/0187: training loss 0.829 Epoch 25 iteration 0139/0187: training loss 0.828 Epoch 25 iteration 0140/0187: training loss 0.828 Epoch 25 iteration 0141/0187: training loss 0.828 Epoch 25 iteration 0142/0187: training loss 0.829 Epoch 25 iteration 0143/0187: training loss 0.829 Epoch 25 iteration 0144/0187: training loss 0.827 Epoch 25 iteration 0145/0187: training loss 0.825 Epoch 25 iteration 0146/0187: training loss 0.825 Epoch 25 iteration 0147/0187: training loss 0.825 Epoch 25 iteration 0148/0187: training loss 0.824 Epoch 25 iteration 0149/0187: training loss 0.825 Epoch 25 iteration 0150/0187: training loss 0.825 Epoch 25 iteration 0151/0187: training loss 0.825 Epoch 25 iteration 0152/0187: training loss 0.824 Epoch 25 iteration 0153/0187: training loss 0.824 Epoch 25 iteration 0154/0187: training loss 0.823 Epoch 25 iteration 0155/0187: training loss 0.823 Epoch 25 iteration 0156/0187: training loss 0.824 Epoch 25 iteration 0157/0187: training loss 0.824 Epoch 25 iteration 0158/0187: training loss 0.823 Epoch 25 iteration 0159/0187: training loss 0.822 Epoch 25 iteration 0160/0187: training loss 0.822 Epoch 25 iteration 0161/0187: training loss 0.822 Epoch 25 iteration 0162/0187: training loss 0.821 Epoch 25 iteration 0163/0187: training loss 0.822 Epoch 25 iteration 0164/0187: training loss 0.821 Epoch 25 iteration 0165/0187: training loss 0.821 Epoch 25 iteration 0166/0187: training loss 0.819 Epoch 25 iteration 0167/0187: training loss 0.818 Epoch 25 iteration 0168/0187: training loss 0.817 Epoch 25 iteration 0169/0187: training loss 0.817 Epoch 25 iteration 0170/0187: training loss 0.818 Epoch 25 iteration 0171/0187: training loss 0.818 Epoch 25 iteration 0172/0187: training loss 0.818 Epoch 25 iteration 0173/0187: training loss 0.817 Epoch 25 iteration 0174/0187: training loss 0.817 Epoch 25 iteration 0175/0187: training loss 0.816 Epoch 25 iteration 0176/0187: training loss 0.815 Epoch 25 iteration 0177/0187: training loss 0.814 Epoch 25 iteration 0178/0187: training loss 0.813 Epoch 25 iteration 0179/0187: training loss 0.815 Epoch 25 iteration 0180/0187: training loss 0.814 Epoch 25 iteration 0181/0187: training loss 0.814 Epoch 25 iteration 0182/0187: training loss 0.813 Epoch 25 iteration 0183/0187: training loss 0.814 Epoch 25 iteration 0184/0187: training loss 0.814 Epoch 25 iteration 0185/0187: training loss 0.813 Epoch 25 iteration 0186/0187: training loss 0.813 Epoch 25 iteration 0187/0187: training loss 0.814 Epoch 25 validation pixAcc: 0.872, mIoU: 0.378 Epoch 26 iteration 0001/0187: training loss 0.753 Epoch 26 iteration 0002/0187: training loss 0.752 Epoch 26 iteration 0003/0187: training loss 0.734 Epoch 26 iteration 0004/0187: training loss 0.761 Epoch 26 iteration 0005/0187: training loss 0.743 Epoch 26 iteration 0006/0187: training loss 0.782 Epoch 26 iteration 0007/0187: training loss 0.793 Epoch 26 iteration 0008/0187: training loss 0.784 Epoch 26 iteration 0009/0187: training loss 0.792 Epoch 26 iteration 0010/0187: training loss 0.789 Epoch 26 iteration 0011/0187: training loss 0.790 Epoch 26 iteration 0012/0187: training loss 0.798 Epoch 26 iteration 0013/0187: training loss 0.816 Epoch 26 iteration 0014/0187: training loss 0.831 Epoch 26 iteration 0015/0187: training loss 0.817 Epoch 26 iteration 0016/0187: training loss 0.820 Epoch 26 iteration 0017/0187: training loss 0.833 Epoch 26 iteration 0018/0187: training loss 0.822 Epoch 26 iteration 0019/0187: training loss 0.817 Epoch 26 iteration 0020/0187: training loss 0.819 Epoch 26 iteration 0021/0187: training loss 0.812 Epoch 26 iteration 0022/0187: training loss 0.826 Epoch 26 iteration 0023/0187: training loss 0.822 Epoch 26 iteration 0024/0187: training loss 0.823 Epoch 26 iteration 0025/0187: training loss 0.829 Epoch 26 iteration 0026/0187: training loss 0.830 Epoch 26 iteration 0027/0187: training loss 0.837 Epoch 26 iteration 0028/0187: training loss 0.836 Epoch 26 iteration 0029/0187: training loss 0.834 Epoch 26 iteration 0030/0187: training loss 0.830 Epoch 26 iteration 0031/0187: training loss 0.835 Epoch 26 iteration 0032/0187: training loss 0.830 Epoch 26 iteration 0033/0187: training loss 0.823 Epoch 26 iteration 0034/0187: training loss 0.822 Epoch 26 iteration 0035/0187: training loss 0.820 Epoch 26 iteration 0036/0187: training loss 0.815 Epoch 26 iteration 0037/0187: training loss 0.814 Epoch 26 iteration 0038/0187: training loss 0.825 Epoch 26 iteration 0039/0187: training loss 0.827 Epoch 26 iteration 0040/0187: training loss 0.825 Epoch 26 iteration 0041/0187: training loss 0.820 Epoch 26 iteration 0042/0187: training loss 0.818 Epoch 26 iteration 0043/0187: training loss 0.816 Epoch 26 iteration 0044/0187: training loss 0.817 Epoch 26 iteration 0045/0187: training loss 0.816 Epoch 26 iteration 0046/0187: training loss 0.814 Epoch 26 iteration 0047/0187: training loss 0.813 Epoch 26 iteration 0048/0187: training loss 0.815 Epoch 26 iteration 0049/0187: training loss 0.818 Epoch 26 iteration 0050/0187: training loss 0.820 Epoch 26 iteration 0051/0187: training loss 0.815 Epoch 26 iteration 0052/0187: training loss 0.814 Epoch 26 iteration 0053/0187: training loss 0.813 Epoch 26 iteration 0054/0187: training loss 0.813 Epoch 26 iteration 0055/0187: training loss 0.809 Epoch 26 iteration 0056/0187: training loss 0.810 Epoch 26 iteration 0057/0187: training loss 0.810 Epoch 26 iteration 0058/0187: training loss 0.818 Epoch 26 iteration 0059/0187: training loss 0.816 Epoch 26 iteration 0060/0187: training loss 0.816 Epoch 26 iteration 0061/0187: training loss 0.823 Epoch 26 iteration 0062/0187: training loss 0.822 Epoch 26 iteration 0063/0187: training loss 0.820 Epoch 26 iteration 0064/0187: training loss 0.819 Epoch 26 iteration 0065/0187: training loss 0.820 Epoch 26 iteration 0066/0187: training loss 0.822 Epoch 26 iteration 0067/0187: training loss 0.820 Epoch 26 iteration 0068/0187: training loss 0.819 Epoch 26 iteration 0069/0187: training loss 0.820 Epoch 26 iteration 0070/0187: training loss 0.820 Epoch 26 iteration 0071/0187: training loss 0.818 Epoch 26 iteration 0072/0187: training loss 0.819 Epoch 26 iteration 0073/0187: training loss 0.816 Epoch 26 iteration 0074/0187: training loss 0.815 Epoch 26 iteration 0075/0187: training loss 0.817 Epoch 26 iteration 0076/0187: training loss 0.817 Epoch 26 iteration 0077/0187: training loss 0.817 Epoch 26 iteration 0078/0187: training loss 0.818 Epoch 26 iteration 0079/0187: training loss 0.817 Epoch 26 iteration 0080/0187: training loss 0.817 Epoch 26 iteration 0081/0187: training loss 0.815 Epoch 26 iteration 0082/0187: training loss 0.814 Epoch 26 iteration 0083/0187: training loss 0.815 Epoch 26 iteration 0084/0187: training loss 0.814 Epoch 26 iteration 0085/0187: training loss 0.813 Epoch 26 iteration 0086/0187: training loss 0.812 Epoch 26 iteration 0087/0187: training loss 0.813 Epoch 26 iteration 0088/0187: training loss 0.813 Epoch 26 iteration 0089/0187: training loss 0.815 Epoch 26 iteration 0090/0187: training loss 0.813 Epoch 26 iteration 0091/0188: training loss 0.810 Epoch 26 iteration 0092/0188: training loss 0.812 Epoch 26 iteration 0093/0188: training loss 0.813 Epoch 26 iteration 0094/0188: training loss 0.814 Epoch 26 iteration 0095/0188: training loss 0.816 Epoch 26 iteration 0096/0188: training loss 0.820 Epoch 26 iteration 0097/0188: training loss 0.822 Epoch 26 iteration 0098/0188: training loss 0.822 Epoch 26 iteration 0099/0188: training loss 0.820 Epoch 26 iteration 0100/0188: training loss 0.822 Epoch 26 iteration 0101/0188: training loss 0.823 Epoch 26 iteration 0102/0188: training loss 0.824 Epoch 26 iteration 0103/0188: training loss 0.822 Epoch 26 iteration 0104/0188: training loss 0.821 Epoch 26 iteration 0105/0188: training loss 0.820 Epoch 26 iteration 0106/0188: training loss 0.821 Epoch 26 iteration 0107/0188: training loss 0.821 Epoch 26 iteration 0108/0188: training loss 0.820 Epoch 26 iteration 0109/0188: training loss 0.819 Epoch 26 iteration 0110/0188: training loss 0.819 Epoch 26 iteration 0111/0188: training loss 0.820 Epoch 26 iteration 0112/0188: training loss 0.821 Epoch 26 iteration 0113/0188: training loss 0.821 Epoch 26 iteration 0114/0188: training loss 0.820 Epoch 26 iteration 0115/0188: training loss 0.820 Epoch 26 iteration 0116/0188: training loss 0.821 Epoch 26 iteration 0117/0188: training loss 0.821 Epoch 26 iteration 0118/0188: training loss 0.821 Epoch 26 iteration 0119/0188: training loss 0.819 Epoch 26 iteration 0120/0188: training loss 0.818 Epoch 26 iteration 0121/0188: training loss 0.817 Epoch 26 iteration 0122/0188: training loss 0.817 Epoch 26 iteration 0123/0188: training loss 0.816 Epoch 26 iteration 0124/0188: training loss 0.814 Epoch 26 iteration 0125/0188: training loss 0.816 Epoch 26 iteration 0126/0188: training loss 0.814 Epoch 26 iteration 0127/0188: training loss 0.815 Epoch 26 iteration 0128/0188: training loss 0.815 Epoch 26 iteration 0129/0188: training loss 0.814 Epoch 26 iteration 0130/0188: training loss 0.814 Epoch 26 iteration 0131/0188: training loss 0.815 Epoch 26 iteration 0132/0188: training loss 0.815 Epoch 26 iteration 0133/0188: training loss 0.814 Epoch 26 iteration 0134/0188: training loss 0.814 Epoch 26 iteration 0135/0188: training loss 0.814 Epoch 26 iteration 0136/0188: training loss 0.814 Epoch 26 iteration 0137/0188: training loss 0.814 Epoch 26 iteration 0138/0188: training loss 0.814 Epoch 26 iteration 0139/0188: training loss 0.814 Epoch 26 iteration 0140/0188: training loss 0.815 Epoch 26 iteration 0141/0188: training loss 0.815 Epoch 26 iteration 0142/0188: training loss 0.815 Epoch 26 iteration 0143/0188: training loss 0.817 Epoch 26 iteration 0144/0188: training loss 0.817 Epoch 26 iteration 0145/0188: training loss 0.817 Epoch 26 iteration 0146/0188: training loss 0.817 Epoch 26 iteration 0147/0188: training loss 0.817 Epoch 26 iteration 0148/0188: training loss 0.817 Epoch 26 iteration 0149/0188: training loss 0.818 Epoch 26 iteration 0150/0188: training loss 0.818 Epoch 26 iteration 0151/0188: training loss 0.818 Epoch 26 iteration 0152/0188: training loss 0.817 Epoch 26 iteration 0153/0188: training loss 0.816 Epoch 26 iteration 0154/0188: training loss 0.816 Epoch 26 iteration 0155/0188: training loss 0.818 Epoch 26 iteration 0156/0188: training loss 0.818 Epoch 26 iteration 0157/0188: training loss 0.818 Epoch 26 iteration 0158/0188: training loss 0.818 Epoch 26 iteration 0159/0188: training loss 0.818 Epoch 26 iteration 0160/0188: training loss 0.818 Epoch 26 iteration 0161/0188: training loss 0.818 Epoch 26 iteration 0162/0188: training loss 0.819 Epoch 26 iteration 0163/0188: training loss 0.818 Epoch 26 iteration 0164/0188: training loss 0.820 Epoch 26 iteration 0165/0188: training loss 0.822 Epoch 26 iteration 0166/0188: training loss 0.822 Epoch 26 iteration 0167/0188: training loss 0.820 Epoch 26 iteration 0168/0188: training loss 0.820 Epoch 26 iteration 0169/0188: training loss 0.820 Epoch 26 iteration 0170/0188: training loss 0.821 Epoch 26 iteration 0171/0188: training loss 0.822 Epoch 26 iteration 0172/0188: training loss 0.822 Epoch 26 iteration 0173/0188: training loss 0.822 Epoch 26 iteration 0174/0188: training loss 0.822 Epoch 26 iteration 0175/0188: training loss 0.821 Epoch 26 iteration 0176/0188: training loss 0.820 Epoch 26 iteration 0177/0188: training loss 0.819 Epoch 26 iteration 0178/0188: training loss 0.819 Epoch 26 iteration 0179/0188: training loss 0.818 Epoch 26 iteration 0180/0188: training loss 0.817 Epoch 26 iteration 0181/0188: training loss 0.817 Epoch 26 iteration 0182/0188: training loss 0.817 Epoch 26 iteration 0183/0188: training loss 0.817 Epoch 26 iteration 0184/0188: training loss 0.816 Epoch 26 iteration 0185/0188: training loss 0.815 Epoch 26 iteration 0186/0188: training loss 0.815 Epoch 26 validation pixAcc: 0.871, mIoU: 0.370 Epoch 27 iteration 0001/0187: training loss 1.049 Epoch 27 iteration 0002/0187: training loss 1.058 Epoch 27 iteration 0003/0187: training loss 1.013 Epoch 27 iteration 0004/0187: training loss 1.003 Epoch 27 iteration 0005/0187: training loss 0.953 Epoch 27 iteration 0006/0187: training loss 0.928 Epoch 27 iteration 0007/0187: training loss 0.909 Epoch 27 iteration 0008/0187: training loss 0.931 Epoch 27 iteration 0009/0187: training loss 0.910 Epoch 27 iteration 0010/0187: training loss 0.920 Epoch 27 iteration 0011/0187: training loss 0.902 Epoch 27 iteration 0012/0187: training loss 0.877 Epoch 27 iteration 0013/0187: training loss 0.870 Epoch 27 iteration 0014/0187: training loss 0.871 Epoch 27 iteration 0015/0187: training loss 0.861 Epoch 27 iteration 0016/0187: training loss 0.867 Epoch 27 iteration 0017/0187: training loss 0.865 Epoch 27 iteration 0018/0187: training loss 0.865 Epoch 27 iteration 0019/0187: training loss 0.867 Epoch 27 iteration 0020/0187: training loss 0.866 Epoch 27 iteration 0021/0187: training loss 0.857 Epoch 27 iteration 0022/0187: training loss 0.860 Epoch 27 iteration 0023/0187: training loss 0.852 Epoch 27 iteration 0024/0187: training loss 0.844 Epoch 27 iteration 0025/0187: training loss 0.853 Epoch 27 iteration 0026/0187: training loss 0.846 Epoch 27 iteration 0027/0187: training loss 0.847 Epoch 27 iteration 0028/0187: training loss 0.844 Epoch 27 iteration 0029/0187: training loss 0.840 Epoch 27 iteration 0030/0187: training loss 0.844 Epoch 27 iteration 0031/0187: training loss 0.843 Epoch 27 iteration 0032/0187: training loss 0.851 Epoch 27 iteration 0033/0187: training loss 0.852 Epoch 27 iteration 0034/0187: training loss 0.854 Epoch 27 iteration 0035/0187: training loss 0.851 Epoch 27 iteration 0036/0187: training loss 0.846 Epoch 27 iteration 0037/0187: training loss 0.847 Epoch 27 iteration 0038/0187: training loss 0.845 Epoch 27 iteration 0039/0187: training loss 0.842 Epoch 27 iteration 0040/0187: training loss 0.838 Epoch 27 iteration 0041/0187: training loss 0.835 Epoch 27 iteration 0042/0187: training loss 0.839 Epoch 27 iteration 0043/0187: training loss 0.848 Epoch 27 iteration 0044/0187: training loss 0.846 Epoch 27 iteration 0045/0187: training loss 0.840 Epoch 27 iteration 0046/0187: training loss 0.835 Epoch 27 iteration 0047/0187: training loss 0.836 Epoch 27 iteration 0048/0187: training loss 0.834 Epoch 27 iteration 0049/0187: training loss 0.836 Epoch 27 iteration 0050/0187: training loss 0.834 Epoch 27 iteration 0051/0187: training loss 0.833 Epoch 27 iteration 0052/0187: training loss 0.831 Epoch 27 iteration 0053/0187: training loss 0.837 Epoch 27 iteration 0054/0187: training loss 0.838 Epoch 27 iteration 0055/0187: training loss 0.838 Epoch 27 iteration 0056/0187: training loss 0.838 Epoch 27 iteration 0057/0187: training loss 0.837 Epoch 27 iteration 0058/0187: training loss 0.836 Epoch 27 iteration 0059/0187: training loss 0.836 Epoch 27 iteration 0060/0187: training loss 0.837 Epoch 27 iteration 0061/0187: training loss 0.833 Epoch 27 iteration 0062/0187: training loss 0.833 Epoch 27 iteration 0063/0187: training loss 0.837 Epoch 27 iteration 0064/0187: training loss 0.835 Epoch 27 iteration 0065/0187: training loss 0.837 Epoch 27 iteration 0066/0187: training loss 0.838 Epoch 27 iteration 0067/0187: training loss 0.837 Epoch 27 iteration 0068/0187: training loss 0.833 Epoch 27 iteration 0069/0187: training loss 0.832 Epoch 27 iteration 0070/0187: training loss 0.832 Epoch 27 iteration 0071/0187: training loss 0.832 Epoch 27 iteration 0072/0187: training loss 0.831 Epoch 27 iteration 0073/0187: training loss 0.830 Epoch 27 iteration 0074/0187: training loss 0.832 Epoch 27 iteration 0075/0187: training loss 0.832 Epoch 27 iteration 0076/0187: training loss 0.832 Epoch 27 iteration 0077/0187: training loss 0.827 Epoch 27 iteration 0078/0187: training loss 0.826 Epoch 27 iteration 0079/0187: training loss 0.825 Epoch 27 iteration 0080/0187: training loss 0.824 Epoch 27 iteration 0081/0187: training loss 0.822 Epoch 27 iteration 0082/0187: training loss 0.821 Epoch 27 iteration 0083/0187: training loss 0.821 Epoch 27 iteration 0084/0187: training loss 0.823 Epoch 27 iteration 0085/0187: training loss 0.822 Epoch 27 iteration 0086/0187: training loss 0.823 Epoch 27 iteration 0087/0187: training loss 0.823 Epoch 27 iteration 0088/0187: training loss 0.823 Epoch 27 iteration 0089/0187: training loss 0.822 Epoch 27 iteration 0090/0187: training loss 0.820 Epoch 27 iteration 0091/0187: training loss 0.821 Epoch 27 iteration 0092/0187: training loss 0.822 Epoch 27 iteration 0093/0187: training loss 0.821 Epoch 27 iteration 0094/0187: training loss 0.821 Epoch 27 iteration 0095/0187: training loss 0.821 Epoch 27 iteration 0096/0187: training loss 0.820 Epoch 27 iteration 0097/0187: training loss 0.819 Epoch 27 iteration 0098/0187: training loss 0.820 Epoch 27 iteration 0099/0187: training loss 0.820 Epoch 27 iteration 0100/0187: training loss 0.820 Epoch 27 iteration 0101/0187: training loss 0.822 Epoch 27 iteration 0102/0187: training loss 0.821 Epoch 27 iteration 0103/0187: training loss 0.821 Epoch 27 iteration 0104/0187: training loss 0.822 Epoch 27 iteration 0105/0187: training loss 0.823 Epoch 27 iteration 0106/0187: training loss 0.822 Epoch 27 iteration 0107/0187: training loss 0.822 Epoch 27 iteration 0108/0187: training loss 0.821 Epoch 27 iteration 0109/0187: training loss 0.821 Epoch 27 iteration 0110/0187: training loss 0.821 Epoch 27 iteration 0111/0187: training loss 0.820 Epoch 27 iteration 0112/0187: training loss 0.824 Epoch 27 iteration 0113/0187: training loss 0.824 Epoch 27 iteration 0114/0187: training loss 0.824 Epoch 27 iteration 0115/0187: training loss 0.825 Epoch 27 iteration 0116/0187: training loss 0.828 Epoch 27 iteration 0117/0187: training loss 0.828 Epoch 27 iteration 0118/0187: training loss 0.828 Epoch 27 iteration 0119/0187: training loss 0.829 Epoch 27 iteration 0120/0187: training loss 0.829 Epoch 27 iteration 0121/0187: training loss 0.829 Epoch 27 iteration 0122/0187: training loss 0.829 Epoch 27 iteration 0123/0187: training loss 0.828 Epoch 27 iteration 0124/0187: training loss 0.828 Epoch 27 iteration 0125/0187: training loss 0.829 Epoch 27 iteration 0126/0187: training loss 0.827 Epoch 27 iteration 0127/0187: training loss 0.828 Epoch 27 iteration 0128/0187: training loss 0.828 Epoch 27 iteration 0129/0187: training loss 0.827 Epoch 27 iteration 0130/0187: training loss 0.828 Epoch 27 iteration 0131/0187: training loss 0.827 Epoch 27 iteration 0132/0187: training loss 0.826 Epoch 27 iteration 0133/0187: training loss 0.826 Epoch 27 iteration 0134/0187: training loss 0.824 Epoch 27 iteration 0135/0187: training loss 0.825 Epoch 27 iteration 0136/0187: training loss 0.824 Epoch 27 iteration 0137/0187: training loss 0.823 Epoch 27 iteration 0138/0187: training loss 0.823 Epoch 27 iteration 0139/0187: training loss 0.822 Epoch 27 iteration 0140/0187: training loss 0.821 Epoch 27 iteration 0141/0187: training loss 0.821 Epoch 27 iteration 0142/0187: training loss 0.821 Epoch 27 iteration 0143/0187: training loss 0.820 Epoch 27 iteration 0144/0187: training loss 0.820 Epoch 27 iteration 0145/0187: training loss 0.820 Epoch 27 iteration 0146/0187: training loss 0.819 Epoch 27 iteration 0147/0187: training loss 0.819 Epoch 27 iteration 0148/0187: training loss 0.819 Epoch 27 iteration 0149/0187: training loss 0.817 Epoch 27 iteration 0150/0187: training loss 0.817 Epoch 27 iteration 0151/0187: training loss 0.816 Epoch 27 iteration 0152/0187: training loss 0.817 Epoch 27 iteration 0153/0187: training loss 0.819 Epoch 27 iteration 0154/0187: training loss 0.818 Epoch 27 iteration 0155/0187: training loss 0.819 Epoch 27 iteration 0156/0187: training loss 0.818 Epoch 27 iteration 0157/0187: training loss 0.819 Epoch 27 iteration 0158/0187: training loss 0.818 Epoch 27 iteration 0159/0187: training loss 0.816 Epoch 27 iteration 0160/0187: training loss 0.815 Epoch 27 iteration 0161/0187: training loss 0.815 Epoch 27 iteration 0162/0187: training loss 0.815 Epoch 27 iteration 0163/0187: training loss 0.817 Epoch 27 iteration 0164/0187: training loss 0.817 Epoch 27 iteration 0165/0187: training loss 0.817 Epoch 27 iteration 0166/0187: training loss 0.817 Epoch 27 iteration 0167/0187: training loss 0.817 Epoch 27 iteration 0168/0187: training loss 0.816 Epoch 27 iteration 0169/0187: training loss 0.815 Epoch 27 iteration 0170/0187: training loss 0.814 Epoch 27 iteration 0171/0187: training loss 0.814 Epoch 27 iteration 0172/0187: training loss 0.815 Epoch 27 iteration 0173/0187: training loss 0.815 Epoch 27 iteration 0174/0187: training loss 0.816 Epoch 27 iteration 0175/0187: training loss 0.815 Epoch 27 iteration 0176/0187: training loss 0.816 Epoch 27 iteration 0177/0187: training loss 0.816 Epoch 27 iteration 0178/0187: training loss 0.816 Epoch 27 iteration 0179/0187: training loss 0.816 Epoch 27 iteration 0180/0187: training loss 0.816 Epoch 27 iteration 0181/0187: training loss 0.815 Epoch 27 iteration 0182/0187: training loss 0.815 Epoch 27 iteration 0183/0187: training loss 0.815 Epoch 27 iteration 0184/0187: training loss 0.814 Epoch 27 iteration 0185/0187: training loss 0.815 Epoch 27 iteration 0186/0187: training loss 0.816 Epoch 27 iteration 0187/0187: training loss 0.815 Epoch 27 validation pixAcc: 0.869, mIoU: 0.373 Epoch 28 iteration 0001/0187: training loss 0.846 Epoch 28 iteration 0002/0187: training loss 0.860 Epoch 28 iteration 0003/0187: training loss 0.941 Epoch 28 iteration 0004/0187: training loss 0.929 Epoch 28 iteration 0005/0187: training loss 0.912 Epoch 28 iteration 0006/0187: training loss 0.916 Epoch 28 iteration 0007/0187: training loss 0.895 Epoch 28 iteration 0008/0187: training loss 0.888 Epoch 28 iteration 0009/0187: training loss 0.893 Epoch 28 iteration 0010/0187: training loss 0.891 Epoch 28 iteration 0011/0187: training loss 0.912 Epoch 28 iteration 0012/0187: training loss 0.903 Epoch 28 iteration 0013/0187: training loss 0.897 Epoch 28 iteration 0014/0187: training loss 0.879 Epoch 28 iteration 0015/0187: training loss 0.879 Epoch 28 iteration 0016/0187: training loss 0.884 Epoch 28 iteration 0017/0187: training loss 0.872 Epoch 28 iteration 0018/0187: training loss 0.864 Epoch 28 iteration 0019/0187: training loss 0.864 Epoch 28 iteration 0020/0187: training loss 0.857 Epoch 28 iteration 0021/0187: training loss 0.860 Epoch 28 iteration 0022/0187: training loss 0.860 Epoch 28 iteration 0023/0187: training loss 0.851 Epoch 28 iteration 0024/0187: training loss 0.857 Epoch 28 iteration 0025/0187: training loss 0.860 Epoch 28 iteration 0026/0187: training loss 0.850 Epoch 28 iteration 0027/0187: training loss 0.846 Epoch 28 iteration 0028/0187: training loss 0.847 Epoch 28 iteration 0029/0187: training loss 0.847 Epoch 28 iteration 0030/0187: training loss 0.847 Epoch 28 iteration 0031/0187: training loss 0.844 Epoch 28 iteration 0032/0187: training loss 0.842 Epoch 28 iteration 0033/0187: training loss 0.840 Epoch 28 iteration 0034/0187: training loss 0.846 Epoch 28 iteration 0035/0187: training loss 0.839 Epoch 28 iteration 0036/0187: training loss 0.834 Epoch 28 iteration 0037/0187: training loss 0.837 Epoch 28 iteration 0038/0187: training loss 0.832 Epoch 28 iteration 0039/0187: training loss 0.827 Epoch 28 iteration 0040/0187: training loss 0.828 Epoch 28 iteration 0041/0187: training loss 0.825 Epoch 28 iteration 0042/0187: training loss 0.822 Epoch 28 iteration 0043/0187: training loss 0.823 Epoch 28 iteration 0044/0187: training loss 0.829 Epoch 28 iteration 0045/0187: training loss 0.830 Epoch 28 iteration 0046/0187: training loss 0.830 Epoch 28 iteration 0047/0187: training loss 0.833 Epoch 28 iteration 0048/0187: training loss 0.831 Epoch 28 iteration 0049/0187: training loss 0.830 Epoch 28 iteration 0050/0187: training loss 0.835 Epoch 28 iteration 0051/0187: training loss 0.833 Epoch 28 iteration 0052/0187: training loss 0.834 Epoch 28 iteration 0053/0187: training loss 0.838 Epoch 28 iteration 0054/0187: training loss 0.839 Epoch 28 iteration 0055/0187: training loss 0.833 Epoch 28 iteration 0056/0187: training loss 0.832 Epoch 28 iteration 0057/0187: training loss 0.833 Epoch 28 iteration 0058/0187: training loss 0.832 Epoch 28 iteration 0059/0187: training loss 0.830 Epoch 28 iteration 0060/0187: training loss 0.831 Epoch 28 iteration 0061/0187: training loss 0.835 Epoch 28 iteration 0062/0187: training loss 0.835 Epoch 28 iteration 0063/0187: training loss 0.833 Epoch 28 iteration 0064/0187: training loss 0.831 Epoch 28 iteration 0065/0187: training loss 0.831 Epoch 28 iteration 0066/0187: training loss 0.831 Epoch 28 iteration 0067/0187: training loss 0.830 Epoch 28 iteration 0068/0187: training loss 0.831 Epoch 28 iteration 0069/0187: training loss 0.831 Epoch 28 iteration 0070/0187: training loss 0.831 Epoch 28 iteration 0071/0187: training loss 0.827 Epoch 28 iteration 0072/0187: training loss 0.826 Epoch 28 iteration 0073/0187: training loss 0.824 Epoch 28 iteration 0074/0187: training loss 0.823 Epoch 28 iteration 0075/0187: training loss 0.824 Epoch 28 iteration 0076/0187: training loss 0.822 Epoch 28 iteration 0077/0187: training loss 0.822 Epoch 28 iteration 0078/0187: training loss 0.821 Epoch 28 iteration 0079/0187: training loss 0.820 Epoch 28 iteration 0080/0187: training loss 0.822 Epoch 28 iteration 0081/0187: training loss 0.820 Epoch 28 iteration 0082/0187: training loss 0.822 Epoch 28 iteration 0083/0187: training loss 0.819 Epoch 28 iteration 0084/0187: training loss 0.819 Epoch 28 iteration 0085/0187: training loss 0.818 Epoch 28 iteration 0086/0187: training loss 0.818 Epoch 28 iteration 0087/0187: training loss 0.818 Epoch 28 iteration 0088/0187: training loss 0.819 Epoch 28 iteration 0089/0187: training loss 0.817 Epoch 28 iteration 0090/0187: training loss 0.819 Epoch 28 iteration 0091/0188: training loss 0.818 Epoch 28 iteration 0092/0188: training loss 0.816 Epoch 28 iteration 0093/0188: training loss 0.816 Epoch 28 iteration 0094/0188: training loss 0.816 Epoch 28 iteration 0095/0188: training loss 0.816 Epoch 28 iteration 0096/0188: training loss 0.818 Epoch 28 iteration 0097/0188: training loss 0.818 Epoch 28 iteration 0098/0188: training loss 0.818 Epoch 28 iteration 0099/0188: training loss 0.816 Epoch 28 iteration 0100/0188: training loss 0.816 Epoch 28 iteration 0101/0188: training loss 0.814 Epoch 28 iteration 0102/0188: training loss 0.815 Epoch 28 iteration 0103/0188: training loss 0.816 Epoch 28 iteration 0104/0188: training loss 0.815 Epoch 28 iteration 0105/0188: training loss 0.814 Epoch 28 iteration 0106/0188: training loss 0.819 Epoch 28 iteration 0107/0188: training loss 0.817 Epoch 28 iteration 0108/0188: training loss 0.818 Epoch 28 iteration 0109/0188: training loss 0.816 Epoch 28 iteration 0110/0188: training loss 0.818 Epoch 28 iteration 0111/0188: training loss 0.819 Epoch 28 iteration 0112/0188: training loss 0.818 Epoch 28 iteration 0113/0188: training loss 0.818 Epoch 28 iteration 0114/0188: training loss 0.818 Epoch 28 iteration 0115/0188: training loss 0.818 Epoch 28 iteration 0116/0188: training loss 0.819 Epoch 28 iteration 0117/0188: training loss 0.820 Epoch 28 iteration 0118/0188: training loss 0.818 Epoch 28 iteration 0119/0188: training loss 0.817 Epoch 28 iteration 0120/0188: training loss 0.817 Epoch 28 iteration 0121/0188: training loss 0.816 Epoch 28 iteration 0122/0188: training loss 0.814 Epoch 28 iteration 0123/0188: training loss 0.813 Epoch 28 iteration 0124/0188: training loss 0.814 Epoch 28 iteration 0125/0188: training loss 0.815 Epoch 28 iteration 0126/0188: training loss 0.815 Epoch 28 iteration 0127/0188: training loss 0.815 Epoch 28 iteration 0128/0188: training loss 0.813 Epoch 28 iteration 0129/0188: training loss 0.816 Epoch 28 iteration 0130/0188: training loss 0.815 Epoch 28 iteration 0131/0188: training loss 0.814 Epoch 28 iteration 0132/0188: training loss 0.814 Epoch 28 iteration 0133/0188: training loss 0.814 Epoch 28 iteration 0134/0188: training loss 0.813 Epoch 28 iteration 0135/0188: training loss 0.814 Epoch 28 iteration 0136/0188: training loss 0.814 Epoch 28 iteration 0137/0188: training loss 0.813 Epoch 28 iteration 0138/0188: training loss 0.814 Epoch 28 iteration 0139/0188: training loss 0.815 Epoch 28 iteration 0140/0188: training loss 0.815 Epoch 28 iteration 0141/0188: training loss 0.815 Epoch 28 iteration 0142/0188: training loss 0.815 Epoch 28 iteration 0143/0188: training loss 0.816 Epoch 28 iteration 0144/0188: training loss 0.815 Epoch 28 iteration 0145/0188: training loss 0.817 Epoch 28 iteration 0146/0188: training loss 0.817 Epoch 28 iteration 0147/0188: training loss 0.816 Epoch 28 iteration 0148/0188: training loss 0.816 Epoch 28 iteration 0149/0188: training loss 0.818 Epoch 28 iteration 0150/0188: training loss 0.818 Epoch 28 iteration 0151/0188: training loss 0.818 Epoch 28 iteration 0152/0188: training loss 0.818 Epoch 28 iteration 0153/0188: training loss 0.817 Epoch 28 iteration 0154/0188: training loss 0.816 Epoch 28 iteration 0155/0188: training loss 0.816 Epoch 28 iteration 0156/0188: training loss 0.816 Epoch 28 iteration 0157/0188: training loss 0.817 Epoch 28 iteration 0158/0188: training loss 0.816 Epoch 28 iteration 0159/0188: training loss 0.816 Epoch 28 iteration 0160/0188: training loss 0.816 Epoch 28 iteration 0161/0188: training loss 0.817 Epoch 28 iteration 0162/0188: training loss 0.816 Epoch 28 iteration 0163/0188: training loss 0.816 Epoch 28 iteration 0164/0188: training loss 0.816 Epoch 28 iteration 0165/0188: training loss 0.817 Epoch 28 iteration 0166/0188: training loss 0.817 Epoch 28 iteration 0167/0188: training loss 0.816 Epoch 28 iteration 0168/0188: training loss 0.816 Epoch 28 iteration 0169/0188: training loss 0.816 Epoch 28 iteration 0170/0188: training loss 0.816 Epoch 28 iteration 0171/0188: training loss 0.816 Epoch 28 iteration 0172/0188: training loss 0.816 Epoch 28 iteration 0173/0188: training loss 0.816 Epoch 28 iteration 0174/0188: training loss 0.816 Epoch 28 iteration 0175/0188: training loss 0.816 Epoch 28 iteration 0176/0188: training loss 0.816 Epoch 28 iteration 0177/0188: training loss 0.816 Epoch 28 iteration 0178/0188: training loss 0.816 Epoch 28 iteration 0179/0188: training loss 0.815 Epoch 28 iteration 0180/0188: training loss 0.815 Epoch 28 iteration 0181/0188: training loss 0.815 Epoch 28 iteration 0182/0188: training loss 0.815 Epoch 28 iteration 0183/0188: training loss 0.815 Epoch 28 iteration 0184/0188: training loss 0.815 Epoch 28 iteration 0185/0188: training loss 0.814 Epoch 28 iteration 0186/0188: training loss 0.814 Epoch 28 validation pixAcc: 0.871, mIoU: 0.374 Epoch 29 iteration 0001/0187: training loss 0.704 Epoch 29 iteration 0002/0187: training loss 0.753 Epoch 29 iteration 0003/0187: training loss 0.727 Epoch 29 iteration 0004/0187: training loss 0.752 Epoch 29 iteration 0005/0187: training loss 0.734 Epoch 29 iteration 0006/0187: training loss 0.760 Epoch 29 iteration 0007/0187: training loss 0.756 Epoch 29 iteration 0008/0187: training loss 0.764 Epoch 29 iteration 0009/0187: training loss 0.753 Epoch 29 iteration 0010/0187: training loss 0.780 Epoch 29 iteration 0011/0187: training loss 0.774 Epoch 29 iteration 0012/0187: training loss 0.781 Epoch 29 iteration 0013/0187: training loss 0.777 Epoch 29 iteration 0014/0187: training loss 0.770 Epoch 29 iteration 0015/0187: training loss 0.766 Epoch 29 iteration 0016/0187: training loss 0.766 Epoch 29 iteration 0017/0187: training loss 0.771 Epoch 29 iteration 0018/0187: training loss 0.777 Epoch 29 iteration 0019/0187: training loss 0.792 Epoch 29 iteration 0020/0187: training loss 0.797 Epoch 29 iteration 0021/0187: training loss 0.793 Epoch 29 iteration 0022/0187: training loss 0.787 Epoch 29 iteration 0023/0187: training loss 0.785 Epoch 29 iteration 0024/0187: training loss 0.784 Epoch 29 iteration 0025/0187: training loss 0.781 Epoch 29 iteration 0026/0187: training loss 0.788 Epoch 29 iteration 0027/0187: training loss 0.786 Epoch 29 iteration 0028/0187: training loss 0.790 Epoch 29 iteration 0029/0187: training loss 0.784 Epoch 29 iteration 0030/0187: training loss 0.788 Epoch 29 iteration 0031/0187: training loss 0.793 Epoch 29 iteration 0032/0187: training loss 0.801 Epoch 29 iteration 0033/0187: training loss 0.813 Epoch 29 iteration 0034/0187: training loss 0.811 Epoch 29 iteration 0035/0187: training loss 0.813 Epoch 29 iteration 0036/0187: training loss 0.810 Epoch 29 iteration 0037/0187: training loss 0.808 Epoch 29 iteration 0038/0187: training loss 0.808 Epoch 29 iteration 0039/0187: training loss 0.808 Epoch 29 iteration 0040/0187: training loss 0.805 Epoch 29 iteration 0041/0187: training loss 0.804 Epoch 29 iteration 0042/0187: training loss 0.801 Epoch 29 iteration 0043/0187: training loss 0.805 Epoch 29 iteration 0044/0187: training loss 0.803 Epoch 29 iteration 0045/0187: training loss 0.801 Epoch 29 iteration 0046/0187: training loss 0.799 Epoch 29 iteration 0047/0187: training loss 0.798 Epoch 29 iteration 0048/0187: training loss 0.797 Epoch 29 iteration 0049/0187: training loss 0.795 Epoch 29 iteration 0050/0187: training loss 0.793 Epoch 29 iteration 0051/0187: training loss 0.794 Epoch 29 iteration 0052/0187: training loss 0.797 Epoch 29 iteration 0053/0187: training loss 0.795 Epoch 29 iteration 0054/0187: training loss 0.798 Epoch 29 iteration 0055/0187: training loss 0.795 Epoch 29 iteration 0056/0187: training loss 0.799 Epoch 29 iteration 0057/0187: training loss 0.797 Epoch 29 iteration 0058/0187: training loss 0.798 Epoch 29 iteration 0059/0187: training loss 0.796 Epoch 29 iteration 0060/0187: training loss 0.793 Epoch 29 iteration 0061/0187: training loss 0.793 Epoch 29 iteration 0062/0187: training loss 0.792 Epoch 29 iteration 0063/0187: training loss 0.794 Epoch 29 iteration 0064/0187: training loss 0.796 Epoch 29 iteration 0065/0187: training loss 0.796 Epoch 29 iteration 0066/0187: training loss 0.796 Epoch 29 iteration 0067/0187: training loss 0.796 Epoch 29 iteration 0068/0187: training loss 0.795 Epoch 29 iteration 0069/0187: training loss 0.795 Epoch 29 iteration 0070/0187: training loss 0.796 Epoch 29 iteration 0071/0187: training loss 0.797 Epoch 29 iteration 0072/0187: training loss 0.796 Epoch 29 iteration 0073/0187: training loss 0.795 Epoch 29 iteration 0074/0187: training loss 0.794 Epoch 29 iteration 0075/0187: training loss 0.793 Epoch 29 iteration 0076/0187: training loss 0.791 Epoch 29 iteration 0077/0187: training loss 0.789 Epoch 29 iteration 0078/0187: training loss 0.788 Epoch 29 iteration 0079/0187: training loss 0.789 Epoch 29 iteration 0080/0187: training loss 0.788 Epoch 29 iteration 0081/0187: training loss 0.788 Epoch 29 iteration 0082/0187: training loss 0.789 Epoch 29 iteration 0083/0187: training loss 0.791 Epoch 29 iteration 0084/0187: training loss 0.789 Epoch 29 iteration 0085/0187: training loss 0.789 Epoch 29 iteration 0086/0187: training loss 0.789 Epoch 29 iteration 0087/0187: training loss 0.789 Epoch 29 iteration 0088/0187: training loss 0.789 Epoch 29 iteration 0089/0187: training loss 0.788 Epoch 29 iteration 0090/0187: training loss 0.787 Epoch 29 iteration 0091/0187: training loss 0.789 Epoch 29 iteration 0092/0187: training loss 0.790 Epoch 29 iteration 0093/0187: training loss 0.791 Epoch 29 iteration 0094/0187: training loss 0.792 Epoch 29 iteration 0095/0187: training loss 0.791 Epoch 29 iteration 0096/0187: training loss 0.793 Epoch 29 iteration 0097/0187: training loss 0.794 Epoch 29 iteration 0098/0187: training loss 0.794 Epoch 29 iteration 0099/0187: training loss 0.793 Epoch 29 iteration 0100/0187: training loss 0.792 Epoch 29 iteration 0101/0187: training loss 0.793 Epoch 29 iteration 0102/0187: training loss 0.794 Epoch 29 iteration 0103/0187: training loss 0.794 Epoch 29 iteration 0104/0187: training loss 0.794 Epoch 29 iteration 0105/0187: training loss 0.793 Epoch 29 iteration 0106/0187: training loss 0.793 Epoch 29 iteration 0107/0187: training loss 0.794 Epoch 29 iteration 0108/0187: training loss 0.793 Epoch 29 iteration 0109/0187: training loss 0.794 Epoch 29 iteration 0110/0187: training loss 0.796 Epoch 29 iteration 0111/0187: training loss 0.798 Epoch 29 iteration 0112/0187: training loss 0.797 Epoch 29 iteration 0113/0187: training loss 0.797 Epoch 29 iteration 0114/0187: training loss 0.798 Epoch 29 iteration 0115/0187: training loss 0.797 Epoch 29 iteration 0116/0187: training loss 0.797 Epoch 29 iteration 0117/0187: training loss 0.799 Epoch 29 iteration 0118/0187: training loss 0.798 Epoch 29 iteration 0119/0187: training loss 0.798 Epoch 29 iteration 0120/0187: training loss 0.800 Epoch 29 iteration 0121/0187: training loss 0.800 Epoch 29 iteration 0122/0187: training loss 0.801 Epoch 29 iteration 0123/0187: training loss 0.801 Epoch 29 iteration 0124/0187: training loss 0.801 Epoch 29 iteration 0125/0187: training loss 0.801 Epoch 29 iteration 0126/0187: training loss 0.804 Epoch 29 iteration 0127/0187: training loss 0.802 Epoch 29 iteration 0128/0187: training loss 0.803 Epoch 29 iteration 0129/0187: training loss 0.807 Epoch 29 iteration 0130/0187: training loss 0.808 Epoch 29 iteration 0131/0187: training loss 0.807 Epoch 29 iteration 0132/0187: training loss 0.806 Epoch 29 iteration 0133/0187: training loss 0.806 Epoch 29 iteration 0134/0187: training loss 0.805 Epoch 29 iteration 0135/0187: training loss 0.806 Epoch 29 iteration 0136/0187: training loss 0.805 Epoch 29 iteration 0137/0187: training loss 0.805 Epoch 29 iteration 0138/0187: training loss 0.805 Epoch 29 iteration 0139/0187: training loss 0.806 Epoch 29 iteration 0140/0187: training loss 0.808 Epoch 29 iteration 0141/0187: training loss 0.807 Epoch 29 iteration 0142/0187: training loss 0.807 Epoch 29 iteration 0143/0187: training loss 0.806 Epoch 29 iteration 0144/0187: training loss 0.806 Epoch 29 iteration 0145/0187: training loss 0.806 Epoch 29 iteration 0146/0187: training loss 0.806 Epoch 29 iteration 0147/0187: training loss 0.806 Epoch 29 iteration 0148/0187: training loss 0.805 Epoch 29 iteration 0149/0187: training loss 0.804 Epoch 29 iteration 0150/0187: training loss 0.804 Epoch 29 iteration 0151/0187: training loss 0.803 Epoch 29 iteration 0152/0187: training loss 0.802 Epoch 29 iteration 0153/0187: training loss 0.802 Epoch 29 iteration 0154/0187: training loss 0.800 Epoch 29 iteration 0155/0187: training loss 0.800 Epoch 29 iteration 0156/0187: training loss 0.801 Epoch 29 iteration 0157/0187: training loss 0.800 Epoch 29 iteration 0158/0187: training loss 0.800 Epoch 29 iteration 0159/0187: training loss 0.799 Epoch 29 iteration 0160/0187: training loss 0.799 Epoch 29 iteration 0161/0187: training loss 0.799 Epoch 29 iteration 0162/0187: training loss 0.799 Epoch 29 iteration 0163/0187: training loss 0.799 Epoch 29 iteration 0164/0187: training loss 0.799 Epoch 29 iteration 0165/0187: training loss 0.800 Epoch 29 iteration 0166/0187: training loss 0.799 Epoch 29 iteration 0167/0187: training loss 0.799 Epoch 29 iteration 0168/0187: training loss 0.800 Epoch 29 iteration 0169/0187: training loss 0.801 Epoch 29 iteration 0170/0187: training loss 0.801 Epoch 29 iteration 0171/0187: training loss 0.801 Epoch 29 iteration 0172/0187: training loss 0.800 Epoch 29 iteration 0173/0187: training loss 0.800 Epoch 29 iteration 0174/0187: training loss 0.801 Epoch 29 iteration 0175/0187: training loss 0.801 Epoch 29 iteration 0176/0187: training loss 0.800 Epoch 29 iteration 0177/0187: training loss 0.800 Epoch 29 iteration 0178/0187: training loss 0.801 Epoch 29 iteration 0179/0187: training loss 0.801 Epoch 29 iteration 0180/0187: training loss 0.801 Epoch 29 iteration 0181/0187: training loss 0.801 Epoch 29 iteration 0182/0187: training loss 0.801 Epoch 29 iteration 0183/0187: training loss 0.800 Epoch 29 iteration 0184/0187: training loss 0.800 Epoch 29 iteration 0185/0187: training loss 0.799 Epoch 29 iteration 0186/0187: training loss 0.799 Epoch 29 iteration 0187/0187: training loss 0.800 Epoch 29 validation pixAcc: 0.871, mIoU: 0.372 Epoch 30 iteration 0001/0187: training loss 0.990 Epoch 30 iteration 0002/0187: training loss 0.884 Epoch 30 iteration 0003/0187: training loss 0.838 Epoch 30 iteration 0004/0187: training loss 0.826 Epoch 30 iteration 0005/0187: training loss 0.818 Epoch 30 iteration 0006/0187: training loss 0.843 Epoch 30 iteration 0007/0187: training loss 0.844 Epoch 30 iteration 0008/0187: training loss 0.834 Epoch 30 iteration 0009/0187: training loss 0.832 Epoch 30 iteration 0010/0187: training loss 0.842 Epoch 30 iteration 0011/0187: training loss 0.830 Epoch 30 iteration 0012/0187: training loss 0.832 Epoch 30 iteration 0013/0187: training loss 0.820 Epoch 30 iteration 0014/0187: training loss 0.819 Epoch 30 iteration 0015/0187: training loss 0.822 Epoch 30 iteration 0016/0187: training loss 0.820 Epoch 30 iteration 0017/0187: training loss 0.807 Epoch 30 iteration 0018/0187: training loss 0.799 Epoch 30 iteration 0019/0187: training loss 0.800 Epoch 30 iteration 0020/0187: training loss 0.804 Epoch 30 iteration 0021/0187: training loss 0.796 Epoch 30 iteration 0022/0187: training loss 0.797 Epoch 30 iteration 0023/0187: training loss 0.801 Epoch 30 iteration 0024/0187: training loss 0.804 Epoch 30 iteration 0025/0187: training loss 0.807 Epoch 30 iteration 0026/0187: training loss 0.798 Epoch 30 iteration 0027/0187: training loss 0.794 Epoch 30 iteration 0028/0187: training loss 0.790 Epoch 30 iteration 0029/0187: training loss 0.789 Epoch 30 iteration 0030/0187: training loss 0.789 Epoch 30 iteration 0031/0187: training loss 0.792 Epoch 30 iteration 0032/0187: training loss 0.794 Epoch 30 iteration 0033/0187: training loss 0.791 Epoch 30 iteration 0034/0187: training loss 0.792 Epoch 30 iteration 0035/0187: training loss 0.788 Epoch 30 iteration 0036/0187: training loss 0.788 Epoch 30 iteration 0037/0187: training loss 0.787 Epoch 30 iteration 0038/0187: training loss 0.788 Epoch 30 iteration 0039/0187: training loss 0.785 Epoch 30 iteration 0040/0187: training loss 0.782 Epoch 30 iteration 0041/0187: training loss 0.780 Epoch 30 iteration 0042/0187: training loss 0.780 Epoch 30 iteration 0043/0187: training loss 0.780 Epoch 30 iteration 0044/0187: training loss 0.777 Epoch 30 iteration 0045/0187: training loss 0.774 Epoch 30 iteration 0046/0187: training loss 0.772 Epoch 30 iteration 0047/0187: training loss 0.771 Epoch 30 iteration 0048/0187: training loss 0.774 Epoch 30 iteration 0049/0187: training loss 0.772 Epoch 30 iteration 0050/0187: training loss 0.772 Epoch 30 iteration 0051/0187: training loss 0.775 Epoch 30 iteration 0052/0187: training loss 0.775 Epoch 30 iteration 0053/0187: training loss 0.771 Epoch 30 iteration 0054/0187: training loss 0.771 Epoch 30 iteration 0055/0187: training loss 0.771 Epoch 30 iteration 0056/0187: training loss 0.770 Epoch 30 iteration 0057/0187: training loss 0.770 Epoch 30 iteration 0058/0187: training loss 0.773 Epoch 30 iteration 0059/0187: training loss 0.771 Epoch 30 iteration 0060/0187: training loss 0.772 Epoch 30 iteration 0061/0187: training loss 0.772 Epoch 30 iteration 0062/0187: training loss 0.770 Epoch 30 iteration 0063/0187: training loss 0.770 Epoch 30 iteration 0064/0187: training loss 0.781 Epoch 30 iteration 0065/0187: training loss 0.781 Epoch 30 iteration 0066/0187: training loss 0.782 Epoch 30 iteration 0067/0187: training loss 0.786 Epoch 30 iteration 0068/0187: training loss 0.784 Epoch 30 iteration 0069/0187: training loss 0.783 Epoch 30 iteration 0070/0187: training loss 0.786 Epoch 30 iteration 0071/0187: training loss 0.784 Epoch 30 iteration 0072/0187: training loss 0.785 Epoch 30 iteration 0073/0187: training loss 0.786 Epoch 30 iteration 0074/0187: training loss 0.787 Epoch 30 iteration 0075/0187: training loss 0.785 Epoch 30 iteration 0076/0187: training loss 0.786 Epoch 30 iteration 0077/0187: training loss 0.785 Epoch 30 iteration 0078/0187: training loss 0.788 Epoch 30 iteration 0079/0187: training loss 0.786 Epoch 30 iteration 0080/0187: training loss 0.786 Epoch 30 iteration 0081/0187: training loss 0.787 Epoch 30 iteration 0082/0187: training loss 0.788 Epoch 30 iteration 0083/0187: training loss 0.786 Epoch 30 iteration 0084/0187: training loss 0.789 Epoch 30 iteration 0085/0187: training loss 0.790 Epoch 30 iteration 0086/0187: training loss 0.790 Epoch 30 iteration 0087/0187: training loss 0.789 Epoch 30 iteration 0088/0187: training loss 0.787 Epoch 30 iteration 0089/0187: training loss 0.789 Epoch 30 iteration 0090/0187: training loss 0.791 Epoch 30 iteration 0091/0188: training loss 0.791 Epoch 30 iteration 0092/0188: training loss 0.790 Epoch 30 iteration 0093/0188: training loss 0.791 Epoch 30 iteration 0094/0188: training loss 0.792 Epoch 30 iteration 0095/0188: training loss 0.793 Epoch 30 iteration 0096/0188: training loss 0.793 Epoch 30 iteration 0097/0188: training loss 0.793 Epoch 30 iteration 0098/0188: training loss 0.795 Epoch 30 iteration 0099/0188: training loss 0.794 Epoch 30 iteration 0100/0188: training loss 0.796 Epoch 30 iteration 0101/0188: training loss 0.797 Epoch 30 iteration 0102/0188: training loss 0.795 Epoch 30 iteration 0103/0188: training loss 0.798 Epoch 30 iteration 0104/0188: training loss 0.799 Epoch 30 iteration 0105/0188: training loss 0.800 Epoch 30 iteration 0106/0188: training loss 0.799 Epoch 30 iteration 0107/0188: training loss 0.800 Epoch 30 iteration 0108/0188: training loss 0.801 Epoch 30 iteration 0109/0188: training loss 0.799 Epoch 30 iteration 0110/0188: training loss 0.798 Epoch 30 iteration 0111/0188: training loss 0.800 Epoch 30 iteration 0112/0188: training loss 0.798 Epoch 30 iteration 0113/0188: training loss 0.798 Epoch 30 iteration 0114/0188: training loss 0.798 Epoch 30 iteration 0115/0188: training loss 0.797 Epoch 30 iteration 0116/0188: training loss 0.798 Epoch 30 iteration 0117/0188: training loss 0.796 Epoch 30 iteration 0118/0188: training loss 0.796 Epoch 30 iteration 0119/0188: training loss 0.795 Epoch 30 iteration 0120/0188: training loss 0.796 Epoch 30 iteration 0121/0188: training loss 0.797 Epoch 30 iteration 0122/0188: training loss 0.796 Epoch 30 iteration 0123/0188: training loss 0.796 Epoch 30 iteration 0124/0188: training loss 0.797 Epoch 30 iteration 0125/0188: training loss 0.797 Epoch 30 iteration 0126/0188: training loss 0.796 Epoch 30 iteration 0127/0188: training loss 0.797 Epoch 30 iteration 0128/0188: training loss 0.797 Epoch 30 iteration 0129/0188: training loss 0.798 Epoch 30 iteration 0130/0188: training loss 0.798 Epoch 30 iteration 0131/0188: training loss 0.796 Epoch 30 iteration 0132/0188: training loss 0.796 Epoch 30 iteration 0133/0188: training loss 0.797 Epoch 30 iteration 0134/0188: training loss 0.796 Epoch 30 iteration 0135/0188: training loss 0.796 Epoch 30 iteration 0136/0188: training loss 0.798 Epoch 30 iteration 0137/0188: training loss 0.796 Epoch 30 iteration 0138/0188: training loss 0.795 Epoch 30 iteration 0139/0188: training loss 0.796 Epoch 30 iteration 0140/0188: training loss 0.797 Epoch 30 iteration 0141/0188: training loss 0.797 Epoch 30 iteration 0142/0188: training loss 0.797 Epoch 30 iteration 0143/0188: training loss 0.797 Epoch 30 iteration 0144/0188: training loss 0.796 Epoch 30 iteration 0145/0188: training loss 0.795 Epoch 30 iteration 0146/0188: training loss 0.794 Epoch 30 iteration 0147/0188: training loss 0.794 Epoch 30 iteration 0148/0188: training loss 0.793 Epoch 30 iteration 0149/0188: training loss 0.792 Epoch 30 iteration 0150/0188: training loss 0.791 Epoch 30 iteration 0151/0188: training loss 0.793 Epoch 30 iteration 0152/0188: training loss 0.793 Epoch 30 iteration 0153/0188: training loss 0.793 Epoch 30 iteration 0154/0188: training loss 0.793 Epoch 30 iteration 0155/0188: training loss 0.793 Epoch 30 iteration 0156/0188: training loss 0.794 Epoch 30 iteration 0157/0188: training loss 0.794 Epoch 30 iteration 0158/0188: training loss 0.794 Epoch 30 iteration 0159/0188: training loss 0.795 Epoch 30 iteration 0160/0188: training loss 0.796 Epoch 30 iteration 0161/0188: training loss 0.795 Epoch 30 iteration 0162/0188: training loss 0.796 Epoch 30 iteration 0163/0188: training loss 0.797 Epoch 30 iteration 0164/0188: training loss 0.797 Epoch 30 iteration 0165/0188: training loss 0.796 Epoch 30 iteration 0166/0188: training loss 0.795 Epoch 30 iteration 0167/0188: training loss 0.796 Epoch 30 iteration 0168/0188: training loss 0.795 Epoch 30 iteration 0169/0188: training loss 0.794 Epoch 30 iteration 0170/0188: training loss 0.794 Epoch 30 iteration 0171/0188: training loss 0.794 Epoch 30 iteration 0172/0188: training loss 0.795 Epoch 30 iteration 0173/0188: training loss 0.795 Epoch 30 iteration 0174/0188: training loss 0.795 Epoch 30 iteration 0175/0188: training loss 0.797 Epoch 30 iteration 0176/0188: training loss 0.796 Epoch 30 iteration 0177/0188: training loss 0.796 Epoch 30 iteration 0178/0188: training loss 0.796 Epoch 30 iteration 0179/0188: training loss 0.797 Epoch 30 iteration 0180/0188: training loss 0.796 Epoch 30 iteration 0181/0188: training loss 0.796 Epoch 30 iteration 0182/0188: training loss 0.796 Epoch 30 iteration 0183/0188: training loss 0.795 Epoch 30 iteration 0184/0188: training loss 0.795 Epoch 30 iteration 0185/0188: training loss 0.795 Epoch 30 iteration 0186/0188: training loss 0.794 Epoch 30 validation pixAcc: 0.873, mIoU: 0.381 Epoch 31 iteration 0001/0187: training loss 0.939 Epoch 31 iteration 0002/0187: training loss 0.874 Epoch 31 iteration 0003/0187: training loss 0.833 Epoch 31 iteration 0004/0187: training loss 0.833 Epoch 31 iteration 0005/0187: training loss 0.824 Epoch 31 iteration 0006/0187: training loss 0.855 Epoch 31 iteration 0007/0187: training loss 0.865 Epoch 31 iteration 0008/0187: training loss 0.868 Epoch 31 iteration 0009/0187: training loss 0.875 Epoch 31 iteration 0010/0187: training loss 0.864 Epoch 31 iteration 0011/0187: training loss 0.854 Epoch 31 iteration 0012/0187: training loss 0.858 Epoch 31 iteration 0013/0187: training loss 0.849 Epoch 31 iteration 0014/0187: training loss 0.843 Epoch 31 iteration 0015/0187: training loss 0.834 Epoch 31 iteration 0016/0187: training loss 0.839 Epoch 31 iteration 0017/0187: training loss 0.828 Epoch 31 iteration 0018/0187: training loss 0.824 Epoch 31 iteration 0019/0187: training loss 0.813 Epoch 31 iteration 0020/0187: training loss 0.799 Epoch 31 iteration 0021/0187: training loss 0.794 Epoch 31 iteration 0022/0187: training loss 0.787 Epoch 31 iteration 0023/0187: training loss 0.789 Epoch 31 iteration 0024/0187: training loss 0.794 Epoch 31 iteration 0025/0187: training loss 0.796 Epoch 31 iteration 0026/0187: training loss 0.789 Epoch 31 iteration 0027/0187: training loss 0.788 Epoch 31 iteration 0028/0187: training loss 0.789 Epoch 31 iteration 0029/0187: training loss 0.787 Epoch 31 iteration 0030/0187: training loss 0.790 Epoch 31 iteration 0031/0187: training loss 0.801 Epoch 31 iteration 0032/0187: training loss 0.798 Epoch 31 iteration 0033/0187: training loss 0.798 Epoch 31 iteration 0034/0187: training loss 0.800 Epoch 31 iteration 0035/0187: training loss 0.797 Epoch 31 iteration 0036/0187: training loss 0.799 Epoch 31 iteration 0037/0187: training loss 0.798 Epoch 31 iteration 0038/0187: training loss 0.804 Epoch 31 iteration 0039/0187: training loss 0.806 Epoch 31 iteration 0040/0187: training loss 0.805 Epoch 31 iteration 0041/0187: training loss 0.804 Epoch 31 iteration 0042/0187: training loss 0.805 Epoch 31 iteration 0043/0187: training loss 0.806 Epoch 31 iteration 0044/0187: training loss 0.808 Epoch 31 iteration 0045/0187: training loss 0.811 Epoch 31 iteration 0046/0187: training loss 0.809 Epoch 31 iteration 0047/0187: training loss 0.810 Epoch 31 iteration 0048/0187: training loss 0.811 Epoch 31 iteration 0049/0187: training loss 0.810 Epoch 31 iteration 0050/0187: training loss 0.812 Epoch 31 iteration 0051/0187: training loss 0.811 Epoch 31 iteration 0052/0187: training loss 0.810 Epoch 31 iteration 0053/0187: training loss 0.810 Epoch 31 iteration 0054/0187: training loss 0.810 Epoch 31 iteration 0055/0187: training loss 0.811 Epoch 31 iteration 0056/0187: training loss 0.812 Epoch 31 iteration 0057/0187: training loss 0.811 Epoch 31 iteration 0058/0187: training loss 0.812 Epoch 31 iteration 0059/0187: training loss 0.813 Epoch 31 iteration 0060/0187: training loss 0.812 Epoch 31 iteration 0061/0187: training loss 0.812 Epoch 31 iteration 0062/0187: training loss 0.808 Epoch 31 iteration 0063/0187: training loss 0.806 Epoch 31 iteration 0064/0187: training loss 0.804 Epoch 31 iteration 0065/0187: training loss 0.803 Epoch 31 iteration 0066/0187: training loss 0.802 Epoch 31 iteration 0067/0187: training loss 0.805 Epoch 31 iteration 0068/0187: training loss 0.803 Epoch 31 iteration 0069/0187: training loss 0.802 Epoch 31 iteration 0070/0187: training loss 0.808 Epoch 31 iteration 0071/0187: training loss 0.807 Epoch 31 iteration 0072/0187: training loss 0.805 Epoch 31 iteration 0073/0187: training loss 0.806 Epoch 31 iteration 0074/0187: training loss 0.804 Epoch 31 iteration 0075/0187: training loss 0.804 Epoch 31 iteration 0076/0187: training loss 0.802 Epoch 31 iteration 0077/0187: training loss 0.802 Epoch 31 iteration 0078/0187: training loss 0.802 Epoch 31 iteration 0079/0187: training loss 0.801 Epoch 31 iteration 0080/0187: training loss 0.801 Epoch 31 iteration 0081/0187: training loss 0.801 Epoch 31 iteration 0082/0187: training loss 0.799 Epoch 31 iteration 0083/0187: training loss 0.798 Epoch 31 iteration 0084/0187: training loss 0.799 Epoch 31 iteration 0085/0187: training loss 0.798 Epoch 31 iteration 0086/0187: training loss 0.796 Epoch 31 iteration 0087/0187: training loss 0.795 Epoch 31 iteration 0088/0187: training loss 0.795 Epoch 31 iteration 0089/0187: training loss 0.793 Epoch 31 iteration 0090/0187: training loss 0.793 Epoch 31 iteration 0091/0187: training loss 0.791 Epoch 31 iteration 0092/0187: training loss 0.792 Epoch 31 iteration 0093/0187: training loss 0.791 Epoch 31 iteration 0094/0187: training loss 0.791 Epoch 31 iteration 0095/0187: training loss 0.790 Epoch 31 iteration 0096/0187: training loss 0.793 Epoch 31 iteration 0097/0187: training loss 0.791 Epoch 31 iteration 0098/0187: training loss 0.791 Epoch 31 iteration 0099/0187: training loss 0.789 Epoch 31 iteration 0100/0187: training loss 0.790 Epoch 31 iteration 0101/0187: training loss 0.789 Epoch 31 iteration 0102/0187: training loss 0.790 Epoch 31 iteration 0103/0187: training loss 0.789 Epoch 31 iteration 0104/0187: training loss 0.789 Epoch 31 iteration 0105/0187: training loss 0.788 Epoch 31 iteration 0106/0187: training loss 0.788 Epoch 31 iteration 0107/0187: training loss 0.788 Epoch 31 iteration 0108/0187: training loss 0.788 Epoch 31 iteration 0109/0187: training loss 0.787 Epoch 31 iteration 0110/0187: training loss 0.786 Epoch 31 iteration 0111/0187: training loss 0.787 Epoch 31 iteration 0112/0187: training loss 0.786 Epoch 31 iteration 0113/0187: training loss 0.786 Epoch 31 iteration 0114/0187: training loss 0.786 Epoch 31 iteration 0115/0187: training loss 0.786 Epoch 31 iteration 0116/0187: training loss 0.787 Epoch 31 iteration 0117/0187: training loss 0.787 Epoch 31 iteration 0118/0187: training loss 0.786 Epoch 31 iteration 0119/0187: training loss 0.786 Epoch 31 iteration 0120/0187: training loss 0.787 Epoch 31 iteration 0121/0187: training loss 0.786 Epoch 31 iteration 0122/0187: training loss 0.786 Epoch 31 iteration 0123/0187: training loss 0.785 Epoch 31 iteration 0124/0187: training loss 0.786 Epoch 31 iteration 0125/0187: training loss 0.787 Epoch 31 iteration 0126/0187: training loss 0.787 Epoch 31 iteration 0127/0187: training loss 0.788 Epoch 31 iteration 0128/0187: training loss 0.789 Epoch 31 iteration 0129/0187: training loss 0.788 Epoch 31 iteration 0130/0187: training loss 0.790 Epoch 31 iteration 0131/0187: training loss 0.789 Epoch 31 iteration 0132/0187: training loss 0.790 Epoch 31 iteration 0133/0187: training loss 0.789 Epoch 31 iteration 0134/0187: training loss 0.789 Epoch 31 iteration 0135/0187: training loss 0.788 Epoch 31 iteration 0136/0187: training loss 0.788 Epoch 31 iteration 0137/0187: training loss 0.789 Epoch 31 iteration 0138/0187: training loss 0.789 Epoch 31 iteration 0139/0187: training loss 0.790 Epoch 31 iteration 0140/0187: training loss 0.789 Epoch 31 iteration 0141/0187: training loss 0.788 Epoch 31 iteration 0142/0187: training loss 0.787 Epoch 31 iteration 0143/0187: training loss 0.787 Epoch 31 iteration 0144/0187: training loss 0.788 Epoch 31 iteration 0145/0187: training loss 0.788 Epoch 31 iteration 0146/0187: training loss 0.788 Epoch 31 iteration 0147/0187: training loss 0.788 Epoch 31 iteration 0148/0187: training loss 0.787 Epoch 31 iteration 0149/0187: training loss 0.787 Epoch 31 iteration 0150/0187: training loss 0.785 Epoch 31 iteration 0151/0187: training loss 0.784 Epoch 31 iteration 0152/0187: training loss 0.784 Epoch 31 iteration 0153/0187: training loss 0.785 Epoch 31 iteration 0154/0187: training loss 0.786 Epoch 31 iteration 0155/0187: training loss 0.786 Epoch 31 iteration 0156/0187: training loss 0.786 Epoch 31 iteration 0157/0187: training loss 0.785 Epoch 31 iteration 0158/0187: training loss 0.786 Epoch 31 iteration 0159/0187: training loss 0.788 Epoch 31 iteration 0160/0187: training loss 0.789 Epoch 31 iteration 0161/0187: training loss 0.790 Epoch 31 iteration 0162/0187: training loss 0.790 Epoch 31 iteration 0163/0187: training loss 0.791 Epoch 31 iteration 0164/0187: training loss 0.790 Epoch 31 iteration 0165/0187: training loss 0.791 Epoch 31 iteration 0166/0187: training loss 0.790 Epoch 31 iteration 0167/0187: training loss 0.789 Epoch 31 iteration 0168/0187: training loss 0.788 Epoch 31 iteration 0169/0187: training loss 0.789 Epoch 31 iteration 0170/0187: training loss 0.790 Epoch 31 iteration 0171/0187: training loss 0.792 Epoch 31 iteration 0172/0187: training loss 0.792 Epoch 31 iteration 0173/0187: training loss 0.791 Epoch 31 iteration 0174/0187: training loss 0.791 Epoch 31 iteration 0175/0187: training loss 0.790 Epoch 31 iteration 0176/0187: training loss 0.791 Epoch 31 iteration 0177/0187: training loss 0.790 Epoch 31 iteration 0178/0187: training loss 0.791 Epoch 31 iteration 0179/0187: training loss 0.791 Epoch 31 iteration 0180/0187: training loss 0.791 Epoch 31 iteration 0181/0187: training loss 0.791 Epoch 31 iteration 0182/0187: training loss 0.790 Epoch 31 iteration 0183/0187: training loss 0.789 Epoch 31 iteration 0184/0187: training loss 0.789 Epoch 31 iteration 0185/0187: training loss 0.789 Epoch 31 iteration 0186/0187: training loss 0.789 Epoch 31 iteration 0187/0187: training loss 0.789 Epoch 31 validation pixAcc: 0.872, mIoU: 0.383 Epoch 32 iteration 0001/0187: training loss 0.706 Epoch 32 iteration 0002/0187: training loss 0.720 Epoch 32 iteration 0003/0187: training loss 0.771 Epoch 32 iteration 0004/0187: training loss 0.760 Epoch 32 iteration 0005/0187: training loss 0.765 Epoch 32 iteration 0006/0187: training loss 0.759 Epoch 32 iteration 0007/0187: training loss 0.753 Epoch 32 iteration 0008/0187: training loss 0.756 Epoch 32 iteration 0009/0187: training loss 0.740 Epoch 32 iteration 0010/0187: training loss 0.751 Epoch 32 iteration 0011/0187: training loss 0.747 Epoch 32 iteration 0012/0187: training loss 0.747 Epoch 32 iteration 0013/0187: training loss 0.745 Epoch 32 iteration 0014/0187: training loss 0.753 Epoch 32 iteration 0015/0187: training loss 0.747 Epoch 32 iteration 0016/0187: training loss 0.751 Epoch 32 iteration 0017/0187: training loss 0.745 Epoch 32 iteration 0018/0187: training loss 0.746 Epoch 32 iteration 0019/0187: training loss 0.747 Epoch 32 iteration 0020/0187: training loss 0.748 Epoch 32 iteration 0021/0187: training loss 0.755 Epoch 32 iteration 0022/0187: training loss 0.751 Epoch 32 iteration 0023/0187: training loss 0.748 Epoch 32 iteration 0024/0187: training loss 0.744 Epoch 32 iteration 0025/0187: training loss 0.745 Epoch 32 iteration 0026/0187: training loss 0.755 Epoch 32 iteration 0027/0187: training loss 0.757 Epoch 32 iteration 0028/0187: training loss 0.757 Epoch 32 iteration 0029/0187: training loss 0.757 Epoch 32 iteration 0030/0187: training loss 0.761 Epoch 32 iteration 0031/0187: training loss 0.770 Epoch 32 iteration 0032/0187: training loss 0.771 Epoch 32 iteration 0033/0187: training loss 0.773 Epoch 32 iteration 0034/0187: training loss 0.772 Epoch 32 iteration 0035/0187: training loss 0.773 Epoch 32 iteration 0036/0187: training loss 0.773 Epoch 32 iteration 0037/0187: training loss 0.775 Epoch 32 iteration 0038/0187: training loss 0.775 Epoch 32 iteration 0039/0187: training loss 0.775 Epoch 32 iteration 0040/0187: training loss 0.775 Epoch 32 iteration 0041/0187: training loss 0.778 Epoch 32 iteration 0042/0187: training loss 0.776 Epoch 32 iteration 0043/0187: training loss 0.775 Epoch 32 iteration 0044/0187: training loss 0.772 Epoch 32 iteration 0045/0187: training loss 0.772 Epoch 32 iteration 0046/0187: training loss 0.777 Epoch 32 iteration 0047/0187: training loss 0.777 Epoch 32 iteration 0048/0187: training loss 0.775 Epoch 32 iteration 0049/0187: training loss 0.774 Epoch 32 iteration 0050/0187: training loss 0.774 Epoch 32 iteration 0051/0187: training loss 0.776 Epoch 32 iteration 0052/0187: training loss 0.779 Epoch 32 iteration 0053/0187: training loss 0.780 Epoch 32 iteration 0054/0187: training loss 0.781 Epoch 32 iteration 0055/0187: training loss 0.782 Epoch 32 iteration 0056/0187: training loss 0.783 Epoch 32 iteration 0057/0187: training loss 0.787 Epoch 32 iteration 0058/0187: training loss 0.787 Epoch 32 iteration 0059/0187: training loss 0.783 Epoch 32 iteration 0060/0187: training loss 0.782 Epoch 32 iteration 0061/0187: training loss 0.783 Epoch 32 iteration 0062/0187: training loss 0.783 Epoch 32 iteration 0063/0187: training loss 0.784 Epoch 32 iteration 0064/0187: training loss 0.786 Epoch 32 iteration 0065/0187: training loss 0.786 Epoch 32 iteration 0066/0187: training loss 0.786 Epoch 32 iteration 0067/0187: training loss 0.786 Epoch 32 iteration 0068/0187: training loss 0.787 Epoch 32 iteration 0069/0187: training loss 0.786 Epoch 32 iteration 0070/0187: training loss 0.790 Epoch 32 iteration 0071/0187: training loss 0.792 Epoch 32 iteration 0072/0187: training loss 0.790 Epoch 32 iteration 0073/0187: training loss 0.791 Epoch 32 iteration 0074/0187: training loss 0.790 Epoch 32 iteration 0075/0187: training loss 0.791 Epoch 32 iteration 0076/0187: training loss 0.793 Epoch 32 iteration 0077/0187: training loss 0.793 Epoch 32 iteration 0078/0187: training loss 0.792 Epoch 32 iteration 0079/0187: training loss 0.791 Epoch 32 iteration 0080/0187: training loss 0.792 Epoch 32 iteration 0081/0187: training loss 0.795 Epoch 32 iteration 0082/0187: training loss 0.794 Epoch 32 iteration 0083/0187: training loss 0.791 Epoch 32 iteration 0084/0187: training loss 0.789 Epoch 32 iteration 0085/0187: training loss 0.790 Epoch 32 iteration 0086/0187: training loss 0.791 Epoch 32 iteration 0087/0187: training loss 0.790 Epoch 32 iteration 0088/0187: training loss 0.791 Epoch 32 iteration 0089/0187: training loss 0.793 Epoch 32 iteration 0090/0187: training loss 0.792 Epoch 32 iteration 0091/0188: training loss 0.794 Epoch 32 iteration 0092/0188: training loss 0.793 Epoch 32 iteration 0093/0188: training loss 0.792 Epoch 32 iteration 0094/0188: training loss 0.792 Epoch 32 iteration 0095/0188: training loss 0.794 Epoch 32 iteration 0096/0188: training loss 0.796 Epoch 32 iteration 0097/0188: training loss 0.796 Epoch 32 iteration 0098/0188: training loss 0.794 Epoch 32 iteration 0099/0188: training loss 0.796 Epoch 32 iteration 0100/0188: training loss 0.796 Epoch 32 iteration 0101/0188: training loss 0.798 Epoch 32 iteration 0102/0188: training loss 0.799 Epoch 32 iteration 0103/0188: training loss 0.799 Epoch 32 iteration 0104/0188: training loss 0.800 Epoch 32 iteration 0105/0188: training loss 0.800 Epoch 32 iteration 0106/0188: training loss 0.798 Epoch 32 iteration 0107/0188: training loss 0.797 Epoch 32 iteration 0108/0188: training loss 0.799 Epoch 32 iteration 0109/0188: training loss 0.799 Epoch 32 iteration 0110/0188: training loss 0.798 Epoch 32 iteration 0111/0188: training loss 0.799 Epoch 32 iteration 0112/0188: training loss 0.799 Epoch 32 iteration 0113/0188: training loss 0.799 Epoch 32 iteration 0114/0188: training loss 0.799 Epoch 32 iteration 0115/0188: training loss 0.798 Epoch 32 iteration 0116/0188: training loss 0.796 Epoch 32 iteration 0117/0188: training loss 0.797 Epoch 32 iteration 0118/0188: training loss 0.797 Epoch 32 iteration 0119/0188: training loss 0.797 Epoch 32 iteration 0120/0188: training loss 0.796 Epoch 32 iteration 0121/0188: training loss 0.795 Epoch 32 iteration 0122/0188: training loss 0.794 Epoch 32 iteration 0123/0188: training loss 0.796 Epoch 32 iteration 0124/0188: training loss 0.796 Epoch 32 iteration 0125/0188: training loss 0.795 Epoch 32 iteration 0126/0188: training loss 0.796 Epoch 32 iteration 0127/0188: training loss 0.795 Epoch 32 iteration 0128/0188: training loss 0.796 Epoch 32 iteration 0129/0188: training loss 0.796 Epoch 32 iteration 0130/0188: training loss 0.796 Epoch 32 iteration 0131/0188: training loss 0.796 Epoch 32 iteration 0132/0188: training loss 0.795 Epoch 32 iteration 0133/0188: training loss 0.795 Epoch 32 iteration 0134/0188: training loss 0.795 Epoch 32 iteration 0135/0188: training loss 0.796 Epoch 32 iteration 0136/0188: training loss 0.797 Epoch 32 iteration 0137/0188: training loss 0.796 Epoch 32 iteration 0138/0188: training loss 0.795 Epoch 32 iteration 0139/0188: training loss 0.795 Epoch 32 iteration 0140/0188: training loss 0.795 Epoch 32 iteration 0141/0188: training loss 0.795 Epoch 32 iteration 0142/0188: training loss 0.794 Epoch 32 iteration 0143/0188: training loss 0.795 Epoch 32 iteration 0144/0188: training loss 0.794 Epoch 32 iteration 0145/0188: training loss 0.796 Epoch 32 iteration 0146/0188: training loss 0.795 Epoch 32 iteration 0147/0188: training loss 0.795 Epoch 32 iteration 0148/0188: training loss 0.794 Epoch 32 iteration 0149/0188: training loss 0.793 Epoch 32 iteration 0150/0188: training loss 0.792 Epoch 32 iteration 0151/0188: training loss 0.791 Epoch 32 iteration 0152/0188: training loss 0.790 Epoch 32 iteration 0153/0188: training loss 0.791 Epoch 32 iteration 0154/0188: training loss 0.792 Epoch 32 iteration 0155/0188: training loss 0.791 Epoch 32 iteration 0156/0188: training loss 0.791 Epoch 32 iteration 0157/0188: training loss 0.791 Epoch 32 iteration 0158/0188: training loss 0.790 Epoch 32 iteration 0159/0188: training loss 0.791 Epoch 32 iteration 0160/0188: training loss 0.791 Epoch 32 iteration 0161/0188: training loss 0.790 Epoch 32 iteration 0162/0188: training loss 0.791 Epoch 32 iteration 0163/0188: training loss 0.790 Epoch 32 iteration 0164/0188: training loss 0.790 Epoch 32 iteration 0165/0188: training loss 0.789 Epoch 32 iteration 0166/0188: training loss 0.789 Epoch 32 iteration 0167/0188: training loss 0.790 Epoch 32 iteration 0168/0188: training loss 0.790 Epoch 32 iteration 0169/0188: training loss 0.789 Epoch 32 iteration 0170/0188: training loss 0.788 Epoch 32 iteration 0171/0188: training loss 0.787 Epoch 32 iteration 0172/0188: training loss 0.787 Epoch 32 iteration 0173/0188: training loss 0.786 Epoch 32 iteration 0174/0188: training loss 0.786 Epoch 32 iteration 0175/0188: training loss 0.786 Epoch 32 iteration 0176/0188: training loss 0.786 Epoch 32 iteration 0177/0188: training loss 0.786 Epoch 32 iteration 0178/0188: training loss 0.785 Epoch 32 iteration 0179/0188: training loss 0.785 Epoch 32 iteration 0180/0188: training loss 0.785 Epoch 32 iteration 0181/0188: training loss 0.785 Epoch 32 iteration 0182/0188: training loss 0.784 Epoch 32 iteration 0183/0188: training loss 0.785 Epoch 32 iteration 0184/0188: training loss 0.784 Epoch 32 iteration 0185/0188: training loss 0.785 Epoch 32 iteration 0186/0188: training loss 0.784 Epoch 32 validation pixAcc: 0.870, mIoU: 0.377 Epoch 33 iteration 0001/0187: training loss 0.697 Epoch 33 iteration 0002/0187: training loss 0.807 Epoch 33 iteration 0003/0187: training loss 0.780 Epoch 33 iteration 0004/0187: training loss 0.871 Epoch 33 iteration 0005/0187: training loss 0.843 Epoch 33 iteration 0006/0187: training loss 0.860 Epoch 33 iteration 0007/0187: training loss 0.834 Epoch 33 iteration 0008/0187: training loss 0.831 Epoch 33 iteration 0009/0187: training loss 0.802 Epoch 33 iteration 0010/0187: training loss 0.786 Epoch 33 iteration 0011/0187: training loss 0.783 Epoch 33 iteration 0012/0187: training loss 0.772 Epoch 33 iteration 0013/0187: training loss 0.772 Epoch 33 iteration 0014/0187: training loss 0.777 Epoch 33 iteration 0015/0187: training loss 0.777 Epoch 33 iteration 0016/0187: training loss 0.768 Epoch 33 iteration 0017/0187: training loss 0.782 Epoch 33 iteration 0018/0187: training loss 0.783 Epoch 33 iteration 0019/0187: training loss 0.789 Epoch 33 iteration 0020/0187: training loss 0.801 Epoch 33 iteration 0021/0187: training loss 0.808 Epoch 33 iteration 0022/0187: training loss 0.806 Epoch 33 iteration 0023/0187: training loss 0.807 Epoch 33 iteration 0024/0187: training loss 0.810 Epoch 33 iteration 0025/0187: training loss 0.799 Epoch 33 iteration 0026/0187: training loss 0.801 Epoch 33 iteration 0027/0187: training loss 0.797 Epoch 33 iteration 0028/0187: training loss 0.803 Epoch 33 iteration 0029/0187: training loss 0.800 Epoch 33 iteration 0030/0187: training loss 0.801 Epoch 33 iteration 0031/0187: training loss 0.798 Epoch 33 iteration 0032/0187: training loss 0.797 Epoch 33 iteration 0033/0187: training loss 0.793 Epoch 33 iteration 0034/0187: training loss 0.788 Epoch 33 iteration 0035/0187: training loss 0.793 Epoch 33 iteration 0036/0187: training loss 0.798 Epoch 33 iteration 0037/0187: training loss 0.796 Epoch 33 iteration 0038/0187: training loss 0.794 Epoch 33 iteration 0039/0187: training loss 0.791 Epoch 33 iteration 0040/0187: training loss 0.791 Epoch 33 iteration 0041/0187: training loss 0.794 Epoch 33 iteration 0042/0187: training loss 0.796 Epoch 33 iteration 0043/0187: training loss 0.800 Epoch 33 iteration 0044/0187: training loss 0.800 Epoch 33 iteration 0045/0187: training loss 0.805 Epoch 33 iteration 0046/0187: training loss 0.807 Epoch 33 iteration 0047/0187: training loss 0.810 Epoch 33 iteration 0048/0187: training loss 0.810 Epoch 33 iteration 0049/0187: training loss 0.810 Epoch 33 iteration 0050/0187: training loss 0.807 Epoch 33 iteration 0051/0187: training loss 0.810 Epoch 33 iteration 0052/0187: training loss 0.812 Epoch 33 iteration 0053/0187: training loss 0.813 Epoch 33 iteration 0054/0187: training loss 0.809 Epoch 33 iteration 0055/0187: training loss 0.808 Epoch 33 iteration 0056/0187: training loss 0.806 Epoch 33 iteration 0057/0187: training loss 0.804 Epoch 33 iteration 0058/0187: training loss 0.803 Epoch 33 iteration 0059/0187: training loss 0.805 Epoch 33 iteration 0060/0187: training loss 0.805 Epoch 33 iteration 0061/0187: training loss 0.805 Epoch 33 iteration 0062/0187: training loss 0.803 Epoch 33 iteration 0063/0187: training loss 0.803 Epoch 33 iteration 0064/0187: training loss 0.804 Epoch 33 iteration 0065/0187: training loss 0.807 Epoch 33 iteration 0066/0187: training loss 0.807 Epoch 33 iteration 0067/0187: training loss 0.806 Epoch 33 iteration 0068/0187: training loss 0.806 Epoch 33 iteration 0069/0187: training loss 0.807 Epoch 33 iteration 0070/0187: training loss 0.807 Epoch 33 iteration 0071/0187: training loss 0.811 Epoch 33 iteration 0072/0187: training loss 0.808 Epoch 33 iteration 0073/0187: training loss 0.808 Epoch 33 iteration 0074/0187: training loss 0.807 Epoch 33 iteration 0075/0187: training loss 0.806 Epoch 33 iteration 0076/0187: training loss 0.809 Epoch 33 iteration 0077/0187: training loss 0.808 Epoch 33 iteration 0078/0187: training loss 0.807 Epoch 33 iteration 0079/0187: training loss 0.808 Epoch 33 iteration 0080/0187: training loss 0.808 Epoch 33 iteration 0081/0187: training loss 0.809 Epoch 33 iteration 0082/0187: training loss 0.806 Epoch 33 iteration 0083/0187: training loss 0.806 Epoch 33 iteration 0084/0187: training loss 0.807 Epoch 33 iteration 0085/0187: training loss 0.808 Epoch 33 iteration 0086/0187: training loss 0.808 Epoch 33 iteration 0087/0187: training loss 0.808 Epoch 33 iteration 0088/0187: training loss 0.808 Epoch 33 iteration 0089/0187: training loss 0.807 Epoch 33 iteration 0090/0187: training loss 0.808 Epoch 33 iteration 0091/0187: training loss 0.805 Epoch 33 iteration 0092/0187: training loss 0.805 Epoch 33 iteration 0093/0187: training loss 0.804 Epoch 33 iteration 0094/0187: training loss 0.803 Epoch 33 iteration 0095/0187: training loss 0.803 Epoch 33 iteration 0096/0187: training loss 0.803 Epoch 33 iteration 0097/0187: training loss 0.802 Epoch 33 iteration 0098/0187: training loss 0.804 Epoch 33 iteration 0099/0187: training loss 0.802 Epoch 33 iteration 0100/0187: training loss 0.801 Epoch 33 iteration 0101/0187: training loss 0.801 Epoch 33 iteration 0102/0187: training loss 0.799 Epoch 33 iteration 0103/0187: training loss 0.797 Epoch 33 iteration 0104/0187: training loss 0.796 Epoch 33 iteration 0105/0187: training loss 0.794 Epoch 33 iteration 0106/0187: training loss 0.795 Epoch 33 iteration 0107/0187: training loss 0.795 Epoch 33 iteration 0108/0187: training loss 0.796 Epoch 33 iteration 0109/0187: training loss 0.795 Epoch 33 iteration 0110/0187: training loss 0.794 Epoch 33 iteration 0111/0187: training loss 0.793 Epoch 33 iteration 0112/0187: training loss 0.792 Epoch 33 iteration 0113/0187: training loss 0.791 Epoch 33 iteration 0114/0187: training loss 0.790 Epoch 33 iteration 0115/0187: training loss 0.790 Epoch 33 iteration 0116/0187: training loss 0.790 Epoch 33 iteration 0117/0187: training loss 0.789 Epoch 33 iteration 0118/0187: training loss 0.788 Epoch 33 iteration 0119/0187: training loss 0.789 Epoch 33 iteration 0120/0187: training loss 0.790 Epoch 33 iteration 0121/0187: training loss 0.789 Epoch 33 iteration 0122/0187: training loss 0.789 Epoch 33 iteration 0123/0187: training loss 0.791 Epoch 33 iteration 0124/0187: training loss 0.792 Epoch 33 iteration 0125/0187: training loss 0.792 Epoch 33 iteration 0126/0187: training loss 0.792 Epoch 33 iteration 0127/0187: training loss 0.792 Epoch 33 iteration 0128/0187: training loss 0.792 Epoch 33 iteration 0129/0187: training loss 0.792 Epoch 33 iteration 0130/0187: training loss 0.791 Epoch 33 iteration 0131/0187: training loss 0.791 Epoch 33 iteration 0132/0187: training loss 0.790 Epoch 33 iteration 0133/0187: training loss 0.790 Epoch 33 iteration 0134/0187: training loss 0.788 Epoch 33 iteration 0135/0187: training loss 0.788 Epoch 33 iteration 0136/0187: training loss 0.790 Epoch 33 iteration 0137/0187: training loss 0.790 Epoch 33 iteration 0138/0187: training loss 0.789 Epoch 33 iteration 0139/0187: training loss 0.789 Epoch 33 iteration 0140/0187: training loss 0.789 Epoch 33 iteration 0141/0187: training loss 0.787 Epoch 33 iteration 0142/0187: training loss 0.786 Epoch 33 iteration 0143/0187: training loss 0.787 Epoch 33 iteration 0144/0187: training loss 0.788 Epoch 33 iteration 0145/0187: training loss 0.789 Epoch 33 iteration 0146/0187: training loss 0.789 Epoch 33 iteration 0147/0187: training loss 0.788 Epoch 33 iteration 0148/0187: training loss 0.788 Epoch 33 iteration 0149/0187: training loss 0.788 Epoch 33 iteration 0150/0187: training loss 0.788 Epoch 33 iteration 0151/0187: training loss 0.788 Epoch 33 iteration 0152/0187: training loss 0.787 Epoch 33 iteration 0153/0187: training loss 0.787 Epoch 33 iteration 0154/0187: training loss 0.787 Epoch 33 iteration 0155/0187: training loss 0.786 Epoch 33 iteration 0156/0187: training loss 0.786 Epoch 33 iteration 0157/0187: training loss 0.785 Epoch 33 iteration 0158/0187: training loss 0.785 Epoch 33 iteration 0159/0187: training loss 0.783 Epoch 33 iteration 0160/0187: training loss 0.783 Epoch 33 iteration 0161/0187: training loss 0.782 Epoch 33 iteration 0162/0187: training loss 0.782 Epoch 33 iteration 0163/0187: training loss 0.782 Epoch 33 iteration 0164/0187: training loss 0.782 Epoch 33 iteration 0165/0187: training loss 0.783 Epoch 33 iteration 0166/0187: training loss 0.782 Epoch 33 iteration 0167/0187: training loss 0.781 Epoch 33 iteration 0168/0187: training loss 0.780 Epoch 33 iteration 0169/0187: training loss 0.780 Epoch 33 iteration 0170/0187: training loss 0.780 Epoch 33 iteration 0171/0187: training loss 0.780 Epoch 33 iteration 0172/0187: training loss 0.780 Epoch 33 iteration 0173/0187: training loss 0.779 Epoch 33 iteration 0174/0187: training loss 0.780 Epoch 33 iteration 0175/0187: training loss 0.781 Epoch 33 iteration 0176/0187: training loss 0.780 Epoch 33 iteration 0177/0187: training loss 0.780 Epoch 33 iteration 0178/0187: training loss 0.780 Epoch 33 iteration 0179/0187: training loss 0.779 Epoch 33 iteration 0180/0187: training loss 0.779 Epoch 33 iteration 0181/0187: training loss 0.780 Epoch 33 iteration 0182/0187: training loss 0.780 Epoch 33 iteration 0183/0187: training loss 0.779 Epoch 33 iteration 0184/0187: training loss 0.779 Epoch 33 iteration 0185/0187: training loss 0.779 Epoch 33 iteration 0186/0187: training loss 0.780 Epoch 33 iteration 0187/0187: training loss 0.780 Epoch 33 validation pixAcc: 0.873, mIoU: 0.377 Epoch 34 iteration 0001/0187: training loss 0.675 Epoch 34 iteration 0002/0187: training loss 0.676 Epoch 34 iteration 0003/0187: training loss 0.728 Epoch 34 iteration 0004/0187: training loss 0.702 Epoch 34 iteration 0005/0187: training loss 0.713 Epoch 34 iteration 0006/0187: training loss 0.745 Epoch 34 iteration 0007/0187: training loss 0.733 Epoch 34 iteration 0008/0187: training loss 0.734 Epoch 34 iteration 0009/0187: training loss 0.748 Epoch 34 iteration 0010/0187: training loss 0.734 Epoch 34 iteration 0011/0187: training loss 0.732 Epoch 34 iteration 0012/0187: training loss 0.730 Epoch 34 iteration 0013/0187: training loss 0.728 Epoch 34 iteration 0014/0187: training loss 0.726 Epoch 34 iteration 0015/0187: training loss 0.722 Epoch 34 iteration 0016/0187: training loss 0.725 Epoch 34 iteration 0017/0187: training loss 0.736 Epoch 34 iteration 0018/0187: training loss 0.734 Epoch 34 iteration 0019/0187: training loss 0.742 Epoch 34 iteration 0020/0187: training loss 0.746 Epoch 34 iteration 0021/0187: training loss 0.747 Epoch 34 iteration 0022/0187: training loss 0.741 Epoch 34 iteration 0023/0187: training loss 0.737 Epoch 34 iteration 0024/0187: training loss 0.736 Epoch 34 iteration 0025/0187: training loss 0.735 Epoch 34 iteration 0026/0187: training loss 0.737 Epoch 34 iteration 0027/0187: training loss 0.737 Epoch 34 iteration 0028/0187: training loss 0.733 Epoch 34 iteration 0029/0187: training loss 0.736 Epoch 34 iteration 0030/0187: training loss 0.736 Epoch 34 iteration 0031/0187: training loss 0.732 Epoch 34 iteration 0032/0187: training loss 0.726 Epoch 34 iteration 0033/0187: training loss 0.728 Epoch 34 iteration 0034/0187: training loss 0.726 Epoch 34 iteration 0035/0187: training loss 0.726 Epoch 34 iteration 0036/0187: training loss 0.728 Epoch 34 iteration 0037/0187: training loss 0.724 Epoch 34 iteration 0038/0187: training loss 0.725 Epoch 34 iteration 0039/0187: training loss 0.733 Epoch 34 iteration 0040/0187: training loss 0.740 Epoch 34 iteration 0041/0187: training loss 0.739 Epoch 34 iteration 0042/0187: training loss 0.739 Epoch 34 iteration 0043/0187: training loss 0.741 Epoch 34 iteration 0044/0187: training loss 0.739 Epoch 34 iteration 0045/0187: training loss 0.737 Epoch 34 iteration 0046/0187: training loss 0.738 Epoch 34 iteration 0047/0187: training loss 0.740 Epoch 34 iteration 0048/0187: training loss 0.739 Epoch 34 iteration 0049/0187: training loss 0.742 Epoch 34 iteration 0050/0187: training loss 0.741 Epoch 34 iteration 0051/0187: training loss 0.740 Epoch 34 iteration 0052/0187: training loss 0.739 Epoch 34 iteration 0053/0187: training loss 0.738 Epoch 34 iteration 0054/0187: training loss 0.741 Epoch 34 iteration 0055/0187: training loss 0.741 Epoch 34 iteration 0056/0187: training loss 0.741 Epoch 34 iteration 0057/0187: training loss 0.743 Epoch 34 iteration 0058/0187: training loss 0.742 Epoch 34 iteration 0059/0187: training loss 0.741 Epoch 34 iteration 0060/0187: training loss 0.742 Epoch 34 iteration 0061/0187: training loss 0.740 Epoch 34 iteration 0062/0187: training loss 0.741 Epoch 34 iteration 0063/0187: training loss 0.737 Epoch 34 iteration 0064/0187: training loss 0.738 Epoch 34 iteration 0065/0187: training loss 0.737 Epoch 34 iteration 0066/0187: training loss 0.733 Epoch 34 iteration 0067/0187: training loss 0.732 Epoch 34 iteration 0068/0187: training loss 0.733 Epoch 34 iteration 0069/0187: training loss 0.738 Epoch 34 iteration 0070/0187: training loss 0.737 Epoch 34 iteration 0071/0187: training loss 0.739 Epoch 34 iteration 0072/0187: training loss 0.740 Epoch 34 iteration 0073/0187: training loss 0.741 Epoch 34 iteration 0074/0187: training loss 0.739 Epoch 34 iteration 0075/0187: training loss 0.737 Epoch 34 iteration 0076/0187: training loss 0.739 Epoch 34 iteration 0077/0187: training loss 0.739 Epoch 34 iteration 0078/0187: training loss 0.738 Epoch 34 iteration 0079/0187: training loss 0.737 Epoch 34 iteration 0080/0187: training loss 0.739 Epoch 34 iteration 0081/0187: training loss 0.742 Epoch 34 iteration 0082/0187: training loss 0.741 Epoch 34 iteration 0083/0187: training loss 0.741 Epoch 34 iteration 0084/0187: training loss 0.741 Epoch 34 iteration 0085/0187: training loss 0.741 Epoch 34 iteration 0086/0187: training loss 0.739 Epoch 34 iteration 0087/0187: training loss 0.738 Epoch 34 iteration 0088/0187: training loss 0.737 Epoch 34 iteration 0089/0187: training loss 0.740 Epoch 34 iteration 0090/0187: training loss 0.739 Epoch 34 iteration 0091/0188: training loss 0.740 Epoch 34 iteration 0092/0188: training loss 0.743 Epoch 34 iteration 0093/0188: training loss 0.744 Epoch 34 iteration 0094/0188: training loss 0.743 Epoch 34 iteration 0095/0188: training loss 0.743 Epoch 34 iteration 0096/0188: training loss 0.747 Epoch 34 iteration 0097/0188: training loss 0.747 Epoch 34 iteration 0098/0188: training loss 0.748 Epoch 34 iteration 0099/0188: training loss 0.746 Epoch 34 iteration 0100/0188: training loss 0.745 Epoch 34 iteration 0101/0188: training loss 0.746 Epoch 34 iteration 0102/0188: training loss 0.746 Epoch 34 iteration 0103/0188: training loss 0.746 Epoch 34 iteration 0104/0188: training loss 0.744 Epoch 34 iteration 0105/0188: training loss 0.745 Epoch 34 iteration 0106/0188: training loss 0.746 Epoch 34 iteration 0107/0188: training loss 0.745 Epoch 34 iteration 0108/0188: training loss 0.745 Epoch 34 iteration 0109/0188: training loss 0.745 Epoch 34 iteration 0110/0188: training loss 0.744 Epoch 34 iteration 0111/0188: training loss 0.747 Epoch 34 iteration 0112/0188: training loss 0.748 Epoch 34 iteration 0113/0188: training loss 0.749 Epoch 34 iteration 0114/0188: training loss 0.751 Epoch 34 iteration 0115/0188: training loss 0.751 Epoch 34 iteration 0116/0188: training loss 0.749 Epoch 34 iteration 0117/0188: training loss 0.749 Epoch 34 iteration 0118/0188: training loss 0.749 Epoch 34 iteration 0119/0188: training loss 0.749 Epoch 34 iteration 0120/0188: training loss 0.750 Epoch 34 iteration 0121/0188: training loss 0.752 Epoch 34 iteration 0122/0188: training loss 0.750 Epoch 34 iteration 0123/0188: training loss 0.749 Epoch 34 iteration 0124/0188: training loss 0.751 Epoch 34 iteration 0125/0188: training loss 0.752 Epoch 34 iteration 0126/0188: training loss 0.752 Epoch 34 iteration 0127/0188: training loss 0.754 Epoch 34 iteration 0128/0188: training loss 0.754 Epoch 34 iteration 0129/0188: training loss 0.756 Epoch 34 iteration 0130/0188: training loss 0.758 Epoch 34 iteration 0131/0188: training loss 0.758 Epoch 34 iteration 0132/0188: training loss 0.757 Epoch 34 iteration 0133/0188: training loss 0.757 Epoch 34 iteration 0134/0188: training loss 0.756 Epoch 34 iteration 0135/0188: training loss 0.757 Epoch 34 iteration 0136/0188: training loss 0.757 Epoch 34 iteration 0137/0188: training loss 0.757 Epoch 34 iteration 0138/0188: training loss 0.756 Epoch 34 iteration 0139/0188: training loss 0.758 Epoch 34 iteration 0140/0188: training loss 0.757 Epoch 34 iteration 0141/0188: training loss 0.757 Epoch 34 iteration 0142/0188: training loss 0.756 Epoch 34 iteration 0143/0188: training loss 0.756 Epoch 34 iteration 0144/0188: training loss 0.756 Epoch 34 iteration 0145/0188: training loss 0.755 Epoch 34 iteration 0146/0188: training loss 0.755 Epoch 34 iteration 0147/0188: training loss 0.755 Epoch 34 iteration 0148/0188: training loss 0.755 Epoch 34 iteration 0149/0188: training loss 0.755 Epoch 34 iteration 0150/0188: training loss 0.755 Epoch 34 iteration 0151/0188: training loss 0.755 Epoch 34 iteration 0152/0188: training loss 0.756 Epoch 34 iteration 0153/0188: training loss 0.756 Epoch 34 iteration 0154/0188: training loss 0.756 Epoch 34 iteration 0155/0188: training loss 0.755 Epoch 34 iteration 0156/0188: training loss 0.755 Epoch 34 iteration 0157/0188: training loss 0.755 Epoch 34 iteration 0158/0188: training loss 0.754 Epoch 34 iteration 0159/0188: training loss 0.758 Epoch 34 iteration 0160/0188: training loss 0.758 Epoch 34 iteration 0161/0188: training loss 0.758 Epoch 34 iteration 0162/0188: training loss 0.757 Epoch 34 iteration 0163/0188: training loss 0.759 Epoch 34 iteration 0164/0188: training loss 0.759 Epoch 34 iteration 0165/0188: training loss 0.758 Epoch 34 iteration 0166/0188: training loss 0.760 Epoch 34 iteration 0167/0188: training loss 0.760 Epoch 34 iteration 0168/0188: training loss 0.760 Epoch 34 iteration 0169/0188: training loss 0.759 Epoch 34 iteration 0170/0188: training loss 0.758 Epoch 34 iteration 0171/0188: training loss 0.759 Epoch 34 iteration 0172/0188: training loss 0.759 Epoch 34 iteration 0173/0188: training loss 0.760 Epoch 34 iteration 0174/0188: training loss 0.761 Epoch 34 iteration 0175/0188: training loss 0.762 Epoch 34 iteration 0176/0188: training loss 0.763 Epoch 34 iteration 0177/0188: training loss 0.763 Epoch 34 iteration 0178/0188: training loss 0.763 Epoch 34 iteration 0179/0188: training loss 0.764 Epoch 34 iteration 0180/0188: training loss 0.763 Epoch 34 iteration 0181/0188: training loss 0.762 Epoch 34 iteration 0182/0188: training loss 0.762 Epoch 34 iteration 0183/0188: training loss 0.761 Epoch 34 iteration 0184/0188: training loss 0.762 Epoch 34 iteration 0185/0188: training loss 0.762 Epoch 34 iteration 0186/0188: training loss 0.762 Epoch 34 validation pixAcc: 0.870, mIoU: 0.375 Epoch 35 iteration 0001/0187: training loss 0.660 Epoch 35 iteration 0002/0187: training loss 0.716 Epoch 35 iteration 0003/0187: training loss 0.754 Epoch 35 iteration 0004/0187: training loss 0.780 Epoch 35 iteration 0005/0187: training loss 0.759 Epoch 35 iteration 0006/0187: training loss 0.758 Epoch 35 iteration 0007/0187: training loss 0.797 Epoch 35 iteration 0008/0187: training loss 0.766 Epoch 35 iteration 0009/0187: training loss 0.768 Epoch 35 iteration 0010/0187: training loss 0.769 Epoch 35 iteration 0011/0187: training loss 0.766 Epoch 35 iteration 0012/0187: training loss 0.762 Epoch 35 iteration 0013/0187: training loss 0.755 Epoch 35 iteration 0014/0187: training loss 0.767 Epoch 35 iteration 0015/0187: training loss 0.769 Epoch 35 iteration 0016/0187: training loss 0.762 Epoch 35 iteration 0017/0187: training loss 0.774 Epoch 35 iteration 0018/0187: training loss 0.771 Epoch 35 iteration 0019/0187: training loss 0.768 Epoch 35 iteration 0020/0187: training loss 0.769 Epoch 35 iteration 0021/0187: training loss 0.765 Epoch 35 iteration 0022/0187: training loss 0.773 Epoch 35 iteration 0023/0187: training loss 0.772 Epoch 35 iteration 0024/0187: training loss 0.776 Epoch 35 iteration 0025/0187: training loss 0.774 Epoch 35 iteration 0026/0187: training loss 0.776 Epoch 35 iteration 0027/0187: training loss 0.776 Epoch 35 iteration 0028/0187: training loss 0.782 Epoch 35 iteration 0029/0187: training loss 0.777 Epoch 35 iteration 0030/0187: training loss 0.777 Epoch 35 iteration 0031/0187: training loss 0.773 Epoch 35 iteration 0032/0187: training loss 0.767 Epoch 35 iteration 0033/0187: training loss 0.769 Epoch 35 iteration 0034/0187: training loss 0.770 Epoch 35 iteration 0035/0187: training loss 0.772 Epoch 35 iteration 0036/0187: training loss 0.770 Epoch 35 iteration 0037/0187: training loss 0.766 Epoch 35 iteration 0038/0187: training loss 0.764 Epoch 35 iteration 0039/0187: training loss 0.762 Epoch 35 iteration 0040/0187: training loss 0.765 Epoch 35 iteration 0041/0187: training loss 0.765 Epoch 35 iteration 0042/0187: training loss 0.764 Epoch 35 iteration 0043/0187: training loss 0.763 Epoch 35 iteration 0044/0187: training loss 0.762 Epoch 35 iteration 0045/0187: training loss 0.758 Epoch 35 iteration 0046/0187: training loss 0.756 Epoch 35 iteration 0047/0187: training loss 0.759 Epoch 35 iteration 0048/0187: training loss 0.756 Epoch 35 iteration 0049/0187: training loss 0.754 Epoch 35 iteration 0050/0187: training loss 0.753 Epoch 35 iteration 0051/0187: training loss 0.753 Epoch 35 iteration 0052/0187: training loss 0.751 Epoch 35 iteration 0053/0187: training loss 0.750 Epoch 35 iteration 0054/0187: training loss 0.750 Epoch 35 iteration 0055/0187: training loss 0.751 Epoch 35 iteration 0056/0187: training loss 0.750 Epoch 35 iteration 0057/0187: training loss 0.748 Epoch 35 iteration 0058/0187: training loss 0.749 Epoch 35 iteration 0059/0187: training loss 0.755 Epoch 35 iteration 0060/0187: training loss 0.758 Epoch 35 iteration 0061/0187: training loss 0.759 Epoch 35 iteration 0062/0187: training loss 0.757 Epoch 35 iteration 0063/0187: training loss 0.753 Epoch 35 iteration 0064/0187: training loss 0.752 Epoch 35 iteration 0065/0187: training loss 0.752 Epoch 35 iteration 0066/0187: training loss 0.752 Epoch 35 iteration 0067/0187: training loss 0.755 Epoch 35 iteration 0068/0187: training loss 0.756 Epoch 35 iteration 0069/0187: training loss 0.757 Epoch 35 iteration 0070/0187: training loss 0.757 Epoch 35 iteration 0071/0187: training loss 0.759 Epoch 35 iteration 0072/0187: training loss 0.757 Epoch 35 iteration 0073/0187: training loss 0.758 Epoch 35 iteration 0074/0187: training loss 0.760 Epoch 35 iteration 0075/0187: training loss 0.758 Epoch 35 iteration 0076/0187: training loss 0.757 Epoch 35 iteration 0077/0187: training loss 0.759 Epoch 35 iteration 0078/0187: training loss 0.757 Epoch 35 iteration 0079/0187: training loss 0.756 Epoch 35 iteration 0080/0187: training loss 0.756 Epoch 35 iteration 0081/0187: training loss 0.758 Epoch 35 iteration 0082/0187: training loss 0.758 Epoch 35 iteration 0083/0187: training loss 0.757 Epoch 35 iteration 0084/0187: training loss 0.757 Epoch 35 iteration 0085/0187: training loss 0.757 Epoch 35 iteration 0086/0187: training loss 0.759 Epoch 35 iteration 0087/0187: training loss 0.759 Epoch 35 iteration 0088/0187: training loss 0.758 Epoch 35 iteration 0089/0187: training loss 0.757 Epoch 35 iteration 0090/0187: training loss 0.760 Epoch 35 iteration 0091/0187: training loss 0.761 Epoch 35 iteration 0092/0187: training loss 0.761 Epoch 35 iteration 0093/0187: training loss 0.763 Epoch 35 iteration 0094/0187: training loss 0.761 Epoch 35 iteration 0095/0187: training loss 0.761 Epoch 35 iteration 0096/0187: training loss 0.759 Epoch 35 iteration 0097/0187: training loss 0.760 Epoch 35 iteration 0098/0187: training loss 0.761 Epoch 35 iteration 0099/0187: training loss 0.760 Epoch 35 iteration 0100/0187: training loss 0.761 Epoch 35 iteration 0101/0187: training loss 0.761 Epoch 35 iteration 0102/0187: training loss 0.762 Epoch 35 iteration 0103/0187: training loss 0.761 Epoch 35 iteration 0104/0187: training loss 0.760 Epoch 35 iteration 0105/0187: training loss 0.760 Epoch 35 iteration 0106/0187: training loss 0.759 Epoch 35 iteration 0107/0187: training loss 0.758 Epoch 35 iteration 0108/0187: training loss 0.758 Epoch 35 iteration 0109/0187: training loss 0.758 Epoch 35 iteration 0110/0187: training loss 0.758 Epoch 35 iteration 0111/0187: training loss 0.757 Epoch 35 iteration 0112/0187: training loss 0.756 Epoch 35 iteration 0113/0187: training loss 0.755 Epoch 35 iteration 0114/0187: training loss 0.754 Epoch 35 iteration 0115/0187: training loss 0.755 Epoch 35 iteration 0116/0187: training loss 0.755 Epoch 35 iteration 0117/0187: training loss 0.755 Epoch 35 iteration 0118/0187: training loss 0.755 Epoch 35 iteration 0119/0187: training loss 0.754 Epoch 35 iteration 0120/0187: training loss 0.755 Epoch 35 iteration 0121/0187: training loss 0.756 Epoch 35 iteration 0122/0187: training loss 0.757 Epoch 35 iteration 0123/0187: training loss 0.757 Epoch 35 iteration 0124/0187: training loss 0.757 Epoch 35 iteration 0125/0187: training loss 0.756 Epoch 35 iteration 0126/0187: training loss 0.755 Epoch 35 iteration 0127/0187: training loss 0.755 Epoch 35 iteration 0128/0187: training loss 0.758 Epoch 35 iteration 0129/0187: training loss 0.757 Epoch 35 iteration 0130/0187: training loss 0.756 Epoch 35 iteration 0131/0187: training loss 0.758 Epoch 35 iteration 0132/0187: training loss 0.757 Epoch 35 iteration 0133/0187: training loss 0.757 Epoch 35 iteration 0134/0187: training loss 0.755 Epoch 35 iteration 0135/0187: training loss 0.755 Epoch 35 iteration 0136/0187: training loss 0.755 Epoch 35 iteration 0137/0187: training loss 0.756 Epoch 35 iteration 0138/0187: training loss 0.756 Epoch 35 iteration 0139/0187: training loss 0.755 Epoch 35 iteration 0140/0187: training loss 0.754 Epoch 35 iteration 0141/0187: training loss 0.754 Epoch 35 iteration 0142/0187: training loss 0.755 Epoch 35 iteration 0143/0187: training loss 0.755 Epoch 35 iteration 0144/0187: training loss 0.754 Epoch 35 iteration 0145/0187: training loss 0.754 Epoch 35 iteration 0146/0187: training loss 0.755 Epoch 35 iteration 0147/0187: training loss 0.757 Epoch 35 iteration 0148/0187: training loss 0.756 Epoch 35 iteration 0149/0187: training loss 0.756 Epoch 35 iteration 0150/0187: training loss 0.757 Epoch 35 iteration 0151/0187: training loss 0.759 Epoch 35 iteration 0152/0187: training loss 0.759 Epoch 35 iteration 0153/0187: training loss 0.759 Epoch 35 iteration 0154/0187: training loss 0.760 Epoch 35 iteration 0155/0187: training loss 0.759 Epoch 35 iteration 0156/0187: training loss 0.759 Epoch 35 iteration 0157/0187: training loss 0.758 Epoch 35 iteration 0158/0187: training loss 0.758 Epoch 35 iteration 0159/0187: training loss 0.758 Epoch 35 iteration 0160/0187: training loss 0.758 Epoch 35 iteration 0161/0187: training loss 0.758 Epoch 35 iteration 0162/0187: training loss 0.758 Epoch 35 iteration 0163/0187: training loss 0.758 Epoch 35 iteration 0164/0187: training loss 0.759 Epoch 35 iteration 0165/0187: training loss 0.759 Epoch 35 iteration 0166/0187: training loss 0.760 Epoch 35 iteration 0167/0187: training loss 0.760 Epoch 35 iteration 0168/0187: training loss 0.760 Epoch 35 iteration 0169/0187: training loss 0.759 Epoch 35 iteration 0170/0187: training loss 0.759 Epoch 35 iteration 0171/0187: training loss 0.759 Epoch 35 iteration 0172/0187: training loss 0.759 Epoch 35 iteration 0173/0187: training loss 0.760 Epoch 35 iteration 0174/0187: training loss 0.759 Epoch 35 iteration 0175/0187: training loss 0.759 Epoch 35 iteration 0176/0187: training loss 0.758 Epoch 35 iteration 0177/0187: training loss 0.758 Epoch 35 iteration 0178/0187: training loss 0.759 Epoch 35 iteration 0179/0187: training loss 0.760 Epoch 35 iteration 0180/0187: training loss 0.760 Epoch 35 iteration 0181/0187: training loss 0.761 Epoch 35 iteration 0182/0187: training loss 0.762 Epoch 35 iteration 0183/0187: training loss 0.761 Epoch 35 iteration 0184/0187: training loss 0.761 Epoch 35 iteration 0185/0187: training loss 0.760 Epoch 35 iteration 0186/0187: training loss 0.763 Epoch 35 iteration 0187/0187: training loss 0.763 Epoch 35 validation pixAcc: 0.873, mIoU: 0.381 Epoch 36 iteration 0001/0187: training loss 0.772 Epoch 36 iteration 0002/0187: training loss 0.718 Epoch 36 iteration 0003/0187: training loss 0.705 Epoch 36 iteration 0004/0187: training loss 0.734 Epoch 36 iteration 0005/0187: training loss 0.757 Epoch 36 iteration 0006/0187: training loss 0.762 Epoch 36 iteration 0007/0187: training loss 0.748 Epoch 36 iteration 0008/0187: training loss 0.792 Epoch 36 iteration 0009/0187: training loss 0.808 Epoch 36 iteration 0010/0187: training loss 0.810 Epoch 36 iteration 0011/0187: training loss 0.800 Epoch 36 iteration 0012/0187: training loss 0.813 Epoch 36 iteration 0013/0187: training loss 0.804 Epoch 36 iteration 0014/0187: training loss 0.818 Epoch 36 iteration 0015/0187: training loss 0.823 Epoch 36 iteration 0016/0187: training loss 0.816 Epoch 36 iteration 0017/0187: training loss 0.813 Epoch 36 iteration 0018/0187: training loss 0.805 Epoch 36 iteration 0019/0187: training loss 0.809 Epoch 36 iteration 0020/0187: training loss 0.800 Epoch 36 iteration 0021/0187: training loss 0.790 Epoch 36 iteration 0022/0187: training loss 0.786 Epoch 36 iteration 0023/0187: training loss 0.788 Epoch 36 iteration 0024/0187: training loss 0.792 Epoch 36 iteration 0025/0187: training loss 0.802 Epoch 36 iteration 0026/0187: training loss 0.801 Epoch 36 iteration 0027/0187: training loss 0.798 Epoch 36 iteration 0028/0187: training loss 0.797 Epoch 36 iteration 0029/0187: training loss 0.801 Epoch 36 iteration 0030/0187: training loss 0.804 Epoch 36 iteration 0031/0187: training loss 0.805 Epoch 36 iteration 0032/0187: training loss 0.800 Epoch 36 iteration 0033/0187: training loss 0.797 Epoch 36 iteration 0034/0187: training loss 0.800 Epoch 36 iteration 0035/0187: training loss 0.794 Epoch 36 iteration 0036/0187: training loss 0.790 Epoch 36 iteration 0037/0187: training loss 0.788 Epoch 36 iteration 0038/0187: training loss 0.789 Epoch 36 iteration 0039/0187: training loss 0.788 Epoch 36 iteration 0040/0187: training loss 0.787 Epoch 36 iteration 0041/0187: training loss 0.791 Epoch 36 iteration 0042/0187: training loss 0.793 Epoch 36 iteration 0043/0187: training loss 0.795 Epoch 36 iteration 0044/0187: training loss 0.790 Epoch 36 iteration 0045/0187: training loss 0.791 Epoch 36 iteration 0046/0187: training loss 0.790 Epoch 36 iteration 0047/0187: training loss 0.791 Epoch 36 iteration 0048/0187: training loss 0.789 Epoch 36 iteration 0049/0187: training loss 0.790 Epoch 36 iteration 0050/0187: training loss 0.787 Epoch 36 iteration 0051/0187: training loss 0.785 Epoch 36 iteration 0052/0187: training loss 0.781 Epoch 36 iteration 0053/0187: training loss 0.781 Epoch 36 iteration 0054/0187: training loss 0.779 Epoch 36 iteration 0055/0187: training loss 0.778 Epoch 36 iteration 0056/0187: training loss 0.778 Epoch 36 iteration 0057/0187: training loss 0.775 Epoch 36 iteration 0058/0187: training loss 0.774 Epoch 36 iteration 0059/0187: training loss 0.774 Epoch 36 iteration 0060/0187: training loss 0.774 Epoch 36 iteration 0061/0187: training loss 0.778 Epoch 36 iteration 0062/0187: training loss 0.779 Epoch 36 iteration 0063/0187: training loss 0.778 Epoch 36 iteration 0064/0187: training loss 0.776 Epoch 36 iteration 0065/0187: training loss 0.773 Epoch 36 iteration 0066/0187: training loss 0.773 Epoch 36 iteration 0067/0187: training loss 0.773 Epoch 36 iteration 0068/0187: training loss 0.771 Epoch 36 iteration 0069/0187: training loss 0.772 Epoch 36 iteration 0070/0187: training loss 0.770 Epoch 36 iteration 0071/0187: training loss 0.771 Epoch 36 iteration 0072/0187: training loss 0.769 Epoch 36 iteration 0073/0187: training loss 0.770 Epoch 36 iteration 0074/0187: training loss 0.773 Epoch 36 iteration 0075/0187: training loss 0.773 Epoch 36 iteration 0076/0187: training loss 0.772 Epoch 36 iteration 0077/0187: training loss 0.772 Epoch 36 iteration 0078/0187: training loss 0.772 Epoch 36 iteration 0079/0187: training loss 0.773 Epoch 36 iteration 0080/0187: training loss 0.773 Epoch 36 iteration 0081/0187: training loss 0.771 Epoch 36 iteration 0082/0187: training loss 0.771 Epoch 36 iteration 0083/0187: training loss 0.771 Epoch 36 iteration 0084/0187: training loss 0.769 Epoch 36 iteration 0085/0187: training loss 0.769 Epoch 36 iteration 0086/0187: training loss 0.769 Epoch 36 iteration 0087/0187: training loss 0.768 Epoch 36 iteration 0088/0187: training loss 0.768 Epoch 36 iteration 0089/0187: training loss 0.766 Epoch 36 iteration 0090/0187: training loss 0.766 Epoch 36 iteration 0091/0188: training loss 0.765 Epoch 36 iteration 0092/0188: training loss 0.764 Epoch 36 iteration 0093/0188: training loss 0.765 Epoch 36 iteration 0094/0188: training loss 0.766 Epoch 36 iteration 0095/0188: training loss 0.767 Epoch 36 iteration 0096/0188: training loss 0.767 Epoch 36 iteration 0097/0188: training loss 0.767 Epoch 36 iteration 0098/0188: training loss 0.767 Epoch 36 iteration 0099/0188: training loss 0.770 Epoch 36 iteration 0100/0188: training loss 0.770 Epoch 36 iteration 0101/0188: training loss 0.769 Epoch 36 iteration 0102/0188: training loss 0.771 Epoch 36 iteration 0103/0188: training loss 0.770 Epoch 36 iteration 0104/0188: training loss 0.773 Epoch 36 iteration 0105/0188: training loss 0.772 Epoch 36 iteration 0106/0188: training loss 0.772 Epoch 36 iteration 0107/0188: training loss 0.772 Epoch 36 iteration 0108/0188: training loss 0.772 Epoch 36 iteration 0109/0188: training loss 0.771 Epoch 36 iteration 0110/0188: training loss 0.770 Epoch 36 iteration 0111/0188: training loss 0.772 Epoch 36 iteration 0112/0188: training loss 0.773 Epoch 36 iteration 0113/0188: training loss 0.772 Epoch 36 iteration 0114/0188: training loss 0.772 Epoch 36 iteration 0115/0188: training loss 0.771 Epoch 36 iteration 0116/0188: training loss 0.770 Epoch 36 iteration 0117/0188: training loss 0.772 Epoch 36 iteration 0118/0188: training loss 0.771 Epoch 36 iteration 0119/0188: training loss 0.772 Epoch 36 iteration 0120/0188: training loss 0.773 Epoch 36 iteration 0121/0188: training loss 0.775 Epoch 36 iteration 0122/0188: training loss 0.775 Epoch 36 iteration 0123/0188: training loss 0.774 Epoch 36 iteration 0124/0188: training loss 0.775 Epoch 36 iteration 0125/0188: training loss 0.775 Epoch 36 iteration 0126/0188: training loss 0.774 Epoch 36 iteration 0127/0188: training loss 0.772 Epoch 36 iteration 0128/0188: training loss 0.772 Epoch 36 iteration 0129/0188: training loss 0.771 Epoch 36 iteration 0130/0188: training loss 0.771 Epoch 36 iteration 0131/0188: training loss 0.770 Epoch 36 iteration 0132/0188: training loss 0.770 Epoch 36 iteration 0133/0188: training loss 0.771 Epoch 36 iteration 0134/0188: training loss 0.770 Epoch 36 iteration 0135/0188: training loss 0.770 Epoch 36 iteration 0136/0188: training loss 0.771 Epoch 36 iteration 0137/0188: training loss 0.770 Epoch 36 iteration 0138/0188: training loss 0.770 Epoch 36 iteration 0139/0188: training loss 0.770 Epoch 36 iteration 0140/0188: training loss 0.769 Epoch 36 iteration 0141/0188: training loss 0.770 Epoch 36 iteration 0142/0188: training loss 0.770 Epoch 36 iteration 0143/0188: training loss 0.769 Epoch 36 iteration 0144/0188: training loss 0.768 Epoch 36 iteration 0145/0188: training loss 0.768 Epoch 36 iteration 0146/0188: training loss 0.770 Epoch 36 iteration 0147/0188: training loss 0.770 Epoch 36 iteration 0148/0188: training loss 0.769 Epoch 36 iteration 0149/0188: training loss 0.769 Epoch 36 iteration 0150/0188: training loss 0.768 Epoch 36 iteration 0151/0188: training loss 0.769 Epoch 36 iteration 0152/0188: training loss 0.768 Epoch 36 iteration 0153/0188: training loss 0.768 Epoch 36 iteration 0154/0188: training loss 0.768 Epoch 36 iteration 0155/0188: training loss 0.768 Epoch 36 iteration 0156/0188: training loss 0.767 Epoch 36 iteration 0157/0188: training loss 0.767 Epoch 36 iteration 0158/0188: training loss 0.767 Epoch 36 iteration 0159/0188: training loss 0.766 Epoch 36 iteration 0160/0188: training loss 0.766 Epoch 36 iteration 0161/0188: training loss 0.767 Epoch 36 iteration 0162/0188: training loss 0.766 Epoch 36 iteration 0163/0188: training loss 0.768 Epoch 36 iteration 0164/0188: training loss 0.768 Epoch 36 iteration 0165/0188: training loss 0.768 Epoch 36 iteration 0166/0188: training loss 0.767 Epoch 36 iteration 0167/0188: training loss 0.767 Epoch 36 iteration 0168/0188: training loss 0.767 Epoch 36 iteration 0169/0188: training loss 0.768 Epoch 36 iteration 0170/0188: training loss 0.768 Epoch 36 iteration 0171/0188: training loss 0.768 Epoch 36 iteration 0172/0188: training loss 0.770 Epoch 36 iteration 0173/0188: training loss 0.769 Epoch 36 iteration 0174/0188: training loss 0.769 Epoch 36 iteration 0175/0188: training loss 0.771 Epoch 36 iteration 0176/0188: training loss 0.770 Epoch 36 iteration 0177/0188: training loss 0.771 Epoch 36 iteration 0178/0188: training loss 0.771 Epoch 36 iteration 0179/0188: training loss 0.771 Epoch 36 iteration 0180/0188: training loss 0.770 Epoch 36 iteration 0181/0188: training loss 0.770 Epoch 36 iteration 0182/0188: training loss 0.770 Epoch 36 iteration 0183/0188: training loss 0.769 Epoch 36 iteration 0184/0188: training loss 0.769 Epoch 36 iteration 0185/0188: training loss 0.769 Epoch 36 iteration 0186/0188: training loss 0.768 Epoch 36 validation pixAcc: 0.873, mIoU: 0.381 Epoch 37 iteration 0001/0187: training loss 0.650 Epoch 37 iteration 0002/0187: training loss 0.668 Epoch 37 iteration 0003/0187: training loss 0.699 Epoch 37 iteration 0004/0187: training loss 0.731 Epoch 37 iteration 0005/0187: training loss 0.742 Epoch 37 iteration 0006/0187: training loss 0.730 Epoch 37 iteration 0007/0187: training loss 0.751 Epoch 37 iteration 0008/0187: training loss 0.773 Epoch 37 iteration 0009/0187: training loss 0.765 Epoch 37 iteration 0010/0187: training loss 0.786 Epoch 37 iteration 0011/0187: training loss 0.772 Epoch 37 iteration 0012/0187: training loss 0.761 Epoch 37 iteration 0013/0187: training loss 0.782 Epoch 37 iteration 0014/0187: training loss 0.788 Epoch 37 iteration 0015/0187: training loss 0.791 Epoch 37 iteration 0016/0187: training loss 0.797 Epoch 37 iteration 0017/0187: training loss 0.785 Epoch 37 iteration 0018/0187: training loss 0.787 Epoch 37 iteration 0019/0187: training loss 0.789 Epoch 37 iteration 0020/0187: training loss 0.783 Epoch 37 iteration 0021/0187: training loss 0.788 Epoch 37 iteration 0022/0187: training loss 0.788 Epoch 37 iteration 0023/0187: training loss 0.784 Epoch 37 iteration 0024/0187: training loss 0.778 Epoch 37 iteration 0025/0187: training loss 0.780 Epoch 37 iteration 0026/0187: training loss 0.774 Epoch 37 iteration 0027/0187: training loss 0.769 Epoch 37 iteration 0028/0187: training loss 0.767 Epoch 37 iteration 0029/0187: training loss 0.767 Epoch 37 iteration 0030/0187: training loss 0.769 Epoch 37 iteration 0031/0187: training loss 0.771 Epoch 37 iteration 0032/0187: training loss 0.769 Epoch 37 iteration 0033/0187: training loss 0.763 Epoch 37 iteration 0034/0187: training loss 0.768 Epoch 37 iteration 0035/0187: training loss 0.771 Epoch 37 iteration 0036/0187: training loss 0.766 Epoch 37 iteration 0037/0187: training loss 0.764 Epoch 37 iteration 0038/0187: training loss 0.761 Epoch 37 iteration 0039/0187: training loss 0.758 Epoch 37 iteration 0040/0187: training loss 0.757 Epoch 37 iteration 0041/0187: training loss 0.753 Epoch 37 iteration 0042/0187: training loss 0.752 Epoch 37 iteration 0043/0187: training loss 0.752 Epoch 37 iteration 0044/0187: training loss 0.750 Epoch 37 iteration 0045/0187: training loss 0.751 Epoch 37 iteration 0046/0187: training loss 0.750 Epoch 37 iteration 0047/0187: training loss 0.751 Epoch 37 iteration 0048/0187: training loss 0.749 Epoch 37 iteration 0049/0187: training loss 0.751 Epoch 37 iteration 0050/0187: training loss 0.750 Epoch 37 iteration 0051/0187: training loss 0.749 Epoch 37 iteration 0052/0187: training loss 0.747 Epoch 37 iteration 0053/0187: training loss 0.750 Epoch 37 iteration 0054/0187: training loss 0.747 Epoch 37 iteration 0055/0187: training loss 0.748 Epoch 37 iteration 0056/0187: training loss 0.745 Epoch 37 iteration 0057/0187: training loss 0.744 Epoch 37 iteration 0058/0187: training loss 0.744 Epoch 37 iteration 0059/0187: training loss 0.747 Epoch 37 iteration 0060/0187: training loss 0.745 Epoch 37 iteration 0061/0187: training loss 0.742 Epoch 37 iteration 0062/0187: training loss 0.742 Epoch 37 iteration 0063/0187: training loss 0.744 Epoch 37 iteration 0064/0187: training loss 0.741 Epoch 37 iteration 0065/0187: training loss 0.740 Epoch 37 iteration 0066/0187: training loss 0.739 Epoch 37 iteration 0067/0187: training loss 0.741 Epoch 37 iteration 0068/0187: training loss 0.742 Epoch 37 iteration 0069/0187: training loss 0.741 Epoch 37 iteration 0070/0187: training loss 0.741 Epoch 37 iteration 0071/0187: training loss 0.741 Epoch 37 iteration 0072/0187: training loss 0.740 Epoch 37 iteration 0073/0187: training loss 0.739 Epoch 37 iteration 0074/0187: training loss 0.738 Epoch 37 iteration 0075/0187: training loss 0.738 Epoch 37 iteration 0076/0187: training loss 0.738 Epoch 37 iteration 0077/0187: training loss 0.738 Epoch 37 iteration 0078/0187: training loss 0.736 Epoch 37 iteration 0079/0187: training loss 0.737 Epoch 37 iteration 0080/0187: training loss 0.737 Epoch 37 iteration 0081/0187: training loss 0.736 Epoch 37 iteration 0082/0187: training loss 0.737 Epoch 37 iteration 0083/0187: training loss 0.736 Epoch 37 iteration 0084/0187: training loss 0.736 Epoch 37 iteration 0085/0187: training loss 0.736 Epoch 37 iteration 0086/0187: training loss 0.736 Epoch 37 iteration 0087/0187: training loss 0.738 Epoch 37 iteration 0088/0187: training loss 0.739 Epoch 37 iteration 0089/0187: training loss 0.739 Epoch 37 iteration 0090/0187: training loss 0.741 Epoch 37 iteration 0091/0187: training loss 0.740 Epoch 37 iteration 0092/0187: training loss 0.743 Epoch 37 iteration 0093/0187: training loss 0.742 Epoch 37 iteration 0094/0187: training loss 0.743 Epoch 37 iteration 0095/0187: training loss 0.743 Epoch 37 iteration 0096/0187: training loss 0.745 Epoch 37 iteration 0097/0187: training loss 0.748 Epoch 37 iteration 0098/0187: training loss 0.748 Epoch 37 iteration 0099/0187: training loss 0.751 Epoch 37 iteration 0100/0187: training loss 0.751 Epoch 37 iteration 0101/0187: training loss 0.750 Epoch 37 iteration 0102/0187: training loss 0.751 Epoch 37 iteration 0103/0187: training loss 0.749 Epoch 37 iteration 0104/0187: training loss 0.750 Epoch 37 iteration 0105/0187: training loss 0.748 Epoch 37 iteration 0106/0187: training loss 0.747 Epoch 37 iteration 0107/0187: training loss 0.746 Epoch 37 iteration 0108/0187: training loss 0.746 Epoch 37 iteration 0109/0187: training loss 0.745 Epoch 37 iteration 0110/0187: training loss 0.748 Epoch 37 iteration 0111/0187: training loss 0.747 Epoch 37 iteration 0112/0187: training loss 0.749 Epoch 37 iteration 0113/0187: training loss 0.748 Epoch 37 iteration 0114/0187: training loss 0.749 Epoch 37 iteration 0115/0187: training loss 0.749 Epoch 37 iteration 0116/0187: training loss 0.749 Epoch 37 iteration 0117/0187: training loss 0.750 Epoch 37 iteration 0118/0187: training loss 0.753 Epoch 37 iteration 0119/0187: training loss 0.753 Epoch 37 iteration 0120/0187: training loss 0.753 Epoch 37 iteration 0121/0187: training loss 0.754 Epoch 37 iteration 0122/0187: training loss 0.753 Epoch 37 iteration 0123/0187: training loss 0.753 Epoch 37 iteration 0124/0187: training loss 0.755 Epoch 37 iteration 0125/0187: training loss 0.757 Epoch 37 iteration 0126/0187: training loss 0.757 Epoch 37 iteration 0127/0187: training loss 0.757 Epoch 37 iteration 0128/0187: training loss 0.757 Epoch 37 iteration 0129/0187: training loss 0.758 Epoch 37 iteration 0130/0187: training loss 0.758 Epoch 37 iteration 0131/0187: training loss 0.758 Epoch 37 iteration 0132/0187: training loss 0.757 Epoch 37 iteration 0133/0187: training loss 0.757 Epoch 37 iteration 0134/0187: training loss 0.756 Epoch 37 iteration 0135/0187: training loss 0.754 Epoch 37 iteration 0136/0187: training loss 0.755 Epoch 37 iteration 0137/0187: training loss 0.757 Epoch 37 iteration 0138/0187: training loss 0.757 Epoch 37 iteration 0139/0187: training loss 0.758 Epoch 37 iteration 0140/0187: training loss 0.759 Epoch 37 iteration 0141/0187: training loss 0.759 Epoch 37 iteration 0142/0187: training loss 0.758 Epoch 37 iteration 0143/0187: training loss 0.758 Epoch 37 iteration 0144/0187: training loss 0.758 Epoch 37 iteration 0145/0187: training loss 0.759 Epoch 37 iteration 0146/0187: training loss 0.759 Epoch 37 iteration 0147/0187: training loss 0.759 Epoch 37 iteration 0148/0187: training loss 0.759 Epoch 37 iteration 0149/0187: training loss 0.758 Epoch 37 iteration 0150/0187: training loss 0.757 Epoch 37 iteration 0151/0187: training loss 0.756 Epoch 37 iteration 0152/0187: training loss 0.756 Epoch 37 iteration 0153/0187: training loss 0.755 Epoch 37 iteration 0154/0187: training loss 0.755 Epoch 37 iteration 0155/0187: training loss 0.755 Epoch 37 iteration 0156/0187: training loss 0.755 Epoch 37 iteration 0157/0187: training loss 0.755 Epoch 37 iteration 0158/0187: training loss 0.756 Epoch 37 iteration 0159/0187: training loss 0.755 Epoch 37 iteration 0160/0187: training loss 0.754 Epoch 37 iteration 0161/0187: training loss 0.754 Epoch 37 iteration 0162/0187: training loss 0.755 Epoch 37 iteration 0163/0187: training loss 0.754 Epoch 37 iteration 0164/0187: training loss 0.754 Epoch 37 iteration 0165/0187: training loss 0.753 Epoch 37 iteration 0166/0187: training loss 0.754 Epoch 37 iteration 0167/0187: training loss 0.754 Epoch 37 iteration 0168/0187: training loss 0.754 Epoch 37 iteration 0169/0187: training loss 0.754 Epoch 37 iteration 0170/0187: training loss 0.755 Epoch 37 iteration 0171/0187: training loss 0.754 Epoch 37 iteration 0172/0187: training loss 0.754 Epoch 37 iteration 0173/0187: training loss 0.753 Epoch 37 iteration 0174/0187: training loss 0.753 Epoch 37 iteration 0175/0187: training loss 0.753 Epoch 37 iteration 0176/0187: training loss 0.752 Epoch 37 iteration 0177/0187: training loss 0.753 Epoch 37 iteration 0178/0187: training loss 0.754 Epoch 37 iteration 0179/0187: training loss 0.753 Epoch 37 iteration 0180/0187: training loss 0.753 Epoch 37 iteration 0181/0187: training loss 0.753 Epoch 37 iteration 0182/0187: training loss 0.753 Epoch 37 iteration 0183/0187: training loss 0.754 Epoch 37 iteration 0184/0187: training loss 0.753 Epoch 37 iteration 0185/0187: training loss 0.752 Epoch 37 iteration 0186/0187: training loss 0.752 Epoch 37 iteration 0187/0187: training loss 0.751 Epoch 37 validation pixAcc: 0.873, mIoU: 0.378 Epoch 38 iteration 0001/0187: training loss 0.630 Epoch 38 iteration 0002/0187: training loss 0.629 Epoch 38 iteration 0003/0187: training loss 0.644 Epoch 38 iteration 0004/0187: training loss 0.670 Epoch 38 iteration 0005/0187: training loss 0.684 Epoch 38 iteration 0006/0187: training loss 0.696 Epoch 38 iteration 0007/0187: training loss 0.707 Epoch 38 iteration 0008/0187: training loss 0.708 Epoch 38 iteration 0009/0187: training loss 0.736 Epoch 38 iteration 0010/0187: training loss 0.744 Epoch 38 iteration 0011/0187: training loss 0.750 Epoch 38 iteration 0012/0187: training loss 0.748 Epoch 38 iteration 0013/0187: training loss 0.739 Epoch 38 iteration 0014/0187: training loss 0.726 Epoch 38 iteration 0015/0187: training loss 0.740 Epoch 38 iteration 0016/0187: training loss 0.736 Epoch 38 iteration 0017/0187: training loss 0.753 Epoch 38 iteration 0018/0187: training loss 0.750 Epoch 38 iteration 0019/0187: training loss 0.745 Epoch 38 iteration 0020/0187: training loss 0.748 Epoch 38 iteration 0021/0187: training loss 0.745 Epoch 38 iteration 0022/0187: training loss 0.738 Epoch 38 iteration 0023/0187: training loss 0.748 Epoch 38 iteration 0024/0187: training loss 0.752 Epoch 38 iteration 0025/0187: training loss 0.757 Epoch 38 iteration 0026/0187: training loss 0.757 Epoch 38 iteration 0027/0187: training loss 0.755 Epoch 38 iteration 0028/0187: training loss 0.754 Epoch 38 iteration 0029/0187: training loss 0.752 Epoch 38 iteration 0030/0187: training loss 0.754 Epoch 38 iteration 0031/0187: training loss 0.752 Epoch 38 iteration 0032/0187: training loss 0.748 Epoch 38 iteration 0033/0187: training loss 0.749 Epoch 38 iteration 0034/0187: training loss 0.748 Epoch 38 iteration 0035/0187: training loss 0.750 Epoch 38 iteration 0036/0187: training loss 0.746 Epoch 38 iteration 0037/0187: training loss 0.745 Epoch 38 iteration 0038/0187: training loss 0.745 Epoch 38 iteration 0039/0187: training loss 0.747 Epoch 38 iteration 0040/0187: training loss 0.751 Epoch 38 iteration 0041/0187: training loss 0.755 Epoch 38 iteration 0042/0187: training loss 0.753 Epoch 38 iteration 0043/0187: training loss 0.756 Epoch 38 iteration 0044/0187: training loss 0.756 Epoch 38 iteration 0045/0187: training loss 0.760 Epoch 38 iteration 0046/0187: training loss 0.756 Epoch 38 iteration 0047/0187: training loss 0.759 Epoch 38 iteration 0048/0187: training loss 0.757 Epoch 38 iteration 0049/0187: training loss 0.757 Epoch 38 iteration 0050/0187: training loss 0.753 Epoch 38 iteration 0051/0187: training loss 0.757 Epoch 38 iteration 0052/0187: training loss 0.759 Epoch 38 iteration 0053/0187: training loss 0.758 Epoch 38 iteration 0054/0187: training loss 0.757 Epoch 38 iteration 0055/0187: training loss 0.759 Epoch 38 iteration 0056/0187: training loss 0.757 Epoch 38 iteration 0057/0187: training loss 0.759 Epoch 38 iteration 0058/0187: training loss 0.758 Epoch 38 iteration 0059/0187: training loss 0.760 Epoch 38 iteration 0060/0187: training loss 0.758 Epoch 38 iteration 0061/0187: training loss 0.759 Epoch 38 iteration 0062/0187: training loss 0.762 Epoch 38 iteration 0063/0187: training loss 0.762 Epoch 38 iteration 0064/0187: training loss 0.761 Epoch 38 iteration 0065/0187: training loss 0.759 Epoch 38 iteration 0066/0187: training loss 0.757 Epoch 38 iteration 0067/0187: training loss 0.756 Epoch 38 iteration 0068/0187: training loss 0.755 Epoch 38 iteration 0069/0187: training loss 0.753 Epoch 38 iteration 0070/0187: training loss 0.755 Epoch 38 iteration 0071/0187: training loss 0.754 Epoch 38 iteration 0072/0187: training loss 0.755 Epoch 38 iteration 0073/0187: training loss 0.755 Epoch 38 iteration 0074/0187: training loss 0.757 Epoch 38 iteration 0075/0187: training loss 0.760 Epoch 38 iteration 0076/0187: training loss 0.761 Epoch 38 iteration 0077/0187: training loss 0.761 Epoch 38 iteration 0078/0187: training loss 0.761 Epoch 38 iteration 0079/0187: training loss 0.761 Epoch 38 iteration 0080/0187: training loss 0.761 Epoch 38 iteration 0081/0187: training loss 0.759 Epoch 38 iteration 0082/0187: training loss 0.760 Epoch 38 iteration 0083/0187: training loss 0.762 Epoch 38 iteration 0084/0187: training loss 0.761 Epoch 38 iteration 0085/0187: training loss 0.760 Epoch 38 iteration 0086/0187: training loss 0.759 Epoch 38 iteration 0087/0187: training loss 0.758 Epoch 38 iteration 0088/0187: training loss 0.757 Epoch 38 iteration 0089/0187: training loss 0.755 Epoch 38 iteration 0090/0187: training loss 0.755 Epoch 38 iteration 0091/0188: training loss 0.756 Epoch 38 iteration 0092/0188: training loss 0.757 Epoch 38 iteration 0093/0188: training loss 0.754 Epoch 38 iteration 0094/0188: training loss 0.753 Epoch 38 iteration 0095/0188: training loss 0.756 Epoch 38 iteration 0096/0188: training loss 0.756 Epoch 38 iteration 0097/0188: training loss 0.756 Epoch 38 iteration 0098/0188: training loss 0.755 Epoch 38 iteration 0099/0188: training loss 0.755 Epoch 38 iteration 0100/0188: training loss 0.754 Epoch 38 iteration 0101/0188: training loss 0.753 Epoch 38 iteration 0102/0188: training loss 0.752 Epoch 38 iteration 0103/0188: training loss 0.752 Epoch 38 iteration 0104/0188: training loss 0.753 Epoch 38 iteration 0105/0188: training loss 0.753 Epoch 38 iteration 0106/0188: training loss 0.755 Epoch 38 iteration 0107/0188: training loss 0.756 Epoch 38 iteration 0108/0188: training loss 0.756 Epoch 38 iteration 0109/0188: training loss 0.755 Epoch 38 iteration 0110/0188: training loss 0.757 Epoch 38 iteration 0111/0188: training loss 0.757 Epoch 38 iteration 0112/0188: training loss 0.757 Epoch 38 iteration 0113/0188: training loss 0.757 Epoch 38 iteration 0114/0188: training loss 0.757 Epoch 38 iteration 0115/0188: training loss 0.758 Epoch 38 iteration 0116/0188: training loss 0.757 Epoch 38 iteration 0117/0188: training loss 0.756 Epoch 38 iteration 0118/0188: training loss 0.756 Epoch 38 iteration 0119/0188: training loss 0.756 Epoch 38 iteration 0120/0188: training loss 0.755 Epoch 38 iteration 0121/0188: training loss 0.755 Epoch 38 iteration 0122/0188: training loss 0.755 Epoch 38 iteration 0123/0188: training loss 0.755 Epoch 38 iteration 0124/0188: training loss 0.755 Epoch 38 iteration 0125/0188: training loss 0.755 Epoch 38 iteration 0126/0188: training loss 0.754 Epoch 38 iteration 0127/0188: training loss 0.755 Epoch 38 iteration 0128/0188: training loss 0.755 Epoch 38 iteration 0129/0188: training loss 0.754 Epoch 38 iteration 0130/0188: training loss 0.754 Epoch 38 iteration 0131/0188: training loss 0.753 Epoch 38 iteration 0132/0188: training loss 0.753 Epoch 38 iteration 0133/0188: training loss 0.754 Epoch 38 iteration 0134/0188: training loss 0.753 Epoch 38 iteration 0135/0188: training loss 0.753 Epoch 38 iteration 0136/0188: training loss 0.752 Epoch 38 iteration 0137/0188: training loss 0.752 Epoch 38 iteration 0138/0188: training loss 0.753 Epoch 38 iteration 0139/0188: training loss 0.754 Epoch 38 iteration 0140/0188: training loss 0.754 Epoch 38 iteration 0141/0188: training loss 0.754 Epoch 38 iteration 0142/0188: training loss 0.756 Epoch 38 iteration 0143/0188: training loss 0.757 Epoch 38 iteration 0144/0188: training loss 0.757 Epoch 38 iteration 0145/0188: training loss 0.758 Epoch 38 iteration 0146/0188: training loss 0.758 Epoch 38 iteration 0147/0188: training loss 0.758 Epoch 38 iteration 0148/0188: training loss 0.758 Epoch 38 iteration 0149/0188: training loss 0.759 Epoch 38 iteration 0150/0188: training loss 0.758 Epoch 38 iteration 0151/0188: training loss 0.759 Epoch 38 iteration 0152/0188: training loss 0.759 Epoch 38 iteration 0153/0188: training loss 0.759 Epoch 38 iteration 0154/0188: training loss 0.760 Epoch 38 iteration 0155/0188: training loss 0.760 Epoch 38 iteration 0156/0188: training loss 0.759 Epoch 38 iteration 0157/0188: training loss 0.760 Epoch 38 iteration 0158/0188: training loss 0.759 Epoch 38 iteration 0159/0188: training loss 0.758 Epoch 38 iteration 0160/0188: training loss 0.758 Epoch 38 iteration 0161/0188: training loss 0.757 Epoch 38 iteration 0162/0188: training loss 0.756 Epoch 38 iteration 0163/0188: training loss 0.755 Epoch 38 iteration 0164/0188: training loss 0.755 Epoch 38 iteration 0165/0188: training loss 0.755 Epoch 38 iteration 0166/0188: training loss 0.754 Epoch 38 iteration 0167/0188: training loss 0.755 Epoch 38 iteration 0168/0188: training loss 0.756 Epoch 38 iteration 0169/0188: training loss 0.755 Epoch 38 iteration 0170/0188: training loss 0.755 Epoch 38 iteration 0171/0188: training loss 0.756 Epoch 38 iteration 0172/0188: training loss 0.755 Epoch 38 iteration 0173/0188: training loss 0.756 Epoch 38 iteration 0174/0188: training loss 0.757 Epoch 38 iteration 0175/0188: training loss 0.755 Epoch 38 iteration 0176/0188: training loss 0.755 Epoch 38 iteration 0177/0188: training loss 0.756 Epoch 38 iteration 0178/0188: training loss 0.755 Epoch 38 iteration 0179/0188: training loss 0.755 Epoch 38 iteration 0180/0188: training loss 0.755 Epoch 38 iteration 0181/0188: training loss 0.755 Epoch 38 iteration 0182/0188: training loss 0.754 Epoch 38 iteration 0183/0188: training loss 0.755 Epoch 38 iteration 0184/0188: training loss 0.758 Epoch 38 iteration 0185/0188: training loss 0.758 Epoch 38 iteration 0186/0188: training loss 0.758 Epoch 38 validation pixAcc: 0.872, mIoU: 0.378 Epoch 39 iteration 0001/0187: training loss 0.774 Epoch 39 iteration 0002/0187: training loss 0.726 Epoch 39 iteration 0003/0187: training loss 0.713 Epoch 39 iteration 0004/0187: training loss 0.695 Epoch 39 iteration 0005/0187: training loss 0.699 Epoch 39 iteration 0006/0187: training loss 0.693 Epoch 39 iteration 0007/0187: training loss 0.742 Epoch 39 iteration 0008/0187: training loss 0.734 Epoch 39 iteration 0009/0187: training loss 0.727 Epoch 39 iteration 0010/0187: training loss 0.725 Epoch 39 iteration 0011/0187: training loss 0.726 Epoch 39 iteration 0012/0187: training loss 0.723 Epoch 39 iteration 0013/0187: training loss 0.747 Epoch 39 iteration 0014/0187: training loss 0.740 Epoch 39 iteration 0015/0187: training loss 0.739 Epoch 39 iteration 0016/0187: training loss 0.734 Epoch 39 iteration 0017/0187: training loss 0.753 Epoch 39 iteration 0018/0187: training loss 0.751 Epoch 39 iteration 0019/0187: training loss 0.759 Epoch 39 iteration 0020/0187: training loss 0.776 Epoch 39 iteration 0021/0187: training loss 0.769 Epoch 39 iteration 0022/0187: training loss 0.773 Epoch 39 iteration 0023/0187: training loss 0.771 Epoch 39 iteration 0024/0187: training loss 0.775 Epoch 39 iteration 0025/0187: training loss 0.775 Epoch 39 iteration 0026/0187: training loss 0.778 Epoch 39 iteration 0027/0187: training loss 0.779 Epoch 39 iteration 0028/0187: training loss 0.774 Epoch 39 iteration 0029/0187: training loss 0.769 Epoch 39 iteration 0030/0187: training loss 0.765 Epoch 39 iteration 0031/0187: training loss 0.766 Epoch 39 iteration 0032/0187: training loss 0.768 Epoch 39 iteration 0033/0187: training loss 0.772 Epoch 39 iteration 0034/0187: training loss 0.772 Epoch 39 iteration 0035/0187: training loss 0.766 Epoch 39 iteration 0036/0187: training loss 0.765 Epoch 39 iteration 0037/0187: training loss 0.765 Epoch 39 iteration 0038/0187: training loss 0.773 Epoch 39 iteration 0039/0187: training loss 0.775 Epoch 39 iteration 0040/0187: training loss 0.782 Epoch 39 iteration 0041/0187: training loss 0.779 Epoch 39 iteration 0042/0187: training loss 0.780 Epoch 39 iteration 0043/0187: training loss 0.778 Epoch 39 iteration 0044/0187: training loss 0.780 Epoch 39 iteration 0045/0187: training loss 0.779 Epoch 39 iteration 0046/0187: training loss 0.773 Epoch 39 iteration 0047/0187: training loss 0.772 Epoch 39 iteration 0048/0187: training loss 0.769 Epoch 39 iteration 0049/0187: training loss 0.765 Epoch 39 iteration 0050/0187: training loss 0.762 Epoch 39 iteration 0051/0187: training loss 0.765 Epoch 39 iteration 0052/0187: training loss 0.763 Epoch 39 iteration 0053/0187: training loss 0.760 Epoch 39 iteration 0054/0187: training loss 0.759 Epoch 39 iteration 0055/0187: training loss 0.764 Epoch 39 iteration 0056/0187: training loss 0.767 Epoch 39 iteration 0057/0187: training loss 0.768 Epoch 39 iteration 0058/0187: training loss 0.769 Epoch 39 iteration 0059/0187: training loss 0.769 Epoch 39 iteration 0060/0187: training loss 0.768 Epoch 39 iteration 0061/0187: training loss 0.768 Epoch 39 iteration 0062/0187: training loss 0.766 Epoch 39 iteration 0063/0187: training loss 0.765 Epoch 39 iteration 0064/0187: training loss 0.763 Epoch 39 iteration 0065/0187: training loss 0.763 Epoch 39 iteration 0066/0187: training loss 0.762 Epoch 39 iteration 0067/0187: training loss 0.763 Epoch 39 iteration 0068/0187: training loss 0.762 Epoch 39 iteration 0069/0187: training loss 0.762 Epoch 39 iteration 0070/0187: training loss 0.763 Epoch 39 iteration 0071/0187: training loss 0.763 Epoch 39 iteration 0072/0187: training loss 0.762 Epoch 39 iteration 0073/0187: training loss 0.760 Epoch 39 iteration 0074/0187: training loss 0.759 Epoch 39 iteration 0075/0187: training loss 0.758 Epoch 39 iteration 0076/0187: training loss 0.758 Epoch 39 iteration 0077/0187: training loss 0.760 Epoch 39 iteration 0078/0187: training loss 0.761 Epoch 39 iteration 0079/0187: training loss 0.761 Epoch 39 iteration 0080/0187: training loss 0.760 Epoch 39 iteration 0081/0187: training loss 0.760 Epoch 39 iteration 0082/0187: training loss 0.760 Epoch 39 iteration 0083/0187: training loss 0.759 Epoch 39 iteration 0084/0187: training loss 0.758 Epoch 39 iteration 0085/0187: training loss 0.757 Epoch 39 iteration 0086/0187: training loss 0.758 Epoch 39 iteration 0087/0187: training loss 0.758 Epoch 39 iteration 0088/0187: training loss 0.759 Epoch 39 iteration 0089/0187: training loss 0.760 Epoch 39 iteration 0090/0187: training loss 0.759 Epoch 39 iteration 0091/0187: training loss 0.757 Epoch 39 iteration 0092/0187: training loss 0.759 Epoch 39 iteration 0093/0187: training loss 0.761 Epoch 39 iteration 0094/0187: training loss 0.761 Epoch 39 iteration 0095/0187: training loss 0.762 Epoch 39 iteration 0096/0187: training loss 0.762 Epoch 39 iteration 0097/0187: training loss 0.761 Epoch 39 iteration 0098/0187: training loss 0.761 Epoch 39 iteration 0099/0187: training loss 0.763 Epoch 39 iteration 0100/0187: training loss 0.762 Epoch 39 iteration 0101/0187: training loss 0.763 Epoch 39 iteration 0102/0187: training loss 0.762 Epoch 39 iteration 0103/0187: training loss 0.762 Epoch 39 iteration 0104/0187: training loss 0.763 Epoch 39 iteration 0105/0187: training loss 0.760 Epoch 39 iteration 0106/0187: training loss 0.758 Epoch 39 iteration 0107/0187: training loss 0.758 Epoch 39 iteration 0108/0187: training loss 0.758 Epoch 39 iteration 0109/0187: training loss 0.760 Epoch 39 iteration 0110/0187: training loss 0.759 Epoch 39 iteration 0111/0187: training loss 0.758 Epoch 39 iteration 0112/0187: training loss 0.757 Epoch 39 iteration 0113/0187: training loss 0.757 Epoch 39 iteration 0114/0187: training loss 0.756 Epoch 39 iteration 0115/0187: training loss 0.755 Epoch 39 iteration 0116/0187: training loss 0.755 Epoch 39 iteration 0117/0187: training loss 0.756 Epoch 39 iteration 0118/0187: training loss 0.755 Epoch 39 iteration 0119/0187: training loss 0.756 Epoch 39 iteration 0120/0187: training loss 0.756 Epoch 39 iteration 0121/0187: training loss 0.757 Epoch 39 iteration 0122/0187: training loss 0.757 Epoch 39 iteration 0123/0187: training loss 0.757 Epoch 39 iteration 0124/0187: training loss 0.756 Epoch 39 iteration 0125/0187: training loss 0.756 Epoch 39 iteration 0126/0187: training loss 0.755 Epoch 39 iteration 0127/0187: training loss 0.754 Epoch 39 iteration 0128/0187: training loss 0.754 Epoch 39 iteration 0129/0187: training loss 0.756 Epoch 39 iteration 0130/0187: training loss 0.756 Epoch 39 iteration 0131/0187: training loss 0.756 Epoch 39 iteration 0132/0187: training loss 0.755 Epoch 39 iteration 0133/0187: training loss 0.757 Epoch 39 iteration 0134/0187: training loss 0.756 Epoch 39 iteration 0135/0187: training loss 0.755 Epoch 39 iteration 0136/0187: training loss 0.755 Epoch 39 iteration 0137/0187: training loss 0.756 Epoch 39 iteration 0138/0187: training loss 0.755 Epoch 39 iteration 0139/0187: training loss 0.755 Epoch 39 iteration 0140/0187: training loss 0.756 Epoch 39 iteration 0141/0187: training loss 0.759 Epoch 39 iteration 0142/0187: training loss 0.759 Epoch 39 iteration 0143/0187: training loss 0.759 Epoch 39 iteration 0144/0187: training loss 0.758 Epoch 39 iteration 0145/0187: training loss 0.758 Epoch 39 iteration 0146/0187: training loss 0.757 Epoch 39 iteration 0147/0187: training loss 0.758 Epoch 39 iteration 0148/0187: training loss 0.759 Epoch 39 iteration 0149/0187: training loss 0.760 Epoch 39 iteration 0150/0187: training loss 0.761 Epoch 39 iteration 0151/0187: training loss 0.761 Epoch 39 iteration 0152/0187: training loss 0.761 Epoch 39 iteration 0153/0187: training loss 0.761 Epoch 39 iteration 0154/0187: training loss 0.761 Epoch 39 iteration 0155/0187: training loss 0.761 Epoch 39 iteration 0156/0187: training loss 0.760 Epoch 39 iteration 0157/0187: training loss 0.761 Epoch 39 iteration 0158/0187: training loss 0.761 Epoch 39 iteration 0159/0187: training loss 0.761 Epoch 39 iteration 0160/0187: training loss 0.762 Epoch 39 iteration 0161/0187: training loss 0.764 Epoch 39 iteration 0162/0187: training loss 0.764 Epoch 39 iteration 0163/0187: training loss 0.764 Epoch 39 iteration 0164/0187: training loss 0.763 Epoch 39 iteration 0165/0187: training loss 0.762 Epoch 39 iteration 0166/0187: training loss 0.762 Epoch 39 iteration 0167/0187: training loss 0.763 Epoch 39 iteration 0168/0187: training loss 0.763 Epoch 39 iteration 0169/0187: training loss 0.764 Epoch 39 iteration 0170/0187: training loss 0.764 Epoch 39 iteration 0171/0187: training loss 0.763 Epoch 39 iteration 0172/0187: training loss 0.763 Epoch 39 iteration 0173/0187: training loss 0.762 Epoch 39 iteration 0174/0187: training loss 0.761 Epoch 39 iteration 0175/0187: training loss 0.761 Epoch 39 iteration 0176/0187: training loss 0.761 Epoch 39 iteration 0177/0187: training loss 0.761 Epoch 39 iteration 0178/0187: training loss 0.762 Epoch 39 iteration 0179/0187: training loss 0.763 Epoch 39 iteration 0180/0187: training loss 0.762 Epoch 39 iteration 0181/0187: training loss 0.762 Epoch 39 iteration 0182/0187: training loss 0.762 Epoch 39 iteration 0183/0187: training loss 0.762 Epoch 39 iteration 0184/0187: training loss 0.762 Epoch 39 iteration 0185/0187: training loss 0.763 Epoch 39 iteration 0186/0187: training loss 0.762 Epoch 39 iteration 0187/0187: training loss 0.761 Epoch 39 validation pixAcc: 0.872, mIoU: 0.381 Epoch 40 iteration 0001/0187: training loss 0.893 Epoch 40 iteration 0002/0187: training loss 0.826 Epoch 40 iteration 0003/0187: training loss 0.828 Epoch 40 iteration 0004/0187: training loss 0.789 Epoch 40 iteration 0005/0187: training loss 0.814 Epoch 40 iteration 0006/0187: training loss 0.800 Epoch 40 iteration 0007/0187: training loss 0.770 Epoch 40 iteration 0008/0187: training loss 0.772 Epoch 40 iteration 0009/0187: training loss 0.749 Epoch 40 iteration 0010/0187: training loss 0.730 Epoch 40 iteration 0011/0187: training loss 0.750 Epoch 40 iteration 0012/0187: training loss 0.745 Epoch 40 iteration 0013/0187: training loss 0.731 Epoch 40 iteration 0014/0187: training loss 0.728 Epoch 40 iteration 0015/0187: training loss 0.730 Epoch 40 iteration 0016/0187: training loss 0.735 Epoch 40 iteration 0017/0187: training loss 0.738 Epoch 40 iteration 0018/0187: training loss 0.741 Epoch 40 iteration 0019/0187: training loss 0.745 Epoch 40 iteration 0020/0187: training loss 0.740 Epoch 40 iteration 0021/0187: training loss 0.737 Epoch 40 iteration 0022/0187: training loss 0.737 Epoch 40 iteration 0023/0187: training loss 0.734 Epoch 40 iteration 0024/0187: training loss 0.736 Epoch 40 iteration 0025/0187: training loss 0.729 Epoch 40 iteration 0026/0187: training loss 0.720 Epoch 40 iteration 0027/0187: training loss 0.719 Epoch 40 iteration 0028/0187: training loss 0.727 Epoch 40 iteration 0029/0187: training loss 0.725 Epoch 40 iteration 0030/0187: training loss 0.731 Epoch 40 iteration 0031/0187: training loss 0.729 Epoch 40 iteration 0032/0187: training loss 0.729 Epoch 40 iteration 0033/0187: training loss 0.731 Epoch 40 iteration 0034/0187: training loss 0.731 Epoch 40 iteration 0035/0187: training loss 0.737 Epoch 40 iteration 0036/0187: training loss 0.734 Epoch 40 iteration 0037/0187: training loss 0.733 Epoch 40 iteration 0038/0187: training loss 0.732 Epoch 40 iteration 0039/0187: training loss 0.729 Epoch 40 iteration 0040/0187: training loss 0.730 Epoch 40 iteration 0041/0187: training loss 0.732 Epoch 40 iteration 0042/0187: training loss 0.731 Epoch 40 iteration 0043/0187: training loss 0.727 Epoch 40 iteration 0044/0187: training loss 0.726 Epoch 40 iteration 0045/0187: training loss 0.728 Epoch 40 iteration 0046/0187: training loss 0.726 Epoch 40 iteration 0047/0187: training loss 0.729 Epoch 40 iteration 0048/0187: training loss 0.728 Epoch 40 iteration 0049/0187: training loss 0.732 Epoch 40 iteration 0050/0187: training loss 0.732 Epoch 40 iteration 0051/0187: training loss 0.734 Epoch 40 iteration 0052/0187: training loss 0.735 Epoch 40 iteration 0053/0187: training loss 0.734 Epoch 40 iteration 0054/0187: training loss 0.734 Epoch 40 iteration 0055/0187: training loss 0.733 Epoch 40 iteration 0056/0187: training loss 0.735 Epoch 40 iteration 0057/0187: training loss 0.735 Epoch 40 iteration 0058/0187: training loss 0.735 Epoch 40 iteration 0059/0187: training loss 0.738 Epoch 40 iteration 0060/0187: training loss 0.743 Epoch 40 iteration 0061/0187: training loss 0.744 Epoch 40 iteration 0062/0187: training loss 0.743 Epoch 40 iteration 0063/0187: training loss 0.745 Epoch 40 iteration 0064/0187: training loss 0.746 Epoch 40 iteration 0065/0187: training loss 0.746 Epoch 40 iteration 0066/0187: training loss 0.745 Epoch 40 iteration 0067/0187: training loss 0.744 Epoch 40 iteration 0068/0187: training loss 0.744 Epoch 40 iteration 0069/0187: training loss 0.743 Epoch 40 iteration 0070/0187: training loss 0.743 Epoch 40 iteration 0071/0187: training loss 0.747 Epoch 40 iteration 0072/0187: training loss 0.752 Epoch 40 iteration 0073/0187: training loss 0.752 Epoch 40 iteration 0074/0187: training loss 0.752 Epoch 40 iteration 0075/0187: training loss 0.750 Epoch 40 iteration 0076/0187: training loss 0.750 Epoch 40 iteration 0077/0187: training loss 0.749 Epoch 40 iteration 0078/0187: training loss 0.751 Epoch 40 iteration 0079/0187: training loss 0.755 Epoch 40 iteration 0080/0187: training loss 0.755 Epoch 40 iteration 0081/0187: training loss 0.753 Epoch 40 iteration 0082/0187: training loss 0.753 Epoch 40 iteration 0083/0187: training loss 0.753 Epoch 40 iteration 0084/0187: training loss 0.754 Epoch 40 iteration 0085/0187: training loss 0.752 Epoch 40 iteration 0086/0187: training loss 0.752 Epoch 40 iteration 0087/0187: training loss 0.752 Epoch 40 iteration 0088/0187: training loss 0.753 Epoch 40 iteration 0089/0187: training loss 0.752 Epoch 40 iteration 0090/0187: training loss 0.751 Epoch 40 iteration 0091/0188: training loss 0.751 Epoch 40 iteration 0092/0188: training loss 0.751 Epoch 40 iteration 0093/0188: training loss 0.751 Epoch 40 iteration 0094/0188: training loss 0.749 Epoch 40 iteration 0095/0188: training loss 0.747 Epoch 40 iteration 0096/0188: training loss 0.747 Epoch 40 iteration 0097/0188: training loss 0.747 Epoch 40 iteration 0098/0188: training loss 0.745 Epoch 40 iteration 0099/0188: training loss 0.745 Epoch 40 iteration 0100/0188: training loss 0.744 Epoch 40 iteration 0101/0188: training loss 0.745 Epoch 40 iteration 0102/0188: training loss 0.744 Epoch 40 iteration 0103/0188: training loss 0.743 Epoch 40 iteration 0104/0188: training loss 0.744 Epoch 40 iteration 0105/0188: training loss 0.743 Epoch 40 iteration 0106/0188: training loss 0.746 Epoch 40 iteration 0107/0188: training loss 0.745 Epoch 40 iteration 0108/0188: training loss 0.744 Epoch 40 iteration 0109/0188: training loss 0.746 Epoch 40 iteration 0110/0188: training loss 0.745 Epoch 40 iteration 0111/0188: training loss 0.746 Epoch 40 iteration 0112/0188: training loss 0.746 Epoch 40 iteration 0113/0188: training loss 0.746 Epoch 40 iteration 0114/0188: training loss 0.745 Epoch 40 iteration 0115/0188: training loss 0.746 Epoch 40 iteration 0116/0188: training loss 0.746 Epoch 40 iteration 0117/0188: training loss 0.747 Epoch 40 iteration 0118/0188: training loss 0.747 Epoch 40 iteration 0119/0188: training loss 0.748 Epoch 40 iteration 0120/0188: training loss 0.748 Epoch 40 iteration 0121/0188: training loss 0.748 Epoch 40 iteration 0122/0188: training loss 0.747 Epoch 40 iteration 0123/0188: training loss 0.747 Epoch 40 iteration 0124/0188: training loss 0.747 Epoch 40 iteration 0125/0188: training loss 0.745 Epoch 40 iteration 0126/0188: training loss 0.746 Epoch 40 iteration 0127/0188: training loss 0.747 Epoch 40 iteration 0128/0188: training loss 0.745 Epoch 40 iteration 0129/0188: training loss 0.745 Epoch 40 iteration 0130/0188: training loss 0.744 Epoch 40 iteration 0131/0188: training loss 0.745 Epoch 40 iteration 0132/0188: training loss 0.745 Epoch 40 iteration 0133/0188: training loss 0.745 Epoch 40 iteration 0134/0188: training loss 0.745 Epoch 40 iteration 0135/0188: training loss 0.744 Epoch 40 iteration 0136/0188: training loss 0.745 Epoch 40 iteration 0137/0188: training loss 0.746 Epoch 40 iteration 0138/0188: training loss 0.746 Epoch 40 iteration 0139/0188: training loss 0.745 Epoch 40 iteration 0140/0188: training loss 0.745 Epoch 40 iteration 0141/0188: training loss 0.745 Epoch 40 iteration 0142/0188: training loss 0.743 Epoch 40 iteration 0143/0188: training loss 0.743 Epoch 40 iteration 0144/0188: training loss 0.742 Epoch 40 iteration 0145/0188: training loss 0.742 Epoch 40 iteration 0146/0188: training loss 0.741 Epoch 40 iteration 0147/0188: training loss 0.742 Epoch 40 iteration 0148/0188: training loss 0.742 Epoch 40 iteration 0149/0188: training loss 0.742 Epoch 40 iteration 0150/0188: training loss 0.742 Epoch 40 iteration 0151/0188: training loss 0.742 Epoch 40 iteration 0152/0188: training loss 0.740 Epoch 40 iteration 0153/0188: training loss 0.741 Epoch 40 iteration 0154/0188: training loss 0.742 Epoch 40 iteration 0155/0188: training loss 0.742 Epoch 40 iteration 0156/0188: training loss 0.742 Epoch 40 iteration 0157/0188: training loss 0.741 Epoch 40 iteration 0158/0188: training loss 0.741 Epoch 40 iteration 0159/0188: training loss 0.740 Epoch 40 iteration 0160/0188: training loss 0.740 Epoch 40 iteration 0161/0188: training loss 0.740 Epoch 40 iteration 0162/0188: training loss 0.740 Epoch 40 iteration 0163/0188: training loss 0.741 Epoch 40 iteration 0164/0188: training loss 0.742 Epoch 40 iteration 0165/0188: training loss 0.741 Epoch 40 iteration 0166/0188: training loss 0.741 Epoch 40 iteration 0167/0188: training loss 0.741 Epoch 40 iteration 0168/0188: training loss 0.741 Epoch 40 iteration 0169/0188: training loss 0.740 Epoch 40 iteration 0170/0188: training loss 0.740 Epoch 40 iteration 0171/0188: training loss 0.741 Epoch 40 iteration 0172/0188: training loss 0.741 Epoch 40 iteration 0173/0188: training loss 0.741 Epoch 40 iteration 0174/0188: training loss 0.740 Epoch 40 iteration 0175/0188: training loss 0.740 Epoch 40 iteration 0176/0188: training loss 0.739 Epoch 40 iteration 0177/0188: training loss 0.740 Epoch 40 iteration 0178/0188: training loss 0.740 Epoch 40 iteration 0179/0188: training loss 0.740 Epoch 40 iteration 0180/0188: training loss 0.740 Epoch 40 iteration 0181/0188: training loss 0.739 Epoch 40 iteration 0182/0188: training loss 0.740 Epoch 40 iteration 0183/0188: training loss 0.740 Epoch 40 iteration 0184/0188: training loss 0.741 Epoch 40 iteration 0185/0188: training loss 0.740 Epoch 40 iteration 0186/0188: training loss 0.742 Epoch 40 validation pixAcc: 0.873, mIoU: 0.378 Epoch 41 iteration 0001/0187: training loss 0.745 Epoch 41 iteration 0002/0187: training loss 0.749 Epoch 41 iteration 0003/0187: training loss 0.741 Epoch 41 iteration 0004/0187: training loss 0.732 Epoch 41 iteration 0005/0187: training loss 0.728 Epoch 41 iteration 0006/0187: training loss 0.705 Epoch 41 iteration 0007/0187: training loss 0.703 Epoch 41 iteration 0008/0187: training loss 0.714 Epoch 41 iteration 0009/0187: training loss 0.713 Epoch 41 iteration 0010/0187: training loss 0.716 Epoch 41 iteration 0011/0187: training loss 0.709 Epoch 41 iteration 0012/0187: training loss 0.707 Epoch 41 iteration 0013/0187: training loss 0.704 Epoch 41 iteration 0014/0187: training loss 0.696 Epoch 41 iteration 0015/0187: training loss 0.690 Epoch 41 iteration 0016/0187: training loss 0.684 Epoch 41 iteration 0017/0187: training loss 0.693 Epoch 41 iteration 0018/0187: training loss 0.697 Epoch 41 iteration 0019/0187: training loss 0.709 Epoch 41 iteration 0020/0187: training loss 0.707 Epoch 41 iteration 0021/0187: training loss 0.718 Epoch 41 iteration 0022/0187: training loss 0.717 Epoch 41 iteration 0023/0187: training loss 0.713 Epoch 41 iteration 0024/0187: training loss 0.712 Epoch 41 iteration 0025/0187: training loss 0.719 Epoch 41 iteration 0026/0187: training loss 0.717 Epoch 41 iteration 0027/0187: training loss 0.710 Epoch 41 iteration 0028/0187: training loss 0.722 Epoch 41 iteration 0029/0187: training loss 0.723 Epoch 41 iteration 0030/0187: training loss 0.726 Epoch 41 iteration 0031/0187: training loss 0.728 Epoch 41 iteration 0032/0187: training loss 0.732 Epoch 41 iteration 0033/0187: training loss 0.735 Epoch 41 iteration 0034/0187: training loss 0.734 Epoch 41 iteration 0035/0187: training loss 0.732 Epoch 41 iteration 0036/0187: training loss 0.733 Epoch 41 iteration 0037/0187: training loss 0.737 Epoch 41 iteration 0038/0187: training loss 0.742 Epoch 41 iteration 0039/0187: training loss 0.744 Epoch 41 iteration 0040/0187: training loss 0.748 Epoch 41 iteration 0041/0187: training loss 0.745 Epoch 41 iteration 0042/0187: training loss 0.746 Epoch 41 iteration 0043/0187: training loss 0.748 Epoch 41 iteration 0044/0187: training loss 0.747 Epoch 41 iteration 0045/0187: training loss 0.744 Epoch 41 iteration 0046/0187: training loss 0.742 Epoch 41 iteration 0047/0187: training loss 0.744 Epoch 41 iteration 0048/0187: training loss 0.742 Epoch 41 iteration 0049/0187: training loss 0.744 Epoch 41 iteration 0050/0187: training loss 0.740 Epoch 41 iteration 0051/0187: training loss 0.742 Epoch 41 iteration 0052/0187: training loss 0.743 Epoch 41 iteration 0053/0187: training loss 0.744 Epoch 41 iteration 0054/0187: training loss 0.750 Epoch 41 iteration 0055/0187: training loss 0.748 Epoch 41 iteration 0056/0187: training loss 0.747 Epoch 41 iteration 0057/0187: training loss 0.744 Epoch 41 iteration 0058/0187: training loss 0.743 Epoch 41 iteration 0059/0187: training loss 0.744 Epoch 41 iteration 0060/0187: training loss 0.746 Epoch 41 iteration 0061/0187: training loss 0.746 Epoch 41 iteration 0062/0187: training loss 0.744 Epoch 41 iteration 0063/0187: training loss 0.746 Epoch 41 iteration 0064/0187: training loss 0.743 Epoch 41 iteration 0065/0187: training loss 0.743 Epoch 41 iteration 0066/0187: training loss 0.742 Epoch 41 iteration 0067/0187: training loss 0.745 Epoch 41 iteration 0068/0187: training loss 0.747 Epoch 41 iteration 0069/0187: training loss 0.747 Epoch 41 iteration 0070/0187: training loss 0.745 Epoch 41 iteration 0071/0187: training loss 0.746 Epoch 41 iteration 0072/0187: training loss 0.746 Epoch 41 iteration 0073/0187: training loss 0.746 Epoch 41 iteration 0074/0187: training loss 0.747 Epoch 41 iteration 0075/0187: training loss 0.747 Epoch 41 iteration 0076/0187: training loss 0.748 Epoch 41 iteration 0077/0187: training loss 0.747 Epoch 41 iteration 0078/0187: training loss 0.749 Epoch 41 iteration 0079/0187: training loss 0.749 Epoch 41 iteration 0080/0187: training loss 0.748 Epoch 41 iteration 0081/0187: training loss 0.748 Epoch 41 iteration 0082/0187: training loss 0.748 Epoch 41 iteration 0083/0187: training loss 0.747 Epoch 41 iteration 0084/0187: training loss 0.748 Epoch 41 iteration 0085/0187: training loss 0.748 Epoch 41 iteration 0086/0187: training loss 0.748 Epoch 41 iteration 0087/0187: training loss 0.749 Epoch 41 iteration 0088/0187: training loss 0.750 Epoch 41 iteration 0089/0187: training loss 0.749 Epoch 41 iteration 0090/0187: training loss 0.747 Epoch 41 iteration 0091/0187: training loss 0.748 Epoch 41 iteration 0092/0187: training loss 0.750 Epoch 41 iteration 0093/0187: training loss 0.749 Epoch 41 iteration 0094/0187: training loss 0.748 Epoch 41 iteration 0095/0187: training loss 0.747 Epoch 41 iteration 0096/0187: training loss 0.745 Epoch 41 iteration 0097/0187: training loss 0.745 Epoch 41 iteration 0098/0187: training loss 0.746 Epoch 41 iteration 0099/0187: training loss 0.745 Epoch 41 iteration 0100/0187: training loss 0.745 Epoch 41 iteration 0101/0187: training loss 0.743 Epoch 41 iteration 0102/0187: training loss 0.744 Epoch 41 iteration 0103/0187: training loss 0.745 Epoch 41 iteration 0104/0187: training loss 0.745 Epoch 41 iteration 0105/0187: training loss 0.748 Epoch 41 iteration 0106/0187: training loss 0.747 Epoch 41 iteration 0107/0187: training loss 0.747 Epoch 41 iteration 0108/0187: training loss 0.754 Epoch 41 iteration 0109/0187: training loss 0.756 Epoch 41 iteration 0110/0187: training loss 0.756 Epoch 41 iteration 0111/0187: training loss 0.755 Epoch 41 iteration 0112/0187: training loss 0.755 Epoch 41 iteration 0113/0187: training loss 0.757 Epoch 41 iteration 0114/0187: training loss 0.757 Epoch 41 iteration 0115/0187: training loss 0.756 Epoch 41 iteration 0116/0187: training loss 0.756 Epoch 41 iteration 0117/0187: training loss 0.757 Epoch 41 iteration 0118/0187: training loss 0.757 Epoch 41 iteration 0119/0187: training loss 0.759 Epoch 41 iteration 0120/0187: training loss 0.760 Epoch 41 iteration 0121/0187: training loss 0.759 Epoch 41 iteration 0122/0187: training loss 0.758 Epoch 41 iteration 0123/0187: training loss 0.757 Epoch 41 iteration 0124/0187: training loss 0.757 Epoch 41 iteration 0125/0187: training loss 0.758 Epoch 41 iteration 0126/0187: training loss 0.759 Epoch 41 iteration 0127/0187: training loss 0.759 Epoch 41 iteration 0128/0187: training loss 0.759 Epoch 41 iteration 0129/0187: training loss 0.758 Epoch 41 iteration 0130/0187: training loss 0.758 Epoch 41 iteration 0131/0187: training loss 0.757 Epoch 41 iteration 0132/0187: training loss 0.757 Epoch 41 iteration 0133/0187: training loss 0.757 Epoch 41 iteration 0134/0187: training loss 0.757 Epoch 41 iteration 0135/0187: training loss 0.756 Epoch 41 iteration 0136/0187: training loss 0.757 Epoch 41 iteration 0137/0187: training loss 0.756 Epoch 41 iteration 0138/0187: training loss 0.755 Epoch 41 iteration 0139/0187: training loss 0.754 Epoch 41 iteration 0140/0187: training loss 0.754 Epoch 41 iteration 0141/0187: training loss 0.754 Epoch 41 iteration 0142/0187: training loss 0.754 Epoch 41 iteration 0143/0187: training loss 0.755 Epoch 41 iteration 0144/0187: training loss 0.755 Epoch 41 iteration 0145/0187: training loss 0.754 Epoch 41 iteration 0146/0187: training loss 0.754 Epoch 41 iteration 0147/0187: training loss 0.754 Epoch 41 iteration 0148/0187: training loss 0.753 Epoch 41 iteration 0149/0187: training loss 0.754 Epoch 41 iteration 0150/0187: training loss 0.754 Epoch 41 iteration 0151/0187: training loss 0.753 Epoch 41 iteration 0152/0187: training loss 0.753 Epoch 41 iteration 0153/0187: training loss 0.753 Epoch 41 iteration 0154/0187: training loss 0.753 Epoch 41 iteration 0155/0187: training loss 0.754 Epoch 41 iteration 0156/0187: training loss 0.754 Epoch 41 iteration 0157/0187: training loss 0.753 Epoch 41 iteration 0158/0187: training loss 0.753 Epoch 41 iteration 0159/0187: training loss 0.752 Epoch 41 iteration 0160/0187: training loss 0.751 Epoch 41 iteration 0161/0187: training loss 0.750 Epoch 41 iteration 0162/0187: training loss 0.752 Epoch 41 iteration 0163/0187: training loss 0.752 Epoch 41 iteration 0164/0187: training loss 0.752 Epoch 41 iteration 0165/0187: training loss 0.754 Epoch 41 iteration 0166/0187: training loss 0.754 Epoch 41 iteration 0167/0187: training loss 0.754 Epoch 41 iteration 0168/0187: training loss 0.754 Epoch 41 iteration 0169/0187: training loss 0.753 Epoch 41 iteration 0170/0187: training loss 0.753 Epoch 41 iteration 0171/0187: training loss 0.753 Epoch 41 iteration 0172/0187: training loss 0.752 Epoch 41 iteration 0173/0187: training loss 0.751 Epoch 41 iteration 0174/0187: training loss 0.750 Epoch 41 iteration 0175/0187: training loss 0.751 Epoch 41 iteration 0176/0187: training loss 0.752 Epoch 41 iteration 0177/0187: training loss 0.753 Epoch 41 iteration 0178/0187: training loss 0.752 Epoch 41 iteration 0179/0187: training loss 0.753 Epoch 41 iteration 0180/0187: training loss 0.752 Epoch 41 iteration 0181/0187: training loss 0.752 Epoch 41 iteration 0182/0187: training loss 0.752 Epoch 41 iteration 0183/0187: training loss 0.751 Epoch 41 iteration 0184/0187: training loss 0.752 Epoch 41 iteration 0185/0187: training loss 0.752 Epoch 41 iteration 0186/0187: training loss 0.752 Epoch 41 iteration 0187/0187: training loss 0.753 Epoch 41 validation pixAcc: 0.872, mIoU: 0.380 Epoch 42 iteration 0001/0187: training loss 0.621 Epoch 42 iteration 0002/0187: training loss 0.652 Epoch 42 iteration 0003/0187: training loss 0.655 Epoch 42 iteration 0004/0187: training loss 0.697 Epoch 42 iteration 0005/0187: training loss 0.701 Epoch 42 iteration 0006/0187: training loss 0.696 Epoch 42 iteration 0007/0187: training loss 0.702 Epoch 42 iteration 0008/0187: training loss 0.702 Epoch 42 iteration 0009/0187: training loss 0.707 Epoch 42 iteration 0010/0187: training loss 0.709 Epoch 42 iteration 0011/0187: training loss 0.716 Epoch 42 iteration 0012/0187: training loss 0.707 Epoch 42 iteration 0013/0187: training loss 0.717 Epoch 42 iteration 0014/0187: training loss 0.727 Epoch 42 iteration 0015/0187: training loss 0.732 Epoch 42 iteration 0016/0187: training loss 0.727 Epoch 42 iteration 0017/0187: training loss 0.739 Epoch 42 iteration 0018/0187: training loss 0.740 Epoch 42 iteration 0019/0187: training loss 0.744 Epoch 42 iteration 0020/0187: training loss 0.746 Epoch 42 iteration 0021/0187: training loss 0.751 Epoch 42 iteration 0022/0187: training loss 0.748 Epoch 42 iteration 0023/0187: training loss 0.745 Epoch 42 iteration 0024/0187: training loss 0.738 Epoch 42 iteration 0025/0187: training loss 0.730 Epoch 42 iteration 0026/0187: training loss 0.730 Epoch 42 iteration 0027/0187: training loss 0.730 Epoch 42 iteration 0028/0187: training loss 0.730 Epoch 42 iteration 0029/0187: training loss 0.736 Epoch 42 iteration 0030/0187: training loss 0.738 Epoch 42 iteration 0031/0187: training loss 0.737 Epoch 42 iteration 0032/0187: training loss 0.735 Epoch 42 iteration 0033/0187: training loss 0.744 Epoch 42 iteration 0034/0187: training loss 0.745 Epoch 42 iteration 0035/0187: training loss 0.745 Epoch 42 iteration 0036/0187: training loss 0.747 Epoch 42 iteration 0037/0187: training loss 0.747 Epoch 42 iteration 0038/0187: training loss 0.747 Epoch 42 iteration 0039/0187: training loss 0.748 Epoch 42 iteration 0040/0187: training loss 0.746 Epoch 42 iteration 0041/0187: training loss 0.744 Epoch 42 iteration 0042/0187: training loss 0.746 Epoch 42 iteration 0043/0187: training loss 0.749 Epoch 42 iteration 0044/0187: training loss 0.743 Epoch 42 iteration 0045/0187: training loss 0.743 Epoch 42 iteration 0046/0187: training loss 0.748 Epoch 42 iteration 0047/0187: training loss 0.746 Epoch 42 iteration 0048/0187: training loss 0.745 Epoch 42 iteration 0049/0187: training loss 0.743 Epoch 42 iteration 0050/0187: training loss 0.741 Epoch 42 iteration 0051/0187: training loss 0.744 Epoch 42 iteration 0052/0187: training loss 0.746 Epoch 42 iteration 0053/0187: training loss 0.749 Epoch 42 iteration 0054/0187: training loss 0.748 Epoch 42 iteration 0055/0187: training loss 0.750 Epoch 42 iteration 0056/0187: training loss 0.751 Epoch 42 iteration 0057/0187: training loss 0.750 Epoch 42 iteration 0058/0187: training loss 0.748 Epoch 42 iteration 0059/0187: training loss 0.750 Epoch 42 iteration 0060/0187: training loss 0.747 Epoch 42 iteration 0061/0187: training loss 0.750 Epoch 42 iteration 0062/0187: training loss 0.749 Epoch 42 iteration 0063/0187: training loss 0.752 Epoch 42 iteration 0064/0187: training loss 0.752 Epoch 42 iteration 0065/0187: training loss 0.753 Epoch 42 iteration 0066/0187: training loss 0.751 Epoch 42 iteration 0067/0187: training loss 0.751 Epoch 42 iteration 0068/0187: training loss 0.749 Epoch 42 iteration 0069/0187: training loss 0.749 Epoch 42 iteration 0070/0187: training loss 0.746 Epoch 42 iteration 0071/0187: training loss 0.748 Epoch 42 iteration 0072/0187: training loss 0.747 Epoch 42 iteration 0073/0187: training loss 0.748 Epoch 42 iteration 0074/0187: training loss 0.748 Epoch 42 iteration 0075/0187: training loss 0.747 Epoch 42 iteration 0076/0187: training loss 0.746 Epoch 42 iteration 0077/0187: training loss 0.745 Epoch 42 iteration 0078/0187: training loss 0.743 Epoch 42 iteration 0079/0187: training loss 0.742 Epoch 42 iteration 0080/0187: training loss 0.742 Epoch 42 iteration 0081/0187: training loss 0.743 Epoch 42 iteration 0082/0187: training loss 0.742 Epoch 42 iteration 0083/0187: training loss 0.743 Epoch 42 iteration 0084/0187: training loss 0.743 Epoch 42 iteration 0085/0187: training loss 0.742 Epoch 42 iteration 0086/0187: training loss 0.740 Epoch 42 iteration 0087/0187: training loss 0.738 Epoch 42 iteration 0088/0187: training loss 0.736 Epoch 42 iteration 0089/0187: training loss 0.735 Epoch 42 iteration 0090/0187: training loss 0.737 Epoch 42 iteration 0091/0188: training loss 0.735 Epoch 42 iteration 0092/0188: training loss 0.735 Epoch 42 iteration 0093/0188: training loss 0.735 Epoch 42 iteration 0094/0188: training loss 0.735 Epoch 42 iteration 0095/0188: training loss 0.737 Epoch 42 iteration 0096/0188: training loss 0.738 Epoch 42 iteration 0097/0188: training loss 0.737 Epoch 42 iteration 0098/0188: training loss 0.738 Epoch 42 iteration 0099/0188: training loss 0.737 Epoch 42 iteration 0100/0188: training loss 0.736 Epoch 42 iteration 0101/0188: training loss 0.737 Epoch 42 iteration 0102/0188: training loss 0.738 Epoch 42 iteration 0103/0188: training loss 0.736 Epoch 42 iteration 0104/0188: training loss 0.736 Epoch 42 iteration 0105/0188: training loss 0.738 Epoch 42 iteration 0106/0188: training loss 0.738 Epoch 42 iteration 0107/0188: training loss 0.739 Epoch 42 iteration 0108/0188: training loss 0.738 Epoch 42 iteration 0109/0188: training loss 0.739 Epoch 42 iteration 0110/0188: training loss 0.740 Epoch 42 iteration 0111/0188: training loss 0.738 Epoch 42 iteration 0112/0188: training loss 0.739 Epoch 42 iteration 0113/0188: training loss 0.739 Epoch 42 iteration 0114/0188: training loss 0.740 Epoch 42 iteration 0115/0188: training loss 0.741 Epoch 42 iteration 0116/0188: training loss 0.740 Epoch 42 iteration 0117/0188: training loss 0.739 Epoch 42 iteration 0118/0188: training loss 0.740 Epoch 42 iteration 0119/0188: training loss 0.741 Epoch 42 iteration 0120/0188: training loss 0.740 Epoch 42 iteration 0121/0188: training loss 0.739 Epoch 42 iteration 0122/0188: training loss 0.741 Epoch 42 iteration 0123/0188: training loss 0.742 Epoch 42 iteration 0124/0188: training loss 0.741 Epoch 42 iteration 0125/0188: training loss 0.743 Epoch 42 iteration 0126/0188: training loss 0.741 Epoch 42 iteration 0127/0188: training loss 0.742 Epoch 42 iteration 0128/0188: training loss 0.742 Epoch 42 iteration 0129/0188: training loss 0.742 Epoch 42 iteration 0130/0188: training loss 0.745 Epoch 42 iteration 0131/0188: training loss 0.745 Epoch 42 iteration 0132/0188: training loss 0.746 Epoch 42 iteration 0133/0188: training loss 0.745 Epoch 42 iteration 0134/0188: training loss 0.745 Epoch 42 iteration 0135/0188: training loss 0.744 Epoch 42 iteration 0136/0188: training loss 0.745 Epoch 42 iteration 0137/0188: training loss 0.746 Epoch 42 iteration 0138/0188: training loss 0.746 Epoch 42 iteration 0139/0188: training loss 0.745 Epoch 42 iteration 0140/0188: training loss 0.745 Epoch 42 iteration 0141/0188: training loss 0.745 Epoch 42 iteration 0142/0188: training loss 0.746 Epoch 42 iteration 0143/0188: training loss 0.746 Epoch 42 iteration 0144/0188: training loss 0.747 Epoch 42 iteration 0145/0188: training loss 0.748 Epoch 42 iteration 0146/0188: training loss 0.748 Epoch 42 iteration 0147/0188: training loss 0.749 Epoch 42 iteration 0148/0188: training loss 0.748 Epoch 42 iteration 0149/0188: training loss 0.749 Epoch 42 iteration 0150/0188: training loss 0.747 Epoch 42 iteration 0151/0188: training loss 0.748 Epoch 42 iteration 0152/0188: training loss 0.748 Epoch 42 iteration 0153/0188: training loss 0.747 Epoch 42 iteration 0154/0188: training loss 0.747 Epoch 42 iteration 0155/0188: training loss 0.747 Epoch 42 iteration 0156/0188: training loss 0.747 Epoch 42 iteration 0157/0188: training loss 0.748 Epoch 42 iteration 0158/0188: training loss 0.748 Epoch 42 iteration 0159/0188: training loss 0.748 Epoch 42 iteration 0160/0188: training loss 0.748 Epoch 42 iteration 0161/0188: training loss 0.748 Epoch 42 iteration 0162/0188: training loss 0.747 Epoch 42 iteration 0163/0188: training loss 0.746 Epoch 42 iteration 0164/0188: training loss 0.747 Epoch 42 iteration 0165/0188: training loss 0.746 Epoch 42 iteration 0166/0188: training loss 0.745 Epoch 42 iteration 0167/0188: training loss 0.745 Epoch 42 iteration 0168/0188: training loss 0.745 Epoch 42 iteration 0169/0188: training loss 0.745 Epoch 42 iteration 0170/0188: training loss 0.745 Epoch 42 iteration 0171/0188: training loss 0.744 Epoch 42 iteration 0172/0188: training loss 0.744 Epoch 42 iteration 0173/0188: training loss 0.744 Epoch 42 iteration 0174/0188: training loss 0.744 Epoch 42 iteration 0175/0188: training loss 0.743 Epoch 42 iteration 0176/0188: training loss 0.743 Epoch 42 iteration 0177/0188: training loss 0.744 Epoch 42 iteration 0178/0188: training loss 0.745 Epoch 42 iteration 0179/0188: training loss 0.744 Epoch 42 iteration 0180/0188: training loss 0.744 Epoch 42 iteration 0181/0188: training loss 0.743 Epoch 42 iteration 0182/0188: training loss 0.744 Epoch 42 iteration 0183/0188: training loss 0.744 Epoch 42 iteration 0184/0188: training loss 0.744 Epoch 42 iteration 0185/0188: training loss 0.744 Epoch 42 iteration 0186/0188: training loss 0.743 Epoch 42 validation pixAcc: 0.874, mIoU: 0.380 Epoch 43 iteration 0001/0187: training loss 0.841 Epoch 43 iteration 0002/0187: training loss 0.796 Epoch 43 iteration 0003/0187: training loss 0.811 Epoch 43 iteration 0004/0187: training loss 0.862 Epoch 43 iteration 0005/0187: training loss 0.845 Epoch 43 iteration 0006/0187: training loss 0.860 Epoch 43 iteration 0007/0187: training loss 0.829 Epoch 43 iteration 0008/0187: training loss 0.812 Epoch 43 iteration 0009/0187: training loss 0.800 Epoch 43 iteration 0010/0187: training loss 0.786 Epoch 43 iteration 0011/0187: training loss 0.781 Epoch 43 iteration 0012/0187: training loss 0.779 Epoch 43 iteration 0013/0187: training loss 0.771 Epoch 43 iteration 0014/0187: training loss 0.775 Epoch 43 iteration 0015/0187: training loss 0.774 Epoch 43 iteration 0016/0187: training loss 0.774 Epoch 43 iteration 0017/0187: training loss 0.767 Epoch 43 iteration 0018/0187: training loss 0.767 Epoch 43 iteration 0019/0187: training loss 0.764 Epoch 43 iteration 0020/0187: training loss 0.767 Epoch 43 iteration 0021/0187: training loss 0.767 Epoch 43 iteration 0022/0187: training loss 0.762 Epoch 43 iteration 0023/0187: training loss 0.764 Epoch 43 iteration 0024/0187: training loss 0.760 Epoch 43 iteration 0025/0187: training loss 0.765 Epoch 43 iteration 0026/0187: training loss 0.758 Epoch 43 iteration 0027/0187: training loss 0.753 Epoch 43 iteration 0028/0187: training loss 0.751 Epoch 43 iteration 0029/0187: training loss 0.746 Epoch 43 iteration 0030/0187: training loss 0.743 Epoch 43 iteration 0031/0187: training loss 0.742 Epoch 43 iteration 0032/0187: training loss 0.740 Epoch 43 iteration 0033/0187: training loss 0.734 Epoch 43 iteration 0034/0187: training loss 0.731 Epoch 43 iteration 0035/0187: training loss 0.731 Epoch 43 iteration 0036/0187: training loss 0.725 Epoch 43 iteration 0037/0187: training loss 0.722 Epoch 43 iteration 0038/0187: training loss 0.722 Epoch 43 iteration 0039/0187: training loss 0.725 Epoch 43 iteration 0040/0187: training loss 0.730 Epoch 43 iteration 0041/0187: training loss 0.728 Epoch 43 iteration 0042/0187: training loss 0.725 Epoch 43 iteration 0043/0187: training loss 0.727 Epoch 43 iteration 0044/0187: training loss 0.722 Epoch 43 iteration 0045/0187: training loss 0.723 Epoch 43 iteration 0046/0187: training loss 0.725 Epoch 43 iteration 0047/0187: training loss 0.728 Epoch 43 iteration 0048/0187: training loss 0.727 Epoch 43 iteration 0049/0187: training loss 0.727 Epoch 43 iteration 0050/0187: training loss 0.733 Epoch 43 iteration 0051/0187: training loss 0.732 Epoch 43 iteration 0052/0187: training loss 0.731 Epoch 43 iteration 0053/0187: training loss 0.728 Epoch 43 iteration 0054/0187: training loss 0.729 Epoch 43 iteration 0055/0187: training loss 0.727 Epoch 43 iteration 0056/0187: training loss 0.729 Epoch 43 iteration 0057/0187: training loss 0.732 Epoch 43 iteration 0058/0187: training loss 0.732 Epoch 43 iteration 0059/0187: training loss 0.733 Epoch 43 iteration 0060/0187: training loss 0.731 Epoch 43 iteration 0061/0187: training loss 0.732 Epoch 43 iteration 0062/0187: training loss 0.730 Epoch 43 iteration 0063/0187: training loss 0.732 Epoch 43 iteration 0064/0187: training loss 0.729 Epoch 43 iteration 0065/0187: training loss 0.730 Epoch 43 iteration 0066/0187: training loss 0.730 Epoch 43 iteration 0067/0187: training loss 0.732 Epoch 43 iteration 0068/0187: training loss 0.733 Epoch 43 iteration 0069/0187: training loss 0.734 Epoch 43 iteration 0070/0187: training loss 0.733 Epoch 43 iteration 0071/0187: training loss 0.734 Epoch 43 iteration 0072/0187: training loss 0.733 Epoch 43 iteration 0073/0187: training loss 0.735 Epoch 43 iteration 0074/0187: training loss 0.733 Epoch 43 iteration 0075/0187: training loss 0.735 Epoch 43 iteration 0076/0187: training loss 0.733 Epoch 43 iteration 0077/0187: training loss 0.733 Epoch 43 iteration 0078/0187: training loss 0.734 Epoch 43 iteration 0079/0187: training loss 0.734 Epoch 43 iteration 0080/0187: training loss 0.734 Epoch 43 iteration 0081/0187: training loss 0.733 Epoch 43 iteration 0082/0187: training loss 0.732 Epoch 43 iteration 0083/0187: training loss 0.731 Epoch 43 iteration 0084/0187: training loss 0.730 Epoch 43 iteration 0085/0187: training loss 0.730 Epoch 43 iteration 0086/0187: training loss 0.730 Epoch 43 iteration 0087/0187: training loss 0.730 Epoch 43 iteration 0088/0187: training loss 0.732 Epoch 43 iteration 0089/0187: training loss 0.731 Epoch 43 iteration 0090/0187: training loss 0.730 Epoch 43 iteration 0091/0187: training loss 0.731 Epoch 43 iteration 0092/0187: training loss 0.730 Epoch 43 iteration 0093/0187: training loss 0.731 Epoch 43 iteration 0094/0187: training loss 0.732 Epoch 43 iteration 0095/0187: training loss 0.733 Epoch 43 iteration 0096/0187: training loss 0.731 Epoch 43 iteration 0097/0187: training loss 0.732 Epoch 43 iteration 0098/0187: training loss 0.731 Epoch 43 iteration 0099/0187: training loss 0.731 Epoch 43 iteration 0100/0187: training loss 0.731 Epoch 43 iteration 0101/0187: training loss 0.730 Epoch 43 iteration 0102/0187: training loss 0.730 Epoch 43 iteration 0103/0187: training loss 0.731 Epoch 43 iteration 0104/0187: training loss 0.731 Epoch 43 iteration 0105/0187: training loss 0.731 Epoch 43 iteration 0106/0187: training loss 0.731 Epoch 43 iteration 0107/0187: training loss 0.731 Epoch 43 iteration 0108/0187: training loss 0.732 Epoch 43 iteration 0109/0187: training loss 0.733 Epoch 43 iteration 0110/0187: training loss 0.732 Epoch 43 iteration 0111/0187: training loss 0.731 Epoch 43 iteration 0112/0187: training loss 0.731 Epoch 43 iteration 0113/0187: training loss 0.729 Epoch 43 iteration 0114/0187: training loss 0.729 Epoch 43 iteration 0115/0187: training loss 0.728 Epoch 43 iteration 0116/0187: training loss 0.727 Epoch 43 iteration 0117/0187: training loss 0.727 Epoch 43 iteration 0118/0187: training loss 0.728 Epoch 43 iteration 0119/0187: training loss 0.727 Epoch 43 iteration 0120/0187: training loss 0.728 Epoch 43 iteration 0121/0187: training loss 0.727 Epoch 43 iteration 0122/0187: training loss 0.727 Epoch 43 iteration 0123/0187: training loss 0.728 Epoch 43 iteration 0124/0187: training loss 0.727 Epoch 43 iteration 0125/0187: training loss 0.727 Epoch 43 iteration 0126/0187: training loss 0.726 Epoch 43 iteration 0127/0187: training loss 0.726 Epoch 43 iteration 0128/0187: training loss 0.726 Epoch 43 iteration 0129/0187: training loss 0.726 Epoch 43 iteration 0130/0187: training loss 0.725 Epoch 43 iteration 0131/0187: training loss 0.725 Epoch 43 iteration 0132/0187: training loss 0.725 Epoch 43 iteration 0133/0187: training loss 0.724 Epoch 43 iteration 0134/0187: training loss 0.724 Epoch 43 iteration 0135/0187: training loss 0.725 Epoch 43 iteration 0136/0187: training loss 0.725 Epoch 43 iteration 0137/0187: training loss 0.725 Epoch 43 iteration 0138/0187: training loss 0.724 Epoch 43 iteration 0139/0187: training loss 0.726 Epoch 43 iteration 0140/0187: training loss 0.726 Epoch 43 iteration 0141/0187: training loss 0.726 Epoch 43 iteration 0142/0187: training loss 0.725 Epoch 43 iteration 0143/0187: training loss 0.727 Epoch 43 iteration 0144/0187: training loss 0.726 Epoch 43 iteration 0145/0187: training loss 0.725 Epoch 43 iteration 0146/0187: training loss 0.725 Epoch 43 iteration 0147/0187: training loss 0.725 Epoch 43 iteration 0148/0187: training loss 0.725 Epoch 43 iteration 0149/0187: training loss 0.726 Epoch 43 iteration 0150/0187: training loss 0.727 Epoch 43 iteration 0151/0187: training loss 0.726 Epoch 43 iteration 0152/0187: training loss 0.726 Epoch 43 iteration 0153/0187: training loss 0.726 Epoch 43 iteration 0154/0187: training loss 0.725 Epoch 43 iteration 0155/0187: training loss 0.724 Epoch 43 iteration 0156/0187: training loss 0.724 Epoch 43 iteration 0157/0187: training loss 0.724 Epoch 43 iteration 0158/0187: training loss 0.724 Epoch 43 iteration 0159/0187: training loss 0.723 Epoch 43 iteration 0160/0187: training loss 0.726 Epoch 43 iteration 0161/0187: training loss 0.727 Epoch 43 iteration 0162/0187: training loss 0.726 Epoch 43 iteration 0163/0187: training loss 0.726 Epoch 43 iteration 0164/0187: training loss 0.725 Epoch 43 iteration 0165/0187: training loss 0.728 Epoch 43 iteration 0166/0187: training loss 0.728 Epoch 43 iteration 0167/0187: training loss 0.727 Epoch 43 iteration 0168/0187: training loss 0.727 Epoch 43 iteration 0169/0187: training loss 0.728 Epoch 43 iteration 0170/0187: training loss 0.727 Epoch 43 iteration 0171/0187: training loss 0.726 Epoch 43 iteration 0172/0187: training loss 0.726 Epoch 43 iteration 0173/0187: training loss 0.725 Epoch 43 iteration 0174/0187: training loss 0.724 Epoch 43 iteration 0175/0187: training loss 0.724 Epoch 43 iteration 0176/0187: training loss 0.723 Epoch 43 iteration 0177/0187: training loss 0.725 Epoch 43 iteration 0178/0187: training loss 0.725 Epoch 43 iteration 0179/0187: training loss 0.724 Epoch 43 iteration 0180/0187: training loss 0.725 Epoch 43 iteration 0181/0187: training loss 0.728 Epoch 43 iteration 0182/0187: training loss 0.728 Epoch 43 iteration 0183/0187: training loss 0.729 Epoch 43 iteration 0184/0187: training loss 0.728 Epoch 43 iteration 0185/0187: training loss 0.729 Epoch 43 iteration 0186/0187: training loss 0.729 Epoch 43 iteration 0187/0187: training loss 0.729 Epoch 43 validation pixAcc: 0.872, mIoU: 0.377 Epoch 44 iteration 0001/0187: training loss 0.773 Epoch 44 iteration 0002/0187: training loss 0.805 Epoch 44 iteration 0003/0187: training loss 0.735 Epoch 44 iteration 0004/0187: training loss 0.719 Epoch 44 iteration 0005/0187: training loss 0.746 Epoch 44 iteration 0006/0187: training loss 0.747 Epoch 44 iteration 0007/0187: training loss 0.733 Epoch 44 iteration 0008/0187: training loss 0.717 Epoch 44 iteration 0009/0187: training loss 0.730 Epoch 44 iteration 0010/0187: training loss 0.734 Epoch 44 iteration 0011/0187: training loss 0.722 Epoch 44 iteration 0012/0187: training loss 0.736 Epoch 44 iteration 0013/0187: training loss 0.750 Epoch 44 iteration 0014/0187: training loss 0.754 Epoch 44 iteration 0015/0187: training loss 0.750 Epoch 44 iteration 0016/0187: training loss 0.764 Epoch 44 iteration 0017/0187: training loss 0.759 Epoch 44 iteration 0018/0187: training loss 0.762 Epoch 44 iteration 0019/0187: training loss 0.769 Epoch 44 iteration 0020/0187: training loss 0.765 Epoch 44 iteration 0021/0187: training loss 0.768 Epoch 44 iteration 0022/0187: training loss 0.764 Epoch 44 iteration 0023/0187: training loss 0.765 Epoch 44 iteration 0024/0187: training loss 0.757 Epoch 44 iteration 0025/0187: training loss 0.764 Epoch 44 iteration 0026/0187: training loss 0.764 Epoch 44 iteration 0027/0187: training loss 0.756 Epoch 44 iteration 0028/0187: training loss 0.753 Epoch 44 iteration 0029/0187: training loss 0.750 Epoch 44 iteration 0030/0187: training loss 0.746 Epoch 44 iteration 0031/0187: training loss 0.746 Epoch 44 iteration 0032/0187: training loss 0.751 Epoch 44 iteration 0033/0187: training loss 0.747 Epoch 44 iteration 0034/0187: training loss 0.745 Epoch 44 iteration 0035/0187: training loss 0.744 Epoch 44 iteration 0036/0187: training loss 0.743 Epoch 44 iteration 0037/0187: training loss 0.743 Epoch 44 iteration 0038/0187: training loss 0.751 Epoch 44 iteration 0039/0187: training loss 0.755 Epoch 44 iteration 0040/0187: training loss 0.753 Epoch 44 iteration 0041/0187: training loss 0.752 Epoch 44 iteration 0042/0187: training loss 0.753 Epoch 44 iteration 0043/0187: training loss 0.750 Epoch 44 iteration 0044/0187: training loss 0.750 Epoch 44 iteration 0045/0187: training loss 0.753 Epoch 44 iteration 0046/0187: training loss 0.752 Epoch 44 iteration 0047/0187: training loss 0.752 Epoch 44 iteration 0048/0187: training loss 0.754 Epoch 44 iteration 0049/0187: training loss 0.752 Epoch 44 iteration 0050/0187: training loss 0.752 Epoch 44 iteration 0051/0187: training loss 0.751 Epoch 44 iteration 0052/0187: training loss 0.751 Epoch 44 iteration 0053/0187: training loss 0.752 Epoch 44 iteration 0054/0187: training loss 0.752 Epoch 44 iteration 0055/0187: training loss 0.751 Epoch 44 iteration 0056/0187: training loss 0.751 Epoch 44 iteration 0057/0187: training loss 0.751 Epoch 44 iteration 0058/0187: training loss 0.754 Epoch 44 iteration 0059/0187: training loss 0.750 Epoch 44 iteration 0060/0187: training loss 0.749 Epoch 44 iteration 0061/0187: training loss 0.747 Epoch 44 iteration 0062/0187: training loss 0.749 Epoch 44 iteration 0063/0187: training loss 0.749 Epoch 44 iteration 0064/0187: training loss 0.747 Epoch 44 iteration 0065/0187: training loss 0.747 Epoch 44 iteration 0066/0187: training loss 0.750 Epoch 44 iteration 0067/0187: training loss 0.749 Epoch 44 iteration 0068/0187: training loss 0.748 Epoch 44 iteration 0069/0187: training loss 0.750 Epoch 44 iteration 0070/0187: training loss 0.750 Epoch 44 iteration 0071/0187: training loss 0.750 Epoch 44 iteration 0072/0187: training loss 0.751 Epoch 44 iteration 0073/0187: training loss 0.750 Epoch 44 iteration 0074/0187: training loss 0.750 Epoch 44 iteration 0075/0187: training loss 0.750 Epoch 44 iteration 0076/0187: training loss 0.750 Epoch 44 iteration 0077/0187: training loss 0.755 Epoch 44 iteration 0078/0187: training loss 0.756 Epoch 44 iteration 0079/0187: training loss 0.757 Epoch 44 iteration 0080/0187: training loss 0.757 Epoch 44 iteration 0081/0187: training loss 0.758 Epoch 44 iteration 0082/0187: training loss 0.757 Epoch 44 iteration 0083/0187: training loss 0.756 Epoch 44 iteration 0084/0187: training loss 0.756 Epoch 44 iteration 0085/0187: training loss 0.756 Epoch 44 iteration 0086/0187: training loss 0.756 Epoch 44 iteration 0087/0187: training loss 0.756 Epoch 44 iteration 0088/0187: training loss 0.755 Epoch 44 iteration 0089/0187: training loss 0.754 Epoch 44 iteration 0090/0187: training loss 0.755 Epoch 44 iteration 0091/0188: training loss 0.756 Epoch 44 iteration 0092/0188: training loss 0.757 Epoch 44 iteration 0093/0188: training loss 0.756 Epoch 44 iteration 0094/0188: training loss 0.755 Epoch 44 iteration 0095/0188: training loss 0.753 Epoch 44 iteration 0096/0188: training loss 0.753 Epoch 44 iteration 0097/0188: training loss 0.751 Epoch 44 iteration 0098/0188: training loss 0.753 Epoch 44 iteration 0099/0188: training loss 0.753 Epoch 44 iteration 0100/0188: training loss 0.753 Epoch 44 iteration 0101/0188: training loss 0.752 Epoch 44 iteration 0102/0188: training loss 0.753 Epoch 44 iteration 0103/0188: training loss 0.752 Epoch 44 iteration 0104/0188: training loss 0.751 Epoch 44 iteration 0105/0188: training loss 0.752 Epoch 44 iteration 0106/0188: training loss 0.753 Epoch 44 iteration 0107/0188: training loss 0.753 Epoch 44 iteration 0108/0188: training loss 0.755 Epoch 44 iteration 0109/0188: training loss 0.754 Epoch 44 iteration 0110/0188: training loss 0.753 Epoch 44 iteration 0111/0188: training loss 0.751 Epoch 44 iteration 0112/0188: training loss 0.752 Epoch 44 iteration 0113/0188: training loss 0.752 Epoch 44 iteration 0114/0188: training loss 0.752 Epoch 44 iteration 0115/0188: training loss 0.750 Epoch 44 iteration 0116/0188: training loss 0.750 Epoch 44 iteration 0117/0188: training loss 0.750 Epoch 44 iteration 0118/0188: training loss 0.750 Epoch 44 iteration 0119/0188: training loss 0.749 Epoch 44 iteration 0120/0188: training loss 0.748 Epoch 44 iteration 0121/0188: training loss 0.749 Epoch 44 iteration 0122/0188: training loss 0.750 Epoch 44 iteration 0123/0188: training loss 0.750 Epoch 44 iteration 0124/0188: training loss 0.750 Epoch 44 iteration 0125/0188: training loss 0.750 Epoch 44 iteration 0126/0188: training loss 0.750 Epoch 44 iteration 0127/0188: training loss 0.751 Epoch 44 iteration 0128/0188: training loss 0.749 Epoch 44 iteration 0129/0188: training loss 0.748 Epoch 44 iteration 0130/0188: training loss 0.751 Epoch 44 iteration 0131/0188: training loss 0.751 Epoch 44 iteration 0132/0188: training loss 0.753 Epoch 44 iteration 0133/0188: training loss 0.752 Epoch 44 iteration 0134/0188: training loss 0.753 Epoch 44 iteration 0135/0188: training loss 0.753 Epoch 44 iteration 0136/0188: training loss 0.753 Epoch 44 iteration 0137/0188: training loss 0.752 Epoch 44 iteration 0138/0188: training loss 0.750 Epoch 44 iteration 0139/0188: training loss 0.751 Epoch 44 iteration 0140/0188: training loss 0.749 Epoch 44 iteration 0141/0188: training loss 0.749 Epoch 44 iteration 0142/0188: training loss 0.752 Epoch 44 iteration 0143/0188: training loss 0.752 Epoch 44 iteration 0144/0188: training loss 0.751 Epoch 44 iteration 0145/0188: training loss 0.750 Epoch 44 iteration 0146/0188: training loss 0.750 Epoch 44 iteration 0147/0188: training loss 0.748 Epoch 44 iteration 0148/0188: training loss 0.748 Epoch 44 iteration 0149/0188: training loss 0.749 Epoch 44 iteration 0150/0188: training loss 0.750 Epoch 44 iteration 0151/0188: training loss 0.749 Epoch 44 iteration 0152/0188: training loss 0.749 Epoch 44 iteration 0153/0188: training loss 0.749 Epoch 44 iteration 0154/0188: training loss 0.749 Epoch 44 iteration 0155/0188: training loss 0.749 Epoch 44 iteration 0156/0188: training loss 0.748 Epoch 44 iteration 0157/0188: training loss 0.748 Epoch 44 iteration 0158/0188: training loss 0.748 Epoch 44 iteration 0159/0188: training loss 0.750 Epoch 44 iteration 0160/0188: training loss 0.750 Epoch 44 iteration 0161/0188: training loss 0.750 Epoch 44 iteration 0162/0188: training loss 0.751 Epoch 44 iteration 0163/0188: training loss 0.749 Epoch 44 iteration 0164/0188: training loss 0.750 Epoch 44 iteration 0165/0188: training loss 0.749 Epoch 44 iteration 0166/0188: training loss 0.749 Epoch 44 iteration 0167/0188: training loss 0.749 Epoch 44 iteration 0168/0188: training loss 0.750 Epoch 44 iteration 0169/0188: training loss 0.750 Epoch 44 iteration 0170/0188: training loss 0.751 Epoch 44 iteration 0171/0188: training loss 0.751 Epoch 44 iteration 0172/0188: training loss 0.750 Epoch 44 iteration 0173/0188: training loss 0.750 Epoch 44 iteration 0174/0188: training loss 0.750 Epoch 44 iteration 0175/0188: training loss 0.750 Epoch 44 iteration 0176/0188: training loss 0.750 Epoch 44 iteration 0177/0188: training loss 0.751 Epoch 44 iteration 0178/0188: training loss 0.752 Epoch 44 iteration 0179/0188: training loss 0.751 Epoch 44 iteration 0180/0188: training loss 0.752 Epoch 44 iteration 0181/0188: training loss 0.752 Epoch 44 iteration 0182/0188: training loss 0.752 Epoch 44 iteration 0183/0188: training loss 0.752 Epoch 44 iteration 0184/0188: training loss 0.752 Epoch 44 iteration 0185/0188: training loss 0.752 Epoch 44 iteration 0186/0188: training loss 0.752 Epoch 44 validation pixAcc: 0.873, mIoU: 0.375 Epoch 45 iteration 0001/0187: training loss 0.603 Epoch 45 iteration 0002/0187: training loss 0.659 Epoch 45 iteration 0003/0187: training loss 0.666 Epoch 45 iteration 0004/0187: training loss 0.726 Epoch 45 iteration 0005/0187: training loss 0.719 Epoch 45 iteration 0006/0187: training loss 0.721 Epoch 45 iteration 0007/0187: training loss 0.715 Epoch 45 iteration 0008/0187: training loss 0.726 Epoch 45 iteration 0009/0187: training loss 0.725 Epoch 45 iteration 0010/0187: training loss 0.745 Epoch 45 iteration 0011/0187: training loss 0.763 Epoch 45 iteration 0012/0187: training loss 0.773 Epoch 45 iteration 0013/0187: training loss 0.766 Epoch 45 iteration 0014/0187: training loss 0.769 Epoch 45 iteration 0015/0187: training loss 0.770 Epoch 45 iteration 0016/0187: training loss 0.777 Epoch 45 iteration 0017/0187: training loss 0.770 Epoch 45 iteration 0018/0187: training loss 0.773 Epoch 45 iteration 0019/0187: training loss 0.770 Epoch 45 iteration 0020/0187: training loss 0.758 Epoch 45 iteration 0021/0187: training loss 0.760 Epoch 45 iteration 0022/0187: training loss 0.752 Epoch 45 iteration 0023/0187: training loss 0.749 Epoch 45 iteration 0024/0187: training loss 0.742 Epoch 45 iteration 0025/0187: training loss 0.736 Epoch 45 iteration 0026/0187: training loss 0.742 Epoch 45 iteration 0027/0187: training loss 0.746 Epoch 45 iteration 0028/0187: training loss 0.752 Epoch 45 iteration 0029/0187: training loss 0.751 Epoch 45 iteration 0030/0187: training loss 0.747 Epoch 45 iteration 0031/0187: training loss 0.751 Epoch 45 iteration 0032/0187: training loss 0.750 Epoch 45 iteration 0033/0187: training loss 0.754 Epoch 45 iteration 0034/0187: training loss 0.751 Epoch 45 iteration 0035/0187: training loss 0.750 Epoch 45 iteration 0036/0187: training loss 0.747 Epoch 45 iteration 0037/0187: training loss 0.750 Epoch 45 iteration 0038/0187: training loss 0.746 Epoch 45 iteration 0039/0187: training loss 0.744 Epoch 45 iteration 0040/0187: training loss 0.747 Epoch 45 iteration 0041/0187: training loss 0.746 Epoch 45 iteration 0042/0187: training loss 0.755 Epoch 45 iteration 0043/0187: training loss 0.751 Epoch 45 iteration 0044/0187: training loss 0.750 Epoch 45 iteration 0045/0187: training loss 0.748 Epoch 45 iteration 0046/0187: training loss 0.745 Epoch 45 iteration 0047/0187: training loss 0.747 Epoch 45 iteration 0048/0187: training loss 0.749 Epoch 45 iteration 0049/0187: training loss 0.747 Epoch 45 iteration 0050/0187: training loss 0.747 Epoch 45 iteration 0051/0187: training loss 0.746 Epoch 45 iteration 0052/0187: training loss 0.744 Epoch 45 iteration 0053/0187: training loss 0.749 Epoch 45 iteration 0054/0187: training loss 0.747 Epoch 45 iteration 0055/0187: training loss 0.750 Epoch 45 iteration 0056/0187: training loss 0.751 Epoch 45 iteration 0057/0187: training loss 0.750 Epoch 45 iteration 0058/0187: training loss 0.749 Epoch 45 iteration 0059/0187: training loss 0.751 Epoch 45 iteration 0060/0187: training loss 0.750 Epoch 45 iteration 0061/0187: training loss 0.753 Epoch 45 iteration 0062/0187: training loss 0.752 Epoch 45 iteration 0063/0187: training loss 0.754 Epoch 45 iteration 0064/0187: training loss 0.753 Epoch 45 iteration 0065/0187: training loss 0.752 Epoch 45 iteration 0066/0187: training loss 0.747 Epoch 45 iteration 0067/0187: training loss 0.747 Epoch 45 iteration 0068/0187: training loss 0.749 Epoch 45 iteration 0069/0187: training loss 0.750 Epoch 45 iteration 0070/0187: training loss 0.757 Epoch 45 iteration 0071/0187: training loss 0.758 Epoch 45 iteration 0072/0187: training loss 0.759 Epoch 45 iteration 0073/0187: training loss 0.762 Epoch 45 iteration 0074/0187: training loss 0.761 Epoch 45 iteration 0075/0187: training loss 0.760 Epoch 45 iteration 0076/0187: training loss 0.764 Epoch 45 iteration 0077/0187: training loss 0.765 Epoch 45 iteration 0078/0187: training loss 0.765 Epoch 45 iteration 0079/0187: training loss 0.763 Epoch 45 iteration 0080/0187: training loss 0.762 Epoch 45 iteration 0081/0187: training loss 0.764 Epoch 45 iteration 0082/0187: training loss 0.763 Epoch 45 iteration 0083/0187: training loss 0.763 Epoch 45 iteration 0084/0187: training loss 0.766 Epoch 45 iteration 0085/0187: training loss 0.764 Epoch 45 iteration 0086/0187: training loss 0.762 Epoch 45 iteration 0087/0187: training loss 0.762 Epoch 45 iteration 0088/0187: training loss 0.760 Epoch 45 iteration 0089/0187: training loss 0.763 Epoch 45 iteration 0090/0187: training loss 0.762 Epoch 45 iteration 0091/0187: training loss 0.759 Epoch 45 iteration 0092/0187: training loss 0.759 Epoch 45 iteration 0093/0187: training loss 0.760 Epoch 45 iteration 0094/0187: training loss 0.762 Epoch 45 iteration 0095/0187: training loss 0.760 Epoch 45 iteration 0096/0187: training loss 0.758 Epoch 45 iteration 0097/0187: training loss 0.758 Epoch 45 iteration 0098/0187: training loss 0.757 Epoch 45 iteration 0099/0187: training loss 0.758 Epoch 45 iteration 0100/0187: training loss 0.758 Epoch 45 iteration 0101/0187: training loss 0.759 Epoch 45 iteration 0102/0187: training loss 0.759 Epoch 45 iteration 0103/0187: training loss 0.758 Epoch 45 iteration 0104/0187: training loss 0.757 Epoch 45 iteration 0105/0187: training loss 0.758 Epoch 45 iteration 0106/0187: training loss 0.759 Epoch 45 iteration 0107/0187: training loss 0.758 Epoch 45 iteration 0108/0187: training loss 0.758 Epoch 45 iteration 0109/0187: training loss 0.756 Epoch 45 iteration 0110/0187: training loss 0.756 Epoch 45 iteration 0111/0187: training loss 0.755 Epoch 45 iteration 0112/0187: training loss 0.754 Epoch 45 iteration 0113/0187: training loss 0.754 Epoch 45 iteration 0114/0187: training loss 0.753 Epoch 45 iteration 0115/0187: training loss 0.753 Epoch 45 iteration 0116/0187: training loss 0.751 Epoch 45 iteration 0117/0187: training loss 0.751 Epoch 45 iteration 0118/0187: training loss 0.752 Epoch 45 iteration 0119/0187: training loss 0.753 Epoch 45 iteration 0120/0187: training loss 0.753 Epoch 45 iteration 0121/0187: training loss 0.754 Epoch 45 iteration 0122/0187: training loss 0.753 Epoch 45 iteration 0123/0187: training loss 0.754 Epoch 45 iteration 0124/0187: training loss 0.754 Epoch 45 iteration 0125/0187: training loss 0.755 Epoch 45 iteration 0126/0187: training loss 0.755 Epoch 45 iteration 0127/0187: training loss 0.755 Epoch 45 iteration 0128/0187: training loss 0.756 Epoch 45 iteration 0129/0187: training loss 0.756 Epoch 45 iteration 0130/0187: training loss 0.756 Epoch 45 iteration 0131/0187: training loss 0.756 Epoch 45 iteration 0132/0187: training loss 0.755 Epoch 45 iteration 0133/0187: training loss 0.755 Epoch 45 iteration 0134/0187: training loss 0.754 Epoch 45 iteration 0135/0187: training loss 0.754 Epoch 45 iteration 0136/0187: training loss 0.756 Epoch 45 iteration 0137/0187: training loss 0.757 Epoch 45 iteration 0138/0187: training loss 0.757 Epoch 45 iteration 0139/0187: training loss 0.758 Epoch 45 iteration 0140/0187: training loss 0.757 Epoch 45 iteration 0141/0187: training loss 0.756 Epoch 45 iteration 0142/0187: training loss 0.757 Epoch 45 iteration 0143/0187: training loss 0.757 Epoch 45 iteration 0144/0187: training loss 0.757 Epoch 45 iteration 0145/0187: training loss 0.757 Epoch 45 iteration 0146/0187: training loss 0.756 Epoch 45 iteration 0147/0187: training loss 0.754 Epoch 45 iteration 0148/0187: training loss 0.754 Epoch 45 iteration 0149/0187: training loss 0.753 Epoch 45 iteration 0150/0187: training loss 0.753 Epoch 45 iteration 0151/0187: training loss 0.753 Epoch 45 iteration 0152/0187: training loss 0.751 Epoch 45 iteration 0153/0187: training loss 0.750 Epoch 45 iteration 0154/0187: training loss 0.749 Epoch 45 iteration 0155/0187: training loss 0.750 Epoch 45 iteration 0156/0187: training loss 0.750 Epoch 45 iteration 0157/0187: training loss 0.751 Epoch 45 iteration 0158/0187: training loss 0.751 Epoch 45 iteration 0159/0187: training loss 0.750 Epoch 45 iteration 0160/0187: training loss 0.750 Epoch 45 iteration 0161/0187: training loss 0.749 Epoch 45 iteration 0162/0187: training loss 0.749 Epoch 45 iteration 0163/0187: training loss 0.749 Epoch 45 iteration 0164/0187: training loss 0.749 Epoch 45 iteration 0165/0187: training loss 0.748 Epoch 45 iteration 0166/0187: training loss 0.747 Epoch 45 iteration 0167/0187: training loss 0.747 Epoch 45 iteration 0168/0187: training loss 0.748 Epoch 45 iteration 0169/0187: training loss 0.749 Epoch 45 iteration 0170/0187: training loss 0.748 Epoch 45 iteration 0171/0187: training loss 0.747 Epoch 45 iteration 0172/0187: training loss 0.748 Epoch 45 iteration 0173/0187: training loss 0.748 Epoch 45 iteration 0174/0187: training loss 0.747 Epoch 45 iteration 0175/0187: training loss 0.747 Epoch 45 iteration 0176/0187: training loss 0.747 Epoch 45 iteration 0177/0187: training loss 0.748 Epoch 45 iteration 0178/0187: training loss 0.748 Epoch 45 iteration 0179/0187: training loss 0.748 Epoch 45 iteration 0180/0187: training loss 0.748 Epoch 45 iteration 0181/0187: training loss 0.747 Epoch 45 iteration 0182/0187: training loss 0.747 Epoch 45 iteration 0183/0187: training loss 0.748 Epoch 45 iteration 0184/0187: training loss 0.748 Epoch 45 iteration 0185/0187: training loss 0.749 Epoch 45 iteration 0186/0187: training loss 0.749 Epoch 45 iteration 0187/0187: training loss 0.750 Epoch 45 validation pixAcc: 0.873, mIoU: 0.382 Epoch 46 iteration 0001/0187: training loss 0.638 Epoch 46 iteration 0002/0187: training loss 0.692 Epoch 46 iteration 0003/0187: training loss 0.793 Epoch 46 iteration 0004/0187: training loss 0.789 Epoch 46 iteration 0005/0187: training loss 0.794 Epoch 46 iteration 0006/0187: training loss 0.778 Epoch 46 iteration 0007/0187: training loss 0.775 Epoch 46 iteration 0008/0187: training loss 0.766 Epoch 46 iteration 0009/0187: training loss 0.775 Epoch 46 iteration 0010/0187: training loss 0.794 Epoch 46 iteration 0011/0187: training loss 0.779 Epoch 46 iteration 0012/0187: training loss 0.772 Epoch 46 iteration 0013/0187: training loss 0.780 Epoch 46 iteration 0014/0187: training loss 0.784 Epoch 46 iteration 0015/0187: training loss 0.783 Epoch 46 iteration 0016/0187: training loss 0.781 Epoch 46 iteration 0017/0187: training loss 0.776 Epoch 46 iteration 0018/0187: training loss 0.783 Epoch 46 iteration 0019/0187: training loss 0.790 Epoch 46 iteration 0020/0187: training loss 0.784 Epoch 46 iteration 0021/0187: training loss 0.781 Epoch 46 iteration 0022/0187: training loss 0.787 Epoch 46 iteration 0023/0187: training loss 0.786 Epoch 46 iteration 0024/0187: training loss 0.788 Epoch 46 iteration 0025/0187: training loss 0.787 Epoch 46 iteration 0026/0187: training loss 0.785 Epoch 46 iteration 0027/0187: training loss 0.779 Epoch 46 iteration 0028/0187: training loss 0.775 Epoch 46 iteration 0029/0187: training loss 0.769 Epoch 46 iteration 0030/0187: training loss 0.771 Epoch 46 iteration 0031/0187: training loss 0.771 Epoch 46 iteration 0032/0187: training loss 0.769 Epoch 46 iteration 0033/0187: training loss 0.775 Epoch 46 iteration 0034/0187: training loss 0.773 Epoch 46 iteration 0035/0187: training loss 0.767 Epoch 46 iteration 0036/0187: training loss 0.764 Epoch 46 iteration 0037/0187: training loss 0.761 Epoch 46 iteration 0038/0187: training loss 0.760 Epoch 46 iteration 0039/0187: training loss 0.762 Epoch 46 iteration 0040/0187: training loss 0.759 Epoch 46 iteration 0041/0187: training loss 0.754 Epoch 46 iteration 0042/0187: training loss 0.757 Epoch 46 iteration 0043/0187: training loss 0.757 Epoch 46 iteration 0044/0187: training loss 0.754 Epoch 46 iteration 0045/0187: training loss 0.752 Epoch 46 iteration 0046/0187: training loss 0.753 Epoch 46 iteration 0047/0187: training loss 0.752 Epoch 46 iteration 0048/0187: training loss 0.752 Epoch 46 iteration 0049/0187: training loss 0.748 Epoch 46 iteration 0050/0187: training loss 0.746 Epoch 46 iteration 0051/0187: training loss 0.750 Epoch 46 iteration 0052/0187: training loss 0.748 Epoch 46 iteration 0053/0187: training loss 0.747 Epoch 46 iteration 0054/0187: training loss 0.745 Epoch 46 iteration 0055/0187: training loss 0.746 Epoch 46 iteration 0056/0187: training loss 0.749 Epoch 46 iteration 0057/0187: training loss 0.747 Epoch 46 iteration 0058/0187: training loss 0.746 Epoch 46 iteration 0059/0187: training loss 0.746 Epoch 46 iteration 0060/0187: training loss 0.746 Epoch 46 iteration 0061/0187: training loss 0.745 Epoch 46 iteration 0062/0187: training loss 0.745 Epoch 46 iteration 0063/0187: training loss 0.746 Epoch 46 iteration 0064/0187: training loss 0.747 Epoch 46 iteration 0065/0187: training loss 0.744 Epoch 46 iteration 0066/0187: training loss 0.744 Epoch 46 iteration 0067/0187: training loss 0.745 Epoch 46 iteration 0068/0187: training loss 0.745 Epoch 46 iteration 0069/0187: training loss 0.744 Epoch 46 iteration 0070/0187: training loss 0.744 Epoch 46 iteration 0071/0187: training loss 0.742 Epoch 46 iteration 0072/0187: training loss 0.740 Epoch 46 iteration 0073/0187: training loss 0.743 Epoch 46 iteration 0074/0187: training loss 0.743 Epoch 46 iteration 0075/0187: training loss 0.742 Epoch 46 iteration 0076/0187: training loss 0.742 Epoch 46 iteration 0077/0187: training loss 0.742 Epoch 46 iteration 0078/0187: training loss 0.741 Epoch 46 iteration 0079/0187: training loss 0.742 Epoch 46 iteration 0080/0187: training loss 0.741 Epoch 46 iteration 0081/0187: training loss 0.742 Epoch 46 iteration 0082/0187: training loss 0.742 Epoch 46 iteration 0083/0187: training loss 0.742 Epoch 46 iteration 0084/0187: training loss 0.740 Epoch 46 iteration 0085/0187: training loss 0.740 Epoch 46 iteration 0086/0187: training loss 0.742 Epoch 46 iteration 0087/0187: training loss 0.740 Epoch 46 iteration 0088/0187: training loss 0.741 Epoch 46 iteration 0089/0187: training loss 0.744 Epoch 46 iteration 0090/0187: training loss 0.743 Epoch 46 iteration 0091/0188: training loss 0.742 Epoch 46 iteration 0092/0188: training loss 0.743 Epoch 46 iteration 0093/0188: training loss 0.742 Epoch 46 iteration 0094/0188: training loss 0.743 Epoch 46 iteration 0095/0188: training loss 0.743 Epoch 46 iteration 0096/0188: training loss 0.742 Epoch 46 iteration 0097/0188: training loss 0.743 Epoch 46 iteration 0098/0188: training loss 0.743 Epoch 46 iteration 0099/0188: training loss 0.742 Epoch 46 iteration 0100/0188: training loss 0.742 Epoch 46 iteration 0101/0188: training loss 0.742 Epoch 46 iteration 0102/0188: training loss 0.741 Epoch 46 iteration 0103/0188: training loss 0.740 Epoch 46 iteration 0104/0188: training loss 0.741 Epoch 46 iteration 0105/0188: training loss 0.742 Epoch 46 iteration 0106/0188: training loss 0.740 Epoch 46 iteration 0107/0188: training loss 0.740 Epoch 46 iteration 0108/0188: training loss 0.741 Epoch 46 iteration 0109/0188: training loss 0.741 Epoch 46 iteration 0110/0188: training loss 0.740 Epoch 46 iteration 0111/0188: training loss 0.740 Epoch 46 iteration 0112/0188: training loss 0.738 Epoch 46 iteration 0113/0188: training loss 0.739 Epoch 46 iteration 0114/0188: training loss 0.739 Epoch 46 iteration 0115/0188: training loss 0.739 Epoch 46 iteration 0116/0188: training loss 0.738 Epoch 46 iteration 0117/0188: training loss 0.738 Epoch 46 iteration 0118/0188: training loss 0.737 Epoch 46 iteration 0119/0188: training loss 0.738 Epoch 46 iteration 0120/0188: training loss 0.737 Epoch 46 iteration 0121/0188: training loss 0.738 Epoch 46 iteration 0122/0188: training loss 0.738 Epoch 46 iteration 0123/0188: training loss 0.738 Epoch 46 iteration 0124/0188: training loss 0.739 Epoch 46 iteration 0125/0188: training loss 0.740 Epoch 46 iteration 0126/0188: training loss 0.739 Epoch 46 iteration 0127/0188: training loss 0.740 Epoch 46 iteration 0128/0188: training loss 0.740 Epoch 46 iteration 0129/0188: training loss 0.740 Epoch 46 iteration 0130/0188: training loss 0.740 Epoch 46 iteration 0131/0188: training loss 0.739 Epoch 46 iteration 0132/0188: training loss 0.740 Epoch 46 iteration 0133/0188: training loss 0.741 Epoch 46 iteration 0134/0188: training loss 0.740 Epoch 46 iteration 0135/0188: training loss 0.741 Epoch 46 iteration 0136/0188: training loss 0.740 Epoch 46 iteration 0137/0188: training loss 0.741 Epoch 46 iteration 0138/0188: training loss 0.740 Epoch 46 iteration 0139/0188: training loss 0.740 Epoch 46 iteration 0140/0188: training loss 0.740 Epoch 46 iteration 0141/0188: training loss 0.740 Epoch 46 iteration 0142/0188: training loss 0.739 Epoch 46 iteration 0143/0188: training loss 0.738 Epoch 46 iteration 0144/0188: training loss 0.739 Epoch 46 iteration 0145/0188: training loss 0.740 Epoch 46 iteration 0146/0188: training loss 0.740 Epoch 46 iteration 0147/0188: training loss 0.739 Epoch 46 iteration 0148/0188: training loss 0.739 Epoch 46 iteration 0149/0188: training loss 0.738 Epoch 46 iteration 0150/0188: training loss 0.737 Epoch 46 iteration 0151/0188: training loss 0.737 Epoch 46 iteration 0152/0188: training loss 0.738 Epoch 46 iteration 0153/0188: training loss 0.738 Epoch 46 iteration 0154/0188: training loss 0.737 Epoch 46 iteration 0155/0188: training loss 0.737 Epoch 46 iteration 0156/0188: training loss 0.736 Epoch 46 iteration 0157/0188: training loss 0.736 Epoch 46 iteration 0158/0188: training loss 0.735 Epoch 46 iteration 0159/0188: training loss 0.735 Epoch 46 iteration 0160/0188: training loss 0.735 Epoch 46 iteration 0161/0188: training loss 0.737 Epoch 46 iteration 0162/0188: training loss 0.737 Epoch 46 iteration 0163/0188: training loss 0.739 Epoch 46 iteration 0164/0188: training loss 0.740 Epoch 46 iteration 0165/0188: training loss 0.739 Epoch 46 iteration 0166/0188: training loss 0.739 Epoch 46 iteration 0167/0188: training loss 0.739 Epoch 46 iteration 0168/0188: training loss 0.740 Epoch 46 iteration 0169/0188: training loss 0.739 Epoch 46 iteration 0170/0188: training loss 0.740 Epoch 46 iteration 0171/0188: training loss 0.738 Epoch 46 iteration 0172/0188: training loss 0.737 Epoch 46 iteration 0173/0188: training loss 0.736 Epoch 46 iteration 0174/0188: training loss 0.736 Epoch 46 iteration 0175/0188: training loss 0.737 Epoch 46 iteration 0176/0188: training loss 0.736 Epoch 46 iteration 0177/0188: training loss 0.736 Epoch 46 iteration 0178/0188: training loss 0.737 Epoch 46 iteration 0179/0188: training loss 0.737 Epoch 46 iteration 0180/0188: training loss 0.738 Epoch 46 iteration 0181/0188: training loss 0.737 Epoch 46 iteration 0182/0188: training loss 0.736 Epoch 46 iteration 0183/0188: training loss 0.737 Epoch 46 iteration 0184/0188: training loss 0.738 Epoch 46 iteration 0185/0188: training loss 0.738 Epoch 46 iteration 0186/0188: training loss 0.737 Epoch 46 validation pixAcc: 0.874, mIoU: 0.380 Epoch 47 iteration 0001/0187: training loss 0.777 Epoch 47 iteration 0002/0187: training loss 0.746 Epoch 47 iteration 0003/0187: training loss 0.736 Epoch 47 iteration 0004/0187: training loss 0.728 Epoch 47 iteration 0005/0187: training loss 0.710 Epoch 47 iteration 0006/0187: training loss 0.705 Epoch 47 iteration 0007/0187: training loss 0.737 Epoch 47 iteration 0008/0187: training loss 0.742 Epoch 47 iteration 0009/0187: training loss 0.726 Epoch 47 iteration 0010/0187: training loss 0.732 Epoch 47 iteration 0011/0187: training loss 0.735 Epoch 47 iteration 0012/0187: training loss 0.729 Epoch 47 iteration 0013/0187: training loss 0.718 Epoch 47 iteration 0014/0187: training loss 0.701 Epoch 47 iteration 0015/0187: training loss 0.696 Epoch 47 iteration 0016/0187: training loss 0.704 Epoch 47 iteration 0017/0187: training loss 0.708 Epoch 47 iteration 0018/0187: training loss 0.715 Epoch 47 iteration 0019/0187: training loss 0.715 Epoch 47 iteration 0020/0187: training loss 0.718 Epoch 47 iteration 0021/0187: training loss 0.715 Epoch 47 iteration 0022/0187: training loss 0.716 Epoch 47 iteration 0023/0187: training loss 0.712 Epoch 47 iteration 0024/0187: training loss 0.716 Epoch 47 iteration 0025/0187: training loss 0.715 Epoch 47 iteration 0026/0187: training loss 0.725 Epoch 47 iteration 0027/0187: training loss 0.736 Epoch 47 iteration 0028/0187: training loss 0.732 Epoch 47 iteration 0029/0187: training loss 0.732 Epoch 47 iteration 0030/0187: training loss 0.730 Epoch 47 iteration 0031/0187: training loss 0.728 Epoch 47 iteration 0032/0187: training loss 0.729 Epoch 47 iteration 0033/0187: training loss 0.723 Epoch 47 iteration 0034/0187: training loss 0.722 Epoch 47 iteration 0035/0187: training loss 0.719 Epoch 47 iteration 0036/0187: training loss 0.716 Epoch 47 iteration 0037/0187: training loss 0.714 Epoch 47 iteration 0038/0187: training loss 0.717 Epoch 47 iteration 0039/0187: training loss 0.713 Epoch 47 iteration 0040/0187: training loss 0.711 Epoch 47 iteration 0041/0187: training loss 0.714 Epoch 47 iteration 0042/0187: training loss 0.714 Epoch 47 iteration 0043/0187: training loss 0.719 Epoch 47 iteration 0044/0187: training loss 0.717 Epoch 47 iteration 0045/0187: training loss 0.718 Epoch 47 iteration 0046/0187: training loss 0.715 Epoch 47 iteration 0047/0187: training loss 0.715 Epoch 47 iteration 0048/0187: training loss 0.714 Epoch 47 iteration 0049/0187: training loss 0.713 Epoch 47 iteration 0050/0187: training loss 0.716 Epoch 47 iteration 0051/0187: training loss 0.714 Epoch 47 iteration 0052/0187: training loss 0.716 Epoch 47 iteration 0053/0187: training loss 0.719 Epoch 47 iteration 0054/0187: training loss 0.717 Epoch 47 iteration 0055/0187: training loss 0.713 Epoch 47 iteration 0056/0187: training loss 0.714 Epoch 47 iteration 0057/0187: training loss 0.717 Epoch 47 iteration 0058/0187: training loss 0.719 Epoch 47 iteration 0059/0187: training loss 0.721 Epoch 47 iteration 0060/0187: training loss 0.720 Epoch 47 iteration 0061/0187: training loss 0.718 Epoch 47 iteration 0062/0187: training loss 0.717 Epoch 47 iteration 0063/0187: training loss 0.719 Epoch 47 iteration 0064/0187: training loss 0.724 Epoch 47 iteration 0065/0187: training loss 0.726 Epoch 47 iteration 0066/0187: training loss 0.722 Epoch 47 iteration 0067/0187: training loss 0.723 Epoch 47 iteration 0068/0187: training loss 0.724 Epoch 47 iteration 0069/0187: training loss 0.727 Epoch 47 iteration 0070/0187: training loss 0.731 Epoch 47 iteration 0071/0187: training loss 0.730 Epoch 47 iteration 0072/0187: training loss 0.735 Epoch 47 iteration 0073/0187: training loss 0.738 Epoch 47 iteration 0074/0187: training loss 0.739 Epoch 47 iteration 0075/0187: training loss 0.739 Epoch 47 iteration 0076/0187: training loss 0.739 Epoch 47 iteration 0077/0187: training loss 0.739 Epoch 47 iteration 0078/0187: training loss 0.740 Epoch 47 iteration 0079/0187: training loss 0.740 Epoch 47 iteration 0080/0187: training loss 0.740 Epoch 47 iteration 0081/0187: training loss 0.743 Epoch 47 iteration 0082/0187: training loss 0.742 Epoch 47 iteration 0083/0187: training loss 0.740 Epoch 47 iteration 0084/0187: training loss 0.742 Epoch 47 iteration 0085/0187: training loss 0.745 Epoch 47 iteration 0086/0187: training loss 0.744 Epoch 47 iteration 0087/0187: training loss 0.742 Epoch 47 iteration 0088/0187: training loss 0.741 Epoch 47 iteration 0089/0187: training loss 0.740 Epoch 47 iteration 0090/0187: training loss 0.741 Epoch 47 iteration 0091/0187: training loss 0.741 Epoch 47 iteration 0092/0187: training loss 0.740 Epoch 47 iteration 0093/0187: training loss 0.741 Epoch 47 iteration 0094/0187: training loss 0.742 Epoch 47 iteration 0095/0187: training loss 0.741 Epoch 47 iteration 0096/0187: training loss 0.741 Epoch 47 iteration 0097/0187: training loss 0.742 Epoch 47 iteration 0098/0187: training loss 0.742 Epoch 47 iteration 0099/0187: training loss 0.740 Epoch 47 iteration 0100/0187: training loss 0.740 Epoch 47 iteration 0101/0187: training loss 0.740 Epoch 47 iteration 0102/0187: training loss 0.741 Epoch 47 iteration 0103/0187: training loss 0.741 Epoch 47 iteration 0104/0187: training loss 0.740 Epoch 47 iteration 0105/0187: training loss 0.740 Epoch 47 iteration 0106/0187: training loss 0.740 Epoch 47 iteration 0107/0187: training loss 0.740 Epoch 47 iteration 0108/0187: training loss 0.739 Epoch 47 iteration 0109/0187: training loss 0.737 Epoch 47 iteration 0110/0187: training loss 0.737 Epoch 47 iteration 0111/0187: training loss 0.737 Epoch 47 iteration 0112/0187: training loss 0.736 Epoch 47 iteration 0113/0187: training loss 0.736 Epoch 47 iteration 0114/0187: training loss 0.735 Epoch 47 iteration 0115/0187: training loss 0.735 Epoch 47 iteration 0116/0187: training loss 0.735 Epoch 47 iteration 0117/0187: training loss 0.734 Epoch 47 iteration 0118/0187: training loss 0.734 Epoch 47 iteration 0119/0187: training loss 0.737 Epoch 47 iteration 0120/0187: training loss 0.737 Epoch 47 iteration 0121/0187: training loss 0.737 Epoch 47 iteration 0122/0187: training loss 0.735 Epoch 47 iteration 0123/0187: training loss 0.735 Epoch 47 iteration 0124/0187: training loss 0.734 Epoch 47 iteration 0125/0187: training loss 0.734 Epoch 47 iteration 0126/0187: training loss 0.733 Epoch 47 iteration 0127/0187: training loss 0.733 Epoch 47 iteration 0128/0187: training loss 0.732 Epoch 47 iteration 0129/0187: training loss 0.731 Epoch 47 iteration 0130/0187: training loss 0.730 Epoch 47 iteration 0131/0187: training loss 0.729 Epoch 47 iteration 0132/0187: training loss 0.730 Epoch 47 iteration 0133/0187: training loss 0.729 Epoch 47 iteration 0134/0187: training loss 0.729 Epoch 47 iteration 0135/0187: training loss 0.730 Epoch 47 iteration 0136/0187: training loss 0.730 Epoch 47 iteration 0137/0187: training loss 0.730 Epoch 47 iteration 0138/0187: training loss 0.730 Epoch 47 iteration 0139/0187: training loss 0.731 Epoch 47 iteration 0140/0187: training loss 0.731 Epoch 47 iteration 0141/0187: training loss 0.731 Epoch 47 iteration 0142/0187: training loss 0.731 Epoch 47 iteration 0143/0187: training loss 0.731 Epoch 47 iteration 0144/0187: training loss 0.730 Epoch 47 iteration 0145/0187: training loss 0.730 Epoch 47 iteration 0146/0187: training loss 0.730 Epoch 47 iteration 0147/0187: training loss 0.729 Epoch 47 iteration 0148/0187: training loss 0.730 Epoch 47 iteration 0149/0187: training loss 0.731 Epoch 47 iteration 0150/0187: training loss 0.731 Epoch 47 iteration 0151/0187: training loss 0.731 Epoch 47 iteration 0152/0187: training loss 0.730 Epoch 47 iteration 0153/0187: training loss 0.730 Epoch 47 iteration 0154/0187: training loss 0.730 Epoch 47 iteration 0155/0187: training loss 0.731 Epoch 47 iteration 0156/0187: training loss 0.732 Epoch 47 iteration 0157/0187: training loss 0.732 Epoch 47 iteration 0158/0187: training loss 0.732 Epoch 47 iteration 0159/0187: training loss 0.731 Epoch 47 iteration 0160/0187: training loss 0.731 Epoch 47 iteration 0161/0187: training loss 0.731 Epoch 47 iteration 0162/0187: training loss 0.731 Epoch 47 iteration 0163/0187: training loss 0.730 Epoch 47 iteration 0164/0187: training loss 0.730 Epoch 47 iteration 0165/0187: training loss 0.730 Epoch 47 iteration 0166/0187: training loss 0.731 Epoch 47 iteration 0167/0187: training loss 0.731 Epoch 47 iteration 0168/0187: training loss 0.732 Epoch 47 iteration 0169/0187: training loss 0.731 Epoch 47 iteration 0170/0187: training loss 0.730 Epoch 47 iteration 0171/0187: training loss 0.730 Epoch 47 iteration 0172/0187: training loss 0.730 Epoch 47 iteration 0173/0187: training loss 0.731 Epoch 47 iteration 0174/0187: training loss 0.731 Epoch 47 iteration 0175/0187: training loss 0.732 Epoch 47 iteration 0176/0187: training loss 0.731 Epoch 47 iteration 0177/0187: training loss 0.732 Epoch 47 iteration 0178/0187: training loss 0.731 Epoch 47 iteration 0179/0187: training loss 0.731 Epoch 47 iteration 0180/0187: training loss 0.732 Epoch 47 iteration 0181/0187: training loss 0.732 Epoch 47 iteration 0182/0187: training loss 0.732 Epoch 47 iteration 0183/0187: training loss 0.732 Epoch 47 iteration 0184/0187: training loss 0.731 Epoch 47 iteration 0185/0187: training loss 0.731 Epoch 47 iteration 0186/0187: training loss 0.731 Epoch 47 iteration 0187/0187: training loss 0.731 Epoch 47 validation pixAcc: 0.874, mIoU: 0.381 Epoch 48 iteration 0001/0187: training loss 0.659 Epoch 48 iteration 0002/0187: training loss 0.726 Epoch 48 iteration 0003/0187: training loss 0.715 Epoch 48 iteration 0004/0187: training loss 0.714 Epoch 48 iteration 0005/0187: training loss 0.716 Epoch 48 iteration 0006/0187: training loss 0.713 Epoch 48 iteration 0007/0187: training loss 0.718 Epoch 48 iteration 0008/0187: training loss 0.727 Epoch 48 iteration 0009/0187: training loss 0.723 Epoch 48 iteration 0010/0187: training loss 0.726 Epoch 48 iteration 0011/0187: training loss 0.723 Epoch 48 iteration 0012/0187: training loss 0.729 Epoch 48 iteration 0013/0187: training loss 0.730 Epoch 48 iteration 0014/0187: training loss 0.722 Epoch 48 iteration 0015/0187: training loss 0.721 Epoch 48 iteration 0016/0187: training loss 0.726 Epoch 48 iteration 0017/0187: training loss 0.730 Epoch 48 iteration 0018/0187: training loss 0.737 Epoch 48 iteration 0019/0187: training loss 0.745 Epoch 48 iteration 0020/0187: training loss 0.738 Epoch 48 iteration 0021/0187: training loss 0.734 Epoch 48 iteration 0022/0187: training loss 0.736 Epoch 48 iteration 0023/0187: training loss 0.740 Epoch 48 iteration 0024/0187: training loss 0.735 Epoch 48 iteration 0025/0187: training loss 0.733 Epoch 48 iteration 0026/0187: training loss 0.734 Epoch 48 iteration 0027/0187: training loss 0.733 Epoch 48 iteration 0028/0187: training loss 0.731 Epoch 48 iteration 0029/0187: training loss 0.734 Epoch 48 iteration 0030/0187: training loss 0.730 Epoch 48 iteration 0031/0187: training loss 0.729 Epoch 48 iteration 0032/0187: training loss 0.726 Epoch 48 iteration 0033/0187: training loss 0.729 Epoch 48 iteration 0034/0187: training loss 0.738 Epoch 48 iteration 0035/0187: training loss 0.736 Epoch 48 iteration 0036/0187: training loss 0.734 Epoch 48 iteration 0037/0187: training loss 0.739 Epoch 48 iteration 0038/0187: training loss 0.748 Epoch 48 iteration 0039/0187: training loss 0.748 Epoch 48 iteration 0040/0187: training loss 0.749 Epoch 48 iteration 0041/0187: training loss 0.748 Epoch 48 iteration 0042/0187: training loss 0.748 Epoch 48 iteration 0043/0187: training loss 0.748 Epoch 48 iteration 0044/0187: training loss 0.747 Epoch 48 iteration 0045/0187: training loss 0.751 Epoch 48 iteration 0046/0187: training loss 0.751 Epoch 48 iteration 0047/0187: training loss 0.753 Epoch 48 iteration 0048/0187: training loss 0.752 Epoch 48 iteration 0049/0187: training loss 0.752 Epoch 48 iteration 0050/0187: training loss 0.754 Epoch 48 iteration 0051/0187: training loss 0.752 Epoch 48 iteration 0052/0187: training loss 0.750 Epoch 48 iteration 0053/0187: training loss 0.745 Epoch 48 iteration 0054/0187: training loss 0.753 Epoch 48 iteration 0055/0187: training loss 0.755 Epoch 48 iteration 0056/0187: training loss 0.755 Epoch 48 iteration 0057/0187: training loss 0.755 Epoch 48 iteration 0058/0187: training loss 0.751 Epoch 48 iteration 0059/0187: training loss 0.749 Epoch 48 iteration 0060/0187: training loss 0.751 Epoch 48 iteration 0061/0187: training loss 0.750 Epoch 48 iteration 0062/0187: training loss 0.748 Epoch 48 iteration 0063/0187: training loss 0.747 Epoch 48 iteration 0064/0187: training loss 0.745 Epoch 48 iteration 0065/0187: training loss 0.743 Epoch 48 iteration 0066/0187: training loss 0.742 Epoch 48 iteration 0067/0187: training loss 0.739 Epoch 48 iteration 0068/0187: training loss 0.741 Epoch 48 iteration 0069/0187: training loss 0.741 Epoch 48 iteration 0070/0187: training loss 0.742 Epoch 48 iteration 0071/0187: training loss 0.743 Epoch 48 iteration 0072/0187: training loss 0.745 Epoch 48 iteration 0073/0187: training loss 0.745 Epoch 48 iteration 0074/0187: training loss 0.743 Epoch 48 iteration 0075/0187: training loss 0.742 Epoch 48 iteration 0076/0187: training loss 0.743 Epoch 48 iteration 0077/0187: training loss 0.742 Epoch 48 iteration 0078/0187: training loss 0.743 Epoch 48 iteration 0079/0187: training loss 0.743 Epoch 48 iteration 0080/0187: training loss 0.742 Epoch 48 iteration 0081/0187: training loss 0.742 Epoch 48 iteration 0082/0187: training loss 0.744 Epoch 48 iteration 0083/0187: training loss 0.744 Epoch 48 iteration 0084/0187: training loss 0.745 Epoch 48 iteration 0085/0187: training loss 0.744 Epoch 48 iteration 0086/0187: training loss 0.744 Epoch 48 iteration 0087/0187: training loss 0.745 Epoch 48 iteration 0088/0187: training loss 0.744 Epoch 48 iteration 0089/0187: training loss 0.744 Epoch 48 iteration 0090/0187: training loss 0.744 Epoch 48 iteration 0091/0188: training loss 0.743 Epoch 48 iteration 0092/0188: training loss 0.742 Epoch 48 iteration 0093/0188: training loss 0.744 Epoch 48 iteration 0094/0188: training loss 0.743 Epoch 48 iteration 0095/0188: training loss 0.743 Epoch 48 iteration 0096/0188: training loss 0.742 Epoch 48 iteration 0097/0188: training loss 0.742 Epoch 48 iteration 0098/0188: training loss 0.744 Epoch 48 iteration 0099/0188: training loss 0.744 Epoch 48 iteration 0100/0188: training loss 0.743 Epoch 48 iteration 0101/0188: training loss 0.742 Epoch 48 iteration 0102/0188: training loss 0.742 Epoch 48 iteration 0103/0188: training loss 0.741 Epoch 48 iteration 0104/0188: training loss 0.743 Epoch 48 iteration 0105/0188: training loss 0.744 Epoch 48 iteration 0106/0188: training loss 0.746 Epoch 48 iteration 0107/0188: training loss 0.745 Epoch 48 iteration 0108/0188: training loss 0.745 Epoch 48 iteration 0109/0188: training loss 0.746 Epoch 48 iteration 0110/0188: training loss 0.746 Epoch 48 iteration 0111/0188: training loss 0.748 Epoch 48 iteration 0112/0188: training loss 0.748 Epoch 48 iteration 0113/0188: training loss 0.747 Epoch 48 iteration 0114/0188: training loss 0.746 Epoch 48 iteration 0115/0188: training loss 0.745 Epoch 48 iteration 0116/0188: training loss 0.744 Epoch 48 iteration 0117/0188: training loss 0.746 Epoch 48 iteration 0118/0188: training loss 0.746 Epoch 48 iteration 0119/0188: training loss 0.745 Epoch 48 iteration 0120/0188: training loss 0.746 Epoch 48 iteration 0121/0188: training loss 0.745 Epoch 48 iteration 0122/0188: training loss 0.746 Epoch 48 iteration 0123/0188: training loss 0.744 Epoch 48 iteration 0124/0188: training loss 0.743 Epoch 48 iteration 0125/0188: training loss 0.743 Epoch 48 iteration 0126/0188: training loss 0.742 Epoch 48 iteration 0127/0188: training loss 0.744 Epoch 48 iteration 0128/0188: training loss 0.742 Epoch 48 iteration 0129/0188: training loss 0.741 Epoch 48 iteration 0130/0188: training loss 0.741 Epoch 48 iteration 0131/0188: training loss 0.741 Epoch 48 iteration 0132/0188: training loss 0.740 Epoch 48 iteration 0133/0188: training loss 0.739 Epoch 48 iteration 0134/0188: training loss 0.738 Epoch 48 iteration 0135/0188: training loss 0.739 Epoch 48 iteration 0136/0188: training loss 0.738 Epoch 48 iteration 0137/0188: training loss 0.739 Epoch 48 iteration 0138/0188: training loss 0.737 Epoch 48 iteration 0139/0188: training loss 0.736 Epoch 48 iteration 0140/0188: training loss 0.736 Epoch 48 iteration 0141/0188: training loss 0.736 Epoch 48 iteration 0142/0188: training loss 0.736 Epoch 48 iteration 0143/0188: training loss 0.736 Epoch 48 iteration 0144/0188: training loss 0.737 Epoch 48 iteration 0145/0188: training loss 0.738 Epoch 48 iteration 0146/0188: training loss 0.738 Epoch 48 iteration 0147/0188: training loss 0.739 Epoch 48 iteration 0148/0188: training loss 0.740 Epoch 48 iteration 0149/0188: training loss 0.739 Epoch 48 iteration 0150/0188: training loss 0.738 Epoch 48 iteration 0151/0188: training loss 0.738 Epoch 48 iteration 0152/0188: training loss 0.737 Epoch 48 iteration 0153/0188: training loss 0.739 Epoch 48 iteration 0154/0188: training loss 0.739 Epoch 48 iteration 0155/0188: training loss 0.739 Epoch 48 iteration 0156/0188: training loss 0.739 Epoch 48 iteration 0157/0188: training loss 0.739 Epoch 48 iteration 0158/0188: training loss 0.738 Epoch 48 iteration 0159/0188: training loss 0.738 Epoch 48 iteration 0160/0188: training loss 0.738 Epoch 48 iteration 0161/0188: training loss 0.738 Epoch 48 iteration 0162/0188: training loss 0.737 Epoch 48 iteration 0163/0188: training loss 0.738 Epoch 48 iteration 0164/0188: training loss 0.739 Epoch 48 iteration 0165/0188: training loss 0.740 Epoch 48 iteration 0166/0188: training loss 0.740 Epoch 48 iteration 0167/0188: training loss 0.740 Epoch 48 iteration 0168/0188: training loss 0.740 Epoch 48 iteration 0169/0188: training loss 0.740 Epoch 48 iteration 0170/0188: training loss 0.740 Epoch 48 iteration 0171/0188: training loss 0.740 Epoch 48 iteration 0172/0188: training loss 0.740 Epoch 48 iteration 0173/0188: training loss 0.740 Epoch 48 iteration 0174/0188: training loss 0.740 Epoch 48 iteration 0175/0188: training loss 0.741 Epoch 48 iteration 0176/0188: training loss 0.740 Epoch 48 iteration 0177/0188: training loss 0.741 Epoch 48 iteration 0178/0188: training loss 0.741 Epoch 48 iteration 0179/0188: training loss 0.741 Epoch 48 iteration 0180/0188: training loss 0.741 Epoch 48 iteration 0181/0188: training loss 0.740 Epoch 48 iteration 0182/0188: training loss 0.740 Epoch 48 iteration 0183/0188: training loss 0.740 Epoch 48 iteration 0184/0188: training loss 0.740 Epoch 48 iteration 0185/0188: training loss 0.740 Epoch 48 iteration 0186/0188: training loss 0.739 Epoch 48 validation pixAcc: 0.874, mIoU: 0.385 Epoch 49 iteration 0001/0187: training loss 0.706 Epoch 49 iteration 0002/0187: training loss 0.723 Epoch 49 iteration 0003/0187: training loss 0.751 Epoch 49 iteration 0004/0187: training loss 0.780 Epoch 49 iteration 0005/0187: training loss 0.776 Epoch 49 iteration 0006/0187: training loss 0.753 Epoch 49 iteration 0007/0187: training loss 0.756 Epoch 49 iteration 0008/0187: training loss 0.758 Epoch 49 iteration 0009/0187: training loss 0.758 Epoch 49 iteration 0010/0187: training loss 0.767 Epoch 49 iteration 0011/0187: training loss 0.757 Epoch 49 iteration 0012/0187: training loss 0.752 Epoch 49 iteration 0013/0187: training loss 0.748 Epoch 49 iteration 0014/0187: training loss 0.737 Epoch 49 iteration 0015/0187: training loss 0.739 Epoch 49 iteration 0016/0187: training loss 0.742 Epoch 49 iteration 0017/0187: training loss 0.748 Epoch 49 iteration 0018/0187: training loss 0.756 Epoch 49 iteration 0019/0187: training loss 0.752 Epoch 49 iteration 0020/0187: training loss 0.746 Epoch 49 iteration 0021/0187: training loss 0.741 Epoch 49 iteration 0022/0187: training loss 0.743 Epoch 49 iteration 0023/0187: training loss 0.739 Epoch 49 iteration 0024/0187: training loss 0.736 Epoch 49 iteration 0025/0187: training loss 0.745 Epoch 49 iteration 0026/0187: training loss 0.740 Epoch 49 iteration 0027/0187: training loss 0.736 Epoch 49 iteration 0028/0187: training loss 0.738 Epoch 49 iteration 0029/0187: training loss 0.748 Epoch 49 iteration 0030/0187: training loss 0.743 Epoch 49 iteration 0031/0187: training loss 0.741 Epoch 49 iteration 0032/0187: training loss 0.744 Epoch 49 iteration 0033/0187: training loss 0.742 Epoch 49 iteration 0034/0187: training loss 0.745 Epoch 49 iteration 0035/0187: training loss 0.746 Epoch 49 iteration 0036/0187: training loss 0.755 Epoch 49 iteration 0037/0187: training loss 0.753 Epoch 49 iteration 0038/0187: training loss 0.753 Epoch 49 iteration 0039/0187: training loss 0.747 Epoch 49 iteration 0040/0187: training loss 0.748 Epoch 49 iteration 0041/0187: training loss 0.750 Epoch 49 iteration 0042/0187: training loss 0.752 Epoch 49 iteration 0043/0187: training loss 0.752 Epoch 49 iteration 0044/0187: training loss 0.748 Epoch 49 iteration 0045/0187: training loss 0.748 Epoch 49 iteration 0046/0187: training loss 0.747 Epoch 49 iteration 0047/0187: training loss 0.748 Epoch 49 iteration 0048/0187: training loss 0.749 Epoch 49 iteration 0049/0187: training loss 0.745 Epoch 49 iteration 0050/0187: training loss 0.743 Epoch 49 iteration 0051/0187: training loss 0.740 Epoch 49 iteration 0052/0187: training loss 0.744 Epoch 49 iteration 0053/0187: training loss 0.745 Epoch 49 iteration 0054/0187: training loss 0.745 Epoch 49 iteration 0055/0187: training loss 0.744 Epoch 49 iteration 0056/0187: training loss 0.742 Epoch 49 iteration 0057/0187: training loss 0.742 Epoch 49 iteration 0058/0187: training loss 0.740 Epoch 49 iteration 0059/0187: training loss 0.741 Epoch 49 iteration 0060/0187: training loss 0.742 Epoch 49 iteration 0061/0187: training loss 0.745 Epoch 49 iteration 0062/0187: training loss 0.745 Epoch 49 iteration 0063/0187: training loss 0.746 Epoch 49 iteration 0064/0187: training loss 0.745 Epoch 49 iteration 0065/0187: training loss 0.743 Epoch 49 iteration 0066/0187: training loss 0.744 Epoch 49 iteration 0067/0187: training loss 0.743 Epoch 49 iteration 0068/0187: training loss 0.743 Epoch 49 iteration 0069/0187: training loss 0.742 Epoch 49 iteration 0070/0187: training loss 0.742 Epoch 49 iteration 0071/0187: training loss 0.741 Epoch 49 iteration 0072/0187: training loss 0.740 Epoch 49 iteration 0073/0187: training loss 0.740 Epoch 49 iteration 0074/0187: training loss 0.738 Epoch 49 iteration 0075/0187: training loss 0.741 Epoch 49 iteration 0076/0187: training loss 0.741 Epoch 49 iteration 0077/0187: training loss 0.742 Epoch 49 iteration 0078/0187: training loss 0.742 Epoch 49 iteration 0079/0187: training loss 0.741 Epoch 49 iteration 0080/0187: training loss 0.738 Epoch 49 iteration 0081/0187: training loss 0.743 Epoch 49 iteration 0082/0187: training loss 0.741 Epoch 49 iteration 0083/0187: training loss 0.740 Epoch 49 iteration 0084/0187: training loss 0.743 Epoch 49 iteration 0085/0187: training loss 0.745 Epoch 49 iteration 0086/0187: training loss 0.745 Epoch 49 iteration 0087/0187: training loss 0.744 Epoch 49 iteration 0088/0187: training loss 0.744 Epoch 49 iteration 0089/0187: training loss 0.741 Epoch 49 iteration 0090/0187: training loss 0.739 Epoch 49 iteration 0091/0187: training loss 0.738 Epoch 49 iteration 0092/0187: training loss 0.739 Epoch 49 iteration 0093/0187: training loss 0.740 Epoch 49 iteration 0094/0187: training loss 0.740 Epoch 49 iteration 0095/0187: training loss 0.742 Epoch 49 iteration 0096/0187: training loss 0.742 Epoch 49 iteration 0097/0187: training loss 0.743 Epoch 49 iteration 0098/0187: training loss 0.743 Epoch 49 iteration 0099/0187: training loss 0.742 Epoch 49 iteration 0100/0187: training loss 0.741 Epoch 49 iteration 0101/0187: training loss 0.741 Epoch 49 iteration 0102/0187: training loss 0.740 Epoch 49 iteration 0103/0187: training loss 0.740 Epoch 49 iteration 0104/0187: training loss 0.738 Epoch 49 iteration 0105/0187: training loss 0.737 Epoch 49 iteration 0106/0187: training loss 0.736 Epoch 49 iteration 0107/0187: training loss 0.735 Epoch 49 iteration 0108/0187: training loss 0.734 Epoch 49 iteration 0109/0187: training loss 0.733 Epoch 49 iteration 0110/0187: training loss 0.733 Epoch 49 iteration 0111/0187: training loss 0.734 Epoch 49 iteration 0112/0187: training loss 0.734 Epoch 49 iteration 0113/0187: training loss 0.735 Epoch 49 iteration 0114/0187: training loss 0.735 Epoch 49 iteration 0115/0187: training loss 0.735 Epoch 49 iteration 0116/0187: training loss 0.735 Epoch 49 iteration 0117/0187: training loss 0.734 Epoch 49 iteration 0118/0187: training loss 0.734 Epoch 49 iteration 0119/0187: training loss 0.733 Epoch 49 iteration 0120/0187: training loss 0.734 Epoch 49 iteration 0121/0187: training loss 0.732 Epoch 49 iteration 0122/0187: training loss 0.732 Epoch 49 iteration 0123/0187: training loss 0.731 Epoch 49 iteration 0124/0187: training loss 0.731 Epoch 49 iteration 0125/0187: training loss 0.732 Epoch 49 iteration 0126/0187: training loss 0.732 Epoch 49 iteration 0127/0187: training loss 0.733 Epoch 49 iteration 0128/0187: training loss 0.734 Epoch 49 iteration 0129/0187: training loss 0.733 Epoch 49 iteration 0130/0187: training loss 0.733 Epoch 49 iteration 0131/0187: training loss 0.733 Epoch 49 iteration 0132/0187: training loss 0.731 Epoch 49 iteration 0133/0187: training loss 0.732 Epoch 49 iteration 0134/0187: training loss 0.731 Epoch 49 iteration 0135/0187: training loss 0.731 Epoch 49 iteration 0136/0187: training loss 0.730 Epoch 49 iteration 0137/0187: training loss 0.732 Epoch 49 iteration 0138/0187: training loss 0.730 Epoch 49 iteration 0139/0187: training loss 0.729 Epoch 49 iteration 0140/0187: training loss 0.728 Epoch 49 iteration 0141/0187: training loss 0.728 Epoch 49 iteration 0142/0187: training loss 0.728 Epoch 49 iteration 0143/0187: training loss 0.728 Epoch 49 iteration 0144/0187: training loss 0.728 Epoch 49 iteration 0145/0187: training loss 0.727 Epoch 49 iteration 0146/0187: training loss 0.727 Epoch 49 iteration 0147/0187: training loss 0.728 Epoch 49 iteration 0148/0187: training loss 0.727 Epoch 49 iteration 0149/0187: training loss 0.726 Epoch 49 iteration 0150/0187: training loss 0.726 Epoch 49 iteration 0151/0187: training loss 0.726 Epoch 49 iteration 0152/0187: training loss 0.726 Epoch 49 iteration 0153/0187: training loss 0.726 Epoch 49 iteration 0154/0187: training loss 0.727 Epoch 49 iteration 0155/0187: training loss 0.726 Epoch 49 iteration 0156/0187: training loss 0.726 Epoch 49 iteration 0157/0187: training loss 0.727 Epoch 49 iteration 0158/0187: training loss 0.727 Epoch 49 iteration 0159/0187: training loss 0.729 Epoch 49 iteration 0160/0187: training loss 0.730 Epoch 49 iteration 0161/0187: training loss 0.730 Epoch 49 iteration 0162/0187: training loss 0.730 Epoch 49 iteration 0163/0187: training loss 0.728 Epoch 49 iteration 0164/0187: training loss 0.728 Epoch 49 iteration 0165/0187: training loss 0.727 Epoch 49 iteration 0166/0187: training loss 0.728 Epoch 49 iteration 0167/0187: training loss 0.729 Epoch 49 iteration 0168/0187: training loss 0.729 Epoch 49 iteration 0169/0187: training loss 0.729 Epoch 49 iteration 0170/0187: training loss 0.728 Epoch 49 iteration 0171/0187: training loss 0.728 Epoch 49 iteration 0172/0187: training loss 0.728 Epoch 49 iteration 0173/0187: training loss 0.727 Epoch 49 iteration 0174/0187: training loss 0.727 Epoch 49 iteration 0175/0187: training loss 0.725 Epoch 49 iteration 0176/0187: training loss 0.727 Epoch 49 iteration 0177/0187: training loss 0.726 Epoch 49 iteration 0178/0187: training loss 0.726 Epoch 49 iteration 0179/0187: training loss 0.727 Epoch 49 iteration 0180/0187: training loss 0.726 Epoch 49 iteration 0181/0187: training loss 0.725 Epoch 49 iteration 0182/0187: training loss 0.726 Epoch 49 iteration 0183/0187: training loss 0.727 Epoch 49 iteration 0184/0187: training loss 0.727 Epoch 49 iteration 0185/0187: training loss 0.728 Epoch 49 iteration 0186/0187: training loss 0.728 Epoch 49 iteration 0187/0187: training loss 0.727 Epoch 49 validation pixAcc: 0.873, mIoU: 0.387 Epoch 50 iteration 0001/0187: training loss 0.745 Epoch 50 iteration 0002/0187: training loss 0.743 Epoch 50 iteration 0003/0187: training loss 0.774 Epoch 50 iteration 0004/0187: training loss 0.752 Epoch 50 iteration 0005/0187: training loss 0.761 Epoch 50 iteration 0006/0187: training loss 0.767 Epoch 50 iteration 0007/0187: training loss 0.745 Epoch 50 iteration 0008/0187: training loss 0.723 Epoch 50 iteration 0009/0187: training loss 0.720 Epoch 50 iteration 0010/0187: training loss 0.720 Epoch 50 iteration 0011/0187: training loss 0.721 Epoch 50 iteration 0012/0187: training loss 0.727 Epoch 50 iteration 0013/0187: training loss 0.730 Epoch 50 iteration 0014/0187: training loss 0.733 Epoch 50 iteration 0015/0187: training loss 0.725 Epoch 50 iteration 0016/0187: training loss 0.722 Epoch 50 iteration 0017/0187: training loss 0.717 Epoch 50 iteration 0018/0187: training loss 0.711 Epoch 50 iteration 0019/0187: training loss 0.711 Epoch 50 iteration 0020/0187: training loss 0.718 Epoch 50 iteration 0021/0187: training loss 0.722 Epoch 50 iteration 0022/0187: training loss 0.723 Epoch 50 iteration 0023/0187: training loss 0.722 Epoch 50 iteration 0024/0187: training loss 0.717 Epoch 50 iteration 0025/0187: training loss 0.724 Epoch 50 iteration 0026/0187: training loss 0.720 Epoch 50 iteration 0027/0187: training loss 0.717 Epoch 50 iteration 0028/0187: training loss 0.718 Epoch 50 iteration 0029/0187: training loss 0.718 Epoch 50 iteration 0030/0187: training loss 0.720 Epoch 50 iteration 0031/0187: training loss 0.718 Epoch 50 iteration 0032/0187: training loss 0.720 Epoch 50 iteration 0033/0187: training loss 0.720 Epoch 50 iteration 0034/0187: training loss 0.717 Epoch 50 iteration 0035/0187: training loss 0.716 Epoch 50 iteration 0036/0187: training loss 0.721 Epoch 50 iteration 0037/0187: training loss 0.720 Epoch 50 iteration 0038/0187: training loss 0.727 Epoch 50 iteration 0039/0187: training loss 0.725 Epoch 50 iteration 0040/0187: training loss 0.723 Epoch 50 iteration 0041/0187: training loss 0.721 Epoch 50 iteration 0042/0187: training loss 0.722 Epoch 50 iteration 0043/0187: training loss 0.719 Epoch 50 iteration 0044/0187: training loss 0.717 Epoch 50 iteration 0045/0187: training loss 0.718 Epoch 50 iteration 0046/0187: training loss 0.715 Epoch 50 iteration 0047/0187: training loss 0.715 Epoch 50 iteration 0048/0187: training loss 0.716 Epoch 50 iteration 0049/0187: training loss 0.716 Epoch 50 iteration 0050/0187: training loss 0.717 Epoch 50 iteration 0051/0187: training loss 0.718 Epoch 50 iteration 0052/0187: training loss 0.722 Epoch 50 iteration 0053/0187: training loss 0.721 Epoch 50 iteration 0054/0187: training loss 0.722 Epoch 50 iteration 0055/0187: training loss 0.719 Epoch 50 iteration 0056/0187: training loss 0.717 Epoch 50 iteration 0057/0187: training loss 0.716 Epoch 50 iteration 0058/0187: training loss 0.713 Epoch 50 iteration 0059/0187: training loss 0.714 Epoch 50 iteration 0060/0187: training loss 0.714 Epoch 50 iteration 0061/0187: training loss 0.714 Epoch 50 iteration 0062/0187: training loss 0.717 Epoch 50 iteration 0063/0187: training loss 0.719 Epoch 50 iteration 0064/0187: training loss 0.718 Epoch 50 iteration 0065/0187: training loss 0.721 Epoch 50 iteration 0066/0187: training loss 0.719 Epoch 50 iteration 0067/0187: training loss 0.718 Epoch 50 iteration 0068/0187: training loss 0.719 Epoch 50 iteration 0069/0187: training loss 0.717 Epoch 50 iteration 0070/0187: training loss 0.717 Epoch 50 iteration 0071/0187: training loss 0.717 Epoch 50 iteration 0072/0187: training loss 0.718 Epoch 50 iteration 0073/0187: training loss 0.720 Epoch 50 iteration 0074/0187: training loss 0.720 Epoch 50 iteration 0075/0187: training loss 0.717 Epoch 50 iteration 0076/0187: training loss 0.718 Epoch 50 iteration 0077/0187: training loss 0.720 Epoch 50 iteration 0078/0187: training loss 0.721 Epoch 50 iteration 0079/0187: training loss 0.721 Epoch 50 iteration 0080/0187: training loss 0.721 Epoch 50 iteration 0081/0187: training loss 0.720 Epoch 50 iteration 0082/0187: training loss 0.723 Epoch 50 iteration 0083/0187: training loss 0.723 Epoch 50 iteration 0084/0187: training loss 0.724 Epoch 50 iteration 0085/0187: training loss 0.724 Epoch 50 iteration 0086/0187: training loss 0.724 Epoch 50 iteration 0087/0187: training loss 0.723 Epoch 50 iteration 0088/0187: training loss 0.723 Epoch 50 iteration 0089/0187: training loss 0.723 Epoch 50 iteration 0090/0187: training loss 0.723 Epoch 50 iteration 0091/0188: training loss 0.724 Epoch 50 iteration 0092/0188: training loss 0.724 Epoch 50 iteration 0093/0188: training loss 0.723 Epoch 50 iteration 0094/0188: training loss 0.724 Epoch 50 iteration 0095/0188: training loss 0.723 Epoch 50 iteration 0096/0188: training loss 0.722 Epoch 50 iteration 0097/0188: training loss 0.721 Epoch 50 iteration 0098/0188: training loss 0.721 Epoch 50 iteration 0099/0188: training loss 0.722 Epoch 50 iteration 0100/0188: training loss 0.722 Epoch 50 iteration 0101/0188: training loss 0.721 Epoch 50 iteration 0102/0188: training loss 0.721 Epoch 50 iteration 0103/0188: training loss 0.722 Epoch 50 iteration 0104/0188: training loss 0.723 Epoch 50 iteration 0105/0188: training loss 0.722 Epoch 50 iteration 0106/0188: training loss 0.722 Epoch 50 iteration 0107/0188: training loss 0.723 Epoch 50 iteration 0108/0188: training loss 0.723 Epoch 50 iteration 0109/0188: training loss 0.723 Epoch 50 iteration 0110/0188: training loss 0.723 Epoch 50 iteration 0111/0188: training loss 0.723 Epoch 50 iteration 0112/0188: training loss 0.721 Epoch 50 iteration 0113/0188: training loss 0.722 Epoch 50 iteration 0114/0188: training loss 0.722 Epoch 50 iteration 0115/0188: training loss 0.721 Epoch 50 iteration 0116/0188: training loss 0.722 Epoch 50 iteration 0117/0188: training loss 0.723 Epoch 50 iteration 0118/0188: training loss 0.722 Epoch 50 iteration 0119/0188: training loss 0.722 Epoch 50 iteration 0120/0188: training loss 0.723 Epoch 50 iteration 0121/0188: training loss 0.722 Epoch 50 iteration 0122/0188: training loss 0.721 Epoch 50 iteration 0123/0188: training loss 0.721 Epoch 50 iteration 0124/0188: training loss 0.721 Epoch 50 iteration 0125/0188: training loss 0.722 Epoch 50 iteration 0126/0188: training loss 0.721 Epoch 50 iteration 0127/0188: training loss 0.721 Epoch 50 iteration 0128/0188: training loss 0.721 Epoch 50 iteration 0129/0188: training loss 0.720 Epoch 50 iteration 0130/0188: training loss 0.719 Epoch 50 iteration 0131/0188: training loss 0.719 Epoch 50 iteration 0132/0188: training loss 0.719 Epoch 50 iteration 0133/0188: training loss 0.719 Epoch 50 iteration 0134/0188: training loss 0.719 Epoch 50 iteration 0135/0188: training loss 0.718 Epoch 50 iteration 0136/0188: training loss 0.719 Epoch 50 iteration 0137/0188: training loss 0.719 Epoch 50 iteration 0138/0188: training loss 0.719 Epoch 50 iteration 0139/0188: training loss 0.719 Epoch 50 iteration 0140/0188: training loss 0.720 Epoch 50 iteration 0141/0188: training loss 0.720 Epoch 50 iteration 0142/0188: training loss 0.720 Epoch 50 iteration 0143/0188: training loss 0.721 Epoch 50 iteration 0144/0188: training loss 0.719 Epoch 50 iteration 0145/0188: training loss 0.720 Epoch 50 iteration 0146/0188: training loss 0.720 Epoch 50 iteration 0147/0188: training loss 0.721 Epoch 50 iteration 0148/0188: training loss 0.722 Epoch 50 iteration 0149/0188: training loss 0.722 Epoch 50 iteration 0150/0188: training loss 0.723 Epoch 50 iteration 0151/0188: training loss 0.725 Epoch 50 iteration 0152/0188: training loss 0.725 Epoch 50 iteration 0153/0188: training loss 0.728 Epoch 50 iteration 0154/0188: training loss 0.729 Epoch 50 iteration 0155/0188: training loss 0.728 Epoch 50 iteration 0156/0188: training loss 0.728 Epoch 50 iteration 0157/0188: training loss 0.728 Epoch 50 iteration 0158/0188: training loss 0.728 Epoch 50 iteration 0159/0188: training loss 0.729 Epoch 50 iteration 0160/0188: training loss 0.729 Epoch 50 iteration 0161/0188: training loss 0.728 Epoch 50 iteration 0162/0188: training loss 0.729 Epoch 50 iteration 0163/0188: training loss 0.729 Epoch 50 iteration 0164/0188: training loss 0.728 Epoch 50 iteration 0165/0188: training loss 0.729 Epoch 50 iteration 0166/0188: training loss 0.729 Epoch 50 iteration 0167/0188: training loss 0.729 Epoch 50 iteration 0168/0188: training loss 0.729 Epoch 50 iteration 0169/0188: training loss 0.729 Epoch 50 iteration 0170/0188: training loss 0.728 Epoch 50 iteration 0171/0188: training loss 0.728 Epoch 50 iteration 0172/0188: training loss 0.728 Epoch 50 iteration 0173/0188: training loss 0.728 Epoch 50 iteration 0174/0188: training loss 0.727 Epoch 50 iteration 0175/0188: training loss 0.729 Epoch 50 iteration 0176/0188: training loss 0.729 Epoch 50 iteration 0177/0188: training loss 0.729 Epoch 50 iteration 0178/0188: training loss 0.729 Epoch 50 iteration 0179/0188: training loss 0.729 Epoch 50 iteration 0180/0188: training loss 0.730 Epoch 50 iteration 0181/0188: training loss 0.730 Epoch 50 iteration 0182/0188: training loss 0.730 Epoch 50 iteration 0183/0188: training loss 0.730 Epoch 50 iteration 0184/0188: training loss 0.731 Epoch 50 iteration 0185/0188: training loss 0.730 Epoch 50 iteration 0186/0188: training loss 0.730 Epoch 50 validation pixAcc: 0.872, mIoU: 0.382 Epoch 51 iteration 0001/0187: training loss 0.798 Epoch 51 iteration 0002/0187: training loss 0.780 Epoch 51 iteration 0003/0187: training loss 0.826 Epoch 51 iteration 0004/0187: training loss 0.803 Epoch 51 iteration 0005/0187: training loss 0.776 Epoch 51 iteration 0006/0187: training loss 0.768 Epoch 51 iteration 0007/0187: training loss 0.781 Epoch 51 iteration 0008/0187: training loss 0.764 Epoch 51 iteration 0009/0187: training loss 0.743 Epoch 51 iteration 0010/0187: training loss 0.734 Epoch 51 iteration 0011/0187: training loss 0.741 Epoch 51 iteration 0012/0187: training loss 0.739 Epoch 51 iteration 0013/0187: training loss 0.729 Epoch 51 iteration 0014/0187: training loss 0.728 Epoch 51 iteration 0015/0187: training loss 0.736 Epoch 51 iteration 0016/0187: training loss 0.743 Epoch 51 iteration 0017/0187: training loss 0.745 Epoch 51 iteration 0018/0187: training loss 0.749 Epoch 51 iteration 0019/0187: training loss 0.739 Epoch 51 iteration 0020/0187: training loss 0.742 Epoch 51 iteration 0021/0187: training loss 0.735 Epoch 51 iteration 0022/0187: training loss 0.734 Epoch 51 iteration 0023/0187: training loss 0.733 Epoch 51 iteration 0024/0187: training loss 0.729 Epoch 51 iteration 0025/0187: training loss 0.736 Epoch 51 iteration 0026/0187: training loss 0.736 Epoch 51 iteration 0027/0187: training loss 0.736 Epoch 51 iteration 0028/0187: training loss 0.745 Epoch 51 iteration 0029/0187: training loss 0.745 Epoch 51 iteration 0030/0187: training loss 0.742 Epoch 51 iteration 0031/0187: training loss 0.741 Epoch 51 iteration 0032/0187: training loss 0.739 Epoch 51 iteration 0033/0187: training loss 0.737 Epoch 51 iteration 0034/0187: training loss 0.740 Epoch 51 iteration 0035/0187: training loss 0.738 Epoch 51 iteration 0036/0187: training loss 0.745 Epoch 51 iteration 0037/0187: training loss 0.743 Epoch 51 iteration 0038/0187: training loss 0.741 Epoch 51 iteration 0039/0187: training loss 0.741 Epoch 51 iteration 0040/0187: training loss 0.734 Epoch 51 iteration 0041/0187: training loss 0.732 Epoch 51 iteration 0042/0187: training loss 0.729 Epoch 51 iteration 0043/0187: training loss 0.727 Epoch 51 iteration 0044/0187: training loss 0.725 Epoch 51 iteration 0045/0187: training loss 0.726 Epoch 51 iteration 0046/0187: training loss 0.723 Epoch 51 iteration 0047/0187: training loss 0.722 Epoch 51 iteration 0048/0187: training loss 0.722 Epoch 51 iteration 0049/0187: training loss 0.723 Epoch 51 iteration 0050/0187: training loss 0.719 Epoch 51 iteration 0051/0187: training loss 0.723 Epoch 51 iteration 0052/0187: training loss 0.722 Epoch 51 iteration 0053/0187: training loss 0.720 Epoch 51 iteration 0054/0187: training loss 0.719 Epoch 51 iteration 0055/0187: training loss 0.717 Epoch 51 iteration 0056/0187: training loss 0.718 Epoch 51 iteration 0057/0187: training loss 0.719 Epoch 51 iteration 0058/0187: training loss 0.719 Epoch 51 iteration 0059/0187: training loss 0.719 Epoch 51 iteration 0060/0187: training loss 0.718 Epoch 51 iteration 0061/0187: training loss 0.715 Epoch 51 iteration 0062/0187: training loss 0.712 Epoch 51 iteration 0063/0187: training loss 0.713 Epoch 51 iteration 0064/0187: training loss 0.713 Epoch 51 iteration 0065/0187: training loss 0.710 Epoch 51 iteration 0066/0187: training loss 0.710 Epoch 51 iteration 0067/0187: training loss 0.713 Epoch 51 iteration 0068/0187: training loss 0.714 Epoch 51 iteration 0069/0187: training loss 0.712 Epoch 51 iteration 0070/0187: training loss 0.710 Epoch 51 iteration 0071/0187: training loss 0.709 Epoch 51 iteration 0072/0187: training loss 0.710 Epoch 51 iteration 0073/0187: training loss 0.709 Epoch 51 iteration 0074/0187: training loss 0.709 Epoch 51 iteration 0075/0187: training loss 0.709 Epoch 51 iteration 0076/0187: training loss 0.716 Epoch 51 iteration 0077/0187: training loss 0.714 Epoch 51 iteration 0078/0187: training loss 0.714 Epoch 51 iteration 0079/0187: training loss 0.712 Epoch 51 iteration 0080/0187: training loss 0.713 Epoch 51 iteration 0081/0187: training loss 0.715 Epoch 51 iteration 0082/0187: training loss 0.714 Epoch 51 iteration 0083/0187: training loss 0.715 Epoch 51 iteration 0084/0187: training loss 0.714 Epoch 51 iteration 0085/0187: training loss 0.717 Epoch 51 iteration 0086/0187: training loss 0.715 Epoch 51 iteration 0087/0187: training loss 0.714 Epoch 51 iteration 0088/0187: training loss 0.715 Epoch 51 iteration 0089/0187: training loss 0.715 Epoch 51 iteration 0090/0187: training loss 0.715 Epoch 51 iteration 0091/0187: training loss 0.716 Epoch 51 iteration 0092/0187: training loss 0.715 Epoch 51 iteration 0093/0187: training loss 0.715 Epoch 51 iteration 0094/0187: training loss 0.713 Epoch 51 iteration 0095/0187: training loss 0.713 Epoch 51 iteration 0096/0187: training loss 0.713 Epoch 51 iteration 0097/0187: training loss 0.712 Epoch 51 iteration 0098/0187: training loss 0.713 Epoch 51 iteration 0099/0187: training loss 0.713 Epoch 51 iteration 0100/0187: training loss 0.713 Epoch 51 iteration 0101/0187: training loss 0.712 Epoch 51 iteration 0102/0187: training loss 0.712 Epoch 51 iteration 0103/0187: training loss 0.711 Epoch 51 iteration 0104/0187: training loss 0.712 Epoch 51 iteration 0105/0187: training loss 0.711 Epoch 51 iteration 0106/0187: training loss 0.710 Epoch 51 iteration 0107/0187: training loss 0.709 Epoch 51 iteration 0108/0187: training loss 0.708 Epoch 51 iteration 0109/0187: training loss 0.709 Epoch 51 iteration 0110/0187: training loss 0.710 Epoch 51 iteration 0111/0187: training loss 0.710 Epoch 51 iteration 0112/0187: training loss 0.709 Epoch 51 iteration 0113/0187: training loss 0.709 Epoch 51 iteration 0114/0187: training loss 0.711 Epoch 51 iteration 0115/0187: training loss 0.713 Epoch 51 iteration 0116/0187: training loss 0.715 Epoch 51 iteration 0117/0187: training loss 0.715 Epoch 51 iteration 0118/0187: training loss 0.716 Epoch 51 iteration 0119/0187: training loss 0.715 Epoch 51 iteration 0120/0187: training loss 0.715 Epoch 51 iteration 0121/0187: training loss 0.716 Epoch 51 iteration 0122/0187: training loss 0.716 Epoch 51 iteration 0123/0187: training loss 0.717 Epoch 51 iteration 0124/0187: training loss 0.716 Epoch 51 iteration 0125/0187: training loss 0.716 Epoch 51 iteration 0126/0187: training loss 0.716 Epoch 51 iteration 0127/0187: training loss 0.716 Epoch 51 iteration 0128/0187: training loss 0.716 Epoch 51 iteration 0129/0187: training loss 0.716 Epoch 51 iteration 0130/0187: training loss 0.715 Epoch 51 iteration 0131/0187: training loss 0.715 Epoch 51 iteration 0132/0187: training loss 0.714 Epoch 51 iteration 0133/0187: training loss 0.715 Epoch 51 iteration 0134/0187: training loss 0.716 Epoch 51 iteration 0135/0187: training loss 0.715 Epoch 51 iteration 0136/0187: training loss 0.716 Epoch 51 iteration 0137/0187: training loss 0.716 Epoch 51 iteration 0138/0187: training loss 0.716 Epoch 51 iteration 0139/0187: training loss 0.717 Epoch 51 iteration 0140/0187: training loss 0.717 Epoch 51 iteration 0141/0187: training loss 0.716 Epoch 51 iteration 0142/0187: training loss 0.715 Epoch 51 iteration 0143/0187: training loss 0.715 Epoch 51 iteration 0144/0187: training loss 0.716 Epoch 51 iteration 0145/0187: training loss 0.716 Epoch 51 iteration 0146/0187: training loss 0.716 Epoch 51 iteration 0147/0187: training loss 0.715 Epoch 51 iteration 0148/0187: training loss 0.716 Epoch 51 iteration 0149/0187: training loss 0.715 Epoch 51 iteration 0150/0187: training loss 0.715 Epoch 51 iteration 0151/0187: training loss 0.714 Epoch 51 iteration 0152/0187: training loss 0.714 Epoch 51 iteration 0153/0187: training loss 0.714 Epoch 51 iteration 0154/0187: training loss 0.714 Epoch 51 iteration 0155/0187: training loss 0.712 Epoch 51 iteration 0156/0187: training loss 0.712 Epoch 51 iteration 0157/0187: training loss 0.713 Epoch 51 iteration 0158/0187: training loss 0.713 Epoch 51 iteration 0159/0187: training loss 0.713 Epoch 51 iteration 0160/0187: training loss 0.712 Epoch 51 iteration 0161/0187: training loss 0.712 Epoch 51 iteration 0162/0187: training loss 0.712 Epoch 51 iteration 0163/0187: training loss 0.712 Epoch 51 iteration 0164/0187: training loss 0.712 Epoch 51 iteration 0165/0187: training loss 0.713 Epoch 51 iteration 0166/0187: training loss 0.712 Epoch 51 iteration 0167/0187: training loss 0.712 Epoch 51 iteration 0168/0187: training loss 0.712 Epoch 51 iteration 0169/0187: training loss 0.711 Epoch 51 iteration 0170/0187: training loss 0.712 Epoch 51 iteration 0171/0187: training loss 0.712 Epoch 51 iteration 0172/0187: training loss 0.712 Epoch 51 iteration 0173/0187: training loss 0.712 Epoch 51 iteration 0174/0187: training loss 0.713 Epoch 51 iteration 0175/0187: training loss 0.712 Epoch 51 iteration 0176/0187: training loss 0.712 Epoch 51 iteration 0177/0187: training loss 0.712 Epoch 51 iteration 0178/0187: training loss 0.712 Epoch 51 iteration 0179/0187: training loss 0.712 Epoch 51 iteration 0180/0187: training loss 0.712 Epoch 51 iteration 0181/0187: training loss 0.712 Epoch 51 iteration 0182/0187: training loss 0.712 Epoch 51 iteration 0183/0187: training loss 0.713 Epoch 51 iteration 0184/0187: training loss 0.713 Epoch 51 iteration 0185/0187: training loss 0.712 Epoch 51 iteration 0186/0187: training loss 0.712 Epoch 51 iteration 0187/0187: training loss 0.712 Epoch 51 validation pixAcc: 0.873, mIoU: 0.387 Epoch 52 iteration 0001/0187: training loss 0.701 Epoch 52 iteration 0002/0187: training loss 0.675 Epoch 52 iteration 0003/0187: training loss 0.665 Epoch 52 iteration 0004/0187: training loss 0.678 Epoch 52 iteration 0005/0187: training loss 0.679 Epoch 52 iteration 0006/0187: training loss 0.713 Epoch 52 iteration 0007/0187: training loss 0.707 Epoch 52 iteration 0008/0187: training loss 0.696 Epoch 52 iteration 0009/0187: training loss 0.710 Epoch 52 iteration 0010/0187: training loss 0.726 Epoch 52 iteration 0011/0187: training loss 0.719 Epoch 52 iteration 0012/0187: training loss 0.726 Epoch 52 iteration 0013/0187: training loss 0.723 Epoch 52 iteration 0014/0187: training loss 0.736 Epoch 52 iteration 0015/0187: training loss 0.754 Epoch 52 iteration 0016/0187: training loss 0.757 Epoch 52 iteration 0017/0187: training loss 0.751 Epoch 52 iteration 0018/0187: training loss 0.743 Epoch 52 iteration 0019/0187: training loss 0.735 Epoch 52 iteration 0020/0187: training loss 0.735 Epoch 52 iteration 0021/0187: training loss 0.730 Epoch 52 iteration 0022/0187: training loss 0.721 Epoch 52 iteration 0023/0187: training loss 0.728 Epoch 52 iteration 0024/0187: training loss 0.726 Epoch 52 iteration 0025/0187: training loss 0.728 Epoch 52 iteration 0026/0187: training loss 0.725 Epoch 52 iteration 0027/0187: training loss 0.734 Epoch 52 iteration 0028/0187: training loss 0.735 Epoch 52 iteration 0029/0187: training loss 0.735 Epoch 52 iteration 0030/0187: training loss 0.737 Epoch 52 iteration 0031/0187: training loss 0.736 Epoch 52 iteration 0032/0187: training loss 0.733 Epoch 52 iteration 0033/0187: training loss 0.738 Epoch 52 iteration 0034/0187: training loss 0.738 Epoch 52 iteration 0035/0187: training loss 0.736 Epoch 52 iteration 0036/0187: training loss 0.735 Epoch 52 iteration 0037/0187: training loss 0.734 Epoch 52 iteration 0038/0187: training loss 0.737 Epoch 52 iteration 0039/0187: training loss 0.740 Epoch 52 iteration 0040/0187: training loss 0.736 Epoch 52 iteration 0041/0187: training loss 0.734 Epoch 52 iteration 0042/0187: training loss 0.733 Epoch 52 iteration 0043/0187: training loss 0.737 Epoch 52 iteration 0044/0187: training loss 0.732 Epoch 52 iteration 0045/0187: training loss 0.735 Epoch 52 iteration 0046/0187: training loss 0.734 Epoch 52 iteration 0047/0187: training loss 0.732 Epoch 52 iteration 0048/0187: training loss 0.732 Epoch 52 iteration 0049/0187: training loss 0.733 Epoch 52 iteration 0050/0187: training loss 0.733 Epoch 52 iteration 0051/0187: training loss 0.733 Epoch 52 iteration 0052/0187: training loss 0.734 Epoch 52 iteration 0053/0187: training loss 0.733 Epoch 52 iteration 0054/0187: training loss 0.737 Epoch 52 iteration 0055/0187: training loss 0.736 Epoch 52 iteration 0056/0187: training loss 0.737 Epoch 52 iteration 0057/0187: training loss 0.734 Epoch 52 iteration 0058/0187: training loss 0.731 Epoch 52 iteration 0059/0187: training loss 0.732 Epoch 52 iteration 0060/0187: training loss 0.736 Epoch 52 iteration 0061/0187: training loss 0.733 Epoch 52 iteration 0062/0187: training loss 0.734 Epoch 52 iteration 0063/0187: training loss 0.733 Epoch 52 iteration 0064/0187: training loss 0.731 Epoch 52 iteration 0065/0187: training loss 0.729 Epoch 52 iteration 0066/0187: training loss 0.734 Epoch 52 iteration 0067/0187: training loss 0.733 Epoch 52 iteration 0068/0187: training loss 0.734 Epoch 52 iteration 0069/0187: training loss 0.733 Epoch 52 iteration 0070/0187: training loss 0.736 Epoch 52 iteration 0071/0187: training loss 0.736 Epoch 52 iteration 0072/0187: training loss 0.736 Epoch 52 iteration 0073/0187: training loss 0.734 Epoch 52 iteration 0074/0187: training loss 0.734 Epoch 52 iteration 0075/0187: training loss 0.734 Epoch 52 iteration 0076/0187: training loss 0.731 Epoch 52 iteration 0077/0187: training loss 0.731 Epoch 52 iteration 0078/0187: training loss 0.728 Epoch 52 iteration 0079/0187: training loss 0.728 Epoch 52 iteration 0080/0187: training loss 0.726 Epoch 52 iteration 0081/0187: training loss 0.726 Epoch 52 iteration 0082/0187: training loss 0.724 Epoch 52 iteration 0083/0187: training loss 0.724 Epoch 52 iteration 0084/0187: training loss 0.723 Epoch 52 iteration 0085/0187: training loss 0.726 Epoch 52 iteration 0086/0187: training loss 0.725 Epoch 52 iteration 0087/0187: training loss 0.728 Epoch 52 iteration 0088/0187: training loss 0.729 Epoch 52 iteration 0089/0187: training loss 0.732 Epoch 52 iteration 0090/0187: training loss 0.732 Epoch 52 iteration 0091/0188: training loss 0.733 Epoch 52 iteration 0092/0188: training loss 0.733 Epoch 52 iteration 0093/0188: training loss 0.734 Epoch 52 iteration 0094/0188: training loss 0.735 Epoch 52 iteration 0095/0188: training loss 0.736 Epoch 52 iteration 0096/0188: training loss 0.736 Epoch 52 iteration 0097/0188: training loss 0.738 Epoch 52 iteration 0098/0188: training loss 0.739 Epoch 52 iteration 0099/0188: training loss 0.738 Epoch 52 iteration 0100/0188: training loss 0.737 Epoch 52 iteration 0101/0188: training loss 0.735 Epoch 52 iteration 0102/0188: training loss 0.735 Epoch 52 iteration 0103/0188: training loss 0.736 Epoch 52 iteration 0104/0188: training loss 0.735 Epoch 52 iteration 0105/0188: training loss 0.735 Epoch 52 iteration 0106/0188: training loss 0.733 Epoch 52 iteration 0107/0188: training loss 0.733 Epoch 52 iteration 0108/0188: training loss 0.733 Epoch 52 iteration 0109/0188: training loss 0.735 Epoch 52 iteration 0110/0188: training loss 0.735 Epoch 52 iteration 0111/0188: training loss 0.734 Epoch 52 iteration 0112/0188: training loss 0.735 Epoch 52 iteration 0113/0188: training loss 0.734 Epoch 52 iteration 0114/0188: training loss 0.734 Epoch 52 iteration 0115/0188: training loss 0.736 Epoch 52 iteration 0116/0188: training loss 0.734 Epoch 52 iteration 0117/0188: training loss 0.733 Epoch 52 iteration 0118/0188: training loss 0.733 Epoch 52 iteration 0119/0188: training loss 0.732 Epoch 52 iteration 0120/0188: training loss 0.733 Epoch 52 iteration 0121/0188: training loss 0.733 Epoch 52 iteration 0122/0188: training loss 0.732 Epoch 52 iteration 0123/0188: training loss 0.731 Epoch 52 iteration 0124/0188: training loss 0.733 Epoch 52 iteration 0125/0188: training loss 0.732 Epoch 52 iteration 0126/0188: training loss 0.731 Epoch 52 iteration 0127/0188: training loss 0.730 Epoch 52 iteration 0128/0188: training loss 0.729 Epoch 52 iteration 0129/0188: training loss 0.728 Epoch 52 iteration 0130/0188: training loss 0.728 Epoch 52 iteration 0131/0188: training loss 0.728 Epoch 52 iteration 0132/0188: training loss 0.728 Epoch 52 iteration 0133/0188: training loss 0.727 Epoch 52 iteration 0134/0188: training loss 0.727 Epoch 52 iteration 0135/0188: training loss 0.727 Epoch 52 iteration 0136/0188: training loss 0.728 Epoch 52 iteration 0137/0188: training loss 0.728 Epoch 52 iteration 0138/0188: training loss 0.727 Epoch 52 iteration 0139/0188: training loss 0.728 Epoch 52 iteration 0140/0188: training loss 0.730 Epoch 52 iteration 0141/0188: training loss 0.731 Epoch 52 iteration 0142/0188: training loss 0.731 Epoch 52 iteration 0143/0188: training loss 0.732 Epoch 52 iteration 0144/0188: training loss 0.733 Epoch 52 iteration 0145/0188: training loss 0.733 Epoch 52 iteration 0146/0188: training loss 0.734 Epoch 52 iteration 0147/0188: training loss 0.734 Epoch 52 iteration 0148/0188: training loss 0.734 Epoch 52 iteration 0149/0188: training loss 0.734 Epoch 52 iteration 0150/0188: training loss 0.734 Epoch 52 iteration 0151/0188: training loss 0.734 Epoch 52 iteration 0152/0188: training loss 0.734 Epoch 52 iteration 0153/0188: training loss 0.734 Epoch 52 iteration 0154/0188: training loss 0.735 Epoch 52 iteration 0155/0188: training loss 0.735 Epoch 52 iteration 0156/0188: training loss 0.735 Epoch 52 iteration 0157/0188: training loss 0.734 Epoch 52 iteration 0158/0188: training loss 0.733 Epoch 52 iteration 0159/0188: training loss 0.732 Epoch 52 iteration 0160/0188: training loss 0.733 Epoch 52 iteration 0161/0188: training loss 0.733 Epoch 52 iteration 0162/0188: training loss 0.733 Epoch 52 iteration 0163/0188: training loss 0.735 Epoch 52 iteration 0164/0188: training loss 0.734 Epoch 52 iteration 0165/0188: training loss 0.735 Epoch 52 iteration 0166/0188: training loss 0.736 Epoch 52 iteration 0167/0188: training loss 0.736 Epoch 52 iteration 0168/0188: training loss 0.736 Epoch 52 iteration 0169/0188: training loss 0.736 Epoch 52 iteration 0170/0188: training loss 0.735 Epoch 52 iteration 0171/0188: training loss 0.737 Epoch 52 iteration 0172/0188: training loss 0.737 Epoch 52 iteration 0173/0188: training loss 0.736 Epoch 52 iteration 0174/0188: training loss 0.736 Epoch 52 iteration 0175/0188: training loss 0.735 Epoch 52 iteration 0176/0188: training loss 0.734 Epoch 52 iteration 0177/0188: training loss 0.737 Epoch 52 iteration 0178/0188: training loss 0.738 Epoch 52 iteration 0179/0188: training loss 0.737 Epoch 52 iteration 0180/0188: training loss 0.737 Epoch 52 iteration 0181/0188: training loss 0.736 Epoch 52 iteration 0182/0188: training loss 0.735 Epoch 52 iteration 0183/0188: training loss 0.736 Epoch 52 iteration 0184/0188: training loss 0.736 Epoch 52 iteration 0185/0188: training loss 0.736 Epoch 52 iteration 0186/0188: training loss 0.736 Epoch 52 validation pixAcc: 0.873, mIoU: 0.384 Epoch 53 iteration 0001/0187: training loss 0.737 Epoch 53 iteration 0002/0187: training loss 0.736 Epoch 53 iteration 0003/0187: training loss 0.704 Epoch 53 iteration 0004/0187: training loss 0.747 Epoch 53 iteration 0005/0187: training loss 0.736 Epoch 53 iteration 0006/0187: training loss 0.734 Epoch 53 iteration 0007/0187: training loss 0.720 Epoch 53 iteration 0008/0187: training loss 0.727 Epoch 53 iteration 0009/0187: training loss 0.719 Epoch 53 iteration 0010/0187: training loss 0.723 Epoch 53 iteration 0011/0187: training loss 0.729 Epoch 53 iteration 0012/0187: training loss 0.728 Epoch 53 iteration 0013/0187: training loss 0.722 Epoch 53 iteration 0014/0187: training loss 0.715 Epoch 53 iteration 0015/0187: training loss 0.721 Epoch 53 iteration 0016/0187: training loss 0.715 Epoch 53 iteration 0017/0187: training loss 0.716 Epoch 53 iteration 0018/0187: training loss 0.718 Epoch 53 iteration 0019/0187: training loss 0.718 Epoch 53 iteration 0020/0187: training loss 0.719 Epoch 53 iteration 0021/0187: training loss 0.725 Epoch 53 iteration 0022/0187: training loss 0.731 Epoch 53 iteration 0023/0187: training loss 0.731 Epoch 53 iteration 0024/0187: training loss 0.733 Epoch 53 iteration 0025/0187: training loss 0.734 Epoch 53 iteration 0026/0187: training loss 0.733 Epoch 53 iteration 0027/0187: training loss 0.727 Epoch 53 iteration 0028/0187: training loss 0.730 Epoch 53 iteration 0029/0187: training loss 0.735 Epoch 53 iteration 0030/0187: training loss 0.733 Epoch 53 iteration 0031/0187: training loss 0.732 Epoch 53 iteration 0032/0187: training loss 0.734 Epoch 53 iteration 0033/0187: training loss 0.731 Epoch 53 iteration 0034/0187: training loss 0.733 Epoch 53 iteration 0035/0187: training loss 0.732 Epoch 53 iteration 0036/0187: training loss 0.734 Epoch 53 iteration 0037/0187: training loss 0.733 Epoch 53 iteration 0038/0187: training loss 0.731 Epoch 53 iteration 0039/0187: training loss 0.731 Epoch 53 iteration 0040/0187: training loss 0.733 Epoch 53 iteration 0041/0187: training loss 0.733 Epoch 53 iteration 0042/0187: training loss 0.729 Epoch 53 iteration 0043/0187: training loss 0.730 Epoch 53 iteration 0044/0187: training loss 0.729 Epoch 53 iteration 0045/0187: training loss 0.726 Epoch 53 iteration 0046/0187: training loss 0.725 Epoch 53 iteration 0047/0187: training loss 0.730 Epoch 53 iteration 0048/0187: training loss 0.726 Epoch 53 iteration 0049/0187: training loss 0.726 Epoch 53 iteration 0050/0187: training loss 0.726 Epoch 53 iteration 0051/0187: training loss 0.728 Epoch 53 iteration 0052/0187: training loss 0.725 Epoch 53 iteration 0053/0187: training loss 0.730 Epoch 53 iteration 0054/0187: training loss 0.730 Epoch 53 iteration 0055/0187: training loss 0.730 Epoch 53 iteration 0056/0187: training loss 0.731 Epoch 53 iteration 0057/0187: training loss 0.730 Epoch 53 iteration 0058/0187: training loss 0.729 Epoch 53 iteration 0059/0187: training loss 0.731 Epoch 53 iteration 0060/0187: training loss 0.731 Epoch 53 iteration 0061/0187: training loss 0.732 Epoch 53 iteration 0062/0187: training loss 0.731 Epoch 53 iteration 0063/0187: training loss 0.728 Epoch 53 iteration 0064/0187: training loss 0.729 Epoch 53 iteration 0065/0187: training loss 0.731 Epoch 53 iteration 0066/0187: training loss 0.731 Epoch 53 iteration 0067/0187: training loss 0.731 Epoch 53 iteration 0068/0187: training loss 0.728 Epoch 53 iteration 0069/0187: training loss 0.730 Epoch 53 iteration 0070/0187: training loss 0.727 Epoch 53 iteration 0071/0187: training loss 0.726 Epoch 53 iteration 0072/0187: training loss 0.726 Epoch 53 iteration 0073/0187: training loss 0.725 Epoch 53 iteration 0074/0187: training loss 0.723 Epoch 53 iteration 0075/0187: training loss 0.722 Epoch 53 iteration 0076/0187: training loss 0.721 Epoch 53 iteration 0077/0187: training loss 0.723 Epoch 53 iteration 0078/0187: training loss 0.721 Epoch 53 iteration 0079/0187: training loss 0.724 Epoch 53 iteration 0080/0187: training loss 0.724 Epoch 53 iteration 0081/0187: training loss 0.726 Epoch 53 iteration 0082/0187: training loss 0.727 Epoch 53 iteration 0083/0187: training loss 0.726 Epoch 53 iteration 0084/0187: training loss 0.725 Epoch 53 iteration 0085/0187: training loss 0.723 Epoch 53 iteration 0086/0187: training loss 0.726 Epoch 53 iteration 0087/0187: training loss 0.726 Epoch 53 iteration 0088/0187: training loss 0.726 Epoch 53 iteration 0089/0187: training loss 0.725 Epoch 53 iteration 0090/0187: training loss 0.723 Epoch 53 iteration 0091/0187: training loss 0.723 Epoch 53 iteration 0092/0187: training loss 0.723 Epoch 53 iteration 0093/0187: training loss 0.723 Epoch 53 iteration 0094/0187: training loss 0.724 Epoch 53 iteration 0095/0187: training loss 0.722 Epoch 53 iteration 0096/0187: training loss 0.721 Epoch 53 iteration 0097/0187: training loss 0.721 Epoch 53 iteration 0098/0187: training loss 0.722 Epoch 53 iteration 0099/0187: training loss 0.722 Epoch 53 iteration 0100/0187: training loss 0.722 Epoch 53 iteration 0101/0187: training loss 0.722 Epoch 53 iteration 0102/0187: training loss 0.721 Epoch 53 iteration 0103/0187: training loss 0.720 Epoch 53 iteration 0104/0187: training loss 0.721 Epoch 53 iteration 0105/0187: training loss 0.720 Epoch 53 iteration 0106/0187: training loss 0.718 Epoch 53 iteration 0107/0187: training loss 0.720 Epoch 53 iteration 0108/0187: training loss 0.720 Epoch 53 iteration 0109/0187: training loss 0.719 Epoch 53 iteration 0110/0187: training loss 0.720 Epoch 53 iteration 0111/0187: training loss 0.719 Epoch 53 iteration 0112/0187: training loss 0.719 Epoch 53 iteration 0113/0187: training loss 0.719 Epoch 53 iteration 0114/0187: training loss 0.720 Epoch 53 iteration 0115/0187: training loss 0.719 Epoch 53 iteration 0116/0187: training loss 0.720 Epoch 53 iteration 0117/0187: training loss 0.719 Epoch 53 iteration 0118/0187: training loss 0.719 Epoch 53 iteration 0119/0187: training loss 0.722 Epoch 53 iteration 0120/0187: training loss 0.722 Epoch 53 iteration 0121/0187: training loss 0.721 Epoch 53 iteration 0122/0187: training loss 0.721 Epoch 53 iteration 0123/0187: training loss 0.721 Epoch 53 iteration 0124/0187: training loss 0.719 Epoch 53 iteration 0125/0187: training loss 0.719 Epoch 53 iteration 0126/0187: training loss 0.719 Epoch 53 iteration 0127/0187: training loss 0.721 Epoch 53 iteration 0128/0187: training loss 0.720 Epoch 53 iteration 0129/0187: training loss 0.722 Epoch 53 iteration 0130/0187: training loss 0.722 Epoch 53 iteration 0131/0187: training loss 0.721 Epoch 53 iteration 0132/0187: training loss 0.722 Epoch 53 iteration 0133/0187: training loss 0.721 Epoch 53 iteration 0134/0187: training loss 0.721 Epoch 53 iteration 0135/0187: training loss 0.722 Epoch 53 iteration 0136/0187: training loss 0.723 Epoch 53 iteration 0137/0187: training loss 0.723 Epoch 53 iteration 0138/0187: training loss 0.723 Epoch 53 iteration 0139/0187: training loss 0.721 Epoch 53 iteration 0140/0187: training loss 0.720 Epoch 53 iteration 0141/0187: training loss 0.720 Epoch 53 iteration 0142/0187: training loss 0.721 Epoch 53 iteration 0143/0187: training loss 0.720 Epoch 53 iteration 0144/0187: training loss 0.719 Epoch 53 iteration 0145/0187: training loss 0.719 Epoch 53 iteration 0146/0187: training loss 0.719 Epoch 53 iteration 0147/0187: training loss 0.719 Epoch 53 iteration 0148/0187: training loss 0.721 Epoch 53 iteration 0149/0187: training loss 0.720 Epoch 53 iteration 0150/0187: training loss 0.722 Epoch 53 iteration 0151/0187: training loss 0.721 Epoch 53 iteration 0152/0187: training loss 0.722 Epoch 53 iteration 0153/0187: training loss 0.724 Epoch 53 iteration 0154/0187: training loss 0.724 Epoch 53 iteration 0155/0187: training loss 0.724 Epoch 53 iteration 0156/0187: training loss 0.723 Epoch 53 iteration 0157/0187: training loss 0.723 Epoch 53 iteration 0158/0187: training loss 0.725 Epoch 53 iteration 0159/0187: training loss 0.725 Epoch 53 iteration 0160/0187: training loss 0.725 Epoch 53 iteration 0161/0187: training loss 0.725 Epoch 53 iteration 0162/0187: training loss 0.725 Epoch 53 iteration 0163/0187: training loss 0.726 Epoch 53 iteration 0164/0187: training loss 0.726 Epoch 53 iteration 0165/0187: training loss 0.726 Epoch 53 iteration 0166/0187: training loss 0.726 Epoch 53 iteration 0167/0187: training loss 0.725 Epoch 53 iteration 0168/0187: training loss 0.725 Epoch 53 iteration 0169/0187: training loss 0.725 Epoch 53 iteration 0170/0187: training loss 0.724 Epoch 53 iteration 0171/0187: training loss 0.724 Epoch 53 iteration 0172/0187: training loss 0.724 Epoch 53 iteration 0173/0187: training loss 0.725 Epoch 53 iteration 0174/0187: training loss 0.724 Epoch 53 iteration 0175/0187: training loss 0.724 Epoch 53 iteration 0176/0187: training loss 0.724 Epoch 53 iteration 0177/0187: training loss 0.723 Epoch 53 iteration 0178/0187: training loss 0.723 Epoch 53 iteration 0179/0187: training loss 0.724 Epoch 53 iteration 0180/0187: training loss 0.723 Epoch 53 iteration 0181/0187: training loss 0.724 Epoch 53 iteration 0182/0187: training loss 0.724 Epoch 53 iteration 0183/0187: training loss 0.725 Epoch 53 iteration 0184/0187: training loss 0.725 Epoch 53 iteration 0185/0187: training loss 0.725 Epoch 53 iteration 0186/0187: training loss 0.725 Epoch 53 iteration 0187/0187: training loss 0.725 Epoch 53 validation pixAcc: 0.874, mIoU: 0.388 Epoch 54 iteration 0001/0187: training loss 0.704 Epoch 54 iteration 0002/0187: training loss 0.668 Epoch 54 iteration 0003/0187: training loss 0.656 Epoch 54 iteration 0004/0187: training loss 0.678 Epoch 54 iteration 0005/0187: training loss 0.692 Epoch 54 iteration 0006/0187: training loss 0.686 Epoch 54 iteration 0007/0187: training loss 0.693 Epoch 54 iteration 0008/0187: training loss 0.692 Epoch 54 iteration 0009/0187: training loss 0.682 Epoch 54 iteration 0010/0187: training loss 0.664 Epoch 54 iteration 0011/0187: training loss 0.653 Epoch 54 iteration 0012/0187: training loss 0.643 Epoch 54 iteration 0013/0187: training loss 0.649 Epoch 54 iteration 0014/0187: training loss 0.662 Epoch 54 iteration 0015/0187: training loss 0.657 Epoch 54 iteration 0016/0187: training loss 0.670 Epoch 54 iteration 0017/0187: training loss 0.666 Epoch 54 iteration 0018/0187: training loss 0.670 Epoch 54 iteration 0019/0187: training loss 0.670 Epoch 54 iteration 0020/0187: training loss 0.675 Epoch 54 iteration 0021/0187: training loss 0.670 Epoch 54 iteration 0022/0187: training loss 0.669 Epoch 54 iteration 0023/0187: training loss 0.679 Epoch 54 iteration 0024/0187: training loss 0.689 Epoch 54 iteration 0025/0187: training loss 0.688 Epoch 54 iteration 0026/0187: training loss 0.691 Epoch 54 iteration 0027/0187: training loss 0.691 Epoch 54 iteration 0028/0187: training loss 0.687 Epoch 54 iteration 0029/0187: training loss 0.692 Epoch 54 iteration 0030/0187: training loss 0.694 Epoch 54 iteration 0031/0187: training loss 0.694 Epoch 54 iteration 0032/0187: training loss 0.696 Epoch 54 iteration 0033/0187: training loss 0.701 Epoch 54 iteration 0034/0187: training loss 0.701 Epoch 54 iteration 0035/0187: training loss 0.705 Epoch 54 iteration 0036/0187: training loss 0.703 Epoch 54 iteration 0037/0187: training loss 0.705 Epoch 54 iteration 0038/0187: training loss 0.704 Epoch 54 iteration 0039/0187: training loss 0.703 Epoch 54 iteration 0040/0187: training loss 0.709 Epoch 54 iteration 0041/0187: training loss 0.709 Epoch 54 iteration 0042/0187: training loss 0.710 Epoch 54 iteration 0043/0187: training loss 0.709 Epoch 54 iteration 0044/0187: training loss 0.712 Epoch 54 iteration 0045/0187: training loss 0.709 Epoch 54 iteration 0046/0187: training loss 0.710 Epoch 54 iteration 0047/0187: training loss 0.715 Epoch 54 iteration 0048/0187: training loss 0.716 Epoch 54 iteration 0049/0187: training loss 0.720 Epoch 54 iteration 0050/0187: training loss 0.723 Epoch 54 iteration 0051/0187: training loss 0.723 Epoch 54 iteration 0052/0187: training loss 0.722 Epoch 54 iteration 0053/0187: training loss 0.722 Epoch 54 iteration 0054/0187: training loss 0.727 Epoch 54 iteration 0055/0187: training loss 0.725 Epoch 54 iteration 0056/0187: training loss 0.722 Epoch 54 iteration 0057/0187: training loss 0.723 Epoch 54 iteration 0058/0187: training loss 0.724 Epoch 54 iteration 0059/0187: training loss 0.725 Epoch 54 iteration 0060/0187: training loss 0.725 Epoch 54 iteration 0061/0187: training loss 0.724 Epoch 54 iteration 0062/0187: training loss 0.723 Epoch 54 iteration 0063/0187: training loss 0.721 Epoch 54 iteration 0064/0187: training loss 0.721 Epoch 54 iteration 0065/0187: training loss 0.724 Epoch 54 iteration 0066/0187: training loss 0.727 Epoch 54 iteration 0067/0187: training loss 0.725 Epoch 54 iteration 0068/0187: training loss 0.727 Epoch 54 iteration 0069/0187: training loss 0.726 Epoch 54 iteration 0070/0187: training loss 0.726 Epoch 54 iteration 0071/0187: training loss 0.728 Epoch 54 iteration 0072/0187: training loss 0.732 Epoch 54 iteration 0073/0187: training loss 0.730 Epoch 54 iteration 0074/0187: training loss 0.730 Epoch 54 iteration 0075/0187: training loss 0.732 Epoch 54 iteration 0076/0187: training loss 0.733 Epoch 54 iteration 0077/0187: training loss 0.733 Epoch 54 iteration 0078/0187: training loss 0.731 Epoch 54 iteration 0079/0187: training loss 0.730 Epoch 54 iteration 0080/0187: training loss 0.729 Epoch 54 iteration 0081/0187: training loss 0.728 Epoch 54 iteration 0082/0187: training loss 0.727 Epoch 54 iteration 0083/0187: training loss 0.727 Epoch 54 iteration 0084/0187: training loss 0.725 Epoch 54 iteration 0085/0187: training loss 0.723 Epoch 54 iteration 0086/0187: training loss 0.721 Epoch 54 iteration 0087/0187: training loss 0.720 Epoch 54 iteration 0088/0187: training loss 0.721 Epoch 54 iteration 0089/0187: training loss 0.720 Epoch 54 iteration 0090/0187: training loss 0.718 Epoch 54 iteration 0091/0188: training loss 0.718 Epoch 54 iteration 0092/0188: training loss 0.718 Epoch 54 iteration 0093/0188: training loss 0.717 Epoch 54 iteration 0094/0188: training loss 0.717 Epoch 54 iteration 0095/0188: training loss 0.716 Epoch 54 iteration 0096/0188: training loss 0.717 Epoch 54 iteration 0097/0188: training loss 0.715 Epoch 54 iteration 0098/0188: training loss 0.715 Epoch 54 iteration 0099/0188: training loss 0.716 Epoch 54 iteration 0100/0188: training loss 0.715 Epoch 54 iteration 0101/0188: training loss 0.717 Epoch 54 iteration 0102/0188: training loss 0.715 Epoch 54 iteration 0103/0188: training loss 0.715 Epoch 54 iteration 0104/0188: training loss 0.715 Epoch 54 iteration 0105/0188: training loss 0.717 Epoch 54 iteration 0106/0188: training loss 0.715 Epoch 54 iteration 0107/0188: training loss 0.719 Epoch 54 iteration 0108/0188: training loss 0.717 Epoch 54 iteration 0109/0188: training loss 0.719 Epoch 54 iteration 0110/0188: training loss 0.717 Epoch 54 iteration 0111/0188: training loss 0.717 Epoch 54 iteration 0112/0188: training loss 0.716 Epoch 54 iteration 0113/0188: training loss 0.715 Epoch 54 iteration 0114/0188: training loss 0.717 Epoch 54 iteration 0115/0188: training loss 0.716 Epoch 54 iteration 0116/0188: training loss 0.715 Epoch 54 iteration 0117/0188: training loss 0.714 Epoch 54 iteration 0118/0188: training loss 0.716 Epoch 54 iteration 0119/0188: training loss 0.716 Epoch 54 iteration 0120/0188: training loss 0.716 Epoch 54 iteration 0121/0188: training loss 0.715 Epoch 54 iteration 0122/0188: training loss 0.715 Epoch 54 iteration 0123/0188: training loss 0.716 Epoch 54 iteration 0124/0188: training loss 0.714 Epoch 54 iteration 0125/0188: training loss 0.714 Epoch 54 iteration 0126/0188: training loss 0.713 Epoch 54 iteration 0127/0188: training loss 0.713 Epoch 54 iteration 0128/0188: training loss 0.713 Epoch 54 iteration 0129/0188: training loss 0.713 Epoch 54 iteration 0130/0188: training loss 0.712 Epoch 54 iteration 0131/0188: training loss 0.711 Epoch 54 iteration 0132/0188: training loss 0.711 Epoch 54 iteration 0133/0188: training loss 0.712 Epoch 54 iteration 0134/0188: training loss 0.713 Epoch 54 iteration 0135/0188: training loss 0.713 Epoch 54 iteration 0136/0188: training loss 0.714 Epoch 54 iteration 0137/0188: training loss 0.713 Epoch 54 iteration 0138/0188: training loss 0.713 Epoch 54 iteration 0139/0188: training loss 0.711 Epoch 54 iteration 0140/0188: training loss 0.711 Epoch 54 iteration 0141/0188: training loss 0.712 Epoch 54 iteration 0142/0188: training loss 0.711 Epoch 54 iteration 0143/0188: training loss 0.710 Epoch 54 iteration 0144/0188: training loss 0.710 Epoch 54 iteration 0145/0188: training loss 0.710 Epoch 54 iteration 0146/0188: training loss 0.711 Epoch 54 iteration 0147/0188: training loss 0.711 Epoch 54 iteration 0148/0188: training loss 0.709 Epoch 54 iteration 0149/0188: training loss 0.709 Epoch 54 iteration 0150/0188: training loss 0.708 Epoch 54 iteration 0151/0188: training loss 0.708 Epoch 54 iteration 0152/0188: training loss 0.708 Epoch 54 iteration 0153/0188: training loss 0.708 Epoch 54 iteration 0154/0188: training loss 0.709 Epoch 54 iteration 0155/0188: training loss 0.708 Epoch 54 iteration 0156/0188: training loss 0.709 Epoch 54 iteration 0157/0188: training loss 0.710 Epoch 54 iteration 0158/0188: training loss 0.710 Epoch 54 iteration 0159/0188: training loss 0.710 Epoch 54 iteration 0160/0188: training loss 0.709 Epoch 54 iteration 0161/0188: training loss 0.710 Epoch 54 iteration 0162/0188: training loss 0.710 Epoch 54 iteration 0163/0188: training loss 0.711 Epoch 54 iteration 0164/0188: training loss 0.712 Epoch 54 iteration 0165/0188: training loss 0.712 Epoch 54 iteration 0166/0188: training loss 0.713 Epoch 54 iteration 0167/0188: training loss 0.714 Epoch 54 iteration 0168/0188: training loss 0.714 Epoch 54 iteration 0169/0188: training loss 0.714 Epoch 54 iteration 0170/0188: training loss 0.715 Epoch 54 iteration 0171/0188: training loss 0.714 Epoch 54 iteration 0172/0188: training loss 0.715 Epoch 54 iteration 0173/0188: training loss 0.716 Epoch 54 iteration 0174/0188: training loss 0.716 Epoch 54 iteration 0175/0188: training loss 0.716 Epoch 54 iteration 0176/0188: training loss 0.716 Epoch 54 iteration 0177/0188: training loss 0.716 Epoch 54 iteration 0178/0188: training loss 0.716 Epoch 54 iteration 0179/0188: training loss 0.717 Epoch 54 iteration 0180/0188: training loss 0.716 Epoch 54 iteration 0181/0188: training loss 0.716 Epoch 54 iteration 0182/0188: training loss 0.716 Epoch 54 iteration 0183/0188: training loss 0.716 Epoch 54 iteration 0184/0188: training loss 0.716 Epoch 54 iteration 0185/0188: training loss 0.716 Epoch 54 iteration 0186/0188: training loss 0.716 Epoch 54 validation pixAcc: 0.874, mIoU: 0.382 Epoch 55 iteration 0001/0187: training loss 0.782 Epoch 55 iteration 0002/0187: training loss 0.793 Epoch 55 iteration 0003/0187: training loss 0.799 Epoch 55 iteration 0004/0187: training loss 0.783 Epoch 55 iteration 0005/0187: training loss 0.804 Epoch 55 iteration 0006/0187: training loss 0.799 Epoch 55 iteration 0007/0187: training loss 0.775 Epoch 55 iteration 0008/0187: training loss 0.788 Epoch 55 iteration 0009/0187: training loss 0.771 Epoch 55 iteration 0010/0187: training loss 0.789 Epoch 55 iteration 0011/0187: training loss 0.776 Epoch 55 iteration 0012/0187: training loss 0.773 Epoch 55 iteration 0013/0187: training loss 0.765 Epoch 55 iteration 0014/0187: training loss 0.760 Epoch 55 iteration 0015/0187: training loss 0.761 Epoch 55 iteration 0016/0187: training loss 0.754 Epoch 55 iteration 0017/0187: training loss 0.749 Epoch 55 iteration 0018/0187: training loss 0.754 Epoch 55 iteration 0019/0187: training loss 0.754 Epoch 55 iteration 0020/0187: training loss 0.769 Epoch 55 iteration 0021/0187: training loss 0.775 Epoch 55 iteration 0022/0187: training loss 0.766 Epoch 55 iteration 0023/0187: training loss 0.762 Epoch 55 iteration 0024/0187: training loss 0.758 Epoch 55 iteration 0025/0187: training loss 0.759 Epoch 55 iteration 0026/0187: training loss 0.752 Epoch 55 iteration 0027/0187: training loss 0.749 Epoch 55 iteration 0028/0187: training loss 0.742 Epoch 55 iteration 0029/0187: training loss 0.738 Epoch 55 iteration 0030/0187: training loss 0.740 Epoch 55 iteration 0031/0187: training loss 0.743 Epoch 55 iteration 0032/0187: training loss 0.745 Epoch 55 iteration 0033/0187: training loss 0.743 Epoch 55 iteration 0034/0187: training loss 0.746 Epoch 55 iteration 0035/0187: training loss 0.743 Epoch 55 iteration 0036/0187: training loss 0.747 Epoch 55 iteration 0037/0187: training loss 0.746 Epoch 55 iteration 0038/0187: training loss 0.744 Epoch 55 iteration 0039/0187: training loss 0.740 Epoch 55 iteration 0040/0187: training loss 0.738 Epoch 55 iteration 0041/0187: training loss 0.735 Epoch 55 iteration 0042/0187: training loss 0.734 Epoch 55 iteration 0043/0187: training loss 0.733 Epoch 55 iteration 0044/0187: training loss 0.734 Epoch 55 iteration 0045/0187: training loss 0.738 Epoch 55 iteration 0046/0187: training loss 0.735 Epoch 55 iteration 0047/0187: training loss 0.737 Epoch 55 iteration 0048/0187: training loss 0.736 Epoch 55 iteration 0049/0187: training loss 0.739 Epoch 55 iteration 0050/0187: training loss 0.735 Epoch 55 iteration 0051/0187: training loss 0.739 Epoch 55 iteration 0052/0187: training loss 0.737 Epoch 55 iteration 0053/0187: training loss 0.737 Epoch 55 iteration 0054/0187: training loss 0.737 Epoch 55 iteration 0055/0187: training loss 0.737 Epoch 55 iteration 0056/0187: training loss 0.739 Epoch 55 iteration 0057/0187: training loss 0.739 Epoch 55 iteration 0058/0187: training loss 0.738 Epoch 55 iteration 0059/0187: training loss 0.742 Epoch 55 iteration 0060/0187: training loss 0.739 Epoch 55 iteration 0061/0187: training loss 0.743 Epoch 55 iteration 0062/0187: training loss 0.739 Epoch 55 iteration 0063/0187: training loss 0.739 Epoch 55 iteration 0064/0187: training loss 0.740 Epoch 55 iteration 0065/0187: training loss 0.738 Epoch 55 iteration 0066/0187: training loss 0.738 Epoch 55 iteration 0067/0187: training loss 0.739 Epoch 55 iteration 0068/0187: training loss 0.740 Epoch 55 iteration 0069/0187: training loss 0.738 Epoch 55 iteration 0070/0187: training loss 0.738 Epoch 55 iteration 0071/0187: training loss 0.740 Epoch 55 iteration 0072/0187: training loss 0.740 Epoch 55 iteration 0073/0187: training loss 0.739 Epoch 55 iteration 0074/0187: training loss 0.739 Epoch 55 iteration 0075/0187: training loss 0.738 Epoch 55 iteration 0076/0187: training loss 0.738 Epoch 55 iteration 0077/0187: training loss 0.737 Epoch 55 iteration 0078/0187: training loss 0.736 Epoch 55 iteration 0079/0187: training loss 0.736 Epoch 55 iteration 0080/0187: training loss 0.737 Epoch 55 iteration 0081/0187: training loss 0.736 Epoch 55 iteration 0082/0187: training loss 0.737 Epoch 55 iteration 0083/0187: training loss 0.738 Epoch 55 iteration 0084/0187: training loss 0.739 Epoch 55 iteration 0085/0187: training loss 0.739 Epoch 55 iteration 0086/0187: training loss 0.740 Epoch 55 iteration 0087/0187: training loss 0.737 Epoch 55 iteration 0088/0187: training loss 0.736 Epoch 55 iteration 0089/0187: training loss 0.735 Epoch 55 iteration 0090/0187: training loss 0.734 Epoch 55 iteration 0091/0187: training loss 0.735 Epoch 55 iteration 0092/0187: training loss 0.734 Epoch 55 iteration 0093/0187: training loss 0.736 Epoch 55 iteration 0094/0187: training loss 0.738 Epoch 55 iteration 0095/0187: training loss 0.736 Epoch 55 iteration 0096/0187: training loss 0.737 Epoch 55 iteration 0097/0187: training loss 0.736 Epoch 55 iteration 0098/0187: training loss 0.736 Epoch 55 iteration 0099/0187: training loss 0.737 Epoch 55 iteration 0100/0187: training loss 0.737 Epoch 55 iteration 0101/0187: training loss 0.738 Epoch 55 iteration 0102/0187: training loss 0.737 Epoch 55 iteration 0103/0187: training loss 0.737 Epoch 55 iteration 0104/0187: training loss 0.736 Epoch 55 iteration 0105/0187: training loss 0.736 Epoch 55 iteration 0106/0187: training loss 0.736 Epoch 55 iteration 0107/0187: training loss 0.735 Epoch 55 iteration 0108/0187: training loss 0.735 Epoch 55 iteration 0109/0187: training loss 0.735 Epoch 55 iteration 0110/0187: training loss 0.733 Epoch 55 iteration 0111/0187: training loss 0.732 Epoch 55 iteration 0112/0187: training loss 0.731 Epoch 55 iteration 0113/0187: training loss 0.731 Epoch 55 iteration 0114/0187: training loss 0.730 Epoch 55 iteration 0115/0187: training loss 0.729 Epoch 55 iteration 0116/0187: training loss 0.731 Epoch 55 iteration 0117/0187: training loss 0.731 Epoch 55 iteration 0118/0187: training loss 0.732 Epoch 55 iteration 0119/0187: training loss 0.732 Epoch 55 iteration 0120/0187: training loss 0.732 Epoch 55 iteration 0121/0187: training loss 0.732 Epoch 55 iteration 0122/0187: training loss 0.731 Epoch 55 iteration 0123/0187: training loss 0.732 Epoch 55 iteration 0124/0187: training loss 0.731 Epoch 55 iteration 0125/0187: training loss 0.731 Epoch 55 iteration 0126/0187: training loss 0.732 Epoch 55 iteration 0127/0187: training loss 0.731 Epoch 55 iteration 0128/0187: training loss 0.730 Epoch 55 iteration 0129/0187: training loss 0.731 Epoch 55 iteration 0130/0187: training loss 0.730 Epoch 55 iteration 0131/0187: training loss 0.729 Epoch 55 iteration 0132/0187: training loss 0.730 Epoch 55 iteration 0133/0187: training loss 0.731 Epoch 55 iteration 0134/0187: training loss 0.731 Epoch 55 iteration 0135/0187: training loss 0.730 Epoch 55 iteration 0136/0187: training loss 0.730 Epoch 55 iteration 0137/0187: training loss 0.730 Epoch 55 iteration 0138/0187: training loss 0.731 Epoch 55 iteration 0139/0187: training loss 0.730 Epoch 55 iteration 0140/0187: training loss 0.729 Epoch 55 iteration 0141/0187: training loss 0.728 Epoch 55 iteration 0142/0187: training loss 0.728 Epoch 55 iteration 0143/0187: training loss 0.727 Epoch 55 iteration 0144/0187: training loss 0.727 Epoch 55 iteration 0145/0187: training loss 0.727 Epoch 55 iteration 0146/0187: training loss 0.727 Epoch 55 iteration 0147/0187: training loss 0.728 Epoch 55 iteration 0148/0187: training loss 0.729 Epoch 55 iteration 0149/0187: training loss 0.729 Epoch 55 iteration 0150/0187: training loss 0.729 Epoch 55 iteration 0151/0187: training loss 0.729 Epoch 55 iteration 0152/0187: training loss 0.727 Epoch 55 iteration 0153/0187: training loss 0.727 Epoch 55 iteration 0154/0187: training loss 0.725 Epoch 55 iteration 0155/0187: training loss 0.724 Epoch 55 iteration 0156/0187: training loss 0.724 Epoch 55 iteration 0157/0187: training loss 0.726 Epoch 55 iteration 0158/0187: training loss 0.726 Epoch 55 iteration 0159/0187: training loss 0.725 Epoch 55 iteration 0160/0187: training loss 0.727 Epoch 55 iteration 0161/0187: training loss 0.727 Epoch 55 iteration 0162/0187: training loss 0.727 Epoch 55 iteration 0163/0187: training loss 0.727 Epoch 55 iteration 0164/0187: training loss 0.725 Epoch 55 iteration 0165/0187: training loss 0.725 Epoch 55 iteration 0166/0187: training loss 0.725 Epoch 55 iteration 0167/0187: training loss 0.726 Epoch 55 iteration 0168/0187: training loss 0.725 Epoch 55 iteration 0169/0187: training loss 0.725 Epoch 55 iteration 0170/0187: training loss 0.726 Epoch 55 iteration 0171/0187: training loss 0.726 Epoch 55 iteration 0172/0187: training loss 0.726 Epoch 55 iteration 0173/0187: training loss 0.726 Epoch 55 iteration 0174/0187: training loss 0.726 Epoch 55 iteration 0175/0187: training loss 0.726 Epoch 55 iteration 0176/0187: training loss 0.725 Epoch 55 iteration 0177/0187: training loss 0.725 Epoch 55 iteration 0178/0187: training loss 0.725 Epoch 55 iteration 0179/0187: training loss 0.725 Epoch 55 iteration 0180/0187: training loss 0.725 Epoch 55 iteration 0181/0187: training loss 0.725 Epoch 55 iteration 0182/0187: training loss 0.727 Epoch 55 iteration 0183/0187: training loss 0.726 Epoch 55 iteration 0184/0187: training loss 0.726 Epoch 55 iteration 0185/0187: training loss 0.726 Epoch 55 iteration 0186/0187: training loss 0.726 Epoch 55 iteration 0187/0187: training loss 0.726 Epoch 55 validation pixAcc: 0.872, mIoU: 0.384 Epoch 56 iteration 0001/0187: training loss 0.676 Epoch 56 iteration 0002/0187: training loss 0.686 Epoch 56 iteration 0003/0187: training loss 0.770 Epoch 56 iteration 0004/0187: training loss 0.781 Epoch 56 iteration 0005/0187: training loss 0.747 Epoch 56 iteration 0006/0187: training loss 0.726 Epoch 56 iteration 0007/0187: training loss 0.717 Epoch 56 iteration 0008/0187: training loss 0.712 Epoch 56 iteration 0009/0187: training loss 0.703 Epoch 56 iteration 0010/0187: training loss 0.711 Epoch 56 iteration 0011/0187: training loss 0.718 Epoch 56 iteration 0012/0187: training loss 0.724 Epoch 56 iteration 0013/0187: training loss 0.725 Epoch 56 iteration 0014/0187: training loss 0.729 Epoch 56 iteration 0015/0187: training loss 0.723 Epoch 56 iteration 0016/0187: training loss 0.728 Epoch 56 iteration 0017/0187: training loss 0.734 Epoch 56 iteration 0018/0187: training loss 0.724 Epoch 56 iteration 0019/0187: training loss 0.725 Epoch 56 iteration 0020/0187: training loss 0.726 Epoch 56 iteration 0021/0187: training loss 0.725 Epoch 56 iteration 0022/0187: training loss 0.725 Epoch 56 iteration 0023/0187: training loss 0.732 Epoch 56 iteration 0024/0187: training loss 0.736 Epoch 56 iteration 0025/0187: training loss 0.733 Epoch 56 iteration 0026/0187: training loss 0.734 Epoch 56 iteration 0027/0187: training loss 0.734 Epoch 56 iteration 0028/0187: training loss 0.741 Epoch 56 iteration 0029/0187: training loss 0.740 Epoch 56 iteration 0030/0187: training loss 0.745 Epoch 56 iteration 0031/0187: training loss 0.749 Epoch 56 iteration 0032/0187: training loss 0.750 Epoch 56 iteration 0033/0187: training loss 0.753 Epoch 56 iteration 0034/0187: training loss 0.753 Epoch 56 iteration 0035/0187: training loss 0.756 Epoch 56 iteration 0036/0187: training loss 0.753 Epoch 56 iteration 0037/0187: training loss 0.756 Epoch 56 iteration 0038/0187: training loss 0.752 Epoch 56 iteration 0039/0187: training loss 0.750 Epoch 56 iteration 0040/0187: training loss 0.750 Epoch 56 iteration 0041/0187: training loss 0.749 Epoch 56 iteration 0042/0187: training loss 0.754 Epoch 56 iteration 0043/0187: training loss 0.763 Epoch 56 iteration 0044/0187: training loss 0.759 Epoch 56 iteration 0045/0187: training loss 0.760 Epoch 56 iteration 0046/0187: training loss 0.757 Epoch 56 iteration 0047/0187: training loss 0.755 Epoch 56 iteration 0048/0187: training loss 0.754 Epoch 56 iteration 0049/0187: training loss 0.754 Epoch 56 iteration 0050/0187: training loss 0.753 Epoch 56 iteration 0051/0187: training loss 0.752 Epoch 56 iteration 0052/0187: training loss 0.751 Epoch 56 iteration 0053/0187: training loss 0.749 Epoch 56 iteration 0054/0187: training loss 0.747 Epoch 56 iteration 0055/0187: training loss 0.749 Epoch 56 iteration 0056/0187: training loss 0.749 Epoch 56 iteration 0057/0187: training loss 0.745 Epoch 56 iteration 0058/0187: training loss 0.748 Epoch 56 iteration 0059/0187: training loss 0.750 Epoch 56 iteration 0060/0187: training loss 0.749 Epoch 56 iteration 0061/0187: training loss 0.753 Epoch 56 iteration 0062/0187: training loss 0.753 Epoch 56 iteration 0063/0187: training loss 0.751 Epoch 56 iteration 0064/0187: training loss 0.750 Epoch 56 iteration 0065/0187: training loss 0.748 Epoch 56 iteration 0066/0187: training loss 0.747 Epoch 56 iteration 0067/0187: training loss 0.746 Epoch 56 iteration 0068/0187: training loss 0.749 Epoch 56 iteration 0069/0187: training loss 0.749 Epoch 56 iteration 0070/0187: training loss 0.747 Epoch 56 iteration 0071/0187: training loss 0.749 Epoch 56 iteration 0072/0187: training loss 0.748 Epoch 56 iteration 0073/0187: training loss 0.750 Epoch 56 iteration 0074/0187: training loss 0.748 Epoch 56 iteration 0075/0187: training loss 0.747 Epoch 56 iteration 0076/0187: training loss 0.747 Epoch 56 iteration 0077/0187: training loss 0.746 Epoch 56 iteration 0078/0187: training loss 0.745 Epoch 56 iteration 0079/0187: training loss 0.744 Epoch 56 iteration 0080/0187: training loss 0.743 Epoch 56 iteration 0081/0187: training loss 0.742 Epoch 56 iteration 0082/0187: training loss 0.742 Epoch 56 iteration 0083/0187: training loss 0.741 Epoch 56 iteration 0084/0187: training loss 0.740 Epoch 56 iteration 0085/0187: training loss 0.739 Epoch 56 iteration 0086/0187: training loss 0.737 Epoch 56 iteration 0087/0187: training loss 0.737 Epoch 56 iteration 0088/0187: training loss 0.739 Epoch 56 iteration 0089/0187: training loss 0.738 Epoch 56 iteration 0090/0187: training loss 0.736 Epoch 56 iteration 0091/0188: training loss 0.736 Epoch 56 iteration 0092/0188: training loss 0.735 Epoch 56 iteration 0093/0188: training loss 0.734 Epoch 56 iteration 0094/0188: training loss 0.733 Epoch 56 iteration 0095/0188: training loss 0.732 Epoch 56 iteration 0096/0188: training loss 0.732 Epoch 56 iteration 0097/0188: training loss 0.730 Epoch 56 iteration 0098/0188: training loss 0.731 Epoch 56 iteration 0099/0188: training loss 0.729 Epoch 56 iteration 0100/0188: training loss 0.728 Epoch 56 iteration 0101/0188: training loss 0.727 Epoch 56 iteration 0102/0188: training loss 0.725 Epoch 56 iteration 0103/0188: training loss 0.725 Epoch 56 iteration 0104/0188: training loss 0.725 Epoch 56 iteration 0105/0188: training loss 0.724 Epoch 56 iteration 0106/0188: training loss 0.723 Epoch 56 iteration 0107/0188: training loss 0.721 Epoch 56 iteration 0108/0188: training loss 0.722 Epoch 56 iteration 0109/0188: training loss 0.721 Epoch 56 iteration 0110/0188: training loss 0.720 Epoch 56 iteration 0111/0188: training loss 0.718 Epoch 56 iteration 0112/0188: training loss 0.717 Epoch 56 iteration 0113/0188: training loss 0.715 Epoch 56 iteration 0114/0188: training loss 0.714 Epoch 56 iteration 0115/0188: training loss 0.712 Epoch 56 iteration 0116/0188: training loss 0.711 Epoch 56 iteration 0117/0188: training loss 0.711 Epoch 56 iteration 0118/0188: training loss 0.711 Epoch 56 iteration 0119/0188: training loss 0.713 Epoch 56 iteration 0120/0188: training loss 0.712 Epoch 56 iteration 0121/0188: training loss 0.711 Epoch 56 iteration 0122/0188: training loss 0.711 Epoch 56 iteration 0123/0188: training loss 0.711 Epoch 56 iteration 0124/0188: training loss 0.710 Epoch 56 iteration 0125/0188: training loss 0.711 Epoch 56 iteration 0126/0188: training loss 0.710 Epoch 56 iteration 0127/0188: training loss 0.710 Epoch 56 iteration 0128/0188: training loss 0.710 Epoch 56 iteration 0129/0188: training loss 0.709 Epoch 56 iteration 0130/0188: training loss 0.709 Epoch 56 iteration 0131/0188: training loss 0.709 Epoch 56 iteration 0132/0188: training loss 0.712 Epoch 56 iteration 0133/0188: training loss 0.712 Epoch 56 iteration 0134/0188: training loss 0.711 Epoch 56 iteration 0135/0188: training loss 0.712 Epoch 56 iteration 0136/0188: training loss 0.714 Epoch 56 iteration 0137/0188: training loss 0.714 Epoch 56 iteration 0138/0188: training loss 0.714 Epoch 56 iteration 0139/0188: training loss 0.713 Epoch 56 iteration 0140/0188: training loss 0.714 Epoch 56 iteration 0141/0188: training loss 0.714 Epoch 56 iteration 0142/0188: training loss 0.714 Epoch 56 iteration 0143/0188: training loss 0.716 Epoch 56 iteration 0144/0188: training loss 0.717 Epoch 56 iteration 0145/0188: training loss 0.718 Epoch 56 iteration 0146/0188: training loss 0.718 Epoch 56 iteration 0147/0188: training loss 0.719 Epoch 56 iteration 0148/0188: training loss 0.718 Epoch 56 iteration 0149/0188: training loss 0.719 Epoch 56 iteration 0150/0188: training loss 0.720 Epoch 56 iteration 0151/0188: training loss 0.720 Epoch 56 iteration 0152/0188: training loss 0.720 Epoch 56 iteration 0153/0188: training loss 0.720 Epoch 56 iteration 0154/0188: training loss 0.719 Epoch 56 iteration 0155/0188: training loss 0.718 Epoch 56 iteration 0156/0188: training loss 0.718 Epoch 56 iteration 0157/0188: training loss 0.717 Epoch 56 iteration 0158/0188: training loss 0.717 Epoch 56 iteration 0159/0188: training loss 0.717 Epoch 56 iteration 0160/0188: training loss 0.717 Epoch 56 iteration 0161/0188: training loss 0.717 Epoch 56 iteration 0162/0188: training loss 0.717 Epoch 56 iteration 0163/0188: training loss 0.717 Epoch 56 iteration 0164/0188: training loss 0.717 Epoch 56 iteration 0165/0188: training loss 0.718 Epoch 56 iteration 0166/0188: training loss 0.718 Epoch 56 iteration 0167/0188: training loss 0.719 Epoch 56 iteration 0168/0188: training loss 0.718 Epoch 56 iteration 0169/0188: training loss 0.719 Epoch 56 iteration 0170/0188: training loss 0.719 Epoch 56 iteration 0171/0188: training loss 0.718 Epoch 56 iteration 0172/0188: training loss 0.718 Epoch 56 iteration 0173/0188: training loss 0.718 Epoch 56 iteration 0174/0188: training loss 0.717 Epoch 56 iteration 0175/0188: training loss 0.720 Epoch 56 iteration 0176/0188: training loss 0.718 Epoch 56 iteration 0177/0188: training loss 0.718 Epoch 56 iteration 0178/0188: training loss 0.718 Epoch 56 iteration 0179/0188: training loss 0.717 Epoch 56 iteration 0180/0188: training loss 0.718 Epoch 56 iteration 0181/0188: training loss 0.718 Epoch 56 iteration 0182/0188: training loss 0.718 Epoch 56 iteration 0183/0188: training loss 0.717 Epoch 56 iteration 0184/0188: training loss 0.717 Epoch 56 iteration 0185/0188: training loss 0.719 Epoch 56 iteration 0186/0188: training loss 0.718 Epoch 56 validation pixAcc: 0.874, mIoU: 0.384 Epoch 57 iteration 0001/0187: training loss 0.719 Epoch 57 iteration 0002/0187: training loss 0.784 Epoch 57 iteration 0003/0187: training loss 0.787 Epoch 57 iteration 0004/0187: training loss 0.779 Epoch 57 iteration 0005/0187: training loss 0.774 Epoch 57 iteration 0006/0187: training loss 0.768 Epoch 57 iteration 0007/0187: training loss 0.758 Epoch 57 iteration 0008/0187: training loss 0.755 Epoch 57 iteration 0009/0187: training loss 0.773 Epoch 57 iteration 0010/0187: training loss 0.775 Epoch 57 iteration 0011/0187: training loss 0.775 Epoch 57 iteration 0012/0187: training loss 0.769 Epoch 57 iteration 0013/0187: training loss 0.770 Epoch 57 iteration 0014/0187: training loss 0.771 Epoch 57 iteration 0015/0187: training loss 0.764 Epoch 57 iteration 0016/0187: training loss 0.761 Epoch 57 iteration 0017/0187: training loss 0.755 Epoch 57 iteration 0018/0187: training loss 0.762 Epoch 57 iteration 0019/0187: training loss 0.754 Epoch 57 iteration 0020/0187: training loss 0.752 Epoch 57 iteration 0021/0187: training loss 0.753 Epoch 57 iteration 0022/0187: training loss 0.751 Epoch 57 iteration 0023/0187: training loss 0.749 Epoch 57 iteration 0024/0187: training loss 0.751 Epoch 57 iteration 0025/0187: training loss 0.750 Epoch 57 iteration 0026/0187: training loss 0.758 Epoch 57 iteration 0027/0187: training loss 0.761 Epoch 57 iteration 0028/0187: training loss 0.770 Epoch 57 iteration 0029/0187: training loss 0.764 Epoch 57 iteration 0030/0187: training loss 0.761 Epoch 57 iteration 0031/0187: training loss 0.760 Epoch 57 iteration 0032/0187: training loss 0.758 Epoch 57 iteration 0033/0187: training loss 0.755 Epoch 57 iteration 0034/0187: training loss 0.752 Epoch 57 iteration 0035/0187: training loss 0.747 Epoch 57 iteration 0036/0187: training loss 0.748 Epoch 57 iteration 0037/0187: training loss 0.749 Epoch 57 iteration 0038/0187: training loss 0.754 Epoch 57 iteration 0039/0187: training loss 0.751 Epoch 57 iteration 0040/0187: training loss 0.751 Epoch 57 iteration 0041/0187: training loss 0.748 Epoch 57 iteration 0042/0187: training loss 0.749 Epoch 57 iteration 0043/0187: training loss 0.749 Epoch 57 iteration 0044/0187: training loss 0.747 Epoch 57 iteration 0045/0187: training loss 0.747 Epoch 57 iteration 0046/0187: training loss 0.744 Epoch 57 iteration 0047/0187: training loss 0.741 Epoch 57 iteration 0048/0187: training loss 0.742 Epoch 57 iteration 0049/0187: training loss 0.741 Epoch 57 iteration 0050/0187: training loss 0.741 Epoch 57 iteration 0051/0187: training loss 0.736 Epoch 57 iteration 0052/0187: training loss 0.741 Epoch 57 iteration 0053/0187: training loss 0.738 Epoch 57 iteration 0054/0187: training loss 0.739 Epoch 57 iteration 0055/0187: training loss 0.740 Epoch 57 iteration 0056/0187: training loss 0.739 Epoch 57 iteration 0057/0187: training loss 0.740 Epoch 57 iteration 0058/0187: training loss 0.740 Epoch 57 iteration 0059/0187: training loss 0.737 Epoch 57 iteration 0060/0187: training loss 0.737 Epoch 57 iteration 0061/0187: training loss 0.736 Epoch 57 iteration 0062/0187: training loss 0.736 Epoch 57 iteration 0063/0187: training loss 0.734 Epoch 57 iteration 0064/0187: training loss 0.735 Epoch 57 iteration 0065/0187: training loss 0.737 Epoch 57 iteration 0066/0187: training loss 0.737 Epoch 57 iteration 0067/0187: training loss 0.742 Epoch 57 iteration 0068/0187: training loss 0.742 Epoch 57 iteration 0069/0187: training loss 0.742 Epoch 57 iteration 0070/0187: training loss 0.742 Epoch 57 iteration 0071/0187: training loss 0.740 Epoch 57 iteration 0072/0187: training loss 0.740 Epoch 57 iteration 0073/0187: training loss 0.738 Epoch 57 iteration 0074/0187: training loss 0.735 Epoch 57 iteration 0075/0187: training loss 0.736 Epoch 57 iteration 0076/0187: training loss 0.736 Epoch 57 iteration 0077/0187: training loss 0.736 Epoch 57 iteration 0078/0187: training loss 0.734 Epoch 57 iteration 0079/0187: training loss 0.731 Epoch 57 iteration 0080/0187: training loss 0.732 Epoch 57 iteration 0081/0187: training loss 0.732 Epoch 57 iteration 0082/0187: training loss 0.732 Epoch 57 iteration 0083/0187: training loss 0.730 Epoch 57 iteration 0084/0187: training loss 0.732 Epoch 57 iteration 0085/0187: training loss 0.731 Epoch 57 iteration 0086/0187: training loss 0.729 Epoch 57 iteration 0087/0187: training loss 0.729 Epoch 57 iteration 0088/0187: training loss 0.728 Epoch 57 iteration 0089/0187: training loss 0.726 Epoch 57 iteration 0090/0187: training loss 0.726 Epoch 57 iteration 0091/0187: training loss 0.726 Epoch 57 iteration 0092/0187: training loss 0.727 Epoch 57 iteration 0093/0187: training loss 0.729 Epoch 57 iteration 0094/0187: training loss 0.729 Epoch 57 iteration 0095/0187: training loss 0.728 Epoch 57 iteration 0096/0187: training loss 0.730 Epoch 57 iteration 0097/0187: training loss 0.731 Epoch 57 iteration 0098/0187: training loss 0.731 Epoch 57 iteration 0099/0187: training loss 0.729 Epoch 57 iteration 0100/0187: training loss 0.729 Epoch 57 iteration 0101/0187: training loss 0.729 Epoch 57 iteration 0102/0187: training loss 0.732 Epoch 57 iteration 0103/0187: training loss 0.731 Epoch 57 iteration 0104/0187: training loss 0.730 Epoch 57 iteration 0105/0187: training loss 0.730 Epoch 57 iteration 0106/0187: training loss 0.731 Epoch 57 iteration 0107/0187: training loss 0.731 Epoch 57 iteration 0108/0187: training loss 0.731 Epoch 57 iteration 0109/0187: training loss 0.729 Epoch 57 iteration 0110/0187: training loss 0.729 Epoch 57 iteration 0111/0187: training loss 0.727 Epoch 57 iteration 0112/0187: training loss 0.729 Epoch 57 iteration 0113/0187: training loss 0.730 Epoch 57 iteration 0114/0187: training loss 0.729 Epoch 57 iteration 0115/0187: training loss 0.727 Epoch 57 iteration 0116/0187: training loss 0.726 Epoch 57 iteration 0117/0187: training loss 0.725 Epoch 57 iteration 0118/0187: training loss 0.725 Epoch 57 iteration 0119/0187: training loss 0.725 Epoch 57 iteration 0120/0187: training loss 0.724 Epoch 57 iteration 0121/0187: training loss 0.723 Epoch 57 iteration 0122/0187: training loss 0.723 Epoch 57 iteration 0123/0187: training loss 0.722 Epoch 57 iteration 0124/0187: training loss 0.723 Epoch 57 iteration 0125/0187: training loss 0.722 Epoch 57 iteration 0126/0187: training loss 0.722 Epoch 57 iteration 0127/0187: training loss 0.723 Epoch 57 iteration 0128/0187: training loss 0.722 Epoch 57 iteration 0129/0187: training loss 0.722 Epoch 57 iteration 0130/0187: training loss 0.723 Epoch 57 iteration 0131/0187: training loss 0.723 Epoch 57 iteration 0132/0187: training loss 0.723 Epoch 57 iteration 0133/0187: training loss 0.723 Epoch 57 iteration 0134/0187: training loss 0.723 Epoch 57 iteration 0135/0187: training loss 0.724 Epoch 57 iteration 0136/0187: training loss 0.723 Epoch 57 iteration 0137/0187: training loss 0.723 Epoch 57 iteration 0138/0187: training loss 0.721 Epoch 57 iteration 0139/0187: training loss 0.720 Epoch 57 iteration 0140/0187: training loss 0.721 Epoch 57 iteration 0141/0187: training loss 0.720 Epoch 57 iteration 0142/0187: training loss 0.719 Epoch 57 iteration 0143/0187: training loss 0.720 Epoch 57 iteration 0144/0187: training loss 0.719 Epoch 57 iteration 0145/0187: training loss 0.718 Epoch 57 iteration 0146/0187: training loss 0.718 Epoch 57 iteration 0147/0187: training loss 0.718 Epoch 57 iteration 0148/0187: training loss 0.718 Epoch 57 iteration 0149/0187: training loss 0.718 Epoch 57 iteration 0150/0187: training loss 0.716 Epoch 57 iteration 0151/0187: training loss 0.716 Epoch 57 iteration 0152/0187: training loss 0.717 Epoch 57 iteration 0153/0187: training loss 0.718 Epoch 57 iteration 0154/0187: training loss 0.719 Epoch 57 iteration 0155/0187: training loss 0.719 Epoch 57 iteration 0156/0187: training loss 0.720 Epoch 57 iteration 0157/0187: training loss 0.722 Epoch 57 iteration 0158/0187: training loss 0.722 Epoch 57 iteration 0159/0187: training loss 0.722 Epoch 57 iteration 0160/0187: training loss 0.722 Epoch 57 iteration 0161/0187: training loss 0.723 Epoch 57 iteration 0162/0187: training loss 0.725 Epoch 57 iteration 0163/0187: training loss 0.725 Epoch 57 iteration 0164/0187: training loss 0.726 Epoch 57 iteration 0165/0187: training loss 0.727 Epoch 57 iteration 0166/0187: training loss 0.729 Epoch 57 iteration 0167/0187: training loss 0.728 Epoch 57 iteration 0168/0187: training loss 0.728 Epoch 57 iteration 0169/0187: training loss 0.728 Epoch 57 iteration 0170/0187: training loss 0.728 Epoch 57 iteration 0171/0187: training loss 0.729 Epoch 57 iteration 0172/0187: training loss 0.729 Epoch 57 iteration 0173/0187: training loss 0.728 Epoch 57 iteration 0174/0187: training loss 0.727 Epoch 57 iteration 0175/0187: training loss 0.728 Epoch 57 iteration 0176/0187: training loss 0.728 Epoch 57 iteration 0177/0187: training loss 0.728 Epoch 57 iteration 0178/0187: training loss 0.729 Epoch 57 iteration 0179/0187: training loss 0.729 Epoch 57 iteration 0180/0187: training loss 0.729 Epoch 57 iteration 0181/0187: training loss 0.729 Epoch 57 iteration 0182/0187: training loss 0.728 Epoch 57 iteration 0183/0187: training loss 0.728 Epoch 57 iteration 0184/0187: training loss 0.729 Epoch 57 iteration 0185/0187: training loss 0.728 Epoch 57 iteration 0186/0187: training loss 0.728 Epoch 57 iteration 0187/0187: training loss 0.727 Epoch 57 validation pixAcc: 0.874, mIoU: 0.383 Epoch 58 iteration 0001/0187: training loss 0.930 Epoch 58 iteration 0002/0187: training loss 0.796 Epoch 58 iteration 0003/0187: training loss 0.756 Epoch 58 iteration 0004/0187: training loss 0.737 Epoch 58 iteration 0005/0187: training loss 0.767 Epoch 58 iteration 0006/0187: training loss 0.793 Epoch 58 iteration 0007/0187: training loss 0.795 Epoch 58 iteration 0008/0187: training loss 0.785 Epoch 58 iteration 0009/0187: training loss 0.796 Epoch 58 iteration 0010/0187: training loss 0.801 Epoch 58 iteration 0011/0187: training loss 0.794 Epoch 58 iteration 0012/0187: training loss 0.790 Epoch 58 iteration 0013/0187: training loss 0.775 Epoch 58 iteration 0014/0187: training loss 0.763 Epoch 58 iteration 0015/0187: training loss 0.763 Epoch 58 iteration 0016/0187: training loss 0.771 Epoch 58 iteration 0017/0187: training loss 0.757 Epoch 58 iteration 0018/0187: training loss 0.766 Epoch 58 iteration 0019/0187: training loss 0.777 Epoch 58 iteration 0020/0187: training loss 0.774 Epoch 58 iteration 0021/0187: training loss 0.769 Epoch 58 iteration 0022/0187: training loss 0.768 Epoch 58 iteration 0023/0187: training loss 0.763 Epoch 58 iteration 0024/0187: training loss 0.756 Epoch 58 iteration 0025/0187: training loss 0.755 Epoch 58 iteration 0026/0187: training loss 0.751 Epoch 58 iteration 0027/0187: training loss 0.747 Epoch 58 iteration 0028/0187: training loss 0.744 Epoch 58 iteration 0029/0187: training loss 0.745 Epoch 58 iteration 0030/0187: training loss 0.743 Epoch 58 iteration 0031/0187: training loss 0.740 Epoch 58 iteration 0032/0187: training loss 0.741 Epoch 58 iteration 0033/0187: training loss 0.741 Epoch 58 iteration 0034/0187: training loss 0.736 Epoch 58 iteration 0035/0187: training loss 0.736 Epoch 58 iteration 0036/0187: training loss 0.736 Epoch 58 iteration 0037/0187: training loss 0.731 Epoch 58 iteration 0038/0187: training loss 0.735 Epoch 58 iteration 0039/0187: training loss 0.736 Epoch 58 iteration 0040/0187: training loss 0.736 Epoch 58 iteration 0041/0187: training loss 0.738 Epoch 58 iteration 0042/0187: training loss 0.739 Epoch 58 iteration 0043/0187: training loss 0.738 Epoch 58 iteration 0044/0187: training loss 0.739 Epoch 58 iteration 0045/0187: training loss 0.738 Epoch 58 iteration 0046/0187: training loss 0.737 Epoch 58 iteration 0047/0187: training loss 0.740 Epoch 58 iteration 0048/0187: training loss 0.739 Epoch 58 iteration 0049/0187: training loss 0.738 Epoch 58 iteration 0050/0187: training loss 0.740 Epoch 58 iteration 0051/0187: training loss 0.740 Epoch 58 iteration 0052/0187: training loss 0.739 Epoch 58 iteration 0053/0187: training loss 0.739 Epoch 58 iteration 0054/0187: training loss 0.738 Epoch 58 iteration 0055/0187: training loss 0.737 Epoch 58 iteration 0056/0187: training loss 0.739 Epoch 58 iteration 0057/0187: training loss 0.741 Epoch 58 iteration 0058/0187: training loss 0.740 Epoch 58 iteration 0059/0187: training loss 0.739 Epoch 58 iteration 0060/0187: training loss 0.741 Epoch 58 iteration 0061/0187: training loss 0.739 Epoch 58 iteration 0062/0187: training loss 0.740 Epoch 58 iteration 0063/0187: training loss 0.737 Epoch 58 iteration 0064/0187: training loss 0.740 Epoch 58 iteration 0065/0187: training loss 0.738 Epoch 58 iteration 0066/0187: training loss 0.735 Epoch 58 iteration 0067/0187: training loss 0.733 Epoch 58 iteration 0068/0187: training loss 0.734 Epoch 58 iteration 0069/0187: training loss 0.732 Epoch 58 iteration 0070/0187: training loss 0.734 Epoch 58 iteration 0071/0187: training loss 0.735 Epoch 58 iteration 0072/0187: training loss 0.736 Epoch 58 iteration 0073/0187: training loss 0.735 Epoch 58 iteration 0074/0187: training loss 0.735 Epoch 58 iteration 0075/0187: training loss 0.733 Epoch 58 iteration 0076/0187: training loss 0.732 Epoch 58 iteration 0077/0187: training loss 0.735 Epoch 58 iteration 0078/0187: training loss 0.733 Epoch 58 iteration 0079/0187: training loss 0.733 Epoch 58 iteration 0080/0187: training loss 0.734 Epoch 58 iteration 0081/0187: training loss 0.734 Epoch 58 iteration 0082/0187: training loss 0.735 Epoch 58 iteration 0083/0187: training loss 0.734 Epoch 58 iteration 0084/0187: training loss 0.734 Epoch 58 iteration 0085/0187: training loss 0.734 Epoch 58 iteration 0086/0187: training loss 0.734 Epoch 58 iteration 0087/0187: training loss 0.734 Epoch 58 iteration 0088/0187: training loss 0.733 Epoch 58 iteration 0089/0187: training loss 0.734 Epoch 58 iteration 0090/0187: training loss 0.735 Epoch 58 iteration 0091/0188: training loss 0.734 Epoch 58 iteration 0092/0188: training loss 0.736 Epoch 58 iteration 0093/0188: training loss 0.734 Epoch 58 iteration 0094/0188: training loss 0.735 Epoch 58 iteration 0095/0188: training loss 0.734 Epoch 58 iteration 0096/0188: training loss 0.733 Epoch 58 iteration 0097/0188: training loss 0.733 Epoch 58 iteration 0098/0188: training loss 0.731 Epoch 58 iteration 0099/0188: training loss 0.733 Epoch 58 iteration 0100/0188: training loss 0.732 Epoch 58 iteration 0101/0188: training loss 0.732 Epoch 58 iteration 0102/0188: training loss 0.731 Epoch 58 iteration 0103/0188: training loss 0.731 Epoch 58 iteration 0104/0188: training loss 0.732 Epoch 58 iteration 0105/0188: training loss 0.731 Epoch 58 iteration 0106/0188: training loss 0.731 Epoch 58 iteration 0107/0188: training loss 0.733 Epoch 58 iteration 0108/0188: training loss 0.733 Epoch 58 iteration 0109/0188: training loss 0.731 Epoch 58 iteration 0110/0188: training loss 0.732 Epoch 58 iteration 0111/0188: training loss 0.731 Epoch 58 iteration 0112/0188: training loss 0.733 Epoch 58 iteration 0113/0188: training loss 0.733 Epoch 58 iteration 0114/0188: training loss 0.731 Epoch 58 iteration 0115/0188: training loss 0.731 Epoch 58 iteration 0116/0188: training loss 0.730 Epoch 58 iteration 0117/0188: training loss 0.731 Epoch 58 iteration 0118/0188: training loss 0.731 Epoch 58 iteration 0119/0188: training loss 0.730 Epoch 58 iteration 0120/0188: training loss 0.730 Epoch 58 iteration 0121/0188: training loss 0.731 Epoch 58 iteration 0122/0188: training loss 0.730 Epoch 58 iteration 0123/0188: training loss 0.729 Epoch 58 iteration 0124/0188: training loss 0.729 Epoch 58 iteration 0125/0188: training loss 0.728 Epoch 58 iteration 0126/0188: training loss 0.728 Epoch 58 iteration 0127/0188: training loss 0.728 Epoch 58 iteration 0128/0188: training loss 0.726 Epoch 58 iteration 0129/0188: training loss 0.727 Epoch 58 iteration 0130/0188: training loss 0.728 Epoch 58 iteration 0131/0188: training loss 0.728 Epoch 58 iteration 0132/0188: training loss 0.728 Epoch 58 iteration 0133/0188: training loss 0.726 Epoch 58 iteration 0134/0188: training loss 0.726 Epoch 58 iteration 0135/0188: training loss 0.725 Epoch 58 iteration 0136/0188: training loss 0.725 Epoch 58 iteration 0137/0188: training loss 0.726 Epoch 58 iteration 0138/0188: training loss 0.725 Epoch 58 iteration 0139/0188: training loss 0.725 Epoch 58 iteration 0140/0188: training loss 0.724 Epoch 58 iteration 0141/0188: training loss 0.724 Epoch 58 iteration 0142/0188: training loss 0.724 Epoch 58 iteration 0143/0188: training loss 0.723 Epoch 58 iteration 0144/0188: training loss 0.723 Epoch 58 iteration 0145/0188: training loss 0.723 Epoch 58 iteration 0146/0188: training loss 0.724 Epoch 58 iteration 0147/0188: training loss 0.724 Epoch 58 iteration 0148/0188: training loss 0.723 Epoch 58 iteration 0149/0188: training loss 0.722 Epoch 58 iteration 0150/0188: training loss 0.723 Epoch 58 iteration 0151/0188: training loss 0.723 Epoch 58 iteration 0152/0188: training loss 0.724 Epoch 58 iteration 0153/0188: training loss 0.723 Epoch 58 iteration 0154/0188: training loss 0.722 Epoch 58 iteration 0155/0188: training loss 0.721 Epoch 58 iteration 0156/0188: training loss 0.721 Epoch 58 iteration 0157/0188: training loss 0.721 Epoch 58 iteration 0158/0188: training loss 0.720 Epoch 58 iteration 0159/0188: training loss 0.719 Epoch 58 iteration 0160/0188: training loss 0.719 Epoch 58 iteration 0161/0188: training loss 0.720 Epoch 58 iteration 0162/0188: training loss 0.719 Epoch 58 iteration 0163/0188: training loss 0.719 Epoch 58 iteration 0164/0188: training loss 0.718 Epoch 58 iteration 0165/0188: training loss 0.718 Epoch 58 iteration 0166/0188: training loss 0.718 Epoch 58 iteration 0167/0188: training loss 0.718 Epoch 58 iteration 0168/0188: training loss 0.718 Epoch 58 iteration 0169/0188: training loss 0.717 Epoch 58 iteration 0170/0188: training loss 0.717 Epoch 58 iteration 0171/0188: training loss 0.718 Epoch 58 iteration 0172/0188: training loss 0.718 Epoch 58 iteration 0173/0188: training loss 0.719 Epoch 58 iteration 0174/0188: training loss 0.719 Epoch 58 iteration 0175/0188: training loss 0.718 Epoch 58 iteration 0176/0188: training loss 0.718 Epoch 58 iteration 0177/0188: training loss 0.719 Epoch 58 iteration 0178/0188: training loss 0.719 Epoch 58 iteration 0179/0188: training loss 0.721 Epoch 58 iteration 0180/0188: training loss 0.721 Epoch 58 iteration 0181/0188: training loss 0.721 Epoch 58 iteration 0182/0188: training loss 0.721 Epoch 58 iteration 0183/0188: training loss 0.720 Epoch 58 iteration 0184/0188: training loss 0.721 Epoch 58 iteration 0185/0188: training loss 0.720 Epoch 58 iteration 0186/0188: training loss 0.720 Epoch 58 validation pixAcc: 0.874, mIoU: 0.385 Epoch 59 iteration 0001/0187: training loss 0.785 Epoch 59 iteration 0002/0187: training loss 0.781 Epoch 59 iteration 0003/0187: training loss 0.779 Epoch 59 iteration 0004/0187: training loss 0.747 Epoch 59 iteration 0005/0187: training loss 0.759 Epoch 59 iteration 0006/0187: training loss 0.770 Epoch 59 iteration 0007/0187: training loss 0.755 Epoch 59 iteration 0008/0187: training loss 0.739 Epoch 59 iteration 0009/0187: training loss 0.742 Epoch 59 iteration 0010/0187: training loss 0.736 Epoch 59 iteration 0011/0187: training loss 0.743 Epoch 59 iteration 0012/0187: training loss 0.733 Epoch 59 iteration 0013/0187: training loss 0.724 Epoch 59 iteration 0014/0187: training loss 0.722 Epoch 59 iteration 0015/0187: training loss 0.725 Epoch 59 iteration 0016/0187: training loss 0.723 Epoch 59 iteration 0017/0187: training loss 0.718 Epoch 59 iteration 0018/0187: training loss 0.712 Epoch 59 iteration 0019/0187: training loss 0.719 Epoch 59 iteration 0020/0187: training loss 0.716 Epoch 59 iteration 0021/0187: training loss 0.714 Epoch 59 iteration 0022/0187: training loss 0.709 Epoch 59 iteration 0023/0187: training loss 0.715 Epoch 59 iteration 0024/0187: training loss 0.711 Epoch 59 iteration 0025/0187: training loss 0.709 Epoch 59 iteration 0026/0187: training loss 0.715 Epoch 59 iteration 0027/0187: training loss 0.714 Epoch 59 iteration 0028/0187: training loss 0.721 Epoch 59 iteration 0029/0187: training loss 0.725 Epoch 59 iteration 0030/0187: training loss 0.727 Epoch 59 iteration 0031/0187: training loss 0.724 Epoch 59 iteration 0032/0187: training loss 0.725 Epoch 59 iteration 0033/0187: training loss 0.721 Epoch 59 iteration 0034/0187: training loss 0.719 Epoch 59 iteration 0035/0187: training loss 0.724 Epoch 59 iteration 0036/0187: training loss 0.720 Epoch 59 iteration 0037/0187: training loss 0.719 Epoch 59 iteration 0038/0187: training loss 0.715 Epoch 59 iteration 0039/0187: training loss 0.715 Epoch 59 iteration 0040/0187: training loss 0.710 Epoch 59 iteration 0041/0187: training loss 0.710 Epoch 59 iteration 0042/0187: training loss 0.713 Epoch 59 iteration 0043/0187: training loss 0.713 Epoch 59 iteration 0044/0187: training loss 0.710 Epoch 59 iteration 0045/0187: training loss 0.708 Epoch 59 iteration 0046/0187: training loss 0.705 Epoch 59 iteration 0047/0187: training loss 0.702 Epoch 59 iteration 0048/0187: training loss 0.699 Epoch 59 iteration 0049/0187: training loss 0.698 Epoch 59 iteration 0050/0187: training loss 0.696 Epoch 59 iteration 0051/0187: training loss 0.699 Epoch 59 iteration 0052/0187: training loss 0.702 Epoch 59 iteration 0053/0187: training loss 0.699 Epoch 59 iteration 0054/0187: training loss 0.701 Epoch 59 iteration 0055/0187: training loss 0.704 Epoch 59 iteration 0056/0187: training loss 0.708 Epoch 59 iteration 0057/0187: training loss 0.706 Epoch 59 iteration 0058/0187: training loss 0.704 Epoch 59 iteration 0059/0187: training loss 0.708 Epoch 59 iteration 0060/0187: training loss 0.708 Epoch 59 iteration 0061/0187: training loss 0.708 Epoch 59 iteration 0062/0187: training loss 0.709 Epoch 59 iteration 0063/0187: training loss 0.708 Epoch 59 iteration 0064/0187: training loss 0.709 Epoch 59 iteration 0065/0187: training loss 0.711 Epoch 59 iteration 0066/0187: training loss 0.712 Epoch 59 iteration 0067/0187: training loss 0.712 Epoch 59 iteration 0068/0187: training loss 0.711 Epoch 59 iteration 0069/0187: training loss 0.712 Epoch 59 iteration 0070/0187: training loss 0.713 Epoch 59 iteration 0071/0187: training loss 0.712 Epoch 59 iteration 0072/0187: training loss 0.711 Epoch 59 iteration 0073/0187: training loss 0.710 Epoch 59 iteration 0074/0187: training loss 0.712 Epoch 59 iteration 0075/0187: training loss 0.713 Epoch 59 iteration 0076/0187: training loss 0.713 Epoch 59 iteration 0077/0187: training loss 0.714 Epoch 59 iteration 0078/0187: training loss 0.712 Epoch 59 iteration 0079/0187: training loss 0.712 Epoch 59 iteration 0080/0187: training loss 0.713 Epoch 59 iteration 0081/0187: training loss 0.715 Epoch 59 iteration 0082/0187: training loss 0.714 Epoch 59 iteration 0083/0187: training loss 0.713 Epoch 59 iteration 0084/0187: training loss 0.714 Epoch 59 iteration 0085/0187: training loss 0.714 Epoch 59 iteration 0086/0187: training loss 0.712 Epoch 59 iteration 0087/0187: training loss 0.713 Epoch 59 iteration 0088/0187: training loss 0.714 Epoch 59 iteration 0089/0187: training loss 0.712 Epoch 59 iteration 0090/0187: training loss 0.713 Epoch 59 iteration 0091/0187: training loss 0.713 Epoch 59 iteration 0092/0187: training loss 0.714 Epoch 59 iteration 0093/0187: training loss 0.715 Epoch 59 iteration 0094/0187: training loss 0.715 Epoch 59 iteration 0095/0187: training loss 0.717 Epoch 59 iteration 0096/0187: training loss 0.716 Epoch 59 iteration 0097/0187: training loss 0.717 Epoch 59 iteration 0098/0187: training loss 0.716 Epoch 59 iteration 0099/0187: training loss 0.716 Epoch 59 iteration 0100/0187: training loss 0.716 Epoch 59 iteration 0101/0187: training loss 0.717 Epoch 59 iteration 0102/0187: training loss 0.716 Epoch 59 iteration 0103/0187: training loss 0.717 Epoch 59 iteration 0104/0187: training loss 0.717 Epoch 59 iteration 0105/0187: training loss 0.717 Epoch 59 iteration 0106/0187: training loss 0.715 Epoch 59 iteration 0107/0187: training loss 0.716 Epoch 59 iteration 0108/0187: training loss 0.716 Epoch 59 iteration 0109/0187: training loss 0.717 Epoch 59 iteration 0110/0187: training loss 0.717 Epoch 59 iteration 0111/0187: training loss 0.716 Epoch 59 iteration 0112/0187: training loss 0.717 Epoch 59 iteration 0113/0187: training loss 0.716 Epoch 59 iteration 0114/0187: training loss 0.717 Epoch 59 iteration 0115/0187: training loss 0.717 Epoch 59 iteration 0116/0187: training loss 0.716 Epoch 59 iteration 0117/0187: training loss 0.717 Epoch 59 iteration 0118/0187: training loss 0.716 Epoch 59 iteration 0119/0187: training loss 0.715 Epoch 59 iteration 0120/0187: training loss 0.715 Epoch 59 iteration 0121/0187: training loss 0.714 Epoch 59 iteration 0122/0187: training loss 0.714 Epoch 59 iteration 0123/0187: training loss 0.713 Epoch 59 iteration 0124/0187: training loss 0.712 Epoch 59 iteration 0125/0187: training loss 0.713 Epoch 59 iteration 0126/0187: training loss 0.714 Epoch 59 iteration 0127/0187: training loss 0.715 Epoch 59 iteration 0128/0187: training loss 0.715 Epoch 59 iteration 0129/0187: training loss 0.714 Epoch 59 iteration 0130/0187: training loss 0.716 Epoch 59 iteration 0131/0187: training loss 0.716 Epoch 59 iteration 0132/0187: training loss 0.715 Epoch 59 iteration 0133/0187: training loss 0.714 Epoch 59 iteration 0134/0187: training loss 0.715 Epoch 59 iteration 0135/0187: training loss 0.714 Epoch 59 iteration 0136/0187: training loss 0.714 Epoch 59 iteration 0137/0187: training loss 0.716 Epoch 59 iteration 0138/0187: training loss 0.716 Epoch 59 iteration 0139/0187: training loss 0.715 Epoch 59 iteration 0140/0187: training loss 0.715 Epoch 59 iteration 0141/0187: training loss 0.716 Epoch 59 iteration 0142/0187: training loss 0.716 Epoch 59 iteration 0143/0187: training loss 0.717 Epoch 59 iteration 0144/0187: training loss 0.717 Epoch 59 iteration 0145/0187: training loss 0.716 Epoch 59 iteration 0146/0187: training loss 0.716 Epoch 59 iteration 0147/0187: training loss 0.715 Epoch 59 iteration 0148/0187: training loss 0.715 Epoch 59 iteration 0149/0187: training loss 0.718 Epoch 59 iteration 0150/0187: training loss 0.719 Epoch 59 iteration 0151/0187: training loss 0.719 Epoch 59 iteration 0152/0187: training loss 0.719 Epoch 59 iteration 0153/0187: training loss 0.718 Epoch 59 iteration 0154/0187: training loss 0.718 Epoch 59 iteration 0155/0187: training loss 0.717 Epoch 59 iteration 0156/0187: training loss 0.716 Epoch 59 iteration 0157/0187: training loss 0.717 Epoch 59 iteration 0158/0187: training loss 0.717 Epoch 59 iteration 0159/0187: training loss 0.717 Epoch 59 iteration 0160/0187: training loss 0.716 Epoch 59 iteration 0161/0187: training loss 0.715 Epoch 59 iteration 0162/0187: training loss 0.715 Epoch 59 iteration 0163/0187: training loss 0.716 Epoch 59 iteration 0164/0187: training loss 0.716 Epoch 59 iteration 0165/0187: training loss 0.717 Epoch 59 iteration 0166/0187: training loss 0.716 Epoch 59 iteration 0167/0187: training loss 0.717 Epoch 59 iteration 0168/0187: training loss 0.716 Epoch 59 iteration 0169/0187: training loss 0.716 Epoch 59 iteration 0170/0187: training loss 0.717 Epoch 59 iteration 0171/0187: training loss 0.716 Epoch 59 iteration 0172/0187: training loss 0.716 Epoch 59 iteration 0173/0187: training loss 0.717 Epoch 59 iteration 0174/0187: training loss 0.717 Epoch 59 iteration 0175/0187: training loss 0.716 Epoch 59 iteration 0176/0187: training loss 0.716 Epoch 59 iteration 0177/0187: training loss 0.717 Epoch 59 iteration 0178/0187: training loss 0.716 Epoch 59 iteration 0179/0187: training loss 0.716 Epoch 59 iteration 0180/0187: training loss 0.715 Epoch 59 iteration 0181/0187: training loss 0.716 Epoch 59 iteration 0182/0187: training loss 0.715 Epoch 59 iteration 0183/0187: training loss 0.715 Epoch 59 iteration 0184/0187: training loss 0.716 Epoch 59 iteration 0185/0187: training loss 0.715 Epoch 59 iteration 0186/0187: training loss 0.715 Epoch 59 iteration 0187/0187: training loss 0.715 Epoch 59 validation pixAcc: 0.874, mIoU: 0.388 Epoch 60 iteration 0001/0187: training loss 0.601 Epoch 60 iteration 0002/0187: training loss 0.651 Epoch 60 iteration 0003/0187: training loss 0.729 Epoch 60 iteration 0004/0187: training loss 0.707 Epoch 60 iteration 0005/0187: training loss 0.704 Epoch 60 iteration 0006/0187: training loss 0.717 Epoch 60 iteration 0007/0187: training loss 0.704 Epoch 60 iteration 0008/0187: training loss 0.725 Epoch 60 iteration 0009/0187: training loss 0.749 Epoch 60 iteration 0010/0187: training loss 0.727 Epoch 60 iteration 0011/0187: training loss 0.718 Epoch 60 iteration 0012/0187: training loss 0.712 Epoch 60 iteration 0013/0187: training loss 0.720 Epoch 60 iteration 0014/0187: training loss 0.712 Epoch 60 iteration 0015/0187: training loss 0.700 Epoch 60 iteration 0016/0187: training loss 0.697 Epoch 60 iteration 0017/0187: training loss 0.698 Epoch 60 iteration 0018/0187: training loss 0.701 Epoch 60 iteration 0019/0187: training loss 0.711 Epoch 60 iteration 0020/0187: training loss 0.711 Epoch 60 iteration 0021/0187: training loss 0.719 Epoch 60 iteration 0022/0187: training loss 0.723 Epoch 60 iteration 0023/0187: training loss 0.718 Epoch 60 iteration 0024/0187: training loss 0.719 Epoch 60 iteration 0025/0187: training loss 0.717 Epoch 60 iteration 0026/0187: training loss 0.715 Epoch 60 iteration 0027/0187: training loss 0.711 Epoch 60 iteration 0028/0187: training loss 0.709 Epoch 60 iteration 0029/0187: training loss 0.708 Epoch 60 iteration 0030/0187: training loss 0.706 Epoch 60 iteration 0031/0187: training loss 0.710 Epoch 60 iteration 0032/0187: training loss 0.715 Epoch 60 iteration 0033/0187: training loss 0.713 Epoch 60 iteration 0034/0187: training loss 0.716 Epoch 60 iteration 0035/0187: training loss 0.715 Epoch 60 iteration 0036/0187: training loss 0.716 Epoch 60 iteration 0037/0187: training loss 0.719 Epoch 60 iteration 0038/0187: training loss 0.723 Epoch 60 iteration 0039/0187: training loss 0.719 Epoch 60 iteration 0040/0187: training loss 0.718 Epoch 60 iteration 0041/0187: training loss 0.716 Epoch 60 iteration 0042/0187: training loss 0.717 Epoch 60 iteration 0043/0187: training loss 0.716 Epoch 60 iteration 0044/0187: training loss 0.714 Epoch 60 iteration 0045/0187: training loss 0.713 Epoch 60 iteration 0046/0187: training loss 0.715 Epoch 60 iteration 0047/0187: training loss 0.712 Epoch 60 iteration 0048/0187: training loss 0.713 Epoch 60 iteration 0049/0187: training loss 0.715 Epoch 60 iteration 0050/0187: training loss 0.716 Epoch 60 iteration 0051/0187: training loss 0.713 Epoch 60 iteration 0052/0187: training loss 0.714 Epoch 60 iteration 0053/0187: training loss 0.715 Epoch 60 iteration 0054/0187: training loss 0.714 Epoch 60 iteration 0055/0187: training loss 0.711 Epoch 60 iteration 0056/0187: training loss 0.709 Epoch 60 iteration 0057/0187: training loss 0.707 Epoch 60 iteration 0058/0187: training loss 0.708 Epoch 60 iteration 0059/0187: training loss 0.707 Epoch 60 iteration 0060/0187: training loss 0.712 Epoch 60 iteration 0061/0187: training loss 0.712 Epoch 60 iteration 0062/0187: training loss 0.712 Epoch 60 iteration 0063/0187: training loss 0.710 Epoch 60 iteration 0064/0187: training loss 0.708 Epoch 60 iteration 0065/0187: training loss 0.710 Epoch 60 iteration 0066/0187: training loss 0.709 Epoch 60 iteration 0067/0187: training loss 0.710 Epoch 60 iteration 0068/0187: training loss 0.709 Epoch 60 iteration 0069/0187: training loss 0.709 Epoch 60 iteration 0070/0187: training loss 0.708 Epoch 60 iteration 0071/0187: training loss 0.705 Epoch 60 iteration 0072/0187: training loss 0.707 Epoch 60 iteration 0073/0187: training loss 0.704 Epoch 60 iteration 0074/0187: training loss 0.704 Epoch 60 iteration 0075/0187: training loss 0.702 Epoch 60 iteration 0076/0187: training loss 0.703 Epoch 60 iteration 0077/0187: training loss 0.702 Epoch 60 iteration 0078/0187: training loss 0.703 Epoch 60 iteration 0079/0187: training loss 0.703 Epoch 60 iteration 0080/0187: training loss 0.701 Epoch 60 iteration 0081/0187: training loss 0.701 Epoch 60 iteration 0082/0187: training loss 0.699 Epoch 60 iteration 0083/0187: training loss 0.698 Epoch 60 iteration 0084/0187: training loss 0.701 Epoch 60 iteration 0085/0187: training loss 0.703 Epoch 60 iteration 0086/0187: training loss 0.704 Epoch 60 iteration 0087/0187: training loss 0.705 Epoch 60 iteration 0088/0187: training loss 0.704 Epoch 60 iteration 0089/0187: training loss 0.705 Epoch 60 iteration 0090/0187: training loss 0.704 Epoch 60 iteration 0091/0188: training loss 0.705 Epoch 60 iteration 0092/0188: training loss 0.705 Epoch 60 iteration 0093/0188: training loss 0.705 Epoch 60 iteration 0094/0188: training loss 0.704 Epoch 60 iteration 0095/0188: training loss 0.706 Epoch 60 iteration 0096/0188: training loss 0.706 Epoch 60 iteration 0097/0188: training loss 0.706 Epoch 60 iteration 0098/0188: training loss 0.707 Epoch 60 iteration 0099/0188: training loss 0.709 Epoch 60 iteration 0100/0188: training loss 0.707 Epoch 60 iteration 0101/0188: training loss 0.707 Epoch 60 iteration 0102/0188: training loss 0.707 Epoch 60 iteration 0103/0188: training loss 0.707 Epoch 60 iteration 0104/0188: training loss 0.706 Epoch 60 iteration 0105/0188: training loss 0.707 Epoch 60 iteration 0106/0188: training loss 0.707 Epoch 60 iteration 0107/0188: training loss 0.707 Epoch 60 iteration 0108/0188: training loss 0.707 Epoch 60 iteration 0109/0188: training loss 0.707 Epoch 60 iteration 0110/0188: training loss 0.710 Epoch 60 iteration 0111/0188: training loss 0.709 Epoch 60 iteration 0112/0188: training loss 0.707 Epoch 60 iteration 0113/0188: training loss 0.706 Epoch 60 iteration 0114/0188: training loss 0.705 Epoch 60 iteration 0115/0188: training loss 0.705 Epoch 60 iteration 0116/0188: training loss 0.705 Epoch 60 iteration 0117/0188: training loss 0.705 Epoch 60 iteration 0118/0188: training loss 0.705 Epoch 60 iteration 0119/0188: training loss 0.704 Epoch 60 iteration 0120/0188: training loss 0.706 Epoch 60 iteration 0121/0188: training loss 0.706 Epoch 60 iteration 0122/0188: training loss 0.707 Epoch 60 iteration 0123/0188: training loss 0.707 Epoch 60 iteration 0124/0188: training loss 0.707 Epoch 60 iteration 0125/0188: training loss 0.708 Epoch 60 iteration 0126/0188: training loss 0.708 Epoch 60 iteration 0127/0188: training loss 0.709 Epoch 60 iteration 0128/0188: training loss 0.708 Epoch 60 iteration 0129/0188: training loss 0.708 Epoch 60 iteration 0130/0188: training loss 0.708 Epoch 60 iteration 0131/0188: training loss 0.707 Epoch 60 iteration 0132/0188: training loss 0.709 Epoch 60 iteration 0133/0188: training loss 0.710 Epoch 60 iteration 0134/0188: training loss 0.710 Epoch 60 iteration 0135/0188: training loss 0.714 Epoch 60 iteration 0136/0188: training loss 0.714 Epoch 60 iteration 0137/0188: training loss 0.713 Epoch 60 iteration 0138/0188: training loss 0.712 Epoch 60 iteration 0139/0188: training loss 0.712 Epoch 60 iteration 0140/0188: training loss 0.713 Epoch 60 iteration 0141/0188: training loss 0.712 Epoch 60 iteration 0142/0188: training loss 0.712 Epoch 60 iteration 0143/0188: training loss 0.711 Epoch 60 iteration 0144/0188: training loss 0.713 Epoch 60 iteration 0145/0188: training loss 0.714 Epoch 60 iteration 0146/0188: training loss 0.714 Epoch 60 iteration 0147/0188: training loss 0.716 Epoch 60 iteration 0148/0188: training loss 0.717 Epoch 60 iteration 0149/0188: training loss 0.719 Epoch 60 iteration 0150/0188: training loss 0.718 Epoch 60 iteration 0151/0188: training loss 0.719 Epoch 60 iteration 0152/0188: training loss 0.719 Epoch 60 iteration 0153/0188: training loss 0.719 Epoch 60 iteration 0154/0188: training loss 0.719 Epoch 60 iteration 0155/0188: training loss 0.720 Epoch 60 iteration 0156/0188: training loss 0.720 Epoch 60 iteration 0157/0188: training loss 0.720 Epoch 60 iteration 0158/0188: training loss 0.720 Epoch 60 iteration 0159/0188: training loss 0.719 Epoch 60 iteration 0160/0188: training loss 0.719 Epoch 60 iteration 0161/0188: training loss 0.720 Epoch 60 iteration 0162/0188: training loss 0.718 Epoch 60 iteration 0163/0188: training loss 0.718 Epoch 60 iteration 0164/0188: training loss 0.717 Epoch 60 iteration 0165/0188: training loss 0.717 Epoch 60 iteration 0166/0188: training loss 0.716 Epoch 60 iteration 0167/0188: training loss 0.718 Epoch 60 iteration 0168/0188: training loss 0.717 Epoch 60 iteration 0169/0188: training loss 0.718 Epoch 60 iteration 0170/0188: training loss 0.719 Epoch 60 iteration 0171/0188: training loss 0.719 Epoch 60 iteration 0172/0188: training loss 0.718 Epoch 60 iteration 0173/0188: training loss 0.719 Epoch 60 iteration 0174/0188: training loss 0.721 Epoch 60 iteration 0175/0188: training loss 0.722 Epoch 60 iteration 0176/0188: training loss 0.721 Epoch 60 iteration 0177/0188: training loss 0.722 Epoch 60 iteration 0178/0188: training loss 0.721 Epoch 60 iteration 0179/0188: training loss 0.722 Epoch 60 iteration 0180/0188: training loss 0.722 Epoch 60 iteration 0181/0188: training loss 0.723 Epoch 60 iteration 0182/0188: training loss 0.723 Epoch 60 iteration 0183/0188: training loss 0.722 Epoch 60 iteration 0184/0188: training loss 0.723 Epoch 60 iteration 0185/0188: training loss 0.722 Epoch 60 iteration 0186/0188: training loss 0.722 Epoch 60 validation pixAcc: 0.875, mIoU: 0.386 Epoch 61 iteration 0001/0187: training loss 0.804 Epoch 61 iteration 0002/0187: training loss 0.742 Epoch 61 iteration 0003/0187: training loss 0.779 Epoch 61 iteration 0004/0187: training loss 0.742 Epoch 61 iteration 0005/0187: training loss 0.717 Epoch 61 iteration 0006/0187: training loss 0.733 Epoch 61 iteration 0007/0187: training loss 0.718 Epoch 61 iteration 0008/0187: training loss 0.696 Epoch 61 iteration 0009/0187: training loss 0.689 Epoch 61 iteration 0010/0187: training loss 0.688 Epoch 61 iteration 0011/0187: training loss 0.695 Epoch 61 iteration 0012/0187: training loss 0.700 Epoch 61 iteration 0013/0187: training loss 0.718 Epoch 61 iteration 0014/0187: training loss 0.719 Epoch 61 iteration 0015/0187: training loss 0.724 Epoch 61 iteration 0016/0187: training loss 0.717 Epoch 61 iteration 0017/0187: training loss 0.720 Epoch 61 iteration 0018/0187: training loss 0.713 Epoch 61 iteration 0019/0187: training loss 0.718 Epoch 61 iteration 0020/0187: training loss 0.717 Epoch 61 iteration 0021/0187: training loss 0.719 Epoch 61 iteration 0022/0187: training loss 0.727 Epoch 61 iteration 0023/0187: training loss 0.725 Epoch 61 iteration 0024/0187: training loss 0.728 Epoch 61 iteration 0025/0187: training loss 0.721 Epoch 61 iteration 0026/0187: training loss 0.722 Epoch 61 iteration 0027/0187: training loss 0.722 Epoch 61 iteration 0028/0187: training loss 0.731 Epoch 61 iteration 0029/0187: training loss 0.728 Epoch 61 iteration 0030/0187: training loss 0.722 Epoch 61 iteration 0031/0187: training loss 0.719 Epoch 61 iteration 0032/0187: training loss 0.714 Epoch 61 iteration 0033/0187: training loss 0.712 Epoch 61 iteration 0034/0187: training loss 0.712 Epoch 61 iteration 0035/0187: training loss 0.714 Epoch 61 iteration 0036/0187: training loss 0.713 Epoch 61 iteration 0037/0187: training loss 0.712 Epoch 61 iteration 0038/0187: training loss 0.715 Epoch 61 iteration 0039/0187: training loss 0.715 Epoch 61 iteration 0040/0187: training loss 0.712 Epoch 61 iteration 0041/0187: training loss 0.710 Epoch 61 iteration 0042/0187: training loss 0.710 Epoch 61 iteration 0043/0187: training loss 0.709 Epoch 61 iteration 0044/0187: training loss 0.710 Epoch 61 iteration 0045/0187: training loss 0.712 Epoch 61 iteration 0046/0187: training loss 0.711 Epoch 61 iteration 0047/0187: training loss 0.712 Epoch 61 iteration 0048/0187: training loss 0.713 Epoch 61 iteration 0049/0187: training loss 0.714 Epoch 61 iteration 0050/0187: training loss 0.714 Epoch 61 iteration 0051/0187: training loss 0.716 Epoch 61 iteration 0052/0187: training loss 0.717 Epoch 61 iteration 0053/0187: training loss 0.715 Epoch 61 iteration 0054/0187: training loss 0.713 Epoch 61 iteration 0055/0187: training loss 0.713 Epoch 61 iteration 0056/0187: training loss 0.711 Epoch 61 iteration 0057/0187: training loss 0.710 Epoch 61 iteration 0058/0187: training loss 0.709 Epoch 61 iteration 0059/0187: training loss 0.711 Epoch 61 iteration 0060/0187: training loss 0.711 Epoch 61 iteration 0061/0187: training loss 0.711 Epoch 61 iteration 0062/0187: training loss 0.710 Epoch 61 iteration 0063/0187: training loss 0.711 Epoch 61 iteration 0064/0187: training loss 0.712 Epoch 61 iteration 0065/0187: training loss 0.711 Epoch 61 iteration 0066/0187: training loss 0.711 Epoch 61 iteration 0067/0187: training loss 0.712 Epoch 61 iteration 0068/0187: training loss 0.714 Epoch 61 iteration 0069/0187: training loss 0.715 Epoch 61 iteration 0070/0187: training loss 0.717 Epoch 61 iteration 0071/0187: training loss 0.716 Epoch 61 iteration 0072/0187: training loss 0.714 Epoch 61 iteration 0073/0187: training loss 0.716 Epoch 61 iteration 0074/0187: training loss 0.715 Epoch 61 iteration 0075/0187: training loss 0.717 Epoch 61 iteration 0076/0187: training loss 0.719 Epoch 61 iteration 0077/0187: training loss 0.717 Epoch 61 iteration 0078/0187: training loss 0.719 Epoch 61 iteration 0079/0187: training loss 0.718 Epoch 61 iteration 0080/0187: training loss 0.718 Epoch 61 iteration 0081/0187: training loss 0.718 Epoch 61 iteration 0082/0187: training loss 0.719 Epoch 61 iteration 0083/0187: training loss 0.717 Epoch 61 iteration 0084/0187: training loss 0.715 Epoch 61 iteration 0085/0187: training loss 0.714 Epoch 61 iteration 0086/0187: training loss 0.713 Epoch 61 iteration 0087/0187: training loss 0.712 Epoch 61 iteration 0088/0187: training loss 0.711 Epoch 61 iteration 0089/0187: training loss 0.710 Epoch 61 iteration 0090/0187: training loss 0.710 Epoch 61 iteration 0091/0187: training loss 0.709 Epoch 61 iteration 0092/0187: training loss 0.709 Epoch 61 iteration 0093/0187: training loss 0.708 Epoch 61 iteration 0094/0187: training loss 0.710 Epoch 61 iteration 0095/0187: training loss 0.709 Epoch 61 iteration 0096/0187: training loss 0.709 Epoch 61 iteration 0097/0187: training loss 0.709 Epoch 61 iteration 0098/0187: training loss 0.708 Epoch 61 iteration 0099/0187: training loss 0.708 Epoch 61 iteration 0100/0187: training loss 0.710 Epoch 61 iteration 0101/0187: training loss 0.710 Epoch 61 iteration 0102/0187: training loss 0.712 Epoch 61 iteration 0103/0187: training loss 0.713 Epoch 61 iteration 0104/0187: training loss 0.713 Epoch 61 iteration 0105/0187: training loss 0.714 Epoch 61 iteration 0106/0187: training loss 0.712 Epoch 61 iteration 0107/0187: training loss 0.714 Epoch 61 iteration 0108/0187: training loss 0.713 Epoch 61 iteration 0109/0187: training loss 0.713 Epoch 61 iteration 0110/0187: training loss 0.713 Epoch 61 iteration 0111/0187: training loss 0.712 Epoch 61 iteration 0112/0187: training loss 0.713 Epoch 61 iteration 0113/0187: training loss 0.713 Epoch 61 iteration 0114/0187: training loss 0.712 Epoch 61 iteration 0115/0187: training loss 0.713 Epoch 61 iteration 0116/0187: training loss 0.713 Epoch 61 iteration 0117/0187: training loss 0.714 Epoch 61 iteration 0118/0187: training loss 0.714 Epoch 61 iteration 0119/0187: training loss 0.714 Epoch 61 iteration 0120/0187: training loss 0.714 Epoch 61 iteration 0121/0187: training loss 0.714 Epoch 61 iteration 0122/0187: training loss 0.715 Epoch 61 iteration 0123/0187: training loss 0.714 Epoch 61 iteration 0124/0187: training loss 0.714 Epoch 61 iteration 0125/0187: training loss 0.715 Epoch 61 iteration 0126/0187: training loss 0.715 Epoch 61 iteration 0127/0187: training loss 0.715 Epoch 61 iteration 0128/0187: training loss 0.714 Epoch 61 iteration 0129/0187: training loss 0.715 Epoch 61 iteration 0130/0187: training loss 0.716 Epoch 61 iteration 0131/0187: training loss 0.715 Epoch 61 iteration 0132/0187: training loss 0.715 Epoch 61 iteration 0133/0187: training loss 0.715 Epoch 61 iteration 0134/0187: training loss 0.716 Epoch 61 iteration 0135/0187: training loss 0.716 Epoch 61 iteration 0136/0187: training loss 0.715 Epoch 61 iteration 0137/0187: training loss 0.714 Epoch 61 iteration 0138/0187: training loss 0.715 Epoch 61 iteration 0139/0187: training loss 0.715 Epoch 61 iteration 0140/0187: training loss 0.716 Epoch 61 iteration 0141/0187: training loss 0.717 Epoch 61 iteration 0142/0187: training loss 0.717 Epoch 61 iteration 0143/0187: training loss 0.716 Epoch 61 iteration 0144/0187: training loss 0.715 Epoch 61 iteration 0145/0187: training loss 0.715 Epoch 61 iteration 0146/0187: training loss 0.715 Epoch 61 iteration 0147/0187: training loss 0.714 Epoch 61 iteration 0148/0187: training loss 0.715 Epoch 61 iteration 0149/0187: training loss 0.715 Epoch 61 iteration 0150/0187: training loss 0.714 Epoch 61 iteration 0151/0187: training loss 0.715 Epoch 61 iteration 0152/0187: training loss 0.715 Epoch 61 iteration 0153/0187: training loss 0.715 Epoch 61 iteration 0154/0187: training loss 0.714 Epoch 61 iteration 0155/0187: training loss 0.714 Epoch 61 iteration 0156/0187: training loss 0.713 Epoch 61 iteration 0157/0187: training loss 0.713 Epoch 61 iteration 0158/0187: training loss 0.712 Epoch 61 iteration 0159/0187: training loss 0.713 Epoch 61 iteration 0160/0187: training loss 0.712 Epoch 61 iteration 0161/0187: training loss 0.712 Epoch 61 iteration 0162/0187: training loss 0.712 Epoch 61 iteration 0163/0187: training loss 0.713 Epoch 61 iteration 0164/0187: training loss 0.713 Epoch 61 iteration 0165/0187: training loss 0.713 Epoch 61 iteration 0166/0187: training loss 0.712 Epoch 61 iteration 0167/0187: training loss 0.712 Epoch 61 iteration 0168/0187: training loss 0.713 Epoch 61 iteration 0169/0187: training loss 0.714 Epoch 61 iteration 0170/0187: training loss 0.715 Epoch 61 iteration 0171/0187: training loss 0.716 Epoch 61 iteration 0172/0187: training loss 0.716 Epoch 61 iteration 0173/0187: training loss 0.715 Epoch 61 iteration 0174/0187: training loss 0.715 Epoch 61 iteration 0175/0187: training loss 0.715 Epoch 61 iteration 0176/0187: training loss 0.715 Epoch 61 iteration 0177/0187: training loss 0.715 Epoch 61 iteration 0178/0187: training loss 0.715 Epoch 61 iteration 0179/0187: training loss 0.715 Epoch 61 iteration 0180/0187: training loss 0.715 Epoch 61 iteration 0181/0187: training loss 0.715 Epoch 61 iteration 0182/0187: training loss 0.715 Epoch 61 iteration 0183/0187: training loss 0.716 Epoch 61 iteration 0184/0187: training loss 0.716 Epoch 61 iteration 0185/0187: training loss 0.715 Epoch 61 iteration 0186/0187: training loss 0.715 Epoch 61 iteration 0187/0187: training loss 0.715 Epoch 61 validation pixAcc: 0.873, mIoU: 0.385 Epoch 62 iteration 0001/0187: training loss 0.604 Epoch 62 iteration 0002/0187: training loss 0.665 Epoch 62 iteration 0003/0187: training loss 0.699 Epoch 62 iteration 0004/0187: training loss 0.695 Epoch 62 iteration 0005/0187: training loss 0.684 Epoch 62 iteration 0006/0187: training loss 0.666 Epoch 62 iteration 0007/0187: training loss 0.661 Epoch 62 iteration 0008/0187: training loss 0.665 Epoch 62 iteration 0009/0187: training loss 0.661 Epoch 62 iteration 0010/0187: training loss 0.663 Epoch 62 iteration 0011/0187: training loss 0.661 Epoch 62 iteration 0012/0187: training loss 0.679 Epoch 62 iteration 0013/0187: training loss 0.683 Epoch 62 iteration 0014/0187: training loss 0.685 Epoch 62 iteration 0015/0187: training loss 0.680 Epoch 62 iteration 0016/0187: training loss 0.674 Epoch 62 iteration 0017/0187: training loss 0.677 Epoch 62 iteration 0018/0187: training loss 0.675 Epoch 62 iteration 0019/0187: training loss 0.670 Epoch 62 iteration 0020/0187: training loss 0.674 Epoch 62 iteration 0021/0187: training loss 0.677 Epoch 62 iteration 0022/0187: training loss 0.675 Epoch 62 iteration 0023/0187: training loss 0.679 Epoch 62 iteration 0024/0187: training loss 0.677 Epoch 62 iteration 0025/0187: training loss 0.681 Epoch 62 iteration 0026/0187: training loss 0.683 Epoch 62 iteration 0027/0187: training loss 0.675 Epoch 62 iteration 0028/0187: training loss 0.688 Epoch 62 iteration 0029/0187: training loss 0.688 Epoch 62 iteration 0030/0187: training loss 0.695 Epoch 62 iteration 0031/0187: training loss 0.700 Epoch 62 iteration 0032/0187: training loss 0.699 Epoch 62 iteration 0033/0187: training loss 0.700 Epoch 62 iteration 0034/0187: training loss 0.700 Epoch 62 iteration 0035/0187: training loss 0.701 Epoch 62 iteration 0036/0187: training loss 0.699 Epoch 62 iteration 0037/0187: training loss 0.698 Epoch 62 iteration 0038/0187: training loss 0.695 Epoch 62 iteration 0039/0187: training loss 0.696 Epoch 62 iteration 0040/0187: training loss 0.695 Epoch 62 iteration 0041/0187: training loss 0.694 Epoch 62 iteration 0042/0187: training loss 0.702 Epoch 62 iteration 0043/0187: training loss 0.702 Epoch 62 iteration 0044/0187: training loss 0.701 Epoch 62 iteration 0045/0187: training loss 0.698 Epoch 62 iteration 0046/0187: training loss 0.697 Epoch 62 iteration 0047/0187: training loss 0.697 Epoch 62 iteration 0048/0187: training loss 0.696 Epoch 62 iteration 0049/0187: training loss 0.697 Epoch 62 iteration 0050/0187: training loss 0.695 Epoch 62 iteration 0051/0187: training loss 0.697 Epoch 62 iteration 0052/0187: training loss 0.698 Epoch 62 iteration 0053/0187: training loss 0.703 Epoch 62 iteration 0054/0187: training loss 0.701 Epoch 62 iteration 0055/0187: training loss 0.701 Epoch 62 iteration 0056/0187: training loss 0.701 Epoch 62 iteration 0057/0187: training loss 0.702 Epoch 62 iteration 0058/0187: training loss 0.706 Epoch 62 iteration 0059/0187: training loss 0.709 Epoch 62 iteration 0060/0187: training loss 0.711 Epoch 62 iteration 0061/0187: training loss 0.711 Epoch 62 iteration 0062/0187: training loss 0.713 Epoch 62 iteration 0063/0187: training loss 0.710 Epoch 62 iteration 0064/0187: training loss 0.708 Epoch 62 iteration 0065/0187: training loss 0.709 Epoch 62 iteration 0066/0187: training loss 0.710 Epoch 62 iteration 0067/0187: training loss 0.709 Epoch 62 iteration 0068/0187: training loss 0.708 Epoch 62 iteration 0069/0187: training loss 0.706 Epoch 62 iteration 0070/0187: training loss 0.706 Epoch 62 iteration 0071/0187: training loss 0.705 Epoch 62 iteration 0072/0187: training loss 0.704 Epoch 62 iteration 0073/0187: training loss 0.705 Epoch 62 iteration 0074/0187: training loss 0.705 Epoch 62 iteration 0075/0187: training loss 0.709 Epoch 62 iteration 0076/0187: training loss 0.708 Epoch 62 iteration 0077/0187: training loss 0.708 Epoch 62 iteration 0078/0187: training loss 0.707 Epoch 62 iteration 0079/0187: training loss 0.707 Epoch 62 iteration 0080/0187: training loss 0.709 Epoch 62 iteration 0081/0187: training loss 0.709 Epoch 62 iteration 0082/0187: training loss 0.710 Epoch 62 iteration 0083/0187: training loss 0.708 Epoch 62 iteration 0084/0187: training loss 0.708 Epoch 62 iteration 0085/0187: training loss 0.709 Epoch 62 iteration 0086/0187: training loss 0.709 Epoch 62 iteration 0087/0187: training loss 0.710 Epoch 62 iteration 0088/0187: training loss 0.709 Epoch 62 iteration 0089/0187: training loss 0.707 Epoch 62 iteration 0090/0187: training loss 0.707 Epoch 62 iteration 0091/0188: training loss 0.709 Epoch 62 iteration 0092/0188: training loss 0.710 Epoch 62 iteration 0093/0188: training loss 0.709 Epoch 62 iteration 0094/0188: training loss 0.711 Epoch 62 iteration 0095/0188: training loss 0.711 Epoch 62 iteration 0096/0188: training loss 0.711 Epoch 62 iteration 0097/0188: training loss 0.710 Epoch 62 iteration 0098/0188: training loss 0.709 Epoch 62 iteration 0099/0188: training loss 0.709 Epoch 62 iteration 0100/0188: training loss 0.709 Epoch 62 iteration 0101/0188: training loss 0.709 Epoch 62 iteration 0102/0188: training loss 0.708 Epoch 62 iteration 0103/0188: training loss 0.707 Epoch 62 iteration 0104/0188: training loss 0.707 Epoch 62 iteration 0105/0188: training loss 0.706 Epoch 62 iteration 0106/0188: training loss 0.706 Epoch 62 iteration 0107/0188: training loss 0.706 Epoch 62 iteration 0108/0188: training loss 0.706 Epoch 62 iteration 0109/0188: training loss 0.707 Epoch 62 iteration 0110/0188: training loss 0.708 Epoch 62 iteration 0111/0188: training loss 0.709 Epoch 62 iteration 0112/0188: training loss 0.710 Epoch 62 iteration 0113/0188: training loss 0.710 Epoch 62 iteration 0114/0188: training loss 0.710 Epoch 62 iteration 0115/0188: training loss 0.710 Epoch 62 iteration 0116/0188: training loss 0.711 Epoch 62 iteration 0117/0188: training loss 0.709 Epoch 62 iteration 0118/0188: training loss 0.710 Epoch 62 iteration 0119/0188: training loss 0.709 Epoch 62 iteration 0120/0188: training loss 0.709 Epoch 62 iteration 0121/0188: training loss 0.710 Epoch 62 iteration 0122/0188: training loss 0.710 Epoch 62 iteration 0123/0188: training loss 0.710 Epoch 62 iteration 0124/0188: training loss 0.710 Epoch 62 iteration 0125/0188: training loss 0.711 Epoch 62 iteration 0126/0188: training loss 0.711 Epoch 62 iteration 0127/0188: training loss 0.712 Epoch 62 iteration 0128/0188: training loss 0.710 Epoch 62 iteration 0129/0188: training loss 0.711 Epoch 62 iteration 0130/0188: training loss 0.711 Epoch 62 iteration 0131/0188: training loss 0.711 Epoch 62 iteration 0132/0188: training loss 0.711 Epoch 62 iteration 0133/0188: training loss 0.710 Epoch 62 iteration 0134/0188: training loss 0.710 Epoch 62 iteration 0135/0188: training loss 0.710 Epoch 62 iteration 0136/0188: training loss 0.710 Epoch 62 iteration 0137/0188: training loss 0.710 Epoch 62 iteration 0138/0188: training loss 0.709 Epoch 62 iteration 0139/0188: training loss 0.708 Epoch 62 iteration 0140/0188: training loss 0.708 Epoch 62 iteration 0141/0188: training loss 0.709 Epoch 62 iteration 0142/0188: training loss 0.709 Epoch 62 iteration 0143/0188: training loss 0.709 Epoch 62 iteration 0144/0188: training loss 0.710 Epoch 62 iteration 0145/0188: training loss 0.709 Epoch 62 iteration 0146/0188: training loss 0.710 Epoch 62 iteration 0147/0188: training loss 0.710 Epoch 62 iteration 0148/0188: training loss 0.710 Epoch 62 iteration 0149/0188: training loss 0.710 Epoch 62 iteration 0150/0188: training loss 0.711 Epoch 62 iteration 0151/0188: training loss 0.711 Epoch 62 iteration 0152/0188: training loss 0.711 Epoch 62 iteration 0153/0188: training loss 0.713 Epoch 62 iteration 0154/0188: training loss 0.713 Epoch 62 iteration 0155/0188: training loss 0.712 Epoch 62 iteration 0156/0188: training loss 0.713 Epoch 62 iteration 0157/0188: training loss 0.712 Epoch 62 iteration 0158/0188: training loss 0.712 Epoch 62 iteration 0159/0188: training loss 0.711 Epoch 62 iteration 0160/0188: training loss 0.711 Epoch 62 iteration 0161/0188: training loss 0.711 Epoch 62 iteration 0162/0188: training loss 0.711 Epoch 62 iteration 0163/0188: training loss 0.711 Epoch 62 iteration 0164/0188: training loss 0.711 Epoch 62 iteration 0165/0188: training loss 0.711 Epoch 62 iteration 0166/0188: training loss 0.712 Epoch 62 iteration 0167/0188: training loss 0.713 Epoch 62 iteration 0168/0188: training loss 0.713 Epoch 62 iteration 0169/0188: training loss 0.713 Epoch 62 iteration 0170/0188: training loss 0.713 Epoch 62 iteration 0171/0188: training loss 0.712 Epoch 62 iteration 0172/0188: training loss 0.712 Epoch 62 iteration 0173/0188: training loss 0.714 Epoch 62 iteration 0174/0188: training loss 0.714 Epoch 62 iteration 0175/0188: training loss 0.714 Epoch 62 iteration 0176/0188: training loss 0.715 Epoch 62 iteration 0177/0188: training loss 0.716 Epoch 62 iteration 0178/0188: training loss 0.716 Epoch 62 iteration 0179/0188: training loss 0.716 Epoch 62 iteration 0180/0188: training loss 0.717 Epoch 62 iteration 0181/0188: training loss 0.717 Epoch 62 iteration 0182/0188: training loss 0.716 Epoch 62 iteration 0183/0188: training loss 0.717 Epoch 62 iteration 0184/0188: training loss 0.717 Epoch 62 iteration 0185/0188: training loss 0.716 Epoch 62 iteration 0186/0188: training loss 0.716 Epoch 62 validation pixAcc: 0.875, mIoU: 0.383 Epoch 63 iteration 0001/0187: training loss 0.680 Epoch 63 iteration 0002/0187: training loss 0.644 Epoch 63 iteration 0003/0187: training loss 0.630 Epoch 63 iteration 0004/0187: training loss 0.655 Epoch 63 iteration 0005/0187: training loss 0.669 Epoch 63 iteration 0006/0187: training loss 0.647 Epoch 63 iteration 0007/0187: training loss 0.658 Epoch 63 iteration 0008/0187: training loss 0.647 Epoch 63 iteration 0009/0187: training loss 0.653 Epoch 63 iteration 0010/0187: training loss 0.654 Epoch 63 iteration 0011/0187: training loss 0.661 Epoch 63 iteration 0012/0187: training loss 0.657 Epoch 63 iteration 0013/0187: training loss 0.658 Epoch 63 iteration 0014/0187: training loss 0.649 Epoch 63 iteration 0015/0187: training loss 0.638 Epoch 63 iteration 0016/0187: training loss 0.648 Epoch 63 iteration 0017/0187: training loss 0.650 Epoch 63 iteration 0018/0187: training loss 0.662 Epoch 63 iteration 0019/0187: training loss 0.661 Epoch 63 iteration 0020/0187: training loss 0.666 Epoch 63 iteration 0021/0187: training loss 0.669 Epoch 63 iteration 0022/0187: training loss 0.681 Epoch 63 iteration 0023/0187: training loss 0.681 Epoch 63 iteration 0024/0187: training loss 0.677 Epoch 63 iteration 0025/0187: training loss 0.680 Epoch 63 iteration 0026/0187: training loss 0.683 Epoch 63 iteration 0027/0187: training loss 0.681 Epoch 63 iteration 0028/0187: training loss 0.678 Epoch 63 iteration 0029/0187: training loss 0.675 Epoch 63 iteration 0030/0187: training loss 0.676 Epoch 63 iteration 0031/0187: training loss 0.675 Epoch 63 iteration 0032/0187: training loss 0.679 Epoch 63 iteration 0033/0187: training loss 0.679 Epoch 63 iteration 0034/0187: training loss 0.683 Epoch 63 iteration 0035/0187: training loss 0.683 Epoch 63 iteration 0036/0187: training loss 0.685 Epoch 63 iteration 0037/0187: training loss 0.682 Epoch 63 iteration 0038/0187: training loss 0.683 Epoch 63 iteration 0039/0187: training loss 0.683 Epoch 63 iteration 0040/0187: training loss 0.682 Epoch 63 iteration 0041/0187: training loss 0.682 Epoch 63 iteration 0042/0187: training loss 0.681 Epoch 63 iteration 0043/0187: training loss 0.678 Epoch 63 iteration 0044/0187: training loss 0.679 Epoch 63 iteration 0045/0187: training loss 0.678 Epoch 63 iteration 0046/0187: training loss 0.675 Epoch 63 iteration 0047/0187: training loss 0.676 Epoch 63 iteration 0048/0187: training loss 0.675 Epoch 63 iteration 0049/0187: training loss 0.676 Epoch 63 iteration 0050/0187: training loss 0.680 Epoch 63 iteration 0051/0187: training loss 0.681 Epoch 63 iteration 0052/0187: training loss 0.682 Epoch 63 iteration 0053/0187: training loss 0.684 Epoch 63 iteration 0054/0187: training loss 0.686 Epoch 63 iteration 0055/0187: training loss 0.686 Epoch 63 iteration 0056/0187: training loss 0.685 Epoch 63 iteration 0057/0187: training loss 0.684 Epoch 63 iteration 0058/0187: training loss 0.682 Epoch 63 iteration 0059/0187: training loss 0.680 Epoch 63 iteration 0060/0187: training loss 0.679 Epoch 63 iteration 0061/0187: training loss 0.677 Epoch 63 iteration 0062/0187: training loss 0.677 Epoch 63 iteration 0063/0187: training loss 0.678 Epoch 63 iteration 0064/0187: training loss 0.678 Epoch 63 iteration 0065/0187: training loss 0.678 Epoch 63 iteration 0066/0187: training loss 0.679 Epoch 63 iteration 0067/0187: training loss 0.681 Epoch 63 iteration 0068/0187: training loss 0.679 Epoch 63 iteration 0069/0187: training loss 0.682 Epoch 63 iteration 0070/0187: training loss 0.680 Epoch 63 iteration 0071/0187: training loss 0.682 Epoch 63 iteration 0072/0187: training loss 0.683 Epoch 63 iteration 0073/0187: training loss 0.681 Epoch 63 iteration 0074/0187: training loss 0.683 Epoch 63 iteration 0075/0187: training loss 0.688 Epoch 63 iteration 0076/0187: training loss 0.689 Epoch 63 iteration 0077/0187: training loss 0.689 Epoch 63 iteration 0078/0187: training loss 0.690 Epoch 63 iteration 0079/0187: training loss 0.692 Epoch 63 iteration 0080/0187: training loss 0.692 Epoch 63 iteration 0081/0187: training loss 0.693 Epoch 63 iteration 0082/0187: training loss 0.694 Epoch 63 iteration 0083/0187: training loss 0.694 Epoch 63 iteration 0084/0187: training loss 0.696 Epoch 63 iteration 0085/0187: training loss 0.694 Epoch 63 iteration 0086/0187: training loss 0.695 Epoch 63 iteration 0087/0187: training loss 0.695 Epoch 63 iteration 0088/0187: training loss 0.696 Epoch 63 iteration 0089/0187: training loss 0.695 Epoch 63 iteration 0090/0187: training loss 0.695 Epoch 63 iteration 0091/0187: training loss 0.696 Epoch 63 iteration 0092/0187: training loss 0.696 Epoch 63 iteration 0093/0187: training loss 0.696 Epoch 63 iteration 0094/0187: training loss 0.695 Epoch 63 iteration 0095/0187: training loss 0.694 Epoch 63 iteration 0096/0187: training loss 0.694 Epoch 63 iteration 0097/0187: training loss 0.694 Epoch 63 iteration 0098/0187: training loss 0.693 Epoch 63 iteration 0099/0187: training loss 0.693 Epoch 63 iteration 0100/0187: training loss 0.692 Epoch 63 iteration 0101/0187: training loss 0.692 Epoch 63 iteration 0102/0187: training loss 0.693 Epoch 63 iteration 0103/0187: training loss 0.693 Epoch 63 iteration 0104/0187: training loss 0.692 Epoch 63 iteration 0105/0187: training loss 0.692 Epoch 63 iteration 0106/0187: training loss 0.692 Epoch 63 iteration 0107/0187: training loss 0.692 Epoch 63 iteration 0108/0187: training loss 0.692 Epoch 63 iteration 0109/0187: training loss 0.692 Epoch 63 iteration 0110/0187: training loss 0.692 Epoch 63 iteration 0111/0187: training loss 0.691 Epoch 63 iteration 0112/0187: training loss 0.691 Epoch 63 iteration 0113/0187: training loss 0.690 Epoch 63 iteration 0114/0187: training loss 0.691 Epoch 63 iteration 0115/0187: training loss 0.690 Epoch 63 iteration 0116/0187: training loss 0.689 Epoch 63 iteration 0117/0187: training loss 0.690 Epoch 63 iteration 0118/0187: training loss 0.689 Epoch 63 iteration 0119/0187: training loss 0.689 Epoch 63 iteration 0120/0187: training loss 0.690 Epoch 63 iteration 0121/0187: training loss 0.693 Epoch 63 iteration 0122/0187: training loss 0.694 Epoch 63 iteration 0123/0187: training loss 0.694 Epoch 63 iteration 0124/0187: training loss 0.693 Epoch 63 iteration 0125/0187: training loss 0.693 Epoch 63 iteration 0126/0187: training loss 0.693 Epoch 63 iteration 0127/0187: training loss 0.694 Epoch 63 iteration 0128/0187: training loss 0.695 Epoch 63 iteration 0129/0187: training loss 0.694 Epoch 63 iteration 0130/0187: training loss 0.695 Epoch 63 iteration 0131/0187: training loss 0.695 Epoch 63 iteration 0132/0187: training loss 0.695 Epoch 63 iteration 0133/0187: training loss 0.696 Epoch 63 iteration 0134/0187: training loss 0.696 Epoch 63 iteration 0135/0187: training loss 0.696 Epoch 63 iteration 0136/0187: training loss 0.695 Epoch 63 iteration 0137/0187: training loss 0.694 Epoch 63 iteration 0138/0187: training loss 0.694 Epoch 63 iteration 0139/0187: training loss 0.694 Epoch 63 iteration 0140/0187: training loss 0.693 Epoch 63 iteration 0141/0187: training loss 0.694 Epoch 63 iteration 0142/0187: training loss 0.694 Epoch 63 iteration 0143/0187: training loss 0.693 Epoch 63 iteration 0144/0187: training loss 0.693 Epoch 63 iteration 0145/0187: training loss 0.693 Epoch 63 iteration 0146/0187: training loss 0.693 Epoch 63 iteration 0147/0187: training loss 0.693 Epoch 63 iteration 0148/0187: training loss 0.693 Epoch 63 iteration 0149/0187: training loss 0.693 Epoch 63 iteration 0150/0187: training loss 0.693 Epoch 63 iteration 0151/0187: training loss 0.693 Epoch 63 iteration 0152/0187: training loss 0.694 Epoch 63 iteration 0153/0187: training loss 0.694 Epoch 63 iteration 0154/0187: training loss 0.693 Epoch 63 iteration 0155/0187: training loss 0.692 Epoch 63 iteration 0156/0187: training loss 0.692 Epoch 63 iteration 0157/0187: training loss 0.692 Epoch 63 iteration 0158/0187: training loss 0.692 Epoch 63 iteration 0159/0187: training loss 0.692 Epoch 63 iteration 0160/0187: training loss 0.692 Epoch 63 iteration 0161/0187: training loss 0.694 Epoch 63 iteration 0162/0187: training loss 0.694 Epoch 63 iteration 0163/0187: training loss 0.693 Epoch 63 iteration 0164/0187: training loss 0.693 Epoch 63 iteration 0165/0187: training loss 0.694 Epoch 63 iteration 0166/0187: training loss 0.693 Epoch 63 iteration 0167/0187: training loss 0.692 Epoch 63 iteration 0168/0187: training loss 0.692 Epoch 63 iteration 0169/0187: training loss 0.692 Epoch 63 iteration 0170/0187: training loss 0.692 Epoch 63 iteration 0171/0187: training loss 0.692 Epoch 63 iteration 0172/0187: training loss 0.692 Epoch 63 iteration 0173/0187: training loss 0.691 Epoch 63 iteration 0174/0187: training loss 0.692 Epoch 63 iteration 0175/0187: training loss 0.691 Epoch 63 iteration 0176/0187: training loss 0.692 Epoch 63 iteration 0177/0187: training loss 0.692 Epoch 63 iteration 0178/0187: training loss 0.692 Epoch 63 iteration 0179/0187: training loss 0.692 Epoch 63 iteration 0180/0187: training loss 0.692 Epoch 63 iteration 0181/0187: training loss 0.693 Epoch 63 iteration 0182/0187: training loss 0.693 Epoch 63 iteration 0183/0187: training loss 0.694 Epoch 63 iteration 0184/0187: training loss 0.695 Epoch 63 iteration 0185/0187: training loss 0.695 Epoch 63 iteration 0186/0187: training loss 0.696 Epoch 63 iteration 0187/0187: training loss 0.696 Epoch 63 validation pixAcc: 0.875, mIoU: 0.386 Epoch 64 iteration 0001/0187: training loss 0.856 Epoch 64 iteration 0002/0187: training loss 0.756 Epoch 64 iteration 0003/0187: training loss 0.727 Epoch 64 iteration 0004/0187: training loss 0.713 Epoch 64 iteration 0005/0187: training loss 0.740 Epoch 64 iteration 0006/0187: training loss 0.744 Epoch 64 iteration 0007/0187: training loss 0.745 Epoch 64 iteration 0008/0187: training loss 0.744 Epoch 64 iteration 0009/0187: training loss 0.752 Epoch 64 iteration 0010/0187: training loss 0.742 Epoch 64 iteration 0011/0187: training loss 0.736 Epoch 64 iteration 0012/0187: training loss 0.726 Epoch 64 iteration 0013/0187: training loss 0.718 Epoch 64 iteration 0014/0187: training loss 0.716 Epoch 64 iteration 0015/0187: training loss 0.726 Epoch 64 iteration 0016/0187: training loss 0.730 Epoch 64 iteration 0017/0187: training loss 0.734 Epoch 64 iteration 0018/0187: training loss 0.734 Epoch 64 iteration 0019/0187: training loss 0.733 Epoch 64 iteration 0020/0187: training loss 0.734 Epoch 64 iteration 0021/0187: training loss 0.729 Epoch 64 iteration 0022/0187: training loss 0.730 Epoch 64 iteration 0023/0187: training loss 0.724 Epoch 64 iteration 0024/0187: training loss 0.727 Epoch 64 iteration 0025/0187: training loss 0.721 Epoch 64 iteration 0026/0187: training loss 0.714 Epoch 64 iteration 0027/0187: training loss 0.715 Epoch 64 iteration 0028/0187: training loss 0.710 Epoch 64 iteration 0029/0187: training loss 0.707 Epoch 64 iteration 0030/0187: training loss 0.707 Epoch 64 iteration 0031/0187: training loss 0.710 Epoch 64 iteration 0032/0187: training loss 0.712 Epoch 64 iteration 0033/0187: training loss 0.712 Epoch 64 iteration 0034/0187: training loss 0.712 Epoch 64 iteration 0035/0187: training loss 0.711 Epoch 64 iteration 0036/0187: training loss 0.716 Epoch 64 iteration 0037/0187: training loss 0.712 Epoch 64 iteration 0038/0187: training loss 0.716 Epoch 64 iteration 0039/0187: training loss 0.719 Epoch 64 iteration 0040/0187: training loss 0.716 Epoch 64 iteration 0041/0187: training loss 0.716 Epoch 64 iteration 0042/0187: training loss 0.722 Epoch 64 iteration 0043/0187: training loss 0.722 Epoch 64 iteration 0044/0187: training loss 0.720 Epoch 64 iteration 0045/0187: training loss 0.721 Epoch 64 iteration 0046/0187: training loss 0.723 Epoch 64 iteration 0047/0187: training loss 0.720 Epoch 64 iteration 0048/0187: training loss 0.720 Epoch 64 iteration 0049/0187: training loss 0.723 Epoch 64 iteration 0050/0187: training loss 0.726 Epoch 64 iteration 0051/0187: training loss 0.730 Epoch 64 iteration 0052/0187: training loss 0.729 Epoch 64 iteration 0053/0187: training loss 0.731 Epoch 64 iteration 0054/0187: training loss 0.735 Epoch 64 iteration 0055/0187: training loss 0.736 Epoch 64 iteration 0056/0187: training loss 0.738 Epoch 64 iteration 0057/0187: training loss 0.740 Epoch 64 iteration 0058/0187: training loss 0.738 Epoch 64 iteration 0059/0187: training loss 0.737 Epoch 64 iteration 0060/0187: training loss 0.736 Epoch 64 iteration 0061/0187: training loss 0.736 Epoch 64 iteration 0062/0187: training loss 0.737 Epoch 64 iteration 0063/0187: training loss 0.736 Epoch 64 iteration 0064/0187: training loss 0.739 Epoch 64 iteration 0065/0187: training loss 0.738 Epoch 64 iteration 0066/0187: training loss 0.739 Epoch 64 iteration 0067/0187: training loss 0.739 Epoch 64 iteration 0068/0187: training loss 0.737 Epoch 64 iteration 0069/0187: training loss 0.738 Epoch 64 iteration 0070/0187: training loss 0.737 Epoch 64 iteration 0071/0187: training loss 0.736 Epoch 64 iteration 0072/0187: training loss 0.738 Epoch 64 iteration 0073/0187: training loss 0.738 Epoch 64 iteration 0074/0187: training loss 0.738 Epoch 64 iteration 0075/0187: training loss 0.739 Epoch 64 iteration 0076/0187: training loss 0.740 Epoch 64 iteration 0077/0187: training loss 0.738 Epoch 64 iteration 0078/0187: training loss 0.736 Epoch 64 iteration 0079/0187: training loss 0.737 Epoch 64 iteration 0080/0187: training loss 0.736 Epoch 64 iteration 0081/0187: training loss 0.735 Epoch 64 iteration 0082/0187: training loss 0.736 Epoch 64 iteration 0083/0187: training loss 0.736 Epoch 64 iteration 0084/0187: training loss 0.736 Epoch 64 iteration 0085/0187: training loss 0.737 Epoch 64 iteration 0086/0187: training loss 0.734 Epoch 64 iteration 0087/0187: training loss 0.735 Epoch 64 iteration 0088/0187: training loss 0.735 Epoch 64 iteration 0089/0187: training loss 0.734 Epoch 64 iteration 0090/0187: training loss 0.732 Epoch 64 iteration 0091/0188: training loss 0.731 Epoch 64 iteration 0092/0188: training loss 0.730 Epoch 64 iteration 0093/0188: training loss 0.732 Epoch 64 iteration 0094/0188: training loss 0.732 Epoch 64 iteration 0095/0188: training loss 0.733 Epoch 64 iteration 0096/0188: training loss 0.734 Epoch 64 iteration 0097/0188: training loss 0.732 Epoch 64 iteration 0098/0188: training loss 0.732 Epoch 64 iteration 0099/0188: training loss 0.732 Epoch 64 iteration 0100/0188: training loss 0.731 Epoch 64 iteration 0101/0188: training loss 0.730 Epoch 64 iteration 0102/0188: training loss 0.731 Epoch 64 iteration 0103/0188: training loss 0.730 Epoch 64 iteration 0104/0188: training loss 0.729 Epoch 64 iteration 0105/0188: training loss 0.729 Epoch 64 iteration 0106/0188: training loss 0.729 Epoch 64 iteration 0107/0188: training loss 0.731 Epoch 64 iteration 0108/0188: training loss 0.731 Epoch 64 iteration 0109/0188: training loss 0.732 Epoch 64 iteration 0110/0188: training loss 0.731 Epoch 64 iteration 0111/0188: training loss 0.731 Epoch 64 iteration 0112/0188: training loss 0.731 Epoch 64 iteration 0113/0188: training loss 0.730 Epoch 64 iteration 0114/0188: training loss 0.730 Epoch 64 iteration 0115/0188: training loss 0.730 Epoch 64 iteration 0116/0188: training loss 0.729 Epoch 64 iteration 0117/0188: training loss 0.729 Epoch 64 iteration 0118/0188: training loss 0.731 Epoch 64 iteration 0119/0188: training loss 0.729 Epoch 64 iteration 0120/0188: training loss 0.729 Epoch 64 iteration 0121/0188: training loss 0.731 Epoch 64 iteration 0122/0188: training loss 0.731 Epoch 64 iteration 0123/0188: training loss 0.732 Epoch 64 iteration 0124/0188: training loss 0.732 Epoch 64 iteration 0125/0188: training loss 0.733 Epoch 64 iteration 0126/0188: training loss 0.733 Epoch 64 iteration 0127/0188: training loss 0.734 Epoch 64 iteration 0128/0188: training loss 0.733 Epoch 64 iteration 0129/0188: training loss 0.732 Epoch 64 iteration 0130/0188: training loss 0.733 Epoch 64 iteration 0131/0188: training loss 0.732 Epoch 64 iteration 0132/0188: training loss 0.732 Epoch 64 iteration 0133/0188: training loss 0.730 Epoch 64 iteration 0134/0188: training loss 0.730 Epoch 64 iteration 0135/0188: training loss 0.729 Epoch 64 iteration 0136/0188: training loss 0.729 Epoch 64 iteration 0137/0188: training loss 0.728 Epoch 64 iteration 0138/0188: training loss 0.727 Epoch 64 iteration 0139/0188: training loss 0.727 Epoch 64 iteration 0140/0188: training loss 0.727 Epoch 64 iteration 0141/0188: training loss 0.727 Epoch 64 iteration 0142/0188: training loss 0.727 Epoch 64 iteration 0143/0188: training loss 0.727 Epoch 64 iteration 0144/0188: training loss 0.728 Epoch 64 iteration 0145/0188: training loss 0.727 Epoch 64 iteration 0146/0188: training loss 0.728 Epoch 64 iteration 0147/0188: training loss 0.726 Epoch 64 iteration 0148/0188: training loss 0.725 Epoch 64 iteration 0149/0188: training loss 0.725 Epoch 64 iteration 0150/0188: training loss 0.724 Epoch 64 iteration 0151/0188: training loss 0.725 Epoch 64 iteration 0152/0188: training loss 0.725 Epoch 64 iteration 0153/0188: training loss 0.724 Epoch 64 iteration 0154/0188: training loss 0.724 Epoch 64 iteration 0155/0188: training loss 0.724 Epoch 64 iteration 0156/0188: training loss 0.724 Epoch 64 iteration 0157/0188: training loss 0.723 Epoch 64 iteration 0158/0188: training loss 0.723 Epoch 64 iteration 0159/0188: training loss 0.723 Epoch 64 iteration 0160/0188: training loss 0.722 Epoch 64 iteration 0161/0188: training loss 0.723 Epoch 64 iteration 0162/0188: training loss 0.722 Epoch 64 iteration 0163/0188: training loss 0.721 Epoch 64 iteration 0164/0188: training loss 0.721 Epoch 64 iteration 0165/0188: training loss 0.721 Epoch 64 iteration 0166/0188: training loss 0.721 Epoch 64 iteration 0167/0188: training loss 0.721 Epoch 64 iteration 0168/0188: training loss 0.721 Epoch 64 iteration 0169/0188: training loss 0.721 Epoch 64 iteration 0170/0188: training loss 0.722 Epoch 64 iteration 0171/0188: training loss 0.724 Epoch 64 iteration 0172/0188: training loss 0.723 Epoch 64 iteration 0173/0188: training loss 0.722 Epoch 64 iteration 0174/0188: training loss 0.722 Epoch 64 iteration 0175/0188: training loss 0.721 Epoch 64 iteration 0176/0188: training loss 0.721 Epoch 64 iteration 0177/0188: training loss 0.721 Epoch 64 iteration 0178/0188: training loss 0.721 Epoch 64 iteration 0179/0188: training loss 0.721 Epoch 64 iteration 0180/0188: training loss 0.721 Epoch 64 iteration 0181/0188: training loss 0.721 Epoch 64 iteration 0182/0188: training loss 0.721 Epoch 64 iteration 0183/0188: training loss 0.721 Epoch 64 iteration 0184/0188: training loss 0.720 Epoch 64 iteration 0185/0188: training loss 0.720 Epoch 64 iteration 0186/0188: training loss 0.720 Epoch 64 validation pixAcc: 0.875, mIoU: 0.388 Epoch 65 iteration 0001/0187: training loss 0.896 Epoch 65 iteration 0002/0187: training loss 0.837 Epoch 65 iteration 0003/0187: training loss 0.887 Epoch 65 iteration 0004/0187: training loss 0.844 Epoch 65 iteration 0005/0187: training loss 0.806 Epoch 65 iteration 0006/0187: training loss 0.815 Epoch 65 iteration 0007/0187: training loss 0.784 Epoch 65 iteration 0008/0187: training loss 0.765 Epoch 65 iteration 0009/0187: training loss 0.759 Epoch 65 iteration 0010/0187: training loss 0.762 Epoch 65 iteration 0011/0187: training loss 0.747 Epoch 65 iteration 0012/0187: training loss 0.740 Epoch 65 iteration 0013/0187: training loss 0.748 Epoch 65 iteration 0014/0187: training loss 0.744 Epoch 65 iteration 0015/0187: training loss 0.742 Epoch 65 iteration 0016/0187: training loss 0.731 Epoch 65 iteration 0017/0187: training loss 0.737 Epoch 65 iteration 0018/0187: training loss 0.729 Epoch 65 iteration 0019/0187: training loss 0.728 Epoch 65 iteration 0020/0187: training loss 0.726 Epoch 65 iteration 0021/0187: training loss 0.727 Epoch 65 iteration 0022/0187: training loss 0.727 Epoch 65 iteration 0023/0187: training loss 0.723 Epoch 65 iteration 0024/0187: training loss 0.718 Epoch 65 iteration 0025/0187: training loss 0.729 Epoch 65 iteration 0026/0187: training loss 0.731 Epoch 65 iteration 0027/0187: training loss 0.726 Epoch 65 iteration 0028/0187: training loss 0.724 Epoch 65 iteration 0029/0187: training loss 0.721 Epoch 65 iteration 0030/0187: training loss 0.722 Epoch 65 iteration 0031/0187: training loss 0.722 Epoch 65 iteration 0032/0187: training loss 0.719 Epoch 65 iteration 0033/0187: training loss 0.716 Epoch 65 iteration 0034/0187: training loss 0.716 Epoch 65 iteration 0035/0187: training loss 0.717 Epoch 65 iteration 0036/0187: training loss 0.716 Epoch 65 iteration 0037/0187: training loss 0.726 Epoch 65 iteration 0038/0187: training loss 0.728 Epoch 65 iteration 0039/0187: training loss 0.728 Epoch 65 iteration 0040/0187: training loss 0.728 Epoch 65 iteration 0041/0187: training loss 0.727 Epoch 65 iteration 0042/0187: training loss 0.727 Epoch 65 iteration 0043/0187: training loss 0.726 Epoch 65 iteration 0044/0187: training loss 0.729 Epoch 65 iteration 0045/0187: training loss 0.727 Epoch 65 iteration 0046/0187: training loss 0.724 Epoch 65 iteration 0047/0187: training loss 0.721 Epoch 65 iteration 0048/0187: training loss 0.719 Epoch 65 iteration 0049/0187: training loss 0.721 Epoch 65 iteration 0050/0187: training loss 0.720 Epoch 65 iteration 0051/0187: training loss 0.722 Epoch 65 iteration 0052/0187: training loss 0.719 Epoch 65 iteration 0053/0187: training loss 0.719 Epoch 65 iteration 0054/0187: training loss 0.717 Epoch 65 iteration 0055/0187: training loss 0.719 Epoch 65 iteration 0056/0187: training loss 0.719 Epoch 65 iteration 0057/0187: training loss 0.719 Epoch 65 iteration 0058/0187: training loss 0.718 Epoch 65 iteration 0059/0187: training loss 0.716 Epoch 65 iteration 0060/0187: training loss 0.717 Epoch 65 iteration 0061/0187: training loss 0.716 Epoch 65 iteration 0062/0187: training loss 0.715 Epoch 65 iteration 0063/0187: training loss 0.714 Epoch 65 iteration 0064/0187: training loss 0.719 Epoch 65 iteration 0065/0187: training loss 0.717 Epoch 65 iteration 0066/0187: training loss 0.716 Epoch 65 iteration 0067/0187: training loss 0.717 Epoch 65 iteration 0068/0187: training loss 0.717 Epoch 65 iteration 0069/0187: training loss 0.716 Epoch 65 iteration 0070/0187: training loss 0.717 Epoch 65 iteration 0071/0187: training loss 0.715 Epoch 65 iteration 0072/0187: training loss 0.715 Epoch 65 iteration 0073/0187: training loss 0.714 Epoch 65 iteration 0074/0187: training loss 0.714 Epoch 65 iteration 0075/0187: training loss 0.715 Epoch 65 iteration 0076/0187: training loss 0.714 Epoch 65 iteration 0077/0187: training loss 0.712 Epoch 65 iteration 0078/0187: training loss 0.716 Epoch 65 iteration 0079/0187: training loss 0.715 Epoch 65 iteration 0080/0187: training loss 0.714 Epoch 65 iteration 0081/0187: training loss 0.713 Epoch 65 iteration 0082/0187: training loss 0.712 Epoch 65 iteration 0083/0187: training loss 0.712 Epoch 65 iteration 0084/0187: training loss 0.713 Epoch 65 iteration 0085/0187: training loss 0.713 Epoch 65 iteration 0086/0187: training loss 0.711 Epoch 65 iteration 0087/0187: training loss 0.710 Epoch 65 iteration 0088/0187: training loss 0.709 Epoch 65 iteration 0089/0187: training loss 0.712 Epoch 65 iteration 0090/0187: training loss 0.711 Epoch 65 iteration 0091/0187: training loss 0.714 Epoch 65 iteration 0092/0187: training loss 0.713 Epoch 65 iteration 0093/0187: training loss 0.713 Epoch 65 iteration 0094/0187: training loss 0.712 Epoch 65 iteration 0095/0187: training loss 0.711 Epoch 65 iteration 0096/0187: training loss 0.711 Epoch 65 iteration 0097/0187: training loss 0.709 Epoch 65 iteration 0098/0187: training loss 0.710 Epoch 65 iteration 0099/0187: training loss 0.709 Epoch 65 iteration 0100/0187: training loss 0.708 Epoch 65 iteration 0101/0187: training loss 0.708 Epoch 65 iteration 0102/0187: training loss 0.707 Epoch 65 iteration 0103/0187: training loss 0.706 Epoch 65 iteration 0104/0187: training loss 0.706 Epoch 65 iteration 0105/0187: training loss 0.706 Epoch 65 iteration 0106/0187: training loss 0.707 Epoch 65 iteration 0107/0187: training loss 0.705 Epoch 65 iteration 0108/0187: training loss 0.705 Epoch 65 iteration 0109/0187: training loss 0.704 Epoch 65 iteration 0110/0187: training loss 0.704 Epoch 65 iteration 0111/0187: training loss 0.704 Epoch 65 iteration 0112/0187: training loss 0.704 Epoch 65 iteration 0113/0187: training loss 0.704 Epoch 65 iteration 0114/0187: training loss 0.704 Epoch 65 iteration 0115/0187: training loss 0.703 Epoch 65 iteration 0116/0187: training loss 0.703 Epoch 65 iteration 0117/0187: training loss 0.702 Epoch 65 iteration 0118/0187: training loss 0.702 Epoch 65 iteration 0119/0187: training loss 0.703 Epoch 65 iteration 0120/0187: training loss 0.702 Epoch 65 iteration 0121/0187: training loss 0.704 Epoch 65 iteration 0122/0187: training loss 0.705 Epoch 65 iteration 0123/0187: training loss 0.706 Epoch 65 iteration 0124/0187: training loss 0.706 Epoch 65 iteration 0125/0187: training loss 0.706 Epoch 65 iteration 0126/0187: training loss 0.705 Epoch 65 iteration 0127/0187: training loss 0.705 Epoch 65 iteration 0128/0187: training loss 0.706 Epoch 65 iteration 0129/0187: training loss 0.706 Epoch 65 iteration 0130/0187: training loss 0.706 Epoch 65 iteration 0131/0187: training loss 0.706 Epoch 65 iteration 0132/0187: training loss 0.705 Epoch 65 iteration 0133/0187: training loss 0.706 Epoch 65 iteration 0134/0187: training loss 0.707 Epoch 65 iteration 0135/0187: training loss 0.708 Epoch 65 iteration 0136/0187: training loss 0.706 Epoch 65 iteration 0137/0187: training loss 0.708 Epoch 65 iteration 0138/0187: training loss 0.709 Epoch 65 iteration 0139/0187: training loss 0.707 Epoch 65 iteration 0140/0187: training loss 0.708 Epoch 65 iteration 0141/0187: training loss 0.707 Epoch 65 iteration 0142/0187: training loss 0.707 Epoch 65 iteration 0143/0187: training loss 0.706 Epoch 65 iteration 0144/0187: training loss 0.707 Epoch 65 iteration 0145/0187: training loss 0.708 Epoch 65 iteration 0146/0187: training loss 0.707 Epoch 65 iteration 0147/0187: training loss 0.707 Epoch 65 iteration 0148/0187: training loss 0.706 Epoch 65 iteration 0149/0187: training loss 0.706 Epoch 65 iteration 0150/0187: training loss 0.705 Epoch 65 iteration 0151/0187: training loss 0.704 Epoch 65 iteration 0152/0187: training loss 0.704 Epoch 65 iteration 0153/0187: training loss 0.703 Epoch 65 iteration 0154/0187: training loss 0.703 Epoch 65 iteration 0155/0187: training loss 0.704 Epoch 65 iteration 0156/0187: training loss 0.704 Epoch 65 iteration 0157/0187: training loss 0.708 Epoch 65 iteration 0158/0187: training loss 0.707 Epoch 65 iteration 0159/0187: training loss 0.708 Epoch 65 iteration 0160/0187: training loss 0.708 Epoch 65 iteration 0161/0187: training loss 0.709 Epoch 65 iteration 0162/0187: training loss 0.709 Epoch 65 iteration 0163/0187: training loss 0.708 Epoch 65 iteration 0164/0187: training loss 0.708 Epoch 65 iteration 0165/0187: training loss 0.708 Epoch 65 iteration 0166/0187: training loss 0.708 Epoch 65 iteration 0167/0187: training loss 0.708 Epoch 65 iteration 0168/0187: training loss 0.707 Epoch 65 iteration 0169/0187: training loss 0.708 Epoch 65 iteration 0170/0187: training loss 0.708 Epoch 65 iteration 0171/0187: training loss 0.708 Epoch 65 iteration 0172/0187: training loss 0.707 Epoch 65 iteration 0173/0187: training loss 0.708 Epoch 65 iteration 0174/0187: training loss 0.708 Epoch 65 iteration 0175/0187: training loss 0.709 Epoch 65 iteration 0176/0187: training loss 0.709 Epoch 65 iteration 0177/0187: training loss 0.708 Epoch 65 iteration 0178/0187: training loss 0.708 Epoch 65 iteration 0179/0187: training loss 0.708 Epoch 65 iteration 0180/0187: training loss 0.707 Epoch 65 iteration 0181/0187: training loss 0.708 Epoch 65 iteration 0182/0187: training loss 0.707 Epoch 65 iteration 0183/0187: training loss 0.707 Epoch 65 iteration 0184/0187: training loss 0.706 Epoch 65 iteration 0185/0187: training loss 0.707 Epoch 65 iteration 0186/0187: training loss 0.707 Epoch 65 iteration 0187/0187: training loss 0.706 Epoch 65 validation pixAcc: 0.875, mIoU: 0.389 Epoch 66 iteration 0001/0187: training loss 0.740 Epoch 66 iteration 0002/0187: training loss 0.735 Epoch 66 iteration 0003/0187: training loss 0.724 Epoch 66 iteration 0004/0187: training loss 0.714 Epoch 66 iteration 0005/0187: training loss 0.707 Epoch 66 iteration 0006/0187: training loss 0.689 Epoch 66 iteration 0007/0187: training loss 0.680 Epoch 66 iteration 0008/0187: training loss 0.698 Epoch 66 iteration 0009/0187: training loss 0.691 Epoch 66 iteration 0010/0187: training loss 0.704 Epoch 66 iteration 0011/0187: training loss 0.706 Epoch 66 iteration 0012/0187: training loss 0.706 Epoch 66 iteration 0013/0187: training loss 0.705 Epoch 66 iteration 0014/0187: training loss 0.697 Epoch 66 iteration 0015/0187: training loss 0.693 Epoch 66 iteration 0016/0187: training loss 0.690 Epoch 66 iteration 0017/0187: training loss 0.702 Epoch 66 iteration 0018/0187: training loss 0.692 Epoch 66 iteration 0019/0187: training loss 0.696 Epoch 66 iteration 0020/0187: training loss 0.698 Epoch 66 iteration 0021/0187: training loss 0.698 Epoch 66 iteration 0022/0187: training loss 0.700 Epoch 66 iteration 0023/0187: training loss 0.698 Epoch 66 iteration 0024/0187: training loss 0.695 Epoch 66 iteration 0025/0187: training loss 0.691 Epoch 66 iteration 0026/0187: training loss 0.688 Epoch 66 iteration 0027/0187: training loss 0.696 Epoch 66 iteration 0028/0187: training loss 0.695 Epoch 66 iteration 0029/0187: training loss 0.692 Epoch 66 iteration 0030/0187: training loss 0.697 Epoch 66 iteration 0031/0187: training loss 0.697 Epoch 66 iteration 0032/0187: training loss 0.699 Epoch 66 iteration 0033/0187: training loss 0.703 Epoch 66 iteration 0034/0187: training loss 0.704 Epoch 66 iteration 0035/0187: training loss 0.702 Epoch 66 iteration 0036/0187: training loss 0.698 Epoch 66 iteration 0037/0187: training loss 0.696 Epoch 66 iteration 0038/0187: training loss 0.700 Epoch 66 iteration 0039/0187: training loss 0.697 Epoch 66 iteration 0040/0187: training loss 0.691 Epoch 66 iteration 0041/0187: training loss 0.697 Epoch 66 iteration 0042/0187: training loss 0.702 Epoch 66 iteration 0043/0187: training loss 0.702 Epoch 66 iteration 0044/0187: training loss 0.698 Epoch 66 iteration 0045/0187: training loss 0.695 Epoch 66 iteration 0046/0187: training loss 0.696 Epoch 66 iteration 0047/0187: training loss 0.697 Epoch 66 iteration 0048/0187: training loss 0.699 Epoch 66 iteration 0049/0187: training loss 0.699 Epoch 66 iteration 0050/0187: training loss 0.696 Epoch 66 iteration 0051/0187: training loss 0.699 Epoch 66 iteration 0052/0187: training loss 0.698 Epoch 66 iteration 0053/0187: training loss 0.701 Epoch 66 iteration 0054/0187: training loss 0.699 Epoch 66 iteration 0055/0187: training loss 0.701 Epoch 66 iteration 0056/0187: training loss 0.701 Epoch 66 iteration 0057/0187: training loss 0.702 Epoch 66 iteration 0058/0187: training loss 0.702 Epoch 66 iteration 0059/0187: training loss 0.703 Epoch 66 iteration 0060/0187: training loss 0.706 Epoch 66 iteration 0061/0187: training loss 0.706 Epoch 66 iteration 0062/0187: training loss 0.707 Epoch 66 iteration 0063/0187: training loss 0.708 Epoch 66 iteration 0064/0187: training loss 0.709 Epoch 66 iteration 0065/0187: training loss 0.709 Epoch 66 iteration 0066/0187: training loss 0.708 Epoch 66 iteration 0067/0187: training loss 0.708 Epoch 66 iteration 0068/0187: training loss 0.708 Epoch 66 iteration 0069/0187: training loss 0.709 Epoch 66 iteration 0070/0187: training loss 0.712 Epoch 66 iteration 0071/0187: training loss 0.711 Epoch 66 iteration 0072/0187: training loss 0.711 Epoch 66 iteration 0073/0187: training loss 0.712 Epoch 66 iteration 0074/0187: training loss 0.714 Epoch 66 iteration 0075/0187: training loss 0.716 Epoch 66 iteration 0076/0187: training loss 0.715 Epoch 66 iteration 0077/0187: training loss 0.714 Epoch 66 iteration 0078/0187: training loss 0.714 Epoch 66 iteration 0079/0187: training loss 0.713 Epoch 66 iteration 0080/0187: training loss 0.712 Epoch 66 iteration 0081/0187: training loss 0.713 Epoch 66 iteration 0082/0187: training loss 0.714 Epoch 66 iteration 0083/0187: training loss 0.715 Epoch 66 iteration 0084/0187: training loss 0.715 Epoch 66 iteration 0085/0187: training loss 0.716 Epoch 66 iteration 0086/0187: training loss 0.716 Epoch 66 iteration 0087/0187: training loss 0.717 Epoch 66 iteration 0088/0187: training loss 0.715 Epoch 66 iteration 0089/0187: training loss 0.713 Epoch 66 iteration 0090/0187: training loss 0.712 Epoch 66 iteration 0091/0188: training loss 0.714 Epoch 66 iteration 0092/0188: training loss 0.714 Epoch 66 iteration 0093/0188: training loss 0.714 Epoch 66 iteration 0094/0188: training loss 0.716 Epoch 66 iteration 0095/0188: training loss 0.715 Epoch 66 iteration 0096/0188: training loss 0.715 Epoch 66 iteration 0097/0188: training loss 0.716 Epoch 66 iteration 0098/0188: training loss 0.715 Epoch 66 iteration 0099/0188: training loss 0.713 Epoch 66 iteration 0100/0188: training loss 0.713 Epoch 66 iteration 0101/0188: training loss 0.713 Epoch 66 iteration 0102/0188: training loss 0.712 Epoch 66 iteration 0103/0188: training loss 0.712 Epoch 66 iteration 0104/0188: training loss 0.714 Epoch 66 iteration 0105/0188: training loss 0.714 Epoch 66 iteration 0106/0188: training loss 0.713 Epoch 66 iteration 0107/0188: training loss 0.713 Epoch 66 iteration 0108/0188: training loss 0.714 Epoch 66 iteration 0109/0188: training loss 0.714 Epoch 66 iteration 0110/0188: training loss 0.713 Epoch 66 iteration 0111/0188: training loss 0.712 Epoch 66 iteration 0112/0188: training loss 0.713 Epoch 66 iteration 0113/0188: training loss 0.713 Epoch 66 iteration 0114/0188: training loss 0.714 Epoch 66 iteration 0115/0188: training loss 0.714 Epoch 66 iteration 0116/0188: training loss 0.714 Epoch 66 iteration 0117/0188: training loss 0.715 Epoch 66 iteration 0118/0188: training loss 0.715 Epoch 66 iteration 0119/0188: training loss 0.715 Epoch 66 iteration 0120/0188: training loss 0.715 Epoch 66 iteration 0121/0188: training loss 0.715 Epoch 66 iteration 0122/0188: training loss 0.714 Epoch 66 iteration 0123/0188: training loss 0.713 Epoch 66 iteration 0124/0188: training loss 0.712 Epoch 66 iteration 0125/0188: training loss 0.712 Epoch 66 iteration 0126/0188: training loss 0.712 Epoch 66 iteration 0127/0188: training loss 0.712 Epoch 66 iteration 0128/0188: training loss 0.711 Epoch 66 iteration 0129/0188: training loss 0.711 Epoch 66 iteration 0130/0188: training loss 0.712 Epoch 66 iteration 0131/0188: training loss 0.713 Epoch 66 iteration 0132/0188: training loss 0.712 Epoch 66 iteration 0133/0188: training loss 0.711 Epoch 66 iteration 0134/0188: training loss 0.711 Epoch 66 iteration 0135/0188: training loss 0.710 Epoch 66 iteration 0136/0188: training loss 0.710 Epoch 66 iteration 0137/0188: training loss 0.709 Epoch 66 iteration 0138/0188: training loss 0.709 Epoch 66 iteration 0139/0188: training loss 0.709 Epoch 66 iteration 0140/0188: training loss 0.708 Epoch 66 iteration 0141/0188: training loss 0.707 Epoch 66 iteration 0142/0188: training loss 0.706 Epoch 66 iteration 0143/0188: training loss 0.706 Epoch 66 iteration 0144/0188: training loss 0.705 Epoch 66 iteration 0145/0188: training loss 0.705 Epoch 66 iteration 0146/0188: training loss 0.706 Epoch 66 iteration 0147/0188: training loss 0.706 Epoch 66 iteration 0148/0188: training loss 0.705 Epoch 66 iteration 0149/0188: training loss 0.706 Epoch 66 iteration 0150/0188: training loss 0.704 Epoch 66 iteration 0151/0188: training loss 0.706 Epoch 66 iteration 0152/0188: training loss 0.706 Epoch 66 iteration 0153/0188: training loss 0.706 Epoch 66 iteration 0154/0188: training loss 0.706 Epoch 66 iteration 0155/0188: training loss 0.706 Epoch 66 iteration 0156/0188: training loss 0.705 Epoch 66 iteration 0157/0188: training loss 0.705 Epoch 66 iteration 0158/0188: training loss 0.704 Epoch 66 iteration 0159/0188: training loss 0.704 Epoch 66 iteration 0160/0188: training loss 0.703 Epoch 66 iteration 0161/0188: training loss 0.702 Epoch 66 iteration 0162/0188: training loss 0.703 Epoch 66 iteration 0163/0188: training loss 0.704 Epoch 66 iteration 0164/0188: training loss 0.705 Epoch 66 iteration 0165/0188: training loss 0.705 Epoch 66 iteration 0166/0188: training loss 0.706 Epoch 66 iteration 0167/0188: training loss 0.706 Epoch 66 iteration 0168/0188: training loss 0.706 Epoch 66 iteration 0169/0188: training loss 0.705 Epoch 66 iteration 0170/0188: training loss 0.704 Epoch 66 iteration 0171/0188: training loss 0.704 Epoch 66 iteration 0172/0188: training loss 0.704 Epoch 66 iteration 0173/0188: training loss 0.703 Epoch 66 iteration 0174/0188: training loss 0.703 Epoch 66 iteration 0175/0188: training loss 0.703 Epoch 66 iteration 0176/0188: training loss 0.704 Epoch 66 iteration 0177/0188: training loss 0.703 Epoch 66 iteration 0178/0188: training loss 0.704 Epoch 66 iteration 0179/0188: training loss 0.704 Epoch 66 iteration 0180/0188: training loss 0.703 Epoch 66 iteration 0181/0188: training loss 0.702 Epoch 66 iteration 0182/0188: training loss 0.702 Epoch 66 iteration 0183/0188: training loss 0.702 Epoch 66 iteration 0184/0188: training loss 0.703 Epoch 66 iteration 0185/0188: training loss 0.702 Epoch 66 iteration 0186/0188: training loss 0.704 Epoch 66 validation pixAcc: 0.875, mIoU: 0.388 Epoch 67 iteration 0001/0187: training loss 0.632 Epoch 67 iteration 0002/0187: training loss 0.636 Epoch 67 iteration 0003/0187: training loss 0.645 Epoch 67 iteration 0004/0187: training loss 0.649 Epoch 67 iteration 0005/0187: training loss 0.652 Epoch 67 iteration 0006/0187: training loss 0.650 Epoch 67 iteration 0007/0187: training loss 0.675 Epoch 67 iteration 0008/0187: training loss 0.689 Epoch 67 iteration 0009/0187: training loss 0.676 Epoch 67 iteration 0010/0187: training loss 0.672 Epoch 67 iteration 0011/0187: training loss 0.665 Epoch 67 iteration 0012/0187: training loss 0.665 Epoch 67 iteration 0013/0187: training loss 0.673 Epoch 67 iteration 0014/0187: training loss 0.684 Epoch 67 iteration 0015/0187: training loss 0.686 Epoch 67 iteration 0016/0187: training loss 0.684 Epoch 67 iteration 0017/0187: training loss 0.689 Epoch 67 iteration 0018/0187: training loss 0.697 Epoch 67 iteration 0019/0187: training loss 0.700 Epoch 67 iteration 0020/0187: training loss 0.699 Epoch 67 iteration 0021/0187: training loss 0.690 Epoch 67 iteration 0022/0187: training loss 0.695 Epoch 67 iteration 0023/0187: training loss 0.700 Epoch 67 iteration 0024/0187: training loss 0.698 Epoch 67 iteration 0025/0187: training loss 0.694 Epoch 67 iteration 0026/0187: training loss 0.695 Epoch 67 iteration 0027/0187: training loss 0.695 Epoch 67 iteration 0028/0187: training loss 0.697 Epoch 67 iteration 0029/0187: training loss 0.703 Epoch 67 iteration 0030/0187: training loss 0.705 Epoch 67 iteration 0031/0187: training loss 0.706 Epoch 67 iteration 0032/0187: training loss 0.707 Epoch 67 iteration 0033/0187: training loss 0.709 Epoch 67 iteration 0034/0187: training loss 0.714 Epoch 67 iteration 0035/0187: training loss 0.714 Epoch 67 iteration 0036/0187: training loss 0.715 Epoch 67 iteration 0037/0187: training loss 0.710 Epoch 67 iteration 0038/0187: training loss 0.709 Epoch 67 iteration 0039/0187: training loss 0.710 Epoch 67 iteration 0040/0187: training loss 0.711 Epoch 67 iteration 0041/0187: training loss 0.708 Epoch 67 iteration 0042/0187: training loss 0.707 Epoch 67 iteration 0043/0187: training loss 0.710 Epoch 67 iteration 0044/0187: training loss 0.707 Epoch 67 iteration 0045/0187: training loss 0.707 Epoch 67 iteration 0046/0187: training loss 0.705 Epoch 67 iteration 0047/0187: training loss 0.703 Epoch 67 iteration 0048/0187: training loss 0.703 Epoch 67 iteration 0049/0187: training loss 0.706 Epoch 67 iteration 0050/0187: training loss 0.704 Epoch 67 iteration 0051/0187: training loss 0.701 Epoch 67 iteration 0052/0187: training loss 0.701 Epoch 67 iteration 0053/0187: training loss 0.700 Epoch 67 iteration 0054/0187: training loss 0.698 Epoch 67 iteration 0055/0187: training loss 0.698 Epoch 67 iteration 0056/0187: training loss 0.696 Epoch 67 iteration 0057/0187: training loss 0.693 Epoch 67 iteration 0058/0187: training loss 0.692 Epoch 67 iteration 0059/0187: training loss 0.691 Epoch 67 iteration 0060/0187: training loss 0.695 Epoch 67 iteration 0061/0187: training loss 0.694 Epoch 67 iteration 0062/0187: training loss 0.693 Epoch 67 iteration 0063/0187: training loss 0.693 Epoch 67 iteration 0064/0187: training loss 0.691 Epoch 67 iteration 0065/0187: training loss 0.694 Epoch 67 iteration 0066/0187: training loss 0.695 Epoch 67 iteration 0067/0187: training loss 0.694 Epoch 67 iteration 0068/0187: training loss 0.694 Epoch 67 iteration 0069/0187: training loss 0.692 Epoch 67 iteration 0070/0187: training loss 0.692 Epoch 67 iteration 0071/0187: training loss 0.692 Epoch 67 iteration 0072/0187: training loss 0.692 Epoch 67 iteration 0073/0187: training loss 0.693 Epoch 67 iteration 0074/0187: training loss 0.690 Epoch 67 iteration 0075/0187: training loss 0.688 Epoch 67 iteration 0076/0187: training loss 0.687 Epoch 67 iteration 0077/0187: training loss 0.687 Epoch 67 iteration 0078/0187: training loss 0.688 Epoch 67 iteration 0079/0187: training loss 0.689 Epoch 67 iteration 0080/0187: training loss 0.690 Epoch 67 iteration 0081/0187: training loss 0.693 Epoch 67 iteration 0082/0187: training loss 0.692 Epoch 67 iteration 0083/0187: training loss 0.691 Epoch 67 iteration 0084/0187: training loss 0.693 Epoch 67 iteration 0085/0187: training loss 0.692 Epoch 67 iteration 0086/0187: training loss 0.692 Epoch 67 iteration 0087/0187: training loss 0.692 Epoch 67 iteration 0088/0187: training loss 0.693 Epoch 67 iteration 0089/0187: training loss 0.696 Epoch 67 iteration 0090/0187: training loss 0.698 Epoch 67 iteration 0091/0187: training loss 0.699 Epoch 67 iteration 0092/0187: training loss 0.702 Epoch 67 iteration 0093/0187: training loss 0.703 Epoch 67 iteration 0094/0187: training loss 0.703 Epoch 67 iteration 0095/0187: training loss 0.701 Epoch 67 iteration 0096/0187: training loss 0.702 Epoch 67 iteration 0097/0187: training loss 0.702 Epoch 67 iteration 0098/0187: training loss 0.702 Epoch 67 iteration 0099/0187: training loss 0.701 Epoch 67 iteration 0100/0187: training loss 0.702 Epoch 67 iteration 0101/0187: training loss 0.701 Epoch 67 iteration 0102/0187: training loss 0.702 Epoch 67 iteration 0103/0187: training loss 0.702 Epoch 67 iteration 0104/0187: training loss 0.702 Epoch 67 iteration 0105/0187: training loss 0.705 Epoch 67 iteration 0106/0187: training loss 0.703 Epoch 67 iteration 0107/0187: training loss 0.703 Epoch 67 iteration 0108/0187: training loss 0.705 Epoch 67 iteration 0109/0187: training loss 0.706 Epoch 67 iteration 0110/0187: training loss 0.705 Epoch 67 iteration 0111/0187: training loss 0.704 Epoch 67 iteration 0112/0187: training loss 0.709 Epoch 67 iteration 0113/0187: training loss 0.709 Epoch 67 iteration 0114/0187: training loss 0.711 Epoch 67 iteration 0115/0187: training loss 0.711 Epoch 67 iteration 0116/0187: training loss 0.711 Epoch 67 iteration 0117/0187: training loss 0.711 Epoch 67 iteration 0118/0187: training loss 0.710 Epoch 67 iteration 0119/0187: training loss 0.709 Epoch 67 iteration 0120/0187: training loss 0.709 Epoch 67 iteration 0121/0187: training loss 0.707 Epoch 67 iteration 0122/0187: training loss 0.706 Epoch 67 iteration 0123/0187: training loss 0.705 Epoch 67 iteration 0124/0187: training loss 0.706 Epoch 67 iteration 0125/0187: training loss 0.706 Epoch 67 iteration 0126/0187: training loss 0.705 Epoch 67 iteration 0127/0187: training loss 0.705 Epoch 67 iteration 0128/0187: training loss 0.705 Epoch 67 iteration 0129/0187: training loss 0.704 Epoch 67 iteration 0130/0187: training loss 0.704 Epoch 67 iteration 0131/0187: training loss 0.704 Epoch 67 iteration 0132/0187: training loss 0.704 Epoch 67 iteration 0133/0187: training loss 0.704 Epoch 67 iteration 0134/0187: training loss 0.704 Epoch 67 iteration 0135/0187: training loss 0.704 Epoch 67 iteration 0136/0187: training loss 0.703 Epoch 67 iteration 0137/0187: training loss 0.703 Epoch 67 iteration 0138/0187: training loss 0.703 Epoch 67 iteration 0139/0187: training loss 0.702 Epoch 67 iteration 0140/0187: training loss 0.703 Epoch 67 iteration 0141/0187: training loss 0.702 Epoch 67 iteration 0142/0187: training loss 0.703 Epoch 67 iteration 0143/0187: training loss 0.703 Epoch 67 iteration 0144/0187: training loss 0.702 Epoch 67 iteration 0145/0187: training loss 0.702 Epoch 67 iteration 0146/0187: training loss 0.701 Epoch 67 iteration 0147/0187: training loss 0.700 Epoch 67 iteration 0148/0187: training loss 0.700 Epoch 67 iteration 0149/0187: training loss 0.700 Epoch 67 iteration 0150/0187: training loss 0.700 Epoch 67 iteration 0151/0187: training loss 0.699 Epoch 67 iteration 0152/0187: training loss 0.699 Epoch 67 iteration 0153/0187: training loss 0.698 Epoch 67 iteration 0154/0187: training loss 0.699 Epoch 67 iteration 0155/0187: training loss 0.699 Epoch 67 iteration 0156/0187: training loss 0.699 Epoch 67 iteration 0157/0187: training loss 0.698 Epoch 67 iteration 0158/0187: training loss 0.700 Epoch 67 iteration 0159/0187: training loss 0.700 Epoch 67 iteration 0160/0187: training loss 0.701 Epoch 67 iteration 0161/0187: training loss 0.700 Epoch 67 iteration 0162/0187: training loss 0.700 Epoch 67 iteration 0163/0187: training loss 0.700 Epoch 67 iteration 0164/0187: training loss 0.700 Epoch 67 iteration 0165/0187: training loss 0.699 Epoch 67 iteration 0166/0187: training loss 0.700 Epoch 67 iteration 0167/0187: training loss 0.699 Epoch 67 iteration 0168/0187: training loss 0.699 Epoch 67 iteration 0169/0187: training loss 0.698 Epoch 67 iteration 0170/0187: training loss 0.698 Epoch 67 iteration 0171/0187: training loss 0.697 Epoch 67 iteration 0172/0187: training loss 0.698 Epoch 67 iteration 0173/0187: training loss 0.698 Epoch 67 iteration 0174/0187: training loss 0.698 Epoch 67 iteration 0175/0187: training loss 0.699 Epoch 67 iteration 0176/0187: training loss 0.699 Epoch 67 iteration 0177/0187: training loss 0.699 Epoch 67 iteration 0178/0187: training loss 0.699 Epoch 67 iteration 0179/0187: training loss 0.700 Epoch 67 iteration 0180/0187: training loss 0.701 Epoch 67 iteration 0181/0187: training loss 0.700 Epoch 67 iteration 0182/0187: training loss 0.701 Epoch 67 iteration 0183/0187: training loss 0.701 Epoch 67 iteration 0184/0187: training loss 0.701 Epoch 67 iteration 0185/0187: training loss 0.700 Epoch 67 iteration 0186/0187: training loss 0.701 Epoch 67 iteration 0187/0187: training loss 0.702 Epoch 67 validation pixAcc: 0.876, mIoU: 0.389 Epoch 68 iteration 0001/0187: training loss 0.632 Epoch 68 iteration 0002/0187: training loss 0.632 Epoch 68 iteration 0003/0187: training loss 0.611 Epoch 68 iteration 0004/0187: training loss 0.627 Epoch 68 iteration 0005/0187: training loss 0.647 Epoch 68 iteration 0006/0187: training loss 0.638 Epoch 68 iteration 0007/0187: training loss 0.659 Epoch 68 iteration 0008/0187: training loss 0.682 Epoch 68 iteration 0009/0187: training loss 0.677 Epoch 68 iteration 0010/0187: training loss 0.688 Epoch 68 iteration 0011/0187: training loss 0.688 Epoch 68 iteration 0012/0187: training loss 0.696 Epoch 68 iteration 0013/0187: training loss 0.698 Epoch 68 iteration 0014/0187: training loss 0.703 Epoch 68 iteration 0015/0187: training loss 0.696 Epoch 68 iteration 0016/0187: training loss 0.696 Epoch 68 iteration 0017/0187: training loss 0.696 Epoch 68 iteration 0018/0187: training loss 0.708 Epoch 68 iteration 0019/0187: training loss 0.708 Epoch 68 iteration 0020/0187: training loss 0.707 Epoch 68 iteration 0021/0187: training loss 0.710 Epoch 68 iteration 0022/0187: training loss 0.707 Epoch 68 iteration 0023/0187: training loss 0.703 Epoch 68 iteration 0024/0187: training loss 0.701 Epoch 68 iteration 0025/0187: training loss 0.699 Epoch 68 iteration 0026/0187: training loss 0.702 Epoch 68 iteration 0027/0187: training loss 0.707 Epoch 68 iteration 0028/0187: training loss 0.709 Epoch 68 iteration 0029/0187: training loss 0.713 Epoch 68 iteration 0030/0187: training loss 0.709 Epoch 68 iteration 0031/0187: training loss 0.712 Epoch 68 iteration 0032/0187: training loss 0.712 Epoch 68 iteration 0033/0187: training loss 0.709 Epoch 68 iteration 0034/0187: training loss 0.703 Epoch 68 iteration 0035/0187: training loss 0.705 Epoch 68 iteration 0036/0187: training loss 0.705 Epoch 68 iteration 0037/0187: training loss 0.706 Epoch 68 iteration 0038/0187: training loss 0.701 Epoch 68 iteration 0039/0187: training loss 0.702 Epoch 68 iteration 0040/0187: training loss 0.701 Epoch 68 iteration 0041/0187: training loss 0.696 Epoch 68 iteration 0042/0187: training loss 0.699 Epoch 68 iteration 0043/0187: training loss 0.700 Epoch 68 iteration 0044/0187: training loss 0.696 Epoch 68 iteration 0045/0187: training loss 0.700 Epoch 68 iteration 0046/0187: training loss 0.701 Epoch 68 iteration 0047/0187: training loss 0.702 Epoch 68 iteration 0048/0187: training loss 0.702 Epoch 68 iteration 0049/0187: training loss 0.698 Epoch 68 iteration 0050/0187: training loss 0.697 Epoch 68 iteration 0051/0187: training loss 0.699 Epoch 68 iteration 0052/0187: training loss 0.696 Epoch 68 iteration 0053/0187: training loss 0.699 Epoch 68 iteration 0054/0187: training loss 0.701 Epoch 68 iteration 0055/0187: training loss 0.700 Epoch 68 iteration 0056/0187: training loss 0.699 Epoch 68 iteration 0057/0187: training loss 0.700 Epoch 68 iteration 0058/0187: training loss 0.702 Epoch 68 iteration 0059/0187: training loss 0.701 Epoch 68 iteration 0060/0187: training loss 0.703 Epoch 68 iteration 0061/0187: training loss 0.701 Epoch 68 iteration 0062/0187: training loss 0.700 Epoch 68 iteration 0063/0187: training loss 0.705 Epoch 68 iteration 0064/0187: training loss 0.704 Epoch 68 iteration 0065/0187: training loss 0.708 Epoch 68 iteration 0066/0187: training loss 0.707 Epoch 68 iteration 0067/0187: training loss 0.706 Epoch 68 iteration 0068/0187: training loss 0.707 Epoch 68 iteration 0069/0187: training loss 0.709 Epoch 68 iteration 0070/0187: training loss 0.706 Epoch 68 iteration 0071/0187: training loss 0.705 Epoch 68 iteration 0072/0187: training loss 0.708 Epoch 68 iteration 0073/0187: training loss 0.709 Epoch 68 iteration 0074/0187: training loss 0.708 Epoch 68 iteration 0075/0187: training loss 0.708 Epoch 68 iteration 0076/0187: training loss 0.709 Epoch 68 iteration 0077/0187: training loss 0.707 Epoch 68 iteration 0078/0187: training loss 0.706 Epoch 68 iteration 0079/0187: training loss 0.707 Epoch 68 iteration 0080/0187: training loss 0.707 Epoch 68 iteration 0081/0187: training loss 0.705 Epoch 68 iteration 0082/0187: training loss 0.705 Epoch 68 iteration 0083/0187: training loss 0.704 Epoch 68 iteration 0084/0187: training loss 0.706 Epoch 68 iteration 0085/0187: training loss 0.707 Epoch 68 iteration 0086/0187: training loss 0.707 Epoch 68 iteration 0087/0187: training loss 0.711 Epoch 68 iteration 0088/0187: training loss 0.711 Epoch 68 iteration 0089/0187: training loss 0.713 Epoch 68 iteration 0090/0187: training loss 0.713 Epoch 68 iteration 0091/0188: training loss 0.711 Epoch 68 iteration 0092/0188: training loss 0.712 Epoch 68 iteration 0093/0188: training loss 0.713 Epoch 68 iteration 0094/0188: training loss 0.712 Epoch 68 iteration 0095/0188: training loss 0.710 Epoch 68 iteration 0096/0188: training loss 0.709 Epoch 68 iteration 0097/0188: training loss 0.709 Epoch 68 iteration 0098/0188: training loss 0.710 Epoch 68 iteration 0099/0188: training loss 0.710 Epoch 68 iteration 0100/0188: training loss 0.709 Epoch 68 iteration 0101/0188: training loss 0.712 Epoch 68 iteration 0102/0188: training loss 0.713 Epoch 68 iteration 0103/0188: training loss 0.713 Epoch 68 iteration 0104/0188: training loss 0.713 Epoch 68 iteration 0105/0188: training loss 0.713 Epoch 68 iteration 0106/0188: training loss 0.711 Epoch 68 iteration 0107/0188: training loss 0.711 Epoch 68 iteration 0108/0188: training loss 0.712 Epoch 68 iteration 0109/0188: training loss 0.711 Epoch 68 iteration 0110/0188: training loss 0.712 Epoch 68 iteration 0111/0188: training loss 0.712 Epoch 68 iteration 0112/0188: training loss 0.712 Epoch 68 iteration 0113/0188: training loss 0.711 Epoch 68 iteration 0114/0188: training loss 0.711 Epoch 68 iteration 0115/0188: training loss 0.711 Epoch 68 iteration 0116/0188: training loss 0.710 Epoch 68 iteration 0117/0188: training loss 0.713 Epoch 68 iteration 0118/0188: training loss 0.712 Epoch 68 iteration 0119/0188: training loss 0.711 Epoch 68 iteration 0120/0188: training loss 0.711 Epoch 68 iteration 0121/0188: training loss 0.710 Epoch 68 iteration 0122/0188: training loss 0.711 Epoch 68 iteration 0123/0188: training loss 0.710 Epoch 68 iteration 0124/0188: training loss 0.709 Epoch 68 iteration 0125/0188: training loss 0.709 Epoch 68 iteration 0126/0188: training loss 0.709 Epoch 68 iteration 0127/0188: training loss 0.710 Epoch 68 iteration 0128/0188: training loss 0.709 Epoch 68 iteration 0129/0188: training loss 0.709 Epoch 68 iteration 0130/0188: training loss 0.709 Epoch 68 iteration 0131/0188: training loss 0.709 Epoch 68 iteration 0132/0188: training loss 0.709 Epoch 68 iteration 0133/0188: training loss 0.708 Epoch 68 iteration 0134/0188: training loss 0.707 Epoch 68 iteration 0135/0188: training loss 0.708 Epoch 68 iteration 0136/0188: training loss 0.710 Epoch 68 iteration 0137/0188: training loss 0.709 Epoch 68 iteration 0138/0188: training loss 0.709 Epoch 68 iteration 0139/0188: training loss 0.708 Epoch 68 iteration 0140/0188: training loss 0.709 Epoch 68 iteration 0141/0188: training loss 0.710 Epoch 68 iteration 0142/0188: training loss 0.710 Epoch 68 iteration 0143/0188: training loss 0.711 Epoch 68 iteration 0144/0188: training loss 0.710 Epoch 68 iteration 0145/0188: training loss 0.712 Epoch 68 iteration 0146/0188: training loss 0.711 Epoch 68 iteration 0147/0188: training loss 0.711 Epoch 68 iteration 0148/0188: training loss 0.710 Epoch 68 iteration 0149/0188: training loss 0.711 Epoch 68 iteration 0150/0188: training loss 0.710 Epoch 68 iteration 0151/0188: training loss 0.711 Epoch 68 iteration 0152/0188: training loss 0.711 Epoch 68 iteration 0153/0188: training loss 0.712 Epoch 68 iteration 0154/0188: training loss 0.712 Epoch 68 iteration 0155/0188: training loss 0.711 Epoch 68 iteration 0156/0188: training loss 0.712 Epoch 68 iteration 0157/0188: training loss 0.712 Epoch 68 iteration 0158/0188: training loss 0.711 Epoch 68 iteration 0159/0188: training loss 0.712 Epoch 68 iteration 0160/0188: training loss 0.711 Epoch 68 iteration 0161/0188: training loss 0.711 Epoch 68 iteration 0162/0188: training loss 0.712 Epoch 68 iteration 0163/0188: training loss 0.711 Epoch 68 iteration 0164/0188: training loss 0.711 Epoch 68 iteration 0165/0188: training loss 0.710 Epoch 68 iteration 0166/0188: training loss 0.712 Epoch 68 iteration 0167/0188: training loss 0.712 Epoch 68 iteration 0168/0188: training loss 0.712 Epoch 68 iteration 0169/0188: training loss 0.713 Epoch 68 iteration 0170/0188: training loss 0.712 Epoch 68 iteration 0171/0188: training loss 0.711 Epoch 68 iteration 0172/0188: training loss 0.711 Epoch 68 iteration 0173/0188: training loss 0.711 Epoch 68 iteration 0174/0188: training loss 0.710 Epoch 68 iteration 0175/0188: training loss 0.710 Epoch 68 iteration 0176/0188: training loss 0.710 Epoch 68 iteration 0177/0188: training loss 0.710 Epoch 68 iteration 0178/0188: training loss 0.709 Epoch 68 iteration 0179/0188: training loss 0.709 Epoch 68 iteration 0180/0188: training loss 0.709 Epoch 68 iteration 0181/0188: training loss 0.709 Epoch 68 iteration 0182/0188: training loss 0.708 Epoch 68 iteration 0183/0188: training loss 0.707 Epoch 68 iteration 0184/0188: training loss 0.707 Epoch 68 iteration 0185/0188: training loss 0.706 Epoch 68 iteration 0186/0188: training loss 0.706 Epoch 68 validation pixAcc: 0.875, mIoU: 0.390 Epoch 69 iteration 0001/0187: training loss 0.650 Epoch 69 iteration 0002/0187: training loss 0.641 Epoch 69 iteration 0003/0187: training loss 0.644 Epoch 69 iteration 0004/0187: training loss 0.662 Epoch 69 iteration 0005/0187: training loss 0.660 Epoch 69 iteration 0006/0187: training loss 0.668 Epoch 69 iteration 0007/0187: training loss 0.651 Epoch 69 iteration 0008/0187: training loss 0.645 Epoch 69 iteration 0009/0187: training loss 0.652 Epoch 69 iteration 0010/0187: training loss 0.654 Epoch 69 iteration 0011/0187: training loss 0.658 Epoch 69 iteration 0012/0187: training loss 0.655 Epoch 69 iteration 0013/0187: training loss 0.657 Epoch 69 iteration 0014/0187: training loss 0.654 Epoch 69 iteration 0015/0187: training loss 0.660 Epoch 69 iteration 0016/0187: training loss 0.659 Epoch 69 iteration 0017/0187: training loss 0.663 Epoch 69 iteration 0018/0187: training loss 0.665 Epoch 69 iteration 0019/0187: training loss 0.668 Epoch 69 iteration 0020/0187: training loss 0.671 Epoch 69 iteration 0021/0187: training loss 0.675 Epoch 69 iteration 0022/0187: training loss 0.674 Epoch 69 iteration 0023/0187: training loss 0.674 Epoch 69 iteration 0024/0187: training loss 0.675 Epoch 69 iteration 0025/0187: training loss 0.677 Epoch 69 iteration 0026/0187: training loss 0.677 Epoch 69 iteration 0027/0187: training loss 0.680 Epoch 69 iteration 0028/0187: training loss 0.685 Epoch 69 iteration 0029/0187: training loss 0.686 Epoch 69 iteration 0030/0187: training loss 0.685 Epoch 69 iteration 0031/0187: training loss 0.690 Epoch 69 iteration 0032/0187: training loss 0.691 Epoch 69 iteration 0033/0187: training loss 0.700 Epoch 69 iteration 0034/0187: training loss 0.701 Epoch 69 iteration 0035/0187: training loss 0.702 Epoch 69 iteration 0036/0187: training loss 0.706 Epoch 69 iteration 0037/0187: training loss 0.704 Epoch 69 iteration 0038/0187: training loss 0.705 Epoch 69 iteration 0039/0187: training loss 0.705 Epoch 69 iteration 0040/0187: training loss 0.708 Epoch 69 iteration 0041/0187: training loss 0.703 Epoch 69 iteration 0042/0187: training loss 0.702 Epoch 69 iteration 0043/0187: training loss 0.707 Epoch 69 iteration 0044/0187: training loss 0.705 Epoch 69 iteration 0045/0187: training loss 0.705 Epoch 69 iteration 0046/0187: training loss 0.704 Epoch 69 iteration 0047/0187: training loss 0.702 Epoch 69 iteration 0048/0187: training loss 0.701 Epoch 69 iteration 0049/0187: training loss 0.702 Epoch 69 iteration 0050/0187: training loss 0.700 Epoch 69 iteration 0051/0187: training loss 0.699 Epoch 69 iteration 0052/0187: training loss 0.699 Epoch 69 iteration 0053/0187: training loss 0.699 Epoch 69 iteration 0054/0187: training loss 0.697 Epoch 69 iteration 0055/0187: training loss 0.697 Epoch 69 iteration 0056/0187: training loss 0.698 Epoch 69 iteration 0057/0187: training loss 0.704 Epoch 69 iteration 0058/0187: training loss 0.703 Epoch 69 iteration 0059/0187: training loss 0.702 Epoch 69 iteration 0060/0187: training loss 0.702 Epoch 69 iteration 0061/0187: training loss 0.700 Epoch 69 iteration 0062/0187: training loss 0.702 Epoch 69 iteration 0063/0187: training loss 0.701 Epoch 69 iteration 0064/0187: training loss 0.698 Epoch 69 iteration 0065/0187: training loss 0.697 Epoch 69 iteration 0066/0187: training loss 0.698 Epoch 69 iteration 0067/0187: training loss 0.698 Epoch 69 iteration 0068/0187: training loss 0.695 Epoch 69 iteration 0069/0187: training loss 0.693 Epoch 69 iteration 0070/0187: training loss 0.696 Epoch 69 iteration 0071/0187: training loss 0.698 Epoch 69 iteration 0072/0187: training loss 0.698 Epoch 69 iteration 0073/0187: training loss 0.698 Epoch 69 iteration 0074/0187: training loss 0.698 Epoch 69 iteration 0075/0187: training loss 0.698 Epoch 69 iteration 0076/0187: training loss 0.698 Epoch 69 iteration 0077/0187: training loss 0.697 Epoch 69 iteration 0078/0187: training loss 0.697 Epoch 69 iteration 0079/0187: training loss 0.699 Epoch 69 iteration 0080/0187: training loss 0.703 Epoch 69 iteration 0081/0187: training loss 0.703 Epoch 69 iteration 0082/0187: training loss 0.702 Epoch 69 iteration 0083/0187: training loss 0.701 Epoch 69 iteration 0084/0187: training loss 0.703 Epoch 69 iteration 0085/0187: training loss 0.703 Epoch 69 iteration 0086/0187: training loss 0.704 Epoch 69 iteration 0087/0187: training loss 0.703 Epoch 69 iteration 0088/0187: training loss 0.703 Epoch 69 iteration 0089/0187: training loss 0.709 Epoch 69 iteration 0090/0187: training loss 0.708 Epoch 69 iteration 0091/0187: training loss 0.707 Epoch 69 iteration 0092/0187: training loss 0.706 Epoch 69 iteration 0093/0187: training loss 0.703 Epoch 69 iteration 0094/0187: training loss 0.703 Epoch 69 iteration 0095/0187: training loss 0.702 Epoch 69 iteration 0096/0187: training loss 0.702 Epoch 69 iteration 0097/0187: training loss 0.702 Epoch 69 iteration 0098/0187: training loss 0.702 Epoch 69 iteration 0099/0187: training loss 0.702 Epoch 69 iteration 0100/0187: training loss 0.702 Epoch 69 iteration 0101/0187: training loss 0.702 Epoch 69 iteration 0102/0187: training loss 0.701 Epoch 69 iteration 0103/0187: training loss 0.700 Epoch 69 iteration 0104/0187: training loss 0.700 Epoch 69 iteration 0105/0187: training loss 0.701 Epoch 69 iteration 0106/0187: training loss 0.700 Epoch 69 iteration 0107/0187: training loss 0.701 Epoch 69 iteration 0108/0187: training loss 0.701 Epoch 69 iteration 0109/0187: training loss 0.701 Epoch 69 iteration 0110/0187: training loss 0.700 Epoch 69 iteration 0111/0187: training loss 0.702 Epoch 69 iteration 0112/0187: training loss 0.701 Epoch 69 iteration 0113/0187: training loss 0.701 Epoch 69 iteration 0114/0187: training loss 0.701 Epoch 69 iteration 0115/0187: training loss 0.701 Epoch 69 iteration 0116/0187: training loss 0.701 Epoch 69 iteration 0117/0187: training loss 0.701 Epoch 69 iteration 0118/0187: training loss 0.700 Epoch 69 iteration 0119/0187: training loss 0.701 Epoch 69 iteration 0120/0187: training loss 0.699 Epoch 69 iteration 0121/0187: training loss 0.699 Epoch 69 iteration 0122/0187: training loss 0.700 Epoch 69 iteration 0123/0187: training loss 0.701 Epoch 69 iteration 0124/0187: training loss 0.701 Epoch 69 iteration 0125/0187: training loss 0.702 Epoch 69 iteration 0126/0187: training loss 0.702 Epoch 69 iteration 0127/0187: training loss 0.702 Epoch 69 iteration 0128/0187: training loss 0.702 Epoch 69 iteration 0129/0187: training loss 0.701 Epoch 69 iteration 0130/0187: training loss 0.702 Epoch 69 iteration 0131/0187: training loss 0.701 Epoch 69 iteration 0132/0187: training loss 0.700 Epoch 69 iteration 0133/0187: training loss 0.701 Epoch 69 iteration 0134/0187: training loss 0.700 Epoch 69 iteration 0135/0187: training loss 0.699 Epoch 69 iteration 0136/0187: training loss 0.700 Epoch 69 iteration 0137/0187: training loss 0.701 Epoch 69 iteration 0138/0187: training loss 0.701 Epoch 69 iteration 0139/0187: training loss 0.700 Epoch 69 iteration 0140/0187: training loss 0.700 Epoch 69 iteration 0141/0187: training loss 0.700 Epoch 69 iteration 0142/0187: training loss 0.700 Epoch 69 iteration 0143/0187: training loss 0.700 Epoch 69 iteration 0144/0187: training loss 0.699 Epoch 69 iteration 0145/0187: training loss 0.699 Epoch 69 iteration 0146/0187: training loss 0.698 Epoch 69 iteration 0147/0187: training loss 0.699 Epoch 69 iteration 0148/0187: training loss 0.699 Epoch 69 iteration 0149/0187: training loss 0.699 Epoch 69 iteration 0150/0187: training loss 0.699 Epoch 69 iteration 0151/0187: training loss 0.699 Epoch 69 iteration 0152/0187: training loss 0.699 Epoch 69 iteration 0153/0187: training loss 0.699 Epoch 69 iteration 0154/0187: training loss 0.700 Epoch 69 iteration 0155/0187: training loss 0.699 Epoch 69 iteration 0156/0187: training loss 0.700 Epoch 69 iteration 0157/0187: training loss 0.700 Epoch 69 iteration 0158/0187: training loss 0.701 Epoch 69 iteration 0159/0187: training loss 0.701 Epoch 69 iteration 0160/0187: training loss 0.701 Epoch 69 iteration 0161/0187: training loss 0.701 Epoch 69 iteration 0162/0187: training loss 0.703 Epoch 69 iteration 0163/0187: training loss 0.703 Epoch 69 iteration 0164/0187: training loss 0.703 Epoch 69 iteration 0165/0187: training loss 0.704 Epoch 69 iteration 0166/0187: training loss 0.704 Epoch 69 iteration 0167/0187: training loss 0.704 Epoch 69 iteration 0168/0187: training loss 0.703 Epoch 69 iteration 0169/0187: training loss 0.705 Epoch 69 iteration 0170/0187: training loss 0.704 Epoch 69 iteration 0171/0187: training loss 0.704 Epoch 69 iteration 0172/0187: training loss 0.704 Epoch 69 iteration 0173/0187: training loss 0.705 Epoch 69 iteration 0174/0187: training loss 0.704 Epoch 69 iteration 0175/0187: training loss 0.704 Epoch 69 iteration 0176/0187: training loss 0.704 Epoch 69 iteration 0177/0187: training loss 0.703 Epoch 69 iteration 0178/0187: training loss 0.704 Epoch 69 iteration 0179/0187: training loss 0.703 Epoch 69 iteration 0180/0187: training loss 0.703 Epoch 69 iteration 0181/0187: training loss 0.703 Epoch 69 iteration 0182/0187: training loss 0.703 Epoch 69 iteration 0183/0187: training loss 0.704 Epoch 69 iteration 0184/0187: training loss 0.703 Epoch 69 iteration 0185/0187: training loss 0.704 Epoch 69 iteration 0186/0187: training loss 0.704 Epoch 69 iteration 0187/0187: training loss 0.704 Epoch 69 validation pixAcc: 0.874, mIoU: 0.384 Epoch 70 iteration 0001/0187: training loss 0.680 Epoch 70 iteration 0002/0187: training loss 0.630 Epoch 70 iteration 0003/0187: training loss 0.642 Epoch 70 iteration 0004/0187: training loss 0.638 Epoch 70 iteration 0005/0187: training loss 0.641 Epoch 70 iteration 0006/0187: training loss 0.652 Epoch 70 iteration 0007/0187: training loss 0.647 Epoch 70 iteration 0008/0187: training loss 0.641 Epoch 70 iteration 0009/0187: training loss 0.643 Epoch 70 iteration 0010/0187: training loss 0.639 Epoch 70 iteration 0011/0187: training loss 0.646 Epoch 70 iteration 0012/0187: training loss 0.665 Epoch 70 iteration 0013/0187: training loss 0.674 Epoch 70 iteration 0014/0187: training loss 0.667 Epoch 70 iteration 0015/0187: training loss 0.678 Epoch 70 iteration 0016/0187: training loss 0.676 Epoch 70 iteration 0017/0187: training loss 0.680 Epoch 70 iteration 0018/0187: training loss 0.690 Epoch 70 iteration 0019/0187: training loss 0.689 Epoch 70 iteration 0020/0187: training loss 0.687 Epoch 70 iteration 0021/0187: training loss 0.682 Epoch 70 iteration 0022/0187: training loss 0.685 Epoch 70 iteration 0023/0187: training loss 0.681 Epoch 70 iteration 0024/0187: training loss 0.682 Epoch 70 iteration 0025/0187: training loss 0.690 Epoch 70 iteration 0026/0187: training loss 0.686 Epoch 70 iteration 0027/0187: training loss 0.686 Epoch 70 iteration 0028/0187: training loss 0.685 Epoch 70 iteration 0029/0187: training loss 0.684 Epoch 70 iteration 0030/0187: training loss 0.679 Epoch 70 iteration 0031/0187: training loss 0.679 Epoch 70 iteration 0032/0187: training loss 0.680 Epoch 70 iteration 0033/0187: training loss 0.678 Epoch 70 iteration 0034/0187: training loss 0.677 Epoch 70 iteration 0035/0187: training loss 0.680 Epoch 70 iteration 0036/0187: training loss 0.679 Epoch 70 iteration 0037/0187: training loss 0.676 Epoch 70 iteration 0038/0187: training loss 0.676 Epoch 70 iteration 0039/0187: training loss 0.678 Epoch 70 iteration 0040/0187: training loss 0.677 Epoch 70 iteration 0041/0187: training loss 0.683 Epoch 70 iteration 0042/0187: training loss 0.683 Epoch 70 iteration 0043/0187: training loss 0.681 Epoch 70 iteration 0044/0187: training loss 0.679 Epoch 70 iteration 0045/0187: training loss 0.677 Epoch 70 iteration 0046/0187: training loss 0.679 Epoch 70 iteration 0047/0187: training loss 0.681 Epoch 70 iteration 0048/0187: training loss 0.680 Epoch 70 iteration 0049/0187: training loss 0.681 Epoch 70 iteration 0050/0187: training loss 0.681 Epoch 70 iteration 0051/0187: training loss 0.685 Epoch 70 iteration 0052/0187: training loss 0.683 Epoch 70 iteration 0053/0187: training loss 0.684 Epoch 70 iteration 0054/0187: training loss 0.684 Epoch 70 iteration 0055/0187: training loss 0.684 Epoch 70 iteration 0056/0187: training loss 0.683 Epoch 70 iteration 0057/0187: training loss 0.683 Epoch 70 iteration 0058/0187: training loss 0.681 Epoch 70 iteration 0059/0187: training loss 0.682 Epoch 70 iteration 0060/0187: training loss 0.682 Epoch 70 iteration 0061/0187: training loss 0.683 Epoch 70 iteration 0062/0187: training loss 0.688 Epoch 70 iteration 0063/0187: training loss 0.688 Epoch 70 iteration 0064/0187: training loss 0.687 Epoch 70 iteration 0065/0187: training loss 0.688 Epoch 70 iteration 0066/0187: training loss 0.688 Epoch 70 iteration 0067/0187: training loss 0.691 Epoch 70 iteration 0068/0187: training loss 0.692 Epoch 70 iteration 0069/0187: training loss 0.692 Epoch 70 iteration 0070/0187: training loss 0.690 Epoch 70 iteration 0071/0187: training loss 0.692 Epoch 70 iteration 0072/0187: training loss 0.691 Epoch 70 iteration 0073/0187: training loss 0.691 Epoch 70 iteration 0074/0187: training loss 0.691 Epoch 70 iteration 0075/0187: training loss 0.693 Epoch 70 iteration 0076/0187: training loss 0.692 Epoch 70 iteration 0077/0187: training loss 0.693 Epoch 70 iteration 0078/0187: training loss 0.694 Epoch 70 iteration 0079/0187: training loss 0.694 Epoch 70 iteration 0080/0187: training loss 0.695 Epoch 70 iteration 0081/0187: training loss 0.695 Epoch 70 iteration 0082/0187: training loss 0.695 Epoch 70 iteration 0083/0187: training loss 0.695 Epoch 70 iteration 0084/0187: training loss 0.698 Epoch 70 iteration 0085/0187: training loss 0.699 Epoch 70 iteration 0086/0187: training loss 0.698 Epoch 70 iteration 0087/0187: training loss 0.697 Epoch 70 iteration 0088/0187: training loss 0.697 Epoch 70 iteration 0089/0187: training loss 0.698 Epoch 70 iteration 0090/0187: training loss 0.700 Epoch 70 iteration 0091/0188: training loss 0.701 Epoch 70 iteration 0092/0188: training loss 0.702 Epoch 70 iteration 0093/0188: training loss 0.701 Epoch 70 iteration 0094/0188: training loss 0.703 Epoch 70 iteration 0095/0188: training loss 0.706 Epoch 70 iteration 0096/0188: training loss 0.706 Epoch 70 iteration 0097/0188: training loss 0.706 Epoch 70 iteration 0098/0188: training loss 0.707 Epoch 70 iteration 0099/0188: training loss 0.706 Epoch 70 iteration 0100/0188: training loss 0.704 Epoch 70 iteration 0101/0188: training loss 0.703 Epoch 70 iteration 0102/0188: training loss 0.703 Epoch 70 iteration 0103/0188: training loss 0.702 Epoch 70 iteration 0104/0188: training loss 0.701 Epoch 70 iteration 0105/0188: training loss 0.702 Epoch 70 iteration 0106/0188: training loss 0.701 Epoch 70 iteration 0107/0188: training loss 0.701 Epoch 70 iteration 0108/0188: training loss 0.702 Epoch 70 iteration 0109/0188: training loss 0.702 Epoch 70 iteration 0110/0188: training loss 0.701 Epoch 70 iteration 0111/0188: training loss 0.700 Epoch 70 iteration 0112/0188: training loss 0.701 Epoch 70 iteration 0113/0188: training loss 0.702 Epoch 70 iteration 0114/0188: training loss 0.701 Epoch 70 iteration 0115/0188: training loss 0.699 Epoch 70 iteration 0116/0188: training loss 0.699 Epoch 70 iteration 0117/0188: training loss 0.698 Epoch 70 iteration 0118/0188: training loss 0.698 Epoch 70 iteration 0119/0188: training loss 0.699 Epoch 70 iteration 0120/0188: training loss 0.701 Epoch 70 iteration 0121/0188: training loss 0.701 Epoch 70 iteration 0122/0188: training loss 0.700 Epoch 70 iteration 0123/0188: training loss 0.701 Epoch 70 iteration 0124/0188: training loss 0.701 Epoch 70 iteration 0125/0188: training loss 0.701 Epoch 70 iteration 0126/0188: training loss 0.703 Epoch 70 iteration 0127/0188: training loss 0.704 Epoch 70 iteration 0128/0188: training loss 0.703 Epoch 70 iteration 0129/0188: training loss 0.702 Epoch 70 iteration 0130/0188: training loss 0.703 Epoch 70 iteration 0131/0188: training loss 0.703 Epoch 70 iteration 0132/0188: training loss 0.701 Epoch 70 iteration 0133/0188: training loss 0.701 Epoch 70 iteration 0134/0188: training loss 0.701 Epoch 70 iteration 0135/0188: training loss 0.701 Epoch 70 iteration 0136/0188: training loss 0.702 Epoch 70 iteration 0137/0188: training loss 0.704 Epoch 70 iteration 0138/0188: training loss 0.706 Epoch 70 iteration 0139/0188: training loss 0.706 Epoch 70 iteration 0140/0188: training loss 0.706 Epoch 70 iteration 0141/0188: training loss 0.705 Epoch 70 iteration 0142/0188: training loss 0.703 Epoch 70 iteration 0143/0188: training loss 0.703 Epoch 70 iteration 0144/0188: training loss 0.705 Epoch 70 iteration 0145/0188: training loss 0.704 Epoch 70 iteration 0146/0188: training loss 0.706 Epoch 70 iteration 0147/0188: training loss 0.705 Epoch 70 iteration 0148/0188: training loss 0.706 Epoch 70 iteration 0149/0188: training loss 0.705 Epoch 70 iteration 0150/0188: training loss 0.705 Epoch 70 iteration 0151/0188: training loss 0.706 Epoch 70 iteration 0152/0188: training loss 0.706 Epoch 70 iteration 0153/0188: training loss 0.704 Epoch 70 iteration 0154/0188: training loss 0.705 Epoch 70 iteration 0155/0188: training loss 0.705 Epoch 70 iteration 0156/0188: training loss 0.705 Epoch 70 iteration 0157/0188: training loss 0.704 Epoch 70 iteration 0158/0188: training loss 0.705 Epoch 70 iteration 0159/0188: training loss 0.706 Epoch 70 iteration 0160/0188: training loss 0.705 Epoch 70 iteration 0161/0188: training loss 0.704 Epoch 70 iteration 0162/0188: training loss 0.704 Epoch 70 iteration 0163/0188: training loss 0.704 Epoch 70 iteration 0164/0188: training loss 0.704 Epoch 70 iteration 0165/0188: training loss 0.704 Epoch 70 iteration 0166/0188: training loss 0.703 Epoch 70 iteration 0167/0188: training loss 0.703 Epoch 70 iteration 0168/0188: training loss 0.703 Epoch 70 iteration 0169/0188: training loss 0.702 Epoch 70 iteration 0170/0188: training loss 0.703 Epoch 70 iteration 0171/0188: training loss 0.703 Epoch 70 iteration 0172/0188: training loss 0.703 Epoch 70 iteration 0173/0188: training loss 0.704 Epoch 70 iteration 0174/0188: training loss 0.704 Epoch 70 iteration 0175/0188: training loss 0.704 Epoch 70 iteration 0176/0188: training loss 0.704 Epoch 70 iteration 0177/0188: training loss 0.704 Epoch 70 iteration 0178/0188: training loss 0.704 Epoch 70 iteration 0179/0188: training loss 0.704 Epoch 70 iteration 0180/0188: training loss 0.704 Epoch 70 iteration 0181/0188: training loss 0.705 Epoch 70 iteration 0182/0188: training loss 0.704 Epoch 70 iteration 0183/0188: training loss 0.704 Epoch 70 iteration 0184/0188: training loss 0.706 Epoch 70 iteration 0185/0188: training loss 0.706 Epoch 70 iteration 0186/0188: training loss 0.706 Epoch 70 validation pixAcc: 0.875, mIoU: 0.389 Epoch 71 iteration 0001/0187: training loss 0.810 Epoch 71 iteration 0002/0187: training loss 0.759 Epoch 71 iteration 0003/0187: training loss 0.718 Epoch 71 iteration 0004/0187: training loss 0.709 Epoch 71 iteration 0005/0187: training loss 0.681 Epoch 71 iteration 0006/0187: training loss 0.710 Epoch 71 iteration 0007/0187: training loss 0.710 Epoch 71 iteration 0008/0187: training loss 0.695 Epoch 71 iteration 0009/0187: training loss 0.679 Epoch 71 iteration 0010/0187: training loss 0.688 Epoch 71 iteration 0011/0187: training loss 0.712 Epoch 71 iteration 0012/0187: training loss 0.709 Epoch 71 iteration 0013/0187: training loss 0.697 Epoch 71 iteration 0014/0187: training loss 0.699 Epoch 71 iteration 0015/0187: training loss 0.697 Epoch 71 iteration 0016/0187: training loss 0.693 Epoch 71 iteration 0017/0187: training loss 0.695 Epoch 71 iteration 0018/0187: training loss 0.700 Epoch 71 iteration 0019/0187: training loss 0.709 Epoch 71 iteration 0020/0187: training loss 0.711 Epoch 71 iteration 0021/0187: training loss 0.710 Epoch 71 iteration 0022/0187: training loss 0.713 Epoch 71 iteration 0023/0187: training loss 0.708 Epoch 71 iteration 0024/0187: training loss 0.710 Epoch 71 iteration 0025/0187: training loss 0.704 Epoch 71 iteration 0026/0187: training loss 0.705 Epoch 71 iteration 0027/0187: training loss 0.712 Epoch 71 iteration 0028/0187: training loss 0.711 Epoch 71 iteration 0029/0187: training loss 0.716 Epoch 71 iteration 0030/0187: training loss 0.713 Epoch 71 iteration 0031/0187: training loss 0.712 Epoch 71 iteration 0032/0187: training loss 0.705 Epoch 71 iteration 0033/0187: training loss 0.701 Epoch 71 iteration 0034/0187: training loss 0.703 Epoch 71 iteration 0035/0187: training loss 0.703 Epoch 71 iteration 0036/0187: training loss 0.702 Epoch 71 iteration 0037/0187: training loss 0.699 Epoch 71 iteration 0038/0187: training loss 0.704 Epoch 71 iteration 0039/0187: training loss 0.705 Epoch 71 iteration 0040/0187: training loss 0.706 Epoch 71 iteration 0041/0187: training loss 0.707 Epoch 71 iteration 0042/0187: training loss 0.707 Epoch 71 iteration 0043/0187: training loss 0.707 Epoch 71 iteration 0044/0187: training loss 0.707 Epoch 71 iteration 0045/0187: training loss 0.705 Epoch 71 iteration 0046/0187: training loss 0.712 Epoch 71 iteration 0047/0187: training loss 0.713 Epoch 71 iteration 0048/0187: training loss 0.712 Epoch 71 iteration 0049/0187: training loss 0.709 Epoch 71 iteration 0050/0187: training loss 0.707 Epoch 71 iteration 0051/0187: training loss 0.707 Epoch 71 iteration 0052/0187: training loss 0.707 Epoch 71 iteration 0053/0187: training loss 0.708 Epoch 71 iteration 0054/0187: training loss 0.709 Epoch 71 iteration 0055/0187: training loss 0.708 Epoch 71 iteration 0056/0187: training loss 0.705 Epoch 71 iteration 0057/0187: training loss 0.708 Epoch 71 iteration 0058/0187: training loss 0.704 Epoch 71 iteration 0059/0187: training loss 0.704 Epoch 71 iteration 0060/0187: training loss 0.703 Epoch 71 iteration 0061/0187: training loss 0.701 Epoch 71 iteration 0062/0187: training loss 0.704 Epoch 71 iteration 0063/0187: training loss 0.702 Epoch 71 iteration 0064/0187: training loss 0.700 Epoch 71 iteration 0065/0187: training loss 0.700 Epoch 71 iteration 0066/0187: training loss 0.701 Epoch 71 iteration 0067/0187: training loss 0.702 Epoch 71 iteration 0068/0187: training loss 0.702 Epoch 71 iteration 0069/0187: training loss 0.701 Epoch 71 iteration 0070/0187: training loss 0.699 Epoch 71 iteration 0071/0187: training loss 0.700 Epoch 71 iteration 0072/0187: training loss 0.699 Epoch 71 iteration 0073/0187: training loss 0.698 Epoch 71 iteration 0074/0187: training loss 0.699 Epoch 71 iteration 0075/0187: training loss 0.698 Epoch 71 iteration 0076/0187: training loss 0.700 Epoch 71 iteration 0077/0187: training loss 0.700 Epoch 71 iteration 0078/0187: training loss 0.700 Epoch 71 iteration 0079/0187: training loss 0.700 Epoch 71 iteration 0080/0187: training loss 0.699 Epoch 71 iteration 0081/0187: training loss 0.696 Epoch 71 iteration 0082/0187: training loss 0.695 Epoch 71 iteration 0083/0187: training loss 0.694 Epoch 71 iteration 0084/0187: training loss 0.697 Epoch 71 iteration 0085/0187: training loss 0.696 Epoch 71 iteration 0086/0187: training loss 0.695 Epoch 71 iteration 0087/0187: training loss 0.695 Epoch 71 iteration 0088/0187: training loss 0.697 Epoch 71 iteration 0089/0187: training loss 0.696 Epoch 71 iteration 0090/0187: training loss 0.694 Epoch 71 iteration 0091/0187: training loss 0.693 Epoch 71 iteration 0092/0187: training loss 0.694 Epoch 71 iteration 0093/0187: training loss 0.693 Epoch 71 iteration 0094/0187: training loss 0.692 Epoch 71 iteration 0095/0187: training loss 0.694 Epoch 71 iteration 0096/0187: training loss 0.692 Epoch 71 iteration 0097/0187: training loss 0.693 Epoch 71 iteration 0098/0187: training loss 0.693 Epoch 71 iteration 0099/0187: training loss 0.692 Epoch 71 iteration 0100/0187: training loss 0.693 Epoch 71 iteration 0101/0187: training loss 0.691 Epoch 71 iteration 0102/0187: training loss 0.692 Epoch 71 iteration 0103/0187: training loss 0.691 Epoch 71 iteration 0104/0187: training loss 0.694 Epoch 71 iteration 0105/0187: training loss 0.694 Epoch 71 iteration 0106/0187: training loss 0.692 Epoch 71 iteration 0107/0187: training loss 0.693 Epoch 71 iteration 0108/0187: training loss 0.693 Epoch 71 iteration 0109/0187: training loss 0.692 Epoch 71 iteration 0110/0187: training loss 0.692 Epoch 71 iteration 0111/0187: training loss 0.692 Epoch 71 iteration 0112/0187: training loss 0.693 Epoch 71 iteration 0113/0187: training loss 0.692 Epoch 71 iteration 0114/0187: training loss 0.691 Epoch 71 iteration 0115/0187: training loss 0.691 Epoch 71 iteration 0116/0187: training loss 0.694 Epoch 71 iteration 0117/0187: training loss 0.694 Epoch 71 iteration 0118/0187: training loss 0.694 Epoch 71 iteration 0119/0187: training loss 0.694 Epoch 71 iteration 0120/0187: training loss 0.695 Epoch 71 iteration 0121/0187: training loss 0.695 Epoch 71 iteration 0122/0187: training loss 0.695 Epoch 71 iteration 0123/0187: training loss 0.695 Epoch 71 iteration 0124/0187: training loss 0.695 Epoch 71 iteration 0125/0187: training loss 0.695 Epoch 71 iteration 0126/0187: training loss 0.694 Epoch 71 iteration 0127/0187: training loss 0.695 Epoch 71 iteration 0128/0187: training loss 0.696 Epoch 71 iteration 0129/0187: training loss 0.696 Epoch 71 iteration 0130/0187: training loss 0.698 Epoch 71 iteration 0131/0187: training loss 0.698 Epoch 71 iteration 0132/0187: training loss 0.698 Epoch 71 iteration 0133/0187: training loss 0.697 Epoch 71 iteration 0134/0187: training loss 0.696 Epoch 71 iteration 0135/0187: training loss 0.696 Epoch 71 iteration 0136/0187: training loss 0.696 Epoch 71 iteration 0137/0187: training loss 0.695 Epoch 71 iteration 0138/0187: training loss 0.697 Epoch 71 iteration 0139/0187: training loss 0.698 Epoch 71 iteration 0140/0187: training loss 0.699 Epoch 71 iteration 0141/0187: training loss 0.699 Epoch 71 iteration 0142/0187: training loss 0.699 Epoch 71 iteration 0143/0187: training loss 0.700 Epoch 71 iteration 0144/0187: training loss 0.699 Epoch 71 iteration 0145/0187: training loss 0.699 Epoch 71 iteration 0146/0187: training loss 0.700 Epoch 71 iteration 0147/0187: training loss 0.700 Epoch 71 iteration 0148/0187: training loss 0.701 Epoch 71 iteration 0149/0187: training loss 0.700 Epoch 71 iteration 0150/0187: training loss 0.700 Epoch 71 iteration 0151/0187: training loss 0.701 Epoch 71 iteration 0152/0187: training loss 0.702 Epoch 71 iteration 0153/0187: training loss 0.703 Epoch 71 iteration 0154/0187: training loss 0.703 Epoch 71 iteration 0155/0187: training loss 0.704 Epoch 71 iteration 0156/0187: training loss 0.703 Epoch 71 iteration 0157/0187: training loss 0.704 Epoch 71 iteration 0158/0187: training loss 0.704 Epoch 71 iteration 0159/0187: training loss 0.705 Epoch 71 iteration 0160/0187: training loss 0.704 Epoch 71 iteration 0161/0187: training loss 0.704 Epoch 71 iteration 0162/0187: training loss 0.705 Epoch 71 iteration 0163/0187: training loss 0.704 Epoch 71 iteration 0164/0187: training loss 0.703 Epoch 71 iteration 0165/0187: training loss 0.704 Epoch 71 iteration 0166/0187: training loss 0.703 Epoch 71 iteration 0167/0187: training loss 0.704 Epoch 71 iteration 0168/0187: training loss 0.704 Epoch 71 iteration 0169/0187: training loss 0.703 Epoch 71 iteration 0170/0187: training loss 0.703 Epoch 71 iteration 0171/0187: training loss 0.703 Epoch 71 iteration 0172/0187: training loss 0.703 Epoch 71 iteration 0173/0187: training loss 0.704 Epoch 71 iteration 0174/0187: training loss 0.704 Epoch 71 iteration 0175/0187: training loss 0.704 Epoch 71 iteration 0176/0187: training loss 0.704 Epoch 71 iteration 0177/0187: training loss 0.703 Epoch 71 iteration 0178/0187: training loss 0.703 Epoch 71 iteration 0179/0187: training loss 0.703 Epoch 71 iteration 0180/0187: training loss 0.702 Epoch 71 iteration 0181/0187: training loss 0.703 Epoch 71 iteration 0182/0187: training loss 0.703 Epoch 71 iteration 0183/0187: training loss 0.704 Epoch 71 iteration 0184/0187: training loss 0.703 Epoch 71 iteration 0185/0187: training loss 0.702 Epoch 71 iteration 0186/0187: training loss 0.702 Epoch 71 iteration 0187/0187: training loss 0.702 Epoch 71 validation pixAcc: 0.875, mIoU: 0.390 Epoch 72 iteration 0001/0187: training loss 0.719 Epoch 72 iteration 0002/0187: training loss 0.717 Epoch 72 iteration 0003/0187: training loss 0.699 Epoch 72 iteration 0004/0187: training loss 0.739 Epoch 72 iteration 0005/0187: training loss 0.736 Epoch 72 iteration 0006/0187: training loss 0.749 Epoch 72 iteration 0007/0187: training loss 0.744 Epoch 72 iteration 0008/0187: training loss 0.739 Epoch 72 iteration 0009/0187: training loss 0.732 Epoch 72 iteration 0010/0187: training loss 0.710 Epoch 72 iteration 0011/0187: training loss 0.713 Epoch 72 iteration 0012/0187: training loss 0.717 Epoch 72 iteration 0013/0187: training loss 0.714 Epoch 72 iteration 0014/0187: training loss 0.709 Epoch 72 iteration 0015/0187: training loss 0.699 Epoch 72 iteration 0016/0187: training loss 0.704 Epoch 72 iteration 0017/0187: training loss 0.701 Epoch 72 iteration 0018/0187: training loss 0.700 Epoch 72 iteration 0019/0187: training loss 0.699 Epoch 72 iteration 0020/0187: training loss 0.701 Epoch 72 iteration 0021/0187: training loss 0.699 Epoch 72 iteration 0022/0187: training loss 0.691 Epoch 72 iteration 0023/0187: training loss 0.698 Epoch 72 iteration 0024/0187: training loss 0.707 Epoch 72 iteration 0025/0187: training loss 0.704 Epoch 72 iteration 0026/0187: training loss 0.703 Epoch 72 iteration 0027/0187: training loss 0.700 Epoch 72 iteration 0028/0187: training loss 0.697 Epoch 72 iteration 0029/0187: training loss 0.699 Epoch 72 iteration 0030/0187: training loss 0.695 Epoch 72 iteration 0031/0187: training loss 0.702 Epoch 72 iteration 0032/0187: training loss 0.710 Epoch 72 iteration 0033/0187: training loss 0.706 Epoch 72 iteration 0034/0187: training loss 0.709 Epoch 72 iteration 0035/0187: training loss 0.703 Epoch 72 iteration 0036/0187: training loss 0.701 Epoch 72 iteration 0037/0187: training loss 0.699 Epoch 72 iteration 0038/0187: training loss 0.702 Epoch 72 iteration 0039/0187: training loss 0.700 Epoch 72 iteration 0040/0187: training loss 0.702 Epoch 72 iteration 0041/0187: training loss 0.699 Epoch 72 iteration 0042/0187: training loss 0.701 Epoch 72 iteration 0043/0187: training loss 0.702 Epoch 72 iteration 0044/0187: training loss 0.703 Epoch 72 iteration 0045/0187: training loss 0.705 Epoch 72 iteration 0046/0187: training loss 0.704 Epoch 72 iteration 0047/0187: training loss 0.709 Epoch 72 iteration 0048/0187: training loss 0.710 Epoch 72 iteration 0049/0187: training loss 0.710 Epoch 72 iteration 0050/0187: training loss 0.715 Epoch 72 iteration 0051/0187: training loss 0.715 Epoch 72 iteration 0052/0187: training loss 0.712 Epoch 72 iteration 0053/0187: training loss 0.711 Epoch 72 iteration 0054/0187: training loss 0.708 Epoch 72 iteration 0055/0187: training loss 0.708 Epoch 72 iteration 0056/0187: training loss 0.707 Epoch 72 iteration 0057/0187: training loss 0.705 Epoch 72 iteration 0058/0187: training loss 0.704 Epoch 72 iteration 0059/0187: training loss 0.705 Epoch 72 iteration 0060/0187: training loss 0.710 Epoch 72 iteration 0061/0187: training loss 0.707 Epoch 72 iteration 0062/0187: training loss 0.706 Epoch 72 iteration 0063/0187: training loss 0.703 Epoch 72 iteration 0064/0187: training loss 0.702 Epoch 72 iteration 0065/0187: training loss 0.702 Epoch 72 iteration 0066/0187: training loss 0.704 Epoch 72 iteration 0067/0187: training loss 0.702 Epoch 72 iteration 0068/0187: training loss 0.702 Epoch 72 iteration 0069/0187: training loss 0.702 Epoch 72 iteration 0070/0187: training loss 0.703 Epoch 72 iteration 0071/0187: training loss 0.703 Epoch 72 iteration 0072/0187: training loss 0.705 Epoch 72 iteration 0073/0187: training loss 0.704 Epoch 72 iteration 0074/0187: training loss 0.703 Epoch 72 iteration 0075/0187: training loss 0.703 Epoch 72 iteration 0076/0187: training loss 0.702 Epoch 72 iteration 0077/0187: training loss 0.701 Epoch 72 iteration 0078/0187: training loss 0.700 Epoch 72 iteration 0079/0187: training loss 0.697 Epoch 72 iteration 0080/0187: training loss 0.696 Epoch 72 iteration 0081/0187: training loss 0.698 Epoch 72 iteration 0082/0187: training loss 0.699 Epoch 72 iteration 0083/0187: training loss 0.698 Epoch 72 iteration 0084/0187: training loss 0.697 Epoch 72 iteration 0085/0187: training loss 0.698 Epoch 72 iteration 0086/0187: training loss 0.701 Epoch 72 iteration 0087/0187: training loss 0.701 Epoch 72 iteration 0088/0187: training loss 0.702 Epoch 72 iteration 0089/0187: training loss 0.703 Epoch 72 iteration 0090/0187: training loss 0.703 Epoch 72 iteration 0091/0188: training loss 0.703 Epoch 72 iteration 0092/0188: training loss 0.703 Epoch 72 iteration 0093/0188: training loss 0.704 Epoch 72 iteration 0094/0188: training loss 0.703 Epoch 72 iteration 0095/0188: training loss 0.703 Epoch 72 iteration 0096/0188: training loss 0.703 Epoch 72 iteration 0097/0188: training loss 0.704 Epoch 72 iteration 0098/0188: training loss 0.702 Epoch 72 iteration 0099/0188: training loss 0.705 Epoch 72 iteration 0100/0188: training loss 0.703 Epoch 72 iteration 0101/0188: training loss 0.704 Epoch 72 iteration 0102/0188: training loss 0.705 Epoch 72 iteration 0103/0188: training loss 0.704 Epoch 72 iteration 0104/0188: training loss 0.704 Epoch 72 iteration 0105/0188: training loss 0.704 Epoch 72 iteration 0106/0188: training loss 0.704 Epoch 72 iteration 0107/0188: training loss 0.704 Epoch 72 iteration 0108/0188: training loss 0.705 Epoch 72 iteration 0109/0188: training loss 0.705 Epoch 72 iteration 0110/0188: training loss 0.705 Epoch 72 iteration 0111/0188: training loss 0.704 Epoch 72 iteration 0112/0188: training loss 0.707 Epoch 72 iteration 0113/0188: training loss 0.706 Epoch 72 iteration 0114/0188: training loss 0.706 Epoch 72 iteration 0115/0188: training loss 0.706 Epoch 72 iteration 0116/0188: training loss 0.704 Epoch 72 iteration 0117/0188: training loss 0.704 Epoch 72 iteration 0118/0188: training loss 0.705 Epoch 72 iteration 0119/0188: training loss 0.705 Epoch 72 iteration 0120/0188: training loss 0.704 Epoch 72 iteration 0121/0188: training loss 0.705 Epoch 72 iteration 0122/0188: training loss 0.706 Epoch 72 iteration 0123/0188: training loss 0.706 Epoch 72 iteration 0124/0188: training loss 0.710 Epoch 72 iteration 0125/0188: training loss 0.711 Epoch 72 iteration 0126/0188: training loss 0.710 Epoch 72 iteration 0127/0188: training loss 0.709 Epoch 72 iteration 0128/0188: training loss 0.712 Epoch 72 iteration 0129/0188: training loss 0.711 Epoch 72 iteration 0130/0188: training loss 0.711 Epoch 72 iteration 0131/0188: training loss 0.710 Epoch 72 iteration 0132/0188: training loss 0.711 Epoch 72 iteration 0133/0188: training loss 0.711 Epoch 72 iteration 0134/0188: training loss 0.710 Epoch 72 iteration 0135/0188: training loss 0.710 Epoch 72 iteration 0136/0188: training loss 0.710 Epoch 72 iteration 0137/0188: training loss 0.709 Epoch 72 iteration 0138/0188: training loss 0.708 Epoch 72 iteration 0139/0188: training loss 0.708 Epoch 72 iteration 0140/0188: training loss 0.708 Epoch 72 iteration 0141/0188: training loss 0.707 Epoch 72 iteration 0142/0188: training loss 0.707 Epoch 72 iteration 0143/0188: training loss 0.706 Epoch 72 iteration 0144/0188: training loss 0.706 Epoch 72 iteration 0145/0188: training loss 0.706 Epoch 72 iteration 0146/0188: training loss 0.705 Epoch 72 iteration 0147/0188: training loss 0.705 Epoch 72 iteration 0148/0188: training loss 0.705 Epoch 72 iteration 0149/0188: training loss 0.704 Epoch 72 iteration 0150/0188: training loss 0.704 Epoch 72 iteration 0151/0188: training loss 0.704 Epoch 72 iteration 0152/0188: training loss 0.703 Epoch 72 iteration 0153/0188: training loss 0.704 Epoch 72 iteration 0154/0188: training loss 0.703 Epoch 72 iteration 0155/0188: training loss 0.702 Epoch 72 iteration 0156/0188: training loss 0.702 Epoch 72 iteration 0157/0188: training loss 0.702 Epoch 72 iteration 0158/0188: training loss 0.702 Epoch 72 iteration 0159/0188: training loss 0.702 Epoch 72 iteration 0160/0188: training loss 0.702 Epoch 72 iteration 0161/0188: training loss 0.701 Epoch 72 iteration 0162/0188: training loss 0.702 Epoch 72 iteration 0163/0188: training loss 0.703 Epoch 72 iteration 0164/0188: training loss 0.704 Epoch 72 iteration 0165/0188: training loss 0.704 Epoch 72 iteration 0166/0188: training loss 0.704 Epoch 72 iteration 0167/0188: training loss 0.704 Epoch 72 iteration 0168/0188: training loss 0.704 Epoch 72 iteration 0169/0188: training loss 0.704 Epoch 72 iteration 0170/0188: training loss 0.704 Epoch 72 iteration 0171/0188: training loss 0.704 Epoch 72 iteration 0172/0188: training loss 0.704 Epoch 72 iteration 0173/0188: training loss 0.703 Epoch 72 iteration 0174/0188: training loss 0.703 Epoch 72 iteration 0175/0188: training loss 0.703 Epoch 72 iteration 0176/0188: training loss 0.702 Epoch 72 iteration 0177/0188: training loss 0.702 Epoch 72 iteration 0178/0188: training loss 0.701 Epoch 72 iteration 0179/0188: training loss 0.701 Epoch 72 iteration 0180/0188: training loss 0.702 Epoch 72 iteration 0181/0188: training loss 0.702 Epoch 72 iteration 0182/0188: training loss 0.701 Epoch 72 iteration 0183/0188: training loss 0.700 Epoch 72 iteration 0184/0188: training loss 0.701 Epoch 72 iteration 0185/0188: training loss 0.701 Epoch 72 iteration 0186/0188: training loss 0.701 Epoch 72 validation pixAcc: 0.876, mIoU: 0.386 Epoch 73 iteration 0001/0187: training loss 0.822 Epoch 73 iteration 0002/0187: training loss 0.763 Epoch 73 iteration 0003/0187: training loss 0.797 Epoch 73 iteration 0004/0187: training loss 0.744 Epoch 73 iteration 0005/0187: training loss 0.758 Epoch 73 iteration 0006/0187: training loss 0.775 Epoch 73 iteration 0007/0187: training loss 0.778 Epoch 73 iteration 0008/0187: training loss 0.776 Epoch 73 iteration 0009/0187: training loss 0.759 Epoch 73 iteration 0010/0187: training loss 0.735 Epoch 73 iteration 0011/0187: training loss 0.729 Epoch 73 iteration 0012/0187: training loss 0.734 Epoch 73 iteration 0013/0187: training loss 0.730 Epoch 73 iteration 0014/0187: training loss 0.728 Epoch 73 iteration 0015/0187: training loss 0.725 Epoch 73 iteration 0016/0187: training loss 0.746 Epoch 73 iteration 0017/0187: training loss 0.747 Epoch 73 iteration 0018/0187: training loss 0.747 Epoch 73 iteration 0019/0187: training loss 0.744 Epoch 73 iteration 0020/0187: training loss 0.738 Epoch 73 iteration 0021/0187: training loss 0.733 Epoch 73 iteration 0022/0187: training loss 0.735 Epoch 73 iteration 0023/0187: training loss 0.737 Epoch 73 iteration 0024/0187: training loss 0.738 Epoch 73 iteration 0025/0187: training loss 0.742 Epoch 73 iteration 0026/0187: training loss 0.739 Epoch 73 iteration 0027/0187: training loss 0.735 Epoch 73 iteration 0028/0187: training loss 0.729 Epoch 73 iteration 0029/0187: training loss 0.725 Epoch 73 iteration 0030/0187: training loss 0.722 Epoch 73 iteration 0031/0187: training loss 0.719 Epoch 73 iteration 0032/0187: training loss 0.717 Epoch 73 iteration 0033/0187: training loss 0.717 Epoch 73 iteration 0034/0187: training loss 0.715 Epoch 73 iteration 0035/0187: training loss 0.713 Epoch 73 iteration 0036/0187: training loss 0.715 Epoch 73 iteration 0037/0187: training loss 0.713 Epoch 73 iteration 0038/0187: training loss 0.709 Epoch 73 iteration 0039/0187: training loss 0.707 Epoch 73 iteration 0040/0187: training loss 0.707 Epoch 73 iteration 0041/0187: training loss 0.706 Epoch 73 iteration 0042/0187: training loss 0.708 Epoch 73 iteration 0043/0187: training loss 0.710 Epoch 73 iteration 0044/0187: training loss 0.710 Epoch 73 iteration 0045/0187: training loss 0.708 Epoch 73 iteration 0046/0187: training loss 0.708 Epoch 73 iteration 0047/0187: training loss 0.710 Epoch 73 iteration 0048/0187: training loss 0.714 Epoch 73 iteration 0049/0187: training loss 0.714 Epoch 73 iteration 0050/0187: training loss 0.712 Epoch 73 iteration 0051/0187: training loss 0.713 Epoch 73 iteration 0052/0187: training loss 0.711 Epoch 73 iteration 0053/0187: training loss 0.712 Epoch 73 iteration 0054/0187: training loss 0.710 Epoch 73 iteration 0055/0187: training loss 0.710 Epoch 73 iteration 0056/0187: training loss 0.708 Epoch 73 iteration 0057/0187: training loss 0.707 Epoch 73 iteration 0058/0187: training loss 0.706 Epoch 73 iteration 0059/0187: training loss 0.708 Epoch 73 iteration 0060/0187: training loss 0.708 Epoch 73 iteration 0061/0187: training loss 0.707 Epoch 73 iteration 0062/0187: training loss 0.710 Epoch 73 iteration 0063/0187: training loss 0.709 Epoch 73 iteration 0064/0187: training loss 0.707 Epoch 73 iteration 0065/0187: training loss 0.706 Epoch 73 iteration 0066/0187: training loss 0.707 Epoch 73 iteration 0067/0187: training loss 0.707 Epoch 73 iteration 0068/0187: training loss 0.707 Epoch 73 iteration 0069/0187: training loss 0.709 Epoch 73 iteration 0070/0187: training loss 0.710 Epoch 73 iteration 0071/0187: training loss 0.712 Epoch 73 iteration 0072/0187: training loss 0.714 Epoch 73 iteration 0073/0187: training loss 0.712 Epoch 73 iteration 0074/0187: training loss 0.712 Epoch 73 iteration 0075/0187: training loss 0.713 Epoch 73 iteration 0076/0187: training loss 0.713 Epoch 73 iteration 0077/0187: training loss 0.716 Epoch 73 iteration 0078/0187: training loss 0.715 Epoch 73 iteration 0079/0187: training loss 0.715 Epoch 73 iteration 0080/0187: training loss 0.714 Epoch 73 iteration 0081/0187: training loss 0.712 Epoch 73 iteration 0082/0187: training loss 0.712 Epoch 73 iteration 0083/0187: training loss 0.712 Epoch 73 iteration 0084/0187: training loss 0.713 Epoch 73 iteration 0085/0187: training loss 0.712 Epoch 73 iteration 0086/0187: training loss 0.712 Epoch 73 iteration 0087/0187: training loss 0.710 Epoch 73 iteration 0088/0187: training loss 0.710 Epoch 73 iteration 0089/0187: training loss 0.709 Epoch 73 iteration 0090/0187: training loss 0.709 Epoch 73 iteration 0091/0187: training loss 0.708 Epoch 73 iteration 0092/0187: training loss 0.708 Epoch 73 iteration 0093/0187: training loss 0.708 Epoch 73 iteration 0094/0187: training loss 0.708 Epoch 73 iteration 0095/0187: training loss 0.708 Epoch 73 iteration 0096/0187: training loss 0.706 Epoch 73 iteration 0097/0187: training loss 0.705 Epoch 73 iteration 0098/0187: training loss 0.703 Epoch 73 iteration 0099/0187: training loss 0.702 Epoch 73 iteration 0100/0187: training loss 0.701 Epoch 73 iteration 0101/0187: training loss 0.701 Epoch 73 iteration 0102/0187: training loss 0.700 Epoch 73 iteration 0103/0187: training loss 0.700 Epoch 73 iteration 0104/0187: training loss 0.699 Epoch 73 iteration 0105/0187: training loss 0.698 Epoch 73 iteration 0106/0187: training loss 0.699 Epoch 73 iteration 0107/0187: training loss 0.699 Epoch 73 iteration 0108/0187: training loss 0.698 Epoch 73 iteration 0109/0187: training loss 0.699 Epoch 73 iteration 0110/0187: training loss 0.699 Epoch 73 iteration 0111/0187: training loss 0.700 Epoch 73 iteration 0112/0187: training loss 0.702 Epoch 73 iteration 0113/0187: training loss 0.702 Epoch 73 iteration 0114/0187: training loss 0.703 Epoch 73 iteration 0115/0187: training loss 0.702 Epoch 73 iteration 0116/0187: training loss 0.703 Epoch 73 iteration 0117/0187: training loss 0.702 Epoch 73 iteration 0118/0187: training loss 0.703 Epoch 73 iteration 0119/0187: training loss 0.704 Epoch 73 iteration 0120/0187: training loss 0.703 Epoch 73 iteration 0121/0187: training loss 0.703 Epoch 73 iteration 0122/0187: training loss 0.703 Epoch 73 iteration 0123/0187: training loss 0.703 Epoch 73 iteration 0124/0187: training loss 0.703 Epoch 73 iteration 0125/0187: training loss 0.702 Epoch 73 iteration 0126/0187: training loss 0.703 Epoch 73 iteration 0127/0187: training loss 0.704 Epoch 73 iteration 0128/0187: training loss 0.704 Epoch 73 iteration 0129/0187: training loss 0.703 Epoch 73 iteration 0130/0187: training loss 0.702 Epoch 73 iteration 0131/0187: training loss 0.701 Epoch 73 iteration 0132/0187: training loss 0.703 Epoch 73 iteration 0133/0187: training loss 0.702 Epoch 73 iteration 0134/0187: training loss 0.701 Epoch 73 iteration 0135/0187: training loss 0.701 Epoch 73 iteration 0136/0187: training loss 0.700 Epoch 73 iteration 0137/0187: training loss 0.701 Epoch 73 iteration 0138/0187: training loss 0.701 Epoch 73 iteration 0139/0187: training loss 0.701 Epoch 73 iteration 0140/0187: training loss 0.701 Epoch 73 iteration 0141/0187: training loss 0.703 Epoch 73 iteration 0142/0187: training loss 0.704 Epoch 73 iteration 0143/0187: training loss 0.705 Epoch 73 iteration 0144/0187: training loss 0.706 Epoch 73 iteration 0145/0187: training loss 0.705 Epoch 73 iteration 0146/0187: training loss 0.704 Epoch 73 iteration 0147/0187: training loss 0.705 Epoch 73 iteration 0148/0187: training loss 0.705 Epoch 73 iteration 0149/0187: training loss 0.704 Epoch 73 iteration 0150/0187: training loss 0.704 Epoch 73 iteration 0151/0187: training loss 0.704 Epoch 73 iteration 0152/0187: training loss 0.704 Epoch 73 iteration 0153/0187: training loss 0.703 Epoch 73 iteration 0154/0187: training loss 0.703 Epoch 73 iteration 0155/0187: training loss 0.703 Epoch 73 iteration 0156/0187: training loss 0.703 Epoch 73 iteration 0157/0187: training loss 0.702 Epoch 73 iteration 0158/0187: training loss 0.701 Epoch 73 iteration 0159/0187: training loss 0.701 Epoch 73 iteration 0160/0187: training loss 0.700 Epoch 73 iteration 0161/0187: training loss 0.699 Epoch 73 iteration 0162/0187: training loss 0.699 Epoch 73 iteration 0163/0187: training loss 0.699 Epoch 73 iteration 0164/0187: training loss 0.698 Epoch 73 iteration 0165/0187: training loss 0.698 Epoch 73 iteration 0166/0187: training loss 0.699 Epoch 73 iteration 0167/0187: training loss 0.699 Epoch 73 iteration 0168/0187: training loss 0.698 Epoch 73 iteration 0169/0187: training loss 0.699 Epoch 73 iteration 0170/0187: training loss 0.699 Epoch 73 iteration 0171/0187: training loss 0.699 Epoch 73 iteration 0172/0187: training loss 0.699 Epoch 73 iteration 0173/0187: training loss 0.700 Epoch 73 iteration 0174/0187: training loss 0.700 Epoch 73 iteration 0175/0187: training loss 0.700 Epoch 73 iteration 0176/0187: training loss 0.700 Epoch 73 iteration 0177/0187: training loss 0.700 Epoch 73 iteration 0178/0187: training loss 0.700 Epoch 73 iteration 0179/0187: training loss 0.700 Epoch 73 iteration 0180/0187: training loss 0.700 Epoch 73 iteration 0181/0187: training loss 0.701 Epoch 73 iteration 0182/0187: training loss 0.700 Epoch 73 iteration 0183/0187: training loss 0.699 Epoch 73 iteration 0184/0187: training loss 0.699 Epoch 73 iteration 0185/0187: training loss 0.700 Epoch 73 iteration 0186/0187: training loss 0.700 Epoch 73 iteration 0187/0187: training loss 0.700 Epoch 73 validation pixAcc: 0.875, mIoU: 0.390 Epoch 74 iteration 0001/0187: training loss 0.770 Epoch 74 iteration 0002/0187: training loss 0.761 Epoch 74 iteration 0003/0187: training loss 0.736 Epoch 74 iteration 0004/0187: training loss 0.708 Epoch 74 iteration 0005/0187: training loss 0.702 Epoch 74 iteration 0006/0187: training loss 0.700 Epoch 74 iteration 0007/0187: training loss 0.707 Epoch 74 iteration 0008/0187: training loss 0.692 Epoch 74 iteration 0009/0187: training loss 0.701 Epoch 74 iteration 0010/0187: training loss 0.684 Epoch 74 iteration 0011/0187: training loss 0.676 Epoch 74 iteration 0012/0187: training loss 0.677 Epoch 74 iteration 0013/0187: training loss 0.670 Epoch 74 iteration 0014/0187: training loss 0.676 Epoch 74 iteration 0015/0187: training loss 0.668 Epoch 74 iteration 0016/0187: training loss 0.673 Epoch 74 iteration 0017/0187: training loss 0.665 Epoch 74 iteration 0018/0187: training loss 0.659 Epoch 74 iteration 0019/0187: training loss 0.658 Epoch 74 iteration 0020/0187: training loss 0.660 Epoch 74 iteration 0021/0187: training loss 0.664 Epoch 74 iteration 0022/0187: training loss 0.670 Epoch 74 iteration 0023/0187: training loss 0.667 Epoch 74 iteration 0024/0187: training loss 0.664 Epoch 74 iteration 0025/0187: training loss 0.672 Epoch 74 iteration 0026/0187: training loss 0.675 Epoch 74 iteration 0027/0187: training loss 0.672 Epoch 74 iteration 0028/0187: training loss 0.671 Epoch 74 iteration 0029/0187: training loss 0.668 Epoch 74 iteration 0030/0187: training loss 0.666 Epoch 74 iteration 0031/0187: training loss 0.664 Epoch 74 iteration 0032/0187: training loss 0.666 Epoch 74 iteration 0033/0187: training loss 0.674 Epoch 74 iteration 0034/0187: training loss 0.678 Epoch 74 iteration 0035/0187: training loss 0.679 Epoch 74 iteration 0036/0187: training loss 0.681 Epoch 74 iteration 0037/0187: training loss 0.681 Epoch 74 iteration 0038/0187: training loss 0.678 Epoch 74 iteration 0039/0187: training loss 0.674 Epoch 74 iteration 0040/0187: training loss 0.673 Epoch 74 iteration 0041/0187: training loss 0.679 Epoch 74 iteration 0042/0187: training loss 0.681 Epoch 74 iteration 0043/0187: training loss 0.681 Epoch 74 iteration 0044/0187: training loss 0.684 Epoch 74 iteration 0045/0187: training loss 0.683 Epoch 74 iteration 0046/0187: training loss 0.685 Epoch 74 iteration 0047/0187: training loss 0.683 Epoch 74 iteration 0048/0187: training loss 0.682 Epoch 74 iteration 0049/0187: training loss 0.684 Epoch 74 iteration 0050/0187: training loss 0.681 Epoch 74 iteration 0051/0187: training loss 0.687 Epoch 74 iteration 0052/0187: training loss 0.688 Epoch 74 iteration 0053/0187: training loss 0.690 Epoch 74 iteration 0054/0187: training loss 0.689 Epoch 74 iteration 0055/0187: training loss 0.689 Epoch 74 iteration 0056/0187: training loss 0.689 Epoch 74 iteration 0057/0187: training loss 0.689 Epoch 74 iteration 0058/0187: training loss 0.687 Epoch 74 iteration 0059/0187: training loss 0.688 Epoch 74 iteration 0060/0187: training loss 0.686 Epoch 74 iteration 0061/0187: training loss 0.687 Epoch 74 iteration 0062/0187: training loss 0.695 Epoch 74 iteration 0063/0187: training loss 0.694 Epoch 74 iteration 0064/0187: training loss 0.695 Epoch 74 iteration 0065/0187: training loss 0.694 Epoch 74 iteration 0066/0187: training loss 0.694 Epoch 74 iteration 0067/0187: training loss 0.693 Epoch 74 iteration 0068/0187: training loss 0.693 Epoch 74 iteration 0069/0187: training loss 0.693 Epoch 74 iteration 0070/0187: training loss 0.694 Epoch 74 iteration 0071/0187: training loss 0.696 Epoch 74 iteration 0072/0187: training loss 0.697 Epoch 74 iteration 0073/0187: training loss 0.698 Epoch 74 iteration 0074/0187: training loss 0.697 Epoch 74 iteration 0075/0187: training loss 0.696 Epoch 74 iteration 0076/0187: training loss 0.700 Epoch 74 iteration 0077/0187: training loss 0.700 Epoch 74 iteration 0078/0187: training loss 0.700 Epoch 74 iteration 0079/0187: training loss 0.700 Epoch 74 iteration 0080/0187: training loss 0.700 Epoch 74 iteration 0081/0187: training loss 0.699 Epoch 74 iteration 0082/0187: training loss 0.699 Epoch 74 iteration 0083/0187: training loss 0.700 Epoch 74 iteration 0084/0187: training loss 0.701 Epoch 74 iteration 0085/0187: training loss 0.701 Epoch 74 iteration 0086/0187: training loss 0.700 Epoch 74 iteration 0087/0187: training loss 0.701 Epoch 74 iteration 0088/0187: training loss 0.702 Epoch 74 iteration 0089/0187: training loss 0.702 Epoch 74 iteration 0090/0187: training loss 0.703 Epoch 74 iteration 0091/0188: training loss 0.703 Epoch 74 iteration 0092/0188: training loss 0.704 Epoch 74 iteration 0093/0188: training loss 0.703 Epoch 74 iteration 0094/0188: training loss 0.703 Epoch 74 iteration 0095/0188: training loss 0.703 Epoch 74 iteration 0096/0188: training loss 0.705 Epoch 74 iteration 0097/0188: training loss 0.704 Epoch 74 iteration 0098/0188: training loss 0.704 Epoch 74 iteration 0099/0188: training loss 0.703 Epoch 74 iteration 0100/0188: training loss 0.702 Epoch 74 iteration 0101/0188: training loss 0.703 Epoch 74 iteration 0102/0188: training loss 0.703 Epoch 74 iteration 0103/0188: training loss 0.702 Epoch 74 iteration 0104/0188: training loss 0.703 Epoch 74 iteration 0105/0188: training loss 0.702 Epoch 74 iteration 0106/0188: training loss 0.702 Epoch 74 iteration 0107/0188: training loss 0.701 Epoch 74 iteration 0108/0188: training loss 0.700 Epoch 74 iteration 0109/0188: training loss 0.701 Epoch 74 iteration 0110/0188: training loss 0.702 Epoch 74 iteration 0111/0188: training loss 0.702 Epoch 74 iteration 0112/0188: training loss 0.702 Epoch 74 iteration 0113/0188: training loss 0.703 Epoch 74 iteration 0114/0188: training loss 0.703 Epoch 74 iteration 0115/0188: training loss 0.704 Epoch 74 iteration 0116/0188: training loss 0.704 Epoch 74 iteration 0117/0188: training loss 0.705 Epoch 74 iteration 0118/0188: training loss 0.705 Epoch 74 iteration 0119/0188: training loss 0.705 Epoch 74 iteration 0120/0188: training loss 0.707 Epoch 74 iteration 0121/0188: training loss 0.710 Epoch 74 iteration 0122/0188: training loss 0.710 Epoch 74 iteration 0123/0188: training loss 0.710 Epoch 74 iteration 0124/0188: training loss 0.709 Epoch 74 iteration 0125/0188: training loss 0.708 Epoch 74 iteration 0126/0188: training loss 0.709 Epoch 74 iteration 0127/0188: training loss 0.709 Epoch 74 iteration 0128/0188: training loss 0.708 Epoch 74 iteration 0129/0188: training loss 0.707 Epoch 74 iteration 0130/0188: training loss 0.708 Epoch 74 iteration 0131/0188: training loss 0.707 Epoch 74 iteration 0132/0188: training loss 0.707 Epoch 74 iteration 0133/0188: training loss 0.705 Epoch 74 iteration 0134/0188: training loss 0.705 Epoch 74 iteration 0135/0188: training loss 0.706 Epoch 74 iteration 0136/0188: training loss 0.707 Epoch 74 iteration 0137/0188: training loss 0.705 Epoch 74 iteration 0138/0188: training loss 0.707 Epoch 74 iteration 0139/0188: training loss 0.706 Epoch 74 iteration 0140/0188: training loss 0.706 Epoch 74 iteration 0141/0188: training loss 0.706 Epoch 74 iteration 0142/0188: training loss 0.706 Epoch 74 iteration 0143/0188: training loss 0.705 Epoch 74 iteration 0144/0188: training loss 0.705 Epoch 74 iteration 0145/0188: training loss 0.706 Epoch 74 iteration 0146/0188: training loss 0.705 Epoch 74 iteration 0147/0188: training loss 0.706 Epoch 74 iteration 0148/0188: training loss 0.706 Epoch 74 iteration 0149/0188: training loss 0.706 Epoch 74 iteration 0150/0188: training loss 0.705 Epoch 74 iteration 0151/0188: training loss 0.704 Epoch 74 iteration 0152/0188: training loss 0.705 Epoch 74 iteration 0153/0188: training loss 0.705 Epoch 74 iteration 0154/0188: training loss 0.706 Epoch 74 iteration 0155/0188: training loss 0.705 Epoch 74 iteration 0156/0188: training loss 0.705 Epoch 74 iteration 0157/0188: training loss 0.705 Epoch 74 iteration 0158/0188: training loss 0.704 Epoch 74 iteration 0159/0188: training loss 0.705 Epoch 74 iteration 0160/0188: training loss 0.705 Epoch 74 iteration 0161/0188: training loss 0.705 Epoch 74 iteration 0162/0188: training loss 0.705 Epoch 74 iteration 0163/0188: training loss 0.706 Epoch 74 iteration 0164/0188: training loss 0.705 Epoch 74 iteration 0165/0188: training loss 0.705 Epoch 74 iteration 0166/0188: training loss 0.706 Epoch 74 iteration 0167/0188: training loss 0.705 Epoch 74 iteration 0168/0188: training loss 0.705 Epoch 74 iteration 0169/0188: training loss 0.705 Epoch 74 iteration 0170/0188: training loss 0.705 Epoch 74 iteration 0171/0188: training loss 0.704 Epoch 74 iteration 0172/0188: training loss 0.704 Epoch 74 iteration 0173/0188: training loss 0.705 Epoch 74 iteration 0174/0188: training loss 0.705 Epoch 74 iteration 0175/0188: training loss 0.705 Epoch 74 iteration 0176/0188: training loss 0.706 Epoch 74 iteration 0177/0188: training loss 0.706 Epoch 74 iteration 0178/0188: training loss 0.706 Epoch 74 iteration 0179/0188: training loss 0.705 Epoch 74 iteration 0180/0188: training loss 0.705 Epoch 74 iteration 0181/0188: training loss 0.705 Epoch 74 iteration 0182/0188: training loss 0.705 Epoch 74 iteration 0183/0188: training loss 0.704 Epoch 74 iteration 0184/0188: training loss 0.704 Epoch 74 iteration 0185/0188: training loss 0.704 Epoch 74 iteration 0186/0188: training loss 0.704 Epoch 74 validation pixAcc: 0.875, mIoU: 0.389 Epoch 75 iteration 0001/0187: training loss 0.581 Epoch 75 iteration 0002/0187: training loss 0.693 Epoch 75 iteration 0003/0187: training loss 0.708 Epoch 75 iteration 0004/0187: training loss 0.714 Epoch 75 iteration 0005/0187: training loss 0.719 Epoch 75 iteration 0006/0187: training loss 0.725 Epoch 75 iteration 0007/0187: training loss 0.723 Epoch 75 iteration 0008/0187: training loss 0.715 Epoch 75 iteration 0009/0187: training loss 0.719 Epoch 75 iteration 0010/0187: training loss 0.715 Epoch 75 iteration 0011/0187: training loss 0.705 Epoch 75 iteration 0012/0187: training loss 0.705 Epoch 75 iteration 0013/0187: training loss 0.704 Epoch 75 iteration 0014/0187: training loss 0.714 Epoch 75 iteration 0015/0187: training loss 0.728 Epoch 75 iteration 0016/0187: training loss 0.727 Epoch 75 iteration 0017/0187: training loss 0.728 Epoch 75 iteration 0018/0187: training loss 0.724 Epoch 75 iteration 0019/0187: training loss 0.718 Epoch 75 iteration 0020/0187: training loss 0.720 Epoch 75 iteration 0021/0187: training loss 0.714 Epoch 75 iteration 0022/0187: training loss 0.716 Epoch 75 iteration 0023/0187: training loss 0.715 Epoch 75 iteration 0024/0187: training loss 0.722 Epoch 75 iteration 0025/0187: training loss 0.722 Epoch 75 iteration 0026/0187: training loss 0.716 Epoch 75 iteration 0027/0187: training loss 0.712 Epoch 75 iteration 0028/0187: training loss 0.715 Epoch 75 iteration 0029/0187: training loss 0.719 Epoch 75 iteration 0030/0187: training loss 0.720 Epoch 75 iteration 0031/0187: training loss 0.723 Epoch 75 iteration 0032/0187: training loss 0.722 Epoch 75 iteration 0033/0187: training loss 0.717 Epoch 75 iteration 0034/0187: training loss 0.717 Epoch 75 iteration 0035/0187: training loss 0.723 Epoch 75 iteration 0036/0187: training loss 0.720 Epoch 75 iteration 0037/0187: training loss 0.717 Epoch 75 iteration 0038/0187: training loss 0.718 Epoch 75 iteration 0039/0187: training loss 0.718 Epoch 75 iteration 0040/0187: training loss 0.716 Epoch 75 iteration 0041/0187: training loss 0.712 Epoch 75 iteration 0042/0187: training loss 0.714 Epoch 75 iteration 0043/0187: training loss 0.713 Epoch 75 iteration 0044/0187: training loss 0.708 Epoch 75 iteration 0045/0187: training loss 0.708 Epoch 75 iteration 0046/0187: training loss 0.706 Epoch 75 iteration 0047/0187: training loss 0.706 Epoch 75 iteration 0048/0187: training loss 0.705 Epoch 75 iteration 0049/0187: training loss 0.705 Epoch 75 iteration 0050/0187: training loss 0.704 Epoch 75 iteration 0051/0187: training loss 0.702 Epoch 75 iteration 0052/0187: training loss 0.703 Epoch 75 iteration 0053/0187: training loss 0.705 Epoch 75 iteration 0054/0187: training loss 0.706 Epoch 75 iteration 0055/0187: training loss 0.704 Epoch 75 iteration 0056/0187: training loss 0.702 Epoch 75 iteration 0057/0187: training loss 0.703 Epoch 75 iteration 0058/0187: training loss 0.703 Epoch 75 iteration 0059/0187: training loss 0.706 Epoch 75 iteration 0060/0187: training loss 0.706 Epoch 75 iteration 0061/0187: training loss 0.710 Epoch 75 iteration 0062/0187: training loss 0.710 Epoch 75 iteration 0063/0187: training loss 0.710 Epoch 75 iteration 0064/0187: training loss 0.710 Epoch 75 iteration 0065/0187: training loss 0.712 Epoch 75 iteration 0066/0187: training loss 0.710 Epoch 75 iteration 0067/0187: training loss 0.709 Epoch 75 iteration 0068/0187: training loss 0.708 Epoch 75 iteration 0069/0187: training loss 0.706 Epoch 75 iteration 0070/0187: training loss 0.706 Epoch 75 iteration 0071/0187: training loss 0.705 Epoch 75 iteration 0072/0187: training loss 0.704 Epoch 75 iteration 0073/0187: training loss 0.703 Epoch 75 iteration 0074/0187: training loss 0.706 Epoch 75 iteration 0075/0187: training loss 0.705 Epoch 75 iteration 0076/0187: training loss 0.703 Epoch 75 iteration 0077/0187: training loss 0.704 Epoch 75 iteration 0078/0187: training loss 0.706 Epoch 75 iteration 0079/0187: training loss 0.704 Epoch 75 iteration 0080/0187: training loss 0.703 Epoch 75 iteration 0081/0187: training loss 0.704 Epoch 75 iteration 0082/0187: training loss 0.703 Epoch 75 iteration 0083/0187: training loss 0.703 Epoch 75 iteration 0084/0187: training loss 0.702 Epoch 75 iteration 0085/0187: training loss 0.702 Epoch 75 iteration 0086/0187: training loss 0.699 Epoch 75 iteration 0087/0187: training loss 0.699 Epoch 75 iteration 0088/0187: training loss 0.699 Epoch 75 iteration 0089/0187: training loss 0.699 Epoch 75 iteration 0090/0187: training loss 0.699 Epoch 75 iteration 0091/0187: training loss 0.700 Epoch 75 iteration 0092/0187: training loss 0.698 Epoch 75 iteration 0093/0187: training loss 0.699 Epoch 75 iteration 0094/0187: training loss 0.700 Epoch 75 iteration 0095/0187: training loss 0.701 Epoch 75 iteration 0096/0187: training loss 0.702 Epoch 75 iteration 0097/0187: training loss 0.703 Epoch 75 iteration 0098/0187: training loss 0.704 Epoch 75 iteration 0099/0187: training loss 0.702 Epoch 75 iteration 0100/0187: training loss 0.704 Epoch 75 iteration 0101/0187: training loss 0.703 Epoch 75 iteration 0102/0187: training loss 0.702 Epoch 75 iteration 0103/0187: training loss 0.702 Epoch 75 iteration 0104/0187: training loss 0.706 Epoch 75 iteration 0105/0187: training loss 0.705 Epoch 75 iteration 0106/0187: training loss 0.706 Epoch 75 iteration 0107/0187: training loss 0.706 Epoch 75 iteration 0108/0187: training loss 0.704 Epoch 75 iteration 0109/0187: training loss 0.702 Epoch 75 iteration 0110/0187: training loss 0.704 Epoch 75 iteration 0111/0187: training loss 0.703 Epoch 75 iteration 0112/0187: training loss 0.702 Epoch 75 iteration 0113/0187: training loss 0.701 Epoch 75 iteration 0114/0187: training loss 0.700 Epoch 75 iteration 0115/0187: training loss 0.701 Epoch 75 iteration 0116/0187: training loss 0.702 Epoch 75 iteration 0117/0187: training loss 0.701 Epoch 75 iteration 0118/0187: training loss 0.703 Epoch 75 iteration 0119/0187: training loss 0.705 Epoch 75 iteration 0120/0187: training loss 0.704 Epoch 75 iteration 0121/0187: training loss 0.704 Epoch 75 iteration 0122/0187: training loss 0.703 Epoch 75 iteration 0123/0187: training loss 0.702 Epoch 75 iteration 0124/0187: training loss 0.702 Epoch 75 iteration 0125/0187: training loss 0.702 Epoch 75 iteration 0126/0187: training loss 0.702 Epoch 75 iteration 0127/0187: training loss 0.701 Epoch 75 iteration 0128/0187: training loss 0.700 Epoch 75 iteration 0129/0187: training loss 0.700 Epoch 75 iteration 0130/0187: training loss 0.699 Epoch 75 iteration 0131/0187: training loss 0.699 Epoch 75 iteration 0132/0187: training loss 0.699 Epoch 75 iteration 0133/0187: training loss 0.698 Epoch 75 iteration 0134/0187: training loss 0.697 Epoch 75 iteration 0135/0187: training loss 0.698 Epoch 75 iteration 0136/0187: training loss 0.699 Epoch 75 iteration 0137/0187: training loss 0.699 Epoch 75 iteration 0138/0187: training loss 0.699 Epoch 75 iteration 0139/0187: training loss 0.699 Epoch 75 iteration 0140/0187: training loss 0.700 Epoch 75 iteration 0141/0187: training loss 0.699 Epoch 75 iteration 0142/0187: training loss 0.699 Epoch 75 iteration 0143/0187: training loss 0.698 Epoch 75 iteration 0144/0187: training loss 0.699 Epoch 75 iteration 0145/0187: training loss 0.699 Epoch 75 iteration 0146/0187: training loss 0.698 Epoch 75 iteration 0147/0187: training loss 0.698 Epoch 75 iteration 0148/0187: training loss 0.697 Epoch 75 iteration 0149/0187: training loss 0.698 Epoch 75 iteration 0150/0187: training loss 0.698 Epoch 75 iteration 0151/0187: training loss 0.697 Epoch 75 iteration 0152/0187: training loss 0.696 Epoch 75 iteration 0153/0187: training loss 0.697 Epoch 75 iteration 0154/0187: training loss 0.696 Epoch 75 iteration 0155/0187: training loss 0.696 Epoch 75 iteration 0156/0187: training loss 0.696 Epoch 75 iteration 0157/0187: training loss 0.695 Epoch 75 iteration 0158/0187: training loss 0.695 Epoch 75 iteration 0159/0187: training loss 0.695 Epoch 75 iteration 0160/0187: training loss 0.695 Epoch 75 iteration 0161/0187: training loss 0.696 Epoch 75 iteration 0162/0187: training loss 0.697 Epoch 75 iteration 0163/0187: training loss 0.696 Epoch 75 iteration 0164/0187: training loss 0.696 Epoch 75 iteration 0165/0187: training loss 0.697 Epoch 75 iteration 0166/0187: training loss 0.697 Epoch 75 iteration 0167/0187: training loss 0.697 Epoch 75 iteration 0168/0187: training loss 0.696 Epoch 75 iteration 0169/0187: training loss 0.697 Epoch 75 iteration 0170/0187: training loss 0.698 Epoch 75 iteration 0171/0187: training loss 0.698 Epoch 75 iteration 0172/0187: training loss 0.698 Epoch 75 iteration 0173/0187: training loss 0.699 Epoch 75 iteration 0174/0187: training loss 0.699 Epoch 75 iteration 0175/0187: training loss 0.699 Epoch 75 iteration 0176/0187: training loss 0.700 Epoch 75 iteration 0177/0187: training loss 0.701 Epoch 75 iteration 0178/0187: training loss 0.701 Epoch 75 iteration 0179/0187: training loss 0.701 Epoch 75 iteration 0180/0187: training loss 0.702 Epoch 75 iteration 0181/0187: training loss 0.702 Epoch 75 iteration 0182/0187: training loss 0.703 Epoch 75 iteration 0183/0187: training loss 0.703 Epoch 75 iteration 0184/0187: training loss 0.703 Epoch 75 iteration 0185/0187: training loss 0.704 Epoch 75 iteration 0186/0187: training loss 0.703 Epoch 75 iteration 0187/0187: training loss 0.704 Epoch 75 validation pixAcc: 0.875, mIoU: 0.388 Epoch 76 iteration 0001/0187: training loss 0.710 Epoch 76 iteration 0002/0187: training loss 0.682 Epoch 76 iteration 0003/0187: training loss 0.641 Epoch 76 iteration 0004/0187: training loss 0.674 Epoch 76 iteration 0005/0187: training loss 0.689 Epoch 76 iteration 0006/0187: training loss 0.687 Epoch 76 iteration 0007/0187: training loss 0.692 Epoch 76 iteration 0008/0187: training loss 0.706 Epoch 76 iteration 0009/0187: training loss 0.714 Epoch 76 iteration 0010/0187: training loss 0.712 Epoch 76 iteration 0011/0187: training loss 0.705 Epoch 76 iteration 0012/0187: training loss 0.726 Epoch 76 iteration 0013/0187: training loss 0.729 Epoch 76 iteration 0014/0187: training loss 0.733 Epoch 76 iteration 0015/0187: training loss 0.724 Epoch 76 iteration 0016/0187: training loss 0.725 Epoch 76 iteration 0017/0187: training loss 0.725 Epoch 76 iteration 0018/0187: training loss 0.728 Epoch 76 iteration 0019/0187: training loss 0.731 Epoch 76 iteration 0020/0187: training loss 0.726 Epoch 76 iteration 0021/0187: training loss 0.729 Epoch 76 iteration 0022/0187: training loss 0.734 Epoch 76 iteration 0023/0187: training loss 0.728 Epoch 76 iteration 0024/0187: training loss 0.727 Epoch 76 iteration 0025/0187: training loss 0.724 Epoch 76 iteration 0026/0187: training loss 0.727 Epoch 76 iteration 0027/0187: training loss 0.725 Epoch 76 iteration 0028/0187: training loss 0.720 Epoch 76 iteration 0029/0187: training loss 0.718 Epoch 76 iteration 0030/0187: training loss 0.717 Epoch 76 iteration 0031/0187: training loss 0.709 Epoch 76 iteration 0032/0187: training loss 0.716 Epoch 76 iteration 0033/0187: training loss 0.719 Epoch 76 iteration 0034/0187: training loss 0.719 Epoch 76 iteration 0035/0187: training loss 0.723 Epoch 76 iteration 0036/0187: training loss 0.723 Epoch 76 iteration 0037/0187: training loss 0.722 Epoch 76 iteration 0038/0187: training loss 0.720 Epoch 76 iteration 0039/0187: training loss 0.726 Epoch 76 iteration 0040/0187: training loss 0.722 Epoch 76 iteration 0041/0187: training loss 0.721 Epoch 76 iteration 0042/0187: training loss 0.721 Epoch 76 iteration 0043/0187: training loss 0.724 Epoch 76 iteration 0044/0187: training loss 0.720 Epoch 76 iteration 0045/0187: training loss 0.724 Epoch 76 iteration 0046/0187: training loss 0.723 Epoch 76 iteration 0047/0187: training loss 0.721 Epoch 76 iteration 0048/0187: training loss 0.718 Epoch 76 iteration 0049/0187: training loss 0.719 Epoch 76 iteration 0050/0187: training loss 0.720 Epoch 76 iteration 0051/0187: training loss 0.719 Epoch 76 iteration 0052/0187: training loss 0.720 Epoch 76 iteration 0053/0187: training loss 0.721 Epoch 76 iteration 0054/0187: training loss 0.722 Epoch 76 iteration 0055/0187: training loss 0.721 Epoch 76 iteration 0056/0187: training loss 0.722 Epoch 76 iteration 0057/0187: training loss 0.720 Epoch 76 iteration 0058/0187: training loss 0.718 Epoch 76 iteration 0059/0187: training loss 0.717 Epoch 76 iteration 0060/0187: training loss 0.715 Epoch 76 iteration 0061/0187: training loss 0.716 Epoch 76 iteration 0062/0187: training loss 0.716 Epoch 76 iteration 0063/0187: training loss 0.715 Epoch 76 iteration 0064/0187: training loss 0.714 Epoch 76 iteration 0065/0187: training loss 0.713 Epoch 76 iteration 0066/0187: training loss 0.712 Epoch 76 iteration 0067/0187: training loss 0.710 Epoch 76 iteration 0068/0187: training loss 0.710 Epoch 76 iteration 0069/0187: training loss 0.708 Epoch 76 iteration 0070/0187: training loss 0.709 Epoch 76 iteration 0071/0187: training loss 0.708 Epoch 76 iteration 0072/0187: training loss 0.708 Epoch 76 iteration 0073/0187: training loss 0.708 Epoch 76 iteration 0074/0187: training loss 0.708 Epoch 76 iteration 0075/0187: training loss 0.709 Epoch 76 iteration 0076/0187: training loss 0.709 Epoch 76 iteration 0077/0187: training loss 0.707 Epoch 76 iteration 0078/0187: training loss 0.707 Epoch 76 iteration 0079/0187: training loss 0.705 Epoch 76 iteration 0080/0187: training loss 0.711 Epoch 76 iteration 0081/0187: training loss 0.708 Epoch 76 iteration 0082/0187: training loss 0.710 Epoch 76 iteration 0083/0187: training loss 0.710 Epoch 76 iteration 0084/0187: training loss 0.709 Epoch 76 iteration 0085/0187: training loss 0.709 Epoch 76 iteration 0086/0187: training loss 0.710 Epoch 76 iteration 0087/0187: training loss 0.711 Epoch 76 iteration 0088/0187: training loss 0.710 Epoch 76 iteration 0089/0187: training loss 0.712 Epoch 76 iteration 0090/0187: training loss 0.712 Epoch 76 iteration 0091/0188: training loss 0.712 Epoch 76 iteration 0092/0188: training loss 0.711 Epoch 76 iteration 0093/0188: training loss 0.710 Epoch 76 iteration 0094/0188: training loss 0.710 Epoch 76 iteration 0095/0188: training loss 0.708 Epoch 76 iteration 0096/0188: training loss 0.708 Epoch 76 iteration 0097/0188: training loss 0.707 Epoch 76 iteration 0098/0188: training loss 0.707 Epoch 76 iteration 0099/0188: training loss 0.707 Epoch 76 iteration 0100/0188: training loss 0.707 Epoch 76 iteration 0101/0188: training loss 0.709 Epoch 76 iteration 0102/0188: training loss 0.708 Epoch 76 iteration 0103/0188: training loss 0.707 Epoch 76 iteration 0104/0188: training loss 0.707 Epoch 76 iteration 0105/0188: training loss 0.706 Epoch 76 iteration 0106/0188: training loss 0.706 Epoch 76 iteration 0107/0188: training loss 0.706 Epoch 76 iteration 0108/0188: training loss 0.705 Epoch 76 iteration 0109/0188: training loss 0.704 Epoch 76 iteration 0110/0188: training loss 0.704 Epoch 76 iteration 0111/0188: training loss 0.704 Epoch 76 iteration 0112/0188: training loss 0.702 Epoch 76 iteration 0113/0188: training loss 0.703 Epoch 76 iteration 0114/0188: training loss 0.704 Epoch 76 iteration 0115/0188: training loss 0.704 Epoch 76 iteration 0116/0188: training loss 0.702 Epoch 76 iteration 0117/0188: training loss 0.702 Epoch 76 iteration 0118/0188: training loss 0.703 Epoch 76 iteration 0119/0188: training loss 0.702 Epoch 76 iteration 0120/0188: training loss 0.702 Epoch 76 iteration 0121/0188: training loss 0.701 Epoch 76 iteration 0122/0188: training loss 0.700 Epoch 76 iteration 0123/0188: training loss 0.702 Epoch 76 iteration 0124/0188: training loss 0.701 Epoch 76 iteration 0125/0188: training loss 0.700 Epoch 76 iteration 0126/0188: training loss 0.701 Epoch 76 iteration 0127/0188: training loss 0.701 Epoch 76 iteration 0128/0188: training loss 0.701 Epoch 76 iteration 0129/0188: training loss 0.701 Epoch 76 iteration 0130/0188: training loss 0.700 Epoch 76 iteration 0131/0188: training loss 0.699 Epoch 76 iteration 0132/0188: training loss 0.700 Epoch 76 iteration 0133/0188: training loss 0.700 Epoch 76 iteration 0134/0188: training loss 0.699 Epoch 76 iteration 0135/0188: training loss 0.699 Epoch 76 iteration 0136/0188: training loss 0.699 Epoch 76 iteration 0137/0188: training loss 0.698 Epoch 76 iteration 0138/0188: training loss 0.699 Epoch 76 iteration 0139/0188: training loss 0.699 Epoch 76 iteration 0140/0188: training loss 0.699 Epoch 76 iteration 0141/0188: training loss 0.698 Epoch 76 iteration 0142/0188: training loss 0.698 Epoch 76 iteration 0143/0188: training loss 0.698 Epoch 76 iteration 0144/0188: training loss 0.697 Epoch 76 iteration 0145/0188: training loss 0.698 Epoch 76 iteration 0146/0188: training loss 0.699 Epoch 76 iteration 0147/0188: training loss 0.699 Epoch 76 iteration 0148/0188: training loss 0.698 Epoch 76 iteration 0149/0188: training loss 0.698 Epoch 76 iteration 0150/0188: training loss 0.698 Epoch 76 iteration 0151/0188: training loss 0.700 Epoch 76 iteration 0152/0188: training loss 0.700 Epoch 76 iteration 0153/0188: training loss 0.701 Epoch 76 iteration 0154/0188: training loss 0.701 Epoch 76 iteration 0155/0188: training loss 0.701 Epoch 76 iteration 0156/0188: training loss 0.701 Epoch 76 iteration 0157/0188: training loss 0.701 Epoch 76 iteration 0158/0188: training loss 0.700 Epoch 76 iteration 0159/0188: training loss 0.700 Epoch 76 iteration 0160/0188: training loss 0.699 Epoch 76 iteration 0161/0188: training loss 0.700 Epoch 76 iteration 0162/0188: training loss 0.699 Epoch 76 iteration 0163/0188: training loss 0.699 Epoch 76 iteration 0164/0188: training loss 0.698 Epoch 76 iteration 0165/0188: training loss 0.699 Epoch 76 iteration 0166/0188: training loss 0.700 Epoch 76 iteration 0167/0188: training loss 0.700 Epoch 76 iteration 0168/0188: training loss 0.700 Epoch 76 iteration 0169/0188: training loss 0.701 Epoch 76 iteration 0170/0188: training loss 0.700 Epoch 76 iteration 0171/0188: training loss 0.700 Epoch 76 iteration 0172/0188: training loss 0.700 Epoch 76 iteration 0173/0188: training loss 0.700 Epoch 76 iteration 0174/0188: training loss 0.700 Epoch 76 iteration 0175/0188: training loss 0.701 Epoch 76 iteration 0176/0188: training loss 0.701 Epoch 76 iteration 0177/0188: training loss 0.701 Epoch 76 iteration 0178/0188: training loss 0.700 Epoch 76 iteration 0179/0188: training loss 0.700 Epoch 76 iteration 0180/0188: training loss 0.700 Epoch 76 iteration 0181/0188: training loss 0.699 Epoch 76 iteration 0182/0188: training loss 0.699 Epoch 76 iteration 0183/0188: training loss 0.699 Epoch 76 iteration 0184/0188: training loss 0.699 Epoch 76 iteration 0185/0188: training loss 0.700 Epoch 76 iteration 0186/0188: training loss 0.699 Epoch 76 validation pixAcc: 0.876, mIoU: 0.387 Epoch 77 iteration 0001/0187: training loss 0.640 Epoch 77 iteration 0002/0187: training loss 0.668 Epoch 77 iteration 0003/0187: training loss 0.693 Epoch 77 iteration 0004/0187: training loss 0.688 Epoch 77 iteration 0005/0187: training loss 0.687 Epoch 77 iteration 0006/0187: training loss 0.667 Epoch 77 iteration 0007/0187: training loss 0.685 Epoch 77 iteration 0008/0187: training loss 0.669 Epoch 77 iteration 0009/0187: training loss 0.685 Epoch 77 iteration 0010/0187: training loss 0.672 Epoch 77 iteration 0011/0187: training loss 0.663 Epoch 77 iteration 0012/0187: training loss 0.670 Epoch 77 iteration 0013/0187: training loss 0.663 Epoch 77 iteration 0014/0187: training loss 0.664 Epoch 77 iteration 0015/0187: training loss 0.669 Epoch 77 iteration 0016/0187: training loss 0.667 Epoch 77 iteration 0017/0187: training loss 0.668 Epoch 77 iteration 0018/0187: training loss 0.670 Epoch 77 iteration 0019/0187: training loss 0.668 Epoch 77 iteration 0020/0187: training loss 0.668 Epoch 77 iteration 0021/0187: training loss 0.667 Epoch 77 iteration 0022/0187: training loss 0.673 Epoch 77 iteration 0023/0187: training loss 0.682 Epoch 77 iteration 0024/0187: training loss 0.682 Epoch 77 iteration 0025/0187: training loss 0.678 Epoch 77 iteration 0026/0187: training loss 0.675 Epoch 77 iteration 0027/0187: training loss 0.675 Epoch 77 iteration 0028/0187: training loss 0.680 Epoch 77 iteration 0029/0187: training loss 0.676 Epoch 77 iteration 0030/0187: training loss 0.677 Epoch 77 iteration 0031/0187: training loss 0.680 Epoch 77 iteration 0032/0187: training loss 0.684 Epoch 77 iteration 0033/0187: training loss 0.680 Epoch 77 iteration 0034/0187: training loss 0.681 Epoch 77 iteration 0035/0187: training loss 0.685 Epoch 77 iteration 0036/0187: training loss 0.687 Epoch 77 iteration 0037/0187: training loss 0.685 Epoch 77 iteration 0038/0187: training loss 0.682 Epoch 77 iteration 0039/0187: training loss 0.680 Epoch 77 iteration 0040/0187: training loss 0.685 Epoch 77 iteration 0041/0187: training loss 0.680 Epoch 77 iteration 0042/0187: training loss 0.681 Epoch 77 iteration 0043/0187: training loss 0.684 Epoch 77 iteration 0044/0187: training loss 0.684 Epoch 77 iteration 0045/0187: training loss 0.684 Epoch 77 iteration 0046/0187: training loss 0.681 Epoch 77 iteration 0047/0187: training loss 0.688 Epoch 77 iteration 0048/0187: training loss 0.691 Epoch 77 iteration 0049/0187: training loss 0.690 Epoch 77 iteration 0050/0187: training loss 0.697 Epoch 77 iteration 0051/0187: training loss 0.697 Epoch 77 iteration 0052/0187: training loss 0.696 Epoch 77 iteration 0053/0187: training loss 0.696 Epoch 77 iteration 0054/0187: training loss 0.696 Epoch 77 iteration 0055/0187: training loss 0.695 Epoch 77 iteration 0056/0187: training loss 0.694 Epoch 77 iteration 0057/0187: training loss 0.694 Epoch 77 iteration 0058/0187: training loss 0.697 Epoch 77 iteration 0059/0187: training loss 0.696 Epoch 77 iteration 0060/0187: training loss 0.694 Epoch 77 iteration 0061/0187: training loss 0.694 Epoch 77 iteration 0062/0187: training loss 0.695 Epoch 77 iteration 0063/0187: training loss 0.693 Epoch 77 iteration 0064/0187: training loss 0.692 Epoch 77 iteration 0065/0187: training loss 0.694 Epoch 77 iteration 0066/0187: training loss 0.693 Epoch 77 iteration 0067/0187: training loss 0.695 Epoch 77 iteration 0068/0187: training loss 0.694 Epoch 77 iteration 0069/0187: training loss 0.695 Epoch 77 iteration 0070/0187: training loss 0.697 Epoch 77 iteration 0071/0187: training loss 0.700 Epoch 77 iteration 0072/0187: training loss 0.699 Epoch 77 iteration 0073/0187: training loss 0.701 Epoch 77 iteration 0074/0187: training loss 0.705 Epoch 77 iteration 0075/0187: training loss 0.715 Epoch 77 iteration 0076/0187: training loss 0.717 Epoch 77 iteration 0077/0187: training loss 0.716 Epoch 77 iteration 0078/0187: training loss 0.714 Epoch 77 iteration 0079/0187: training loss 0.715 Epoch 77 iteration 0080/0187: training loss 0.716 Epoch 77 iteration 0081/0187: training loss 0.716 Epoch 77 iteration 0082/0187: training loss 0.717 Epoch 77 iteration 0083/0187: training loss 0.717 Epoch 77 iteration 0084/0187: training loss 0.719 Epoch 77 iteration 0085/0187: training loss 0.718 Epoch 77 iteration 0086/0187: training loss 0.717 Epoch 77 iteration 0087/0187: training loss 0.717 Epoch 77 iteration 0088/0187: training loss 0.719 Epoch 77 iteration 0089/0187: training loss 0.718 Epoch 77 iteration 0090/0187: training loss 0.718 Epoch 77 iteration 0091/0187: training loss 0.716 Epoch 77 iteration 0092/0187: training loss 0.715 Epoch 77 iteration 0093/0187: training loss 0.715 Epoch 77 iteration 0094/0187: training loss 0.715 Epoch 77 iteration 0095/0187: training loss 0.717 Epoch 77 iteration 0096/0187: training loss 0.716 Epoch 77 iteration 0097/0187: training loss 0.717 Epoch 77 iteration 0098/0187: training loss 0.718 Epoch 77 iteration 0099/0187: training loss 0.716 Epoch 77 iteration 0100/0187: training loss 0.717 Epoch 77 iteration 0101/0187: training loss 0.716 Epoch 77 iteration 0102/0187: training loss 0.717 Epoch 77 iteration 0103/0187: training loss 0.717 Epoch 77 iteration 0104/0187: training loss 0.720 Epoch 77 iteration 0105/0187: training loss 0.720 Epoch 77 iteration 0106/0187: training loss 0.721 Epoch 77 iteration 0107/0187: training loss 0.720 Epoch 77 iteration 0108/0187: training loss 0.719 Epoch 77 iteration 0109/0187: training loss 0.721 Epoch 77 iteration 0110/0187: training loss 0.722 Epoch 77 iteration 0111/0187: training loss 0.721 Epoch 77 iteration 0112/0187: training loss 0.720 Epoch 77 iteration 0113/0187: training loss 0.719 Epoch 77 iteration 0114/0187: training loss 0.720 Epoch 77 iteration 0115/0187: training loss 0.720 Epoch 77 iteration 0116/0187: training loss 0.718 Epoch 77 iteration 0117/0187: training loss 0.719 Epoch 77 iteration 0118/0187: training loss 0.719 Epoch 77 iteration 0119/0187: training loss 0.719 Epoch 77 iteration 0120/0187: training loss 0.718 Epoch 77 iteration 0121/0187: training loss 0.719 Epoch 77 iteration 0122/0187: training loss 0.719 Epoch 77 iteration 0123/0187: training loss 0.719 Epoch 77 iteration 0124/0187: training loss 0.719 Epoch 77 iteration 0125/0187: training loss 0.718 Epoch 77 iteration 0126/0187: training loss 0.719 Epoch 77 iteration 0127/0187: training loss 0.720 Epoch 77 iteration 0128/0187: training loss 0.719 Epoch 77 iteration 0129/0187: training loss 0.719 Epoch 77 iteration 0130/0187: training loss 0.719 Epoch 77 iteration 0131/0187: training loss 0.718 Epoch 77 iteration 0132/0187: training loss 0.719 Epoch 77 iteration 0133/0187: training loss 0.720 Epoch 77 iteration 0134/0187: training loss 0.720 Epoch 77 iteration 0135/0187: training loss 0.718 Epoch 77 iteration 0136/0187: training loss 0.718 Epoch 77 iteration 0137/0187: training loss 0.718 Epoch 77 iteration 0138/0187: training loss 0.718 Epoch 77 iteration 0139/0187: training loss 0.717 Epoch 77 iteration 0140/0187: training loss 0.719 Epoch 77 iteration 0141/0187: training loss 0.718 Epoch 77 iteration 0142/0187: training loss 0.717 Epoch 77 iteration 0143/0187: training loss 0.717 Epoch 77 iteration 0144/0187: training loss 0.717 Epoch 77 iteration 0145/0187: training loss 0.717 Epoch 77 iteration 0146/0187: training loss 0.717 Epoch 77 iteration 0147/0187: training loss 0.717 Epoch 77 iteration 0148/0187: training loss 0.716 Epoch 77 iteration 0149/0187: training loss 0.716 Epoch 77 iteration 0150/0187: training loss 0.716 Epoch 77 iteration 0151/0187: training loss 0.716 Epoch 77 iteration 0152/0187: training loss 0.716 Epoch 77 iteration 0153/0187: training loss 0.715 Epoch 77 iteration 0154/0187: training loss 0.714 Epoch 77 iteration 0155/0187: training loss 0.714 Epoch 77 iteration 0156/0187: training loss 0.713 Epoch 77 iteration 0157/0187: training loss 0.713 Epoch 77 iteration 0158/0187: training loss 0.713 Epoch 77 iteration 0159/0187: training loss 0.712 Epoch 77 iteration 0160/0187: training loss 0.712 Epoch 77 iteration 0161/0187: training loss 0.711 Epoch 77 iteration 0162/0187: training loss 0.711 Epoch 77 iteration 0163/0187: training loss 0.711 Epoch 77 iteration 0164/0187: training loss 0.711 Epoch 77 iteration 0165/0187: training loss 0.710 Epoch 77 iteration 0166/0187: training loss 0.710 Epoch 77 iteration 0167/0187: training loss 0.709 Epoch 77 iteration 0168/0187: training loss 0.708 Epoch 77 iteration 0169/0187: training loss 0.708 Epoch 77 iteration 0170/0187: training loss 0.707 Epoch 77 iteration 0171/0187: training loss 0.708 Epoch 77 iteration 0172/0187: training loss 0.708 Epoch 77 iteration 0173/0187: training loss 0.708 Epoch 77 iteration 0174/0187: training loss 0.707 Epoch 77 iteration 0175/0187: training loss 0.707 Epoch 77 iteration 0176/0187: training loss 0.707 Epoch 77 iteration 0177/0187: training loss 0.707 Epoch 77 iteration 0178/0187: training loss 0.706 Epoch 77 iteration 0179/0187: training loss 0.707 Epoch 77 iteration 0180/0187: training loss 0.707 Epoch 77 iteration 0181/0187: training loss 0.707 Epoch 77 iteration 0182/0187: training loss 0.707 Epoch 77 iteration 0183/0187: training loss 0.707 Epoch 77 iteration 0184/0187: training loss 0.707 Epoch 77 iteration 0185/0187: training loss 0.707 Epoch 77 iteration 0186/0187: training loss 0.708 Epoch 77 iteration 0187/0187: training loss 0.708 Epoch 77 validation pixAcc: 0.875, mIoU: 0.392 Epoch 78 iteration 0001/0187: training loss 0.617 Epoch 78 iteration 0002/0187: training loss 0.666 Epoch 78 iteration 0003/0187: training loss 0.635 Epoch 78 iteration 0004/0187: training loss 0.622 Epoch 78 iteration 0005/0187: training loss 0.611 Epoch 78 iteration 0006/0187: training loss 0.599 Epoch 78 iteration 0007/0187: training loss 0.598 Epoch 78 iteration 0008/0187: training loss 0.627 Epoch 78 iteration 0009/0187: training loss 0.635 Epoch 78 iteration 0010/0187: training loss 0.632 Epoch 78 iteration 0011/0187: training loss 0.632 Epoch 78 iteration 0012/0187: training loss 0.641 Epoch 78 iteration 0013/0187: training loss 0.642 Epoch 78 iteration 0014/0187: training loss 0.652 Epoch 78 iteration 0015/0187: training loss 0.660 Epoch 78 iteration 0016/0187: training loss 0.667 Epoch 78 iteration 0017/0187: training loss 0.694 Epoch 78 iteration 0018/0187: training loss 0.687 Epoch 78 iteration 0019/0187: training loss 0.688 Epoch 78 iteration 0020/0187: training loss 0.685 Epoch 78 iteration 0021/0187: training loss 0.683 Epoch 78 iteration 0022/0187: training loss 0.687 Epoch 78 iteration 0023/0187: training loss 0.685 Epoch 78 iteration 0024/0187: training loss 0.679 Epoch 78 iteration 0025/0187: training loss 0.681 Epoch 78 iteration 0026/0187: training loss 0.679 Epoch 78 iteration 0027/0187: training loss 0.681 Epoch 78 iteration 0028/0187: training loss 0.682 Epoch 78 iteration 0029/0187: training loss 0.681 Epoch 78 iteration 0030/0187: training loss 0.684 Epoch 78 iteration 0031/0187: training loss 0.686 Epoch 78 iteration 0032/0187: training loss 0.690 Epoch 78 iteration 0033/0187: training loss 0.689 Epoch 78 iteration 0034/0187: training loss 0.696 Epoch 78 iteration 0035/0187: training loss 0.696 Epoch 78 iteration 0036/0187: training loss 0.696 Epoch 78 iteration 0037/0187: training loss 0.698 Epoch 78 iteration 0038/0187: training loss 0.698 Epoch 78 iteration 0039/0187: training loss 0.701 Epoch 78 iteration 0040/0187: training loss 0.699 Epoch 78 iteration 0041/0187: training loss 0.702 Epoch 78 iteration 0042/0187: training loss 0.704 Epoch 78 iteration 0043/0187: training loss 0.706 Epoch 78 iteration 0044/0187: training loss 0.704 Epoch 78 iteration 0045/0187: training loss 0.704 Epoch 78 iteration 0046/0187: training loss 0.702 Epoch 78 iteration 0047/0187: training loss 0.703 Epoch 78 iteration 0048/0187: training loss 0.701 Epoch 78 iteration 0049/0187: training loss 0.701 Epoch 78 iteration 0050/0187: training loss 0.700 Epoch 78 iteration 0051/0187: training loss 0.701 Epoch 78 iteration 0052/0187: training loss 0.701 Epoch 78 iteration 0053/0187: training loss 0.701 Epoch 78 iteration 0054/0187: training loss 0.701 Epoch 78 iteration 0055/0187: training loss 0.704 Epoch 78 iteration 0056/0187: training loss 0.706 Epoch 78 iteration 0057/0187: training loss 0.704 Epoch 78 iteration 0058/0187: training loss 0.700 Epoch 78 iteration 0059/0187: training loss 0.700 Epoch 78 iteration 0060/0187: training loss 0.700 Epoch 78 iteration 0061/0187: training loss 0.701 Epoch 78 iteration 0062/0187: training loss 0.702 Epoch 78 iteration 0063/0187: training loss 0.703 Epoch 78 iteration 0064/0187: training loss 0.701 Epoch 78 iteration 0065/0187: training loss 0.698 Epoch 78 iteration 0066/0187: training loss 0.698 Epoch 78 iteration 0067/0187: training loss 0.699 Epoch 78 iteration 0068/0187: training loss 0.697 Epoch 78 iteration 0069/0187: training loss 0.698 Epoch 78 iteration 0070/0187: training loss 0.697 Epoch 78 iteration 0071/0187: training loss 0.697 Epoch 78 iteration 0072/0187: training loss 0.698 Epoch 78 iteration 0073/0187: training loss 0.698 Epoch 78 iteration 0074/0187: training loss 0.700 Epoch 78 iteration 0075/0187: training loss 0.701 Epoch 78 iteration 0076/0187: training loss 0.702 Epoch 78 iteration 0077/0187: training loss 0.702 Epoch 78 iteration 0078/0187: training loss 0.703 Epoch 78 iteration 0079/0187: training loss 0.704 Epoch 78 iteration 0080/0187: training loss 0.703 Epoch 78 iteration 0081/0187: training loss 0.704 Epoch 78 iteration 0082/0187: training loss 0.705 Epoch 78 iteration 0083/0187: training loss 0.706 Epoch 78 iteration 0084/0187: training loss 0.708 Epoch 78 iteration 0085/0187: training loss 0.708 Epoch 78 iteration 0086/0187: training loss 0.706 Epoch 78 iteration 0087/0187: training loss 0.708 Epoch 78 iteration 0088/0187: training loss 0.707 Epoch 78 iteration 0089/0187: training loss 0.709 Epoch 78 iteration 0090/0187: training loss 0.708 Epoch 78 iteration 0091/0188: training loss 0.707 Epoch 78 iteration 0092/0188: training loss 0.708 Epoch 78 iteration 0093/0188: training loss 0.708 Epoch 78 iteration 0094/0188: training loss 0.707 Epoch 78 iteration 0095/0188: training loss 0.707 Epoch 78 iteration 0096/0188: training loss 0.707 Epoch 78 iteration 0097/0188: training loss 0.706 Epoch 78 iteration 0098/0188: training loss 0.707 Epoch 78 iteration 0099/0188: training loss 0.708 Epoch 78 iteration 0100/0188: training loss 0.707 Epoch 78 iteration 0101/0188: training loss 0.708 Epoch 78 iteration 0102/0188: training loss 0.709 Epoch 78 iteration 0103/0188: training loss 0.709 Epoch 78 iteration 0104/0188: training loss 0.707 Epoch 78 iteration 0105/0188: training loss 0.707 Epoch 78 iteration 0106/0188: training loss 0.708 Epoch 78 iteration 0107/0188: training loss 0.707 Epoch 78 iteration 0108/0188: training loss 0.708 Epoch 78 iteration 0109/0188: training loss 0.709 Epoch 78 iteration 0110/0188: training loss 0.708 Epoch 78 iteration 0111/0188: training loss 0.708 Epoch 78 iteration 0112/0188: training loss 0.707 Epoch 78 iteration 0113/0188: training loss 0.707 Epoch 78 iteration 0114/0188: training loss 0.708 Epoch 78 iteration 0115/0188: training loss 0.708 Epoch 78 iteration 0116/0188: training loss 0.708 Epoch 78 iteration 0117/0188: training loss 0.708 Epoch 78 iteration 0118/0188: training loss 0.709 Epoch 78 iteration 0119/0188: training loss 0.708 Epoch 78 iteration 0120/0188: training loss 0.708 Epoch 78 iteration 0121/0188: training loss 0.710 Epoch 78 iteration 0122/0188: training loss 0.709 Epoch 78 iteration 0123/0188: training loss 0.708 Epoch 78 iteration 0124/0188: training loss 0.707 Epoch 78 iteration 0125/0188: training loss 0.708 Epoch 78 iteration 0126/0188: training loss 0.708 Epoch 78 iteration 0127/0188: training loss 0.706 Epoch 78 iteration 0128/0188: training loss 0.706 Epoch 78 iteration 0129/0188: training loss 0.704 Epoch 78 iteration 0130/0188: training loss 0.704 Epoch 78 iteration 0131/0188: training loss 0.705 Epoch 78 iteration 0132/0188: training loss 0.706 Epoch 78 iteration 0133/0188: training loss 0.706 Epoch 78 iteration 0134/0188: training loss 0.705 Epoch 78 iteration 0135/0188: training loss 0.705 Epoch 78 iteration 0136/0188: training loss 0.704 Epoch 78 iteration 0137/0188: training loss 0.702 Epoch 78 iteration 0138/0188: training loss 0.702 Epoch 78 iteration 0139/0188: training loss 0.702 Epoch 78 iteration 0140/0188: training loss 0.702 Epoch 78 iteration 0141/0188: training loss 0.704 Epoch 78 iteration 0142/0188: training loss 0.704 Epoch 78 iteration 0143/0188: training loss 0.704 Epoch 78 iteration 0144/0188: training loss 0.706 Epoch 78 iteration 0145/0188: training loss 0.705 Epoch 78 iteration 0146/0188: training loss 0.705 Epoch 78 iteration 0147/0188: training loss 0.705 Epoch 78 iteration 0148/0188: training loss 0.705 Epoch 78 iteration 0149/0188: training loss 0.705 Epoch 78 iteration 0150/0188: training loss 0.705 Epoch 78 iteration 0151/0188: training loss 0.704 Epoch 78 iteration 0152/0188: training loss 0.705 Epoch 78 iteration 0153/0188: training loss 0.705 Epoch 78 iteration 0154/0188: training loss 0.705 Epoch 78 iteration 0155/0188: training loss 0.706 Epoch 78 iteration 0156/0188: training loss 0.705 Epoch 78 iteration 0157/0188: training loss 0.705 Epoch 78 iteration 0158/0188: training loss 0.705 Epoch 78 iteration 0159/0188: training loss 0.704 Epoch 78 iteration 0160/0188: training loss 0.704 Epoch 78 iteration 0161/0188: training loss 0.704 Epoch 78 iteration 0162/0188: training loss 0.703 Epoch 78 iteration 0163/0188: training loss 0.703 Epoch 78 iteration 0164/0188: training loss 0.703 Epoch 78 iteration 0165/0188: training loss 0.703 Epoch 78 iteration 0166/0188: training loss 0.702 Epoch 78 iteration 0167/0188: training loss 0.702 Epoch 78 iteration 0168/0188: training loss 0.703 Epoch 78 iteration 0169/0188: training loss 0.704 Epoch 78 iteration 0170/0188: training loss 0.702 Epoch 78 iteration 0171/0188: training loss 0.702 Epoch 78 iteration 0172/0188: training loss 0.702 Epoch 78 iteration 0173/0188: training loss 0.703 Epoch 78 iteration 0174/0188: training loss 0.703 Epoch 78 iteration 0175/0188: training loss 0.702 Epoch 78 iteration 0176/0188: training loss 0.702 Epoch 78 iteration 0177/0188: training loss 0.703 Epoch 78 iteration 0178/0188: training loss 0.702 Epoch 78 iteration 0179/0188: training loss 0.701 Epoch 78 iteration 0180/0188: training loss 0.702 Epoch 78 iteration 0181/0188: training loss 0.701 Epoch 78 iteration 0182/0188: training loss 0.700 Epoch 78 iteration 0183/0188: training loss 0.701 Epoch 78 iteration 0184/0188: training loss 0.701 Epoch 78 iteration 0185/0188: training loss 0.701 Epoch 78 iteration 0186/0188: training loss 0.701 Epoch 78 validation pixAcc: 0.875, mIoU: 0.390 Epoch 79 iteration 0001/0187: training loss 0.659 Epoch 79 iteration 0002/0187: training loss 0.708 Epoch 79 iteration 0003/0187: training loss 0.752 Epoch 79 iteration 0004/0187: training loss 0.727 Epoch 79 iteration 0005/0187: training loss 0.699 Epoch 79 iteration 0006/0187: training loss 0.696 Epoch 79 iteration 0007/0187: training loss 0.706 Epoch 79 iteration 0008/0187: training loss 0.691 Epoch 79 iteration 0009/0187: training loss 0.705 Epoch 79 iteration 0010/0187: training loss 0.713 Epoch 79 iteration 0011/0187: training loss 0.714 Epoch 79 iteration 0012/0187: training loss 0.722 Epoch 79 iteration 0013/0187: training loss 0.725 Epoch 79 iteration 0014/0187: training loss 0.713 Epoch 79 iteration 0015/0187: training loss 0.709 Epoch 79 iteration 0016/0187: training loss 0.703 Epoch 79 iteration 0017/0187: training loss 0.702 Epoch 79 iteration 0018/0187: training loss 0.706 Epoch 79 iteration 0019/0187: training loss 0.701 Epoch 79 iteration 0020/0187: training loss 0.696 Epoch 79 iteration 0021/0187: training loss 0.698 Epoch 79 iteration 0022/0187: training loss 0.706 Epoch 79 iteration 0023/0187: training loss 0.713 Epoch 79 iteration 0024/0187: training loss 0.719 Epoch 79 iteration 0025/0187: training loss 0.723 Epoch 79 iteration 0026/0187: training loss 0.719 Epoch 79 iteration 0027/0187: training loss 0.712 Epoch 79 iteration 0028/0187: training loss 0.711 Epoch 79 iteration 0029/0187: training loss 0.706 Epoch 79 iteration 0030/0187: training loss 0.705 Epoch 79 iteration 0031/0187: training loss 0.702 Epoch 79 iteration 0032/0187: training loss 0.696 Epoch 79 iteration 0033/0187: training loss 0.694 Epoch 79 iteration 0034/0187: training loss 0.692 Epoch 79 iteration 0035/0187: training loss 0.690 Epoch 79 iteration 0036/0187: training loss 0.693 Epoch 79 iteration 0037/0187: training loss 0.698 Epoch 79 iteration 0038/0187: training loss 0.705 Epoch 79 iteration 0039/0187: training loss 0.704 Epoch 79 iteration 0040/0187: training loss 0.702 Epoch 79 iteration 0041/0187: training loss 0.700 Epoch 79 iteration 0042/0187: training loss 0.702 Epoch 79 iteration 0043/0187: training loss 0.702 Epoch 79 iteration 0044/0187: training loss 0.705 Epoch 79 iteration 0045/0187: training loss 0.705 Epoch 79 iteration 0046/0187: training loss 0.704 Epoch 79 iteration 0047/0187: training loss 0.700 Epoch 79 iteration 0048/0187: training loss 0.698 Epoch 79 iteration 0049/0187: training loss 0.694 Epoch 79 iteration 0050/0187: training loss 0.695 Epoch 79 iteration 0051/0187: training loss 0.693 Epoch 79 iteration 0052/0187: training loss 0.693 Epoch 79 iteration 0053/0187: training loss 0.692 Epoch 79 iteration 0054/0187: training loss 0.691 Epoch 79 iteration 0055/0187: training loss 0.690 Epoch 79 iteration 0056/0187: training loss 0.691 Epoch 79 iteration 0057/0187: training loss 0.691 Epoch 79 iteration 0058/0187: training loss 0.690 Epoch 79 iteration 0059/0187: training loss 0.690 Epoch 79 iteration 0060/0187: training loss 0.691 Epoch 79 iteration 0061/0187: training loss 0.696 Epoch 79 iteration 0062/0187: training loss 0.696 Epoch 79 iteration 0063/0187: training loss 0.696 Epoch 79 iteration 0064/0187: training loss 0.697 Epoch 79 iteration 0065/0187: training loss 0.697 Epoch 79 iteration 0066/0187: training loss 0.698 Epoch 79 iteration 0067/0187: training loss 0.703 Epoch 79 iteration 0068/0187: training loss 0.703 Epoch 79 iteration 0069/0187: training loss 0.703 Epoch 79 iteration 0070/0187: training loss 0.701 Epoch 79 iteration 0071/0187: training loss 0.706 Epoch 79 iteration 0072/0187: training loss 0.706 Epoch 79 iteration 0073/0187: training loss 0.705 Epoch 79 iteration 0074/0187: training loss 0.704 Epoch 79 iteration 0075/0187: training loss 0.706 Epoch 79 iteration 0076/0187: training loss 0.704 Epoch 79 iteration 0077/0187: training loss 0.705 Epoch 79 iteration 0078/0187: training loss 0.703 Epoch 79 iteration 0079/0187: training loss 0.703 Epoch 79 iteration 0080/0187: training loss 0.703 Epoch 79 iteration 0081/0187: training loss 0.701 Epoch 79 iteration 0082/0187: training loss 0.700 Epoch 79 iteration 0083/0187: training loss 0.701 Epoch 79 iteration 0084/0187: training loss 0.701 Epoch 79 iteration 0085/0187: training loss 0.701 Epoch 79 iteration 0086/0187: training loss 0.701 Epoch 79 iteration 0087/0187: training loss 0.702 Epoch 79 iteration 0088/0187: training loss 0.703 Epoch 79 iteration 0089/0187: training loss 0.704 Epoch 79 iteration 0090/0187: training loss 0.704 Epoch 79 iteration 0091/0187: training loss 0.703 Epoch 79 iteration 0092/0187: training loss 0.703 Epoch 79 iteration 0093/0187: training loss 0.706 Epoch 79 iteration 0094/0187: training loss 0.704 Epoch 79 iteration 0095/0187: training loss 0.703 Epoch 79 iteration 0096/0187: training loss 0.705 Epoch 79 iteration 0097/0187: training loss 0.705 Epoch 79 iteration 0098/0187: training loss 0.705 Epoch 79 iteration 0099/0187: training loss 0.706 Epoch 79 iteration 0100/0187: training loss 0.704 Epoch 79 iteration 0101/0187: training loss 0.705 Epoch 79 iteration 0102/0187: training loss 0.703 Epoch 79 iteration 0103/0187: training loss 0.704 Epoch 79 iteration 0104/0187: training loss 0.703 Epoch 79 iteration 0105/0187: training loss 0.704 Epoch 79 iteration 0106/0187: training loss 0.703 Epoch 79 iteration 0107/0187: training loss 0.703 Epoch 79 iteration 0108/0187: training loss 0.703 Epoch 79 iteration 0109/0187: training loss 0.704 Epoch 79 iteration 0110/0187: training loss 0.704 Epoch 79 iteration 0111/0187: training loss 0.704 Epoch 79 iteration 0112/0187: training loss 0.704 Epoch 79 iteration 0113/0187: training loss 0.703 Epoch 79 iteration 0114/0187: training loss 0.707 Epoch 79 iteration 0115/0187: training loss 0.706 Epoch 79 iteration 0116/0187: training loss 0.707 Epoch 79 iteration 0117/0187: training loss 0.708 Epoch 79 iteration 0118/0187: training loss 0.709 Epoch 79 iteration 0119/0187: training loss 0.709 Epoch 79 iteration 0120/0187: training loss 0.709 Epoch 79 iteration 0121/0187: training loss 0.710 Epoch 79 iteration 0122/0187: training loss 0.712 Epoch 79 iteration 0123/0187: training loss 0.713 Epoch 79 iteration 0124/0187: training loss 0.713 Epoch 79 iteration 0125/0187: training loss 0.712 Epoch 79 iteration 0126/0187: training loss 0.713 Epoch 79 iteration 0127/0187: training loss 0.714 Epoch 79 iteration 0128/0187: training loss 0.713 Epoch 79 iteration 0129/0187: training loss 0.714 Epoch 79 iteration 0130/0187: training loss 0.714 Epoch 79 iteration 0131/0187: training loss 0.713 Epoch 79 iteration 0132/0187: training loss 0.713 Epoch 79 iteration 0133/0187: training loss 0.713 Epoch 79 iteration 0134/0187: training loss 0.712 Epoch 79 iteration 0135/0187: training loss 0.712 Epoch 79 iteration 0136/0187: training loss 0.711 Epoch 79 iteration 0137/0187: training loss 0.710 Epoch 79 iteration 0138/0187: training loss 0.710 Epoch 79 iteration 0139/0187: training loss 0.710 Epoch 79 iteration 0140/0187: training loss 0.711 Epoch 79 iteration 0141/0187: training loss 0.711 Epoch 79 iteration 0142/0187: training loss 0.711 Epoch 79 iteration 0143/0187: training loss 0.711 Epoch 79 iteration 0144/0187: training loss 0.711 Epoch 79 iteration 0145/0187: training loss 0.710 Epoch 79 iteration 0146/0187: training loss 0.711 Epoch 79 iteration 0147/0187: training loss 0.710 Epoch 79 iteration 0148/0187: training loss 0.712 Epoch 79 iteration 0149/0187: training loss 0.712 Epoch 79 iteration 0150/0187: training loss 0.712 Epoch 79 iteration 0151/0187: training loss 0.711 Epoch 79 iteration 0152/0187: training loss 0.711 Epoch 79 iteration 0153/0187: training loss 0.712 Epoch 79 iteration 0154/0187: training loss 0.713 Epoch 79 iteration 0155/0187: training loss 0.713 Epoch 79 iteration 0156/0187: training loss 0.713 Epoch 79 iteration 0157/0187: training loss 0.714 Epoch 79 iteration 0158/0187: training loss 0.714 Epoch 79 iteration 0159/0187: training loss 0.714 Epoch 79 iteration 0160/0187: training loss 0.715 Epoch 79 iteration 0161/0187: training loss 0.715 Epoch 79 iteration 0162/0187: training loss 0.716 Epoch 79 iteration 0163/0187: training loss 0.716 Epoch 79 iteration 0164/0187: training loss 0.716 Epoch 79 iteration 0165/0187: training loss 0.716 Epoch 79 iteration 0166/0187: training loss 0.717 Epoch 79 iteration 0167/0187: training loss 0.717 Epoch 79 iteration 0168/0187: training loss 0.716 Epoch 79 iteration 0169/0187: training loss 0.716 Epoch 79 iteration 0170/0187: training loss 0.715 Epoch 79 iteration 0171/0187: training loss 0.715 Epoch 79 iteration 0172/0187: training loss 0.715 Epoch 79 iteration 0173/0187: training loss 0.714 Epoch 79 iteration 0174/0187: training loss 0.714 Epoch 79 iteration 0175/0187: training loss 0.714 Epoch 79 iteration 0176/0187: training loss 0.713 Epoch 79 iteration 0177/0187: training loss 0.713 Epoch 79 iteration 0178/0187: training loss 0.712 Epoch 79 iteration 0179/0187: training loss 0.712 Epoch 79 iteration 0180/0187: training loss 0.711 Epoch 79 iteration 0181/0187: training loss 0.711 Epoch 79 iteration 0182/0187: training loss 0.710 Epoch 79 iteration 0183/0187: training loss 0.710 Epoch 79 iteration 0184/0187: training loss 0.710 Epoch 79 iteration 0185/0187: training loss 0.710 Epoch 79 iteration 0186/0187: training loss 0.709 Epoch 79 iteration 0187/0187: training loss 0.710 Epoch 79 validation pixAcc: 0.875, mIoU: 0.391 Epoch 80 iteration 0001/0187: training loss 0.829 Epoch 80 iteration 0002/0187: training loss 0.776 Epoch 80 iteration 0003/0187: training loss 0.720 Epoch 80 iteration 0004/0187: training loss 0.724 Epoch 80 iteration 0005/0187: training loss 0.724 Epoch 80 iteration 0006/0187: training loss 0.711 Epoch 80 iteration 0007/0187: training loss 0.702 Epoch 80 iteration 0008/0187: training loss 0.703 Epoch 80 iteration 0009/0187: training loss 0.694 Epoch 80 iteration 0010/0187: training loss 0.683 Epoch 80 iteration 0011/0187: training loss 0.669 Epoch 80 iteration 0012/0187: training loss 0.662 Epoch 80 iteration 0013/0187: training loss 0.680 Epoch 80 iteration 0014/0187: training loss 0.682 Epoch 80 iteration 0015/0187: training loss 0.686 Epoch 80 iteration 0016/0187: training loss 0.683 Epoch 80 iteration 0017/0187: training loss 0.689 Epoch 80 iteration 0018/0187: training loss 0.689 Epoch 80 iteration 0019/0187: training loss 0.690 Epoch 80 iteration 0020/0187: training loss 0.696 Epoch 80 iteration 0021/0187: training loss 0.697 Epoch 80 iteration 0022/0187: training loss 0.702 Epoch 80 iteration 0023/0187: training loss 0.700 Epoch 80 iteration 0024/0187: training loss 0.697 Epoch 80 iteration 0025/0187: training loss 0.703 Epoch 80 iteration 0026/0187: training loss 0.702 Epoch 80 iteration 0027/0187: training loss 0.703 Epoch 80 iteration 0028/0187: training loss 0.699 Epoch 80 iteration 0029/0187: training loss 0.702 Epoch 80 iteration 0030/0187: training loss 0.707 Epoch 80 iteration 0031/0187: training loss 0.706 Epoch 80 iteration 0032/0187: training loss 0.709 Epoch 80 iteration 0033/0187: training loss 0.707 Epoch 80 iteration 0034/0187: training loss 0.706 Epoch 80 iteration 0035/0187: training loss 0.707 Epoch 80 iteration 0036/0187: training loss 0.705 Epoch 80 iteration 0037/0187: training loss 0.705 Epoch 80 iteration 0038/0187: training loss 0.702 Epoch 80 iteration 0039/0187: training loss 0.703 Epoch 80 iteration 0040/0187: training loss 0.706 Epoch 80 iteration 0041/0187: training loss 0.703 Epoch 80 iteration 0042/0187: training loss 0.701 Epoch 80 iteration 0043/0187: training loss 0.700 Epoch 80 iteration 0044/0187: training loss 0.701 Epoch 80 iteration 0045/0187: training loss 0.697 Epoch 80 iteration 0046/0187: training loss 0.696 Epoch 80 iteration 0047/0187: training loss 0.697 Epoch 80 iteration 0048/0187: training loss 0.696 Epoch 80 iteration 0049/0187: training loss 0.692 Epoch 80 iteration 0050/0187: training loss 0.690 Epoch 80 iteration 0051/0187: training loss 0.690 Epoch 80 iteration 0052/0187: training loss 0.692 Epoch 80 iteration 0053/0187: training loss 0.690 Epoch 80 iteration 0054/0187: training loss 0.688 Epoch 80 iteration 0055/0187: training loss 0.688 Epoch 80 iteration 0056/0187: training loss 0.690 Epoch 80 iteration 0057/0187: training loss 0.690 Epoch 80 iteration 0058/0187: training loss 0.690 Epoch 80 iteration 0059/0187: training loss 0.690 Epoch 80 iteration 0060/0187: training loss 0.689 Epoch 80 iteration 0061/0187: training loss 0.687 Epoch 80 iteration 0062/0187: training loss 0.686 Epoch 80 iteration 0063/0187: training loss 0.686 Epoch 80 iteration 0064/0187: training loss 0.686 Epoch 80 iteration 0065/0187: training loss 0.686 Epoch 80 iteration 0066/0187: training loss 0.688 Epoch 80 iteration 0067/0187: training loss 0.689 Epoch 80 iteration 0068/0187: training loss 0.685 Epoch 80 iteration 0069/0187: training loss 0.683 Epoch 80 iteration 0070/0187: training loss 0.683 Epoch 80 iteration 0071/0187: training loss 0.685 Epoch 80 iteration 0072/0187: training loss 0.685 Epoch 80 iteration 0073/0187: training loss 0.685 Epoch 80 iteration 0074/0187: training loss 0.685 Epoch 80 iteration 0075/0187: training loss 0.685 Epoch 80 iteration 0076/0187: training loss 0.684 Epoch 80 iteration 0077/0187: training loss 0.683 Epoch 80 iteration 0078/0187: training loss 0.682 Epoch 80 iteration 0079/0187: training loss 0.682 Epoch 80 iteration 0080/0187: training loss 0.682 Epoch 80 iteration 0081/0187: training loss 0.682 Epoch 80 iteration 0082/0187: training loss 0.682 Epoch 80 iteration 0083/0187: training loss 0.682 Epoch 80 iteration 0084/0187: training loss 0.683 Epoch 80 iteration 0085/0187: training loss 0.683 Epoch 80 iteration 0086/0187: training loss 0.683 Epoch 80 iteration 0087/0187: training loss 0.684 Epoch 80 iteration 0088/0187: training loss 0.687 Epoch 80 iteration 0089/0187: training loss 0.686 Epoch 80 iteration 0090/0187: training loss 0.686 Epoch 80 iteration 0091/0188: training loss 0.686 Epoch 80 iteration 0092/0188: training loss 0.686 Epoch 80 iteration 0093/0188: training loss 0.687 Epoch 80 iteration 0094/0188: training loss 0.687 Epoch 80 iteration 0095/0188: training loss 0.689 Epoch 80 iteration 0096/0188: training loss 0.687 Epoch 80 iteration 0097/0188: training loss 0.689 Epoch 80 iteration 0098/0188: training loss 0.687 Epoch 80 iteration 0099/0188: training loss 0.687 Epoch 80 iteration 0100/0188: training loss 0.687 Epoch 80 iteration 0101/0188: training loss 0.685 Epoch 80 iteration 0102/0188: training loss 0.685 Epoch 80 iteration 0103/0188: training loss 0.686 Epoch 80 iteration 0104/0188: training loss 0.686 Epoch 80 iteration 0105/0188: training loss 0.685 Epoch 80 iteration 0106/0188: training loss 0.684 Epoch 80 iteration 0107/0188: training loss 0.684 Epoch 80 iteration 0108/0188: training loss 0.684 Epoch 80 iteration 0109/0188: training loss 0.683 Epoch 80 iteration 0110/0188: training loss 0.681 Epoch 80 iteration 0111/0188: training loss 0.682 Epoch 80 iteration 0112/0188: training loss 0.683 Epoch 80 iteration 0113/0188: training loss 0.681 Epoch 80 iteration 0114/0188: training loss 0.682 Epoch 80 iteration 0115/0188: training loss 0.682 Epoch 80 iteration 0116/0188: training loss 0.682 Epoch 80 iteration 0117/0188: training loss 0.683 Epoch 80 iteration 0118/0188: training loss 0.683 Epoch 80 iteration 0119/0188: training loss 0.683 Epoch 80 iteration 0120/0188: training loss 0.684 Epoch 80 iteration 0121/0188: training loss 0.685 Epoch 80 iteration 0122/0188: training loss 0.684 Epoch 80 iteration 0123/0188: training loss 0.686 Epoch 80 iteration 0124/0188: training loss 0.686 Epoch 80 iteration 0125/0188: training loss 0.686 Epoch 80 iteration 0126/0188: training loss 0.686 Epoch 80 iteration 0127/0188: training loss 0.685 Epoch 80 iteration 0128/0188: training loss 0.686 Epoch 80 iteration 0129/0188: training loss 0.686 Epoch 80 iteration 0130/0188: training loss 0.686 Epoch 80 iteration 0131/0188: training loss 0.686 Epoch 80 iteration 0132/0188: training loss 0.685 Epoch 80 iteration 0133/0188: training loss 0.684 Epoch 80 iteration 0134/0188: training loss 0.684 Epoch 80 iteration 0135/0188: training loss 0.683 Epoch 80 iteration 0136/0188: training loss 0.683 Epoch 80 iteration 0137/0188: training loss 0.685 Epoch 80 iteration 0138/0188: training loss 0.684 Epoch 80 iteration 0139/0188: training loss 0.683 Epoch 80 iteration 0140/0188: training loss 0.683 Epoch 80 iteration 0141/0188: training loss 0.684 Epoch 80 iteration 0142/0188: training loss 0.683 Epoch 80 iteration 0143/0188: training loss 0.684 Epoch 80 iteration 0144/0188: training loss 0.684 Epoch 80 iteration 0145/0188: training loss 0.684 Epoch 80 iteration 0146/0188: training loss 0.684 Epoch 80 iteration 0147/0188: training loss 0.684 Epoch 80 iteration 0148/0188: training loss 0.684 Epoch 80 iteration 0149/0188: training loss 0.684 Epoch 80 iteration 0150/0188: training loss 0.685 Epoch 80 iteration 0151/0188: training loss 0.686 Epoch 80 iteration 0152/0188: training loss 0.687 Epoch 80 iteration 0153/0188: training loss 0.687 Epoch 80 iteration 0154/0188: training loss 0.687 Epoch 80 iteration 0155/0188: training loss 0.687 Epoch 80 iteration 0156/0188: training loss 0.687 Epoch 80 iteration 0157/0188: training loss 0.685 Epoch 80 iteration 0158/0188: training loss 0.686 Epoch 80 iteration 0159/0188: training loss 0.685 Epoch 80 iteration 0160/0188: training loss 0.685 Epoch 80 iteration 0161/0188: training loss 0.685 Epoch 80 iteration 0162/0188: training loss 0.686 Epoch 80 iteration 0163/0188: training loss 0.685 Epoch 80 iteration 0164/0188: training loss 0.684 Epoch 80 iteration 0165/0188: training loss 0.683 Epoch 80 iteration 0166/0188: training loss 0.683 Epoch 80 iteration 0167/0188: training loss 0.684 Epoch 80 iteration 0168/0188: training loss 0.685 Epoch 80 iteration 0169/0188: training loss 0.685 Epoch 80 iteration 0170/0188: training loss 0.685 Epoch 80 iteration 0171/0188: training loss 0.685 Epoch 80 iteration 0172/0188: training loss 0.686 Epoch 80 iteration 0173/0188: training loss 0.685 Epoch 80 iteration 0174/0188: training loss 0.687 Epoch 80 iteration 0175/0188: training loss 0.687 Epoch 80 iteration 0176/0188: training loss 0.687 Epoch 80 iteration 0177/0188: training loss 0.688 Epoch 80 iteration 0178/0188: training loss 0.687 Epoch 80 iteration 0179/0188: training loss 0.687 Epoch 80 iteration 0180/0188: training loss 0.687 Epoch 80 iteration 0181/0188: training loss 0.688 Epoch 80 iteration 0182/0188: training loss 0.687 Epoch 80 iteration 0183/0188: training loss 0.689 Epoch 80 iteration 0184/0188: training loss 0.689 Epoch 80 iteration 0185/0188: training loss 0.691 Epoch 80 iteration 0186/0188: training loss 0.692 Epoch 80 validation pixAcc: 0.874, mIoU: 0.389 Epoch 81 iteration 0001/0187: training loss 0.746 Epoch 81 iteration 0002/0187: training loss 0.719 Epoch 81 iteration 0003/0187: training loss 0.712 Epoch 81 iteration 0004/0187: training loss 0.693 Epoch 81 iteration 0005/0187: training loss 0.680 Epoch 81 iteration 0006/0187: training loss 0.672 Epoch 81 iteration 0007/0187: training loss 0.680 Epoch 81 iteration 0008/0187: training loss 0.679 Epoch 81 iteration 0009/0187: training loss 0.723 Epoch 81 iteration 0010/0187: training loss 0.716 Epoch 81 iteration 0011/0187: training loss 0.725 Epoch 81 iteration 0012/0187: training loss 0.722 Epoch 81 iteration 0013/0187: training loss 0.720 Epoch 81 iteration 0014/0187: training loss 0.724 Epoch 81 iteration 0015/0187: training loss 0.725 Epoch 81 iteration 0016/0187: training loss 0.720 Epoch 81 iteration 0017/0187: training loss 0.709 Epoch 81 iteration 0018/0187: training loss 0.711 Epoch 81 iteration 0019/0187: training loss 0.719 Epoch 81 iteration 0020/0187: training loss 0.719 Epoch 81 iteration 0021/0187: training loss 0.723 Epoch 81 iteration 0022/0187: training loss 0.720 Epoch 81 iteration 0023/0187: training loss 0.718 Epoch 81 iteration 0024/0187: training loss 0.716 Epoch 81 iteration 0025/0187: training loss 0.711 Epoch 81 iteration 0026/0187: training loss 0.710 Epoch 81 iteration 0027/0187: training loss 0.712 Epoch 81 iteration 0028/0187: training loss 0.709 Epoch 81 iteration 0029/0187: training loss 0.706 Epoch 81 iteration 0030/0187: training loss 0.706 Epoch 81 iteration 0031/0187: training loss 0.704 Epoch 81 iteration 0032/0187: training loss 0.700 Epoch 81 iteration 0033/0187: training loss 0.698 Epoch 81 iteration 0034/0187: training loss 0.693 Epoch 81 iteration 0035/0187: training loss 0.693 Epoch 81 iteration 0036/0187: training loss 0.697 Epoch 81 iteration 0037/0187: training loss 0.699 Epoch 81 iteration 0038/0187: training loss 0.699 Epoch 81 iteration 0039/0187: training loss 0.700 Epoch 81 iteration 0040/0187: training loss 0.699 Epoch 81 iteration 0041/0187: training loss 0.700 Epoch 81 iteration 0042/0187: training loss 0.700 Epoch 81 iteration 0043/0187: training loss 0.703 Epoch 81 iteration 0044/0187: training loss 0.703 Epoch 81 iteration 0045/0187: training loss 0.702 Epoch 81 iteration 0046/0187: training loss 0.703 Epoch 81 iteration 0047/0187: training loss 0.702 Epoch 81 iteration 0048/0187: training loss 0.700 Epoch 81 iteration 0049/0187: training loss 0.699 Epoch 81 iteration 0050/0187: training loss 0.698 Epoch 81 iteration 0051/0187: training loss 0.696 Epoch 81 iteration 0052/0187: training loss 0.698 Epoch 81 iteration 0053/0187: training loss 0.702 Epoch 81 iteration 0054/0187: training loss 0.702 Epoch 81 iteration 0055/0187: training loss 0.701 Epoch 81 iteration 0056/0187: training loss 0.700 Epoch 81 iteration 0057/0187: training loss 0.698 Epoch 81 iteration 0058/0187: training loss 0.697 Epoch 81 iteration 0059/0187: training loss 0.694 Epoch 81 iteration 0060/0187: training loss 0.694 Epoch 81 iteration 0061/0187: training loss 0.694 Epoch 81 iteration 0062/0187: training loss 0.693 Epoch 81 iteration 0063/0187: training loss 0.691 Epoch 81 iteration 0064/0187: training loss 0.694 Epoch 81 iteration 0065/0187: training loss 0.692 Epoch 81 iteration 0066/0187: training loss 0.691 Epoch 81 iteration 0067/0187: training loss 0.691 Epoch 81 iteration 0068/0187: training loss 0.693 Epoch 81 iteration 0069/0187: training loss 0.692 Epoch 81 iteration 0070/0187: training loss 0.694 Epoch 81 iteration 0071/0187: training loss 0.694 Epoch 81 iteration 0072/0187: training loss 0.694 Epoch 81 iteration 0073/0187: training loss 0.693 Epoch 81 iteration 0074/0187: training loss 0.692 Epoch 81 iteration 0075/0187: training loss 0.692 Epoch 81 iteration 0076/0187: training loss 0.692 Epoch 81 iteration 0077/0187: training loss 0.692 Epoch 81 iteration 0078/0187: training loss 0.691 Epoch 81 iteration 0079/0187: training loss 0.693 Epoch 81 iteration 0080/0187: training loss 0.694 Epoch 81 iteration 0081/0187: training loss 0.698 Epoch 81 iteration 0082/0187: training loss 0.697 Epoch 81 iteration 0083/0187: training loss 0.696 Epoch 81 iteration 0084/0187: training loss 0.695 Epoch 81 iteration 0085/0187: training loss 0.694 Epoch 81 iteration 0086/0187: training loss 0.695 Epoch 81 iteration 0087/0187: training loss 0.693 Epoch 81 iteration 0088/0187: training loss 0.696 Epoch 81 iteration 0089/0187: training loss 0.695 Epoch 81 iteration 0090/0187: training loss 0.698 Epoch 81 iteration 0091/0187: training loss 0.699 Epoch 81 iteration 0092/0187: training loss 0.698 Epoch 81 iteration 0093/0187: training loss 0.700 Epoch 81 iteration 0094/0187: training loss 0.700 Epoch 81 iteration 0095/0187: training loss 0.699 Epoch 81 iteration 0096/0187: training loss 0.700 Epoch 81 iteration 0097/0187: training loss 0.699 Epoch 81 iteration 0098/0187: training loss 0.700 Epoch 81 iteration 0099/0187: training loss 0.699 Epoch 81 iteration 0100/0187: training loss 0.699 Epoch 81 iteration 0101/0187: training loss 0.698 Epoch 81 iteration 0102/0187: training loss 0.698 Epoch 81 iteration 0103/0187: training loss 0.698 Epoch 81 iteration 0104/0187: training loss 0.697 Epoch 81 iteration 0105/0187: training loss 0.696 Epoch 81 iteration 0106/0187: training loss 0.696 Epoch 81 iteration 0107/0187: training loss 0.696 Epoch 81 iteration 0108/0187: training loss 0.697 Epoch 81 iteration 0109/0187: training loss 0.696 Epoch 81 iteration 0110/0187: training loss 0.695 Epoch 81 iteration 0111/0187: training loss 0.696 Epoch 81 iteration 0112/0187: training loss 0.696 Epoch 81 iteration 0113/0187: training loss 0.695 Epoch 81 iteration 0114/0187: training loss 0.693 Epoch 81 iteration 0115/0187: training loss 0.693 Epoch 81 iteration 0116/0187: training loss 0.694 Epoch 81 iteration 0117/0187: training loss 0.693 Epoch 81 iteration 0118/0187: training loss 0.694 Epoch 81 iteration 0119/0187: training loss 0.693 Epoch 81 iteration 0120/0187: training loss 0.694 Epoch 81 iteration 0121/0187: training loss 0.693 Epoch 81 iteration 0122/0187: training loss 0.694 Epoch 81 iteration 0123/0187: training loss 0.694 Epoch 81 iteration 0124/0187: training loss 0.695 Epoch 81 iteration 0125/0187: training loss 0.694 Epoch 81 iteration 0126/0187: training loss 0.694 Epoch 81 iteration 0127/0187: training loss 0.695 Epoch 81 iteration 0128/0187: training loss 0.697 Epoch 81 iteration 0129/0187: training loss 0.697 Epoch 81 iteration 0130/0187: training loss 0.697 Epoch 81 iteration 0131/0187: training loss 0.697 Epoch 81 iteration 0132/0187: training loss 0.696 Epoch 81 iteration 0133/0187: training loss 0.697 Epoch 81 iteration 0134/0187: training loss 0.697 Epoch 81 iteration 0135/0187: training loss 0.698 Epoch 81 iteration 0136/0187: training loss 0.697 Epoch 81 iteration 0137/0187: training loss 0.698 Epoch 81 iteration 0138/0187: training loss 0.697 Epoch 81 iteration 0139/0187: training loss 0.697 Epoch 81 iteration 0140/0187: training loss 0.696 Epoch 81 iteration 0141/0187: training loss 0.696 Epoch 81 iteration 0142/0187: training loss 0.696 Epoch 81 iteration 0143/0187: training loss 0.696 Epoch 81 iteration 0144/0187: training loss 0.696 Epoch 81 iteration 0145/0187: training loss 0.696 Epoch 81 iteration 0146/0187: training loss 0.696 Epoch 81 iteration 0147/0187: training loss 0.695 Epoch 81 iteration 0148/0187: training loss 0.695 Epoch 81 iteration 0149/0187: training loss 0.695 Epoch 81 iteration 0150/0187: training loss 0.695 Epoch 81 iteration 0151/0187: training loss 0.695 Epoch 81 iteration 0152/0187: training loss 0.695 Epoch 81 iteration 0153/0187: training loss 0.695 Epoch 81 iteration 0154/0187: training loss 0.695 Epoch 81 iteration 0155/0187: training loss 0.695 Epoch 81 iteration 0156/0187: training loss 0.695 Epoch 81 iteration 0157/0187: training loss 0.694 Epoch 81 iteration 0158/0187: training loss 0.694 Epoch 81 iteration 0159/0187: training loss 0.693 Epoch 81 iteration 0160/0187: training loss 0.694 Epoch 81 iteration 0161/0187: training loss 0.693 Epoch 81 iteration 0162/0187: training loss 0.693 Epoch 81 iteration 0163/0187: training loss 0.692 Epoch 81 iteration 0164/0187: training loss 0.692 Epoch 81 iteration 0165/0187: training loss 0.692 Epoch 81 iteration 0166/0187: training loss 0.692 Epoch 81 iteration 0167/0187: training loss 0.692 Epoch 81 iteration 0168/0187: training loss 0.692 Epoch 81 iteration 0169/0187: training loss 0.692 Epoch 81 iteration 0170/0187: training loss 0.692 Epoch 81 iteration 0171/0187: training loss 0.693 Epoch 81 iteration 0172/0187: training loss 0.692 Epoch 81 iteration 0173/0187: training loss 0.691 Epoch 81 iteration 0174/0187: training loss 0.691 Epoch 81 iteration 0175/0187: training loss 0.690 Epoch 81 iteration 0176/0187: training loss 0.690 Epoch 81 iteration 0177/0187: training loss 0.689 Epoch 81 iteration 0178/0187: training loss 0.689 Epoch 81 iteration 0179/0187: training loss 0.689 Epoch 81 iteration 0180/0187: training loss 0.690 Epoch 81 iteration 0181/0187: training loss 0.690 Epoch 81 iteration 0182/0187: training loss 0.689 Epoch 81 iteration 0183/0187: training loss 0.689 Epoch 81 iteration 0184/0187: training loss 0.688 Epoch 81 iteration 0185/0187: training loss 0.689 Epoch 81 iteration 0186/0187: training loss 0.688 Epoch 81 iteration 0187/0187: training loss 0.688 Epoch 81 validation pixAcc: 0.874, mIoU: 0.389 Epoch 82 iteration 0001/0187: training loss 0.660 Epoch 82 iteration 0002/0187: training loss 0.693 Epoch 82 iteration 0003/0187: training loss 0.704 Epoch 82 iteration 0004/0187: training loss 0.681 Epoch 82 iteration 0005/0187: training loss 0.679 Epoch 82 iteration 0006/0187: training loss 0.721 Epoch 82 iteration 0007/0187: training loss 0.711 Epoch 82 iteration 0008/0187: training loss 0.723 Epoch 82 iteration 0009/0187: training loss 0.754 Epoch 82 iteration 0010/0187: training loss 0.744 Epoch 82 iteration 0011/0187: training loss 0.730 Epoch 82 iteration 0012/0187: training loss 0.726 Epoch 82 iteration 0013/0187: training loss 0.742 Epoch 82 iteration 0014/0187: training loss 0.740 Epoch 82 iteration 0015/0187: training loss 0.734 Epoch 82 iteration 0016/0187: training loss 0.729 Epoch 82 iteration 0017/0187: training loss 0.729 Epoch 82 iteration 0018/0187: training loss 0.727 Epoch 82 iteration 0019/0187: training loss 0.734 Epoch 82 iteration 0020/0187: training loss 0.734 Epoch 82 iteration 0021/0187: training loss 0.724 Epoch 82 iteration 0022/0187: training loss 0.718 Epoch 82 iteration 0023/0187: training loss 0.719 Epoch 82 iteration 0024/0187: training loss 0.717 Epoch 82 iteration 0025/0187: training loss 0.722 Epoch 82 iteration 0026/0187: training loss 0.720 Epoch 82 iteration 0027/0187: training loss 0.713 Epoch 82 iteration 0028/0187: training loss 0.712 Epoch 82 iteration 0029/0187: training loss 0.710 Epoch 82 iteration 0030/0187: training loss 0.709 Epoch 82 iteration 0031/0187: training loss 0.712 Epoch 82 iteration 0032/0187: training loss 0.709 Epoch 82 iteration 0033/0187: training loss 0.705 Epoch 82 iteration 0034/0187: training loss 0.700 Epoch 82 iteration 0035/0187: training loss 0.702 Epoch 82 iteration 0036/0187: training loss 0.703 Epoch 82 iteration 0037/0187: training loss 0.700 Epoch 82 iteration 0038/0187: training loss 0.699 Epoch 82 iteration 0039/0187: training loss 0.699 Epoch 82 iteration 0040/0187: training loss 0.704 Epoch 82 iteration 0041/0187: training loss 0.704 Epoch 82 iteration 0042/0187: training loss 0.709 Epoch 82 iteration 0043/0187: training loss 0.710 Epoch 82 iteration 0044/0187: training loss 0.712 Epoch 82 iteration 0045/0187: training loss 0.710 Epoch 82 iteration 0046/0187: training loss 0.709 Epoch 82 iteration 0047/0187: training loss 0.711 Epoch 82 iteration 0048/0187: training loss 0.712 Epoch 82 iteration 0049/0187: training loss 0.711 Epoch 82 iteration 0050/0187: training loss 0.710 Epoch 82 iteration 0051/0187: training loss 0.711 Epoch 82 iteration 0052/0187: training loss 0.712 Epoch 82 iteration 0053/0187: training loss 0.714 Epoch 82 iteration 0054/0187: training loss 0.710 Epoch 82 iteration 0055/0187: training loss 0.714 Epoch 82 iteration 0056/0187: training loss 0.717 Epoch 82 iteration 0057/0187: training loss 0.717 Epoch 82 iteration 0058/0187: training loss 0.717 Epoch 82 iteration 0059/0187: training loss 0.718 Epoch 82 iteration 0060/0187: training loss 0.717 Epoch 82 iteration 0061/0187: training loss 0.716 Epoch 82 iteration 0062/0187: training loss 0.714 Epoch 82 iteration 0063/0187: training loss 0.714 Epoch 82 iteration 0064/0187: training loss 0.713 Epoch 82 iteration 0065/0187: training loss 0.716 Epoch 82 iteration 0066/0187: training loss 0.715 Epoch 82 iteration 0067/0187: training loss 0.716 Epoch 82 iteration 0068/0187: training loss 0.715 Epoch 82 iteration 0069/0187: training loss 0.714 Epoch 82 iteration 0070/0187: training loss 0.713 Epoch 82 iteration 0071/0187: training loss 0.711 Epoch 82 iteration 0072/0187: training loss 0.712 Epoch 82 iteration 0073/0187: training loss 0.711 Epoch 82 iteration 0074/0187: training loss 0.710 Epoch 82 iteration 0075/0187: training loss 0.710 Epoch 82 iteration 0076/0187: training loss 0.709 Epoch 82 iteration 0077/0187: training loss 0.711 Epoch 82 iteration 0078/0187: training loss 0.712 Epoch 82 iteration 0079/0187: training loss 0.715 Epoch 82 iteration 0080/0187: training loss 0.714 Epoch 82 iteration 0081/0187: training loss 0.713 Epoch 82 iteration 0082/0187: training loss 0.713 Epoch 82 iteration 0083/0187: training loss 0.711 Epoch 82 iteration 0084/0187: training loss 0.710 Epoch 82 iteration 0085/0187: training loss 0.709 Epoch 82 iteration 0086/0187: training loss 0.708 Epoch 82 iteration 0087/0187: training loss 0.707 Epoch 82 iteration 0088/0187: training loss 0.707 Epoch 82 iteration 0089/0187: training loss 0.704 Epoch 82 iteration 0090/0187: training loss 0.705 Epoch 82 iteration 0091/0188: training loss 0.705 Epoch 82 iteration 0092/0188: training loss 0.704 Epoch 82 iteration 0093/0188: training loss 0.704 Epoch 82 iteration 0094/0188: training loss 0.704 Epoch 82 iteration 0095/0188: training loss 0.704 Epoch 82 iteration 0096/0188: training loss 0.703 Epoch 82 iteration 0097/0188: training loss 0.705 Epoch 82 iteration 0098/0188: training loss 0.706 Epoch 82 iteration 0099/0188: training loss 0.706 Epoch 82 iteration 0100/0188: training loss 0.706 Epoch 82 iteration 0101/0188: training loss 0.706 Epoch 82 iteration 0102/0188: training loss 0.706 Epoch 82 iteration 0103/0188: training loss 0.706 Epoch 82 iteration 0104/0188: training loss 0.705 Epoch 82 iteration 0105/0188: training loss 0.706 Epoch 82 iteration 0106/0188: training loss 0.705 Epoch 82 iteration 0107/0188: training loss 0.705 Epoch 82 iteration 0108/0188: training loss 0.705 Epoch 82 iteration 0109/0188: training loss 0.704 Epoch 82 iteration 0110/0188: training loss 0.704 Epoch 82 iteration 0111/0188: training loss 0.703 Epoch 82 iteration 0112/0188: training loss 0.703 Epoch 82 iteration 0113/0188: training loss 0.701 Epoch 82 iteration 0114/0188: training loss 0.701 Epoch 82 iteration 0115/0188: training loss 0.701 Epoch 82 iteration 0116/0188: training loss 0.701 Epoch 82 iteration 0117/0188: training loss 0.701 Epoch 82 iteration 0118/0188: training loss 0.701 Epoch 82 iteration 0119/0188: training loss 0.702 Epoch 82 iteration 0120/0188: training loss 0.702 Epoch 82 iteration 0121/0188: training loss 0.702 Epoch 82 iteration 0122/0188: training loss 0.701 Epoch 82 iteration 0123/0188: training loss 0.701 Epoch 82 iteration 0124/0188: training loss 0.700 Epoch 82 iteration 0125/0188: training loss 0.699 Epoch 82 iteration 0126/0188: training loss 0.699 Epoch 82 iteration 0127/0188: training loss 0.699 Epoch 82 iteration 0128/0188: training loss 0.699 Epoch 82 iteration 0129/0188: training loss 0.699 Epoch 82 iteration 0130/0188: training loss 0.699 Epoch 82 iteration 0131/0188: training loss 0.699 Epoch 82 iteration 0132/0188: training loss 0.699 Epoch 82 iteration 0133/0188: training loss 0.699 Epoch 82 iteration 0134/0188: training loss 0.700 Epoch 82 iteration 0135/0188: training loss 0.700 Epoch 82 iteration 0136/0188: training loss 0.699 Epoch 82 iteration 0137/0188: training loss 0.700 Epoch 82 iteration 0138/0188: training loss 0.700 Epoch 82 iteration 0139/0188: training loss 0.700 Epoch 82 iteration 0140/0188: training loss 0.701 Epoch 82 iteration 0141/0188: training loss 0.702 Epoch 82 iteration 0142/0188: training loss 0.703 Epoch 82 iteration 0143/0188: training loss 0.703 Epoch 82 iteration 0144/0188: training loss 0.703 Epoch 82 iteration 0145/0188: training loss 0.702 Epoch 82 iteration 0146/0188: training loss 0.702 Epoch 82 iteration 0147/0188: training loss 0.705 Epoch 82 iteration 0148/0188: training loss 0.704 Epoch 82 iteration 0149/0188: training loss 0.703 Epoch 82 iteration 0150/0188: training loss 0.704 Epoch 82 iteration 0151/0188: training loss 0.703 Epoch 82 iteration 0152/0188: training loss 0.702 Epoch 82 iteration 0153/0188: training loss 0.702 Epoch 82 iteration 0154/0188: training loss 0.702 Epoch 82 iteration 0155/0188: training loss 0.702 Epoch 82 iteration 0156/0188: training loss 0.701 Epoch 82 iteration 0157/0188: training loss 0.700 Epoch 82 iteration 0158/0188: training loss 0.700 Epoch 82 iteration 0159/0188: training loss 0.699 Epoch 82 iteration 0160/0188: training loss 0.699 Epoch 82 iteration 0161/0188: training loss 0.700 Epoch 82 iteration 0162/0188: training loss 0.700 Epoch 82 iteration 0163/0188: training loss 0.699 Epoch 82 iteration 0164/0188: training loss 0.700 Epoch 82 iteration 0165/0188: training loss 0.700 Epoch 82 iteration 0166/0188: training loss 0.700 Epoch 82 iteration 0167/0188: training loss 0.700 Epoch 82 iteration 0168/0188: training loss 0.700 Epoch 82 iteration 0169/0188: training loss 0.699 Epoch 82 iteration 0170/0188: training loss 0.699 Epoch 82 iteration 0171/0188: training loss 0.698 Epoch 82 iteration 0172/0188: training loss 0.697 Epoch 82 iteration 0173/0188: training loss 0.698 Epoch 82 iteration 0174/0188: training loss 0.698 Epoch 82 iteration 0175/0188: training loss 0.697 Epoch 82 iteration 0176/0188: training loss 0.697 Epoch 82 iteration 0177/0188: training loss 0.697 Epoch 82 iteration 0178/0188: training loss 0.698 Epoch 82 iteration 0179/0188: training loss 0.697 Epoch 82 iteration 0180/0188: training loss 0.697 Epoch 82 iteration 0181/0188: training loss 0.696 Epoch 82 iteration 0182/0188: training loss 0.696 Epoch 82 iteration 0183/0188: training loss 0.696 Epoch 82 iteration 0184/0188: training loss 0.697 Epoch 82 iteration 0185/0188: training loss 0.698 Epoch 82 iteration 0186/0188: training loss 0.697 Epoch 82 validation pixAcc: 0.874, mIoU: 0.387 Epoch 83 iteration 0001/0187: training loss 0.719 Epoch 83 iteration 0002/0187: training loss 0.657 Epoch 83 iteration 0003/0187: training loss 0.635 Epoch 83 iteration 0004/0187: training loss 0.618 Epoch 83 iteration 0005/0187: training loss 0.657 Epoch 83 iteration 0006/0187: training loss 0.680 Epoch 83 iteration 0007/0187: training loss 0.680 Epoch 83 iteration 0008/0187: training loss 0.697 Epoch 83 iteration 0009/0187: training loss 0.702 Epoch 83 iteration 0010/0187: training loss 0.702 Epoch 83 iteration 0011/0187: training loss 0.706 Epoch 83 iteration 0012/0187: training loss 0.702 Epoch 83 iteration 0013/0187: training loss 0.704 Epoch 83 iteration 0014/0187: training loss 0.721 Epoch 83 iteration 0015/0187: training loss 0.721 Epoch 83 iteration 0016/0187: training loss 0.713 Epoch 83 iteration 0017/0187: training loss 0.720 Epoch 83 iteration 0018/0187: training loss 0.718 Epoch 83 iteration 0019/0187: training loss 0.716 Epoch 83 iteration 0020/0187: training loss 0.711 Epoch 83 iteration 0021/0187: training loss 0.708 Epoch 83 iteration 0022/0187: training loss 0.712 Epoch 83 iteration 0023/0187: training loss 0.711 Epoch 83 iteration 0024/0187: training loss 0.709 Epoch 83 iteration 0025/0187: training loss 0.706 Epoch 83 iteration 0026/0187: training loss 0.703 Epoch 83 iteration 0027/0187: training loss 0.703 Epoch 83 iteration 0028/0187: training loss 0.700 Epoch 83 iteration 0029/0187: training loss 0.699 Epoch 83 iteration 0030/0187: training loss 0.696 Epoch 83 iteration 0031/0187: training loss 0.698 Epoch 83 iteration 0032/0187: training loss 0.701 Epoch 83 iteration 0033/0187: training loss 0.703 Epoch 83 iteration 0034/0187: training loss 0.701 Epoch 83 iteration 0035/0187: training loss 0.701 Epoch 83 iteration 0036/0187: training loss 0.701 Epoch 83 iteration 0037/0187: training loss 0.704 Epoch 83 iteration 0038/0187: training loss 0.706 Epoch 83 iteration 0039/0187: training loss 0.705 Epoch 83 iteration 0040/0187: training loss 0.702 Epoch 83 iteration 0041/0187: training loss 0.700 Epoch 83 iteration 0042/0187: training loss 0.700 Epoch 83 iteration 0043/0187: training loss 0.703 Epoch 83 iteration 0044/0187: training loss 0.701 Epoch 83 iteration 0045/0187: training loss 0.703 Epoch 83 iteration 0046/0187: training loss 0.699 Epoch 83 iteration 0047/0187: training loss 0.700 Epoch 83 iteration 0048/0187: training loss 0.701 Epoch 83 iteration 0049/0187: training loss 0.699 Epoch 83 iteration 0050/0187: training loss 0.697 Epoch 83 iteration 0051/0187: training loss 0.695 Epoch 83 iteration 0052/0187: training loss 0.695 Epoch 83 iteration 0053/0187: training loss 0.693 Epoch 83 iteration 0054/0187: training loss 0.696 Epoch 83 iteration 0055/0187: training loss 0.700 Epoch 83 iteration 0056/0187: training loss 0.700 Epoch 83 iteration 0057/0187: training loss 0.700 Epoch 83 iteration 0058/0187: training loss 0.702 Epoch 83 iteration 0059/0187: training loss 0.703 Epoch 83 iteration 0060/0187: training loss 0.704 Epoch 83 iteration 0061/0187: training loss 0.702 Epoch 83 iteration 0062/0187: training loss 0.701 Epoch 83 iteration 0063/0187: training loss 0.699 Epoch 83 iteration 0064/0187: training loss 0.700 Epoch 83 iteration 0065/0187: training loss 0.703 Epoch 83 iteration 0066/0187: training loss 0.702 Epoch 83 iteration 0067/0187: training loss 0.702 Epoch 83 iteration 0068/0187: training loss 0.702 Epoch 83 iteration 0069/0187: training loss 0.702 Epoch 83 iteration 0070/0187: training loss 0.700 Epoch 83 iteration 0071/0187: training loss 0.699 Epoch 83 iteration 0072/0187: training loss 0.698 Epoch 83 iteration 0073/0187: training loss 0.697 Epoch 83 iteration 0074/0187: training loss 0.699 Epoch 83 iteration 0075/0187: training loss 0.700 Epoch 83 iteration 0076/0187: training loss 0.700 Epoch 83 iteration 0077/0187: training loss 0.698 Epoch 83 iteration 0078/0187: training loss 0.699 Epoch 83 iteration 0079/0187: training loss 0.700 Epoch 83 iteration 0080/0187: training loss 0.701 Epoch 83 iteration 0081/0187: training loss 0.699 Epoch 83 iteration 0082/0187: training loss 0.700 Epoch 83 iteration 0083/0187: training loss 0.698 Epoch 83 iteration 0084/0187: training loss 0.697 Epoch 83 iteration 0085/0187: training loss 0.698 Epoch 83 iteration 0086/0187: training loss 0.698 Epoch 83 iteration 0087/0187: training loss 0.700 Epoch 83 iteration 0088/0187: training loss 0.701 Epoch 83 iteration 0089/0187: training loss 0.700 Epoch 83 iteration 0090/0187: training loss 0.699 Epoch 83 iteration 0091/0187: training loss 0.698 Epoch 83 iteration 0092/0187: training loss 0.699 Epoch 83 iteration 0093/0187: training loss 0.698 Epoch 83 iteration 0094/0187: training loss 0.698 Epoch 83 iteration 0095/0187: training loss 0.698 Epoch 83 iteration 0096/0187: training loss 0.699 Epoch 83 iteration 0097/0187: training loss 0.699 Epoch 83 iteration 0098/0187: training loss 0.700 Epoch 83 iteration 0099/0187: training loss 0.699 Epoch 83 iteration 0100/0187: training loss 0.699 Epoch 83 iteration 0101/0187: training loss 0.698 Epoch 83 iteration 0102/0187: training loss 0.699 Epoch 83 iteration 0103/0187: training loss 0.699 Epoch 83 iteration 0104/0187: training loss 0.699 Epoch 83 iteration 0105/0187: training loss 0.698 Epoch 83 iteration 0106/0187: training loss 0.697 Epoch 83 iteration 0107/0187: training loss 0.697 Epoch 83 iteration 0108/0187: training loss 0.697 Epoch 83 iteration 0109/0187: training loss 0.696 Epoch 83 iteration 0110/0187: training loss 0.695 Epoch 83 iteration 0111/0187: training loss 0.695 Epoch 83 iteration 0112/0187: training loss 0.695 Epoch 83 iteration 0113/0187: training loss 0.696 Epoch 83 iteration 0114/0187: training loss 0.696 Epoch 83 iteration 0115/0187: training loss 0.695 Epoch 83 iteration 0116/0187: training loss 0.697 Epoch 83 iteration 0117/0187: training loss 0.697 Epoch 83 iteration 0118/0187: training loss 0.696 Epoch 83 iteration 0119/0187: training loss 0.695 Epoch 83 iteration 0120/0187: training loss 0.695 Epoch 83 iteration 0121/0187: training loss 0.696 Epoch 83 iteration 0122/0187: training loss 0.696 Epoch 83 iteration 0123/0187: training loss 0.696 Epoch 83 iteration 0124/0187: training loss 0.695 Epoch 83 iteration 0125/0187: training loss 0.696 Epoch 83 iteration 0126/0187: training loss 0.696 Epoch 83 iteration 0127/0187: training loss 0.695 Epoch 83 iteration 0128/0187: training loss 0.696 Epoch 83 iteration 0129/0187: training loss 0.695 Epoch 83 iteration 0130/0187: training loss 0.695 Epoch 83 iteration 0131/0187: training loss 0.695 Epoch 83 iteration 0132/0187: training loss 0.695 Epoch 83 iteration 0133/0187: training loss 0.697 Epoch 83 iteration 0134/0187: training loss 0.697 Epoch 83 iteration 0135/0187: training loss 0.697 Epoch 83 iteration 0136/0187: training loss 0.696 Epoch 83 iteration 0137/0187: training loss 0.696 Epoch 83 iteration 0138/0187: training loss 0.695 Epoch 83 iteration 0139/0187: training loss 0.695 Epoch 83 iteration 0140/0187: training loss 0.695 Epoch 83 iteration 0141/0187: training loss 0.696 Epoch 83 iteration 0142/0187: training loss 0.696 Epoch 83 iteration 0143/0187: training loss 0.695 Epoch 83 iteration 0144/0187: training loss 0.695 Epoch 83 iteration 0145/0187: training loss 0.694 Epoch 83 iteration 0146/0187: training loss 0.694 Epoch 83 iteration 0147/0187: training loss 0.693 Epoch 83 iteration 0148/0187: training loss 0.692 Epoch 83 iteration 0149/0187: training loss 0.691 Epoch 83 iteration 0150/0187: training loss 0.691 Epoch 83 iteration 0151/0187: training loss 0.692 Epoch 83 iteration 0152/0187: training loss 0.693 Epoch 83 iteration 0153/0187: training loss 0.694 Epoch 83 iteration 0154/0187: training loss 0.694 Epoch 83 iteration 0155/0187: training loss 0.694 Epoch 83 iteration 0156/0187: training loss 0.693 Epoch 83 iteration 0157/0187: training loss 0.693 Epoch 83 iteration 0158/0187: training loss 0.693 Epoch 83 iteration 0159/0187: training loss 0.693 Epoch 83 iteration 0160/0187: training loss 0.692 Epoch 83 iteration 0161/0187: training loss 0.692 Epoch 83 iteration 0162/0187: training loss 0.691 Epoch 83 iteration 0163/0187: training loss 0.692 Epoch 83 iteration 0164/0187: training loss 0.692 Epoch 83 iteration 0165/0187: training loss 0.693 Epoch 83 iteration 0166/0187: training loss 0.692 Epoch 83 iteration 0167/0187: training loss 0.691 Epoch 83 iteration 0168/0187: training loss 0.691 Epoch 83 iteration 0169/0187: training loss 0.691 Epoch 83 iteration 0170/0187: training loss 0.690 Epoch 83 iteration 0171/0187: training loss 0.691 Epoch 83 iteration 0172/0187: training loss 0.694 Epoch 83 iteration 0173/0187: training loss 0.693 Epoch 83 iteration 0174/0187: training loss 0.692 Epoch 83 iteration 0175/0187: training loss 0.692 Epoch 83 iteration 0176/0187: training loss 0.691 Epoch 83 iteration 0177/0187: training loss 0.692 Epoch 83 iteration 0178/0187: training loss 0.692 Epoch 83 iteration 0179/0187: training loss 0.692 Epoch 83 iteration 0180/0187: training loss 0.692 Epoch 83 iteration 0181/0187: training loss 0.691 Epoch 83 iteration 0182/0187: training loss 0.691 Epoch 83 iteration 0183/0187: training loss 0.691 Epoch 83 iteration 0184/0187: training loss 0.691 Epoch 83 iteration 0185/0187: training loss 0.691 Epoch 83 iteration 0186/0187: training loss 0.690 Epoch 83 iteration 0187/0187: training loss 0.690 Epoch 83 validation pixAcc: 0.874, mIoU: 0.384 Epoch 84 iteration 0001/0187: training loss 0.666 Epoch 84 iteration 0002/0187: training loss 0.672 Epoch 84 iteration 0003/0187: training loss 0.655 Epoch 84 iteration 0004/0187: training loss 0.687 Epoch 84 iteration 0005/0187: training loss 0.693 Epoch 84 iteration 0006/0187: training loss 0.676 Epoch 84 iteration 0007/0187: training loss 0.678 Epoch 84 iteration 0008/0187: training loss 0.695 Epoch 84 iteration 0009/0187: training loss 0.727 Epoch 84 iteration 0010/0187: training loss 0.719 Epoch 84 iteration 0011/0187: training loss 0.716 Epoch 84 iteration 0012/0187: training loss 0.706 Epoch 84 iteration 0013/0187: training loss 0.704 Epoch 84 iteration 0014/0187: training loss 0.699 Epoch 84 iteration 0015/0187: training loss 0.695 Epoch 84 iteration 0016/0187: training loss 0.689 Epoch 84 iteration 0017/0187: training loss 0.689 Epoch 84 iteration 0018/0187: training loss 0.688 Epoch 84 iteration 0019/0187: training loss 0.682 Epoch 84 iteration 0020/0187: training loss 0.692 Epoch 84 iteration 0021/0187: training loss 0.687 Epoch 84 iteration 0022/0187: training loss 0.691 Epoch 84 iteration 0023/0187: training loss 0.688 Epoch 84 iteration 0024/0187: training loss 0.696 Epoch 84 iteration 0025/0187: training loss 0.693 Epoch 84 iteration 0026/0187: training loss 0.693 Epoch 84 iteration 0027/0187: training loss 0.687 Epoch 84 iteration 0028/0187: training loss 0.684 Epoch 84 iteration 0029/0187: training loss 0.684 Epoch 84 iteration 0030/0187: training loss 0.681 Epoch 84 iteration 0031/0187: training loss 0.679 Epoch 84 iteration 0032/0187: training loss 0.675 Epoch 84 iteration 0033/0187: training loss 0.673 Epoch 84 iteration 0034/0187: training loss 0.675 Epoch 84 iteration 0035/0187: training loss 0.675 Epoch 84 iteration 0036/0187: training loss 0.675 Epoch 84 iteration 0037/0187: training loss 0.680 Epoch 84 iteration 0038/0187: training loss 0.685 Epoch 84 iteration 0039/0187: training loss 0.685 Epoch 84 iteration 0040/0187: training loss 0.683 Epoch 84 iteration 0041/0187: training loss 0.687 Epoch 84 iteration 0042/0187: training loss 0.691 Epoch 84 iteration 0043/0187: training loss 0.689 Epoch 84 iteration 0044/0187: training loss 0.689 Epoch 84 iteration 0045/0187: training loss 0.693 Epoch 84 iteration 0046/0187: training loss 0.695 Epoch 84 iteration 0047/0187: training loss 0.695 Epoch 84 iteration 0048/0187: training loss 0.695 Epoch 84 iteration 0049/0187: training loss 0.693 Epoch 84 iteration 0050/0187: training loss 0.693 Epoch 84 iteration 0051/0187: training loss 0.694 Epoch 84 iteration 0052/0187: training loss 0.693 Epoch 84 iteration 0053/0187: training loss 0.693 Epoch 84 iteration 0054/0187: training loss 0.695 Epoch 84 iteration 0055/0187: training loss 0.695 Epoch 84 iteration 0056/0187: training loss 0.695 Epoch 84 iteration 0057/0187: training loss 0.696 Epoch 84 iteration 0058/0187: training loss 0.695 Epoch 84 iteration 0059/0187: training loss 0.696 Epoch 84 iteration 0060/0187: training loss 0.697 Epoch 84 iteration 0061/0187: training loss 0.694 Epoch 84 iteration 0062/0187: training loss 0.694 Epoch 84 iteration 0063/0187: training loss 0.693 Epoch 84 iteration 0064/0187: training loss 0.694 Epoch 84 iteration 0065/0187: training loss 0.693 Epoch 84 iteration 0066/0187: training loss 0.691 Epoch 84 iteration 0067/0187: training loss 0.690 Epoch 84 iteration 0068/0187: training loss 0.693 Epoch 84 iteration 0069/0187: training loss 0.693 Epoch 84 iteration 0070/0187: training loss 0.694 Epoch 84 iteration 0071/0187: training loss 0.692 Epoch 84 iteration 0072/0187: training loss 0.694 Epoch 84 iteration 0073/0187: training loss 0.694 Epoch 84 iteration 0074/0187: training loss 0.695 Epoch 84 iteration 0075/0187: training loss 0.695 Epoch 84 iteration 0076/0187: training loss 0.694 Epoch 84 iteration 0077/0187: training loss 0.692 Epoch 84 iteration 0078/0187: training loss 0.692 Epoch 84 iteration 0079/0187: training loss 0.694 Epoch 84 iteration 0080/0187: training loss 0.694 Epoch 84 iteration 0081/0187: training loss 0.696 Epoch 84 iteration 0082/0187: training loss 0.697 Epoch 84 iteration 0083/0187: training loss 0.698 Epoch 84 iteration 0084/0187: training loss 0.695 Epoch 84 iteration 0085/0187: training loss 0.697 Epoch 84 iteration 0086/0187: training loss 0.696 Epoch 84 iteration 0087/0187: training loss 0.698 Epoch 84 iteration 0088/0187: training loss 0.697 Epoch 84 iteration 0089/0187: training loss 0.697 Epoch 84 iteration 0090/0187: training loss 0.695 Epoch 84 iteration 0091/0188: training loss 0.695 Epoch 84 iteration 0092/0188: training loss 0.693 Epoch 84 iteration 0093/0188: training loss 0.693 Epoch 84 iteration 0094/0188: training loss 0.695 Epoch 84 iteration 0095/0188: training loss 0.696 Epoch 84 iteration 0096/0188: training loss 0.697 Epoch 84 iteration 0097/0188: training loss 0.698 Epoch 84 iteration 0098/0188: training loss 0.697 Epoch 84 iteration 0099/0188: training loss 0.697 Epoch 84 iteration 0100/0188: training loss 0.696 Epoch 84 iteration 0101/0188: training loss 0.696 Epoch 84 iteration 0102/0188: training loss 0.696 Epoch 84 iteration 0103/0188: training loss 0.695 Epoch 84 iteration 0104/0188: training loss 0.694 Epoch 84 iteration 0105/0188: training loss 0.695 Epoch 84 iteration 0106/0188: training loss 0.695 Epoch 84 iteration 0107/0188: training loss 0.694 Epoch 84 iteration 0108/0188: training loss 0.694 Epoch 84 iteration 0109/0188: training loss 0.694 Epoch 84 iteration 0110/0188: training loss 0.693 Epoch 84 iteration 0111/0188: training loss 0.692 Epoch 84 iteration 0112/0188: training loss 0.691 Epoch 84 iteration 0113/0188: training loss 0.691 Epoch 84 iteration 0114/0188: training loss 0.691 Epoch 84 iteration 0115/0188: training loss 0.691 Epoch 84 iteration 0116/0188: training loss 0.692 Epoch 84 iteration 0117/0188: training loss 0.691 Epoch 84 iteration 0118/0188: training loss 0.692 Epoch 84 iteration 0119/0188: training loss 0.691 Epoch 84 iteration 0120/0188: training loss 0.690 Epoch 84 iteration 0121/0188: training loss 0.691 Epoch 84 iteration 0122/0188: training loss 0.691 Epoch 84 iteration 0123/0188: training loss 0.690 Epoch 84 iteration 0124/0188: training loss 0.691 Epoch 84 iteration 0125/0188: training loss 0.690 Epoch 84 iteration 0126/0188: training loss 0.691 Epoch 84 iteration 0127/0188: training loss 0.690 Epoch 84 iteration 0128/0188: training loss 0.690 Epoch 84 iteration 0129/0188: training loss 0.690 Epoch 84 iteration 0130/0188: training loss 0.690 Epoch 84 iteration 0131/0188: training loss 0.689 Epoch 84 iteration 0132/0188: training loss 0.689 Epoch 84 iteration 0133/0188: training loss 0.688 Epoch 84 iteration 0134/0188: training loss 0.687 Epoch 84 iteration 0135/0188: training loss 0.687 Epoch 84 iteration 0136/0188: training loss 0.689 Epoch 84 iteration 0137/0188: training loss 0.689 Epoch 84 iteration 0138/0188: training loss 0.688 Epoch 84 iteration 0139/0188: training loss 0.686 Epoch 84 iteration 0140/0188: training loss 0.688 Epoch 84 iteration 0141/0188: training loss 0.688 Epoch 84 iteration 0142/0188: training loss 0.688 Epoch 84 iteration 0143/0188: training loss 0.688 Epoch 84 iteration 0144/0188: training loss 0.687 Epoch 84 iteration 0145/0188: training loss 0.686 Epoch 84 iteration 0146/0188: training loss 0.687 Epoch 84 iteration 0147/0188: training loss 0.688 Epoch 84 iteration 0148/0188: training loss 0.687 Epoch 84 iteration 0149/0188: training loss 0.687 Epoch 84 iteration 0150/0188: training loss 0.688 Epoch 84 iteration 0151/0188: training loss 0.688 Epoch 84 iteration 0152/0188: training loss 0.688 Epoch 84 iteration 0153/0188: training loss 0.688 Epoch 84 iteration 0154/0188: training loss 0.689 Epoch 84 iteration 0155/0188: training loss 0.688 Epoch 84 iteration 0156/0188: training loss 0.689 Epoch 84 iteration 0157/0188: training loss 0.689 Epoch 84 iteration 0158/0188: training loss 0.690 Epoch 84 iteration 0159/0188: training loss 0.690 Epoch 84 iteration 0160/0188: training loss 0.690 Epoch 84 iteration 0161/0188: training loss 0.690 Epoch 84 iteration 0162/0188: training loss 0.690 Epoch 84 iteration 0163/0188: training loss 0.690 Epoch 84 iteration 0164/0188: training loss 0.689 Epoch 84 iteration 0165/0188: training loss 0.689 Epoch 84 iteration 0166/0188: training loss 0.688 Epoch 84 iteration 0167/0188: training loss 0.687 Epoch 84 iteration 0168/0188: training loss 0.688 Epoch 84 iteration 0169/0188: training loss 0.687 Epoch 84 iteration 0170/0188: training loss 0.689 Epoch 84 iteration 0171/0188: training loss 0.687 Epoch 84 iteration 0172/0188: training loss 0.688 Epoch 84 iteration 0173/0188: training loss 0.688 Epoch 84 iteration 0174/0188: training loss 0.688 Epoch 84 iteration 0175/0188: training loss 0.689 Epoch 84 iteration 0176/0188: training loss 0.689 Epoch 84 iteration 0177/0188: training loss 0.689 Epoch 84 iteration 0178/0188: training loss 0.689 Epoch 84 iteration 0179/0188: training loss 0.688 Epoch 84 iteration 0180/0188: training loss 0.688 Epoch 84 iteration 0181/0188: training loss 0.688 Epoch 84 iteration 0182/0188: training loss 0.687 Epoch 84 iteration 0183/0188: training loss 0.687 Epoch 84 iteration 0184/0188: training loss 0.686 Epoch 84 iteration 0185/0188: training loss 0.686 Epoch 84 iteration 0186/0188: training loss 0.686 Epoch 84 validation pixAcc: 0.874, mIoU: 0.387 Epoch 85 iteration 0001/0187: training loss 0.642 Epoch 85 iteration 0002/0187: training loss 0.620 Epoch 85 iteration 0003/0187: training loss 0.627 Epoch 85 iteration 0004/0187: training loss 0.667 Epoch 85 iteration 0005/0187: training loss 0.671 Epoch 85 iteration 0006/0187: training loss 0.679 Epoch 85 iteration 0007/0187: training loss 0.679 Epoch 85 iteration 0008/0187: training loss 0.665 Epoch 85 iteration 0009/0187: training loss 0.671 Epoch 85 iteration 0010/0187: training loss 0.662 Epoch 85 iteration 0011/0187: training loss 0.658 Epoch 85 iteration 0012/0187: training loss 0.651 Epoch 85 iteration 0013/0187: training loss 0.666 Epoch 85 iteration 0014/0187: training loss 0.668 Epoch 85 iteration 0015/0187: training loss 0.664 Epoch 85 iteration 0016/0187: training loss 0.661 Epoch 85 iteration 0017/0187: training loss 0.664 Epoch 85 iteration 0018/0187: training loss 0.662 Epoch 85 iteration 0019/0187: training loss 0.662 Epoch 85 iteration 0020/0187: training loss 0.655 Epoch 85 iteration 0021/0187: training loss 0.649 Epoch 85 iteration 0022/0187: training loss 0.652 Epoch 85 iteration 0023/0187: training loss 0.651 Epoch 85 iteration 0024/0187: training loss 0.651 Epoch 85 iteration 0025/0187: training loss 0.647 Epoch 85 iteration 0026/0187: training loss 0.646 Epoch 85 iteration 0027/0187: training loss 0.651 Epoch 85 iteration 0028/0187: training loss 0.650 Epoch 85 iteration 0029/0187: training loss 0.648 Epoch 85 iteration 0030/0187: training loss 0.646 Epoch 85 iteration 0031/0187: training loss 0.645 Epoch 85 iteration 0032/0187: training loss 0.647 Epoch 85 iteration 0033/0187: training loss 0.647 Epoch 85 iteration 0034/0187: training loss 0.648 Epoch 85 iteration 0035/0187: training loss 0.650 Epoch 85 iteration 0036/0187: training loss 0.651 Epoch 85 iteration 0037/0187: training loss 0.653 Epoch 85 iteration 0038/0187: training loss 0.653 Epoch 85 iteration 0039/0187: training loss 0.655 Epoch 85 iteration 0040/0187: training loss 0.651 Epoch 85 iteration 0041/0187: training loss 0.653 Epoch 85 iteration 0042/0187: training loss 0.652 Epoch 85 iteration 0043/0187: training loss 0.653 Epoch 85 iteration 0044/0187: training loss 0.654 Epoch 85 iteration 0045/0187: training loss 0.656 Epoch 85 iteration 0046/0187: training loss 0.656 Epoch 85 iteration 0047/0187: training loss 0.657 Epoch 85 iteration 0048/0187: training loss 0.656 Epoch 85 iteration 0049/0187: training loss 0.657 Epoch 85 iteration 0050/0187: training loss 0.658 Epoch 85 iteration 0051/0187: training loss 0.658 Epoch 85 iteration 0052/0187: training loss 0.663 Epoch 85 iteration 0053/0187: training loss 0.662 Epoch 85 iteration 0054/0187: training loss 0.661 Epoch 85 iteration 0055/0187: training loss 0.660 Epoch 85 iteration 0056/0187: training loss 0.657 Epoch 85 iteration 0057/0187: training loss 0.658 Epoch 85 iteration 0058/0187: training loss 0.660 Epoch 85 iteration 0059/0187: training loss 0.659 Epoch 85 iteration 0060/0187: training loss 0.658 Epoch 85 iteration 0061/0187: training loss 0.659 Epoch 85 iteration 0062/0187: training loss 0.658 Epoch 85 iteration 0063/0187: training loss 0.661 Epoch 85 iteration 0064/0187: training loss 0.660 Epoch 85 iteration 0065/0187: training loss 0.660 Epoch 85 iteration 0066/0187: training loss 0.661 Epoch 85 iteration 0067/0187: training loss 0.659 Epoch 85 iteration 0068/0187: training loss 0.660 Epoch 85 iteration 0069/0187: training loss 0.661 Epoch 85 iteration 0070/0187: training loss 0.662 Epoch 85 iteration 0071/0187: training loss 0.662 Epoch 85 iteration 0072/0187: training loss 0.661 Epoch 85 iteration 0073/0187: training loss 0.661 Epoch 85 iteration 0074/0187: training loss 0.661 Epoch 85 iteration 0075/0187: training loss 0.663 Epoch 85 iteration 0076/0187: training loss 0.666 Epoch 85 iteration 0077/0187: training loss 0.666 Epoch 85 iteration 0078/0187: training loss 0.665 Epoch 85 iteration 0079/0187: training loss 0.665 Epoch 85 iteration 0080/0187: training loss 0.665 Epoch 85 iteration 0081/0187: training loss 0.663 Epoch 85 iteration 0082/0187: training loss 0.663 Epoch 85 iteration 0083/0187: training loss 0.664 Epoch 85 iteration 0084/0187: training loss 0.666 Epoch 85 iteration 0085/0187: training loss 0.666 Epoch 85 iteration 0086/0187: training loss 0.665 Epoch 85 iteration 0087/0187: training loss 0.666 Epoch 85 iteration 0088/0187: training loss 0.667 Epoch 85 iteration 0089/0187: training loss 0.667 Epoch 85 iteration 0090/0187: training loss 0.666 Epoch 85 iteration 0091/0187: training loss 0.667 Epoch 85 iteration 0092/0187: training loss 0.666 Epoch 85 iteration 0093/0187: training loss 0.666 Epoch 85 iteration 0094/0187: training loss 0.665 Epoch 85 iteration 0095/0187: training loss 0.665 Epoch 85 iteration 0096/0187: training loss 0.665 Epoch 85 iteration 0097/0187: training loss 0.666 Epoch 85 iteration 0098/0187: training loss 0.666 Epoch 85 iteration 0099/0187: training loss 0.667 Epoch 85 iteration 0100/0187: training loss 0.666 Epoch 85 iteration 0101/0187: training loss 0.666 Epoch 85 iteration 0102/0187: training loss 0.667 Epoch 85 iteration 0103/0187: training loss 0.665 Epoch 85 iteration 0104/0187: training loss 0.666 Epoch 85 iteration 0105/0187: training loss 0.669 Epoch 85 iteration 0106/0187: training loss 0.670 Epoch 85 iteration 0107/0187: training loss 0.671 Epoch 85 iteration 0108/0187: training loss 0.671 Epoch 85 iteration 0109/0187: training loss 0.672 Epoch 85 iteration 0110/0187: training loss 0.672 Epoch 85 iteration 0111/0187: training loss 0.672 Epoch 85 iteration 0112/0187: training loss 0.672 Epoch 85 iteration 0113/0187: training loss 0.672 Epoch 85 iteration 0114/0187: training loss 0.672 Epoch 85 iteration 0115/0187: training loss 0.672 Epoch 85 iteration 0116/0187: training loss 0.672 Epoch 85 iteration 0117/0187: training loss 0.672 Epoch 85 iteration 0118/0187: training loss 0.673 Epoch 85 iteration 0119/0187: training loss 0.674 Epoch 85 iteration 0120/0187: training loss 0.674 Epoch 85 iteration 0121/0187: training loss 0.675 Epoch 85 iteration 0122/0187: training loss 0.676 Epoch 85 iteration 0123/0187: training loss 0.677 Epoch 85 iteration 0124/0187: training loss 0.676 Epoch 85 iteration 0125/0187: training loss 0.677 Epoch 85 iteration 0126/0187: training loss 0.675 Epoch 85 iteration 0127/0187: training loss 0.676 Epoch 85 iteration 0128/0187: training loss 0.676 Epoch 85 iteration 0129/0187: training loss 0.677 Epoch 85 iteration 0130/0187: training loss 0.676 Epoch 85 iteration 0131/0187: training loss 0.675 Epoch 85 iteration 0132/0187: training loss 0.676 Epoch 85 iteration 0133/0187: training loss 0.676 Epoch 85 iteration 0134/0187: training loss 0.677 Epoch 85 iteration 0135/0187: training loss 0.678 Epoch 85 iteration 0136/0187: training loss 0.680 Epoch 85 iteration 0137/0187: training loss 0.680 Epoch 85 iteration 0138/0187: training loss 0.680 Epoch 85 iteration 0139/0187: training loss 0.680 Epoch 85 iteration 0140/0187: training loss 0.680 Epoch 85 iteration 0141/0187: training loss 0.681 Epoch 85 iteration 0142/0187: training loss 0.681 Epoch 85 iteration 0143/0187: training loss 0.681 Epoch 85 iteration 0144/0187: training loss 0.682 Epoch 85 iteration 0145/0187: training loss 0.682 Epoch 85 iteration 0146/0187: training loss 0.682 Epoch 85 iteration 0147/0187: training loss 0.682 Epoch 85 iteration 0148/0187: training loss 0.682 Epoch 85 iteration 0149/0187: training loss 0.682 Epoch 85 iteration 0150/0187: training loss 0.683 Epoch 85 iteration 0151/0187: training loss 0.682 Epoch 85 iteration 0152/0187: training loss 0.683 Epoch 85 iteration 0153/0187: training loss 0.682 Epoch 85 iteration 0154/0187: training loss 0.683 Epoch 85 iteration 0155/0187: training loss 0.684 Epoch 85 iteration 0156/0187: training loss 0.683 Epoch 85 iteration 0157/0187: training loss 0.683 Epoch 85 iteration 0158/0187: training loss 0.684 Epoch 85 iteration 0159/0187: training loss 0.685 Epoch 85 iteration 0160/0187: training loss 0.685 Epoch 85 iteration 0161/0187: training loss 0.684 Epoch 85 iteration 0162/0187: training loss 0.684 Epoch 85 iteration 0163/0187: training loss 0.684 Epoch 85 iteration 0164/0187: training loss 0.683 Epoch 85 iteration 0165/0187: training loss 0.682 Epoch 85 iteration 0166/0187: training loss 0.682 Epoch 85 iteration 0167/0187: training loss 0.682 Epoch 85 iteration 0168/0187: training loss 0.682 Epoch 85 iteration 0169/0187: training loss 0.683 Epoch 85 iteration 0170/0187: training loss 0.682 Epoch 85 iteration 0171/0187: training loss 0.682 Epoch 85 iteration 0172/0187: training loss 0.682 Epoch 85 iteration 0173/0187: training loss 0.681 Epoch 85 iteration 0174/0187: training loss 0.681 Epoch 85 iteration 0175/0187: training loss 0.681 Epoch 85 iteration 0176/0187: training loss 0.680 Epoch 85 iteration 0177/0187: training loss 0.680 Epoch 85 iteration 0178/0187: training loss 0.680 Epoch 85 iteration 0179/0187: training loss 0.681 Epoch 85 iteration 0180/0187: training loss 0.682 Epoch 85 iteration 0181/0187: training loss 0.682 Epoch 85 iteration 0182/0187: training loss 0.681 Epoch 85 iteration 0183/0187: training loss 0.681 Epoch 85 iteration 0184/0187: training loss 0.681 Epoch 85 iteration 0185/0187: training loss 0.683 Epoch 85 iteration 0186/0187: training loss 0.684 Epoch 85 iteration 0187/0187: training loss 0.684 Epoch 85 validation pixAcc: 0.875, mIoU: 0.390 Epoch 86 iteration 0001/0187: training loss 0.635 Epoch 86 iteration 0002/0187: training loss 0.638 Epoch 86 iteration 0003/0187: training loss 0.663 Epoch 86 iteration 0004/0187: training loss 0.668 Epoch 86 iteration 0005/0187: training loss 0.678 Epoch 86 iteration 0006/0187: training loss 0.669 Epoch 86 iteration 0007/0187: training loss 0.655 Epoch 86 iteration 0008/0187: training loss 0.673 Epoch 86 iteration 0009/0187: training loss 0.683 Epoch 86 iteration 0010/0187: training loss 0.683 Epoch 86 iteration 0011/0187: training loss 0.674 Epoch 86 iteration 0012/0187: training loss 0.675 Epoch 86 iteration 0013/0187: training loss 0.681 Epoch 86 iteration 0014/0187: training loss 0.687 Epoch 86 iteration 0015/0187: training loss 0.691 Epoch 86 iteration 0016/0187: training loss 0.684 Epoch 86 iteration 0017/0187: training loss 0.681 Epoch 86 iteration 0018/0187: training loss 0.679 Epoch 86 iteration 0019/0187: training loss 0.684 Epoch 86 iteration 0020/0187: training loss 0.692 Epoch 86 iteration 0021/0187: training loss 0.694 Epoch 86 iteration 0022/0187: training loss 0.697 Epoch 86 iteration 0023/0187: training loss 0.699 Epoch 86 iteration 0024/0187: training loss 0.695 Epoch 86 iteration 0025/0187: training loss 0.691 Epoch 86 iteration 0026/0187: training loss 0.688 Epoch 86 iteration 0027/0187: training loss 0.688 Epoch 86 iteration 0028/0187: training loss 0.687 Epoch 86 iteration 0029/0187: training loss 0.682 Epoch 86 iteration 0030/0187: training loss 0.686 Epoch 86 iteration 0031/0187: training loss 0.696 Epoch 86 iteration 0032/0187: training loss 0.697 Epoch 86 iteration 0033/0187: training loss 0.696 Epoch 86 iteration 0034/0187: training loss 0.689 Epoch 86 iteration 0035/0187: training loss 0.690 Epoch 86 iteration 0036/0187: training loss 0.684 Epoch 86 iteration 0037/0187: training loss 0.682 Epoch 86 iteration 0038/0187: training loss 0.687 Epoch 86 iteration 0039/0187: training loss 0.687 Epoch 86 iteration 0040/0187: training loss 0.687 Epoch 86 iteration 0041/0187: training loss 0.691 Epoch 86 iteration 0042/0187: training loss 0.689 Epoch 86 iteration 0043/0187: training loss 0.685 Epoch 86 iteration 0044/0187: training loss 0.686 Epoch 86 iteration 0045/0187: training loss 0.685 Epoch 86 iteration 0046/0187: training loss 0.683 Epoch 86 iteration 0047/0187: training loss 0.682 Epoch 86 iteration 0048/0187: training loss 0.685 Epoch 86 iteration 0049/0187: training loss 0.685 Epoch 86 iteration 0050/0187: training loss 0.682 Epoch 86 iteration 0051/0187: training loss 0.685 Epoch 86 iteration 0052/0187: training loss 0.686 Epoch 86 iteration 0053/0187: training loss 0.687 Epoch 86 iteration 0054/0187: training loss 0.690 Epoch 86 iteration 0055/0187: training loss 0.691 Epoch 86 iteration 0056/0187: training loss 0.693 Epoch 86 iteration 0057/0187: training loss 0.691 Epoch 86 iteration 0058/0187: training loss 0.691 Epoch 86 iteration 0059/0187: training loss 0.691 Epoch 86 iteration 0060/0187: training loss 0.690 Epoch 86 iteration 0061/0187: training loss 0.690 Epoch 86 iteration 0062/0187: training loss 0.690 Epoch 86 iteration 0063/0187: training loss 0.690 Epoch 86 iteration 0064/0187: training loss 0.690 Epoch 86 iteration 0065/0187: training loss 0.690 Epoch 86 iteration 0066/0187: training loss 0.690 Epoch 86 iteration 0067/0187: training loss 0.687 Epoch 86 iteration 0068/0187: training loss 0.690 Epoch 86 iteration 0069/0187: training loss 0.689 Epoch 86 iteration 0070/0187: training loss 0.690 Epoch 86 iteration 0071/0187: training loss 0.689 Epoch 86 iteration 0072/0187: training loss 0.689 Epoch 86 iteration 0073/0187: training loss 0.690 Epoch 86 iteration 0074/0187: training loss 0.693 Epoch 86 iteration 0075/0187: training loss 0.692 Epoch 86 iteration 0076/0187: training loss 0.691 Epoch 86 iteration 0077/0187: training loss 0.690 Epoch 86 iteration 0078/0187: training loss 0.690 Epoch 86 iteration 0079/0187: training loss 0.689 Epoch 86 iteration 0080/0187: training loss 0.689 Epoch 86 iteration 0081/0187: training loss 0.687 Epoch 86 iteration 0082/0187: training loss 0.691 Epoch 86 iteration 0083/0187: training loss 0.690 Epoch 86 iteration 0084/0187: training loss 0.690 Epoch 86 iteration 0085/0187: training loss 0.689 Epoch 86 iteration 0086/0187: training loss 0.689 Epoch 86 iteration 0087/0187: training loss 0.689 Epoch 86 iteration 0088/0187: training loss 0.690 Epoch 86 iteration 0089/0187: training loss 0.690 Epoch 86 iteration 0090/0187: training loss 0.690 Epoch 86 iteration 0091/0188: training loss 0.690 Epoch 86 iteration 0092/0188: training loss 0.689 Epoch 86 iteration 0093/0188: training loss 0.689 Epoch 86 iteration 0094/0188: training loss 0.689 Epoch 86 iteration 0095/0188: training loss 0.690 Epoch 86 iteration 0096/0188: training loss 0.690 Epoch 86 iteration 0097/0188: training loss 0.691 Epoch 86 iteration 0098/0188: training loss 0.693 Epoch 86 iteration 0099/0188: training loss 0.692 Epoch 86 iteration 0100/0188: training loss 0.692 Epoch 86 iteration 0101/0188: training loss 0.692 Epoch 86 iteration 0102/0188: training loss 0.691 Epoch 86 iteration 0103/0188: training loss 0.692 Epoch 86 iteration 0104/0188: training loss 0.693 Epoch 86 iteration 0105/0188: training loss 0.691 Epoch 86 iteration 0106/0188: training loss 0.690 Epoch 86 iteration 0107/0188: training loss 0.694 Epoch 86 iteration 0108/0188: training loss 0.694 Epoch 86 iteration 0109/0188: training loss 0.693 Epoch 86 iteration 0110/0188: training loss 0.692 Epoch 86 iteration 0111/0188: training loss 0.691 Epoch 86 iteration 0112/0188: training loss 0.691 Epoch 86 iteration 0113/0188: training loss 0.690 Epoch 86 iteration 0114/0188: training loss 0.689 Epoch 86 iteration 0115/0188: training loss 0.689 Epoch 86 iteration 0116/0188: training loss 0.689 Epoch 86 iteration 0117/0188: training loss 0.690 Epoch 86 iteration 0118/0188: training loss 0.689 Epoch 86 iteration 0119/0188: training loss 0.688 Epoch 86 iteration 0120/0188: training loss 0.688 Epoch 86 iteration 0121/0188: training loss 0.688 Epoch 86 iteration 0122/0188: training loss 0.687 Epoch 86 iteration 0123/0188: training loss 0.687 Epoch 86 iteration 0124/0188: training loss 0.686 Epoch 86 iteration 0125/0188: training loss 0.686 Epoch 86 iteration 0126/0188: training loss 0.686 Epoch 86 iteration 0127/0188: training loss 0.685 Epoch 86 iteration 0128/0188: training loss 0.685 Epoch 86 iteration 0129/0188: training loss 0.685 Epoch 86 iteration 0130/0188: training loss 0.685 Epoch 86 iteration 0131/0188: training loss 0.685 Epoch 86 iteration 0132/0188: training loss 0.687 Epoch 86 iteration 0133/0188: training loss 0.686 Epoch 86 iteration 0134/0188: training loss 0.686 Epoch 86 iteration 0135/0188: training loss 0.685 Epoch 86 iteration 0136/0188: training loss 0.685 Epoch 86 iteration 0137/0188: training loss 0.684 Epoch 86 iteration 0138/0188: training loss 0.685 Epoch 86 iteration 0139/0188: training loss 0.684 Epoch 86 iteration 0140/0188: training loss 0.684 Epoch 86 iteration 0141/0188: training loss 0.684 Epoch 86 iteration 0142/0188: training loss 0.683 Epoch 86 iteration 0143/0188: training loss 0.683 Epoch 86 iteration 0144/0188: training loss 0.683 Epoch 86 iteration 0145/0188: training loss 0.683 Epoch 86 iteration 0146/0188: training loss 0.683 Epoch 86 iteration 0147/0188: training loss 0.683 Epoch 86 iteration 0148/0188: training loss 0.683 Epoch 86 iteration 0149/0188: training loss 0.684 Epoch 86 iteration 0150/0188: training loss 0.685 Epoch 86 iteration 0151/0188: training loss 0.685 Epoch 86 iteration 0152/0188: training loss 0.686 Epoch 86 iteration 0153/0188: training loss 0.685 Epoch 86 iteration 0154/0188: training loss 0.686 Epoch 86 iteration 0155/0188: training loss 0.687 Epoch 86 iteration 0156/0188: training loss 0.686 Epoch 86 iteration 0157/0188: training loss 0.687 Epoch 86 iteration 0158/0188: training loss 0.686 Epoch 86 iteration 0159/0188: training loss 0.686 Epoch 86 iteration 0160/0188: training loss 0.687 Epoch 86 iteration 0161/0188: training loss 0.687 Epoch 86 iteration 0162/0188: training loss 0.688 Epoch 86 iteration 0163/0188: training loss 0.688 Epoch 86 iteration 0164/0188: training loss 0.688 Epoch 86 iteration 0165/0188: training loss 0.688 Epoch 86 iteration 0166/0188: training loss 0.689 Epoch 86 iteration 0167/0188: training loss 0.691 Epoch 86 iteration 0168/0188: training loss 0.690 Epoch 86 iteration 0169/0188: training loss 0.691 Epoch 86 iteration 0170/0188: training loss 0.691 Epoch 86 iteration 0171/0188: training loss 0.691 Epoch 86 iteration 0172/0188: training loss 0.691 Epoch 86 iteration 0173/0188: training loss 0.691 Epoch 86 iteration 0174/0188: training loss 0.692 Epoch 86 iteration 0175/0188: training loss 0.691 Epoch 86 iteration 0176/0188: training loss 0.691 Epoch 86 iteration 0177/0188: training loss 0.691 Epoch 86 iteration 0178/0188: training loss 0.690 Epoch 86 iteration 0179/0188: training loss 0.690 Epoch 86 iteration 0180/0188: training loss 0.689 Epoch 86 iteration 0181/0188: training loss 0.689 Epoch 86 iteration 0182/0188: training loss 0.689 Epoch 86 iteration 0183/0188: training loss 0.689 Epoch 86 iteration 0184/0188: training loss 0.689 Epoch 86 iteration 0185/0188: training loss 0.689 Epoch 86 iteration 0186/0188: training loss 0.690 Epoch 86 validation pixAcc: 0.875, mIoU: 0.390 Epoch 87 iteration 0001/0187: training loss 0.689 Epoch 87 iteration 0002/0187: training loss 0.686 Epoch 87 iteration 0003/0187: training loss 0.662 Epoch 87 iteration 0004/0187: training loss 0.650 Epoch 87 iteration 0005/0187: training loss 0.650 Epoch 87 iteration 0006/0187: training loss 0.655 Epoch 87 iteration 0007/0187: training loss 0.672 Epoch 87 iteration 0008/0187: training loss 0.678 Epoch 87 iteration 0009/0187: training loss 0.699 Epoch 87 iteration 0010/0187: training loss 0.701 Epoch 87 iteration 0011/0187: training loss 0.695 Epoch 87 iteration 0012/0187: training loss 0.703 Epoch 87 iteration 0013/0187: training loss 0.697 Epoch 87 iteration 0014/0187: training loss 0.689 Epoch 87 iteration 0015/0187: training loss 0.684 Epoch 87 iteration 0016/0187: training loss 0.678 Epoch 87 iteration 0017/0187: training loss 0.676 Epoch 87 iteration 0018/0187: training loss 0.683 Epoch 87 iteration 0019/0187: training loss 0.687 Epoch 87 iteration 0020/0187: training loss 0.694 Epoch 87 iteration 0021/0187: training loss 0.692 Epoch 87 iteration 0022/0187: training loss 0.700 Epoch 87 iteration 0023/0187: training loss 0.702 Epoch 87 iteration 0024/0187: training loss 0.701 Epoch 87 iteration 0025/0187: training loss 0.695 Epoch 87 iteration 0026/0187: training loss 0.689 Epoch 87 iteration 0027/0187: training loss 0.694 Epoch 87 iteration 0028/0187: training loss 0.692 Epoch 87 iteration 0029/0187: training loss 0.693 Epoch 87 iteration 0030/0187: training loss 0.700 Epoch 87 iteration 0031/0187: training loss 0.694 Epoch 87 iteration 0032/0187: training loss 0.702 Epoch 87 iteration 0033/0187: training loss 0.703 Epoch 87 iteration 0034/0187: training loss 0.701 Epoch 87 iteration 0035/0187: training loss 0.700 Epoch 87 iteration 0036/0187: training loss 0.698 Epoch 87 iteration 0037/0187: training loss 0.695 Epoch 87 iteration 0038/0187: training loss 0.697 Epoch 87 iteration 0039/0187: training loss 0.699 Epoch 87 iteration 0040/0187: training loss 0.698 Epoch 87 iteration 0041/0187: training loss 0.700 Epoch 87 iteration 0042/0187: training loss 0.700 Epoch 87 iteration 0043/0187: training loss 0.698 Epoch 87 iteration 0044/0187: training loss 0.695 Epoch 87 iteration 0045/0187: training loss 0.693 Epoch 87 iteration 0046/0187: training loss 0.692 Epoch 87 iteration 0047/0187: training loss 0.694 Epoch 87 iteration 0048/0187: training loss 0.698 Epoch 87 iteration 0049/0187: training loss 0.701 Epoch 87 iteration 0050/0187: training loss 0.700 Epoch 87 iteration 0051/0187: training loss 0.698 Epoch 87 iteration 0052/0187: training loss 0.696 Epoch 87 iteration 0053/0187: training loss 0.695 Epoch 87 iteration 0054/0187: training loss 0.693 Epoch 87 iteration 0055/0187: training loss 0.692 Epoch 87 iteration 0056/0187: training loss 0.691 Epoch 87 iteration 0057/0187: training loss 0.691 Epoch 87 iteration 0058/0187: training loss 0.689 Epoch 87 iteration 0059/0187: training loss 0.688 Epoch 87 iteration 0060/0187: training loss 0.687 Epoch 87 iteration 0061/0187: training loss 0.685 Epoch 87 iteration 0062/0187: training loss 0.684 Epoch 87 iteration 0063/0187: training loss 0.684 Epoch 87 iteration 0064/0187: training loss 0.683 Epoch 87 iteration 0065/0187: training loss 0.683 Epoch 87 iteration 0066/0187: training loss 0.683 Epoch 87 iteration 0067/0187: training loss 0.684 Epoch 87 iteration 0068/0187: training loss 0.684 Epoch 87 iteration 0069/0187: training loss 0.684 Epoch 87 iteration 0070/0187: training loss 0.684 Epoch 87 iteration 0071/0187: training loss 0.685 Epoch 87 iteration 0072/0187: training loss 0.685 Epoch 87 iteration 0073/0187: training loss 0.685 Epoch 87 iteration 0074/0187: training loss 0.683 Epoch 87 iteration 0075/0187: training loss 0.682 Epoch 87 iteration 0076/0187: training loss 0.684 Epoch 87 iteration 0077/0187: training loss 0.683 Epoch 87 iteration 0078/0187: training loss 0.682 Epoch 87 iteration 0079/0187: training loss 0.684 Epoch 87 iteration 0080/0187: training loss 0.681 Epoch 87 iteration 0081/0187: training loss 0.682 Epoch 87 iteration 0082/0187: training loss 0.682 Epoch 87 iteration 0083/0187: training loss 0.681 Epoch 87 iteration 0084/0187: training loss 0.681 Epoch 87 iteration 0085/0187: training loss 0.680 Epoch 87 iteration 0086/0187: training loss 0.679 Epoch 87 iteration 0087/0187: training loss 0.679 Epoch 87 iteration 0088/0187: training loss 0.681 Epoch 87 iteration 0089/0187: training loss 0.680 Epoch 87 iteration 0090/0187: training loss 0.681 Epoch 87 iteration 0091/0187: training loss 0.682 Epoch 87 iteration 0092/0187: training loss 0.681 Epoch 87 iteration 0093/0187: training loss 0.682 Epoch 87 iteration 0094/0187: training loss 0.682 Epoch 87 iteration 0095/0187: training loss 0.682 Epoch 87 iteration 0096/0187: training loss 0.680 Epoch 87 iteration 0097/0187: training loss 0.680 Epoch 87 iteration 0098/0187: training loss 0.681 Epoch 87 iteration 0099/0187: training loss 0.681 Epoch 87 iteration 0100/0187: training loss 0.680 Epoch 87 iteration 0101/0187: training loss 0.679 Epoch 87 iteration 0102/0187: training loss 0.678 Epoch 87 iteration 0103/0187: training loss 0.677 Epoch 87 iteration 0104/0187: training loss 0.677 Epoch 87 iteration 0105/0187: training loss 0.678 Epoch 87 iteration 0106/0187: training loss 0.678 Epoch 87 iteration 0107/0187: training loss 0.679 Epoch 87 iteration 0108/0187: training loss 0.680 Epoch 87 iteration 0109/0187: training loss 0.680 Epoch 87 iteration 0110/0187: training loss 0.681 Epoch 87 iteration 0111/0187: training loss 0.681 Epoch 87 iteration 0112/0187: training loss 0.680 Epoch 87 iteration 0113/0187: training loss 0.680 Epoch 87 iteration 0114/0187: training loss 0.682 Epoch 87 iteration 0115/0187: training loss 0.682 Epoch 87 iteration 0116/0187: training loss 0.683 Epoch 87 iteration 0117/0187: training loss 0.683 Epoch 87 iteration 0118/0187: training loss 0.683 Epoch 87 iteration 0119/0187: training loss 0.683 Epoch 87 iteration 0120/0187: training loss 0.683 Epoch 87 iteration 0121/0187: training loss 0.684 Epoch 87 iteration 0122/0187: training loss 0.684 Epoch 87 iteration 0123/0187: training loss 0.685 Epoch 87 iteration 0124/0187: training loss 0.686 Epoch 87 iteration 0125/0187: training loss 0.687 Epoch 87 iteration 0126/0187: training loss 0.687 Epoch 87 iteration 0127/0187: training loss 0.687 Epoch 87 iteration 0128/0187: training loss 0.687 Epoch 87 iteration 0129/0187: training loss 0.687 Epoch 87 iteration 0130/0187: training loss 0.687 Epoch 87 iteration 0131/0187: training loss 0.687 Epoch 87 iteration 0132/0187: training loss 0.687 Epoch 87 iteration 0133/0187: training loss 0.686 Epoch 87 iteration 0134/0187: training loss 0.686 Epoch 87 iteration 0135/0187: training loss 0.686 Epoch 87 iteration 0136/0187: training loss 0.685 Epoch 87 iteration 0137/0187: training loss 0.685 Epoch 87 iteration 0138/0187: training loss 0.685 Epoch 87 iteration 0139/0187: training loss 0.685 Epoch 87 iteration 0140/0187: training loss 0.685 Epoch 87 iteration 0141/0187: training loss 0.684 Epoch 87 iteration 0142/0187: training loss 0.683 Epoch 87 iteration 0143/0187: training loss 0.683 Epoch 87 iteration 0144/0187: training loss 0.683 Epoch 87 iteration 0145/0187: training loss 0.683 Epoch 87 iteration 0146/0187: training loss 0.683 Epoch 87 iteration 0147/0187: training loss 0.683 Epoch 87 iteration 0148/0187: training loss 0.683 Epoch 87 iteration 0149/0187: training loss 0.685 Epoch 87 iteration 0150/0187: training loss 0.685 Epoch 87 iteration 0151/0187: training loss 0.686 Epoch 87 iteration 0152/0187: training loss 0.685 Epoch 87 iteration 0153/0187: training loss 0.685 Epoch 87 iteration 0154/0187: training loss 0.685 Epoch 87 iteration 0155/0187: training loss 0.684 Epoch 87 iteration 0156/0187: training loss 0.684 Epoch 87 iteration 0157/0187: training loss 0.684 Epoch 87 iteration 0158/0187: training loss 0.685 Epoch 87 iteration 0159/0187: training loss 0.685 Epoch 87 iteration 0160/0187: training loss 0.684 Epoch 87 iteration 0161/0187: training loss 0.685 Epoch 87 iteration 0162/0187: training loss 0.685 Epoch 87 iteration 0163/0187: training loss 0.684 Epoch 87 iteration 0164/0187: training loss 0.686 Epoch 87 iteration 0165/0187: training loss 0.686 Epoch 87 iteration 0166/0187: training loss 0.688 Epoch 87 iteration 0167/0187: training loss 0.687 Epoch 87 iteration 0168/0187: training loss 0.688 Epoch 87 iteration 0169/0187: training loss 0.688 Epoch 87 iteration 0170/0187: training loss 0.689 Epoch 87 iteration 0171/0187: training loss 0.688 Epoch 87 iteration 0172/0187: training loss 0.688 Epoch 87 iteration 0173/0187: training loss 0.688 Epoch 87 iteration 0174/0187: training loss 0.688 Epoch 87 iteration 0175/0187: training loss 0.688 Epoch 87 iteration 0176/0187: training loss 0.688 Epoch 87 iteration 0177/0187: training loss 0.688 Epoch 87 iteration 0178/0187: training loss 0.688 Epoch 87 iteration 0179/0187: training loss 0.688 Epoch 87 iteration 0180/0187: training loss 0.688 Epoch 87 iteration 0181/0187: training loss 0.688 Epoch 87 iteration 0182/0187: training loss 0.688 Epoch 87 iteration 0183/0187: training loss 0.688 Epoch 87 iteration 0184/0187: training loss 0.688 Epoch 87 iteration 0185/0187: training loss 0.688 Epoch 87 iteration 0186/0187: training loss 0.688 Epoch 87 iteration 0187/0187: training loss 0.688 Epoch 87 validation pixAcc: 0.875, mIoU: 0.395 Epoch 88 iteration 0001/0187: training loss 0.592 Epoch 88 iteration 0002/0187: training loss 0.554 Epoch 88 iteration 0003/0187: training loss 0.554 Epoch 88 iteration 0004/0187: training loss 0.559 Epoch 88 iteration 0005/0187: training loss 0.610 Epoch 88 iteration 0006/0187: training loss 0.640 Epoch 88 iteration 0007/0187: training loss 0.658 Epoch 88 iteration 0008/0187: training loss 0.650 Epoch 88 iteration 0009/0187: training loss 0.656 Epoch 88 iteration 0010/0187: training loss 0.659 Epoch 88 iteration 0011/0187: training loss 0.661 Epoch 88 iteration 0012/0187: training loss 0.648 Epoch 88 iteration 0013/0187: training loss 0.649 Epoch 88 iteration 0014/0187: training loss 0.665 Epoch 88 iteration 0015/0187: training loss 0.661 Epoch 88 iteration 0016/0187: training loss 0.662 Epoch 88 iteration 0017/0187: training loss 0.658 Epoch 88 iteration 0018/0187: training loss 0.660 Epoch 88 iteration 0019/0187: training loss 0.672 Epoch 88 iteration 0020/0187: training loss 0.673 Epoch 88 iteration 0021/0187: training loss 0.670 Epoch 88 iteration 0022/0187: training loss 0.671 Epoch 88 iteration 0023/0187: training loss 0.667 Epoch 88 iteration 0024/0187: training loss 0.667 Epoch 88 iteration 0025/0187: training loss 0.669 Epoch 88 iteration 0026/0187: training loss 0.673 Epoch 88 iteration 0027/0187: training loss 0.672 Epoch 88 iteration 0028/0187: training loss 0.671 Epoch 88 iteration 0029/0187: training loss 0.672 Epoch 88 iteration 0030/0187: training loss 0.675 Epoch 88 iteration 0031/0187: training loss 0.672 Epoch 88 iteration 0032/0187: training loss 0.671 Epoch 88 iteration 0033/0187: training loss 0.670 Epoch 88 iteration 0034/0187: training loss 0.674 Epoch 88 iteration 0035/0187: training loss 0.676 Epoch 88 iteration 0036/0187: training loss 0.674 Epoch 88 iteration 0037/0187: training loss 0.674 Epoch 88 iteration 0038/0187: training loss 0.674 Epoch 88 iteration 0039/0187: training loss 0.675 Epoch 88 iteration 0040/0187: training loss 0.676 Epoch 88 iteration 0041/0187: training loss 0.678 Epoch 88 iteration 0042/0187: training loss 0.678 Epoch 88 iteration 0043/0187: training loss 0.677 Epoch 88 iteration 0044/0187: training loss 0.685 Epoch 88 iteration 0045/0187: training loss 0.689 Epoch 88 iteration 0046/0187: training loss 0.690 Epoch 88 iteration 0047/0187: training loss 0.691 Epoch 88 iteration 0048/0187: training loss 0.689 Epoch 88 iteration 0049/0187: training loss 0.689 Epoch 88 iteration 0050/0187: training loss 0.688 Epoch 88 iteration 0051/0187: training loss 0.689 Epoch 88 iteration 0052/0187: training loss 0.688 Epoch 88 iteration 0053/0187: training loss 0.689 Epoch 88 iteration 0054/0187: training loss 0.689 Epoch 88 iteration 0055/0187: training loss 0.687 Epoch 88 iteration 0056/0187: training loss 0.686 Epoch 88 iteration 0057/0187: training loss 0.684 Epoch 88 iteration 0058/0187: training loss 0.682 Epoch 88 iteration 0059/0187: training loss 0.681 Epoch 88 iteration 0060/0187: training loss 0.679 Epoch 88 iteration 0061/0187: training loss 0.680 Epoch 88 iteration 0062/0187: training loss 0.681 Epoch 88 iteration 0063/0187: training loss 0.679 Epoch 88 iteration 0064/0187: training loss 0.678 Epoch 88 iteration 0065/0187: training loss 0.679 Epoch 88 iteration 0066/0187: training loss 0.681 Epoch 88 iteration 0067/0187: training loss 0.679 Epoch 88 iteration 0068/0187: training loss 0.676 Epoch 88 iteration 0069/0187: training loss 0.675 Epoch 88 iteration 0070/0187: training loss 0.674 Epoch 88 iteration 0071/0187: training loss 0.675 Epoch 88 iteration 0072/0187: training loss 0.677 Epoch 88 iteration 0073/0187: training loss 0.675 Epoch 88 iteration 0074/0187: training loss 0.673 Epoch 88 iteration 0075/0187: training loss 0.674 Epoch 88 iteration 0076/0187: training loss 0.674 Epoch 88 iteration 0077/0187: training loss 0.675 Epoch 88 iteration 0078/0187: training loss 0.675 Epoch 88 iteration 0079/0187: training loss 0.678 Epoch 88 iteration 0080/0187: training loss 0.678 Epoch 88 iteration 0081/0187: training loss 0.678 Epoch 88 iteration 0082/0187: training loss 0.678 Epoch 88 iteration 0083/0187: training loss 0.679 Epoch 88 iteration 0084/0187: training loss 0.679 Epoch 88 iteration 0085/0187: training loss 0.680 Epoch 88 iteration 0086/0187: training loss 0.680 Epoch 88 iteration 0087/0187: training loss 0.681 Epoch 88 iteration 0088/0187: training loss 0.682 Epoch 88 iteration 0089/0187: training loss 0.682 Epoch 88 iteration 0090/0187: training loss 0.683 Epoch 88 iteration 0091/0188: training loss 0.682 Epoch 88 iteration 0092/0188: training loss 0.681 Epoch 88 iteration 0093/0188: training loss 0.681 Epoch 88 iteration 0094/0188: training loss 0.681 Epoch 88 iteration 0095/0188: training loss 0.681 Epoch 88 iteration 0096/0188: training loss 0.682 Epoch 88 iteration 0097/0188: training loss 0.682 Epoch 88 iteration 0098/0188: training loss 0.683 Epoch 88 iteration 0099/0188: training loss 0.685 Epoch 88 iteration 0100/0188: training loss 0.685 Epoch 88 iteration 0101/0188: training loss 0.684 Epoch 88 iteration 0102/0188: training loss 0.684 Epoch 88 iteration 0103/0188: training loss 0.684 Epoch 88 iteration 0104/0188: training loss 0.684 Epoch 88 iteration 0105/0188: training loss 0.684 Epoch 88 iteration 0106/0188: training loss 0.686 Epoch 88 iteration 0107/0188: training loss 0.685 Epoch 88 iteration 0108/0188: training loss 0.686 Epoch 88 iteration 0109/0188: training loss 0.686 Epoch 88 iteration 0110/0188: training loss 0.686 Epoch 88 iteration 0111/0188: training loss 0.685 Epoch 88 iteration 0112/0188: training loss 0.683 Epoch 88 iteration 0113/0188: training loss 0.685 Epoch 88 iteration 0114/0188: training loss 0.687 Epoch 88 iteration 0115/0188: training loss 0.686 Epoch 88 iteration 0116/0188: training loss 0.687 Epoch 88 iteration 0117/0188: training loss 0.688 Epoch 88 iteration 0118/0188: training loss 0.688 Epoch 88 iteration 0119/0188: training loss 0.689 Epoch 88 iteration 0120/0188: training loss 0.689 Epoch 88 iteration 0121/0188: training loss 0.689 Epoch 88 iteration 0122/0188: training loss 0.689 Epoch 88 iteration 0123/0188: training loss 0.689 Epoch 88 iteration 0124/0188: training loss 0.688 Epoch 88 iteration 0125/0188: training loss 0.688 Epoch 88 iteration 0126/0188: training loss 0.689 Epoch 88 iteration 0127/0188: training loss 0.690 Epoch 88 iteration 0128/0188: training loss 0.688 Epoch 88 iteration 0129/0188: training loss 0.690 Epoch 88 iteration 0130/0188: training loss 0.689 Epoch 88 iteration 0131/0188: training loss 0.689 Epoch 88 iteration 0132/0188: training loss 0.690 Epoch 88 iteration 0133/0188: training loss 0.694 Epoch 88 iteration 0134/0188: training loss 0.692 Epoch 88 iteration 0135/0188: training loss 0.692 Epoch 88 iteration 0136/0188: training loss 0.693 Epoch 88 iteration 0137/0188: training loss 0.694 Epoch 88 iteration 0138/0188: training loss 0.694 Epoch 88 iteration 0139/0188: training loss 0.696 Epoch 88 iteration 0140/0188: training loss 0.697 Epoch 88 iteration 0141/0188: training loss 0.696 Epoch 88 iteration 0142/0188: training loss 0.696 Epoch 88 iteration 0143/0188: training loss 0.696 Epoch 88 iteration 0144/0188: training loss 0.698 Epoch 88 iteration 0145/0188: training loss 0.698 Epoch 88 iteration 0146/0188: training loss 0.697 Epoch 88 iteration 0147/0188: training loss 0.697 Epoch 88 iteration 0148/0188: training loss 0.697 Epoch 88 iteration 0149/0188: training loss 0.699 Epoch 88 iteration 0150/0188: training loss 0.698 Epoch 88 iteration 0151/0188: training loss 0.698 Epoch 88 iteration 0152/0188: training loss 0.697 Epoch 88 iteration 0153/0188: training loss 0.697 Epoch 88 iteration 0154/0188: training loss 0.697 Epoch 88 iteration 0155/0188: training loss 0.697 Epoch 88 iteration 0156/0188: training loss 0.696 Epoch 88 iteration 0157/0188: training loss 0.696 Epoch 88 iteration 0158/0188: training loss 0.696 Epoch 88 iteration 0159/0188: training loss 0.695 Epoch 88 iteration 0160/0188: training loss 0.695 Epoch 88 iteration 0161/0188: training loss 0.695 Epoch 88 iteration 0162/0188: training loss 0.694 Epoch 88 iteration 0163/0188: training loss 0.696 Epoch 88 iteration 0164/0188: training loss 0.695 Epoch 88 iteration 0165/0188: training loss 0.695 Epoch 88 iteration 0166/0188: training loss 0.695 Epoch 88 iteration 0167/0188: training loss 0.694 Epoch 88 iteration 0168/0188: training loss 0.694 Epoch 88 iteration 0169/0188: training loss 0.695 Epoch 88 iteration 0170/0188: training loss 0.694 Epoch 88 iteration 0171/0188: training loss 0.694 Epoch 88 iteration 0172/0188: training loss 0.694 Epoch 88 iteration 0173/0188: training loss 0.693 Epoch 88 iteration 0174/0188: training loss 0.693 Epoch 88 iteration 0175/0188: training loss 0.693 Epoch 88 iteration 0176/0188: training loss 0.692 Epoch 88 iteration 0177/0188: training loss 0.692 Epoch 88 iteration 0178/0188: training loss 0.691 Epoch 88 iteration 0179/0188: training loss 0.691 Epoch 88 iteration 0180/0188: training loss 0.691 Epoch 88 iteration 0181/0188: training loss 0.692 Epoch 88 iteration 0182/0188: training loss 0.692 Epoch 88 iteration 0183/0188: training loss 0.691 Epoch 88 iteration 0184/0188: training loss 0.692 Epoch 88 iteration 0185/0188: training loss 0.693 Epoch 88 iteration 0186/0188: training loss 0.692 Epoch 88 validation pixAcc: 0.875, mIoU: 0.390 Epoch 89 iteration 0001/0187: training loss 0.578 Epoch 89 iteration 0002/0187: training loss 0.644 Epoch 89 iteration 0003/0187: training loss 0.634 Epoch 89 iteration 0004/0187: training loss 0.627 Epoch 89 iteration 0005/0187: training loss 0.635 Epoch 89 iteration 0006/0187: training loss 0.629 Epoch 89 iteration 0007/0187: training loss 0.631 Epoch 89 iteration 0008/0187: training loss 0.631 Epoch 89 iteration 0009/0187: training loss 0.642 Epoch 89 iteration 0010/0187: training loss 0.665 Epoch 89 iteration 0011/0187: training loss 0.676 Epoch 89 iteration 0012/0187: training loss 0.671 Epoch 89 iteration 0013/0187: training loss 0.667 Epoch 89 iteration 0014/0187: training loss 0.673 Epoch 89 iteration 0015/0187: training loss 0.667 Epoch 89 iteration 0016/0187: training loss 0.674 Epoch 89 iteration 0017/0187: training loss 0.669 Epoch 89 iteration 0018/0187: training loss 0.675 Epoch 89 iteration 0019/0187: training loss 0.683 Epoch 89 iteration 0020/0187: training loss 0.690 Epoch 89 iteration 0021/0187: training loss 0.686 Epoch 89 iteration 0022/0187: training loss 0.680 Epoch 89 iteration 0023/0187: training loss 0.689 Epoch 89 iteration 0024/0187: training loss 0.689 Epoch 89 iteration 0025/0187: training loss 0.689 Epoch 89 iteration 0026/0187: training loss 0.690 Epoch 89 iteration 0027/0187: training loss 0.691 Epoch 89 iteration 0028/0187: training loss 0.688 Epoch 89 iteration 0029/0187: training loss 0.684 Epoch 89 iteration 0030/0187: training loss 0.693 Epoch 89 iteration 0031/0187: training loss 0.695 Epoch 89 iteration 0032/0187: training loss 0.699 Epoch 89 iteration 0033/0187: training loss 0.695 Epoch 89 iteration 0034/0187: training loss 0.695 Epoch 89 iteration 0035/0187: training loss 0.693 Epoch 89 iteration 0036/0187: training loss 0.693 Epoch 89 iteration 0037/0187: training loss 0.693 Epoch 89 iteration 0038/0187: training loss 0.692 Epoch 89 iteration 0039/0187: training loss 0.692 Epoch 89 iteration 0040/0187: training loss 0.690 Epoch 89 iteration 0041/0187: training loss 0.693 Epoch 89 iteration 0042/0187: training loss 0.693 Epoch 89 iteration 0043/0187: training loss 0.689 Epoch 89 iteration 0044/0187: training loss 0.691 Epoch 89 iteration 0045/0187: training loss 0.689 Epoch 89 iteration 0046/0187: training loss 0.687 Epoch 89 iteration 0047/0187: training loss 0.690 Epoch 89 iteration 0048/0187: training loss 0.688 Epoch 89 iteration 0049/0187: training loss 0.685 Epoch 89 iteration 0050/0187: training loss 0.685 Epoch 89 iteration 0051/0187: training loss 0.684 Epoch 89 iteration 0052/0187: training loss 0.687 Epoch 89 iteration 0053/0187: training loss 0.686 Epoch 89 iteration 0054/0187: training loss 0.684 Epoch 89 iteration 0055/0187: training loss 0.685 Epoch 89 iteration 0056/0187: training loss 0.685 Epoch 89 iteration 0057/0187: training loss 0.685 Epoch 89 iteration 0058/0187: training loss 0.683 Epoch 89 iteration 0059/0187: training loss 0.682 Epoch 89 iteration 0060/0187: training loss 0.681 Epoch 89 iteration 0061/0187: training loss 0.681 Epoch 89 iteration 0062/0187: training loss 0.682 Epoch 89 iteration 0063/0187: training loss 0.681 Epoch 89 iteration 0064/0187: training loss 0.682 Epoch 89 iteration 0065/0187: training loss 0.683 Epoch 89 iteration 0066/0187: training loss 0.680 Epoch 89 iteration 0067/0187: training loss 0.680 Epoch 89 iteration 0068/0187: training loss 0.682 Epoch 89 iteration 0069/0187: training loss 0.684 Epoch 89 iteration 0070/0187: training loss 0.683 Epoch 89 iteration 0071/0187: training loss 0.685 Epoch 89 iteration 0072/0187: training loss 0.684 Epoch 89 iteration 0073/0187: training loss 0.686 Epoch 89 iteration 0074/0187: training loss 0.684 Epoch 89 iteration 0075/0187: training loss 0.687 Epoch 89 iteration 0076/0187: training loss 0.686 Epoch 89 iteration 0077/0187: training loss 0.684 Epoch 89 iteration 0078/0187: training loss 0.686 Epoch 89 iteration 0079/0187: training loss 0.684 Epoch 89 iteration 0080/0187: training loss 0.685 Epoch 89 iteration 0081/0187: training loss 0.684 Epoch 89 iteration 0082/0187: training loss 0.686 Epoch 89 iteration 0083/0187: training loss 0.687 Epoch 89 iteration 0084/0187: training loss 0.687 Epoch 89 iteration 0085/0187: training loss 0.685 Epoch 89 iteration 0086/0187: training loss 0.685 Epoch 89 iteration 0087/0187: training loss 0.685 Epoch 89 iteration 0088/0187: training loss 0.685 Epoch 89 iteration 0089/0187: training loss 0.687 Epoch 89 iteration 0090/0187: training loss 0.686 Epoch 89 iteration 0091/0187: training loss 0.686 Epoch 89 iteration 0092/0187: training loss 0.686 Epoch 89 iteration 0093/0187: training loss 0.689 Epoch 89 iteration 0094/0187: training loss 0.689 Epoch 89 iteration 0095/0187: training loss 0.687 Epoch 89 iteration 0096/0187: training loss 0.688 Epoch 89 iteration 0097/0187: training loss 0.689 Epoch 89 iteration 0098/0187: training loss 0.688 Epoch 89 iteration 0099/0187: training loss 0.689 Epoch 89 iteration 0100/0187: training loss 0.688 Epoch 89 iteration 0101/0187: training loss 0.690 Epoch 89 iteration 0102/0187: training loss 0.690 Epoch 89 iteration 0103/0187: training loss 0.690 Epoch 89 iteration 0104/0187: training loss 0.688 Epoch 89 iteration 0105/0187: training loss 0.691 Epoch 89 iteration 0106/0187: training loss 0.690 Epoch 89 iteration 0107/0187: training loss 0.691 Epoch 89 iteration 0108/0187: training loss 0.691 Epoch 89 iteration 0109/0187: training loss 0.691 Epoch 89 iteration 0110/0187: training loss 0.691 Epoch 89 iteration 0111/0187: training loss 0.690 Epoch 89 iteration 0112/0187: training loss 0.689 Epoch 89 iteration 0113/0187: training loss 0.689 Epoch 89 iteration 0114/0187: training loss 0.689 Epoch 89 iteration 0115/0187: training loss 0.688 Epoch 89 iteration 0116/0187: training loss 0.687 Epoch 89 iteration 0117/0187: training loss 0.687 Epoch 89 iteration 0118/0187: training loss 0.688 Epoch 89 iteration 0119/0187: training loss 0.688 Epoch 89 iteration 0120/0187: training loss 0.688 Epoch 89 iteration 0121/0187: training loss 0.689 Epoch 89 iteration 0122/0187: training loss 0.688 Epoch 89 iteration 0123/0187: training loss 0.690 Epoch 89 iteration 0124/0187: training loss 0.690 Epoch 89 iteration 0125/0187: training loss 0.691 Epoch 89 iteration 0126/0187: training loss 0.690 Epoch 89 iteration 0127/0187: training loss 0.689 Epoch 89 iteration 0128/0187: training loss 0.689 Epoch 89 iteration 0129/0187: training loss 0.691 Epoch 89 iteration 0130/0187: training loss 0.690 Epoch 89 iteration 0131/0187: training loss 0.689 Epoch 89 iteration 0132/0187: training loss 0.687 Epoch 89 iteration 0133/0187: training loss 0.687 Epoch 89 iteration 0134/0187: training loss 0.688 Epoch 89 iteration 0135/0187: training loss 0.689 Epoch 89 iteration 0136/0187: training loss 0.689 Epoch 89 iteration 0137/0187: training loss 0.689 Epoch 89 iteration 0138/0187: training loss 0.689 Epoch 89 iteration 0139/0187: training loss 0.688 Epoch 89 iteration 0140/0187: training loss 0.688 Epoch 89 iteration 0141/0187: training loss 0.688 Epoch 89 iteration 0142/0187: training loss 0.689 Epoch 89 iteration 0143/0187: training loss 0.689 Epoch 89 iteration 0144/0187: training loss 0.689 Epoch 89 iteration 0145/0187: training loss 0.688 Epoch 89 iteration 0146/0187: training loss 0.689 Epoch 89 iteration 0147/0187: training loss 0.689 Epoch 89 iteration 0148/0187: training loss 0.690 Epoch 89 iteration 0149/0187: training loss 0.689 Epoch 89 iteration 0150/0187: training loss 0.689 Epoch 89 iteration 0151/0187: training loss 0.688 Epoch 89 iteration 0152/0187: training loss 0.689 Epoch 89 iteration 0153/0187: training loss 0.689 Epoch 89 iteration 0154/0187: training loss 0.689 Epoch 89 iteration 0155/0187: training loss 0.689 Epoch 89 iteration 0156/0187: training loss 0.690 Epoch 89 iteration 0157/0187: training loss 0.689 Epoch 89 iteration 0158/0187: training loss 0.689 Epoch 89 iteration 0159/0187: training loss 0.689 Epoch 89 iteration 0160/0187: training loss 0.689 Epoch 89 iteration 0161/0187: training loss 0.690 Epoch 89 iteration 0162/0187: training loss 0.690 Epoch 89 iteration 0163/0187: training loss 0.689 Epoch 89 iteration 0164/0187: training loss 0.689 Epoch 89 iteration 0165/0187: training loss 0.690 Epoch 89 iteration 0166/0187: training loss 0.689 Epoch 89 iteration 0167/0187: training loss 0.689 Epoch 89 iteration 0168/0187: training loss 0.688 Epoch 89 iteration 0169/0187: training loss 0.687 Epoch 89 iteration 0170/0187: training loss 0.688 Epoch 89 iteration 0171/0187: training loss 0.688 Epoch 89 iteration 0172/0187: training loss 0.688 Epoch 89 iteration 0173/0187: training loss 0.689 Epoch 89 iteration 0174/0187: training loss 0.689 Epoch 89 iteration 0175/0187: training loss 0.688 Epoch 89 iteration 0176/0187: training loss 0.688 Epoch 89 iteration 0177/0187: training loss 0.689 Epoch 89 iteration 0178/0187: training loss 0.688 Epoch 89 iteration 0179/0187: training loss 0.688 Epoch 89 iteration 0180/0187: training loss 0.687 Epoch 89 iteration 0181/0187: training loss 0.687 Epoch 89 iteration 0182/0187: training loss 0.688 Epoch 89 iteration 0183/0187: training loss 0.687 Epoch 89 iteration 0184/0187: training loss 0.687 Epoch 89 iteration 0185/0187: training loss 0.687 Epoch 89 iteration 0186/0187: training loss 0.687 Epoch 89 iteration 0187/0187: training loss 0.688 Epoch 89 validation pixAcc: 0.875, mIoU: 0.393 Epoch 90 iteration 0001/0187: training loss 0.736 Epoch 90 iteration 0002/0187: training loss 0.750 Epoch 90 iteration 0003/0187: training loss 0.745 Epoch 90 iteration 0004/0187: training loss 0.769 Epoch 90 iteration 0005/0187: training loss 0.761 Epoch 90 iteration 0006/0187: training loss 0.747 Epoch 90 iteration 0007/0187: training loss 0.735 Epoch 90 iteration 0008/0187: training loss 0.721 Epoch 90 iteration 0009/0187: training loss 0.718 Epoch 90 iteration 0010/0187: training loss 0.720 Epoch 90 iteration 0011/0187: training loss 0.713 Epoch 90 iteration 0012/0187: training loss 0.710 Epoch 90 iteration 0013/0187: training loss 0.706 Epoch 90 iteration 0014/0187: training loss 0.696 Epoch 90 iteration 0015/0187: training loss 0.694 Epoch 90 iteration 0016/0187: training loss 0.684 Epoch 90 iteration 0017/0187: training loss 0.687 Epoch 90 iteration 0018/0187: training loss 0.696 Epoch 90 iteration 0019/0187: training loss 0.688 Epoch 90 iteration 0020/0187: training loss 0.687 Epoch 90 iteration 0021/0187: training loss 0.689 Epoch 90 iteration 0022/0187: training loss 0.693 Epoch 90 iteration 0023/0187: training loss 0.687 Epoch 90 iteration 0024/0187: training loss 0.680 Epoch 90 iteration 0025/0187: training loss 0.679 Epoch 90 iteration 0026/0187: training loss 0.676 Epoch 90 iteration 0027/0187: training loss 0.672 Epoch 90 iteration 0028/0187: training loss 0.667 Epoch 90 iteration 0029/0187: training loss 0.686 Epoch 90 iteration 0030/0187: training loss 0.685 Epoch 90 iteration 0031/0187: training loss 0.682 Epoch 90 iteration 0032/0187: training loss 0.686 Epoch 90 iteration 0033/0187: training loss 0.684 Epoch 90 iteration 0034/0187: training loss 0.684 Epoch 90 iteration 0035/0187: training loss 0.682 Epoch 90 iteration 0036/0187: training loss 0.685 Epoch 90 iteration 0037/0187: training loss 0.685 Epoch 90 iteration 0038/0187: training loss 0.683 Epoch 90 iteration 0039/0187: training loss 0.682 Epoch 90 iteration 0040/0187: training loss 0.681 Epoch 90 iteration 0041/0187: training loss 0.688 Epoch 90 iteration 0042/0187: training loss 0.687 Epoch 90 iteration 0043/0187: training loss 0.685 Epoch 90 iteration 0044/0187: training loss 0.686 Epoch 90 iteration 0045/0187: training loss 0.684 Epoch 90 iteration 0046/0187: training loss 0.683 Epoch 90 iteration 0047/0187: training loss 0.687 Epoch 90 iteration 0048/0187: training loss 0.692 Epoch 90 iteration 0049/0187: training loss 0.690 Epoch 90 iteration 0050/0187: training loss 0.688 Epoch 90 iteration 0051/0187: training loss 0.687 Epoch 90 iteration 0052/0187: training loss 0.688 Epoch 90 iteration 0053/0187: training loss 0.685 Epoch 90 iteration 0054/0187: training loss 0.682 Epoch 90 iteration 0055/0187: training loss 0.685 Epoch 90 iteration 0056/0187: training loss 0.686 Epoch 90 iteration 0057/0187: training loss 0.684 Epoch 90 iteration 0058/0187: training loss 0.685 Epoch 90 iteration 0059/0187: training loss 0.689 Epoch 90 iteration 0060/0187: training loss 0.685 Epoch 90 iteration 0061/0187: training loss 0.684 Epoch 90 iteration 0062/0187: training loss 0.682 Epoch 90 iteration 0063/0187: training loss 0.684 Epoch 90 iteration 0064/0187: training loss 0.685 Epoch 90 iteration 0065/0187: training loss 0.683 Epoch 90 iteration 0066/0187: training loss 0.680 Epoch 90 iteration 0067/0187: training loss 0.682 Epoch 90 iteration 0068/0187: training loss 0.681 Epoch 90 iteration 0069/0187: training loss 0.685 Epoch 90 iteration 0070/0187: training loss 0.685 Epoch 90 iteration 0071/0187: training loss 0.685 Epoch 90 iteration 0072/0187: training loss 0.686 Epoch 90 iteration 0073/0187: training loss 0.686 Epoch 90 iteration 0074/0187: training loss 0.685 Epoch 90 iteration 0075/0187: training loss 0.684 Epoch 90 iteration 0076/0187: training loss 0.684 Epoch 90 iteration 0077/0187: training loss 0.682 Epoch 90 iteration 0078/0187: training loss 0.682 Epoch 90 iteration 0079/0187: training loss 0.681 Epoch 90 iteration 0080/0187: training loss 0.681 Epoch 90 iteration 0081/0187: training loss 0.681 Epoch 90 iteration 0082/0187: training loss 0.680 Epoch 90 iteration 0083/0187: training loss 0.683 Epoch 90 iteration 0084/0187: training loss 0.684 Epoch 90 iteration 0085/0187: training loss 0.687 Epoch 90 iteration 0086/0187: training loss 0.686 Epoch 90 iteration 0087/0187: training loss 0.684 Epoch 90 iteration 0088/0187: training loss 0.686 Epoch 90 iteration 0089/0187: training loss 0.685 Epoch 90 iteration 0090/0187: training loss 0.686 Epoch 90 iteration 0091/0188: training loss 0.687 Epoch 90 iteration 0092/0188: training loss 0.687 Epoch 90 iteration 0093/0188: training loss 0.687 Epoch 90 iteration 0094/0188: training loss 0.686 Epoch 90 iteration 0095/0188: training loss 0.687 Epoch 90 iteration 0096/0188: training loss 0.686 Epoch 90 iteration 0097/0188: training loss 0.686 Epoch 90 iteration 0098/0188: training loss 0.688 Epoch 90 iteration 0099/0188: training loss 0.687 Epoch 90 iteration 0100/0188: training loss 0.686 Epoch 90 iteration 0101/0188: training loss 0.687 Epoch 90 iteration 0102/0188: training loss 0.689 Epoch 90 iteration 0103/0188: training loss 0.688 Epoch 90 iteration 0104/0188: training loss 0.687 Epoch 90 iteration 0105/0188: training loss 0.686 Epoch 90 iteration 0106/0188: training loss 0.686 Epoch 90 iteration 0107/0188: training loss 0.686 Epoch 90 iteration 0108/0188: training loss 0.687 Epoch 90 iteration 0109/0188: training loss 0.686 Epoch 90 iteration 0110/0188: training loss 0.686 Epoch 90 iteration 0111/0188: training loss 0.686 Epoch 90 iteration 0112/0188: training loss 0.686 Epoch 90 iteration 0113/0188: training loss 0.685 Epoch 90 iteration 0114/0188: training loss 0.686 Epoch 90 iteration 0115/0188: training loss 0.685 Epoch 90 iteration 0116/0188: training loss 0.684 Epoch 90 iteration 0117/0188: training loss 0.686 Epoch 90 iteration 0118/0188: training loss 0.687 Epoch 90 iteration 0119/0188: training loss 0.686 Epoch 90 iteration 0120/0188: training loss 0.686 Epoch 90 iteration 0121/0188: training loss 0.687 Epoch 90 iteration 0122/0188: training loss 0.686 Epoch 90 iteration 0123/0188: training loss 0.686 Epoch 90 iteration 0124/0188: training loss 0.687 Epoch 90 iteration 0125/0188: training loss 0.685 Epoch 90 iteration 0126/0188: training loss 0.684 Epoch 90 iteration 0127/0188: training loss 0.685 Epoch 90 iteration 0128/0188: training loss 0.685 Epoch 90 iteration 0129/0188: training loss 0.686 Epoch 90 iteration 0130/0188: training loss 0.686 Epoch 90 iteration 0131/0188: training loss 0.686 Epoch 90 iteration 0132/0188: training loss 0.686 Epoch 90 iteration 0133/0188: training loss 0.687 Epoch 90 iteration 0134/0188: training loss 0.686 Epoch 90 iteration 0135/0188: training loss 0.687 Epoch 90 iteration 0136/0188: training loss 0.686 Epoch 90 iteration 0137/0188: training loss 0.687 Epoch 90 iteration 0138/0188: training loss 0.686 Epoch 90 iteration 0139/0188: training loss 0.687 Epoch 90 iteration 0140/0188: training loss 0.686 Epoch 90 iteration 0141/0188: training loss 0.688 Epoch 90 iteration 0142/0188: training loss 0.688 Epoch 90 iteration 0143/0188: training loss 0.688 Epoch 90 iteration 0144/0188: training loss 0.688 Epoch 90 iteration 0145/0188: training loss 0.689 Epoch 90 iteration 0146/0188: training loss 0.689 Epoch 90 iteration 0147/0188: training loss 0.689 Epoch 90 iteration 0148/0188: training loss 0.688 Epoch 90 iteration 0149/0188: training loss 0.689 Epoch 90 iteration 0150/0188: training loss 0.689 Epoch 90 iteration 0151/0188: training loss 0.689 Epoch 90 iteration 0152/0188: training loss 0.689 Epoch 90 iteration 0153/0188: training loss 0.689 Epoch 90 iteration 0154/0188: training loss 0.689 Epoch 90 iteration 0155/0188: training loss 0.689 Epoch 90 iteration 0156/0188: training loss 0.688 Epoch 90 iteration 0157/0188: training loss 0.688 Epoch 90 iteration 0158/0188: training loss 0.688 Epoch 90 iteration 0159/0188: training loss 0.688 Epoch 90 iteration 0160/0188: training loss 0.689 Epoch 90 iteration 0161/0188: training loss 0.690 Epoch 90 iteration 0162/0188: training loss 0.690 Epoch 90 iteration 0163/0188: training loss 0.690 Epoch 90 iteration 0164/0188: training loss 0.690 Epoch 90 iteration 0165/0188: training loss 0.691 Epoch 90 iteration 0166/0188: training loss 0.691 Epoch 90 iteration 0167/0188: training loss 0.690 Epoch 90 iteration 0168/0188: training loss 0.690 Epoch 90 iteration 0169/0188: training loss 0.690 Epoch 90 iteration 0170/0188: training loss 0.689 Epoch 90 iteration 0171/0188: training loss 0.689 Epoch 90 iteration 0172/0188: training loss 0.689 Epoch 90 iteration 0173/0188: training loss 0.689 Epoch 90 iteration 0174/0188: training loss 0.689 Epoch 90 iteration 0175/0188: training loss 0.689 Epoch 90 iteration 0176/0188: training loss 0.689 Epoch 90 iteration 0177/0188: training loss 0.690 Epoch 90 iteration 0178/0188: training loss 0.690 Epoch 90 iteration 0179/0188: training loss 0.689 Epoch 90 iteration 0180/0188: training loss 0.690 Epoch 90 iteration 0181/0188: training loss 0.691 Epoch 90 iteration 0182/0188: training loss 0.691 Epoch 90 iteration 0183/0188: training loss 0.690 Epoch 90 iteration 0184/0188: training loss 0.689 Epoch 90 iteration 0185/0188: training loss 0.689 Epoch 90 iteration 0186/0188: training loss 0.689 Epoch 90 validation pixAcc: 0.875, mIoU: 0.391 Epoch 91 iteration 0001/0187: training loss 0.774 Epoch 91 iteration 0002/0187: training loss 0.725 Epoch 91 iteration 0003/0187: training loss 0.755 Epoch 91 iteration 0004/0187: training loss 0.741 Epoch 91 iteration 0005/0187: training loss 0.720 Epoch 91 iteration 0006/0187: training loss 0.709 Epoch 91 iteration 0007/0187: training loss 0.690 Epoch 91 iteration 0008/0187: training loss 0.669 Epoch 91 iteration 0009/0187: training loss 0.666 Epoch 91 iteration 0010/0187: training loss 0.665 Epoch 91 iteration 0011/0187: training loss 0.660 Epoch 91 iteration 0012/0187: training loss 0.666 Epoch 91 iteration 0013/0187: training loss 0.656 Epoch 91 iteration 0014/0187: training loss 0.660 Epoch 91 iteration 0015/0187: training loss 0.658 Epoch 91 iteration 0016/0187: training loss 0.664 Epoch 91 iteration 0017/0187: training loss 0.664 Epoch 91 iteration 0018/0187: training loss 0.676 Epoch 91 iteration 0019/0187: training loss 0.673 Epoch 91 iteration 0020/0187: training loss 0.672 Epoch 91 iteration 0021/0187: training loss 0.670 Epoch 91 iteration 0022/0187: training loss 0.659 Epoch 91 iteration 0023/0187: training loss 0.656 Epoch 91 iteration 0024/0187: training loss 0.657 Epoch 91 iteration 0025/0187: training loss 0.656 Epoch 91 iteration 0026/0187: training loss 0.660 Epoch 91 iteration 0027/0187: training loss 0.662 Epoch 91 iteration 0028/0187: training loss 0.658 Epoch 91 iteration 0029/0187: training loss 0.655 Epoch 91 iteration 0030/0187: training loss 0.657 Epoch 91 iteration 0031/0187: training loss 0.657 Epoch 91 iteration 0032/0187: training loss 0.658 Epoch 91 iteration 0033/0187: training loss 0.658 Epoch 91 iteration 0034/0187: training loss 0.659 Epoch 91 iteration 0035/0187: training loss 0.655 Epoch 91 iteration 0036/0187: training loss 0.660 Epoch 91 iteration 0037/0187: training loss 0.657 Epoch 91 iteration 0038/0187: training loss 0.658 Epoch 91 iteration 0039/0187: training loss 0.660 Epoch 91 iteration 0040/0187: training loss 0.664 Epoch 91 iteration 0041/0187: training loss 0.662 Epoch 91 iteration 0042/0187: training loss 0.660 Epoch 91 iteration 0043/0187: training loss 0.660 Epoch 91 iteration 0044/0187: training loss 0.657 Epoch 91 iteration 0045/0187: training loss 0.657 Epoch 91 iteration 0046/0187: training loss 0.658 Epoch 91 iteration 0047/0187: training loss 0.660 Epoch 91 iteration 0048/0187: training loss 0.661 Epoch 91 iteration 0049/0187: training loss 0.660 Epoch 91 iteration 0050/0187: training loss 0.664 Epoch 91 iteration 0051/0187: training loss 0.663 Epoch 91 iteration 0052/0187: training loss 0.663 Epoch 91 iteration 0053/0187: training loss 0.661 Epoch 91 iteration 0054/0187: training loss 0.662 Epoch 91 iteration 0055/0187: training loss 0.661 Epoch 91 iteration 0056/0187: training loss 0.660 Epoch 91 iteration 0057/0187: training loss 0.664 Epoch 91 iteration 0058/0187: training loss 0.664 Epoch 91 iteration 0059/0187: training loss 0.665 Epoch 91 iteration 0060/0187: training loss 0.665 Epoch 91 iteration 0061/0187: training loss 0.667 Epoch 91 iteration 0062/0187: training loss 0.668 Epoch 91 iteration 0063/0187: training loss 0.668 Epoch 91 iteration 0064/0187: training loss 0.668 Epoch 91 iteration 0065/0187: training loss 0.669 Epoch 91 iteration 0066/0187: training loss 0.667 Epoch 91 iteration 0067/0187: training loss 0.666 Epoch 91 iteration 0068/0187: training loss 0.666 Epoch 91 iteration 0069/0187: training loss 0.665 Epoch 91 iteration 0070/0187: training loss 0.664 Epoch 91 iteration 0071/0187: training loss 0.662 Epoch 91 iteration 0072/0187: training loss 0.663 Epoch 91 iteration 0073/0187: training loss 0.664 Epoch 91 iteration 0074/0187: training loss 0.663 Epoch 91 iteration 0075/0187: training loss 0.662 Epoch 91 iteration 0076/0187: training loss 0.662 Epoch 91 iteration 0077/0187: training loss 0.662 Epoch 91 iteration 0078/0187: training loss 0.661 Epoch 91 iteration 0079/0187: training loss 0.663 Epoch 91 iteration 0080/0187: training loss 0.663 Epoch 91 iteration 0081/0187: training loss 0.663 Epoch 91 iteration 0082/0187: training loss 0.664 Epoch 91 iteration 0083/0187: training loss 0.665 Epoch 91 iteration 0084/0187: training loss 0.664 Epoch 91 iteration 0085/0187: training loss 0.665 Epoch 91 iteration 0086/0187: training loss 0.664 Epoch 91 iteration 0087/0187: training loss 0.665 Epoch 91 iteration 0088/0187: training loss 0.667 Epoch 91 iteration 0089/0187: training loss 0.667 Epoch 91 iteration 0090/0187: training loss 0.668 Epoch 91 iteration 0091/0187: training loss 0.667 Epoch 91 iteration 0092/0187: training loss 0.667 Epoch 91 iteration 0093/0187: training loss 0.668 Epoch 91 iteration 0094/0187: training loss 0.668 Epoch 91 iteration 0095/0187: training loss 0.667 Epoch 91 iteration 0096/0187: training loss 0.666 Epoch 91 iteration 0097/0187: training loss 0.666 Epoch 91 iteration 0098/0187: training loss 0.667 Epoch 91 iteration 0099/0187: training loss 0.666 Epoch 91 iteration 0100/0187: training loss 0.667 Epoch 91 iteration 0101/0187: training loss 0.668 Epoch 91 iteration 0102/0187: training loss 0.668 Epoch 91 iteration 0103/0187: training loss 0.668 Epoch 91 iteration 0104/0187: training loss 0.667 Epoch 91 iteration 0105/0187: training loss 0.668 Epoch 91 iteration 0106/0187: training loss 0.670 Epoch 91 iteration 0107/0187: training loss 0.670 Epoch 91 iteration 0108/0187: training loss 0.671 Epoch 91 iteration 0109/0187: training loss 0.672 Epoch 91 iteration 0110/0187: training loss 0.673 Epoch 91 iteration 0111/0187: training loss 0.675 Epoch 91 iteration 0112/0187: training loss 0.675 Epoch 91 iteration 0113/0187: training loss 0.675 Epoch 91 iteration 0114/0187: training loss 0.677 Epoch 91 iteration 0115/0187: training loss 0.677 Epoch 91 iteration 0116/0187: training loss 0.678 Epoch 91 iteration 0117/0187: training loss 0.680 Epoch 91 iteration 0118/0187: training loss 0.680 Epoch 91 iteration 0119/0187: training loss 0.680 Epoch 91 iteration 0120/0187: training loss 0.680 Epoch 91 iteration 0121/0187: training loss 0.680 Epoch 91 iteration 0122/0187: training loss 0.680 Epoch 91 iteration 0123/0187: training loss 0.681 Epoch 91 iteration 0124/0187: training loss 0.681 Epoch 91 iteration 0125/0187: training loss 0.680 Epoch 91 iteration 0126/0187: training loss 0.680 Epoch 91 iteration 0127/0187: training loss 0.680 Epoch 91 iteration 0128/0187: training loss 0.680 Epoch 91 iteration 0129/0187: training loss 0.681 Epoch 91 iteration 0130/0187: training loss 0.682 Epoch 91 iteration 0131/0187: training loss 0.682 Epoch 91 iteration 0132/0187: training loss 0.682 Epoch 91 iteration 0133/0187: training loss 0.683 Epoch 91 iteration 0134/0187: training loss 0.684 Epoch 91 iteration 0135/0187: training loss 0.685 Epoch 91 iteration 0136/0187: training loss 0.686 Epoch 91 iteration 0137/0187: training loss 0.686 Epoch 91 iteration 0138/0187: training loss 0.685 Epoch 91 iteration 0139/0187: training loss 0.686 Epoch 91 iteration 0140/0187: training loss 0.685 Epoch 91 iteration 0141/0187: training loss 0.684 Epoch 91 iteration 0142/0187: training loss 0.684 Epoch 91 iteration 0143/0187: training loss 0.683 Epoch 91 iteration 0144/0187: training loss 0.682 Epoch 91 iteration 0145/0187: training loss 0.683 Epoch 91 iteration 0146/0187: training loss 0.683 Epoch 91 iteration 0147/0187: training loss 0.682 Epoch 91 iteration 0148/0187: training loss 0.681 Epoch 91 iteration 0149/0187: training loss 0.681 Epoch 91 iteration 0150/0187: training loss 0.682 Epoch 91 iteration 0151/0187: training loss 0.683 Epoch 91 iteration 0152/0187: training loss 0.681 Epoch 91 iteration 0153/0187: training loss 0.683 Epoch 91 iteration 0154/0187: training loss 0.682 Epoch 91 iteration 0155/0187: training loss 0.682 Epoch 91 iteration 0156/0187: training loss 0.682 Epoch 91 iteration 0157/0187: training loss 0.682 Epoch 91 iteration 0158/0187: training loss 0.682 Epoch 91 iteration 0159/0187: training loss 0.682 Epoch 91 iteration 0160/0187: training loss 0.682 Epoch 91 iteration 0161/0187: training loss 0.682 Epoch 91 iteration 0162/0187: training loss 0.682 Epoch 91 iteration 0163/0187: training loss 0.681 Epoch 91 iteration 0164/0187: training loss 0.681 Epoch 91 iteration 0165/0187: training loss 0.682 Epoch 91 iteration 0166/0187: training loss 0.682 Epoch 91 iteration 0167/0187: training loss 0.682 Epoch 91 iteration 0168/0187: training loss 0.682 Epoch 91 iteration 0169/0187: training loss 0.682 Epoch 91 iteration 0170/0187: training loss 0.681 Epoch 91 iteration 0171/0187: training loss 0.680 Epoch 91 iteration 0172/0187: training loss 0.680 Epoch 91 iteration 0173/0187: training loss 0.680 Epoch 91 iteration 0174/0187: training loss 0.681 Epoch 91 iteration 0175/0187: training loss 0.681 Epoch 91 iteration 0176/0187: training loss 0.681 Epoch 91 iteration 0177/0187: training loss 0.681 Epoch 91 iteration 0178/0187: training loss 0.681 Epoch 91 iteration 0179/0187: training loss 0.680 Epoch 91 iteration 0180/0187: training loss 0.681 Epoch 91 iteration 0181/0187: training loss 0.682 Epoch 91 iteration 0182/0187: training loss 0.682 Epoch 91 iteration 0183/0187: training loss 0.682 Epoch 91 iteration 0184/0187: training loss 0.682 Epoch 91 iteration 0185/0187: training loss 0.682 Epoch 91 iteration 0186/0187: training loss 0.682 Epoch 91 iteration 0187/0187: training loss 0.682 Epoch 91 validation pixAcc: 0.874, mIoU: 0.391 Epoch 92 iteration 0001/0187: training loss 0.722 Epoch 92 iteration 0002/0187: training loss 0.695 Epoch 92 iteration 0003/0187: training loss 0.705 Epoch 92 iteration 0004/0187: training loss 0.729 Epoch 92 iteration 0005/0187: training loss 0.732 Epoch 92 iteration 0006/0187: training loss 0.724 Epoch 92 iteration 0007/0187: training loss 0.724 Epoch 92 iteration 0008/0187: training loss 0.707 Epoch 92 iteration 0009/0187: training loss 0.712 Epoch 92 iteration 0010/0187: training loss 0.696 Epoch 92 iteration 0011/0187: training loss 0.696 Epoch 92 iteration 0012/0187: training loss 0.692 Epoch 92 iteration 0013/0187: training loss 0.696 Epoch 92 iteration 0014/0187: training loss 0.685 Epoch 92 iteration 0015/0187: training loss 0.683 Epoch 92 iteration 0016/0187: training loss 0.684 Epoch 92 iteration 0017/0187: training loss 0.679 Epoch 92 iteration 0018/0187: training loss 0.675 Epoch 92 iteration 0019/0187: training loss 0.679 Epoch 92 iteration 0020/0187: training loss 0.679 Epoch 92 iteration 0021/0187: training loss 0.676 Epoch 92 iteration 0022/0187: training loss 0.677 Epoch 92 iteration 0023/0187: training loss 0.679 Epoch 92 iteration 0024/0187: training loss 0.684 Epoch 92 iteration 0025/0187: training loss 0.694 Epoch 92 iteration 0026/0187: training loss 0.693 Epoch 92 iteration 0027/0187: training loss 0.691 Epoch 92 iteration 0028/0187: training loss 0.693 Epoch 92 iteration 0029/0187: training loss 0.690 Epoch 92 iteration 0030/0187: training loss 0.684 Epoch 92 iteration 0031/0187: training loss 0.685 Epoch 92 iteration 0032/0187: training loss 0.684 Epoch 92 iteration 0033/0187: training loss 0.680 Epoch 92 iteration 0034/0187: training loss 0.676 Epoch 92 iteration 0035/0187: training loss 0.675 Epoch 92 iteration 0036/0187: training loss 0.679 Epoch 92 iteration 0037/0187: training loss 0.690 Epoch 92 iteration 0038/0187: training loss 0.690 Epoch 92 iteration 0039/0187: training loss 0.694 Epoch 92 iteration 0040/0187: training loss 0.695 Epoch 92 iteration 0041/0187: training loss 0.696 Epoch 92 iteration 0042/0187: training loss 0.698 Epoch 92 iteration 0043/0187: training loss 0.700 Epoch 92 iteration 0044/0187: training loss 0.695 Epoch 92 iteration 0045/0187: training loss 0.698 Epoch 92 iteration 0046/0187: training loss 0.700 Epoch 92 iteration 0047/0187: training loss 0.701 Epoch 92 iteration 0048/0187: training loss 0.697 Epoch 92 iteration 0049/0187: training loss 0.699 Epoch 92 iteration 0050/0187: training loss 0.702 Epoch 92 iteration 0051/0187: training loss 0.702 Epoch 92 iteration 0052/0187: training loss 0.701 Epoch 92 iteration 0053/0187: training loss 0.697 Epoch 92 iteration 0054/0187: training loss 0.698 Epoch 92 iteration 0055/0187: training loss 0.696 Epoch 92 iteration 0056/0187: training loss 0.695 Epoch 92 iteration 0057/0187: training loss 0.693 Epoch 92 iteration 0058/0187: training loss 0.690 Epoch 92 iteration 0059/0187: training loss 0.688 Epoch 92 iteration 0060/0187: training loss 0.692 Epoch 92 iteration 0061/0187: training loss 0.697 Epoch 92 iteration 0062/0187: training loss 0.695 Epoch 92 iteration 0063/0187: training loss 0.694 Epoch 92 iteration 0064/0187: training loss 0.692 Epoch 92 iteration 0065/0187: training loss 0.697 Epoch 92 iteration 0066/0187: training loss 0.697 Epoch 92 iteration 0067/0187: training loss 0.697 Epoch 92 iteration 0068/0187: training loss 0.695 Epoch 92 iteration 0069/0187: training loss 0.697 Epoch 92 iteration 0070/0187: training loss 0.698 Epoch 92 iteration 0071/0187: training loss 0.699 Epoch 92 iteration 0072/0187: training loss 0.698 Epoch 92 iteration 0073/0187: training loss 0.698 Epoch 92 iteration 0074/0187: training loss 0.698 Epoch 92 iteration 0075/0187: training loss 0.699 Epoch 92 iteration 0076/0187: training loss 0.699 Epoch 92 iteration 0077/0187: training loss 0.698 Epoch 92 iteration 0078/0187: training loss 0.698 Epoch 92 iteration 0079/0187: training loss 0.700 Epoch 92 iteration 0080/0187: training loss 0.699 Epoch 92 iteration 0081/0187: training loss 0.699 Epoch 92 iteration 0082/0187: training loss 0.702 Epoch 92 iteration 0083/0187: training loss 0.705 Epoch 92 iteration 0084/0187: training loss 0.705 Epoch 92 iteration 0085/0187: training loss 0.703 Epoch 92 iteration 0086/0187: training loss 0.705 Epoch 92 iteration 0087/0187: training loss 0.707 Epoch 92 iteration 0088/0187: training loss 0.706 Epoch 92 iteration 0089/0187: training loss 0.704 Epoch 92 iteration 0090/0187: training loss 0.704 Epoch 92 iteration 0091/0188: training loss 0.704 Epoch 92 iteration 0092/0188: training loss 0.705 Epoch 92 iteration 0093/0188: training loss 0.705 Epoch 92 iteration 0094/0188: training loss 0.704 Epoch 92 iteration 0095/0188: training loss 0.703 Epoch 92 iteration 0096/0188: training loss 0.702 Epoch 92 iteration 0097/0188: training loss 0.703 Epoch 92 iteration 0098/0188: training loss 0.703 Epoch 92 iteration 0099/0188: training loss 0.703 Epoch 92 iteration 0100/0188: training loss 0.701 Epoch 92 iteration 0101/0188: training loss 0.701 Epoch 92 iteration 0102/0188: training loss 0.700 Epoch 92 iteration 0103/0188: training loss 0.702 Epoch 92 iteration 0104/0188: training loss 0.701 Epoch 92 iteration 0105/0188: training loss 0.700 Epoch 92 iteration 0106/0188: training loss 0.700 Epoch 92 iteration 0107/0188: training loss 0.701 Epoch 92 iteration 0108/0188: training loss 0.700 Epoch 92 iteration 0109/0188: training loss 0.699 Epoch 92 iteration 0110/0188: training loss 0.700 Epoch 92 iteration 0111/0188: training loss 0.700 Epoch 92 iteration 0112/0188: training loss 0.699 Epoch 92 iteration 0113/0188: training loss 0.700 Epoch 92 iteration 0114/0188: training loss 0.700 Epoch 92 iteration 0115/0188: training loss 0.700 Epoch 92 iteration 0116/0188: training loss 0.700 Epoch 92 iteration 0117/0188: training loss 0.700 Epoch 92 iteration 0118/0188: training loss 0.700 Epoch 92 iteration 0119/0188: training loss 0.700 Epoch 92 iteration 0120/0188: training loss 0.699 Epoch 92 iteration 0121/0188: training loss 0.698 Epoch 92 iteration 0122/0188: training loss 0.699 Epoch 92 iteration 0123/0188: training loss 0.699 Epoch 92 iteration 0124/0188: training loss 0.698 Epoch 92 iteration 0125/0188: training loss 0.697 Epoch 92 iteration 0126/0188: training loss 0.697 Epoch 92 iteration 0127/0188: training loss 0.697 Epoch 92 iteration 0128/0188: training loss 0.698 Epoch 92 iteration 0129/0188: training loss 0.697 Epoch 92 iteration 0130/0188: training loss 0.697 Epoch 92 iteration 0131/0188: training loss 0.696 Epoch 92 iteration 0132/0188: training loss 0.696 Epoch 92 iteration 0133/0188: training loss 0.696 Epoch 92 iteration 0134/0188: training loss 0.695 Epoch 92 iteration 0135/0188: training loss 0.694 Epoch 92 iteration 0136/0188: training loss 0.694 Epoch 92 iteration 0137/0188: training loss 0.694 Epoch 92 iteration 0138/0188: training loss 0.694 Epoch 92 iteration 0139/0188: training loss 0.694 Epoch 92 iteration 0140/0188: training loss 0.695 Epoch 92 iteration 0141/0188: training loss 0.694 Epoch 92 iteration 0142/0188: training loss 0.693 Epoch 92 iteration 0143/0188: training loss 0.693 Epoch 92 iteration 0144/0188: training loss 0.692 Epoch 92 iteration 0145/0188: training loss 0.693 Epoch 92 iteration 0146/0188: training loss 0.693 Epoch 92 iteration 0147/0188: training loss 0.692 Epoch 92 iteration 0148/0188: training loss 0.692 Epoch 92 iteration 0149/0188: training loss 0.692 Epoch 92 iteration 0150/0188: training loss 0.692 Epoch 92 iteration 0151/0188: training loss 0.691 Epoch 92 iteration 0152/0188: training loss 0.691 Epoch 92 iteration 0153/0188: training loss 0.692 Epoch 92 iteration 0154/0188: training loss 0.692 Epoch 92 iteration 0155/0188: training loss 0.691 Epoch 92 iteration 0156/0188: training loss 0.692 Epoch 92 iteration 0157/0188: training loss 0.691 Epoch 92 iteration 0158/0188: training loss 0.691 Epoch 92 iteration 0159/0188: training loss 0.691 Epoch 92 iteration 0160/0188: training loss 0.691 Epoch 92 iteration 0161/0188: training loss 0.689 Epoch 92 iteration 0162/0188: training loss 0.689 Epoch 92 iteration 0163/0188: training loss 0.689 Epoch 92 iteration 0164/0188: training loss 0.689 Epoch 92 iteration 0165/0188: training loss 0.689 Epoch 92 iteration 0166/0188: training loss 0.688 Epoch 92 iteration 0167/0188: training loss 0.689 Epoch 92 iteration 0168/0188: training loss 0.688 Epoch 92 iteration 0169/0188: training loss 0.689 Epoch 92 iteration 0170/0188: training loss 0.688 Epoch 92 iteration 0171/0188: training loss 0.688 Epoch 92 iteration 0172/0188: training loss 0.688 Epoch 92 iteration 0173/0188: training loss 0.688 Epoch 92 iteration 0174/0188: training loss 0.688 Epoch 92 iteration 0175/0188: training loss 0.688 Epoch 92 iteration 0176/0188: training loss 0.688 Epoch 92 iteration 0177/0188: training loss 0.687 Epoch 92 iteration 0178/0188: training loss 0.686 Epoch 92 iteration 0179/0188: training loss 0.686 Epoch 92 iteration 0180/0188: training loss 0.687 Epoch 92 iteration 0181/0188: training loss 0.687 Epoch 92 iteration 0182/0188: training loss 0.687 Epoch 92 iteration 0183/0188: training loss 0.687 Epoch 92 iteration 0184/0188: training loss 0.687 Epoch 92 iteration 0185/0188: training loss 0.687 Epoch 92 iteration 0186/0188: training loss 0.686 Epoch 92 validation pixAcc: 0.875, mIoU: 0.393 Epoch 93 iteration 0001/0187: training loss 0.737 Epoch 93 iteration 0002/0187: training loss 0.667 Epoch 93 iteration 0003/0187: training loss 0.670 Epoch 93 iteration 0004/0187: training loss 0.671 Epoch 93 iteration 0005/0187: training loss 0.639 Epoch 93 iteration 0006/0187: training loss 0.625 Epoch 93 iteration 0007/0187: training loss 0.622 Epoch 93 iteration 0008/0187: training loss 0.619 Epoch 93 iteration 0009/0187: training loss 0.621 Epoch 93 iteration 0010/0187: training loss 0.620 Epoch 93 iteration 0011/0187: training loss 0.617 Epoch 93 iteration 0012/0187: training loss 0.613 Epoch 93 iteration 0013/0187: training loss 0.619 Epoch 93 iteration 0014/0187: training loss 0.616 Epoch 93 iteration 0015/0187: training loss 0.631 Epoch 93 iteration 0016/0187: training loss 0.634 Epoch 93 iteration 0017/0187: training loss 0.643 Epoch 93 iteration 0018/0187: training loss 0.641 Epoch 93 iteration 0019/0187: training loss 0.656 Epoch 93 iteration 0020/0187: training loss 0.657 Epoch 93 iteration 0021/0187: training loss 0.654 Epoch 93 iteration 0022/0187: training loss 0.650 Epoch 93 iteration 0023/0187: training loss 0.652 Epoch 93 iteration 0024/0187: training loss 0.658 Epoch 93 iteration 0025/0187: training loss 0.663 Epoch 93 iteration 0026/0187: training loss 0.662 Epoch 93 iteration 0027/0187: training loss 0.657 Epoch 93 iteration 0028/0187: training loss 0.652 Epoch 93 iteration 0029/0187: training loss 0.651 Epoch 93 iteration 0030/0187: training loss 0.652 Epoch 93 iteration 0031/0187: training loss 0.654 Epoch 93 iteration 0032/0187: training loss 0.660 Epoch 93 iteration 0033/0187: training loss 0.661 Epoch 93 iteration 0034/0187: training loss 0.659 Epoch 93 iteration 0035/0187: training loss 0.657 Epoch 93 iteration 0036/0187: training loss 0.652 Epoch 93 iteration 0037/0187: training loss 0.657 Epoch 93 iteration 0038/0187: training loss 0.660 Epoch 93 iteration 0039/0187: training loss 0.660 Epoch 93 iteration 0040/0187: training loss 0.662 Epoch 93 iteration 0041/0187: training loss 0.661 Epoch 93 iteration 0042/0187: training loss 0.669 Epoch 93 iteration 0043/0187: training loss 0.675 Epoch 93 iteration 0044/0187: training loss 0.673 Epoch 93 iteration 0045/0187: training loss 0.673 Epoch 93 iteration 0046/0187: training loss 0.673 Epoch 93 iteration 0047/0187: training loss 0.675 Epoch 93 iteration 0048/0187: training loss 0.673 Epoch 93 iteration 0049/0187: training loss 0.674 Epoch 93 iteration 0050/0187: training loss 0.677 Epoch 93 iteration 0051/0187: training loss 0.676 Epoch 93 iteration 0052/0187: training loss 0.676 Epoch 93 iteration 0053/0187: training loss 0.675 Epoch 93 iteration 0054/0187: training loss 0.676 Epoch 93 iteration 0055/0187: training loss 0.680 Epoch 93 iteration 0056/0187: training loss 0.681 Epoch 93 iteration 0057/0187: training loss 0.680 Epoch 93 iteration 0058/0187: training loss 0.680 Epoch 93 iteration 0059/0187: training loss 0.680 Epoch 93 iteration 0060/0187: training loss 0.681 Epoch 93 iteration 0061/0187: training loss 0.681 Epoch 93 iteration 0062/0187: training loss 0.679 Epoch 93 iteration 0063/0187: training loss 0.681 Epoch 93 iteration 0064/0187: training loss 0.678 Epoch 93 iteration 0065/0187: training loss 0.677 Epoch 93 iteration 0066/0187: training loss 0.676 Epoch 93 iteration 0067/0187: training loss 0.676 Epoch 93 iteration 0068/0187: training loss 0.675 Epoch 93 iteration 0069/0187: training loss 0.676 Epoch 93 iteration 0070/0187: training loss 0.673 Epoch 93 iteration 0071/0187: training loss 0.675 Epoch 93 iteration 0072/0187: training loss 0.673 Epoch 93 iteration 0073/0187: training loss 0.674 Epoch 93 iteration 0074/0187: training loss 0.675 Epoch 93 iteration 0075/0187: training loss 0.677 Epoch 93 iteration 0076/0187: training loss 0.680 Epoch 93 iteration 0077/0187: training loss 0.679 Epoch 93 iteration 0078/0187: training loss 0.679 Epoch 93 iteration 0079/0187: training loss 0.678 Epoch 93 iteration 0080/0187: training loss 0.680 Epoch 93 iteration 0081/0187: training loss 0.678 Epoch 93 iteration 0082/0187: training loss 0.678 Epoch 93 iteration 0083/0187: training loss 0.679 Epoch 93 iteration 0084/0187: training loss 0.680 Epoch 93 iteration 0085/0187: training loss 0.680 Epoch 93 iteration 0086/0187: training loss 0.680 Epoch 93 iteration 0087/0187: training loss 0.683 Epoch 93 iteration 0088/0187: training loss 0.683 Epoch 93 iteration 0089/0187: training loss 0.681 Epoch 93 iteration 0090/0187: training loss 0.680 Epoch 93 iteration 0091/0187: training loss 0.680 Epoch 93 iteration 0092/0187: training loss 0.682 Epoch 93 iteration 0093/0187: training loss 0.681 Epoch 93 iteration 0094/0187: training loss 0.682 Epoch 93 iteration 0095/0187: training loss 0.682 Epoch 93 iteration 0096/0187: training loss 0.683 Epoch 93 iteration 0097/0187: training loss 0.684 Epoch 93 iteration 0098/0187: training loss 0.686 Epoch 93 iteration 0099/0187: training loss 0.686 Epoch 93 iteration 0100/0187: training loss 0.686 Epoch 93 iteration 0101/0187: training loss 0.687 Epoch 93 iteration 0102/0187: training loss 0.685 Epoch 93 iteration 0103/0187: training loss 0.686 Epoch 93 iteration 0104/0187: training loss 0.685 Epoch 93 iteration 0105/0187: training loss 0.685 Epoch 93 iteration 0106/0187: training loss 0.684 Epoch 93 iteration 0107/0187: training loss 0.684 Epoch 93 iteration 0108/0187: training loss 0.683 Epoch 93 iteration 0109/0187: training loss 0.682 Epoch 93 iteration 0110/0187: training loss 0.683 Epoch 93 iteration 0111/0187: training loss 0.685 Epoch 93 iteration 0112/0187: training loss 0.685 Epoch 93 iteration 0113/0187: training loss 0.686 Epoch 93 iteration 0114/0187: training loss 0.687 Epoch 93 iteration 0115/0187: training loss 0.687 Epoch 93 iteration 0116/0187: training loss 0.686 Epoch 93 iteration 0117/0187: training loss 0.686 Epoch 93 iteration 0118/0187: training loss 0.687 Epoch 93 iteration 0119/0187: training loss 0.690 Epoch 93 iteration 0120/0187: training loss 0.691 Epoch 93 iteration 0121/0187: training loss 0.690 Epoch 93 iteration 0122/0187: training loss 0.688 Epoch 93 iteration 0123/0187: training loss 0.689 Epoch 93 iteration 0124/0187: training loss 0.688 Epoch 93 iteration 0125/0187: training loss 0.688 Epoch 93 iteration 0126/0187: training loss 0.689 Epoch 93 iteration 0127/0187: training loss 0.688 Epoch 93 iteration 0128/0187: training loss 0.687 Epoch 93 iteration 0129/0187: training loss 0.687 Epoch 93 iteration 0130/0187: training loss 0.687 Epoch 93 iteration 0131/0187: training loss 0.686 Epoch 93 iteration 0132/0187: training loss 0.686 Epoch 93 iteration 0133/0187: training loss 0.685 Epoch 93 iteration 0134/0187: training loss 0.685 Epoch 93 iteration 0135/0187: training loss 0.684 Epoch 93 iteration 0136/0187: training loss 0.684 Epoch 93 iteration 0137/0187: training loss 0.685 Epoch 93 iteration 0138/0187: training loss 0.685 Epoch 93 iteration 0139/0187: training loss 0.685 Epoch 93 iteration 0140/0187: training loss 0.685 Epoch 93 iteration 0141/0187: training loss 0.686 Epoch 93 iteration 0142/0187: training loss 0.687 Epoch 93 iteration 0143/0187: training loss 0.686 Epoch 93 iteration 0144/0187: training loss 0.687 Epoch 93 iteration 0145/0187: training loss 0.688 Epoch 93 iteration 0146/0187: training loss 0.688 Epoch 93 iteration 0147/0187: training loss 0.689 Epoch 93 iteration 0148/0187: training loss 0.688 Epoch 93 iteration 0149/0187: training loss 0.690 Epoch 93 iteration 0150/0187: training loss 0.689 Epoch 93 iteration 0151/0187: training loss 0.691 Epoch 93 iteration 0152/0187: training loss 0.690 Epoch 93 iteration 0153/0187: training loss 0.689 Epoch 93 iteration 0154/0187: training loss 0.690 Epoch 93 iteration 0155/0187: training loss 0.689 Epoch 93 iteration 0156/0187: training loss 0.688 Epoch 93 iteration 0157/0187: training loss 0.688 Epoch 93 iteration 0158/0187: training loss 0.688 Epoch 93 iteration 0159/0187: training loss 0.689 Epoch 93 iteration 0160/0187: training loss 0.689 Epoch 93 iteration 0161/0187: training loss 0.690 Epoch 93 iteration 0162/0187: training loss 0.690 Epoch 93 iteration 0163/0187: training loss 0.690 Epoch 93 iteration 0164/0187: training loss 0.689 Epoch 93 iteration 0165/0187: training loss 0.689 Epoch 93 iteration 0166/0187: training loss 0.688 Epoch 93 iteration 0167/0187: training loss 0.689 Epoch 93 iteration 0168/0187: training loss 0.689 Epoch 93 iteration 0169/0187: training loss 0.688 Epoch 93 iteration 0170/0187: training loss 0.687 Epoch 93 iteration 0171/0187: training loss 0.687 Epoch 93 iteration 0172/0187: training loss 0.687 Epoch 93 iteration 0173/0187: training loss 0.687 Epoch 93 iteration 0174/0187: training loss 0.687 Epoch 93 iteration 0175/0187: training loss 0.688 Epoch 93 iteration 0176/0187: training loss 0.689 Epoch 93 iteration 0177/0187: training loss 0.688 Epoch 93 iteration 0178/0187: training loss 0.688 Epoch 93 iteration 0179/0187: training loss 0.688 Epoch 93 iteration 0180/0187: training loss 0.689 Epoch 93 iteration 0181/0187: training loss 0.689 Epoch 93 iteration 0182/0187: training loss 0.688 Epoch 93 iteration 0183/0187: training loss 0.688 Epoch 93 iteration 0184/0187: training loss 0.688 Epoch 93 iteration 0185/0187: training loss 0.688 Epoch 93 iteration 0186/0187: training loss 0.687 Epoch 93 iteration 0187/0187: training loss 0.687 Epoch 93 validation pixAcc: 0.874, mIoU: 0.391 Epoch 94 iteration 0001/0187: training loss 0.687 Epoch 94 iteration 0002/0187: training loss 0.762 Epoch 94 iteration 0003/0187: training loss 0.758 Epoch 94 iteration 0004/0187: training loss 0.755 Epoch 94 iteration 0005/0187: training loss 0.761 Epoch 94 iteration 0006/0187: training loss 0.732 Epoch 94 iteration 0007/0187: training loss 0.738 Epoch 94 iteration 0008/0187: training loss 0.758 Epoch 94 iteration 0009/0187: training loss 0.756 Epoch 94 iteration 0010/0187: training loss 0.774 Epoch 94 iteration 0011/0187: training loss 0.771 Epoch 94 iteration 0012/0187: training loss 0.757 Epoch 94 iteration 0013/0187: training loss 0.743 Epoch 94 iteration 0014/0187: training loss 0.750 Epoch 94 iteration 0015/0187: training loss 0.742 Epoch 94 iteration 0016/0187: training loss 0.738 Epoch 94 iteration 0017/0187: training loss 0.730 Epoch 94 iteration 0018/0187: training loss 0.731 Epoch 94 iteration 0019/0187: training loss 0.731 Epoch 94 iteration 0020/0187: training loss 0.741 Epoch 94 iteration 0021/0187: training loss 0.735 Epoch 94 iteration 0022/0187: training loss 0.731 Epoch 94 iteration 0023/0187: training loss 0.729 Epoch 94 iteration 0024/0187: training loss 0.725 Epoch 94 iteration 0025/0187: training loss 0.722 Epoch 94 iteration 0026/0187: training loss 0.717 Epoch 94 iteration 0027/0187: training loss 0.722 Epoch 94 iteration 0028/0187: training loss 0.727 Epoch 94 iteration 0029/0187: training loss 0.731 Epoch 94 iteration 0030/0187: training loss 0.728 Epoch 94 iteration 0031/0187: training loss 0.733 Epoch 94 iteration 0032/0187: training loss 0.733 Epoch 94 iteration 0033/0187: training loss 0.728 Epoch 94 iteration 0034/0187: training loss 0.725 Epoch 94 iteration 0035/0187: training loss 0.724 Epoch 94 iteration 0036/0187: training loss 0.724 Epoch 94 iteration 0037/0187: training loss 0.720 Epoch 94 iteration 0038/0187: training loss 0.720 Epoch 94 iteration 0039/0187: training loss 0.719 Epoch 94 iteration 0040/0187: training loss 0.716 Epoch 94 iteration 0041/0187: training loss 0.712 Epoch 94 iteration 0042/0187: training loss 0.711 Epoch 94 iteration 0043/0187: training loss 0.708 Epoch 94 iteration 0044/0187: training loss 0.707 Epoch 94 iteration 0045/0187: training loss 0.709 Epoch 94 iteration 0046/0187: training loss 0.706 Epoch 94 iteration 0047/0187: training loss 0.709 Epoch 94 iteration 0048/0187: training loss 0.705 Epoch 94 iteration 0049/0187: training loss 0.704 Epoch 94 iteration 0050/0187: training loss 0.702 Epoch 94 iteration 0051/0187: training loss 0.700 Epoch 94 iteration 0052/0187: training loss 0.699 Epoch 94 iteration 0053/0187: training loss 0.702 Epoch 94 iteration 0054/0187: training loss 0.703 Epoch 94 iteration 0055/0187: training loss 0.702 Epoch 94 iteration 0056/0187: training loss 0.700 Epoch 94 iteration 0057/0187: training loss 0.697 Epoch 94 iteration 0058/0187: training loss 0.698 Epoch 94 iteration 0059/0187: training loss 0.698 Epoch 94 iteration 0060/0187: training loss 0.697 Epoch 94 iteration 0061/0187: training loss 0.697 Epoch 94 iteration 0062/0187: training loss 0.698 Epoch 94 iteration 0063/0187: training loss 0.699 Epoch 94 iteration 0064/0187: training loss 0.697 Epoch 94 iteration 0065/0187: training loss 0.696 Epoch 94 iteration 0066/0187: training loss 0.696 Epoch 94 iteration 0067/0187: training loss 0.697 Epoch 94 iteration 0068/0187: training loss 0.695 Epoch 94 iteration 0069/0187: training loss 0.696 Epoch 94 iteration 0070/0187: training loss 0.694 Epoch 94 iteration 0071/0187: training loss 0.694 Epoch 94 iteration 0072/0187: training loss 0.697 Epoch 94 iteration 0073/0187: training loss 0.699 Epoch 94 iteration 0074/0187: training loss 0.700 Epoch 94 iteration 0075/0187: training loss 0.702 Epoch 94 iteration 0076/0187: training loss 0.702 Epoch 94 iteration 0077/0187: training loss 0.701 Epoch 94 iteration 0078/0187: training loss 0.699 Epoch 94 iteration 0079/0187: training loss 0.700 Epoch 94 iteration 0080/0187: training loss 0.699 Epoch 94 iteration 0081/0187: training loss 0.699 Epoch 94 iteration 0082/0187: training loss 0.697 Epoch 94 iteration 0083/0187: training loss 0.699 Epoch 94 iteration 0084/0187: training loss 0.700 Epoch 94 iteration 0085/0187: training loss 0.699 Epoch 94 iteration 0086/0187: training loss 0.701 Epoch 94 iteration 0087/0187: training loss 0.702 Epoch 94 iteration 0088/0187: training loss 0.700 Epoch 94 iteration 0089/0187: training loss 0.700 Epoch 94 iteration 0090/0187: training loss 0.699 Epoch 94 iteration 0091/0188: training loss 0.700 Epoch 94 iteration 0092/0188: training loss 0.700 Epoch 94 iteration 0093/0188: training loss 0.701 Epoch 94 iteration 0094/0188: training loss 0.700 Epoch 94 iteration 0095/0188: training loss 0.699 Epoch 94 iteration 0096/0188: training loss 0.699 Epoch 94 iteration 0097/0188: training loss 0.698 Epoch 94 iteration 0098/0188: training loss 0.699 Epoch 94 iteration 0099/0188: training loss 0.697 Epoch 94 iteration 0100/0188: training loss 0.697 Epoch 94 iteration 0101/0188: training loss 0.697 Epoch 94 iteration 0102/0188: training loss 0.698 Epoch 94 iteration 0103/0188: training loss 0.697 Epoch 94 iteration 0104/0188: training loss 0.698 Epoch 94 iteration 0105/0188: training loss 0.697 Epoch 94 iteration 0106/0188: training loss 0.697 Epoch 94 iteration 0107/0188: training loss 0.698 Epoch 94 iteration 0108/0188: training loss 0.697 Epoch 94 iteration 0109/0188: training loss 0.695 Epoch 94 iteration 0110/0188: training loss 0.695 Epoch 94 iteration 0111/0188: training loss 0.694 Epoch 94 iteration 0112/0188: training loss 0.693 Epoch 94 iteration 0113/0188: training loss 0.694 Epoch 94 iteration 0114/0188: training loss 0.693 Epoch 94 iteration 0115/0188: training loss 0.693 Epoch 94 iteration 0116/0188: training loss 0.693 Epoch 94 iteration 0117/0188: training loss 0.694 Epoch 94 iteration 0118/0188: training loss 0.692 Epoch 94 iteration 0119/0188: training loss 0.691 Epoch 94 iteration 0120/0188: training loss 0.691 Epoch 94 iteration 0121/0188: training loss 0.691 Epoch 94 iteration 0122/0188: training loss 0.691 Epoch 94 iteration 0123/0188: training loss 0.691 Epoch 94 iteration 0124/0188: training loss 0.691 Epoch 94 iteration 0125/0188: training loss 0.690 Epoch 94 iteration 0126/0188: training loss 0.690 Epoch 94 iteration 0127/0188: training loss 0.690 Epoch 94 iteration 0128/0188: training loss 0.691 Epoch 94 iteration 0129/0188: training loss 0.690 Epoch 94 iteration 0130/0188: training loss 0.692 Epoch 94 iteration 0131/0188: training loss 0.691 Epoch 94 iteration 0132/0188: training loss 0.692 Epoch 94 iteration 0133/0188: training loss 0.691 Epoch 94 iteration 0134/0188: training loss 0.691 Epoch 94 iteration 0135/0188: training loss 0.690 Epoch 94 iteration 0136/0188: training loss 0.689 Epoch 94 iteration 0137/0188: training loss 0.690 Epoch 94 iteration 0138/0188: training loss 0.688 Epoch 94 iteration 0139/0188: training loss 0.688 Epoch 94 iteration 0140/0188: training loss 0.688 Epoch 94 iteration 0141/0188: training loss 0.688 Epoch 94 iteration 0142/0188: training loss 0.687 Epoch 94 iteration 0143/0188: training loss 0.688 Epoch 94 iteration 0144/0188: training loss 0.688 Epoch 94 iteration 0145/0188: training loss 0.688 Epoch 94 iteration 0146/0188: training loss 0.688 Epoch 94 iteration 0147/0188: training loss 0.689 Epoch 94 iteration 0148/0188: training loss 0.689 Epoch 94 iteration 0149/0188: training loss 0.688 Epoch 94 iteration 0150/0188: training loss 0.688 Epoch 94 iteration 0151/0188: training loss 0.687 Epoch 94 iteration 0152/0188: training loss 0.688 Epoch 94 iteration 0153/0188: training loss 0.687 Epoch 94 iteration 0154/0188: training loss 0.686 Epoch 94 iteration 0155/0188: training loss 0.687 Epoch 94 iteration 0156/0188: training loss 0.687 Epoch 94 iteration 0157/0188: training loss 0.687 Epoch 94 iteration 0158/0188: training loss 0.686 Epoch 94 iteration 0159/0188: training loss 0.685 Epoch 94 iteration 0160/0188: training loss 0.685 Epoch 94 iteration 0161/0188: training loss 0.685 Epoch 94 iteration 0162/0188: training loss 0.686 Epoch 94 iteration 0163/0188: training loss 0.685 Epoch 94 iteration 0164/0188: training loss 0.685 Epoch 94 iteration 0165/0188: training loss 0.684 Epoch 94 iteration 0166/0188: training loss 0.684 Epoch 94 iteration 0167/0188: training loss 0.683 Epoch 94 iteration 0168/0188: training loss 0.683 Epoch 94 iteration 0169/0188: training loss 0.684 Epoch 94 iteration 0170/0188: training loss 0.685 Epoch 94 iteration 0171/0188: training loss 0.685 Epoch 94 iteration 0172/0188: training loss 0.685 Epoch 94 iteration 0173/0188: training loss 0.684 Epoch 94 iteration 0174/0188: training loss 0.684 Epoch 94 iteration 0175/0188: training loss 0.684 Epoch 94 iteration 0176/0188: training loss 0.684 Epoch 94 iteration 0177/0188: training loss 0.684 Epoch 94 iteration 0178/0188: training loss 0.684 Epoch 94 iteration 0179/0188: training loss 0.685 Epoch 94 iteration 0180/0188: training loss 0.684 Epoch 94 iteration 0181/0188: training loss 0.684 Epoch 94 iteration 0182/0188: training loss 0.684 Epoch 94 iteration 0183/0188: training loss 0.683 Epoch 94 iteration 0184/0188: training loss 0.683 Epoch 94 iteration 0185/0188: training loss 0.683 Epoch 94 iteration 0186/0188: training loss 0.683 Epoch 94 validation pixAcc: 0.874, mIoU: 0.393 Epoch 95 iteration 0001/0187: training loss 0.676 Epoch 95 iteration 0002/0187: training loss 0.656 Epoch 95 iteration 0003/0187: training loss 0.679 Epoch 95 iteration 0004/0187: training loss 0.671 Epoch 95 iteration 0005/0187: training loss 0.669 Epoch 95 iteration 0006/0187: training loss 0.690 Epoch 95 iteration 0007/0187: training loss 0.671 Epoch 95 iteration 0008/0187: training loss 0.684 Epoch 95 iteration 0009/0187: training loss 0.690 Epoch 95 iteration 0010/0187: training loss 0.696 Epoch 95 iteration 0011/0187: training loss 0.707 Epoch 95 iteration 0012/0187: training loss 0.698 Epoch 95 iteration 0013/0187: training loss 0.701 Epoch 95 iteration 0014/0187: training loss 0.695 Epoch 95 iteration 0015/0187: training loss 0.698 Epoch 95 iteration 0016/0187: training loss 0.708 Epoch 95 iteration 0017/0187: training loss 0.709 Epoch 95 iteration 0018/0187: training loss 0.712 Epoch 95 iteration 0019/0187: training loss 0.699 Epoch 95 iteration 0020/0187: training loss 0.697 Epoch 95 iteration 0021/0187: training loss 0.712 Epoch 95 iteration 0022/0187: training loss 0.703 Epoch 95 iteration 0023/0187: training loss 0.712 Epoch 95 iteration 0024/0187: training loss 0.716 Epoch 95 iteration 0025/0187: training loss 0.716 Epoch 95 iteration 0026/0187: training loss 0.714 Epoch 95 iteration 0027/0187: training loss 0.710 Epoch 95 iteration 0028/0187: training loss 0.710 Epoch 95 iteration 0029/0187: training loss 0.714 Epoch 95 iteration 0030/0187: training loss 0.712 Epoch 95 iteration 0031/0187: training loss 0.716 Epoch 95 iteration 0032/0187: training loss 0.710 Epoch 95 iteration 0033/0187: training loss 0.710 Epoch 95 iteration 0034/0187: training loss 0.714 Epoch 95 iteration 0035/0187: training loss 0.714 Epoch 95 iteration 0036/0187: training loss 0.716 Epoch 95 iteration 0037/0187: training loss 0.712 Epoch 95 iteration 0038/0187: training loss 0.709 Epoch 95 iteration 0039/0187: training loss 0.706 Epoch 95 iteration 0040/0187: training loss 0.701 Epoch 95 iteration 0041/0187: training loss 0.700 Epoch 95 iteration 0042/0187: training loss 0.698 Epoch 95 iteration 0043/0187: training loss 0.698 Epoch 95 iteration 0044/0187: training loss 0.696 Epoch 95 iteration 0045/0187: training loss 0.695 Epoch 95 iteration 0046/0187: training loss 0.693 Epoch 95 iteration 0047/0187: training loss 0.696 Epoch 95 iteration 0048/0187: training loss 0.694 Epoch 95 iteration 0049/0187: training loss 0.697 Epoch 95 iteration 0050/0187: training loss 0.703 Epoch 95 iteration 0051/0187: training loss 0.705 Epoch 95 iteration 0052/0187: training loss 0.707 Epoch 95 iteration 0053/0187: training loss 0.706 Epoch 95 iteration 0054/0187: training loss 0.708 Epoch 95 iteration 0055/0187: training loss 0.707 Epoch 95 iteration 0056/0187: training loss 0.708 Epoch 95 iteration 0057/0187: training loss 0.706 Epoch 95 iteration 0058/0187: training loss 0.707 Epoch 95 iteration 0059/0187: training loss 0.706 Epoch 95 iteration 0060/0187: training loss 0.708 Epoch 95 iteration 0061/0187: training loss 0.707 Epoch 95 iteration 0062/0187: training loss 0.709 Epoch 95 iteration 0063/0187: training loss 0.708 Epoch 95 iteration 0064/0187: training loss 0.709 Epoch 95 iteration 0065/0187: training loss 0.707 Epoch 95 iteration 0066/0187: training loss 0.706 Epoch 95 iteration 0067/0187: training loss 0.710 Epoch 95 iteration 0068/0187: training loss 0.711 Epoch 95 iteration 0069/0187: training loss 0.709 Epoch 95 iteration 0070/0187: training loss 0.709 Epoch 95 iteration 0071/0187: training loss 0.708 Epoch 95 iteration 0072/0187: training loss 0.706 Epoch 95 iteration 0073/0187: training loss 0.704 Epoch 95 iteration 0074/0187: training loss 0.703 Epoch 95 iteration 0075/0187: training loss 0.704 Epoch 95 iteration 0076/0187: training loss 0.705 Epoch 95 iteration 0077/0187: training loss 0.705 Epoch 95 iteration 0078/0187: training loss 0.705 Epoch 95 iteration 0079/0187: training loss 0.704 Epoch 95 iteration 0080/0187: training loss 0.702 Epoch 95 iteration 0081/0187: training loss 0.701 Epoch 95 iteration 0082/0187: training loss 0.700 Epoch 95 iteration 0083/0187: training loss 0.700 Epoch 95 iteration 0084/0187: training loss 0.702 Epoch 95 iteration 0085/0187: training loss 0.702 Epoch 95 iteration 0086/0187: training loss 0.700 Epoch 95 iteration 0087/0187: training loss 0.699 Epoch 95 iteration 0088/0187: training loss 0.699 Epoch 95 iteration 0089/0187: training loss 0.697 Epoch 95 iteration 0090/0187: training loss 0.696 Epoch 95 iteration 0091/0187: training loss 0.694 Epoch 95 iteration 0092/0187: training loss 0.694 Epoch 95 iteration 0093/0187: training loss 0.694 Epoch 95 iteration 0094/0187: training loss 0.692 Epoch 95 iteration 0095/0187: training loss 0.692 Epoch 95 iteration 0096/0187: training loss 0.692 Epoch 95 iteration 0097/0187: training loss 0.694 Epoch 95 iteration 0098/0187: training loss 0.693 Epoch 95 iteration 0099/0187: training loss 0.693 Epoch 95 iteration 0100/0187: training loss 0.692 Epoch 95 iteration 0101/0187: training loss 0.692 Epoch 95 iteration 0102/0187: training loss 0.692 Epoch 95 iteration 0103/0187: training loss 0.692 Epoch 95 iteration 0104/0187: training loss 0.693 Epoch 95 iteration 0105/0187: training loss 0.693 Epoch 95 iteration 0106/0187: training loss 0.693 Epoch 95 iteration 0107/0187: training loss 0.692 Epoch 95 iteration 0108/0187: training loss 0.691 Epoch 95 iteration 0109/0187: training loss 0.690 Epoch 95 iteration 0110/0187: training loss 0.689 Epoch 95 iteration 0111/0187: training loss 0.688 Epoch 95 iteration 0112/0187: training loss 0.688 Epoch 95 iteration 0113/0187: training loss 0.689 Epoch 95 iteration 0114/0187: training loss 0.690 Epoch 95 iteration 0115/0187: training loss 0.689 Epoch 95 iteration 0116/0187: training loss 0.689 Epoch 95 iteration 0117/0187: training loss 0.689 Epoch 95 iteration 0118/0187: training loss 0.688 Epoch 95 iteration 0119/0187: training loss 0.687 Epoch 95 iteration 0120/0187: training loss 0.688 Epoch 95 iteration 0121/0187: training loss 0.688 Epoch 95 iteration 0122/0187: training loss 0.688 Epoch 95 iteration 0123/0187: training loss 0.689 Epoch 95 iteration 0124/0187: training loss 0.690 Epoch 95 iteration 0125/0187: training loss 0.691 Epoch 95 iteration 0126/0187: training loss 0.689 Epoch 95 iteration 0127/0187: training loss 0.689 Epoch 95 iteration 0128/0187: training loss 0.689 Epoch 95 iteration 0129/0187: training loss 0.690 Epoch 95 iteration 0130/0187: training loss 0.690 Epoch 95 iteration 0131/0187: training loss 0.689 Epoch 95 iteration 0132/0187: training loss 0.689 Epoch 95 iteration 0133/0187: training loss 0.689 Epoch 95 iteration 0134/0187: training loss 0.690 Epoch 95 iteration 0135/0187: training loss 0.689 Epoch 95 iteration 0136/0187: training loss 0.689 Epoch 95 iteration 0137/0187: training loss 0.687 Epoch 95 iteration 0138/0187: training loss 0.687 Epoch 95 iteration 0139/0187: training loss 0.686 Epoch 95 iteration 0140/0187: training loss 0.684 Epoch 95 iteration 0141/0187: training loss 0.685 Epoch 95 iteration 0142/0187: training loss 0.686 Epoch 95 iteration 0143/0187: training loss 0.686 Epoch 95 iteration 0144/0187: training loss 0.686 Epoch 95 iteration 0145/0187: training loss 0.686 Epoch 95 iteration 0146/0187: training loss 0.688 Epoch 95 iteration 0147/0187: training loss 0.687 Epoch 95 iteration 0148/0187: training loss 0.688 Epoch 95 iteration 0149/0187: training loss 0.687 Epoch 95 iteration 0150/0187: training loss 0.687 Epoch 95 iteration 0151/0187: training loss 0.686 Epoch 95 iteration 0152/0187: training loss 0.687 Epoch 95 iteration 0153/0187: training loss 0.686 Epoch 95 iteration 0154/0187: training loss 0.685 Epoch 95 iteration 0155/0187: training loss 0.685 Epoch 95 iteration 0156/0187: training loss 0.685 Epoch 95 iteration 0157/0187: training loss 0.685 Epoch 95 iteration 0158/0187: training loss 0.684 Epoch 95 iteration 0159/0187: training loss 0.684 Epoch 95 iteration 0160/0187: training loss 0.684 Epoch 95 iteration 0161/0187: training loss 0.685 Epoch 95 iteration 0162/0187: training loss 0.684 Epoch 95 iteration 0163/0187: training loss 0.686 Epoch 95 iteration 0164/0187: training loss 0.686 Epoch 95 iteration 0165/0187: training loss 0.687 Epoch 95 iteration 0166/0187: training loss 0.687 Epoch 95 iteration 0167/0187: training loss 0.687 Epoch 95 iteration 0168/0187: training loss 0.686 Epoch 95 iteration 0169/0187: training loss 0.687 Epoch 95 iteration 0170/0187: training loss 0.687 Epoch 95 iteration 0171/0187: training loss 0.687 Epoch 95 iteration 0172/0187: training loss 0.686 Epoch 95 iteration 0173/0187: training loss 0.686 Epoch 95 iteration 0174/0187: training loss 0.685 Epoch 95 iteration 0175/0187: training loss 0.685 Epoch 95 iteration 0176/0187: training loss 0.684 Epoch 95 iteration 0177/0187: training loss 0.685 Epoch 95 iteration 0178/0187: training loss 0.685 Epoch 95 iteration 0179/0187: training loss 0.685 Epoch 95 iteration 0180/0187: training loss 0.685 Epoch 95 iteration 0181/0187: training loss 0.685 Epoch 95 iteration 0182/0187: training loss 0.684 Epoch 95 iteration 0183/0187: training loss 0.685 Epoch 95 iteration 0184/0187: training loss 0.685 Epoch 95 iteration 0185/0187: training loss 0.685 Epoch 95 iteration 0186/0187: training loss 0.686 Epoch 95 iteration 0187/0187: training loss 0.686 Epoch 95 validation pixAcc: 0.876, mIoU: 0.392 Epoch 96 iteration 0001/0187: training loss 0.715 Epoch 96 iteration 0002/0187: training loss 0.762 Epoch 96 iteration 0003/0187: training loss 0.749 Epoch 96 iteration 0004/0187: training loss 0.744 Epoch 96 iteration 0005/0187: training loss 0.734 Epoch 96 iteration 0006/0187: training loss 0.751 Epoch 96 iteration 0007/0187: training loss 0.747 Epoch 96 iteration 0008/0187: training loss 0.730 Epoch 96 iteration 0009/0187: training loss 0.732 Epoch 96 iteration 0010/0187: training loss 0.743 Epoch 96 iteration 0011/0187: training loss 0.748 Epoch 96 iteration 0012/0187: training loss 0.733 Epoch 96 iteration 0013/0187: training loss 0.728 Epoch 96 iteration 0014/0187: training loss 0.741 Epoch 96 iteration 0015/0187: training loss 0.732 Epoch 96 iteration 0016/0187: training loss 0.736 Epoch 96 iteration 0017/0187: training loss 0.731 Epoch 96 iteration 0018/0187: training loss 0.731 Epoch 96 iteration 0019/0187: training loss 0.726 Epoch 96 iteration 0020/0187: training loss 0.721 Epoch 96 iteration 0021/0187: training loss 0.722 Epoch 96 iteration 0022/0187: training loss 0.718 Epoch 96 iteration 0023/0187: training loss 0.719 Epoch 96 iteration 0024/0187: training loss 0.724 Epoch 96 iteration 0025/0187: training loss 0.720 Epoch 96 iteration 0026/0187: training loss 0.721 Epoch 96 iteration 0027/0187: training loss 0.726 Epoch 96 iteration 0028/0187: training loss 0.724 Epoch 96 iteration 0029/0187: training loss 0.724 Epoch 96 iteration 0030/0187: training loss 0.719 Epoch 96 iteration 0031/0187: training loss 0.718 Epoch 96 iteration 0032/0187: training loss 0.717 Epoch 96 iteration 0033/0187: training loss 0.711 Epoch 96 iteration 0034/0187: training loss 0.710 Epoch 96 iteration 0035/0187: training loss 0.707 Epoch 96 iteration 0036/0187: training loss 0.707 Epoch 96 iteration 0037/0187: training loss 0.706 Epoch 96 iteration 0038/0187: training loss 0.707 Epoch 96 iteration 0039/0187: training loss 0.705 Epoch 96 iteration 0040/0187: training loss 0.706 Epoch 96 iteration 0041/0187: training loss 0.707 Epoch 96 iteration 0042/0187: training loss 0.707 Epoch 96 iteration 0043/0187: training loss 0.708 Epoch 96 iteration 0044/0187: training loss 0.712 Epoch 96 iteration 0045/0187: training loss 0.709 Epoch 96 iteration 0046/0187: training loss 0.708 Epoch 96 iteration 0047/0187: training loss 0.708 Epoch 96 iteration 0048/0187: training loss 0.708 Epoch 96 iteration 0049/0187: training loss 0.708 Epoch 96 iteration 0050/0187: training loss 0.710 Epoch 96 iteration 0051/0187: training loss 0.708 Epoch 96 iteration 0052/0187: training loss 0.709 Epoch 96 iteration 0053/0187: training loss 0.706 Epoch 96 iteration 0054/0187: training loss 0.709 Epoch 96 iteration 0055/0187: training loss 0.707 Epoch 96 iteration 0056/0187: training loss 0.705 Epoch 96 iteration 0057/0187: training loss 0.707 Epoch 96 iteration 0058/0187: training loss 0.706 Epoch 96 iteration 0059/0187: training loss 0.707 Epoch 96 iteration 0060/0187: training loss 0.704 Epoch 96 iteration 0061/0187: training loss 0.704 Epoch 96 iteration 0062/0187: training loss 0.704 Epoch 96 iteration 0063/0187: training loss 0.702 Epoch 96 iteration 0064/0187: training loss 0.704 Epoch 96 iteration 0065/0187: training loss 0.702 Epoch 96 iteration 0066/0187: training loss 0.704 Epoch 96 iteration 0067/0187: training loss 0.703 Epoch 96 iteration 0068/0187: training loss 0.706 Epoch 96 iteration 0069/0187: training loss 0.704 Epoch 96 iteration 0070/0187: training loss 0.701 Epoch 96 iteration 0071/0187: training loss 0.699 Epoch 96 iteration 0072/0187: training loss 0.698 Epoch 96 iteration 0073/0187: training loss 0.699 Epoch 96 iteration 0074/0187: training loss 0.701 Epoch 96 iteration 0075/0187: training loss 0.699 Epoch 96 iteration 0076/0187: training loss 0.699 Epoch 96 iteration 0077/0187: training loss 0.698 Epoch 96 iteration 0078/0187: training loss 0.697 Epoch 96 iteration 0079/0187: training loss 0.699 Epoch 96 iteration 0080/0187: training loss 0.698 Epoch 96 iteration 0081/0187: training loss 0.697 Epoch 96 iteration 0082/0187: training loss 0.697 Epoch 96 iteration 0083/0187: training loss 0.697 Epoch 96 iteration 0084/0187: training loss 0.695 Epoch 96 iteration 0085/0187: training loss 0.696 Epoch 96 iteration 0086/0187: training loss 0.697 Epoch 96 iteration 0087/0187: training loss 0.698 Epoch 96 iteration 0088/0187: training loss 0.698 Epoch 96 iteration 0089/0187: training loss 0.698 Epoch 96 iteration 0090/0187: training loss 0.698 Epoch 96 iteration 0091/0188: training loss 0.698 Epoch 96 iteration 0092/0188: training loss 0.698 Epoch 96 iteration 0093/0188: training loss 0.696 Epoch 96 iteration 0094/0188: training loss 0.696 Epoch 96 iteration 0095/0188: training loss 0.696 Epoch 96 iteration 0096/0188: training loss 0.695 Epoch 96 iteration 0097/0188: training loss 0.693 Epoch 96 iteration 0098/0188: training loss 0.692 Epoch 96 iteration 0099/0188: training loss 0.691 Epoch 96 iteration 0100/0188: training loss 0.692 Epoch 96 iteration 0101/0188: training loss 0.689 Epoch 96 iteration 0102/0188: training loss 0.689 Epoch 96 iteration 0103/0188: training loss 0.688 Epoch 96 iteration 0104/0188: training loss 0.687 Epoch 96 iteration 0105/0188: training loss 0.687 Epoch 96 iteration 0106/0188: training loss 0.687 Epoch 96 iteration 0107/0188: training loss 0.688 Epoch 96 iteration 0108/0188: training loss 0.687 Epoch 96 iteration 0109/0188: training loss 0.686 Epoch 96 iteration 0110/0188: training loss 0.686 Epoch 96 iteration 0111/0188: training loss 0.687 Epoch 96 iteration 0112/0188: training loss 0.688 Epoch 96 iteration 0113/0188: training loss 0.687 Epoch 96 iteration 0114/0188: training loss 0.689 Epoch 96 iteration 0115/0188: training loss 0.687 Epoch 96 iteration 0116/0188: training loss 0.687 Epoch 96 iteration 0117/0188: training loss 0.686 Epoch 96 iteration 0118/0188: training loss 0.686 Epoch 96 iteration 0119/0188: training loss 0.686 Epoch 96 iteration 0120/0188: training loss 0.687 Epoch 96 iteration 0121/0188: training loss 0.686 Epoch 96 iteration 0122/0188: training loss 0.686 Epoch 96 iteration 0123/0188: training loss 0.686 Epoch 96 iteration 0124/0188: training loss 0.686 Epoch 96 iteration 0125/0188: training loss 0.685 Epoch 96 iteration 0126/0188: training loss 0.685 Epoch 96 iteration 0127/0188: training loss 0.685 Epoch 96 iteration 0128/0188: training loss 0.685 Epoch 96 iteration 0129/0188: training loss 0.686 Epoch 96 iteration 0130/0188: training loss 0.685 Epoch 96 iteration 0131/0188: training loss 0.685 Epoch 96 iteration 0132/0188: training loss 0.686 Epoch 96 iteration 0133/0188: training loss 0.686 Epoch 96 iteration 0134/0188: training loss 0.686 Epoch 96 iteration 0135/0188: training loss 0.686 Epoch 96 iteration 0136/0188: training loss 0.686 Epoch 96 iteration 0137/0188: training loss 0.686 Epoch 96 iteration 0138/0188: training loss 0.686 Epoch 96 iteration 0139/0188: training loss 0.685 Epoch 96 iteration 0140/0188: training loss 0.687 Epoch 96 iteration 0141/0188: training loss 0.686 Epoch 96 iteration 0142/0188: training loss 0.687 Epoch 96 iteration 0143/0188: training loss 0.686 Epoch 96 iteration 0144/0188: training loss 0.685 Epoch 96 iteration 0145/0188: training loss 0.685 Epoch 96 iteration 0146/0188: training loss 0.684 Epoch 96 iteration 0147/0188: training loss 0.684 Epoch 96 iteration 0148/0188: training loss 0.684 Epoch 96 iteration 0149/0188: training loss 0.684 Epoch 96 iteration 0150/0188: training loss 0.684 Epoch 96 iteration 0151/0188: training loss 0.684 Epoch 96 iteration 0152/0188: training loss 0.683 Epoch 96 iteration 0153/0188: training loss 0.683 Epoch 96 iteration 0154/0188: training loss 0.683 Epoch 96 iteration 0155/0188: training loss 0.683 Epoch 96 iteration 0156/0188: training loss 0.683 Epoch 96 iteration 0157/0188: training loss 0.683 Epoch 96 iteration 0158/0188: training loss 0.684 Epoch 96 iteration 0159/0188: training loss 0.684 Epoch 96 iteration 0160/0188: training loss 0.684 Epoch 96 iteration 0161/0188: training loss 0.684 Epoch 96 iteration 0162/0188: training loss 0.684 Epoch 96 iteration 0163/0188: training loss 0.684 Epoch 96 iteration 0164/0188: training loss 0.684 Epoch 96 iteration 0165/0188: training loss 0.684 Epoch 96 iteration 0166/0188: training loss 0.684 Epoch 96 iteration 0167/0188: training loss 0.684 Epoch 96 iteration 0168/0188: training loss 0.684 Epoch 96 iteration 0169/0188: training loss 0.684 Epoch 96 iteration 0170/0188: training loss 0.685 Epoch 96 iteration 0171/0188: training loss 0.684 Epoch 96 iteration 0172/0188: training loss 0.684 Epoch 96 iteration 0173/0188: training loss 0.684 Epoch 96 iteration 0174/0188: training loss 0.685 Epoch 96 iteration 0175/0188: training loss 0.684 Epoch 96 iteration 0176/0188: training loss 0.684 Epoch 96 iteration 0177/0188: training loss 0.684 Epoch 96 iteration 0178/0188: training loss 0.684 Epoch 96 iteration 0179/0188: training loss 0.684 Epoch 96 iteration 0180/0188: training loss 0.684 Epoch 96 iteration 0181/0188: training loss 0.683 Epoch 96 iteration 0182/0188: training loss 0.684 Epoch 96 iteration 0183/0188: training loss 0.684 Epoch 96 iteration 0184/0188: training loss 0.683 Epoch 96 iteration 0185/0188: training loss 0.683 Epoch 96 iteration 0186/0188: training loss 0.683 Epoch 96 validation pixAcc: 0.876, mIoU: 0.393 Epoch 97 iteration 0001/0187: training loss 0.726 Epoch 97 iteration 0002/0187: training loss 0.684 Epoch 97 iteration 0003/0187: training loss 0.661 Epoch 97 iteration 0004/0187: training loss 0.698 Epoch 97 iteration 0005/0187: training loss 0.682 Epoch 97 iteration 0006/0187: training loss 0.680 Epoch 97 iteration 0007/0187: training loss 0.672 Epoch 97 iteration 0008/0187: training loss 0.666 Epoch 97 iteration 0009/0187: training loss 0.656 Epoch 97 iteration 0010/0187: training loss 0.658 Epoch 97 iteration 0011/0187: training loss 0.658 Epoch 97 iteration 0012/0187: training loss 0.658 Epoch 97 iteration 0013/0187: training loss 0.659 Epoch 97 iteration 0014/0187: training loss 0.663 Epoch 97 iteration 0015/0187: training loss 0.670 Epoch 97 iteration 0016/0187: training loss 0.672 Epoch 97 iteration 0017/0187: training loss 0.675 Epoch 97 iteration 0018/0187: training loss 0.670 Epoch 97 iteration 0019/0187: training loss 0.667 Epoch 97 iteration 0020/0187: training loss 0.665 Epoch 97 iteration 0021/0187: training loss 0.668 Epoch 97 iteration 0022/0187: training loss 0.668 Epoch 97 iteration 0023/0187: training loss 0.667 Epoch 97 iteration 0024/0187: training loss 0.669 Epoch 97 iteration 0025/0187: training loss 0.667 Epoch 97 iteration 0026/0187: training loss 0.672 Epoch 97 iteration 0027/0187: training loss 0.671 Epoch 97 iteration 0028/0187: training loss 0.667 Epoch 97 iteration 0029/0187: training loss 0.668 Epoch 97 iteration 0030/0187: training loss 0.668 Epoch 97 iteration 0031/0187: training loss 0.666 Epoch 97 iteration 0032/0187: training loss 0.667 Epoch 97 iteration 0033/0187: training loss 0.671 Epoch 97 iteration 0034/0187: training loss 0.672 Epoch 97 iteration 0035/0187: training loss 0.673 Epoch 97 iteration 0036/0187: training loss 0.680 Epoch 97 iteration 0037/0187: training loss 0.680 Epoch 97 iteration 0038/0187: training loss 0.677 Epoch 97 iteration 0039/0187: training loss 0.676 Epoch 97 iteration 0040/0187: training loss 0.675 Epoch 97 iteration 0041/0187: training loss 0.676 Epoch 97 iteration 0042/0187: training loss 0.676 Epoch 97 iteration 0043/0187: training loss 0.673 Epoch 97 iteration 0044/0187: training loss 0.671 Epoch 97 iteration 0045/0187: training loss 0.674 Epoch 97 iteration 0046/0187: training loss 0.671 Epoch 97 iteration 0047/0187: training loss 0.670 Epoch 97 iteration 0048/0187: training loss 0.669 Epoch 97 iteration 0049/0187: training loss 0.672 Epoch 97 iteration 0050/0187: training loss 0.675 Epoch 97 iteration 0051/0187: training loss 0.673 Epoch 97 iteration 0052/0187: training loss 0.676 Epoch 97 iteration 0053/0187: training loss 0.675 Epoch 97 iteration 0054/0187: training loss 0.673 Epoch 97 iteration 0055/0187: training loss 0.677 Epoch 97 iteration 0056/0187: training loss 0.677 Epoch 97 iteration 0057/0187: training loss 0.676 Epoch 97 iteration 0058/0187: training loss 0.679 Epoch 97 iteration 0059/0187: training loss 0.679 Epoch 97 iteration 0060/0187: training loss 0.677 Epoch 97 iteration 0061/0187: training loss 0.678 Epoch 97 iteration 0062/0187: training loss 0.676 Epoch 97 iteration 0063/0187: training loss 0.677 Epoch 97 iteration 0064/0187: training loss 0.677 Epoch 97 iteration 0065/0187: training loss 0.675 Epoch 97 iteration 0066/0187: training loss 0.676 Epoch 97 iteration 0067/0187: training loss 0.679 Epoch 97 iteration 0068/0187: training loss 0.676 Epoch 97 iteration 0069/0187: training loss 0.675 Epoch 97 iteration 0070/0187: training loss 0.676 Epoch 97 iteration 0071/0187: training loss 0.675 Epoch 97 iteration 0072/0187: training loss 0.675 Epoch 97 iteration 0073/0187: training loss 0.677 Epoch 97 iteration 0074/0187: training loss 0.676 Epoch 97 iteration 0075/0187: training loss 0.676 Epoch 97 iteration 0076/0187: training loss 0.679 Epoch 97 iteration 0077/0187: training loss 0.679 Epoch 97 iteration 0078/0187: training loss 0.679 Epoch 97 iteration 0079/0187: training loss 0.679 Epoch 97 iteration 0080/0187: training loss 0.679 Epoch 97 iteration 0081/0187: training loss 0.678 Epoch 97 iteration 0082/0187: training loss 0.678 Epoch 97 iteration 0083/0187: training loss 0.677 Epoch 97 iteration 0084/0187: training loss 0.676 Epoch 97 iteration 0085/0187: training loss 0.676 Epoch 97 iteration 0086/0187: training loss 0.676 Epoch 97 iteration 0087/0187: training loss 0.674 Epoch 97 iteration 0088/0187: training loss 0.675 Epoch 97 iteration 0089/0187: training loss 0.675 Epoch 97 iteration 0090/0187: training loss 0.673 Epoch 97 iteration 0091/0187: training loss 0.676 Epoch 97 iteration 0092/0187: training loss 0.676 Epoch 97 iteration 0093/0187: training loss 0.676 Epoch 97 iteration 0094/0187: training loss 0.675 Epoch 97 iteration 0095/0187: training loss 0.675 Epoch 97 iteration 0096/0187: training loss 0.674 Epoch 97 iteration 0097/0187: training loss 0.674 Epoch 97 iteration 0098/0187: training loss 0.674 Epoch 97 iteration 0099/0187: training loss 0.673 Epoch 97 iteration 0100/0187: training loss 0.673 Epoch 97 iteration 0101/0187: training loss 0.672 Epoch 97 iteration 0102/0187: training loss 0.671 Epoch 97 iteration 0103/0187: training loss 0.671 Epoch 97 iteration 0104/0187: training loss 0.672 Epoch 97 iteration 0105/0187: training loss 0.674 Epoch 97 iteration 0106/0187: training loss 0.674 Epoch 97 iteration 0107/0187: training loss 0.673 Epoch 97 iteration 0108/0187: training loss 0.673 Epoch 97 iteration 0109/0187: training loss 0.674 Epoch 97 iteration 0110/0187: training loss 0.674 Epoch 97 iteration 0111/0187: training loss 0.674 Epoch 97 iteration 0112/0187: training loss 0.674 Epoch 97 iteration 0113/0187: training loss 0.673 Epoch 97 iteration 0114/0187: training loss 0.674 Epoch 97 iteration 0115/0187: training loss 0.675 Epoch 97 iteration 0116/0187: training loss 0.676 Epoch 97 iteration 0117/0187: training loss 0.676 Epoch 97 iteration 0118/0187: training loss 0.676 Epoch 97 iteration 0119/0187: training loss 0.679 Epoch 97 iteration 0120/0187: training loss 0.680 Epoch 97 iteration 0121/0187: training loss 0.680 Epoch 97 iteration 0122/0187: training loss 0.681 Epoch 97 iteration 0123/0187: training loss 0.679 Epoch 97 iteration 0124/0187: training loss 0.679 Epoch 97 iteration 0125/0187: training loss 0.680 Epoch 97 iteration 0126/0187: training loss 0.679 Epoch 97 iteration 0127/0187: training loss 0.681 Epoch 97 iteration 0128/0187: training loss 0.681 Epoch 97 iteration 0129/0187: training loss 0.680 Epoch 97 iteration 0130/0187: training loss 0.680 Epoch 97 iteration 0131/0187: training loss 0.680 Epoch 97 iteration 0132/0187: training loss 0.680 Epoch 97 iteration 0133/0187: training loss 0.680 Epoch 97 iteration 0134/0187: training loss 0.680 Epoch 97 iteration 0135/0187: training loss 0.679 Epoch 97 iteration 0136/0187: training loss 0.679 Epoch 97 iteration 0137/0187: training loss 0.678 Epoch 97 iteration 0138/0187: training loss 0.678 Epoch 97 iteration 0139/0187: training loss 0.677 Epoch 97 iteration 0140/0187: training loss 0.676 Epoch 97 iteration 0141/0187: training loss 0.676 Epoch 97 iteration 0142/0187: training loss 0.676 Epoch 97 iteration 0143/0187: training loss 0.676 Epoch 97 iteration 0144/0187: training loss 0.676 Epoch 97 iteration 0145/0187: training loss 0.676 Epoch 97 iteration 0146/0187: training loss 0.676 Epoch 97 iteration 0147/0187: training loss 0.676 Epoch 97 iteration 0148/0187: training loss 0.678 Epoch 97 iteration 0149/0187: training loss 0.678 Epoch 97 iteration 0150/0187: training loss 0.679 Epoch 97 iteration 0151/0187: training loss 0.680 Epoch 97 iteration 0152/0187: training loss 0.681 Epoch 97 iteration 0153/0187: training loss 0.681 Epoch 97 iteration 0154/0187: training loss 0.681 Epoch 97 iteration 0155/0187: training loss 0.681 Epoch 97 iteration 0156/0187: training loss 0.681 Epoch 97 iteration 0157/0187: training loss 0.681 Epoch 97 iteration 0158/0187: training loss 0.681 Epoch 97 iteration 0159/0187: training loss 0.680 Epoch 97 iteration 0160/0187: training loss 0.680 Epoch 97 iteration 0161/0187: training loss 0.680 Epoch 97 iteration 0162/0187: training loss 0.681 Epoch 97 iteration 0163/0187: training loss 0.680 Epoch 97 iteration 0164/0187: training loss 0.681 Epoch 97 iteration 0165/0187: training loss 0.682 Epoch 97 iteration 0166/0187: training loss 0.682 Epoch 97 iteration 0167/0187: training loss 0.682 Epoch 97 iteration 0168/0187: training loss 0.682 Epoch 97 iteration 0169/0187: training loss 0.681 Epoch 97 iteration 0170/0187: training loss 0.681 Epoch 97 iteration 0171/0187: training loss 0.682 Epoch 97 iteration 0172/0187: training loss 0.682 Epoch 97 iteration 0173/0187: training loss 0.681 Epoch 97 iteration 0174/0187: training loss 0.682 Epoch 97 iteration 0175/0187: training loss 0.682 Epoch 97 iteration 0176/0187: training loss 0.682 Epoch 97 iteration 0177/0187: training loss 0.682 Epoch 97 iteration 0178/0187: training loss 0.683 Epoch 97 iteration 0179/0187: training loss 0.683 Epoch 97 iteration 0180/0187: training loss 0.683 Epoch 97 iteration 0181/0187: training loss 0.682 Epoch 97 iteration 0182/0187: training loss 0.682 Epoch 97 iteration 0183/0187: training loss 0.682 Epoch 97 iteration 0184/0187: training loss 0.683 Epoch 97 iteration 0185/0187: training loss 0.683 Epoch 97 iteration 0186/0187: training loss 0.683 Epoch 97 iteration 0187/0187: training loss 0.682 Epoch 97 validation pixAcc: 0.875, mIoU: 0.392 Epoch 98 iteration 0001/0187: training loss 0.703 Epoch 98 iteration 0002/0187: training loss 0.663 Epoch 98 iteration 0003/0187: training loss 0.653 Epoch 98 iteration 0004/0187: training loss 0.711 Epoch 98 iteration 0005/0187: training loss 0.690 Epoch 98 iteration 0006/0187: training loss 0.713 Epoch 98 iteration 0007/0187: training loss 0.738 Epoch 98 iteration 0008/0187: training loss 0.728 Epoch 98 iteration 0009/0187: training loss 0.735 Epoch 98 iteration 0010/0187: training loss 0.747 Epoch 98 iteration 0011/0187: training loss 0.744 Epoch 98 iteration 0012/0187: training loss 0.738 Epoch 98 iteration 0013/0187: training loss 0.743 Epoch 98 iteration 0014/0187: training loss 0.735 Epoch 98 iteration 0015/0187: training loss 0.733 Epoch 98 iteration 0016/0187: training loss 0.734 Epoch 98 iteration 0017/0187: training loss 0.733 Epoch 98 iteration 0018/0187: training loss 0.727 Epoch 98 iteration 0019/0187: training loss 0.724 Epoch 98 iteration 0020/0187: training loss 0.719 Epoch 98 iteration 0021/0187: training loss 0.721 Epoch 98 iteration 0022/0187: training loss 0.722 Epoch 98 iteration 0023/0187: training loss 0.724 Epoch 98 iteration 0024/0187: training loss 0.718 Epoch 98 iteration 0025/0187: training loss 0.720 Epoch 98 iteration 0026/0187: training loss 0.718 Epoch 98 iteration 0027/0187: training loss 0.716 Epoch 98 iteration 0028/0187: training loss 0.714 Epoch 98 iteration 0029/0187: training loss 0.714 Epoch 98 iteration 0030/0187: training loss 0.714 Epoch 98 iteration 0031/0187: training loss 0.711 Epoch 98 iteration 0032/0187: training loss 0.708 Epoch 98 iteration 0033/0187: training loss 0.709 Epoch 98 iteration 0034/0187: training loss 0.706 Epoch 98 iteration 0035/0187: training loss 0.706 Epoch 98 iteration 0036/0187: training loss 0.709 Epoch 98 iteration 0037/0187: training loss 0.713 Epoch 98 iteration 0038/0187: training loss 0.709 Epoch 98 iteration 0039/0187: training loss 0.705 Epoch 98 iteration 0040/0187: training loss 0.703 Epoch 98 iteration 0041/0187: training loss 0.706 Epoch 98 iteration 0042/0187: training loss 0.703 Epoch 98 iteration 0043/0187: training loss 0.700 Epoch 98 iteration 0044/0187: training loss 0.699 Epoch 98 iteration 0045/0187: training loss 0.701 Epoch 98 iteration 0046/0187: training loss 0.699 Epoch 98 iteration 0047/0187: training loss 0.698 Epoch 98 iteration 0048/0187: training loss 0.698 Epoch 98 iteration 0049/0187: training loss 0.697 Epoch 98 iteration 0050/0187: training loss 0.695 Epoch 98 iteration 0051/0187: training loss 0.701 Epoch 98 iteration 0052/0187: training loss 0.698 Epoch 98 iteration 0053/0187: training loss 0.696 Epoch 98 iteration 0054/0187: training loss 0.693 Epoch 98 iteration 0055/0187: training loss 0.695 Epoch 98 iteration 0056/0187: training loss 0.693 Epoch 98 iteration 0057/0187: training loss 0.694 Epoch 98 iteration 0058/0187: training loss 0.692 Epoch 98 iteration 0059/0187: training loss 0.689 Epoch 98 iteration 0060/0187: training loss 0.690 Epoch 98 iteration 0061/0187: training loss 0.689 Epoch 98 iteration 0062/0187: training loss 0.690 Epoch 98 iteration 0063/0187: training loss 0.688 Epoch 98 iteration 0064/0187: training loss 0.686 Epoch 98 iteration 0065/0187: training loss 0.687 Epoch 98 iteration 0066/0187: training loss 0.688 Epoch 98 iteration 0067/0187: training loss 0.686 Epoch 98 iteration 0068/0187: training loss 0.687 Epoch 98 iteration 0069/0187: training loss 0.686 Epoch 98 iteration 0070/0187: training loss 0.685 Epoch 98 iteration 0071/0187: training loss 0.685 Epoch 98 iteration 0072/0187: training loss 0.686 Epoch 98 iteration 0073/0187: training loss 0.686 Epoch 98 iteration 0074/0187: training loss 0.685 Epoch 98 iteration 0075/0187: training loss 0.684 Epoch 98 iteration 0076/0187: training loss 0.685 Epoch 98 iteration 0077/0187: training loss 0.685 Epoch 98 iteration 0078/0187: training loss 0.685 Epoch 98 iteration 0079/0187: training loss 0.685 Epoch 98 iteration 0080/0187: training loss 0.683 Epoch 98 iteration 0081/0187: training loss 0.684 Epoch 98 iteration 0082/0187: training loss 0.684 Epoch 98 iteration 0083/0187: training loss 0.684 Epoch 98 iteration 0084/0187: training loss 0.685 Epoch 98 iteration 0085/0187: training loss 0.686 Epoch 98 iteration 0086/0187: training loss 0.684 Epoch 98 iteration 0087/0187: training loss 0.684 Epoch 98 iteration 0088/0187: training loss 0.684 Epoch 98 iteration 0089/0187: training loss 0.683 Epoch 98 iteration 0090/0187: training loss 0.682 Epoch 98 iteration 0091/0188: training loss 0.683 Epoch 98 iteration 0092/0188: training loss 0.682 Epoch 98 iteration 0093/0188: training loss 0.682 Epoch 98 iteration 0094/0188: training loss 0.684 Epoch 98 iteration 0095/0188: training loss 0.683 Epoch 98 iteration 0096/0188: training loss 0.682 Epoch 98 iteration 0097/0188: training loss 0.680 Epoch 98 iteration 0098/0188: training loss 0.679 Epoch 98 iteration 0099/0188: training loss 0.679 Epoch 98 iteration 0100/0188: training loss 0.678 Epoch 98 iteration 0101/0188: training loss 0.680 Epoch 98 iteration 0102/0188: training loss 0.680 Epoch 98 iteration 0103/0188: training loss 0.679 Epoch 98 iteration 0104/0188: training loss 0.682 Epoch 98 iteration 0105/0188: training loss 0.682 Epoch 98 iteration 0106/0188: training loss 0.681 Epoch 98 iteration 0107/0188: training loss 0.681 Epoch 98 iteration 0108/0188: training loss 0.680 Epoch 98 iteration 0109/0188: training loss 0.679 Epoch 98 iteration 0110/0188: training loss 0.681 Epoch 98 iteration 0111/0188: training loss 0.681 Epoch 98 iteration 0112/0188: training loss 0.680 Epoch 98 iteration 0113/0188: training loss 0.679 Epoch 98 iteration 0114/0188: training loss 0.679 Epoch 98 iteration 0115/0188: training loss 0.679 Epoch 98 iteration 0116/0188: training loss 0.680 Epoch 98 iteration 0117/0188: training loss 0.681 Epoch 98 iteration 0118/0188: training loss 0.681 Epoch 98 iteration 0119/0188: training loss 0.680 Epoch 98 iteration 0120/0188: training loss 0.679 Epoch 98 iteration 0121/0188: training loss 0.680 Epoch 98 iteration 0122/0188: training loss 0.681 Epoch 98 iteration 0123/0188: training loss 0.683 Epoch 98 iteration 0124/0188: training loss 0.684 Epoch 98 iteration 0125/0188: training loss 0.683 Epoch 98 iteration 0126/0188: training loss 0.683 Epoch 98 iteration 0127/0188: training loss 0.683 Epoch 98 iteration 0128/0188: training loss 0.682 Epoch 98 iteration 0129/0188: training loss 0.683 Epoch 98 iteration 0130/0188: training loss 0.683 Epoch 98 iteration 0131/0188: training loss 0.682 Epoch 98 iteration 0132/0188: training loss 0.682 Epoch 98 iteration 0133/0188: training loss 0.682 Epoch 98 iteration 0134/0188: training loss 0.682 Epoch 98 iteration 0135/0188: training loss 0.682 Epoch 98 iteration 0136/0188: training loss 0.682 Epoch 98 iteration 0137/0188: training loss 0.681 Epoch 98 iteration 0138/0188: training loss 0.681 Epoch 98 iteration 0139/0188: training loss 0.681 Epoch 98 iteration 0140/0188: training loss 0.681 Epoch 98 iteration 0141/0188: training loss 0.680 Epoch 98 iteration 0142/0188: training loss 0.681 Epoch 98 iteration 0143/0188: training loss 0.680 Epoch 98 iteration 0144/0188: training loss 0.680 Epoch 98 iteration 0145/0188: training loss 0.679 Epoch 98 iteration 0146/0188: training loss 0.679 Epoch 98 iteration 0147/0188: training loss 0.678 Epoch 98 iteration 0148/0188: training loss 0.679 Epoch 98 iteration 0149/0188: training loss 0.678 Epoch 98 iteration 0150/0188: training loss 0.678 Epoch 98 iteration 0151/0188: training loss 0.678 Epoch 98 iteration 0152/0188: training loss 0.678 Epoch 98 iteration 0153/0188: training loss 0.678 Epoch 98 iteration 0154/0188: training loss 0.678 Epoch 98 iteration 0155/0188: training loss 0.677 Epoch 98 iteration 0156/0188: training loss 0.676 Epoch 98 iteration 0157/0188: training loss 0.677 Epoch 98 iteration 0158/0188: training loss 0.677 Epoch 98 iteration 0159/0188: training loss 0.677 Epoch 98 iteration 0160/0188: training loss 0.677 Epoch 98 iteration 0161/0188: training loss 0.677 Epoch 98 iteration 0162/0188: training loss 0.677 Epoch 98 iteration 0163/0188: training loss 0.678 Epoch 98 iteration 0164/0188: training loss 0.677 Epoch 98 iteration 0165/0188: training loss 0.678 Epoch 98 iteration 0166/0188: training loss 0.679 Epoch 98 iteration 0167/0188: training loss 0.680 Epoch 98 iteration 0168/0188: training loss 0.679 Epoch 98 iteration 0169/0188: training loss 0.679 Epoch 98 iteration 0170/0188: training loss 0.680 Epoch 98 iteration 0171/0188: training loss 0.680 Epoch 98 iteration 0172/0188: training loss 0.680 Epoch 98 iteration 0173/0188: training loss 0.679 Epoch 98 iteration 0174/0188: training loss 0.678 Epoch 98 iteration 0175/0188: training loss 0.679 Epoch 98 iteration 0176/0188: training loss 0.679 Epoch 98 iteration 0177/0188: training loss 0.678 Epoch 98 iteration 0178/0188: training loss 0.678 Epoch 98 iteration 0179/0188: training loss 0.678 Epoch 98 iteration 0180/0188: training loss 0.677 Epoch 98 iteration 0181/0188: training loss 0.677 Epoch 98 iteration 0182/0188: training loss 0.677 Epoch 98 iteration 0183/0188: training loss 0.677 Epoch 98 iteration 0184/0188: training loss 0.677 Epoch 98 iteration 0185/0188: training loss 0.677 Epoch 98 iteration 0186/0188: training loss 0.677 Epoch 98 validation pixAcc: 0.875, mIoU: 0.395 Epoch 99 iteration 0001/0187: training loss 0.696 Epoch 99 iteration 0002/0187: training loss 0.682 Epoch 99 iteration 0003/0187: training loss 0.690 Epoch 99 iteration 0004/0187: training loss 0.697 Epoch 99 iteration 0005/0187: training loss 0.678 Epoch 99 iteration 0006/0187: training loss 0.658 Epoch 99 iteration 0007/0187: training loss 0.664 Epoch 99 iteration 0008/0187: training loss 0.663 Epoch 99 iteration 0009/0187: training loss 0.670 Epoch 99 iteration 0010/0187: training loss 0.673 Epoch 99 iteration 0011/0187: training loss 0.673 Epoch 99 iteration 0012/0187: training loss 0.674 Epoch 99 iteration 0013/0187: training loss 0.672 Epoch 99 iteration 0014/0187: training loss 0.668 Epoch 99 iteration 0015/0187: training loss 0.671 Epoch 99 iteration 0016/0187: training loss 0.671 Epoch 99 iteration 0017/0187: training loss 0.672 Epoch 99 iteration 0018/0187: training loss 0.682 Epoch 99 iteration 0019/0187: training loss 0.678 Epoch 99 iteration 0020/0187: training loss 0.675 Epoch 99 iteration 0021/0187: training loss 0.673 Epoch 99 iteration 0022/0187: training loss 0.676 Epoch 99 iteration 0023/0187: training loss 0.673 Epoch 99 iteration 0024/0187: training loss 0.682 Epoch 99 iteration 0025/0187: training loss 0.681 Epoch 99 iteration 0026/0187: training loss 0.678 Epoch 99 iteration 0027/0187: training loss 0.679 Epoch 99 iteration 0028/0187: training loss 0.674 Epoch 99 iteration 0029/0187: training loss 0.676 Epoch 99 iteration 0030/0187: training loss 0.679 Epoch 99 iteration 0031/0187: training loss 0.681 Epoch 99 iteration 0032/0187: training loss 0.679 Epoch 99 iteration 0033/0187: training loss 0.681 Epoch 99 iteration 0034/0187: training loss 0.689 Epoch 99 iteration 0035/0187: training loss 0.685 Epoch 99 iteration 0036/0187: training loss 0.686 Epoch 99 iteration 0037/0187: training loss 0.689 Epoch 99 iteration 0038/0187: training loss 0.691 Epoch 99 iteration 0039/0187: training loss 0.687 Epoch 99 iteration 0040/0187: training loss 0.684 Epoch 99 iteration 0041/0187: training loss 0.685 Epoch 99 iteration 0042/0187: training loss 0.684 Epoch 99 iteration 0043/0187: training loss 0.683 Epoch 99 iteration 0044/0187: training loss 0.684 Epoch 99 iteration 0045/0187: training loss 0.684 Epoch 99 iteration 0046/0187: training loss 0.680 Epoch 99 iteration 0047/0187: training loss 0.682 Epoch 99 iteration 0048/0187: training loss 0.687 Epoch 99 iteration 0049/0187: training loss 0.689 Epoch 99 iteration 0050/0187: training loss 0.686 Epoch 99 iteration 0051/0187: training loss 0.683 Epoch 99 iteration 0052/0187: training loss 0.686 Epoch 99 iteration 0053/0187: training loss 0.685 Epoch 99 iteration 0054/0187: training loss 0.683 Epoch 99 iteration 0055/0187: training loss 0.681 Epoch 99 iteration 0056/0187: training loss 0.681 Epoch 99 iteration 0057/0187: training loss 0.677 Epoch 99 iteration 0058/0187: training loss 0.678 Epoch 99 iteration 0059/0187: training loss 0.679 Epoch 99 iteration 0060/0187: training loss 0.682 Epoch 99 iteration 0061/0187: training loss 0.679 Epoch 99 iteration 0062/0187: training loss 0.680 Epoch 99 iteration 0063/0187: training loss 0.682 Epoch 99 iteration 0064/0187: training loss 0.683 Epoch 99 iteration 0065/0187: training loss 0.681 Epoch 99 iteration 0066/0187: training loss 0.681 Epoch 99 iteration 0067/0187: training loss 0.680 Epoch 99 iteration 0068/0187: training loss 0.679 Epoch 99 iteration 0069/0187: training loss 0.680 Epoch 99 iteration 0070/0187: training loss 0.679 Epoch 99 iteration 0071/0187: training loss 0.680 Epoch 99 iteration 0072/0187: training loss 0.678 Epoch 99 iteration 0073/0187: training loss 0.677 Epoch 99 iteration 0074/0187: training loss 0.678 Epoch 99 iteration 0075/0187: training loss 0.677 Epoch 99 iteration 0076/0187: training loss 0.676 Epoch 99 iteration 0077/0187: training loss 0.676 Epoch 99 iteration 0078/0187: training loss 0.678 Epoch 99 iteration 0079/0187: training loss 0.677 Epoch 99 iteration 0080/0187: training loss 0.675 Epoch 99 iteration 0081/0187: training loss 0.678 Epoch 99 iteration 0082/0187: training loss 0.679 Epoch 99 iteration 0083/0187: training loss 0.679 Epoch 99 iteration 0084/0187: training loss 0.677 Epoch 99 iteration 0085/0187: training loss 0.677 Epoch 99 iteration 0086/0187: training loss 0.678 Epoch 99 iteration 0087/0187: training loss 0.679 Epoch 99 iteration 0088/0187: training loss 0.678 Epoch 99 iteration 0089/0187: training loss 0.678 Epoch 99 iteration 0090/0187: training loss 0.676 Epoch 99 iteration 0091/0187: training loss 0.676 Epoch 99 iteration 0092/0187: training loss 0.675 Epoch 99 iteration 0093/0187: training loss 0.676 Epoch 99 iteration 0094/0187: training loss 0.678 Epoch 99 iteration 0095/0187: training loss 0.679 Epoch 99 iteration 0096/0187: training loss 0.679 Epoch 99 iteration 0097/0187: training loss 0.678 Epoch 99 iteration 0098/0187: training loss 0.678 Epoch 99 iteration 0099/0187: training loss 0.678 Epoch 99 iteration 0100/0187: training loss 0.677 Epoch 99 iteration 0101/0187: training loss 0.677 Epoch 99 iteration 0102/0187: training loss 0.675 Epoch 99 iteration 0103/0187: training loss 0.674 Epoch 99 iteration 0104/0187: training loss 0.675 Epoch 99 iteration 0105/0187: training loss 0.674 Epoch 99 iteration 0106/0187: training loss 0.674 Epoch 99 iteration 0107/0187: training loss 0.675 Epoch 99 iteration 0108/0187: training loss 0.676 Epoch 99 iteration 0109/0187: training loss 0.676 Epoch 99 iteration 0110/0187: training loss 0.674 Epoch 99 iteration 0111/0187: training loss 0.673 Epoch 99 iteration 0112/0187: training loss 0.674 Epoch 99 iteration 0113/0187: training loss 0.674 Epoch 99 iteration 0114/0187: training loss 0.674 Epoch 99 iteration 0115/0187: training loss 0.673 Epoch 99 iteration 0116/0187: training loss 0.673 Epoch 99 iteration 0117/0187: training loss 0.673 Epoch 99 iteration 0118/0187: training loss 0.673 Epoch 99 iteration 0119/0187: training loss 0.672 Epoch 99 iteration 0120/0187: training loss 0.671 Epoch 99 iteration 0121/0187: training loss 0.672 Epoch 99 iteration 0122/0187: training loss 0.672 Epoch 99 iteration 0123/0187: training loss 0.672 Epoch 99 iteration 0124/0187: training loss 0.671 Epoch 99 iteration 0125/0187: training loss 0.671 Epoch 99 iteration 0126/0187: training loss 0.671 Epoch 99 iteration 0127/0187: training loss 0.670 Epoch 99 iteration 0128/0187: training loss 0.671 Epoch 99 iteration 0129/0187: training loss 0.671 Epoch 99 iteration 0130/0187: training loss 0.671 Epoch 99 iteration 0131/0187: training loss 0.670 Epoch 99 iteration 0132/0187: training loss 0.670 Epoch 99 iteration 0133/0187: training loss 0.670 Epoch 99 iteration 0134/0187: training loss 0.671 Epoch 99 iteration 0135/0187: training loss 0.672 Epoch 99 iteration 0136/0187: training loss 0.672 Epoch 99 iteration 0137/0187: training loss 0.672 Epoch 99 iteration 0138/0187: training loss 0.672 Epoch 99 iteration 0139/0187: training loss 0.672 Epoch 99 iteration 0140/0187: training loss 0.673 Epoch 99 iteration 0141/0187: training loss 0.673 Epoch 99 iteration 0142/0187: training loss 0.673 Epoch 99 iteration 0143/0187: training loss 0.672 Epoch 99 iteration 0144/0187: training loss 0.671 Epoch 99 iteration 0145/0187: training loss 0.671 Epoch 99 iteration 0146/0187: training loss 0.671 Epoch 99 iteration 0147/0187: training loss 0.672 Epoch 99 iteration 0148/0187: training loss 0.672 Epoch 99 iteration 0149/0187: training loss 0.673 Epoch 99 iteration 0150/0187: training loss 0.673 Epoch 99 iteration 0151/0187: training loss 0.674 Epoch 99 iteration 0152/0187: training loss 0.674 Epoch 99 iteration 0153/0187: training loss 0.673 Epoch 99 iteration 0154/0187: training loss 0.672 Epoch 99 iteration 0155/0187: training loss 0.672 Epoch 99 iteration 0156/0187: training loss 0.672 Epoch 99 iteration 0157/0187: training loss 0.673 Epoch 99 iteration 0158/0187: training loss 0.673 Epoch 99 iteration 0159/0187: training loss 0.672 Epoch 99 iteration 0160/0187: training loss 0.672 Epoch 99 iteration 0161/0187: training loss 0.672 Epoch 99 iteration 0162/0187: training loss 0.672 Epoch 99 iteration 0163/0187: training loss 0.674 Epoch 99 iteration 0164/0187: training loss 0.675 Epoch 99 iteration 0165/0187: training loss 0.675 Epoch 99 iteration 0166/0187: training loss 0.675 Epoch 99 iteration 0167/0187: training loss 0.674 Epoch 99 iteration 0168/0187: training loss 0.674 Epoch 99 iteration 0169/0187: training loss 0.674 Epoch 99 iteration 0170/0187: training loss 0.674 Epoch 99 iteration 0171/0187: training loss 0.675 Epoch 99 iteration 0172/0187: training loss 0.674 Epoch 99 iteration 0173/0187: training loss 0.674 Epoch 99 iteration 0174/0187: training loss 0.673 Epoch 99 iteration 0175/0187: training loss 0.673 Epoch 99 iteration 0176/0187: training loss 0.675 Epoch 99 iteration 0177/0187: training loss 0.674 Epoch 99 iteration 0178/0187: training loss 0.674 Epoch 99 iteration 0179/0187: training loss 0.674 Epoch 99 iteration 0180/0187: training loss 0.674 Epoch 99 iteration 0181/0187: training loss 0.674 Epoch 99 iteration 0182/0187: training loss 0.674 Epoch 99 iteration 0183/0187: training loss 0.674 Epoch 99 iteration 0184/0187: training loss 0.674 Epoch 99 iteration 0185/0187: training loss 0.675 Epoch 99 iteration 0186/0187: training loss 0.674 Epoch 99 iteration 0187/0187: training loss 0.673 Epoch 99 validation pixAcc: 0.876, mIoU: 0.393 Epoch 100 iteration 0001/0187: training loss 0.833 Epoch 100 iteration 0002/0187: training loss 0.860 Epoch 100 iteration 0003/0187: training loss 0.810 Epoch 100 iteration 0004/0187: training loss 0.789 Epoch 100 iteration 0005/0187: training loss 0.771 Epoch 100 iteration 0006/0187: training loss 0.742 Epoch 100 iteration 0007/0187: training loss 0.725 Epoch 100 iteration 0008/0187: training loss 0.717 Epoch 100 iteration 0009/0187: training loss 0.705 Epoch 100 iteration 0010/0187: training loss 0.702 Epoch 100 iteration 0011/0187: training loss 0.710 Epoch 100 iteration 0012/0187: training loss 0.727 Epoch 100 iteration 0013/0187: training loss 0.715 Epoch 100 iteration 0014/0187: training loss 0.705 Epoch 100 iteration 0015/0187: training loss 0.704 Epoch 100 iteration 0016/0187: training loss 0.704 Epoch 100 iteration 0017/0187: training loss 0.699 Epoch 100 iteration 0018/0187: training loss 0.693 Epoch 100 iteration 0019/0187: training loss 0.687 Epoch 100 iteration 0020/0187: training loss 0.690 Epoch 100 iteration 0021/0187: training loss 0.700 Epoch 100 iteration 0022/0187: training loss 0.696 Epoch 100 iteration 0023/0187: training loss 0.690 Epoch 100 iteration 0024/0187: training loss 0.691 Epoch 100 iteration 0025/0187: training loss 0.686 Epoch 100 iteration 0026/0187: training loss 0.681 Epoch 100 iteration 0027/0187: training loss 0.680 Epoch 100 iteration 0028/0187: training loss 0.676 Epoch 100 iteration 0029/0187: training loss 0.679 Epoch 100 iteration 0030/0187: training loss 0.674 Epoch 100 iteration 0031/0187: training loss 0.675 Epoch 100 iteration 0032/0187: training loss 0.671 Epoch 100 iteration 0033/0187: training loss 0.676 Epoch 100 iteration 0034/0187: training loss 0.674 Epoch 100 iteration 0035/0187: training loss 0.670 Epoch 100 iteration 0036/0187: training loss 0.672 Epoch 100 iteration 0037/0187: training loss 0.670 Epoch 100 iteration 0038/0187: training loss 0.675 Epoch 100 iteration 0039/0187: training loss 0.675 Epoch 100 iteration 0040/0187: training loss 0.675 Epoch 100 iteration 0041/0187: training loss 0.678 Epoch 100 iteration 0042/0187: training loss 0.680 Epoch 100 iteration 0043/0187: training loss 0.676 Epoch 100 iteration 0044/0187: training loss 0.677 Epoch 100 iteration 0045/0187: training loss 0.674 Epoch 100 iteration 0046/0187: training loss 0.677 Epoch 100 iteration 0047/0187: training loss 0.677 Epoch 100 iteration 0048/0187: training loss 0.677 Epoch 100 iteration 0049/0187: training loss 0.676 Epoch 100 iteration 0050/0187: training loss 0.675 Epoch 100 iteration 0051/0187: training loss 0.676 Epoch 100 iteration 0052/0187: training loss 0.674 Epoch 100 iteration 0053/0187: training loss 0.674 Epoch 100 iteration 0054/0187: training loss 0.672 Epoch 100 iteration 0055/0187: training loss 0.672 Epoch 100 iteration 0056/0187: training loss 0.672 Epoch 100 iteration 0057/0187: training loss 0.671 Epoch 100 iteration 0058/0187: training loss 0.669 Epoch 100 iteration 0059/0187: training loss 0.669 Epoch 100 iteration 0060/0187: training loss 0.670 Epoch 100 iteration 0061/0187: training loss 0.668 Epoch 100 iteration 0062/0187: training loss 0.669 Epoch 100 iteration 0063/0187: training loss 0.671 Epoch 100 iteration 0064/0187: training loss 0.672 Epoch 100 iteration 0065/0187: training loss 0.673 Epoch 100 iteration 0066/0187: training loss 0.675 Epoch 100 iteration 0067/0187: training loss 0.674 Epoch 100 iteration 0068/0187: training loss 0.673 Epoch 100 iteration 0069/0187: training loss 0.673 Epoch 100 iteration 0070/0187: training loss 0.673 Epoch 100 iteration 0071/0187: training loss 0.679 Epoch 100 iteration 0072/0187: training loss 0.680 Epoch 100 iteration 0073/0187: training loss 0.681 Epoch 100 iteration 0074/0187: training loss 0.681 Epoch 100 iteration 0075/0187: training loss 0.679 Epoch 100 iteration 0076/0187: training loss 0.680 Epoch 100 iteration 0077/0187: training loss 0.680 Epoch 100 iteration 0078/0187: training loss 0.679 Epoch 100 iteration 0079/0187: training loss 0.678 Epoch 100 iteration 0080/0187: training loss 0.677 Epoch 100 iteration 0081/0187: training loss 0.677 Epoch 100 iteration 0082/0187: training loss 0.675 Epoch 100 iteration 0083/0187: training loss 0.674 Epoch 100 iteration 0084/0187: training loss 0.672 Epoch 100 iteration 0085/0187: training loss 0.674 Epoch 100 iteration 0086/0187: training loss 0.674 Epoch 100 iteration 0087/0187: training loss 0.674 Epoch 100 iteration 0088/0187: training loss 0.674 Epoch 100 iteration 0089/0187: training loss 0.674 Epoch 100 iteration 0090/0187: training loss 0.674 Epoch 100 iteration 0091/0188: training loss 0.674 Epoch 100 iteration 0092/0188: training loss 0.675 Epoch 100 iteration 0093/0188: training loss 0.676 Epoch 100 iteration 0094/0188: training loss 0.677 Epoch 100 iteration 0095/0188: training loss 0.677 Epoch 100 iteration 0096/0188: training loss 0.676 Epoch 100 iteration 0097/0188: training loss 0.674 Epoch 100 iteration 0098/0188: training loss 0.676 Epoch 100 iteration 0099/0188: training loss 0.676 Epoch 100 iteration 0100/0188: training loss 0.675 Epoch 100 iteration 0101/0188: training loss 0.675 Epoch 100 iteration 0102/0188: training loss 0.675 Epoch 100 iteration 0103/0188: training loss 0.674 Epoch 100 iteration 0104/0188: training loss 0.674 Epoch 100 iteration 0105/0188: training loss 0.675 Epoch 100 iteration 0106/0188: training loss 0.675 Epoch 100 iteration 0107/0188: training loss 0.675 Epoch 100 iteration 0108/0188: training loss 0.676 Epoch 100 iteration 0109/0188: training loss 0.675 Epoch 100 iteration 0110/0188: training loss 0.678 Epoch 100 iteration 0111/0188: training loss 0.678 Epoch 100 iteration 0112/0188: training loss 0.678 Epoch 100 iteration 0113/0188: training loss 0.677 Epoch 100 iteration 0114/0188: training loss 0.679 Epoch 100 iteration 0115/0188: training loss 0.678 Epoch 100 iteration 0116/0188: training loss 0.680 Epoch 100 iteration 0117/0188: training loss 0.679 Epoch 100 iteration 0118/0188: training loss 0.679 Epoch 100 iteration 0119/0188: training loss 0.679 Epoch 100 iteration 0120/0188: training loss 0.678 Epoch 100 iteration 0121/0188: training loss 0.677 Epoch 100 iteration 0122/0188: training loss 0.678 Epoch 100 iteration 0123/0188: training loss 0.677 Epoch 100 iteration 0124/0188: training loss 0.676 Epoch 100 iteration 0125/0188: training loss 0.677 Epoch 100 iteration 0126/0188: training loss 0.676 Epoch 100 iteration 0127/0188: training loss 0.676 Epoch 100 iteration 0128/0188: training loss 0.676 Epoch 100 iteration 0129/0188: training loss 0.677 Epoch 100 iteration 0130/0188: training loss 0.677 Epoch 100 iteration 0131/0188: training loss 0.679 Epoch 100 iteration 0132/0188: training loss 0.677 Epoch 100 iteration 0133/0188: training loss 0.678 Epoch 100 iteration 0134/0188: training loss 0.678 Epoch 100 iteration 0135/0188: training loss 0.678 Epoch 100 iteration 0136/0188: training loss 0.677 Epoch 100 iteration 0137/0188: training loss 0.678 Epoch 100 iteration 0138/0188: training loss 0.677 Epoch 100 iteration 0139/0188: training loss 0.676 Epoch 100 iteration 0140/0188: training loss 0.677 Epoch 100 iteration 0141/0188: training loss 0.678 Epoch 100 iteration 0142/0188: training loss 0.677 Epoch 100 iteration 0143/0188: training loss 0.677 Epoch 100 iteration 0144/0188: training loss 0.678 Epoch 100 iteration 0145/0188: training loss 0.677 Epoch 100 iteration 0146/0188: training loss 0.678 Epoch 100 iteration 0147/0188: training loss 0.677 Epoch 100 iteration 0148/0188: training loss 0.677 Epoch 100 iteration 0149/0188: training loss 0.677 Epoch 100 iteration 0150/0188: training loss 0.678 Epoch 100 iteration 0151/0188: training loss 0.678 Epoch 100 iteration 0152/0188: training loss 0.680 Epoch 100 iteration 0153/0188: training loss 0.681 Epoch 100 iteration 0154/0188: training loss 0.681 Epoch 100 iteration 0155/0188: training loss 0.680 Epoch 100 iteration 0156/0188: training loss 0.679 Epoch 100 iteration 0157/0188: training loss 0.680 Epoch 100 iteration 0158/0188: training loss 0.681 Epoch 100 iteration 0159/0188: training loss 0.681 Epoch 100 iteration 0160/0188: training loss 0.680 Epoch 100 iteration 0161/0188: training loss 0.681 Epoch 100 iteration 0162/0188: training loss 0.681 Epoch 100 iteration 0163/0188: training loss 0.680 Epoch 100 iteration 0164/0188: training loss 0.682 Epoch 100 iteration 0165/0188: training loss 0.682 Epoch 100 iteration 0166/0188: training loss 0.682 Epoch 100 iteration 0167/0188: training loss 0.683 Epoch 100 iteration 0168/0188: training loss 0.682 Epoch 100 iteration 0169/0188: training loss 0.682 Epoch 100 iteration 0170/0188: training loss 0.682 Epoch 100 iteration 0171/0188: training loss 0.682 Epoch 100 iteration 0172/0188: training loss 0.683 Epoch 100 iteration 0173/0188: training loss 0.683 Epoch 100 iteration 0174/0188: training loss 0.683 Epoch 100 iteration 0175/0188: training loss 0.683 Epoch 100 iteration 0176/0188: training loss 0.683 Epoch 100 iteration 0177/0188: training loss 0.683 Epoch 100 iteration 0178/0188: training loss 0.683 Epoch 100 iteration 0179/0188: training loss 0.683 Epoch 100 iteration 0180/0188: training loss 0.682 Epoch 100 iteration 0181/0188: training loss 0.683 Epoch 100 iteration 0182/0188: training loss 0.683 Epoch 100 iteration 0183/0188: training loss 0.684 Epoch 100 iteration 0184/0188: training loss 0.684 Epoch 100 iteration 0185/0188: training loss 0.685 Epoch 100 iteration 0186/0188: training loss 0.685 Epoch 100 validation pixAcc: 0.875, mIoU: 0.392 Epoch 101 iteration 0001/0187: training loss 0.693 Epoch 101 iteration 0002/0187: training loss 0.724 Epoch 101 iteration 0003/0187: training loss 0.693 Epoch 101 iteration 0004/0187: training loss 0.677 Epoch 101 iteration 0005/0187: training loss 0.671 Epoch 101 iteration 0006/0187: training loss 0.663 Epoch 101 iteration 0007/0187: training loss 0.682 Epoch 101 iteration 0008/0187: training loss 0.681 Epoch 101 iteration 0009/0187: training loss 0.672 Epoch 101 iteration 0010/0187: training loss 0.683 Epoch 101 iteration 0011/0187: training loss 0.692 Epoch 101 iteration 0012/0187: training loss 0.694 Epoch 101 iteration 0013/0187: training loss 0.686 Epoch 101 iteration 0014/0187: training loss 0.679 Epoch 101 iteration 0015/0187: training loss 0.669 Epoch 101 iteration 0016/0187: training loss 0.676 Epoch 101 iteration 0017/0187: training loss 0.677 Epoch 101 iteration 0018/0187: training loss 0.680 Epoch 101 iteration 0019/0187: training loss 0.686 Epoch 101 iteration 0020/0187: training loss 0.681 Epoch 101 iteration 0021/0187: training loss 0.677 Epoch 101 iteration 0022/0187: training loss 0.677 Epoch 101 iteration 0023/0187: training loss 0.672 Epoch 101 iteration 0024/0187: training loss 0.671 Epoch 101 iteration 0025/0187: training loss 0.669 Epoch 101 iteration 0026/0187: training loss 0.665 Epoch 101 iteration 0027/0187: training loss 0.668 Epoch 101 iteration 0028/0187: training loss 0.669 Epoch 101 iteration 0029/0187: training loss 0.672 Epoch 101 iteration 0030/0187: training loss 0.670 Epoch 101 iteration 0031/0187: training loss 0.670 Epoch 101 iteration 0032/0187: training loss 0.667 Epoch 101 iteration 0033/0187: training loss 0.669 Epoch 101 iteration 0034/0187: training loss 0.666 Epoch 101 iteration 0035/0187: training loss 0.669 Epoch 101 iteration 0036/0187: training loss 0.665 Epoch 101 iteration 0037/0187: training loss 0.664 Epoch 101 iteration 0038/0187: training loss 0.664 Epoch 101 iteration 0039/0187: training loss 0.664 Epoch 101 iteration 0040/0187: training loss 0.671 Epoch 101 iteration 0041/0187: training loss 0.676 Epoch 101 iteration 0042/0187: training loss 0.682 Epoch 101 iteration 0043/0187: training loss 0.681 Epoch 101 iteration 0044/0187: training loss 0.684 Epoch 101 iteration 0045/0187: training loss 0.685 Epoch 101 iteration 0046/0187: training loss 0.689 Epoch 101 iteration 0047/0187: training loss 0.686 Epoch 101 iteration 0048/0187: training loss 0.688 Epoch 101 iteration 0049/0187: training loss 0.687 Epoch 101 iteration 0050/0187: training loss 0.685 Epoch 101 iteration 0051/0187: training loss 0.685 Epoch 101 iteration 0052/0187: training loss 0.683 Epoch 101 iteration 0053/0187: training loss 0.686 Epoch 101 iteration 0054/0187: training loss 0.686 Epoch 101 iteration 0055/0187: training loss 0.687 Epoch 101 iteration 0056/0187: training loss 0.689 Epoch 101 iteration 0057/0187: training loss 0.689 Epoch 101 iteration 0058/0187: training loss 0.690 Epoch 101 iteration 0059/0187: training loss 0.688 Epoch 101 iteration 0060/0187: training loss 0.687 Epoch 101 iteration 0061/0187: training loss 0.685 Epoch 101 iteration 0062/0187: training loss 0.685 Epoch 101 iteration 0063/0187: training loss 0.688 Epoch 101 iteration 0064/0187: training loss 0.688 Epoch 101 iteration 0065/0187: training loss 0.687 Epoch 101 iteration 0066/0187: training loss 0.685 Epoch 101 iteration 0067/0187: training loss 0.686 Epoch 101 iteration 0068/0187: training loss 0.686 Epoch 101 iteration 0069/0187: training loss 0.686 Epoch 101 iteration 0070/0187: training loss 0.684 Epoch 101 iteration 0071/0187: training loss 0.684 Epoch 101 iteration 0072/0187: training loss 0.686 Epoch 101 iteration 0073/0187: training loss 0.689 Epoch 101 iteration 0074/0187: training loss 0.687 Epoch 101 iteration 0075/0187: training loss 0.686 Epoch 101 iteration 0076/0187: training loss 0.686 Epoch 101 iteration 0077/0187: training loss 0.686 Epoch 101 iteration 0078/0187: training loss 0.687 Epoch 101 iteration 0079/0187: training loss 0.688 Epoch 101 iteration 0080/0187: training loss 0.689 Epoch 101 iteration 0081/0187: training loss 0.687 Epoch 101 iteration 0082/0187: training loss 0.688 Epoch 101 iteration 0083/0187: training loss 0.688 Epoch 101 iteration 0084/0187: training loss 0.690 Epoch 101 iteration 0085/0187: training loss 0.693 Epoch 101 iteration 0086/0187: training loss 0.696 Epoch 101 iteration 0087/0187: training loss 0.698 Epoch 101 iteration 0088/0187: training loss 0.698 Epoch 101 iteration 0089/0187: training loss 0.696 Epoch 101 iteration 0090/0187: training loss 0.696 Epoch 101 iteration 0091/0187: training loss 0.697 Epoch 101 iteration 0092/0187: training loss 0.697 Epoch 101 iteration 0093/0187: training loss 0.698 Epoch 101 iteration 0094/0187: training loss 0.699 Epoch 101 iteration 0095/0187: training loss 0.697 Epoch 101 iteration 0096/0187: training loss 0.697 Epoch 101 iteration 0097/0187: training loss 0.695 Epoch 101 iteration 0098/0187: training loss 0.693 Epoch 101 iteration 0099/0187: training loss 0.693 Epoch 101 iteration 0100/0187: training loss 0.692 Epoch 101 iteration 0101/0187: training loss 0.692 Epoch 101 iteration 0102/0187: training loss 0.693 Epoch 101 iteration 0103/0187: training loss 0.693 Epoch 101 iteration 0104/0187: training loss 0.692 Epoch 101 iteration 0105/0187: training loss 0.692 Epoch 101 iteration 0106/0187: training loss 0.692 Epoch 101 iteration 0107/0187: training loss 0.691 Epoch 101 iteration 0108/0187: training loss 0.691 Epoch 101 iteration 0109/0187: training loss 0.690 Epoch 101 iteration 0110/0187: training loss 0.690 Epoch 101 iteration 0111/0187: training loss 0.690 Epoch 101 iteration 0112/0187: training loss 0.691 Epoch 101 iteration 0113/0187: training loss 0.690 Epoch 101 iteration 0114/0187: training loss 0.691 Epoch 101 iteration 0115/0187: training loss 0.692 Epoch 101 iteration 0116/0187: training loss 0.692 Epoch 101 iteration 0117/0187: training loss 0.692 Epoch 101 iteration 0118/0187: training loss 0.691 Epoch 101 iteration 0119/0187: training loss 0.691 Epoch 101 iteration 0120/0187: training loss 0.690 Epoch 101 iteration 0121/0187: training loss 0.691 Epoch 101 iteration 0122/0187: training loss 0.691 Epoch 101 iteration 0123/0187: training loss 0.690 Epoch 101 iteration 0124/0187: training loss 0.688 Epoch 101 iteration 0125/0187: training loss 0.689 Epoch 101 iteration 0126/0187: training loss 0.690 Epoch 101 iteration 0127/0187: training loss 0.690 Epoch 101 iteration 0128/0187: training loss 0.690 Epoch 101 iteration 0129/0187: training loss 0.691 Epoch 101 iteration 0130/0187: training loss 0.693 Epoch 101 iteration 0131/0187: training loss 0.692 Epoch 101 iteration 0132/0187: training loss 0.692 Epoch 101 iteration 0133/0187: training loss 0.691 Epoch 101 iteration 0134/0187: training loss 0.690 Epoch 101 iteration 0135/0187: training loss 0.689 Epoch 101 iteration 0136/0187: training loss 0.689 Epoch 101 iteration 0137/0187: training loss 0.689 Epoch 101 iteration 0138/0187: training loss 0.688 Epoch 101 iteration 0139/0187: training loss 0.688 Epoch 101 iteration 0140/0187: training loss 0.688 Epoch 101 iteration 0141/0187: training loss 0.688 Epoch 101 iteration 0142/0187: training loss 0.688 Epoch 101 iteration 0143/0187: training loss 0.689 Epoch 101 iteration 0144/0187: training loss 0.689 Epoch 101 iteration 0145/0187: training loss 0.689 Epoch 101 iteration 0146/0187: training loss 0.689 Epoch 101 iteration 0147/0187: training loss nan Epoch 101 iteration 0148/0187: training loss nan Epoch 101 iteration 0149/0187: training loss nan Epoch 101 iteration 0150/0187: training loss nan Epoch 101 iteration 0151/0187: training loss nan Epoch 101 iteration 0152/0187: training loss nan Epoch 101 iteration 0153/0187: training loss nan Epoch 101 iteration 0154/0187: training loss nan Epoch 101 iteration 0155/0187: training loss nan Epoch 101 iteration 0156/0187: training loss nan Epoch 101 iteration 0157/0187: training loss nan Epoch 101 iteration 0158/0187: training loss nan Epoch 101 iteration 0159/0187: training loss nan Epoch 101 iteration 0160/0187: training loss nan Epoch 101 iteration 0161/0187: training loss nan Epoch 101 iteration 0162/0187: training loss nan Epoch 101 iteration 0163/0187: training loss nan Epoch 101 iteration 0164/0187: training loss nan Epoch 101 iteration 0165/0187: training loss nan Epoch 101 iteration 0166/0187: training loss nan Epoch 101 iteration 0167/0187: training loss nan Epoch 101 iteration 0168/0187: training loss nan Epoch 101 iteration 0169/0187: training loss nan Epoch 101 iteration 0170/0187: training loss nan Epoch 101 iteration 0171/0187: training loss nan Epoch 101 iteration 0172/0187: training loss nan Epoch 101 iteration 0173/0187: training loss nan Epoch 101 iteration 0174/0187: training loss nan Epoch 101 iteration 0175/0187: training loss nan Epoch 101 iteration 0176/0187: training loss nan Epoch 101 iteration 0177/0187: training loss nan Epoch 101 iteration 0178/0187: training loss nan Epoch 101 iteration 0179/0187: training loss nan Epoch 101 iteration 0180/0187: training loss nan Epoch 101 iteration 0181/0187: training loss nan Epoch 101 iteration 0182/0187: training loss nan Epoch 101 iteration 0183/0187: training loss nan Epoch 101 iteration 0184/0187: training loss nan Epoch 101 iteration 0185/0187: training loss nan Epoch 101 iteration 0186/0187: training loss nan Epoch 101 iteration 0187/0187: training loss nan Epoch 101 validation pixAcc: 0.875, mIoU: 0.393 Epoch 102 iteration 0001/0187: training loss 0.659 Epoch 102 iteration 0002/0187: training loss 0.688 Epoch 102 iteration 0003/0187: training loss 0.683 Epoch 102 iteration 0004/0187: training loss 0.681 Epoch 102 iteration 0005/0187: training loss 0.652 Epoch 102 iteration 0006/0187: training loss 0.638 Epoch 102 iteration 0007/0187: training loss 0.634 Epoch 102 iteration 0008/0187: training loss 0.673 Epoch 102 iteration 0009/0187: training loss 0.670 Epoch 102 iteration 0010/0187: training loss 0.671 Epoch 102 iteration 0011/0187: training loss 0.670 Epoch 102 iteration 0012/0187: training loss 0.663 Epoch 102 iteration 0013/0187: training loss 0.676 Epoch 102 iteration 0014/0187: training loss 0.677 Epoch 102 iteration 0015/0187: training loss 0.677 Epoch 102 iteration 0016/0187: training loss 0.694 Epoch 102 iteration 0017/0187: training loss 0.694 Epoch 102 iteration 0018/0187: training loss 0.690 Epoch 102 iteration 0019/0187: training loss 0.687 Epoch 102 iteration 0020/0187: training loss 0.680 Epoch 102 iteration 0021/0187: training loss 0.684 Epoch 102 iteration 0022/0187: training loss 0.681 Epoch 102 iteration 0023/0187: training loss 0.680 Epoch 102 iteration 0024/0187: training loss 0.684 Epoch 102 iteration 0025/0187: training loss 0.691 Epoch 102 iteration 0026/0187: training loss 0.687 Epoch 102 iteration 0027/0187: training loss 0.686 Epoch 102 iteration 0028/0187: training loss 0.688 Epoch 102 iteration 0029/0187: training loss 0.686 Epoch 102 iteration 0030/0187: training loss 0.680 Epoch 102 iteration 0031/0187: training loss 0.675 Epoch 102 iteration 0032/0187: training loss 0.674 Epoch 102 iteration 0033/0187: training loss 0.675 Epoch 102 iteration 0034/0187: training loss 0.676 Epoch 102 iteration 0035/0187: training loss 0.678 Epoch 102 iteration 0036/0187: training loss 0.679 Epoch 102 iteration 0037/0187: training loss 0.683 Epoch 102 iteration 0038/0187: training loss 0.681 Epoch 102 iteration 0039/0187: training loss 0.680 Epoch 102 iteration 0040/0187: training loss 0.685 Epoch 102 iteration 0041/0187: training loss 0.685 Epoch 102 iteration 0042/0187: training loss 0.682 Epoch 102 iteration 0043/0187: training loss 0.684 Epoch 102 iteration 0044/0187: training loss 0.685 Epoch 102 iteration 0045/0187: training loss 0.688 Epoch 102 iteration 0046/0187: training loss 0.689 Epoch 102 iteration 0047/0187: training loss 0.687 Epoch 102 iteration 0048/0187: training loss 0.687 Epoch 102 iteration 0049/0187: training loss 0.688 Epoch 102 iteration 0050/0187: training loss 0.688 Epoch 102 iteration 0051/0187: training loss 0.690 Epoch 102 iteration 0052/0187: training loss 0.688 Epoch 102 iteration 0053/0187: training loss 0.687 Epoch 102 iteration 0054/0187: training loss 0.687 Epoch 102 iteration 0055/0187: training loss 0.686 Epoch 102 iteration 0056/0187: training loss 0.683 Epoch 102 iteration 0057/0187: training loss 0.680 Epoch 102 iteration 0058/0187: training loss 0.683 Epoch 102 iteration 0059/0187: training loss 0.684 Epoch 102 iteration 0060/0187: training loss 0.685 Epoch 102 iteration 0061/0187: training loss 0.685 Epoch 102 iteration 0062/0187: training loss 0.685 Epoch 102 iteration 0063/0187: training loss 0.685 Epoch 102 iteration 0064/0187: training loss 0.687 Epoch 102 iteration 0065/0187: training loss 0.689 Epoch 102 iteration 0066/0187: training loss 0.688 Epoch 102 iteration 0067/0187: training loss 0.689 Epoch 102 iteration 0068/0187: training loss 0.689 Epoch 102 iteration 0069/0187: training loss 0.687 Epoch 102 iteration 0070/0187: training loss 0.688 Epoch 102 iteration 0071/0187: training loss 0.685 Epoch 102 iteration 0072/0187: training loss 0.687 Epoch 102 iteration 0073/0187: training loss 0.689 Epoch 102 iteration 0074/0187: training loss 0.690 Epoch 102 iteration 0075/0187: training loss 0.689 Epoch 102 iteration 0076/0187: training loss 0.686 Epoch 102 iteration 0077/0187: training loss 0.686 Epoch 102 iteration 0078/0187: training loss 0.686 Epoch 102 iteration 0079/0187: training loss 0.689 Epoch 102 iteration 0080/0187: training loss 0.687 Epoch 102 iteration 0081/0187: training loss 0.686 Epoch 102 iteration 0082/0187: training loss 0.686 Epoch 102 iteration 0083/0187: training loss 0.685 Epoch 102 iteration 0084/0187: training loss 0.685 Epoch 102 iteration 0085/0187: training loss 0.685 Epoch 102 iteration 0086/0187: training loss 0.687 Epoch 102 iteration 0087/0187: training loss 0.686 Epoch 102 iteration 0088/0187: training loss 0.685 Epoch 102 iteration 0089/0187: training loss 0.684 Epoch 102 iteration 0090/0187: training loss 0.683 Epoch 102 iteration 0091/0188: training loss 0.683 Epoch 102 iteration 0092/0188: training loss 0.682 Epoch 102 iteration 0093/0188: training loss 0.683 Epoch 102 iteration 0094/0188: training loss 0.683 Epoch 102 iteration 0095/0188: training loss 0.683 Epoch 102 iteration 0096/0188: training loss 0.681 Epoch 102 iteration 0097/0188: training loss 0.682 Epoch 102 iteration 0098/0188: training loss 0.684 Epoch 102 iteration 0099/0188: training loss 0.685 Epoch 102 iteration 0100/0188: training loss 0.685 Epoch 102 iteration 0101/0188: training loss 0.684 Epoch 102 iteration 0102/0188: training loss 0.685 Epoch 102 iteration 0103/0188: training loss 0.685 Epoch 102 iteration 0104/0188: training loss 0.684 Epoch 102 iteration 0105/0188: training loss 0.684 Epoch 102 iteration 0106/0188: training loss 0.684 Epoch 102 iteration 0107/0188: training loss 0.682 Epoch 102 iteration 0108/0188: training loss 0.683 Epoch 102 iteration 0109/0188: training loss 0.682 Epoch 102 iteration 0110/0188: training loss 0.681 Epoch 102 iteration 0111/0188: training loss 0.681 Epoch 102 iteration 0112/0188: training loss 0.681 Epoch 102 iteration 0113/0188: training loss 0.682 Epoch 102 iteration 0114/0188: training loss 0.682 Epoch 102 iteration 0115/0188: training loss 0.682 Epoch 102 iteration 0116/0188: training loss 0.682 Epoch 102 iteration 0117/0188: training loss 0.681 Epoch 102 iteration 0118/0188: training loss 0.681 Epoch 102 iteration 0119/0188: training loss 0.682 Epoch 102 iteration 0120/0188: training loss 0.681 Epoch 102 iteration 0121/0188: training loss 0.681 Epoch 102 iteration 0122/0188: training loss 0.680 Epoch 102 iteration 0123/0188: training loss 0.680 Epoch 102 iteration 0124/0188: training loss 0.680 Epoch 102 iteration 0125/0188: training loss 0.679 Epoch 102 iteration 0126/0188: training loss 0.679 Epoch 102 iteration 0127/0188: training loss 0.680 Epoch 102 iteration 0128/0188: training loss 0.681 Epoch 102 iteration 0129/0188: training loss 0.680 Epoch 102 iteration 0130/0188: training loss 0.680 Epoch 102 iteration 0131/0188: training loss 0.681 Epoch 102 iteration 0132/0188: training loss 0.680 Epoch 102 iteration 0133/0188: training loss 0.680 Epoch 102 iteration 0134/0188: training loss 0.682 Epoch 102 iteration 0135/0188: training loss 0.682 Epoch 102 iteration 0136/0188: training loss 0.684 Epoch 102 iteration 0137/0188: training loss 0.685 Epoch 102 iteration 0138/0188: training loss 0.685 Epoch 102 iteration 0139/0188: training loss 0.684 Epoch 102 iteration 0140/0188: training loss 0.684 Epoch 102 iteration 0141/0188: training loss 0.684 Epoch 102 iteration 0142/0188: training loss 0.684 Epoch 102 iteration 0143/0188: training loss 0.684 Epoch 102 iteration 0144/0188: training loss 0.684 Epoch 102 iteration 0145/0188: training loss 0.684 Epoch 102 iteration 0146/0188: training loss 0.683 Epoch 102 iteration 0147/0188: training loss 0.683 Epoch 102 iteration 0148/0188: training loss 0.683 Epoch 102 iteration 0149/0188: training loss 0.684 Epoch 102 iteration 0150/0188: training loss 0.685 Epoch 102 iteration 0151/0188: training loss 0.684 Epoch 102 iteration 0152/0188: training loss 0.684 Epoch 102 iteration 0153/0188: training loss 0.685 Epoch 102 iteration 0154/0188: training loss 0.684 Epoch 102 iteration 0155/0188: training loss 0.684 Epoch 102 iteration 0156/0188: training loss 0.684 Epoch 102 iteration 0157/0188: training loss 0.686 Epoch 102 iteration 0158/0188: training loss 0.687 Epoch 102 iteration 0159/0188: training loss 0.687 Epoch 102 iteration 0160/0188: training loss 0.688 Epoch 102 iteration 0161/0188: training loss 0.687 Epoch 102 iteration 0162/0188: training loss 0.687 Epoch 102 iteration 0163/0188: training loss 0.687 Epoch 102 iteration 0164/0188: training loss 0.686 Epoch 102 iteration 0165/0188: training loss 0.686 Epoch 102 iteration 0166/0188: training loss 0.686 Epoch 102 iteration 0167/0188: training loss 0.685 Epoch 102 iteration 0168/0188: training loss 0.685 Epoch 102 iteration 0169/0188: training loss 0.685 Epoch 102 iteration 0170/0188: training loss 0.684 Epoch 102 iteration 0171/0188: training loss 0.686 Epoch 102 iteration 0172/0188: training loss 0.686 Epoch 102 iteration 0173/0188: training loss 0.686 Epoch 102 iteration 0174/0188: training loss 0.686 Epoch 102 iteration 0175/0188: training loss 0.685 Epoch 102 iteration 0176/0188: training loss 0.685 Epoch 102 iteration 0177/0188: training loss 0.685 Epoch 102 iteration 0178/0188: training loss 0.685 Epoch 102 iteration 0179/0188: training loss 0.685 Epoch 102 iteration 0180/0188: training loss 0.685 Epoch 102 iteration 0181/0188: training loss 0.684 Epoch 102 iteration 0182/0188: training loss 0.685 Epoch 102 iteration 0183/0188: training loss 0.686 Epoch 102 iteration 0184/0188: training loss 0.686 Epoch 102 iteration 0185/0188: training loss 0.687 Epoch 102 iteration 0186/0188: training loss 0.687 Epoch 102 validation pixAcc: 0.875, mIoU: 0.392 Epoch 103 iteration 0001/0187: training loss 0.749 Epoch 103 iteration 0002/0187: training loss 0.681 Epoch 103 iteration 0003/0187: training loss 0.669 Epoch 103 iteration 0004/0187: training loss 0.683 Epoch 103 iteration 0005/0187: training loss 0.662 Epoch 103 iteration 0006/0187: training loss 0.660 Epoch 103 iteration 0007/0187: training loss 0.651 Epoch 103 iteration 0008/0187: training loss 0.655 Epoch 103 iteration 0009/0187: training loss 0.664 Epoch 103 iteration 0010/0187: training loss 0.669 Epoch 103 iteration 0011/0187: training loss 0.665 Epoch 103 iteration 0012/0187: training loss 0.673 Epoch 103 iteration 0013/0187: training loss 0.671 Epoch 103 iteration 0014/0187: training loss 0.669 Epoch 103 iteration 0015/0187: training loss 0.675 Epoch 103 iteration 0016/0187: training loss 0.672 Epoch 103 iteration 0017/0187: training loss 0.670 Epoch 103 iteration 0018/0187: training loss 0.666 Epoch 103 iteration 0019/0187: training loss 0.665 Epoch 103 iteration 0020/0187: training loss 0.661 Epoch 103 iteration 0021/0187: training loss 0.673 Epoch 103 iteration 0022/0187: training loss 0.677 Epoch 103 iteration 0023/0187: training loss 0.679 Epoch 103 iteration 0024/0187: training loss 0.683 Epoch 103 iteration 0025/0187: training loss 0.681 Epoch 103 iteration 0026/0187: training loss 0.676 Epoch 103 iteration 0027/0187: training loss 0.672 Epoch 103 iteration 0028/0187: training loss 0.672 Epoch 103 iteration 0029/0187: training loss 0.670 Epoch 103 iteration 0030/0187: training loss 0.673 Epoch 103 iteration 0031/0187: training loss 0.669 Epoch 103 iteration 0032/0187: training loss 0.670 Epoch 103 iteration 0033/0187: training loss 0.671 Epoch 103 iteration 0034/0187: training loss 0.674 Epoch 103 iteration 0035/0187: training loss 0.673 Epoch 103 iteration 0036/0187: training loss 0.672 Epoch 103 iteration 0037/0187: training loss 0.671 Epoch 103 iteration 0038/0187: training loss 0.669 Epoch 103 iteration 0039/0187: training loss 0.669 Epoch 103 iteration 0040/0187: training loss 0.671 Epoch 103 iteration 0041/0187: training loss 0.671 Epoch 103 iteration 0042/0187: training loss 0.675 Epoch 103 iteration 0043/0187: training loss 0.674 Epoch 103 iteration 0044/0187: training loss 0.673 Epoch 103 iteration 0045/0187: training loss 0.671 Epoch 103 iteration 0046/0187: training loss 0.672 Epoch 103 iteration 0047/0187: training loss 0.674 Epoch 103 iteration 0048/0187: training loss 0.676 Epoch 103 iteration 0049/0187: training loss 0.674 Epoch 103 iteration 0050/0187: training loss 0.673 Epoch 103 iteration 0051/0187: training loss 0.673 Epoch 103 iteration 0052/0187: training loss 0.677 Epoch 103 iteration 0053/0187: training loss 0.675 Epoch 103 iteration 0054/0187: training loss 0.673 Epoch 103 iteration 0055/0187: training loss 0.676 Epoch 103 iteration 0056/0187: training loss 0.676 Epoch 103 iteration 0057/0187: training loss 0.676 Epoch 103 iteration 0058/0187: training loss 0.677 Epoch 103 iteration 0059/0187: training loss 0.678 Epoch 103 iteration 0060/0187: training loss 0.679 Epoch 103 iteration 0061/0187: training loss 0.681 Epoch 103 iteration 0062/0187: training loss 0.681 Epoch 103 iteration 0063/0187: training loss 0.681 Epoch 103 iteration 0064/0187: training loss 0.681 Epoch 103 iteration 0065/0187: training loss 0.681 Epoch 103 iteration 0066/0187: training loss 0.680 Epoch 103 iteration 0067/0187: training loss 0.678 Epoch 103 iteration 0068/0187: training loss 0.678 Epoch 103 iteration 0069/0187: training loss 0.678 Epoch 103 iteration 0070/0187: training loss 0.678 Epoch 103 iteration 0071/0187: training loss 0.676 Epoch 103 iteration 0072/0187: training loss 0.676 Epoch 103 iteration 0073/0187: training loss 0.676 Epoch 103 iteration 0074/0187: training loss 0.675 Epoch 103 iteration 0075/0187: training loss 0.675 Epoch 103 iteration 0076/0187: training loss 0.676 Epoch 103 iteration 0077/0187: training loss 0.675 Epoch 103 iteration 0078/0187: training loss 0.673 Epoch 103 iteration 0079/0187: training loss 0.672 Epoch 103 iteration 0080/0187: training loss 0.670 Epoch 103 iteration 0081/0187: training loss 0.670 Epoch 103 iteration 0082/0187: training loss 0.670 Epoch 103 iteration 0083/0187: training loss 0.670 Epoch 103 iteration 0084/0187: training loss 0.670 Epoch 103 iteration 0085/0187: training loss 0.671 Epoch 103 iteration 0086/0187: training loss 0.671 Epoch 103 iteration 0087/0187: training loss 0.671 Epoch 103 iteration 0088/0187: training loss 0.673 Epoch 103 iteration 0089/0187: training loss 0.671 Epoch 103 iteration 0090/0187: training loss 0.674 Epoch 103 iteration 0091/0187: training loss 0.673 Epoch 103 iteration 0092/0187: training loss 0.674 Epoch 103 iteration 0093/0187: training loss 0.674 Epoch 103 iteration 0094/0187: training loss 0.673 Epoch 103 iteration 0095/0187: training loss 0.673 Epoch 103 iteration 0096/0187: training loss 0.675 Epoch 103 iteration 0097/0187: training loss 0.676 Epoch 103 iteration 0098/0187: training loss 0.674 Epoch 103 iteration 0099/0187: training loss 0.674 Epoch 103 iteration 0100/0187: training loss 0.675 Epoch 103 iteration 0101/0187: training loss 0.675 Epoch 103 iteration 0102/0187: training loss 0.676 Epoch 103 iteration 0103/0187: training loss 0.675 Epoch 103 iteration 0104/0187: training loss 0.676 Epoch 103 iteration 0105/0187: training loss 0.675 Epoch 103 iteration 0106/0187: training loss 0.676 Epoch 103 iteration 0107/0187: training loss 0.678 Epoch 103 iteration 0108/0187: training loss 0.679 Epoch 103 iteration 0109/0187: training loss 0.678 Epoch 103 iteration 0110/0187: training loss 0.677 Epoch 103 iteration 0111/0187: training loss 0.677 Epoch 103 iteration 0112/0187: training loss 0.679 Epoch 103 iteration 0113/0187: training loss 0.681 Epoch 103 iteration 0114/0187: training loss 0.681 Epoch 103 iteration 0115/0187: training loss 0.681 Epoch 103 iteration 0116/0187: training loss 0.682 Epoch 103 iteration 0117/0187: training loss 0.682 Epoch 103 iteration 0118/0187: training loss 0.682 Epoch 103 iteration 0119/0187: training loss 0.684 Epoch 103 iteration 0120/0187: training loss 0.683 Epoch 103 iteration 0121/0187: training loss 0.684 Epoch 103 iteration 0122/0187: training loss 0.684 Epoch 103 iteration 0123/0187: training loss 0.683 Epoch 103 iteration 0124/0187: training loss 0.684 Epoch 103 iteration 0125/0187: training loss 0.684 Epoch 103 iteration 0126/0187: training loss 0.683 Epoch 103 iteration 0127/0187: training loss 0.684 Epoch 103 iteration 0128/0187: training loss 0.688 Epoch 103 iteration 0129/0187: training loss 0.688 Epoch 103 iteration 0130/0187: training loss 0.688 Epoch 103 iteration 0131/0187: training loss 0.688 Epoch 103 iteration 0132/0187: training loss 0.688 Epoch 103 iteration 0133/0187: training loss 0.689 Epoch 103 iteration 0134/0187: training loss 0.688 Epoch 103 iteration 0135/0187: training loss 0.688 Epoch 103 iteration 0136/0187: training loss 0.687 Epoch 103 iteration 0137/0187: training loss 0.687 Epoch 103 iteration 0138/0187: training loss 0.688 Epoch 103 iteration 0139/0187: training loss 0.690 Epoch 103 iteration 0140/0187: training loss 0.689 Epoch 103 iteration 0141/0187: training loss 0.688 Epoch 103 iteration 0142/0187: training loss 0.688 Epoch 103 iteration 0143/0187: training loss 0.688 Epoch 103 iteration 0144/0187: training loss 0.689 Epoch 103 iteration 0145/0187: training loss 0.688 Epoch 103 iteration 0146/0187: training loss 0.688 Epoch 103 iteration 0147/0187: training loss 0.687 Epoch 103 iteration 0148/0187: training loss 0.687 Epoch 103 iteration 0149/0187: training loss 0.686 Epoch 103 iteration 0150/0187: training loss 0.687 Epoch 103 iteration 0151/0187: training loss 0.687 Epoch 103 iteration 0152/0187: training loss 0.687 Epoch 103 iteration 0153/0187: training loss 0.688 Epoch 103 iteration 0154/0187: training loss 0.688 Epoch 103 iteration 0155/0187: training loss 0.688 Epoch 103 iteration 0156/0187: training loss 0.688 Epoch 103 iteration 0157/0187: training loss 0.687 Epoch 103 iteration 0158/0187: training loss 0.687 Epoch 103 iteration 0159/0187: training loss 0.687 Epoch 103 iteration 0160/0187: training loss 0.687 Epoch 103 iteration 0161/0187: training loss 0.687 Epoch 103 iteration 0162/0187: training loss 0.688 Epoch 103 iteration 0163/0187: training loss 0.688 Epoch 103 iteration 0164/0187: training loss 0.688 Epoch 103 iteration 0165/0187: training loss 0.687 Epoch 103 iteration 0166/0187: training loss 0.689 Epoch 103 iteration 0167/0187: training loss 0.689 Epoch 103 iteration 0168/0187: training loss 0.688 Epoch 103 iteration 0169/0187: training loss 0.687 Epoch 103 iteration 0170/0187: training loss 0.687 Epoch 103 iteration 0171/0187: training loss 0.687 Epoch 103 iteration 0172/0187: training loss 0.686 Epoch 103 iteration 0173/0187: training loss 0.686 Epoch 103 iteration 0174/0187: training loss 0.686 Epoch 103 iteration 0175/0187: training loss 0.688 Epoch 103 iteration 0176/0187: training loss 0.687 Epoch 103 iteration 0177/0187: training loss 0.686 Epoch 103 iteration 0178/0187: training loss 0.686 Epoch 103 iteration 0179/0187: training loss 0.686 Epoch 103 iteration 0180/0187: training loss 0.685 Epoch 103 iteration 0181/0187: training loss 0.684 Epoch 103 iteration 0182/0187: training loss 0.684 Epoch 103 iteration 0183/0187: training loss 0.684 Epoch 103 iteration 0184/0187: training loss 0.684 Epoch 103 iteration 0185/0187: training loss 0.683 Epoch 103 iteration 0186/0187: training loss 0.683 Epoch 103 iteration 0187/0187: training loss 0.682 Epoch 103 validation pixAcc: 0.875, mIoU: 0.395 Epoch 104 iteration 0001/0187: training loss 0.656 Epoch 104 iteration 0002/0187: training loss 0.731 Epoch 104 iteration 0003/0187: training loss 0.728 Epoch 104 iteration 0004/0187: training loss 0.766 Epoch 104 iteration 0005/0187: training loss 0.753 Epoch 104 iteration 0006/0187: training loss 0.757 Epoch 104 iteration 0007/0187: training loss 0.741 Epoch 104 iteration 0008/0187: training loss 0.739 Epoch 104 iteration 0009/0187: training loss 0.737 Epoch 104 iteration 0010/0187: training loss 0.735 Epoch 104 iteration 0011/0187: training loss 0.723 Epoch 104 iteration 0012/0187: training loss 0.736 Epoch 104 iteration 0013/0187: training loss 0.746 Epoch 104 iteration 0014/0187: training loss 0.741 Epoch 104 iteration 0015/0187: training loss 0.736 Epoch 104 iteration 0016/0187: training loss 0.731 Epoch 104 iteration 0017/0187: training loss 0.722 Epoch 104 iteration 0018/0187: training loss 0.715 Epoch 104 iteration 0019/0187: training loss 0.713 Epoch 104 iteration 0020/0187: training loss 0.714 Epoch 104 iteration 0021/0187: training loss 0.710 Epoch 104 iteration 0022/0187: training loss 0.710 Epoch 104 iteration 0023/0187: training loss 0.704 Epoch 104 iteration 0024/0187: training loss 0.699 Epoch 104 iteration 0025/0187: training loss 0.697 Epoch 104 iteration 0026/0187: training loss 0.693 Epoch 104 iteration 0027/0187: training loss 0.690 Epoch 104 iteration 0028/0187: training loss 0.686 Epoch 104 iteration 0029/0187: training loss 0.683 Epoch 104 iteration 0030/0187: training loss 0.685 Epoch 104 iteration 0031/0187: training loss 0.687 Epoch 104 iteration 0032/0187: training loss 0.688 Epoch 104 iteration 0033/0187: training loss 0.691 Epoch 104 iteration 0034/0187: training loss 0.691 Epoch 104 iteration 0035/0187: training loss 0.691 Epoch 104 iteration 0036/0187: training loss 0.693 Epoch 104 iteration 0037/0187: training loss 0.691 Epoch 104 iteration 0038/0187: training loss 0.697 Epoch 104 iteration 0039/0187: training loss 0.693 Epoch 104 iteration 0040/0187: training loss 0.694 Epoch 104 iteration 0041/0187: training loss 0.691 Epoch 104 iteration 0042/0187: training loss 0.691 Epoch 104 iteration 0043/0187: training loss 0.695 Epoch 104 iteration 0044/0187: training loss 0.694 Epoch 104 iteration 0045/0187: training loss 0.697 Epoch 104 iteration 0046/0187: training loss 0.699 Epoch 104 iteration 0047/0187: training loss 0.701 Epoch 104 iteration 0048/0187: training loss 0.700 Epoch 104 iteration 0049/0187: training loss 0.701 Epoch 104 iteration 0050/0187: training loss 0.700 Epoch 104 iteration 0051/0187: training loss 0.697 Epoch 104 iteration 0052/0187: training loss 0.698 Epoch 104 iteration 0053/0187: training loss 0.700 Epoch 104 iteration 0054/0187: training loss 0.699 Epoch 104 iteration 0055/0187: training loss 0.697 Epoch 104 iteration 0056/0187: training loss 0.696 Epoch 104 iteration 0057/0187: training loss 0.696 Epoch 104 iteration 0058/0187: training loss 0.696 Epoch 104 iteration 0059/0187: training loss 0.697 Epoch 104 iteration 0060/0187: training loss 0.699 Epoch 104 iteration 0061/0187: training loss 0.697 Epoch 104 iteration 0062/0187: training loss 0.696 Epoch 104 iteration 0063/0187: training loss 0.695 Epoch 104 iteration 0064/0187: training loss 0.698 Epoch 104 iteration 0065/0187: training loss 0.696 Epoch 104 iteration 0066/0187: training loss 0.693 Epoch 104 iteration 0067/0187: training loss 0.691 Epoch 104 iteration 0068/0187: training loss 0.691 Epoch 104 iteration 0069/0187: training loss 0.689 Epoch 104 iteration 0070/0187: training loss 0.688 Epoch 104 iteration 0071/0187: training loss 0.687 Epoch 104 iteration 0072/0187: training loss 0.687 Epoch 104 iteration 0073/0187: training loss 0.686 Epoch 104 iteration 0074/0187: training loss 0.685 Epoch 104 iteration 0075/0187: training loss 0.684 Epoch 104 iteration 0076/0187: training loss 0.684 Epoch 104 iteration 0077/0187: training loss 0.684 Epoch 104 iteration 0078/0187: training loss 0.682 Epoch 104 iteration 0079/0187: training loss 0.681 Epoch 104 iteration 0080/0187: training loss 0.682 Epoch 104 iteration 0081/0187: training loss 0.682 Epoch 104 iteration 0082/0187: training loss 0.685 Epoch 104 iteration 0083/0187: training loss 0.686 Epoch 104 iteration 0084/0187: training loss 0.685 Epoch 104 iteration 0085/0187: training loss 0.685 Epoch 104 iteration 0086/0187: training loss 0.685 Epoch 104 iteration 0087/0187: training loss 0.686 Epoch 104 iteration 0088/0187: training loss 0.687 Epoch 104 iteration 0089/0187: training loss 0.685 Epoch 104 iteration 0090/0187: training loss 0.686 Epoch 104 iteration 0091/0188: training loss 0.687 Epoch 104 iteration 0092/0188: training loss 0.686 Epoch 104 iteration 0093/0188: training loss 0.686 Epoch 104 iteration 0094/0188: training loss 0.686 Epoch 104 iteration 0095/0188: training loss 0.688 Epoch 104 iteration 0096/0188: training loss 0.690 Epoch 104 iteration 0097/0188: training loss 0.690 Epoch 104 iteration 0098/0188: training loss 0.692 Epoch 104 iteration 0099/0188: training loss 0.691 Epoch 104 iteration 0100/0188: training loss 0.690 Epoch 104 iteration 0101/0188: training loss 0.691 Epoch 104 iteration 0102/0188: training loss 0.691 Epoch 104 iteration 0103/0188: training loss 0.689 Epoch 104 iteration 0104/0188: training loss 0.689 Epoch 104 iteration 0105/0188: training loss 0.689 Epoch 104 iteration 0106/0188: training loss 0.687 Epoch 104 iteration 0107/0188: training loss 0.686 Epoch 104 iteration 0108/0188: training loss 0.688 Epoch 104 iteration 0109/0188: training loss 0.689 Epoch 104 iteration 0110/0188: training loss 0.689 Epoch 104 iteration 0111/0188: training loss 0.688 Epoch 104 iteration 0112/0188: training loss 0.688 Epoch 104 iteration 0113/0188: training loss 0.688 Epoch 104 iteration 0114/0188: training loss 0.689 Epoch 104 iteration 0115/0188: training loss 0.688 Epoch 104 iteration 0116/0188: training loss 0.687 Epoch 104 iteration 0117/0188: training loss 0.688 Epoch 104 iteration 0118/0188: training loss 0.690 Epoch 104 iteration 0119/0188: training loss 0.689 Epoch 104 iteration 0120/0188: training loss 0.690 Epoch 104 iteration 0121/0188: training loss 0.689 Epoch 104 iteration 0122/0188: training loss 0.689 Epoch 104 iteration 0123/0188: training loss 0.689 Epoch 104 iteration 0124/0188: training loss 0.689 Epoch 104 iteration 0125/0188: training loss 0.689 Epoch 104 iteration 0126/0188: training loss 0.688 Epoch 104 iteration 0127/0188: training loss 0.689 Epoch 104 iteration 0128/0188: training loss 0.689 Epoch 104 iteration 0129/0188: training loss 0.689 Epoch 104 iteration 0130/0188: training loss 0.689 Epoch 104 iteration 0131/0188: training loss 0.688 Epoch 104 iteration 0132/0188: training loss 0.688 Epoch 104 iteration 0133/0188: training loss 0.687 Epoch 104 iteration 0134/0188: training loss 0.687 Epoch 104 iteration 0135/0188: training loss 0.686 Epoch 104 iteration 0136/0188: training loss 0.684 Epoch 104 iteration 0137/0188: training loss 0.684 Epoch 104 iteration 0138/0188: training loss 0.684 Epoch 104 iteration 0139/0188: training loss 0.684 Epoch 104 iteration 0140/0188: training loss 0.685 Epoch 104 iteration 0141/0188: training loss 0.684 Epoch 104 iteration 0142/0188: training loss 0.683 Epoch 104 iteration 0143/0188: training loss 0.683 Epoch 104 iteration 0144/0188: training loss 0.683 Epoch 104 iteration 0145/0188: training loss 0.683 Epoch 104 iteration 0146/0188: training loss 0.684 Epoch 104 iteration 0147/0188: training loss 0.685 Epoch 104 iteration 0148/0188: training loss 0.685 Epoch 104 iteration 0149/0188: training loss 0.686 Epoch 104 iteration 0150/0188: training loss 0.686 Epoch 104 iteration 0151/0188: training loss 0.686 Epoch 104 iteration 0152/0188: training loss 0.687 Epoch 104 iteration 0153/0188: training loss 0.686 Epoch 104 iteration 0154/0188: training loss 0.686 Epoch 104 iteration 0155/0188: training loss 0.686 Epoch 104 iteration 0156/0188: training loss 0.686 Epoch 104 iteration 0157/0188: training loss 0.685 Epoch 104 iteration 0158/0188: training loss 0.684 Epoch 104 iteration 0159/0188: training loss 0.684 Epoch 104 iteration 0160/0188: training loss 0.684 Epoch 104 iteration 0161/0188: training loss 0.684 Epoch 104 iteration 0162/0188: training loss 0.684 Epoch 104 iteration 0163/0188: training loss 0.683 Epoch 104 iteration 0164/0188: training loss 0.683 Epoch 104 iteration 0165/0188: training loss 0.684 Epoch 104 iteration 0166/0188: training loss 0.684 Epoch 104 iteration 0167/0188: training loss 0.684 Epoch 104 iteration 0168/0188: training loss 0.683 Epoch 104 iteration 0169/0188: training loss 0.683 Epoch 104 iteration 0170/0188: training loss 0.683 Epoch 104 iteration 0171/0188: training loss 0.682 Epoch 104 iteration 0172/0188: training loss 0.683 Epoch 104 iteration 0173/0188: training loss 0.682 Epoch 104 iteration 0174/0188: training loss 0.682 Epoch 104 iteration 0175/0188: training loss 0.682 Epoch 104 iteration 0176/0188: training loss 0.682 Epoch 104 iteration 0177/0188: training loss 0.682 Epoch 104 iteration 0178/0188: training loss 0.682 Epoch 104 iteration 0179/0188: training loss 0.683 Epoch 104 iteration 0180/0188: training loss 0.683 Epoch 104 iteration 0181/0188: training loss 0.682 Epoch 104 iteration 0182/0188: training loss 0.683 Epoch 104 iteration 0183/0188: training loss 0.682 Epoch 104 iteration 0184/0188: training loss 0.682 Epoch 104 iteration 0185/0188: training loss 0.682 Epoch 104 iteration 0186/0188: training loss 0.682 Epoch 104 validation pixAcc: 0.876, mIoU: 0.394 Epoch 105 iteration 0001/0187: training loss 0.611 Epoch 105 iteration 0002/0187: training loss 0.638 Epoch 105 iteration 0003/0187: training loss 0.603 Epoch 105 iteration 0004/0187: training loss 0.609 Epoch 105 iteration 0005/0187: training loss 0.596 Epoch 105 iteration 0006/0187: training loss 0.614 Epoch 105 iteration 0007/0187: training loss 0.616 Epoch 105 iteration 0008/0187: training loss 0.620 Epoch 105 iteration 0009/0187: training loss 0.631 Epoch 105 iteration 0010/0187: training loss 0.627 Epoch 105 iteration 0011/0187: training loss 0.631 Epoch 105 iteration 0012/0187: training loss 0.624 Epoch 105 iteration 0013/0187: training loss 0.622 Epoch 105 iteration 0014/0187: training loss 0.626 Epoch 105 iteration 0015/0187: training loss 0.629 Epoch 105 iteration 0016/0187: training loss 0.625 Epoch 105 iteration 0017/0187: training loss 0.626 Epoch 105 iteration 0018/0187: training loss 0.626 Epoch 105 iteration 0019/0187: training loss 0.631 Epoch 105 iteration 0020/0187: training loss 0.632 Epoch 105 iteration 0021/0187: training loss 0.636 Epoch 105 iteration 0022/0187: training loss 0.634 Epoch 105 iteration 0023/0187: training loss 0.639 Epoch 105 iteration 0024/0187: training loss 0.640 Epoch 105 iteration 0025/0187: training loss 0.648 Epoch 105 iteration 0026/0187: training loss 0.645 Epoch 105 iteration 0027/0187: training loss 0.643 Epoch 105 iteration 0028/0187: training loss 0.648 Epoch 105 iteration 0029/0187: training loss 0.650 Epoch 105 iteration 0030/0187: training loss 0.646 Epoch 105 iteration 0031/0187: training loss 0.646 Epoch 105 iteration 0032/0187: training loss 0.650 Epoch 105 iteration 0033/0187: training loss 0.650 Epoch 105 iteration 0034/0187: training loss 0.654 Epoch 105 iteration 0035/0187: training loss 0.659 Epoch 105 iteration 0036/0187: training loss 0.658 Epoch 105 iteration 0037/0187: training loss 0.657 Epoch 105 iteration 0038/0187: training loss 0.657 Epoch 105 iteration 0039/0187: training loss 0.655 Epoch 105 iteration 0040/0187: training loss 0.654 Epoch 105 iteration 0041/0187: training loss 0.653 Epoch 105 iteration 0042/0187: training loss 0.658 Epoch 105 iteration 0043/0187: training loss 0.656 Epoch 105 iteration 0044/0187: training loss 0.659 Epoch 105 iteration 0045/0187: training loss 0.658 Epoch 105 iteration 0046/0187: training loss 0.657 Epoch 105 iteration 0047/0187: training loss 0.659 Epoch 105 iteration 0048/0187: training loss 0.663 Epoch 105 iteration 0049/0187: training loss 0.667 Epoch 105 iteration 0050/0187: training loss 0.664 Epoch 105 iteration 0051/0187: training loss 0.665 Epoch 105 iteration 0052/0187: training loss 0.663 Epoch 105 iteration 0053/0187: training loss 0.662 Epoch 105 iteration 0054/0187: training loss 0.660 Epoch 105 iteration 0055/0187: training loss 0.661 Epoch 105 iteration 0056/0187: training loss 0.660 Epoch 105 iteration 0057/0187: training loss 0.663 Epoch 105 iteration 0058/0187: training loss 0.668 Epoch 105 iteration 0059/0187: training loss 0.672 Epoch 105 iteration 0060/0187: training loss 0.674 Epoch 105 iteration 0061/0187: training loss 0.672 Epoch 105 iteration 0062/0187: training loss 0.671 Epoch 105 iteration 0063/0187: training loss 0.672 Epoch 105 iteration 0064/0187: training loss 0.672 Epoch 105 iteration 0065/0187: training loss 0.676 Epoch 105 iteration 0066/0187: training loss 0.677 Epoch 105 iteration 0067/0187: training loss 0.676 Epoch 105 iteration 0068/0187: training loss 0.674 Epoch 105 iteration 0069/0187: training loss 0.674 Epoch 105 iteration 0070/0187: training loss 0.676 Epoch 105 iteration 0071/0187: training loss 0.675 Epoch 105 iteration 0072/0187: training loss 0.677 Epoch 105 iteration 0073/0187: training loss 0.677 Epoch 105 iteration 0074/0187: training loss 0.677 Epoch 105 iteration 0075/0187: training loss 0.675 Epoch 105 iteration 0076/0187: training loss 0.677 Epoch 105 iteration 0077/0187: training loss 0.675 Epoch 105 iteration 0078/0187: training loss 0.674 Epoch 105 iteration 0079/0187: training loss 0.675 Epoch 105 iteration 0080/0187: training loss 0.675 Epoch 105 iteration 0081/0187: training loss 0.676 Epoch 105 iteration 0082/0187: training loss 0.676 Epoch 105 iteration 0083/0187: training loss 0.678 Epoch 105 iteration 0084/0187: training loss 0.677 Epoch 105 iteration 0085/0187: training loss 0.677 Epoch 105 iteration 0086/0187: training loss 0.679 Epoch 105 iteration 0087/0187: training loss 0.678 Epoch 105 iteration 0088/0187: training loss 0.677 Epoch 105 iteration 0089/0187: training loss 0.678 Epoch 105 iteration 0090/0187: training loss 0.677 Epoch 105 iteration 0091/0187: training loss 0.676 Epoch 105 iteration 0092/0187: training loss 0.678 Epoch 105 iteration 0093/0187: training loss 0.677 Epoch 105 iteration 0094/0187: training loss 0.675 Epoch 105 iteration 0095/0187: training loss 0.675 Epoch 105 iteration 0096/0187: training loss 0.674 Epoch 105 iteration 0097/0187: training loss 0.674 Epoch 105 iteration 0098/0187: training loss 0.674 Epoch 105 iteration 0099/0187: training loss 0.675 Epoch 105 iteration 0100/0187: training loss 0.675 Epoch 105 iteration 0101/0187: training loss 0.675 Epoch 105 iteration 0102/0187: training loss 0.674 Epoch 105 iteration 0103/0187: training loss 0.674 Epoch 105 iteration 0104/0187: training loss 0.674 Epoch 105 iteration 0105/0187: training loss 0.676 Epoch 105 iteration 0106/0187: training loss 0.679 Epoch 105 iteration 0107/0187: training loss 0.679 Epoch 105 iteration 0108/0187: training loss 0.679 Epoch 105 iteration 0109/0187: training loss 0.678 Epoch 105 iteration 0110/0187: training loss 0.677 Epoch 105 iteration 0111/0187: training loss 0.678 Epoch 105 iteration 0112/0187: training loss 0.678 Epoch 105 iteration 0113/0187: training loss 0.677 Epoch 105 iteration 0114/0187: training loss 0.677 Epoch 105 iteration 0115/0187: training loss 0.677 Epoch 105 iteration 0116/0187: training loss 0.677 Epoch 105 iteration 0117/0187: training loss 0.678 Epoch 105 iteration 0118/0187: training loss 0.678 Epoch 105 iteration 0119/0187: training loss 0.679 Epoch 105 iteration 0120/0187: training loss 0.678 Epoch 105 iteration 0121/0187: training loss 0.680 Epoch 105 iteration 0122/0187: training loss 0.680 Epoch 105 iteration 0123/0187: training loss 0.680 Epoch 105 iteration 0124/0187: training loss 0.680 Epoch 105 iteration 0125/0187: training loss 0.678 Epoch 105 iteration 0126/0187: training loss 0.677 Epoch 105 iteration 0127/0187: training loss 0.678 Epoch 105 iteration 0128/0187: training loss 0.676 Epoch 105 iteration 0129/0187: training loss 0.677 Epoch 105 iteration 0130/0187: training loss 0.677 Epoch 105 iteration 0131/0187: training loss 0.676 Epoch 105 iteration 0132/0187: training loss 0.677 Epoch 105 iteration 0133/0187: training loss 0.677 Epoch 105 iteration 0134/0187: training loss 0.677 Epoch 105 iteration 0135/0187: training loss 0.678 Epoch 105 iteration 0136/0187: training loss 0.678 Epoch 105 iteration 0137/0187: training loss 0.677 Epoch 105 iteration 0138/0187: training loss 0.678 Epoch 105 iteration 0139/0187: training loss 0.677 Epoch 105 iteration 0140/0187: training loss 0.677 Epoch 105 iteration 0141/0187: training loss 0.679 Epoch 105 iteration 0142/0187: training loss 0.678 Epoch 105 iteration 0143/0187: training loss 0.677 Epoch 105 iteration 0144/0187: training loss 0.676 Epoch 105 iteration 0145/0187: training loss 0.676 Epoch 105 iteration 0146/0187: training loss 0.677 Epoch 105 iteration 0147/0187: training loss 0.677 Epoch 105 iteration 0148/0187: training loss 0.678 Epoch 105 iteration 0149/0187: training loss 0.678 Epoch 105 iteration 0150/0187: training loss 0.678 Epoch 105 iteration 0151/0187: training loss 0.678 Epoch 105 iteration 0152/0187: training loss 0.679 Epoch 105 iteration 0153/0187: training loss 0.680 Epoch 105 iteration 0154/0187: training loss 0.680 Epoch 105 iteration 0155/0187: training loss 0.681 Epoch 105 iteration 0156/0187: training loss 0.681 Epoch 105 iteration 0157/0187: training loss 0.681 Epoch 105 iteration 0158/0187: training loss 0.681 Epoch 105 iteration 0159/0187: training loss 0.681 Epoch 105 iteration 0160/0187: training loss 0.681 Epoch 105 iteration 0161/0187: training loss 0.682 Epoch 105 iteration 0162/0187: training loss 0.682 Epoch 105 iteration 0163/0187: training loss 0.683 Epoch 105 iteration 0164/0187: training loss 0.683 Epoch 105 iteration 0165/0187: training loss 0.682 Epoch 105 iteration 0166/0187: training loss 0.682 Epoch 105 iteration 0167/0187: training loss 0.683 Epoch 105 iteration 0168/0187: training loss 0.682 Epoch 105 iteration 0169/0187: training loss 0.682 Epoch 105 iteration 0170/0187: training loss 0.681 Epoch 105 iteration 0171/0187: training loss 0.681 Epoch 105 iteration 0172/0187: training loss 0.680 Epoch 105 iteration 0173/0187: training loss 0.680 Epoch 105 iteration 0174/0187: training loss nan Epoch 105 iteration 0175/0187: training loss nan Epoch 105 iteration 0176/0187: training loss nan Epoch 105 iteration 0177/0187: training loss nan Epoch 105 iteration 0178/0187: training loss nan Epoch 105 iteration 0179/0187: training loss nan Epoch 105 iteration 0180/0187: training loss nan Epoch 105 iteration 0181/0187: training loss nan Epoch 105 iteration 0182/0187: training loss nan Epoch 105 iteration 0183/0187: training loss nan Epoch 105 iteration 0184/0187: training loss nan Epoch 105 iteration 0185/0187: training loss nan Epoch 105 iteration 0186/0187: training loss nan Epoch 105 iteration 0187/0187: training loss nan Epoch 105 validation pixAcc: 0.876, mIoU: 0.397 Epoch 106 iteration 0001/0187: training loss 0.701 Epoch 106 iteration 0002/0187: training loss 0.710 Epoch 106 iteration 0003/0187: training loss 0.674 Epoch 106 iteration 0004/0187: training loss 0.642 Epoch 106 iteration 0005/0187: training loss 0.662 Epoch 106 iteration 0006/0187: training loss 0.640 Epoch 106 iteration 0007/0187: training loss 0.613 Epoch 106 iteration 0008/0187: training loss 0.630 Epoch 106 iteration 0009/0187: training loss 0.654 Epoch 106 iteration 0010/0187: training loss 0.655 Epoch 106 iteration 0011/0187: training loss 0.652 Epoch 106 iteration 0012/0187: training loss 0.654 Epoch 106 iteration 0013/0187: training loss 0.654 Epoch 106 iteration 0014/0187: training loss 0.660 Epoch 106 iteration 0015/0187: training loss 0.665 Epoch 106 iteration 0016/0187: training loss 0.663 Epoch 106 iteration 0017/0187: training loss 0.660 Epoch 106 iteration 0018/0187: training loss 0.652 Epoch 106 iteration 0019/0187: training loss 0.658 Epoch 106 iteration 0020/0187: training loss 0.663 Epoch 106 iteration 0021/0187: training loss 0.666 Epoch 106 iteration 0022/0187: training loss 0.662 Epoch 106 iteration 0023/0187: training loss 0.663 Epoch 106 iteration 0024/0187: training loss 0.661 Epoch 106 iteration 0025/0187: training loss 0.660 Epoch 106 iteration 0026/0187: training loss 0.659 Epoch 106 iteration 0027/0187: training loss 0.657 Epoch 106 iteration 0028/0187: training loss 0.656 Epoch 106 iteration 0029/0187: training loss 0.656 Epoch 106 iteration 0030/0187: training loss 0.662 Epoch 106 iteration 0031/0187: training loss 0.662 Epoch 106 iteration 0032/0187: training loss 0.657 Epoch 106 iteration 0033/0187: training loss 0.659 Epoch 106 iteration 0034/0187: training loss 0.657 Epoch 106 iteration 0035/0187: training loss 0.653 Epoch 106 iteration 0036/0187: training loss 0.653 Epoch 106 iteration 0037/0187: training loss 0.649 Epoch 106 iteration 0038/0187: training loss 0.647 Epoch 106 iteration 0039/0187: training loss 0.648 Epoch 106 iteration 0040/0187: training loss 0.646 Epoch 106 iteration 0041/0187: training loss 0.649 Epoch 106 iteration 0042/0187: training loss 0.652 Epoch 106 iteration 0043/0187: training loss 0.654 Epoch 106 iteration 0044/0187: training loss 0.656 Epoch 106 iteration 0045/0187: training loss 0.656 Epoch 106 iteration 0046/0187: training loss 0.658 Epoch 106 iteration 0047/0187: training loss 0.658 Epoch 106 iteration 0048/0187: training loss 0.658 Epoch 106 iteration 0049/0187: training loss 0.659 Epoch 106 iteration 0050/0187: training loss 0.661 Epoch 106 iteration 0051/0187: training loss 0.663 Epoch 106 iteration 0052/0187: training loss 0.664 Epoch 106 iteration 0053/0187: training loss 0.664 Epoch 106 iteration 0054/0187: training loss 0.665 Epoch 106 iteration 0055/0187: training loss 0.665 Epoch 106 iteration 0056/0187: training loss 0.665 Epoch 106 iteration 0057/0187: training loss 0.664 Epoch 106 iteration 0058/0187: training loss 0.664 Epoch 106 iteration 0059/0187: training loss 0.662 Epoch 106 iteration 0060/0187: training loss 0.662 Epoch 106 iteration 0061/0187: training loss 0.665 Epoch 106 iteration 0062/0187: training loss 0.666 Epoch 106 iteration 0063/0187: training loss 0.665 Epoch 106 iteration 0064/0187: training loss 0.666 Epoch 106 iteration 0065/0187: training loss 0.669 Epoch 106 iteration 0066/0187: training loss 0.667 Epoch 106 iteration 0067/0187: training loss 0.668 Epoch 106 iteration 0068/0187: training loss 0.670 Epoch 106 iteration 0069/0187: training loss 0.669 Epoch 106 iteration 0070/0187: training loss 0.670 Epoch 106 iteration 0071/0187: training loss 0.670 Epoch 106 iteration 0072/0187: training loss 0.670 Epoch 106 iteration 0073/0187: training loss 0.669 Epoch 106 iteration 0074/0187: training loss 0.669 Epoch 106 iteration 0075/0187: training loss 0.671 Epoch 106 iteration 0076/0187: training loss 0.671 Epoch 106 iteration 0077/0187: training loss 0.671 Epoch 106 iteration 0078/0187: training loss 0.670 Epoch 106 iteration 0079/0187: training loss 0.671 Epoch 106 iteration 0080/0187: training loss 0.673 Epoch 106 iteration 0081/0187: training loss 0.673 Epoch 106 iteration 0082/0187: training loss 0.672 Epoch 106 iteration 0083/0187: training loss 0.673 Epoch 106 iteration 0084/0187: training loss 0.672 Epoch 106 iteration 0085/0187: training loss 0.674 Epoch 106 iteration 0086/0187: training loss 0.674 Epoch 106 iteration 0087/0187: training loss 0.674 Epoch 106 iteration 0088/0187: training loss 0.673 Epoch 106 iteration 0089/0187: training loss 0.675 Epoch 106 iteration 0090/0187: training loss 0.674 Epoch 106 iteration 0091/0188: training loss 0.675 Epoch 106 iteration 0092/0188: training loss 0.675 Epoch 106 iteration 0093/0188: training loss 0.676 Epoch 106 iteration 0094/0188: training loss 0.677 Epoch 106 iteration 0095/0188: training loss 0.677 Epoch 106 iteration 0096/0188: training loss 0.675 Epoch 106 iteration 0097/0188: training loss 0.674 Epoch 106 iteration 0098/0188: training loss 0.675 Epoch 106 iteration 0099/0188: training loss 0.674 Epoch 106 iteration 0100/0188: training loss 0.675 Epoch 106 iteration 0101/0188: training loss 0.675 Epoch 106 iteration 0102/0188: training loss 0.675 Epoch 106 iteration 0103/0188: training loss 0.678 Epoch 106 iteration 0104/0188: training loss 0.680 Epoch 106 iteration 0105/0188: training loss 0.680 Epoch 106 iteration 0106/0188: training loss 0.679 Epoch 106 iteration 0107/0188: training loss 0.678 Epoch 106 iteration 0108/0188: training loss 0.678 Epoch 106 iteration 0109/0188: training loss 0.679 Epoch 106 iteration 0110/0188: training loss 0.679 Epoch 106 iteration 0111/0188: training loss 0.679 Epoch 106 iteration 0112/0188: training loss 0.679 Epoch 106 iteration 0113/0188: training loss 0.680 Epoch 106 iteration 0114/0188: training loss 0.681 Epoch 106 iteration 0115/0188: training loss 0.683 Epoch 106 iteration 0116/0188: training loss 0.682 Epoch 106 iteration 0117/0188: training loss 0.680 Epoch 106 iteration 0118/0188: training loss 0.680 Epoch 106 iteration 0119/0188: training loss 0.680 Epoch 106 iteration 0120/0188: training loss 0.680 Epoch 106 iteration 0121/0188: training loss 0.679 Epoch 106 iteration 0122/0188: training loss 0.679 Epoch 106 iteration 0123/0188: training loss 0.680 Epoch 106 iteration 0124/0188: training loss 0.679 Epoch 106 iteration 0125/0188: training loss 0.679 Epoch 106 iteration 0126/0188: training loss 0.680 Epoch 106 iteration 0127/0188: training loss 0.680 Epoch 106 iteration 0128/0188: training loss 0.679 Epoch 106 iteration 0129/0188: training loss 0.679 Epoch 106 iteration 0130/0188: training loss 0.679 Epoch 106 iteration 0131/0188: training loss 0.681 Epoch 106 iteration 0132/0188: training loss 0.680 Epoch 106 iteration 0133/0188: training loss 0.680 Epoch 106 iteration 0134/0188: training loss 0.682 Epoch 106 iteration 0135/0188: training loss 0.681 Epoch 106 iteration 0136/0188: training loss 0.681 Epoch 106 iteration 0137/0188: training loss 0.682 Epoch 106 iteration 0138/0188: training loss 0.682 Epoch 106 iteration 0139/0188: training loss 0.682 Epoch 106 iteration 0140/0188: training loss 0.681 Epoch 106 iteration 0141/0188: training loss 0.683 Epoch 106 iteration 0142/0188: training loss 0.682 Epoch 106 iteration 0143/0188: training loss 0.682 Epoch 106 iteration 0144/0188: training loss 0.681 Epoch 106 iteration 0145/0188: training loss 0.681 Epoch 106 iteration 0146/0188: training loss 0.682 Epoch 106 iteration 0147/0188: training loss 0.682 Epoch 106 iteration 0148/0188: training loss 0.682 Epoch 106 iteration 0149/0188: training loss 0.681 Epoch 106 iteration 0150/0188: training loss 0.682 Epoch 106 iteration 0151/0188: training loss 0.682 Epoch 106 iteration 0152/0188: training loss 0.682 Epoch 106 iteration 0153/0188: training loss 0.680 Epoch 106 iteration 0154/0188: training loss 0.680 Epoch 106 iteration 0155/0188: training loss 0.680 Epoch 106 iteration 0156/0188: training loss 0.679 Epoch 106 iteration 0157/0188: training loss 0.679 Epoch 106 iteration 0158/0188: training loss 0.678 Epoch 106 iteration 0159/0188: training loss 0.677 Epoch 106 iteration 0160/0188: training loss 0.678 Epoch 106 iteration 0161/0188: training loss 0.678 Epoch 106 iteration 0162/0188: training loss 0.677 Epoch 106 iteration 0163/0188: training loss 0.677 Epoch 106 iteration 0164/0188: training loss 0.677 Epoch 106 iteration 0165/0188: training loss 0.677 Epoch 106 iteration 0166/0188: training loss 0.677 Epoch 106 iteration 0167/0188: training loss 0.678 Epoch 106 iteration 0168/0188: training loss 0.678 Epoch 106 iteration 0169/0188: training loss 0.678 Epoch 106 iteration 0170/0188: training loss 0.678 Epoch 106 iteration 0171/0188: training loss 0.677 Epoch 106 iteration 0172/0188: training loss 0.677 Epoch 106 iteration 0173/0188: training loss 0.677 Epoch 106 iteration 0174/0188: training loss 0.677 Epoch 106 iteration 0175/0188: training loss 0.677 Epoch 106 iteration 0176/0188: training loss 0.679 Epoch 106 iteration 0177/0188: training loss 0.680 Epoch 106 iteration 0178/0188: training loss 0.681 Epoch 106 iteration 0179/0188: training loss 0.682 Epoch 106 iteration 0180/0188: training loss 0.681 Epoch 106 iteration 0181/0188: training loss 0.680 Epoch 106 iteration 0182/0188: training loss 0.680 Epoch 106 iteration 0183/0188: training loss 0.679 Epoch 106 iteration 0184/0188: training loss 0.679 Epoch 106 iteration 0185/0188: training loss 0.679 Epoch 106 iteration 0186/0188: training loss 0.678 Epoch 106 validation pixAcc: 0.876, mIoU: 0.396 Epoch 107 iteration 0001/0187: training loss 0.661 Epoch 107 iteration 0002/0187: training loss 0.753 Epoch 107 iteration 0003/0187: training loss 0.766 Epoch 107 iteration 0004/0187: training loss 0.719 Epoch 107 iteration 0005/0187: training loss 0.718 Epoch 107 iteration 0006/0187: training loss 0.699 Epoch 107 iteration 0007/0187: training loss 0.688 Epoch 107 iteration 0008/0187: training loss 0.679 Epoch 107 iteration 0009/0187: training loss 0.669 Epoch 107 iteration 0010/0187: training loss 0.664 Epoch 107 iteration 0011/0187: training loss 0.667 Epoch 107 iteration 0012/0187: training loss 0.682 Epoch 107 iteration 0013/0187: training loss 0.681 Epoch 107 iteration 0014/0187: training loss 0.686 Epoch 107 iteration 0015/0187: training loss 0.673 Epoch 107 iteration 0016/0187: training loss 0.679 Epoch 107 iteration 0017/0187: training loss 0.673 Epoch 107 iteration 0018/0187: training loss 0.678 Epoch 107 iteration 0019/0187: training loss 0.684 Epoch 107 iteration 0020/0187: training loss 0.692 Epoch 107 iteration 0021/0187: training loss 0.688 Epoch 107 iteration 0022/0187: training loss 0.685 Epoch 107 iteration 0023/0187: training loss 0.686 Epoch 107 iteration 0024/0187: training loss 0.682 Epoch 107 iteration 0025/0187: training loss 0.680 Epoch 107 iteration 0026/0187: training loss 0.681 Epoch 107 iteration 0027/0187: training loss 0.678 Epoch 107 iteration 0028/0187: training loss 0.682 Epoch 107 iteration 0029/0187: training loss 0.680 Epoch 107 iteration 0030/0187: training loss 0.674 Epoch 107 iteration 0031/0187: training loss 0.672 Epoch 107 iteration 0032/0187: training loss 0.673 Epoch 107 iteration 0033/0187: training loss 0.678 Epoch 107 iteration 0034/0187: training loss 0.680 Epoch 107 iteration 0035/0187: training loss 0.676 Epoch 107 iteration 0036/0187: training loss 0.677 Epoch 107 iteration 0037/0187: training loss 0.680 Epoch 107 iteration 0038/0187: training loss 0.675 Epoch 107 iteration 0039/0187: training loss 0.675 Epoch 107 iteration 0040/0187: training loss 0.675 Epoch 107 iteration 0041/0187: training loss 0.671 Epoch 107 iteration 0042/0187: training loss 0.673 Epoch 107 iteration 0043/0187: training loss 0.673 Epoch 107 iteration 0044/0187: training loss 0.672 Epoch 107 iteration 0045/0187: training loss 0.673 Epoch 107 iteration 0046/0187: training loss 0.677 Epoch 107 iteration 0047/0187: training loss 0.677 Epoch 107 iteration 0048/0187: training loss 0.674 Epoch 107 iteration 0049/0187: training loss 0.675 Epoch 107 iteration 0050/0187: training loss 0.677 Epoch 107 iteration 0051/0187: training loss 0.674 Epoch 107 iteration 0052/0187: training loss 0.676 Epoch 107 iteration 0053/0187: training loss 0.677 Epoch 107 iteration 0054/0187: training loss 0.678 Epoch 107 iteration 0055/0187: training loss 0.680 Epoch 107 iteration 0056/0187: training loss 0.678 Epoch 107 iteration 0057/0187: training loss 0.681 Epoch 107 iteration 0058/0187: training loss 0.680 Epoch 107 iteration 0059/0187: training loss 0.679 Epoch 107 iteration 0060/0187: training loss 0.679 Epoch 107 iteration 0061/0187: training loss 0.679 Epoch 107 iteration 0062/0187: training loss 0.678 Epoch 107 iteration 0063/0187: training loss 0.679 Epoch 107 iteration 0064/0187: training loss 0.679 Epoch 107 iteration 0065/0187: training loss 0.679 Epoch 107 iteration 0066/0187: training loss 0.678 Epoch 107 iteration 0067/0187: training loss 0.679 Epoch 107 iteration 0068/0187: training loss 0.680 Epoch 107 iteration 0069/0187: training loss 0.682 Epoch 107 iteration 0070/0187: training loss 0.683 Epoch 107 iteration 0071/0187: training loss 0.681 Epoch 107 iteration 0072/0187: training loss 0.681 Epoch 107 iteration 0073/0187: training loss 0.685 Epoch 107 iteration 0074/0187: training loss 0.683 Epoch 107 iteration 0075/0187: training loss 0.682 Epoch 107 iteration 0076/0187: training loss 0.683 Epoch 107 iteration 0077/0187: training loss 0.681 Epoch 107 iteration 0078/0187: training loss 0.679 Epoch 107 iteration 0079/0187: training loss 0.680 Epoch 107 iteration 0080/0187: training loss 0.681 Epoch 107 iteration 0081/0187: training loss 0.682 Epoch 107 iteration 0082/0187: training loss 0.681 Epoch 107 iteration 0083/0187: training loss 0.680 Epoch 107 iteration 0084/0187: training loss 0.679 Epoch 107 iteration 0085/0187: training loss 0.677 Epoch 107 iteration 0086/0187: training loss 0.677 Epoch 107 iteration 0087/0187: training loss 0.677 Epoch 107 iteration 0088/0187: training loss 0.675 Epoch 107 iteration 0089/0187: training loss 0.674 Epoch 107 iteration 0090/0187: training loss 0.677 Epoch 107 iteration 0091/0187: training loss 0.678 Epoch 107 iteration 0092/0187: training loss 0.678 Epoch 107 iteration 0093/0187: training loss 0.677 Epoch 107 iteration 0094/0187: training loss 0.677 Epoch 107 iteration 0095/0187: training loss 0.678 Epoch 107 iteration 0096/0187: training loss 0.678 Epoch 107 iteration 0097/0187: training loss 0.677 Epoch 107 iteration 0098/0187: training loss 0.677 Epoch 107 iteration 0099/0187: training loss 0.677 Epoch 107 iteration 0100/0187: training loss 0.675 Epoch 107 iteration 0101/0187: training loss 0.674 Epoch 107 iteration 0102/0187: training loss 0.674 Epoch 107 iteration 0103/0187: training loss 0.673 Epoch 107 iteration 0104/0187: training loss 0.672 Epoch 107 iteration 0105/0187: training loss 0.671 Epoch 107 iteration 0106/0187: training loss 0.670 Epoch 107 iteration 0107/0187: training loss 0.670 Epoch 107 iteration 0108/0187: training loss 0.673 Epoch 107 iteration 0109/0187: training loss 0.672 Epoch 107 iteration 0110/0187: training loss 0.673 Epoch 107 iteration 0111/0187: training loss 0.672 Epoch 107 iteration 0112/0187: training loss 0.672 Epoch 107 iteration 0113/0187: training loss 0.672 Epoch 107 iteration 0114/0187: training loss 0.673 Epoch 107 iteration 0115/0187: training loss 0.673 Epoch 107 iteration 0116/0187: training loss 0.673 Epoch 107 iteration 0117/0187: training loss 0.671 Epoch 107 iteration 0118/0187: training loss 0.671 Epoch 107 iteration 0119/0187: training loss 0.672 Epoch 107 iteration 0120/0187: training loss 0.672 Epoch 107 iteration 0121/0187: training loss 0.671 Epoch 107 iteration 0122/0187: training loss 0.671 Epoch 107 iteration 0123/0187: training loss 0.671 Epoch 107 iteration 0124/0187: training loss 0.671 Epoch 107 iteration 0125/0187: training loss 0.672 Epoch 107 iteration 0126/0187: training loss 0.671 Epoch 107 iteration 0127/0187: training loss 0.671 Epoch 107 iteration 0128/0187: training loss 0.670 Epoch 107 iteration 0129/0187: training loss 0.671 Epoch 107 iteration 0130/0187: training loss 0.671 Epoch 107 iteration 0131/0187: training loss 0.670 Epoch 107 iteration 0132/0187: training loss 0.671 Epoch 107 iteration 0133/0187: training loss 0.671 Epoch 107 iteration 0134/0187: training loss 0.673 Epoch 107 iteration 0135/0187: training loss 0.673 Epoch 107 iteration 0136/0187: training loss 0.676 Epoch 107 iteration 0137/0187: training loss 0.676 Epoch 107 iteration 0138/0187: training loss 0.678 Epoch 107 iteration 0139/0187: training loss 0.678 Epoch 107 iteration 0140/0187: training loss 0.676 Epoch 107 iteration 0141/0187: training loss 0.677 Epoch 107 iteration 0142/0187: training loss 0.678 Epoch 107 iteration 0143/0187: training loss 0.678 Epoch 107 iteration 0144/0187: training loss 0.677 Epoch 107 iteration 0145/0187: training loss 0.677 Epoch 107 iteration 0146/0187: training loss 0.676 Epoch 107 iteration 0147/0187: training loss 0.676 Epoch 107 iteration 0148/0187: training loss 0.677 Epoch 107 iteration 0149/0187: training loss 0.676 Epoch 107 iteration 0150/0187: training loss 0.677 Epoch 107 iteration 0151/0187: training loss 0.677 Epoch 107 iteration 0152/0187: training loss 0.677 Epoch 107 iteration 0153/0187: training loss 0.679 Epoch 107 iteration 0154/0187: training loss 0.679 Epoch 107 iteration 0155/0187: training loss 0.678 Epoch 107 iteration 0156/0187: training loss 0.678 Epoch 107 iteration 0157/0187: training loss 0.678 Epoch 107 iteration 0158/0187: training loss 0.677 Epoch 107 iteration 0159/0187: training loss 0.677 Epoch 107 iteration 0160/0187: training loss 0.676 Epoch 107 iteration 0161/0187: training loss 0.676 Epoch 107 iteration 0162/0187: training loss 0.675 Epoch 107 iteration 0163/0187: training loss 0.675 Epoch 107 iteration 0164/0187: training loss 0.675 Epoch 107 iteration 0165/0187: training loss 0.676 Epoch 107 iteration 0166/0187: training loss 0.676 Epoch 107 iteration 0167/0187: training loss 0.676 Epoch 107 iteration 0168/0187: training loss 0.676 Epoch 107 iteration 0169/0187: training loss 0.676 Epoch 107 iteration 0170/0187: training loss 0.676 Epoch 107 iteration 0171/0187: training loss 0.675 Epoch 107 iteration 0172/0187: training loss 0.675 Epoch 107 iteration 0173/0187: training loss 0.676 Epoch 107 iteration 0174/0187: training loss 0.676 Epoch 107 iteration 0175/0187: training loss 0.676 Epoch 107 iteration 0176/0187: training loss 0.676 Epoch 107 iteration 0177/0187: training loss 0.675 Epoch 107 iteration 0178/0187: training loss 0.675 Epoch 107 iteration 0179/0187: training loss 0.674 Epoch 107 iteration 0180/0187: training loss 0.674 Epoch 107 iteration 0181/0187: training loss 0.673 Epoch 107 iteration 0182/0187: training loss 0.673 Epoch 107 iteration 0183/0187: training loss 0.672 Epoch 107 iteration 0184/0187: training loss 0.673 Epoch 107 iteration 0185/0187: training loss 0.674 Epoch 107 iteration 0186/0187: training loss 0.675 Epoch 107 iteration 0187/0187: training loss 0.676 Epoch 107 validation pixAcc: 0.875, mIoU: 0.396 Epoch 108 iteration 0001/0187: training loss 0.635 Epoch 108 iteration 0002/0187: training loss 0.629 Epoch 108 iteration 0003/0187: training loss 0.638 Epoch 108 iteration 0004/0187: training loss 0.651 Epoch 108 iteration 0005/0187: training loss 0.661 Epoch 108 iteration 0006/0187: training loss 0.700 Epoch 108 iteration 0007/0187: training loss 0.709 Epoch 108 iteration 0008/0187: training loss 0.706 Epoch 108 iteration 0009/0187: training loss 0.724 Epoch 108 iteration 0010/0187: training loss 0.712 Epoch 108 iteration 0011/0187: training loss 0.706 Epoch 108 iteration 0012/0187: training loss 0.702 Epoch 108 iteration 0013/0187: training loss 0.713 Epoch 108 iteration 0014/0187: training loss 0.698 Epoch 108 iteration 0015/0187: training loss 0.689 Epoch 108 iteration 0016/0187: training loss 0.697 Epoch 108 iteration 0017/0187: training loss 0.703 Epoch 108 iteration 0018/0187: training loss 0.703 Epoch 108 iteration 0019/0187: training loss 0.706 Epoch 108 iteration 0020/0187: training loss 0.714 Epoch 108 iteration 0021/0187: training loss 0.712 Epoch 108 iteration 0022/0187: training loss 0.712 Epoch 108 iteration 0023/0187: training loss 0.707 Epoch 108 iteration 0024/0187: training loss 0.713 Epoch 108 iteration 0025/0187: training loss 0.718 Epoch 108 iteration 0026/0187: training loss 0.715 Epoch 108 iteration 0027/0187: training loss 0.720 Epoch 108 iteration 0028/0187: training loss 0.718 Epoch 108 iteration 0029/0187: training loss 0.718 Epoch 108 iteration 0030/0187: training loss 0.718 Epoch 108 iteration 0031/0187: training loss 0.721 Epoch 108 iteration 0032/0187: training loss 0.724 Epoch 108 iteration 0033/0187: training loss 0.724 Epoch 108 iteration 0034/0187: training loss 0.724 Epoch 108 iteration 0035/0187: training loss 0.721 Epoch 108 iteration 0036/0187: training loss 0.716 Epoch 108 iteration 0037/0187: training loss 0.718 Epoch 108 iteration 0038/0187: training loss 0.717 Epoch 108 iteration 0039/0187: training loss 0.721 Epoch 108 iteration 0040/0187: training loss 0.719 Epoch 108 iteration 0041/0187: training loss 0.719 Epoch 108 iteration 0042/0187: training loss 0.715 Epoch 108 iteration 0043/0187: training loss 0.719 Epoch 108 iteration 0044/0187: training loss 0.720 Epoch 108 iteration 0045/0187: training loss 0.719 Epoch 108 iteration 0046/0187: training loss 0.715 Epoch 108 iteration 0047/0187: training loss 0.713 Epoch 108 iteration 0048/0187: training loss 0.714 Epoch 108 iteration 0049/0187: training loss 0.717 Epoch 108 iteration 0050/0187: training loss 0.717 Epoch 108 iteration 0051/0187: training loss 0.715 Epoch 108 iteration 0052/0187: training loss 0.713 Epoch 108 iteration 0053/0187: training loss 0.713 Epoch 108 iteration 0054/0187: training loss 0.714 Epoch 108 iteration 0055/0187: training loss 0.715 Epoch 108 iteration 0056/0187: training loss 0.713 Epoch 108 iteration 0057/0187: training loss 0.716 Epoch 108 iteration 0058/0187: training loss 0.715 Epoch 108 iteration 0059/0187: training loss 0.714 Epoch 108 iteration 0060/0187: training loss 0.713 Epoch 108 iteration 0061/0187: training loss 0.712 Epoch 108 iteration 0062/0187: training loss 0.712 Epoch 108 iteration 0063/0187: training loss 0.714 Epoch 108 iteration 0064/0187: training loss 0.715 Epoch 108 iteration 0065/0187: training loss 0.712 Epoch 108 iteration 0066/0187: training loss 0.711 Epoch 108 iteration 0067/0187: training loss 0.710 Epoch 108 iteration 0068/0187: training loss 0.708 Epoch 108 iteration 0069/0187: training loss 0.709 Epoch 108 iteration 0070/0187: training loss 0.705 Epoch 108 iteration 0071/0187: training loss 0.705 Epoch 108 iteration 0072/0187: training loss 0.704 Epoch 108 iteration 0073/0187: training loss 0.703 Epoch 108 iteration 0074/0187: training loss 0.702 Epoch 108 iteration 0075/0187: training loss 0.703 Epoch 108 iteration 0076/0187: training loss 0.701 Epoch 108 iteration 0077/0187: training loss 0.698 Epoch 108 iteration 0078/0187: training loss 0.698 Epoch 108 iteration 0079/0187: training loss 0.696 Epoch 108 iteration 0080/0187: training loss 0.695 Epoch 108 iteration 0081/0187: training loss 0.693 Epoch 108 iteration 0082/0187: training loss 0.693 Epoch 108 iteration 0083/0187: training loss 0.694 Epoch 108 iteration 0084/0187: training loss 0.695 Epoch 108 iteration 0085/0187: training loss 0.694 Epoch 108 iteration 0086/0187: training loss 0.692 Epoch 108 iteration 0087/0187: training loss 0.691 Epoch 108 iteration 0088/0187: training loss 0.691 Epoch 108 iteration 0089/0187: training loss 0.694 Epoch 108 iteration 0090/0187: training loss 0.692 Epoch 108 iteration 0091/0188: training loss 0.693 Epoch 108 iteration 0092/0188: training loss 0.692 Epoch 108 iteration 0093/0188: training loss 0.691 Epoch 108 iteration 0094/0188: training loss 0.691 Epoch 108 iteration 0095/0188: training loss 0.691 Epoch 108 iteration 0096/0188: training loss 0.691 Epoch 108 iteration 0097/0188: training loss 0.692 Epoch 108 iteration 0098/0188: training loss 0.691 Epoch 108 iteration 0099/0188: training loss 0.690 Epoch 108 iteration 0100/0188: training loss 0.689 Epoch 108 iteration 0101/0188: training loss 0.689 Epoch 108 iteration 0102/0188: training loss 0.688 Epoch 108 iteration 0103/0188: training loss 0.690 Epoch 108 iteration 0104/0188: training loss 0.689 Epoch 108 iteration 0105/0188: training loss 0.689 Epoch 108 iteration 0106/0188: training loss 0.690 Epoch 108 iteration 0107/0188: training loss 0.690 Epoch 108 iteration 0108/0188: training loss 0.689 Epoch 108 iteration 0109/0188: training loss 0.689 Epoch 108 iteration 0110/0188: training loss 0.688 Epoch 108 iteration 0111/0188: training loss 0.688 Epoch 108 iteration 0112/0188: training loss 0.687 Epoch 108 iteration 0113/0188: training loss 0.689 Epoch 108 iteration 0114/0188: training loss 0.689 Epoch 108 iteration 0115/0188: training loss 0.689 Epoch 108 iteration 0116/0188: training loss 0.689 Epoch 108 iteration 0117/0188: training loss 0.688 Epoch 108 iteration 0118/0188: training loss 0.689 Epoch 108 iteration 0119/0188: training loss 0.689 Epoch 108 iteration 0120/0188: training loss 0.690 Epoch 108 iteration 0121/0188: training loss 0.690 Epoch 108 iteration 0122/0188: training loss 0.691 Epoch 108 iteration 0123/0188: training loss 0.690 Epoch 108 iteration 0124/0188: training loss 0.689 Epoch 108 iteration 0125/0188: training loss 0.689 Epoch 108 iteration 0126/0188: training loss 0.688 Epoch 108 iteration 0127/0188: training loss 0.688 Epoch 108 iteration 0128/0188: training loss 0.688 Epoch 108 iteration 0129/0188: training loss 0.689 Epoch 108 iteration 0130/0188: training loss 0.689 Epoch 108 iteration 0131/0188: training loss 0.688 Epoch 108 iteration 0132/0188: training loss 0.687 Epoch 108 iteration 0133/0188: training loss 0.686 Epoch 108 iteration 0134/0188: training loss 0.686 Epoch 108 iteration 0135/0188: training loss 0.686 Epoch 108 iteration 0136/0188: training loss 0.684 Epoch 108 iteration 0137/0188: training loss 0.684 Epoch 108 iteration 0138/0188: training loss 0.683 Epoch 108 iteration 0139/0188: training loss 0.683 Epoch 108 iteration 0140/0188: training loss 0.683 Epoch 108 iteration 0141/0188: training loss 0.683 Epoch 108 iteration 0142/0188: training loss 0.684 Epoch 108 iteration 0143/0188: training loss 0.684 Epoch 108 iteration 0144/0188: training loss 0.683 Epoch 108 iteration 0145/0188: training loss 0.683 Epoch 108 iteration 0146/0188: training loss 0.684 Epoch 108 iteration 0147/0188: training loss 0.683 Epoch 108 iteration 0148/0188: training loss 0.683 Epoch 108 iteration 0149/0188: training loss 0.683 Epoch 108 iteration 0150/0188: training loss 0.683 Epoch 108 iteration 0151/0188: training loss 0.682 Epoch 108 iteration 0152/0188: training loss 0.681 Epoch 108 iteration 0153/0188: training loss 0.681 Epoch 108 iteration 0154/0188: training loss 0.682 Epoch 108 iteration 0155/0188: training loss 0.682 Epoch 108 iteration 0156/0188: training loss 0.681 Epoch 108 iteration 0157/0188: training loss 0.680 Epoch 108 iteration 0158/0188: training loss 0.681 Epoch 108 iteration 0159/0188: training loss 0.680 Epoch 108 iteration 0160/0188: training loss 0.680 Epoch 108 iteration 0161/0188: training loss 0.680 Epoch 108 iteration 0162/0188: training loss 0.680 Epoch 108 iteration 0163/0188: training loss 0.679 Epoch 108 iteration 0164/0188: training loss 0.679 Epoch 108 iteration 0165/0188: training loss 0.678 Epoch 108 iteration 0166/0188: training loss 0.679 Epoch 108 iteration 0167/0188: training loss 0.679 Epoch 108 iteration 0168/0188: training loss 0.680 Epoch 108 iteration 0169/0188: training loss 0.680 Epoch 108 iteration 0170/0188: training loss 0.679 Epoch 108 iteration 0171/0188: training loss 0.679 Epoch 108 iteration 0172/0188: training loss 0.678 Epoch 108 iteration 0173/0188: training loss 0.677 Epoch 108 iteration 0174/0188: training loss 0.677 Epoch 108 iteration 0175/0188: training loss 0.676 Epoch 108 iteration 0176/0188: training loss 0.677 Epoch 108 iteration 0177/0188: training loss 0.677 Epoch 108 iteration 0178/0188: training loss 0.677 Epoch 108 iteration 0179/0188: training loss 0.678 Epoch 108 iteration 0180/0188: training loss 0.678 Epoch 108 iteration 0181/0188: training loss 0.679 Epoch 108 iteration 0182/0188: training loss 0.679 Epoch 108 iteration 0183/0188: training loss 0.679 Epoch 108 iteration 0184/0188: training loss 0.679 Epoch 108 iteration 0185/0188: training loss 0.679 Epoch 108 iteration 0186/0188: training loss 0.678 Epoch 108 validation pixAcc: 0.875, mIoU: 0.393 Epoch 109 iteration 0001/0187: training loss 0.654 Epoch 109 iteration 0002/0187: training loss 0.657 Epoch 109 iteration 0003/0187: training loss 0.656 Epoch 109 iteration 0004/0187: training loss 0.637 Epoch 109 iteration 0005/0187: training loss 0.629 Epoch 109 iteration 0006/0187: training loss 0.629 Epoch 109 iteration 0007/0187: training loss 0.641 Epoch 109 iteration 0008/0187: training loss 0.636 Epoch 109 iteration 0009/0187: training loss 0.634 Epoch 109 iteration 0010/0187: training loss 0.633 Epoch 109 iteration 0011/0187: training loss 0.650 Epoch 109 iteration 0012/0187: training loss 0.643 Epoch 109 iteration 0013/0187: training loss 0.640 Epoch 109 iteration 0014/0187: training loss 0.645 Epoch 109 iteration 0015/0187: training loss 0.643 Epoch 109 iteration 0016/0187: training loss 0.643 Epoch 109 iteration 0017/0187: training loss 0.650 Epoch 109 iteration 0018/0187: training loss 0.648 Epoch 109 iteration 0019/0187: training loss 0.646 Epoch 109 iteration 0020/0187: training loss 0.649 Epoch 109 iteration 0021/0187: training loss 0.649 Epoch 109 iteration 0022/0187: training loss 0.650 Epoch 109 iteration 0023/0187: training loss 0.649 Epoch 109 iteration 0024/0187: training loss 0.648 Epoch 109 iteration 0025/0187: training loss 0.652 Epoch 109 iteration 0026/0187: training loss 0.650 Epoch 109 iteration 0027/0187: training loss 0.653 Epoch 109 iteration 0028/0187: training loss 0.650 Epoch 109 iteration 0029/0187: training loss 0.651 Epoch 109 iteration 0030/0187: training loss 0.649 Epoch 109 iteration 0031/0187: training loss 0.654 Epoch 109 iteration 0032/0187: training loss 0.662 Epoch 109 iteration 0033/0187: training loss 0.680 Epoch 109 iteration 0034/0187: training loss 0.682 Epoch 109 iteration 0035/0187: training loss 0.681 Epoch 109 iteration 0036/0187: training loss 0.686 Epoch 109 iteration 0037/0187: training loss 0.692 Epoch 109 iteration 0038/0187: training loss 0.690 Epoch 109 iteration 0039/0187: training loss 0.693 Epoch 109 iteration 0040/0187: training loss 0.691 Epoch 109 iteration 0041/0187: training loss 0.698 Epoch 109 iteration 0042/0187: training loss 0.696 Epoch 109 iteration 0043/0187: training loss 0.701 Epoch 109 iteration 0044/0187: training loss 0.703 Epoch 109 iteration 0045/0187: training loss 0.706 Epoch 109 iteration 0046/0187: training loss 0.707 Epoch 109 iteration 0047/0187: training loss 0.705 Epoch 109 iteration 0048/0187: training loss 0.703 Epoch 109 iteration 0049/0187: training loss 0.705 Epoch 109 iteration 0050/0187: training loss 0.701 Epoch 109 iteration 0051/0187: training loss 0.701 Epoch 109 iteration 0052/0187: training loss 0.700 Epoch 109 iteration 0053/0187: training loss 0.701 Epoch 109 iteration 0054/0187: training loss 0.699 Epoch 109 iteration 0055/0187: training loss 0.699 Epoch 109 iteration 0056/0187: training loss 0.696 Epoch 109 iteration 0057/0187: training loss 0.695 Epoch 109 iteration 0058/0187: training loss 0.696 Epoch 109 iteration 0059/0187: training loss 0.697 Epoch 109 iteration 0060/0187: training loss 0.698 Epoch 109 iteration 0061/0187: training loss 0.702 Epoch 109 iteration 0062/0187: training loss 0.703 Epoch 109 iteration 0063/0187: training loss 0.701 Epoch 109 iteration 0064/0187: training loss 0.699 Epoch 109 iteration 0065/0187: training loss 0.700 Epoch 109 iteration 0066/0187: training loss 0.700 Epoch 109 iteration 0067/0187: training loss 0.700 Epoch 109 iteration 0068/0187: training loss 0.698 Epoch 109 iteration 0069/0187: training loss 0.697 Epoch 109 iteration 0070/0187: training loss 0.698 Epoch 109 iteration 0071/0187: training loss 0.699 Epoch 109 iteration 0072/0187: training loss 0.702 Epoch 109 iteration 0073/0187: training loss 0.701 Epoch 109 iteration 0074/0187: training loss 0.703 Epoch 109 iteration 0075/0187: training loss 0.703 Epoch 109 iteration 0076/0187: training loss 0.703 Epoch 109 iteration 0077/0187: training loss 0.701 Epoch 109 iteration 0078/0187: training loss 0.701 Epoch 109 iteration 0079/0187: training loss 0.702 Epoch 109 iteration 0080/0187: training loss 0.701 Epoch 109 iteration 0081/0187: training loss 0.697 Epoch 109 iteration 0082/0187: training loss 0.696 Epoch 109 iteration 0083/0187: training loss 0.696 Epoch 109 iteration 0084/0187: training loss 0.693 Epoch 109 iteration 0085/0187: training loss 0.694 Epoch 109 iteration 0086/0187: training loss 0.694 Epoch 109 iteration 0087/0187: training loss 0.693 Epoch 109 iteration 0088/0187: training loss 0.695 Epoch 109 iteration 0089/0187: training loss 0.695 Epoch 109 iteration 0090/0187: training loss 0.696 Epoch 109 iteration 0091/0187: training loss 0.696 Epoch 109 iteration 0092/0187: training loss 0.695 Epoch 109 iteration 0093/0187: training loss 0.694 Epoch 109 iteration 0094/0187: training loss 0.693 Epoch 109 iteration 0095/0187: training loss 0.692 Epoch 109 iteration 0096/0187: training loss 0.690 Epoch 109 iteration 0097/0187: training loss 0.690 Epoch 109 iteration 0098/0187: training loss 0.691 Epoch 109 iteration 0099/0187: training loss 0.691 Epoch 109 iteration 0100/0187: training loss 0.690 Epoch 109 iteration 0101/0187: training loss 0.689 Epoch 109 iteration 0102/0187: training loss 0.688 Epoch 109 iteration 0103/0187: training loss 0.686 Epoch 109 iteration 0104/0187: training loss 0.685 Epoch 109 iteration 0105/0187: training loss 0.685 Epoch 109 iteration 0106/0187: training loss 0.684 Epoch 109 iteration 0107/0187: training loss 0.684 Epoch 109 iteration 0108/0187: training loss 0.684 Epoch 109 iteration 0109/0187: training loss 0.685 Epoch 109 iteration 0110/0187: training loss 0.685 Epoch 109 iteration 0111/0187: training loss 0.684 Epoch 109 iteration 0112/0187: training loss 0.684 Epoch 109 iteration 0113/0187: training loss 0.683 Epoch 109 iteration 0114/0187: training loss 0.682 Epoch 109 iteration 0115/0187: training loss 0.681 Epoch 109 iteration 0116/0187: training loss 0.681 Epoch 109 iteration 0117/0187: training loss 0.681 Epoch 109 iteration 0118/0187: training loss 0.682 Epoch 109 iteration 0119/0187: training loss 0.681 Epoch 109 iteration 0120/0187: training loss 0.681 Epoch 109 iteration 0121/0187: training loss 0.681 Epoch 109 iteration 0122/0187: training loss 0.680 Epoch 109 iteration 0123/0187: training loss 0.680 Epoch 109 iteration 0124/0187: training loss 0.681 Epoch 109 iteration 0125/0187: training loss 0.682 Epoch 109 iteration 0126/0187: training loss 0.682 Epoch 109 iteration 0127/0187: training loss 0.681 Epoch 109 iteration 0128/0187: training loss 0.682 Epoch 109 iteration 0129/0187: training loss 0.681 Epoch 109 iteration 0130/0187: training loss 0.680 Epoch 109 iteration 0131/0187: training loss 0.680 Epoch 109 iteration 0132/0187: training loss 0.681 Epoch 109 iteration 0133/0187: training loss 0.681 Epoch 109 iteration 0134/0187: training loss 0.680 Epoch 109 iteration 0135/0187: training loss 0.680 Epoch 109 iteration 0136/0187: training loss 0.681 Epoch 109 iteration 0137/0187: training loss 0.680 Epoch 109 iteration 0138/0187: training loss 0.679 Epoch 109 iteration 0139/0187: training loss 0.678 Epoch 109 iteration 0140/0187: training loss 0.678 Epoch 109 iteration 0141/0187: training loss 0.679 Epoch 109 iteration 0142/0187: training loss 0.678 Epoch 109 iteration 0143/0187: training loss 0.677 Epoch 109 iteration 0144/0187: training loss 0.678 Epoch 109 iteration 0145/0187: training loss 0.677 Epoch 109 iteration 0146/0187: training loss 0.676 Epoch 109 iteration 0147/0187: training loss 0.677 Epoch 109 iteration 0148/0187: training loss 0.675 Epoch 109 iteration 0149/0187: training loss 0.675 Epoch 109 iteration 0150/0187: training loss 0.675 Epoch 109 iteration 0151/0187: training loss 0.676 Epoch 109 iteration 0152/0187: training loss 0.674 Epoch 109 iteration 0153/0187: training loss 0.673 Epoch 109 iteration 0154/0187: training loss 0.673 Epoch 109 iteration 0155/0187: training loss 0.674 Epoch 109 iteration 0156/0187: training loss 0.674 Epoch 109 iteration 0157/0187: training loss 0.674 Epoch 109 iteration 0158/0187: training loss 0.673 Epoch 109 iteration 0159/0187: training loss 0.673 Epoch 109 iteration 0160/0187: training loss 0.673 Epoch 109 iteration 0161/0187: training loss 0.673 Epoch 109 iteration 0162/0187: training loss 0.671 Epoch 109 iteration 0163/0187: training loss 0.671 Epoch 109 iteration 0164/0187: training loss 0.670 Epoch 109 iteration 0165/0187: training loss 0.670 Epoch 109 iteration 0166/0187: training loss 0.670 Epoch 109 iteration 0167/0187: training loss 0.669 Epoch 109 iteration 0168/0187: training loss 0.669 Epoch 109 iteration 0169/0187: training loss 0.672 Epoch 109 iteration 0170/0187: training loss 0.672 Epoch 109 iteration 0171/0187: training loss 0.673 Epoch 109 iteration 0172/0187: training loss 0.675 Epoch 109 iteration 0173/0187: training loss 0.674 Epoch 109 iteration 0174/0187: training loss 0.674 Epoch 109 iteration 0175/0187: training loss 0.674 Epoch 109 iteration 0176/0187: training loss 0.674 Epoch 109 iteration 0177/0187: training loss 0.674 Epoch 109 iteration 0178/0187: training loss 0.674 Epoch 109 iteration 0179/0187: training loss 0.675 Epoch 109 iteration 0180/0187: training loss 0.675 Epoch 109 iteration 0181/0187: training loss 0.676 Epoch 109 iteration 0182/0187: training loss 0.676 Epoch 109 iteration 0183/0187: training loss 0.677 Epoch 109 iteration 0184/0187: training loss 0.676 Epoch 109 iteration 0185/0187: training loss 0.677 Epoch 109 iteration 0186/0187: training loss 0.677 Epoch 109 iteration 0187/0187: training loss 0.678 Epoch 109 validation pixAcc: 0.875, mIoU: 0.392 Epoch 110 iteration 0001/0187: training loss 0.520 Epoch 110 iteration 0002/0187: training loss 0.537 Epoch 110 iteration 0003/0187: training loss 0.584 Epoch 110 iteration 0004/0187: training loss 0.593 Epoch 110 iteration 0005/0187: training loss 0.594 Epoch 110 iteration 0006/0187: training loss 0.591 Epoch 110 iteration 0007/0187: training loss 0.590 Epoch 110 iteration 0008/0187: training loss 0.595 Epoch 110 iteration 0009/0187: training loss 0.611 Epoch 110 iteration 0010/0187: training loss 0.621 Epoch 110 iteration 0011/0187: training loss 0.643 Epoch 110 iteration 0012/0187: training loss 0.653 Epoch 110 iteration 0013/0187: training loss 0.657 Epoch 110 iteration 0014/0187: training loss 0.651 Epoch 110 iteration 0015/0187: training loss 0.650 Epoch 110 iteration 0016/0187: training loss 0.655 Epoch 110 iteration 0017/0187: training loss 0.646 Epoch 110 iteration 0018/0187: training loss 0.655 Epoch 110 iteration 0019/0187: training loss 0.661 Epoch 110 iteration 0020/0187: training loss 0.662 Epoch 110 iteration 0021/0187: training loss 0.660 Epoch 110 iteration 0022/0187: training loss 0.659 Epoch 110 iteration 0023/0187: training loss 0.652 Epoch 110 iteration 0024/0187: training loss 0.648 Epoch 110 iteration 0025/0187: training loss 0.658 Epoch 110 iteration 0026/0187: training loss 0.661 Epoch 110 iteration 0027/0187: training loss 0.663 Epoch 110 iteration 0028/0187: training loss 0.663 Epoch 110 iteration 0029/0187: training loss 0.665 Epoch 110 iteration 0030/0187: training loss 0.667 Epoch 110 iteration 0031/0187: training loss 0.662 Epoch 110 iteration 0032/0187: training loss 0.665 Epoch 110 iteration 0033/0187: training loss 0.669 Epoch 110 iteration 0034/0187: training loss 0.667 Epoch 110 iteration 0035/0187: training loss 0.673 Epoch 110 iteration 0036/0187: training loss 0.677 Epoch 110 iteration 0037/0187: training loss 0.674 Epoch 110 iteration 0038/0187: training loss 0.670 Epoch 110 iteration 0039/0187: training loss 0.673 Epoch 110 iteration 0040/0187: training loss 0.670 Epoch 110 iteration 0041/0187: training loss 0.669 Epoch 110 iteration 0042/0187: training loss 0.668 Epoch 110 iteration 0043/0187: training loss 0.667 Epoch 110 iteration 0044/0187: training loss 0.664 Epoch 110 iteration 0045/0187: training loss 0.662 Epoch 110 iteration 0046/0187: training loss 0.660 Epoch 110 iteration 0047/0187: training loss 0.659 Epoch 110 iteration 0048/0187: training loss 0.659 Epoch 110 iteration 0049/0187: training loss 0.660 Epoch 110 iteration 0050/0187: training loss 0.661 Epoch 110 iteration 0051/0187: training loss 0.660 Epoch 110 iteration 0052/0187: training loss 0.659 Epoch 110 iteration 0053/0187: training loss 0.659 Epoch 110 iteration 0054/0187: training loss 0.658 Epoch 110 iteration 0055/0187: training loss 0.658 Epoch 110 iteration 0056/0187: training loss 0.660 Epoch 110 iteration 0057/0187: training loss 0.661 Epoch 110 iteration 0058/0187: training loss 0.665 Epoch 110 iteration 0059/0187: training loss 0.664 Epoch 110 iteration 0060/0187: training loss 0.666 Epoch 110 iteration 0061/0187: training loss 0.673 Epoch 110 iteration 0062/0187: training loss 0.673 Epoch 110 iteration 0063/0187: training loss 0.672 Epoch 110 iteration 0064/0187: training loss 0.672 Epoch 110 iteration 0065/0187: training loss 0.671 Epoch 110 iteration 0066/0187: training loss 0.671 Epoch 110 iteration 0067/0187: training loss 0.673 Epoch 110 iteration 0068/0187: training loss 0.672 Epoch 110 iteration 0069/0187: training loss 0.674 Epoch 110 iteration 0070/0187: training loss 0.674 Epoch 110 iteration 0071/0187: training loss 0.676 Epoch 110 iteration 0072/0187: training loss 0.676 Epoch 110 iteration 0073/0187: training loss 0.677 Epoch 110 iteration 0074/0187: training loss 0.679 Epoch 110 iteration 0075/0187: training loss 0.684 Epoch 110 iteration 0076/0187: training loss 0.682 Epoch 110 iteration 0077/0187: training loss 0.682 Epoch 110 iteration 0078/0187: training loss 0.683 Epoch 110 iteration 0079/0187: training loss 0.682 Epoch 110 iteration 0080/0187: training loss 0.683 Epoch 110 iteration 0081/0187: training loss 0.687 Epoch 110 iteration 0082/0187: training loss 0.686 Epoch 110 iteration 0083/0187: training loss 0.686 Epoch 110 iteration 0084/0187: training loss 0.686 Epoch 110 iteration 0085/0187: training loss 0.685 Epoch 110 iteration 0086/0187: training loss 0.686 Epoch 110 iteration 0087/0187: training loss 0.686 Epoch 110 iteration 0088/0187: training loss 0.687 Epoch 110 iteration 0089/0187: training loss 0.688 Epoch 110 iteration 0090/0187: training loss 0.687 Epoch 110 iteration 0091/0188: training loss 0.687 Epoch 110 iteration 0092/0188: training loss 0.687 Epoch 110 iteration 0093/0188: training loss 0.688 Epoch 110 iteration 0094/0188: training loss 0.688 Epoch 110 iteration 0095/0188: training loss 0.688 Epoch 110 iteration 0096/0188: training loss 0.689 Epoch 110 iteration 0097/0188: training loss 0.689 Epoch 110 iteration 0098/0188: training loss 0.689 Epoch 110 iteration 0099/0188: training loss 0.691 Epoch 110 iteration 0100/0188: training loss 0.691 Epoch 110 iteration 0101/0188: training loss 0.690 Epoch 110 iteration 0102/0188: training loss 0.690 Epoch 110 iteration 0103/0188: training loss 0.690 Epoch 110 iteration 0104/0188: training loss 0.689 Epoch 110 iteration 0105/0188: training loss 0.691 Epoch 110 iteration 0106/0188: training loss 0.690 Epoch 110 iteration 0107/0188: training loss 0.695 Epoch 110 iteration 0108/0188: training loss 0.694 Epoch 110 iteration 0109/0188: training loss 0.694 Epoch 110 iteration 0110/0188: training loss 0.692 Epoch 110 iteration 0111/0188: training loss 0.693 Epoch 110 iteration 0112/0188: training loss 0.692 Epoch 110 iteration 0113/0188: training loss 0.692 Epoch 110 iteration 0114/0188: training loss 0.692 Epoch 110 iteration 0115/0188: training loss 0.692 Epoch 110 iteration 0116/0188: training loss 0.692 Epoch 110 iteration 0117/0188: training loss 0.693 Epoch 110 iteration 0118/0188: training loss 0.693 Epoch 110 iteration 0119/0188: training loss 0.692 Epoch 110 iteration 0120/0188: training loss 0.695 Epoch 110 iteration 0121/0188: training loss 0.695 Epoch 110 iteration 0122/0188: training loss 0.695 Epoch 110 iteration 0123/0188: training loss 0.694 Epoch 110 iteration 0124/0188: training loss 0.694 Epoch 110 iteration 0125/0188: training loss 0.694 Epoch 110 iteration 0126/0188: training loss 0.694 Epoch 110 iteration 0127/0188: training loss 0.693 Epoch 110 iteration 0128/0188: training loss 0.696 Epoch 110 iteration 0129/0188: training loss 0.695 Epoch 110 iteration 0130/0188: training loss 0.695 Epoch 110 iteration 0131/0188: training loss 0.695 Epoch 110 iteration 0132/0188: training loss 0.694 Epoch 110 iteration 0133/0188: training loss 0.693 Epoch 110 iteration 0134/0188: training loss 0.694 Epoch 110 iteration 0135/0188: training loss 0.694 Epoch 110 iteration 0136/0188: training loss 0.694 Epoch 110 iteration 0137/0188: training loss 0.692 Epoch 110 iteration 0138/0188: training loss 0.693 Epoch 110 iteration 0139/0188: training loss 0.692 Epoch 110 iteration 0140/0188: training loss 0.691 Epoch 110 iteration 0141/0188: training loss 0.691 Epoch 110 iteration 0142/0188: training loss 0.690 Epoch 110 iteration 0143/0188: training loss 0.689 Epoch 110 iteration 0144/0188: training loss 0.689 Epoch 110 iteration 0145/0188: training loss 0.688 Epoch 110 iteration 0146/0188: training loss 0.687 Epoch 110 iteration 0147/0188: training loss 0.688 Epoch 110 iteration 0148/0188: training loss 0.689 Epoch 110 iteration 0149/0188: training loss 0.689 Epoch 110 iteration 0150/0188: training loss 0.688 Epoch 110 iteration 0151/0188: training loss 0.687 Epoch 110 iteration 0152/0188: training loss 0.687 Epoch 110 iteration 0153/0188: training loss 0.687 Epoch 110 iteration 0154/0188: training loss 0.686 Epoch 110 iteration 0155/0188: training loss 0.686 Epoch 110 iteration 0156/0188: training loss 0.685 Epoch 110 iteration 0157/0188: training loss 0.685 Epoch 110 iteration 0158/0188: training loss 0.686 Epoch 110 iteration 0159/0188: training loss 0.685 Epoch 110 iteration 0160/0188: training loss 0.685 Epoch 110 iteration 0161/0188: training loss 0.685 Epoch 110 iteration 0162/0188: training loss 0.685 Epoch 110 iteration 0163/0188: training loss 0.684 Epoch 110 iteration 0164/0188: training loss 0.684 Epoch 110 iteration 0165/0188: training loss 0.683 Epoch 110 iteration 0166/0188: training loss 0.683 Epoch 110 iteration 0167/0188: training loss 0.682 Epoch 110 iteration 0168/0188: training loss 0.682 Epoch 110 iteration 0169/0188: training loss 0.681 Epoch 110 iteration 0170/0188: training loss 0.681 Epoch 110 iteration 0171/0188: training loss 0.680 Epoch 110 iteration 0172/0188: training loss 0.680 Epoch 110 iteration 0173/0188: training loss 0.680 Epoch 110 iteration 0174/0188: training loss 0.680 Epoch 110 iteration 0175/0188: training loss 0.680 Epoch 110 iteration 0176/0188: training loss 0.680 Epoch 110 iteration 0177/0188: training loss 0.679 Epoch 110 iteration 0178/0188: training loss 0.679 Epoch 110 iteration 0179/0188: training loss 0.679 Epoch 110 iteration 0180/0188: training loss 0.678 Epoch 110 iteration 0181/0188: training loss 0.677 Epoch 110 iteration 0182/0188: training loss 0.677 Epoch 110 iteration 0183/0188: training loss 0.677 Epoch 110 iteration 0184/0188: training loss 0.677 Epoch 110 iteration 0185/0188: training loss 0.678 Epoch 110 iteration 0186/0188: training loss 0.679 Epoch 110 validation pixAcc: 0.876, mIoU: 0.393 Epoch 111 iteration 0001/0187: training loss 0.732 Epoch 111 iteration 0002/0187: training loss 0.704 Epoch 111 iteration 0003/0187: training loss 0.746 Epoch 111 iteration 0004/0187: training loss 0.722 Epoch 111 iteration 0005/0187: training loss 0.726 Epoch 111 iteration 0006/0187: training loss 0.702 Epoch 111 iteration 0007/0187: training loss 0.749 Epoch 111 iteration 0008/0187: training loss 0.732 Epoch 111 iteration 0009/0187: training loss 0.720 Epoch 111 iteration 0010/0187: training loss 0.718 Epoch 111 iteration 0011/0187: training loss 0.724 Epoch 111 iteration 0012/0187: training loss 0.726 Epoch 111 iteration 0013/0187: training loss 0.722 Epoch 111 iteration 0014/0187: training loss 0.726 Epoch 111 iteration 0015/0187: training loss 0.736 Epoch 111 iteration 0016/0187: training loss 0.745 Epoch 111 iteration 0017/0187: training loss 0.740 Epoch 111 iteration 0018/0187: training loss 0.739 Epoch 111 iteration 0019/0187: training loss 0.734 Epoch 111 iteration 0020/0187: training loss 0.734 Epoch 111 iteration 0021/0187: training loss 0.731 Epoch 111 iteration 0022/0187: training loss 0.732 Epoch 111 iteration 0023/0187: training loss 0.736 Epoch 111 iteration 0024/0187: training loss 0.736 Epoch 111 iteration 0025/0187: training loss 0.735 Epoch 111 iteration 0026/0187: training loss 0.738 Epoch 111 iteration 0027/0187: training loss 0.733 Epoch 111 iteration 0028/0187: training loss 0.733 Epoch 111 iteration 0029/0187: training loss 0.734 Epoch 111 iteration 0030/0187: training loss 0.729 Epoch 111 iteration 0031/0187: training loss 0.728 Epoch 111 iteration 0032/0187: training loss 0.721 Epoch 111 iteration 0033/0187: training loss 0.719 Epoch 111 iteration 0034/0187: training loss 0.720 Epoch 111 iteration 0035/0187: training loss 0.721 Epoch 111 iteration 0036/0187: training loss 0.719 Epoch 111 iteration 0037/0187: training loss 0.721 Epoch 111 iteration 0038/0187: training loss 0.718 Epoch 111 iteration 0039/0187: training loss 0.720 Epoch 111 iteration 0040/0187: training loss 0.718 Epoch 111 iteration 0041/0187: training loss 0.719 Epoch 111 iteration 0042/0187: training loss 0.716 Epoch 111 iteration 0043/0187: training loss 0.714 Epoch 111 iteration 0044/0187: training loss 0.714 Epoch 111 iteration 0045/0187: training loss 0.713 Epoch 111 iteration 0046/0187: training loss 0.709 Epoch 111 iteration 0047/0187: training loss 0.705 Epoch 111 iteration 0048/0187: training loss 0.703 Epoch 111 iteration 0049/0187: training loss 0.705 Epoch 111 iteration 0050/0187: training loss 0.705 Epoch 111 iteration 0051/0187: training loss 0.706 Epoch 111 iteration 0052/0187: training loss 0.704 Epoch 111 iteration 0053/0187: training loss 0.703 Epoch 111 iteration 0054/0187: training loss 0.702 Epoch 111 iteration 0055/0187: training loss 0.700 Epoch 111 iteration 0056/0187: training loss 0.698 Epoch 111 iteration 0057/0187: training loss 0.698 Epoch 111 iteration 0058/0187: training loss 0.697 Epoch 111 iteration 0059/0187: training loss 0.696 Epoch 111 iteration 0060/0187: training loss 0.698 Epoch 111 iteration 0061/0187: training loss 0.699 Epoch 111 iteration 0062/0187: training loss 0.699 Epoch 111 iteration 0063/0187: training loss 0.702 Epoch 111 iteration 0064/0187: training loss 0.702 Epoch 111 iteration 0065/0187: training loss 0.704 Epoch 111 iteration 0066/0187: training loss 0.701 Epoch 111 iteration 0067/0187: training loss 0.701 Epoch 111 iteration 0068/0187: training loss 0.700 Epoch 111 iteration 0069/0187: training loss 0.698 Epoch 111 iteration 0070/0187: training loss 0.697 Epoch 111 iteration 0071/0187: training loss 0.695 Epoch 111 iteration 0072/0187: training loss 0.695 Epoch 111 iteration 0073/0187: training loss 0.695 Epoch 111 iteration 0074/0187: training loss 0.695 Epoch 111 iteration 0075/0187: training loss 0.694 Epoch 111 iteration 0076/0187: training loss 0.694 Epoch 111 iteration 0077/0187: training loss 0.693 Epoch 111 iteration 0078/0187: training loss 0.694 Epoch 111 iteration 0079/0187: training loss 0.693 Epoch 111 iteration 0080/0187: training loss 0.692 Epoch 111 iteration 0081/0187: training loss 0.693 Epoch 111 iteration 0082/0187: training loss 0.694 Epoch 111 iteration 0083/0187: training loss 0.693 Epoch 111 iteration 0084/0187: training loss 0.694 Epoch 111 iteration 0085/0187: training loss 0.693 Epoch 111 iteration 0086/0187: training loss 0.692 Epoch 111 iteration 0087/0187: training loss 0.692 Epoch 111 iteration 0088/0187: training loss 0.693 Epoch 111 iteration 0089/0187: training loss 0.691 Epoch 111 iteration 0090/0187: training loss 0.691 Epoch 111 iteration 0091/0187: training loss 0.692 Epoch 111 iteration 0092/0187: training loss 0.693 Epoch 111 iteration 0093/0187: training loss 0.694 Epoch 111 iteration 0094/0187: training loss 0.694 Epoch 111 iteration 0095/0187: training loss 0.693 Epoch 111 iteration 0096/0187: training loss 0.691 Epoch 111 iteration 0097/0187: training loss 0.690 Epoch 111 iteration 0098/0187: training loss 0.690 Epoch 111 iteration 0099/0187: training loss 0.691 Epoch 111 iteration 0100/0187: training loss 0.690 Epoch 111 iteration 0101/0187: training loss 0.689 Epoch 111 iteration 0102/0187: training loss 0.688 Epoch 111 iteration 0103/0187: training loss 0.688 Epoch 111 iteration 0104/0187: training loss 0.689 Epoch 111 iteration 0105/0187: training loss 0.689 Epoch 111 iteration 0106/0187: training loss 0.688 Epoch 111 iteration 0107/0187: training loss 0.690 Epoch 111 iteration 0108/0187: training loss 0.693 Epoch 111 iteration 0109/0187: training loss 0.693 Epoch 111 iteration 0110/0187: training loss 0.693 Epoch 111 iteration 0111/0187: training loss 0.693 Epoch 111 iteration 0112/0187: training loss 0.694 Epoch 111 iteration 0113/0187: training loss 0.693 Epoch 111 iteration 0114/0187: training loss 0.693 Epoch 111 iteration 0115/0187: training loss 0.695 Epoch 111 iteration 0116/0187: training loss 0.694 Epoch 111 iteration 0117/0187: training loss 0.693 Epoch 111 iteration 0118/0187: training loss 0.693 Epoch 111 iteration 0119/0187: training loss 0.694 Epoch 111 iteration 0120/0187: training loss 0.694 Epoch 111 iteration 0121/0187: training loss 0.695 Epoch 111 iteration 0122/0187: training loss 0.697 Epoch 111 iteration 0123/0187: training loss 0.697 Epoch 111 iteration 0124/0187: training loss 0.698 Epoch 111 iteration 0125/0187: training loss 0.697 Epoch 111 iteration 0126/0187: training loss 0.696 Epoch 111 iteration 0127/0187: training loss 0.697 Epoch 111 iteration 0128/0187: training loss 0.697 Epoch 111 iteration 0129/0187: training loss 0.696 Epoch 111 iteration 0130/0187: training loss 0.696 Epoch 111 iteration 0131/0187: training loss 0.696 Epoch 111 iteration 0132/0187: training loss 0.695 Epoch 111 iteration 0133/0187: training loss 0.695 Epoch 111 iteration 0134/0187: training loss 0.695 Epoch 111 iteration 0135/0187: training loss 0.694 Epoch 111 iteration 0136/0187: training loss 0.693 Epoch 111 iteration 0137/0187: training loss 0.693 Epoch 111 iteration 0138/0187: training loss 0.694 Epoch 111 iteration 0139/0187: training loss 0.694 Epoch 111 iteration 0140/0187: training loss 0.693 Epoch 111 iteration 0141/0187: training loss 0.694 Epoch 111 iteration 0142/0187: training loss 0.693 Epoch 111 iteration 0143/0187: training loss 0.693 Epoch 111 iteration 0144/0187: training loss 0.693 Epoch 111 iteration 0145/0187: training loss 0.692 Epoch 111 iteration 0146/0187: training loss 0.691 Epoch 111 iteration 0147/0187: training loss 0.690 Epoch 111 iteration 0148/0187: training loss 0.691 Epoch 111 iteration 0149/0187: training loss 0.691 Epoch 111 iteration 0150/0187: training loss 0.693 Epoch 111 iteration 0151/0187: training loss 0.693 Epoch 111 iteration 0152/0187: training loss 0.693 Epoch 111 iteration 0153/0187: training loss 0.693 Epoch 111 iteration 0154/0187: training loss 0.692 Epoch 111 iteration 0155/0187: training loss 0.693 Epoch 111 iteration 0156/0187: training loss 0.694 Epoch 111 iteration 0157/0187: training loss 0.694 Epoch 111 iteration 0158/0187: training loss 0.693 Epoch 111 iteration 0159/0187: training loss 0.693 Epoch 111 iteration 0160/0187: training loss 0.693 Epoch 111 iteration 0161/0187: training loss 0.694 Epoch 111 iteration 0162/0187: training loss 0.694 Epoch 111 iteration 0163/0187: training loss 0.694 Epoch 111 iteration 0164/0187: training loss 0.693 Epoch 111 iteration 0165/0187: training loss 0.694 Epoch 111 iteration 0166/0187: training loss 0.694 Epoch 111 iteration 0167/0187: training loss 0.693 Epoch 111 iteration 0168/0187: training loss 0.693 Epoch 111 iteration 0169/0187: training loss 0.692 Epoch 111 iteration 0170/0187: training loss 0.693 Epoch 111 iteration 0171/0187: training loss 0.692 Epoch 111 iteration 0172/0187: training loss 0.691 Epoch 111 iteration 0173/0187: training loss 0.691 Epoch 111 iteration 0174/0187: training loss 0.690 Epoch 111 iteration 0175/0187: training loss 0.690 Epoch 111 iteration 0176/0187: training loss 0.690 Epoch 111 iteration 0177/0187: training loss 0.689 Epoch 111 iteration 0178/0187: training loss 0.689 Epoch 111 iteration 0179/0187: training loss 0.689 Epoch 111 iteration 0180/0187: training loss 0.689 Epoch 111 iteration 0181/0187: training loss 0.688 Epoch 111 iteration 0182/0187: training loss 0.689 Epoch 111 iteration 0183/0187: training loss 0.688 Epoch 111 iteration 0184/0187: training loss 0.687 Epoch 111 iteration 0185/0187: training loss 0.688 Epoch 111 iteration 0186/0187: training loss 0.688 Epoch 111 iteration 0187/0187: training loss 0.689 Epoch 111 validation pixAcc: 0.875, mIoU: 0.393 Epoch 112 iteration 0001/0187: training loss 0.701 Epoch 112 iteration 0002/0187: training loss 0.778 Epoch 112 iteration 0003/0187: training loss 0.787 Epoch 112 iteration 0004/0187: training loss 0.773 Epoch 112 iteration 0005/0187: training loss 0.763 Epoch 112 iteration 0006/0187: training loss 0.754 Epoch 112 iteration 0007/0187: training loss 0.759 Epoch 112 iteration 0008/0187: training loss 0.737 Epoch 112 iteration 0009/0187: training loss 0.725 Epoch 112 iteration 0010/0187: training loss 0.709 Epoch 112 iteration 0011/0187: training loss 0.704 Epoch 112 iteration 0012/0187: training loss 0.711 Epoch 112 iteration 0013/0187: training loss 0.699 Epoch 112 iteration 0014/0187: training loss 0.702 Epoch 112 iteration 0015/0187: training loss 0.705 Epoch 112 iteration 0016/0187: training loss 0.699 Epoch 112 iteration 0017/0187: training loss 0.701 Epoch 112 iteration 0018/0187: training loss 0.691 Epoch 112 iteration 0019/0187: training loss 0.687 Epoch 112 iteration 0020/0187: training loss 0.688 Epoch 112 iteration 0021/0187: training loss 0.692 Epoch 112 iteration 0022/0187: training loss 0.683 Epoch 112 iteration 0023/0187: training loss 0.691 Epoch 112 iteration 0024/0187: training loss 0.686 Epoch 112 iteration 0025/0187: training loss 0.689 Epoch 112 iteration 0026/0187: training loss 0.690 Epoch 112 iteration 0027/0187: training loss 0.701 Epoch 112 iteration 0028/0187: training loss 0.702 Epoch 112 iteration 0029/0187: training loss 0.698 Epoch 112 iteration 0030/0187: training loss 0.694 Epoch 112 iteration 0031/0187: training loss 0.697 Epoch 112 iteration 0032/0187: training loss 0.693 Epoch 112 iteration 0033/0187: training loss 0.687 Epoch 112 iteration 0034/0187: training loss 0.684 Epoch 112 iteration 0035/0187: training loss 0.682 Epoch 112 iteration 0036/0187: training loss 0.681 Epoch 112 iteration 0037/0187: training loss 0.681 Epoch 112 iteration 0038/0187: training loss 0.678 Epoch 112 iteration 0039/0187: training loss 0.678 Epoch 112 iteration 0040/0187: training loss 0.679 Epoch 112 iteration 0041/0187: training loss 0.677 Epoch 112 iteration 0042/0187: training loss 0.679 Epoch 112 iteration 0043/0187: training loss 0.678 Epoch 112 iteration 0044/0187: training loss 0.677 Epoch 112 iteration 0045/0187: training loss 0.677 Epoch 112 iteration 0046/0187: training loss 0.674 Epoch 112 iteration 0047/0187: training loss 0.675 Epoch 112 iteration 0048/0187: training loss 0.674 Epoch 112 iteration 0049/0187: training loss 0.674 Epoch 112 iteration 0050/0187: training loss 0.672 Epoch 112 iteration 0051/0187: training loss 0.671 Epoch 112 iteration 0052/0187: training loss 0.671 Epoch 112 iteration 0053/0187: training loss 0.676 Epoch 112 iteration 0054/0187: training loss 0.674 Epoch 112 iteration 0055/0187: training loss 0.673 Epoch 112 iteration 0056/0187: training loss 0.670 Epoch 112 iteration 0057/0187: training loss 0.669 Epoch 112 iteration 0058/0187: training loss 0.669 Epoch 112 iteration 0059/0187: training loss 0.669 Epoch 112 iteration 0060/0187: training loss 0.672 Epoch 112 iteration 0061/0187: training loss 0.671 Epoch 112 iteration 0062/0187: training loss 0.672 Epoch 112 iteration 0063/0187: training loss 0.672 Epoch 112 iteration 0064/0187: training loss 0.676 Epoch 112 iteration 0065/0187: training loss 0.674 Epoch 112 iteration 0066/0187: training loss 0.674 Epoch 112 iteration 0067/0187: training loss 0.673 Epoch 112 iteration 0068/0187: training loss 0.671 Epoch 112 iteration 0069/0187: training loss 0.668 Epoch 112 iteration 0070/0187: training loss 0.670 Epoch 112 iteration 0071/0187: training loss 0.672 Epoch 112 iteration 0072/0187: training loss 0.671 Epoch 112 iteration 0073/0187: training loss 0.672 Epoch 112 iteration 0074/0187: training loss 0.670 Epoch 112 iteration 0075/0187: training loss 0.669 Epoch 112 iteration 0076/0187: training loss 0.668 Epoch 112 iteration 0077/0187: training loss 0.667 Epoch 112 iteration 0078/0187: training loss 0.667 Epoch 112 iteration 0079/0187: training loss 0.667 Epoch 112 iteration 0080/0187: training loss 0.668 Epoch 112 iteration 0081/0187: training loss 0.670 Epoch 112 iteration 0082/0187: training loss 0.670 Epoch 112 iteration 0083/0187: training loss 0.670 Epoch 112 iteration 0084/0187: training loss 0.670 Epoch 112 iteration 0085/0187: training loss 0.672 Epoch 112 iteration 0086/0187: training loss 0.673 Epoch 112 iteration 0087/0187: training loss 0.675 Epoch 112 iteration 0088/0187: training loss 0.674 Epoch 112 iteration 0089/0187: training loss 0.673 Epoch 112 iteration 0090/0187: training loss 0.672 Epoch 112 iteration 0091/0188: training loss 0.673 Epoch 112 iteration 0092/0188: training loss 0.673 Epoch 112 iteration 0093/0188: training loss 0.673 Epoch 112 iteration 0094/0188: training loss 0.672 Epoch 112 iteration 0095/0188: training loss 0.671 Epoch 112 iteration 0096/0188: training loss 0.671 Epoch 112 iteration 0097/0188: training loss 0.672 Epoch 112 iteration 0098/0188: training loss 0.673 Epoch 112 iteration 0099/0188: training loss 0.674 Epoch 112 iteration 0100/0188: training loss 0.673 Epoch 112 iteration 0101/0188: training loss 0.673 Epoch 112 iteration 0102/0188: training loss 0.674 Epoch 112 iteration 0103/0188: training loss 0.676 Epoch 112 iteration 0104/0188: training loss 0.675 Epoch 112 iteration 0105/0188: training loss 0.674 Epoch 112 iteration 0106/0188: training loss 0.675 Epoch 112 iteration 0107/0188: training loss 0.673 Epoch 112 iteration 0108/0188: training loss 0.674 Epoch 112 iteration 0109/0188: training loss 0.675 Epoch 112 iteration 0110/0188: training loss 0.675 Epoch 112 iteration 0111/0188: training loss 0.675 Epoch 112 iteration 0112/0188: training loss 0.676 Epoch 112 iteration 0113/0188: training loss 0.675 Epoch 112 iteration 0114/0188: training loss 0.676 Epoch 112 iteration 0115/0188: training loss 0.675 Epoch 112 iteration 0116/0188: training loss 0.676 Epoch 112 iteration 0117/0188: training loss 0.676 Epoch 112 iteration 0118/0188: training loss 0.676 Epoch 112 iteration 0119/0188: training loss 0.675 Epoch 112 iteration 0120/0188: training loss 0.675 Epoch 112 iteration 0121/0188: training loss 0.676 Epoch 112 iteration 0122/0188: training loss 0.679 Epoch 112 iteration 0123/0188: training loss 0.678 Epoch 112 iteration 0124/0188: training loss 0.678 Epoch 112 iteration 0125/0188: training loss 0.678 Epoch 112 iteration 0126/0188: training loss 0.677 Epoch 112 iteration 0127/0188: training loss 0.676 Epoch 112 iteration 0128/0188: training loss 0.676 Epoch 112 iteration 0129/0188: training loss 0.676 Epoch 112 iteration 0130/0188: training loss 0.677 Epoch 112 iteration 0131/0188: training loss 0.676 Epoch 112 iteration 0132/0188: training loss 0.675 Epoch 112 iteration 0133/0188: training loss 0.675 Epoch 112 iteration 0134/0188: training loss 0.676 Epoch 112 iteration 0135/0188: training loss 0.675 Epoch 112 iteration 0136/0188: training loss 0.676 Epoch 112 iteration 0137/0188: training loss 0.677 Epoch 112 iteration 0138/0188: training loss 0.676 Epoch 112 iteration 0139/0188: training loss 0.675 Epoch 112 iteration 0140/0188: training loss 0.675 Epoch 112 iteration 0141/0188: training loss 0.676 Epoch 112 iteration 0142/0188: training loss 0.676 Epoch 112 iteration 0143/0188: training loss 0.675 Epoch 112 iteration 0144/0188: training loss 0.676 Epoch 112 iteration 0145/0188: training loss 0.676 Epoch 112 iteration 0146/0188: training loss 0.675 Epoch 112 iteration 0147/0188: training loss 0.675 Epoch 112 iteration 0148/0188: training loss 0.675 Epoch 112 iteration 0149/0188: training loss 0.675 Epoch 112 iteration 0150/0188: training loss 0.676 Epoch 112 iteration 0151/0188: training loss 0.676 Epoch 112 iteration 0152/0188: training loss 0.676 Epoch 112 iteration 0153/0188: training loss 0.676 Epoch 112 iteration 0154/0188: training loss 0.676 Epoch 112 iteration 0155/0188: training loss 0.676 Epoch 112 iteration 0156/0188: training loss 0.676 Epoch 112 iteration 0157/0188: training loss 0.676 Epoch 112 iteration 0158/0188: training loss 0.676 Epoch 112 iteration 0159/0188: training loss 0.676 Epoch 112 iteration 0160/0188: training loss 0.676 Epoch 112 iteration 0161/0188: training loss 0.675 Epoch 112 iteration 0162/0188: training loss 0.676 Epoch 112 iteration 0163/0188: training loss 0.677 Epoch 112 iteration 0164/0188: training loss 0.677 Epoch 112 iteration 0165/0188: training loss 0.677 Epoch 112 iteration 0166/0188: training loss 0.677 Epoch 112 iteration 0167/0188: training loss 0.677 Epoch 112 iteration 0168/0188: training loss 0.677 Epoch 112 iteration 0169/0188: training loss 0.678 Epoch 112 iteration 0170/0188: training loss 0.679 Epoch 112 iteration 0171/0188: training loss 0.679 Epoch 112 iteration 0172/0188: training loss 0.679 Epoch 112 iteration 0173/0188: training loss 0.678 Epoch 112 iteration 0174/0188: training loss 0.678 Epoch 112 iteration 0175/0188: training loss 0.678 Epoch 112 iteration 0176/0188: training loss 0.680 Epoch 112 iteration 0177/0188: training loss 0.680 Epoch 112 iteration 0178/0188: training loss 0.679 Epoch 112 iteration 0179/0188: training loss 0.679 Epoch 112 iteration 0180/0188: training loss 0.679 Epoch 112 iteration 0181/0188: training loss 0.679 Epoch 112 iteration 0182/0188: training loss 0.679 Epoch 112 iteration 0183/0188: training loss 0.679 Epoch 112 iteration 0184/0188: training loss 0.678 Epoch 112 iteration 0185/0188: training loss 0.679 Epoch 112 iteration 0186/0188: training loss 0.678 Epoch 112 validation pixAcc: 0.875, mIoU: 0.393 Epoch 113 iteration 0001/0187: training loss 0.617 Epoch 113 iteration 0002/0187: training loss 0.606 Epoch 113 iteration 0003/0187: training loss 0.612 Epoch 113 iteration 0004/0187: training loss 0.617 Epoch 113 iteration 0005/0187: training loss 0.609 Epoch 113 iteration 0006/0187: training loss 0.602 Epoch 113 iteration 0007/0187: training loss 0.611 Epoch 113 iteration 0008/0187: training loss 0.621 Epoch 113 iteration 0009/0187: training loss 0.625 Epoch 113 iteration 0010/0187: training loss 0.631 Epoch 113 iteration 0011/0187: training loss 0.638 Epoch 113 iteration 0012/0187: training loss 0.633 Epoch 113 iteration 0013/0187: training loss 0.645 Epoch 113 iteration 0014/0187: training loss 0.651 Epoch 113 iteration 0015/0187: training loss 0.659 Epoch 113 iteration 0016/0187: training loss 0.659 Epoch 113 iteration 0017/0187: training loss 0.660 Epoch 113 iteration 0018/0187: training loss 0.661 Epoch 113 iteration 0019/0187: training loss 0.672 Epoch 113 iteration 0020/0187: training loss 0.667 Epoch 113 iteration 0021/0187: training loss 0.666 Epoch 113 iteration 0022/0187: training loss 0.679 Epoch 113 iteration 0023/0187: training loss 0.675 Epoch 113 iteration 0024/0187: training loss 0.676 Epoch 113 iteration 0025/0187: training loss 0.679 Epoch 113 iteration 0026/0187: training loss 0.678 Epoch 113 iteration 0027/0187: training loss 0.677 Epoch 113 iteration 0028/0187: training loss 0.681 Epoch 113 iteration 0029/0187: training loss 0.684 Epoch 113 iteration 0030/0187: training loss 0.685 Epoch 113 iteration 0031/0187: training loss 0.690 Epoch 113 iteration 0032/0187: training loss 0.690 Epoch 113 iteration 0033/0187: training loss 0.692 Epoch 113 iteration 0034/0187: training loss 0.691 Epoch 113 iteration 0035/0187: training loss 0.688 Epoch 113 iteration 0036/0187: training loss 0.690 Epoch 113 iteration 0037/0187: training loss 0.688 Epoch 113 iteration 0038/0187: training loss 0.690 Epoch 113 iteration 0039/0187: training loss 0.691 Epoch 113 iteration 0040/0187: training loss 0.691 Epoch 113 iteration 0041/0187: training loss 0.690 Epoch 113 iteration 0042/0187: training loss 0.688 Epoch 113 iteration 0043/0187: training loss 0.688 Epoch 113 iteration 0044/0187: training loss 0.688 Epoch 113 iteration 0045/0187: training loss 0.685 Epoch 113 iteration 0046/0187: training loss 0.684 Epoch 113 iteration 0047/0187: training loss 0.682 Epoch 113 iteration 0048/0187: training loss 0.681 Epoch 113 iteration 0049/0187: training loss 0.679 Epoch 113 iteration 0050/0187: training loss 0.677 Epoch 113 iteration 0051/0187: training loss 0.677 Epoch 113 iteration 0052/0187: training loss 0.678 Epoch 113 iteration 0053/0187: training loss 0.679 Epoch 113 iteration 0054/0187: training loss 0.682 Epoch 113 iteration 0055/0187: training loss 0.680 Epoch 113 iteration 0056/0187: training loss 0.680 Epoch 113 iteration 0057/0187: training loss 0.681 Epoch 113 iteration 0058/0187: training loss 0.681 Epoch 113 iteration 0059/0187: training loss 0.680 Epoch 113 iteration 0060/0187: training loss 0.680 Epoch 113 iteration 0061/0187: training loss 0.680 Epoch 113 iteration 0062/0187: training loss 0.681 Epoch 113 iteration 0063/0187: training loss 0.684 Epoch 113 iteration 0064/0187: training loss 0.686 Epoch 113 iteration 0065/0187: training loss 0.682 Epoch 113 iteration 0066/0187: training loss 0.685 Epoch 113 iteration 0067/0187: training loss 0.686 Epoch 113 iteration 0068/0187: training loss 0.689 Epoch 113 iteration 0069/0187: training loss 0.691 Epoch 113 iteration 0070/0187: training loss 0.692 Epoch 113 iteration 0071/0187: training loss 0.692 Epoch 113 iteration 0072/0187: training loss 0.692 Epoch 113 iteration 0073/0187: training loss 0.690 Epoch 113 iteration 0074/0187: training loss 0.689 Epoch 113 iteration 0075/0187: training loss 0.689 Epoch 113 iteration 0076/0187: training loss 0.690 Epoch 113 iteration 0077/0187: training loss 0.689 Epoch 113 iteration 0078/0187: training loss 0.689 Epoch 113 iteration 0079/0187: training loss 0.687 Epoch 113 iteration 0080/0187: training loss 0.686 Epoch 113 iteration 0081/0187: training loss 0.688 Epoch 113 iteration 0082/0187: training loss 0.687 Epoch 113 iteration 0083/0187: training loss 0.687 Epoch 113 iteration 0084/0187: training loss 0.687 Epoch 113 iteration 0085/0187: training loss 0.687 Epoch 113 iteration 0086/0187: training loss 0.687 Epoch 113 iteration 0087/0187: training loss 0.685 Epoch 113 iteration 0088/0187: training loss 0.684 Epoch 113 iteration 0089/0187: training loss 0.682 Epoch 113 iteration 0090/0187: training loss 0.682 Epoch 113 iteration 0091/0187: training loss 0.684 Epoch 113 iteration 0092/0187: training loss 0.685 Epoch 113 iteration 0093/0187: training loss 0.686 Epoch 113 iteration 0094/0187: training loss 0.685 Epoch 113 iteration 0095/0187: training loss 0.684 Epoch 113 iteration 0096/0187: training loss 0.683 Epoch 113 iteration 0097/0187: training loss 0.681 Epoch 113 iteration 0098/0187: training loss 0.682 Epoch 113 iteration 0099/0187: training loss 0.682 Epoch 113 iteration 0100/0187: training loss 0.680 Epoch 113 iteration 0101/0187: training loss 0.680 Epoch 113 iteration 0102/0187: training loss 0.681 Epoch 113 iteration 0103/0187: training loss 0.681 Epoch 113 iteration 0104/0187: training loss 0.681 Epoch 113 iteration 0105/0187: training loss 0.681 Epoch 113 iteration 0106/0187: training loss 0.680 Epoch 113 iteration 0107/0187: training loss 0.679 Epoch 113 iteration 0108/0187: training loss 0.680 Epoch 113 iteration 0109/0187: training loss 0.679 Epoch 113 iteration 0110/0187: training loss 0.679 Epoch 113 iteration 0111/0187: training loss 0.680 Epoch 113 iteration 0112/0187: training loss 0.678 Epoch 113 iteration 0113/0187: training loss 0.677 Epoch 113 iteration 0114/0187: training loss 0.676 Epoch 113 iteration 0115/0187: training loss 0.675 Epoch 113 iteration 0116/0187: training loss 0.675 Epoch 113 iteration 0117/0187: training loss 0.676 Epoch 113 iteration 0118/0187: training loss 0.675 Epoch 113 iteration 0119/0187: training loss 0.676 Epoch 113 iteration 0120/0187: training loss 0.677 Epoch 113 iteration 0121/0187: training loss 0.677 Epoch 113 iteration 0122/0187: training loss 0.677 Epoch 113 iteration 0123/0187: training loss 0.677 Epoch 113 iteration 0124/0187: training loss 0.676 Epoch 113 iteration 0125/0187: training loss 0.677 Epoch 113 iteration 0126/0187: training loss 0.677 Epoch 113 iteration 0127/0187: training loss 0.679 Epoch 113 iteration 0128/0187: training loss 0.678 Epoch 113 iteration 0129/0187: training loss 0.678 Epoch 113 iteration 0130/0187: training loss 0.677 Epoch 113 iteration 0131/0187: training loss 0.677 Epoch 113 iteration 0132/0187: training loss 0.677 Epoch 113 iteration 0133/0187: training loss 0.677 Epoch 113 iteration 0134/0187: training loss 0.676 Epoch 113 iteration 0135/0187: training loss 0.677 Epoch 113 iteration 0136/0187: training loss 0.677 Epoch 113 iteration 0137/0187: training loss 0.677 Epoch 113 iteration 0138/0187: training loss 0.676 Epoch 113 iteration 0139/0187: training loss 0.675 Epoch 113 iteration 0140/0187: training loss 0.676 Epoch 113 iteration 0141/0187: training loss 0.676 Epoch 113 iteration 0142/0187: training loss 0.676 Epoch 113 iteration 0143/0187: training loss 0.675 Epoch 113 iteration 0144/0187: training loss 0.676 Epoch 113 iteration 0145/0187: training loss 0.675 Epoch 113 iteration 0146/0187: training loss 0.676 Epoch 113 iteration 0147/0187: training loss 0.678 Epoch 113 iteration 0148/0187: training loss 0.678 Epoch 113 iteration 0149/0187: training loss 0.678 Epoch 113 iteration 0150/0187: training loss 0.677 Epoch 113 iteration 0151/0187: training loss 0.677 Epoch 113 iteration 0152/0187: training loss 0.677 Epoch 113 iteration 0153/0187: training loss 0.676 Epoch 113 iteration 0154/0187: training loss 0.676 Epoch 113 iteration 0155/0187: training loss 0.676 Epoch 113 iteration 0156/0187: training loss 0.677 Epoch 113 iteration 0157/0187: training loss 0.677 Epoch 113 iteration 0158/0187: training loss 0.676 Epoch 113 iteration 0159/0187: training loss 0.678 Epoch 113 iteration 0160/0187: training loss 0.677 Epoch 113 iteration 0161/0187: training loss 0.677 Epoch 113 iteration 0162/0187: training loss 0.676 Epoch 113 iteration 0163/0187: training loss 0.677 Epoch 113 iteration 0164/0187: training loss 0.677 Epoch 113 iteration 0165/0187: training loss 0.677 Epoch 113 iteration 0166/0187: training loss 0.676 Epoch 113 iteration 0167/0187: training loss 0.676 Epoch 113 iteration 0168/0187: training loss 0.676 Epoch 113 iteration 0169/0187: training loss 0.676 Epoch 113 iteration 0170/0187: training loss 0.675 Epoch 113 iteration 0171/0187: training loss 0.676 Epoch 113 iteration 0172/0187: training loss 0.676 Epoch 113 iteration 0173/0187: training loss 0.677 Epoch 113 iteration 0174/0187: training loss 0.676 Epoch 113 iteration 0175/0187: training loss 0.676 Epoch 113 iteration 0176/0187: training loss 0.677 Epoch 113 iteration 0177/0187: training loss 0.677 Epoch 113 iteration 0178/0187: training loss 0.678 Epoch 113 iteration 0179/0187: training loss 0.677 Epoch 113 iteration 0180/0187: training loss 0.677 Epoch 113 iteration 0181/0187: training loss 0.678 Epoch 113 iteration 0182/0187: training loss 0.679 Epoch 113 iteration 0183/0187: training loss 0.679 Epoch 113 iteration 0184/0187: training loss 0.679 Epoch 113 iteration 0185/0187: training loss 0.679 Epoch 113 iteration 0186/0187: training loss 0.678 Epoch 113 iteration 0187/0187: training loss 0.678 Epoch 113 validation pixAcc: 0.876, mIoU: 0.394 Epoch 114 iteration 0001/0187: training loss 0.768 Epoch 114 iteration 0002/0187: training loss 0.802 Epoch 114 iteration 0003/0187: training loss 0.782 Epoch 114 iteration 0004/0187: training loss 0.735 Epoch 114 iteration 0005/0187: training loss 0.698 Epoch 114 iteration 0006/0187: training loss 0.721 Epoch 114 iteration 0007/0187: training loss 0.752 Epoch 114 iteration 0008/0187: training loss 0.751 Epoch 114 iteration 0009/0187: training loss 0.734 Epoch 114 iteration 0010/0187: training loss 0.727 Epoch 114 iteration 0011/0187: training loss 0.717 Epoch 114 iteration 0012/0187: training loss 0.734 Epoch 114 iteration 0013/0187: training loss 0.730 Epoch 114 iteration 0014/0187: training loss 0.725 Epoch 114 iteration 0015/0187: training loss 0.715 Epoch 114 iteration 0016/0187: training loss 0.706 Epoch 114 iteration 0017/0187: training loss 0.697 Epoch 114 iteration 0018/0187: training loss 0.693 Epoch 114 iteration 0019/0187: training loss 0.707 Epoch 114 iteration 0020/0187: training loss 0.706 Epoch 114 iteration 0021/0187: training loss 0.700 Epoch 114 iteration 0022/0187: training loss 0.698 Epoch 114 iteration 0023/0187: training loss 0.699 Epoch 114 iteration 0024/0187: training loss 0.693 Epoch 114 iteration 0025/0187: training loss 0.690 Epoch 114 iteration 0026/0187: training loss 0.688 Epoch 114 iteration 0027/0187: training loss 0.696 Epoch 114 iteration 0028/0187: training loss 0.697 Epoch 114 iteration 0029/0187: training loss 0.705 Epoch 114 iteration 0030/0187: training loss 0.699 Epoch 114 iteration 0031/0187: training loss 0.699 Epoch 114 iteration 0032/0187: training loss 0.697 Epoch 114 iteration 0033/0187: training loss 0.700 Epoch 114 iteration 0034/0187: training loss 0.702 Epoch 114 iteration 0035/0187: training loss 0.698 Epoch 114 iteration 0036/0187: training loss 0.700 Epoch 114 iteration 0037/0187: training loss 0.699 Epoch 114 iteration 0038/0187: training loss 0.695 Epoch 114 iteration 0039/0187: training loss 0.695 Epoch 114 iteration 0040/0187: training loss 0.699 Epoch 114 iteration 0041/0187: training loss 0.695 Epoch 114 iteration 0042/0187: training loss 0.699 Epoch 114 iteration 0043/0187: training loss 0.698 Epoch 114 iteration 0044/0187: training loss 0.696 Epoch 114 iteration 0045/0187: training loss 0.696 Epoch 114 iteration 0046/0187: training loss 0.696 Epoch 114 iteration 0047/0187: training loss 0.698 Epoch 114 iteration 0048/0187: training loss 0.696 Epoch 114 iteration 0049/0187: training loss 0.695 Epoch 114 iteration 0050/0187: training loss 0.697 Epoch 114 iteration 0051/0187: training loss 0.694 Epoch 114 iteration 0052/0187: training loss 0.690 Epoch 114 iteration 0053/0187: training loss 0.691 Epoch 114 iteration 0054/0187: training loss 0.689 Epoch 114 iteration 0055/0187: training loss 0.691 Epoch 114 iteration 0056/0187: training loss 0.694 Epoch 114 iteration 0057/0187: training loss 0.694 Epoch 114 iteration 0058/0187: training loss 0.692 Epoch 114 iteration 0059/0187: training loss 0.694 Epoch 114 iteration 0060/0187: training loss 0.692 Epoch 114 iteration 0061/0187: training loss 0.691 Epoch 114 iteration 0062/0187: training loss 0.691 Epoch 114 iteration 0063/0187: training loss 0.692 Epoch 114 iteration 0064/0187: training loss 0.692 Epoch 114 iteration 0065/0187: training loss 0.692 Epoch 114 iteration 0066/0187: training loss 0.697 Epoch 114 iteration 0067/0187: training loss 0.698 Epoch 114 iteration 0068/0187: training loss 0.697 Epoch 114 iteration 0069/0187: training loss 0.694 Epoch 114 iteration 0070/0187: training loss 0.694 Epoch 114 iteration 0071/0187: training loss 0.696 Epoch 114 iteration 0072/0187: training loss 0.694 Epoch 114 iteration 0073/0187: training loss 0.694 Epoch 114 iteration 0074/0187: training loss 0.694 Epoch 114 iteration 0075/0187: training loss 0.693 Epoch 114 iteration 0076/0187: training loss 0.690 Epoch 114 iteration 0077/0187: training loss 0.687 Epoch 114 iteration 0078/0187: training loss 0.685 Epoch 114 iteration 0079/0187: training loss 0.685 Epoch 114 iteration 0080/0187: training loss 0.684 Epoch 114 iteration 0081/0187: training loss 0.683 Epoch 114 iteration 0082/0187: training loss 0.683 Epoch 114 iteration 0083/0187: training loss 0.685 Epoch 114 iteration 0084/0187: training loss 0.684 Epoch 114 iteration 0085/0187: training loss 0.684 Epoch 114 iteration 0086/0187: training loss 0.684 Epoch 114 iteration 0087/0187: training loss 0.684 Epoch 114 iteration 0088/0187: training loss 0.684 Epoch 114 iteration 0089/0187: training loss 0.683 Epoch 114 iteration 0090/0187: training loss 0.682 Epoch 114 iteration 0091/0188: training loss 0.681 Epoch 114 iteration 0092/0188: training loss 0.680 Epoch 114 iteration 0093/0188: training loss 0.679 Epoch 114 iteration 0094/0188: training loss 0.678 Epoch 114 iteration 0095/0188: training loss 0.677 Epoch 114 iteration 0096/0188: training loss 0.676 Epoch 114 iteration 0097/0188: training loss 0.675 Epoch 114 iteration 0098/0188: training loss 0.675 Epoch 114 iteration 0099/0188: training loss 0.675 Epoch 114 iteration 0100/0188: training loss 0.675 Epoch 114 iteration 0101/0188: training loss 0.674 Epoch 114 iteration 0102/0188: training loss 0.674 Epoch 114 iteration 0103/0188: training loss 0.674 Epoch 114 iteration 0104/0188: training loss 0.675 Epoch 114 iteration 0105/0188: training loss 0.674 Epoch 114 iteration 0106/0188: training loss 0.676 Epoch 114 iteration 0107/0188: training loss 0.676 Epoch 114 iteration 0108/0188: training loss 0.676 Epoch 114 iteration 0109/0188: training loss 0.677 Epoch 114 iteration 0110/0188: training loss 0.677 Epoch 114 iteration 0111/0188: training loss 0.677 Epoch 114 iteration 0112/0188: training loss 0.678 Epoch 114 iteration 0113/0188: training loss 0.677 Epoch 114 iteration 0114/0188: training loss 0.678 Epoch 114 iteration 0115/0188: training loss 0.677 Epoch 114 iteration 0116/0188: training loss 0.677 Epoch 114 iteration 0117/0188: training loss 0.677 Epoch 114 iteration 0118/0188: training loss 0.678 Epoch 114 iteration 0119/0188: training loss 0.680 Epoch 114 iteration 0120/0188: training loss 0.679 Epoch 114 iteration 0121/0188: training loss 0.679 Epoch 114 iteration 0122/0188: training loss 0.679 Epoch 114 iteration 0123/0188: training loss 0.678 Epoch 114 iteration 0124/0188: training loss 0.677 Epoch 114 iteration 0125/0188: training loss 0.677 Epoch 114 iteration 0126/0188: training loss 0.677 Epoch 114 iteration 0127/0188: training loss 0.678 Epoch 114 iteration 0128/0188: training loss 0.677 Epoch 114 iteration 0129/0188: training loss 0.678 Epoch 114 iteration 0130/0188: training loss 0.678 Epoch 114 iteration 0131/0188: training loss 0.677 Epoch 114 iteration 0132/0188: training loss 0.677 Epoch 114 iteration 0133/0188: training loss 0.676 Epoch 114 iteration 0134/0188: training loss 0.676 Epoch 114 iteration 0135/0188: training loss 0.675 Epoch 114 iteration 0136/0188: training loss 0.676 Epoch 114 iteration 0137/0188: training loss 0.677 Epoch 114 iteration 0138/0188: training loss 0.677 Epoch 114 iteration 0139/0188: training loss 0.677 Epoch 114 iteration 0140/0188: training loss 0.677 Epoch 114 iteration 0141/0188: training loss 0.676 Epoch 114 iteration 0142/0188: training loss 0.678 Epoch 114 iteration 0143/0188: training loss 0.678 Epoch 114 iteration 0144/0188: training loss 0.678 Epoch 114 iteration 0145/0188: training loss 0.679 Epoch 114 iteration 0146/0188: training loss 0.679 Epoch 114 iteration 0147/0188: training loss 0.679 Epoch 114 iteration 0148/0188: training loss 0.679 Epoch 114 iteration 0149/0188: training loss 0.678 Epoch 114 iteration 0150/0188: training loss 0.678 Epoch 114 iteration 0151/0188: training loss 0.677 Epoch 114 iteration 0152/0188: training loss 0.677 Epoch 114 iteration 0153/0188: training loss 0.677 Epoch 114 iteration 0154/0188: training loss 0.678 Epoch 114 iteration 0155/0188: training loss 0.678 Epoch 114 iteration 0156/0188: training loss 0.677 Epoch 114 iteration 0157/0188: training loss 0.676 Epoch 114 iteration 0158/0188: training loss 0.676 Epoch 114 iteration 0159/0188: training loss 0.676 Epoch 114 iteration 0160/0188: training loss 0.677 Epoch 114 iteration 0161/0188: training loss 0.676 Epoch 114 iteration 0162/0188: training loss 0.676 Epoch 114 iteration 0163/0188: training loss 0.677 Epoch 114 iteration 0164/0188: training loss 0.677 Epoch 114 iteration 0165/0188: training loss 0.677 Epoch 114 iteration 0166/0188: training loss 0.677 Epoch 114 iteration 0167/0188: training loss 0.676 Epoch 114 iteration 0168/0188: training loss 0.676 Epoch 114 iteration 0169/0188: training loss 0.676 Epoch 114 iteration 0170/0188: training loss 0.676 Epoch 114 iteration 0171/0188: training loss 0.676 Epoch 114 iteration 0172/0188: training loss 0.677 Epoch 114 iteration 0173/0188: training loss 0.676 Epoch 114 iteration 0174/0188: training loss 0.676 Epoch 114 iteration 0175/0188: training loss 0.678 Epoch 114 iteration 0176/0188: training loss 0.678 Epoch 114 iteration 0177/0188: training loss 0.679 Epoch 114 iteration 0178/0188: training loss 0.678 Epoch 114 iteration 0179/0188: training loss 0.678 Epoch 114 iteration 0180/0188: training loss 0.678 Epoch 114 iteration 0181/0188: training loss 0.677 Epoch 114 iteration 0182/0188: training loss 0.677 Epoch 114 iteration 0183/0188: training loss 0.678 Epoch 114 iteration 0184/0188: training loss 0.678 Epoch 114 iteration 0185/0188: training loss 0.679 Epoch 114 iteration 0186/0188: training loss 0.678 Epoch 114 validation pixAcc: 0.875, mIoU: 0.393 Epoch 115 iteration 0001/0187: training loss 0.548 Epoch 115 iteration 0002/0187: training loss 0.654 Epoch 115 iteration 0003/0187: training loss 0.658 Epoch 115 iteration 0004/0187: training loss 0.663 Epoch 115 iteration 0005/0187: training loss 0.668 Epoch 115 iteration 0006/0187: training loss 0.653 Epoch 115 iteration 0007/0187: training loss 0.638 Epoch 115 iteration 0008/0187: training loss 0.638 Epoch 115 iteration 0009/0187: training loss 0.641 Epoch 115 iteration 0010/0187: training loss 0.659 Epoch 115 iteration 0011/0187: training loss 0.649 Epoch 115 iteration 0012/0187: training loss 0.649 Epoch 115 iteration 0013/0187: training loss 0.650 Epoch 115 iteration 0014/0187: training loss 0.656 Epoch 115 iteration 0015/0187: training loss 0.657 Epoch 115 iteration 0016/0187: training loss 0.655 Epoch 115 iteration 0017/0187: training loss 0.660 Epoch 115 iteration 0018/0187: training loss 0.662 Epoch 115 iteration 0019/0187: training loss 0.665 Epoch 115 iteration 0020/0187: training loss 0.666 Epoch 115 iteration 0021/0187: training loss 0.667 Epoch 115 iteration 0022/0187: training loss 0.664 Epoch 115 iteration 0023/0187: training loss 0.660 Epoch 115 iteration 0024/0187: training loss 0.662 Epoch 115 iteration 0025/0187: training loss 0.660 Epoch 115 iteration 0026/0187: training loss 0.661 Epoch 115 iteration 0027/0187: training loss 0.656 Epoch 115 iteration 0028/0187: training loss 0.655 Epoch 115 iteration 0029/0187: training loss 0.662 Epoch 115 iteration 0030/0187: training loss 0.661 Epoch 115 iteration 0031/0187: training loss 0.660 Epoch 115 iteration 0032/0187: training loss 0.659 Epoch 115 iteration 0033/0187: training loss 0.662 Epoch 115 iteration 0034/0187: training loss 0.661 Epoch 115 iteration 0035/0187: training loss 0.659 Epoch 115 iteration 0036/0187: training loss 0.657 Epoch 115 iteration 0037/0187: training loss 0.655 Epoch 115 iteration 0038/0187: training loss 0.654 Epoch 115 iteration 0039/0187: training loss 0.658 Epoch 115 iteration 0040/0187: training loss 0.659 Epoch 115 iteration 0041/0187: training loss 0.658 Epoch 115 iteration 0042/0187: training loss 0.656 Epoch 115 iteration 0043/0187: training loss 0.660 Epoch 115 iteration 0044/0187: training loss 0.663 Epoch 115 iteration 0045/0187: training loss 0.663 Epoch 115 iteration 0046/0187: training loss 0.661 Epoch 115 iteration 0047/0187: training loss 0.659 Epoch 115 iteration 0048/0187: training loss 0.663 Epoch 115 iteration 0049/0187: training loss 0.664 Epoch 115 iteration 0050/0187: training loss 0.661 Epoch 115 iteration 0051/0187: training loss 0.662 Epoch 115 iteration 0052/0187: training loss 0.661 Epoch 115 iteration 0053/0187: training loss 0.662 Epoch 115 iteration 0054/0187: training loss 0.664 Epoch 115 iteration 0055/0187: training loss 0.665 Epoch 115 iteration 0056/0187: training loss 0.668 Epoch 115 iteration 0057/0187: training loss 0.670 Epoch 115 iteration 0058/0187: training loss 0.673 Epoch 115 iteration 0059/0187: training loss 0.672 Epoch 115 iteration 0060/0187: training loss 0.673 Epoch 115 iteration 0061/0187: training loss 0.674 Epoch 115 iteration 0062/0187: training loss 0.672 Epoch 115 iteration 0063/0187: training loss 0.675 Epoch 115 iteration 0064/0187: training loss 0.679 Epoch 115 iteration 0065/0187: training loss 0.677 Epoch 115 iteration 0066/0187: training loss 0.677 Epoch 115 iteration 0067/0187: training loss 0.678 Epoch 115 iteration 0068/0187: training loss 0.678 Epoch 115 iteration 0069/0187: training loss 0.681 Epoch 115 iteration 0070/0187: training loss 0.681 Epoch 115 iteration 0071/0187: training loss 0.680 Epoch 115 iteration 0072/0187: training loss 0.679 Epoch 115 iteration 0073/0187: training loss 0.682 Epoch 115 iteration 0074/0187: training loss 0.683 Epoch 115 iteration 0075/0187: training loss 0.681 Epoch 115 iteration 0076/0187: training loss 0.680 Epoch 115 iteration 0077/0187: training loss 0.680 Epoch 115 iteration 0078/0187: training loss 0.680 Epoch 115 iteration 0079/0187: training loss 0.679 Epoch 115 iteration 0080/0187: training loss 0.681 Epoch 115 iteration 0081/0187: training loss 0.681 Epoch 115 iteration 0082/0187: training loss 0.682 Epoch 115 iteration 0083/0187: training loss 0.681 Epoch 115 iteration 0084/0187: training loss 0.682 Epoch 115 iteration 0085/0187: training loss 0.680 Epoch 115 iteration 0086/0187: training loss 0.679 Epoch 115 iteration 0087/0187: training loss 0.681 Epoch 115 iteration 0088/0187: training loss 0.681 Epoch 115 iteration 0089/0187: training loss 0.681 Epoch 115 iteration 0090/0187: training loss 0.680 Epoch 115 iteration 0091/0187: training loss 0.680 Epoch 115 iteration 0092/0187: training loss 0.680 Epoch 115 iteration 0093/0187: training loss 0.681 Epoch 115 iteration 0094/0187: training loss 0.683 Epoch 115 iteration 0095/0187: training loss 0.682 Epoch 115 iteration 0096/0187: training loss 0.682 Epoch 115 iteration 0097/0187: training loss 0.681 Epoch 115 iteration 0098/0187: training loss 0.681 Epoch 115 iteration 0099/0187: training loss 0.679 Epoch 115 iteration 0100/0187: training loss 0.679 Epoch 115 iteration 0101/0187: training loss 0.679 Epoch 115 iteration 0102/0187: training loss 0.679 Epoch 115 iteration 0103/0187: training loss 0.679 Epoch 115 iteration 0104/0187: training loss 0.680 Epoch 115 iteration 0105/0187: training loss 0.679 Epoch 115 iteration 0106/0187: training loss 0.679 Epoch 115 iteration 0107/0187: training loss 0.679 Epoch 115 iteration 0108/0187: training loss 0.680 Epoch 115 iteration 0109/0187: training loss 0.678 Epoch 115 iteration 0110/0187: training loss 0.677 Epoch 115 iteration 0111/0187: training loss 0.676 Epoch 115 iteration 0112/0187: training loss 0.677 Epoch 115 iteration 0113/0187: training loss 0.678 Epoch 115 iteration 0114/0187: training loss 0.678 Epoch 115 iteration 0115/0187: training loss 0.677 Epoch 115 iteration 0116/0187: training loss 0.677 Epoch 115 iteration 0117/0187: training loss 0.677 Epoch 115 iteration 0118/0187: training loss 0.679 Epoch 115 iteration 0119/0187: training loss 0.679 Epoch 115 iteration 0120/0187: training loss 0.677 Epoch 115 iteration 0121/0187: training loss 0.676 Epoch 115 iteration 0122/0187: training loss 0.674 Epoch 115 iteration 0123/0187: training loss 0.674 Epoch 115 iteration 0124/0187: training loss 0.673 Epoch 115 iteration 0125/0187: training loss 0.674 Epoch 115 iteration 0126/0187: training loss 0.674 Epoch 115 iteration 0127/0187: training loss 0.674 Epoch 115 iteration 0128/0187: training loss 0.673 Epoch 115 iteration 0129/0187: training loss 0.672 Epoch 115 iteration 0130/0187: training loss 0.675 Epoch 115 iteration 0131/0187: training loss 0.675 Epoch 115 iteration 0132/0187: training loss 0.675 Epoch 115 iteration 0133/0187: training loss 0.677 Epoch 115 iteration 0134/0187: training loss 0.675 Epoch 115 iteration 0135/0187: training loss 0.676 Epoch 115 iteration 0136/0187: training loss 0.674 Epoch 115 iteration 0137/0187: training loss 0.676 Epoch 115 iteration 0138/0187: training loss 0.676 Epoch 115 iteration 0139/0187: training loss 0.676 Epoch 115 iteration 0140/0187: training loss 0.676 Epoch 115 iteration 0141/0187: training loss 0.675 Epoch 115 iteration 0142/0187: training loss 0.677 Epoch 115 iteration 0143/0187: training loss 0.677 Epoch 115 iteration 0144/0187: training loss 0.678 Epoch 115 iteration 0145/0187: training loss 0.678 Epoch 115 iteration 0146/0187: training loss 0.677 Epoch 115 iteration 0147/0187: training loss 0.677 Epoch 115 iteration 0148/0187: training loss 0.679 Epoch 115 iteration 0149/0187: training loss 0.678 Epoch 115 iteration 0150/0187: training loss 0.678 Epoch 115 iteration 0151/0187: training loss 0.677 Epoch 115 iteration 0152/0187: training loss 0.677 Epoch 115 iteration 0153/0187: training loss 0.677 Epoch 115 iteration 0154/0187: training loss 0.677 Epoch 115 iteration 0155/0187: training loss 0.677 Epoch 115 iteration 0156/0187: training loss 0.676 Epoch 115 iteration 0157/0187: training loss 0.676 Epoch 115 iteration 0158/0187: training loss 0.675 Epoch 115 iteration 0159/0187: training loss 0.675 Epoch 115 iteration 0160/0187: training loss 0.675 Epoch 115 iteration 0161/0187: training loss 0.674 Epoch 115 iteration 0162/0187: training loss 0.675 Epoch 115 iteration 0163/0187: training loss 0.675 Epoch 115 iteration 0164/0187: training loss 0.675 Epoch 115 iteration 0165/0187: training loss 0.675 Epoch 115 iteration 0166/0187: training loss 0.675 Epoch 115 iteration 0167/0187: training loss 0.676 Epoch 115 iteration 0168/0187: training loss 0.678 Epoch 115 iteration 0169/0187: training loss 0.678 Epoch 115 iteration 0170/0187: training loss 0.677 Epoch 115 iteration 0171/0187: training loss 0.678 Epoch 115 iteration 0172/0187: training loss 0.678 Epoch 115 iteration 0173/0187: training loss 0.678 Epoch 115 iteration 0174/0187: training loss 0.679 Epoch 115 iteration 0175/0187: training loss 0.679 Epoch 115 iteration 0176/0187: training loss 0.680 Epoch 115 iteration 0177/0187: training loss 0.681 Epoch 115 iteration 0178/0187: training loss 0.680 Epoch 115 iteration 0179/0187: training loss 0.679 Epoch 115 iteration 0180/0187: training loss 0.679 Epoch 115 iteration 0181/0187: training loss 0.679 Epoch 115 iteration 0182/0187: training loss 0.679 Epoch 115 iteration 0183/0187: training loss 0.678 Epoch 115 iteration 0184/0187: training loss 0.678 Epoch 115 iteration 0185/0187: training loss 0.678 Epoch 115 iteration 0186/0187: training loss 0.679 Epoch 115 iteration 0187/0187: training loss 0.679 Epoch 115 validation pixAcc: 0.875, mIoU: 0.393 Epoch 116 iteration 0001/0187: training loss 0.793 Epoch 116 iteration 0002/0187: training loss 0.702 Epoch 116 iteration 0003/0187: training loss 0.763 Epoch 116 iteration 0004/0187: training loss 0.770 Epoch 116 iteration 0005/0187: training loss 0.747 Epoch 116 iteration 0006/0187: training loss 0.730 Epoch 116 iteration 0007/0187: training loss 0.721 Epoch 116 iteration 0008/0187: training loss 0.703 Epoch 116 iteration 0009/0187: training loss 0.688 Epoch 116 iteration 0010/0187: training loss 0.687 Epoch 116 iteration 0011/0187: training loss 0.677 Epoch 116 iteration 0012/0187: training loss 0.675 Epoch 116 iteration 0013/0187: training loss 0.693 Epoch 116 iteration 0014/0187: training loss 0.683 Epoch 116 iteration 0015/0187: training loss 0.682 Epoch 116 iteration 0016/0187: training loss 0.681 Epoch 116 iteration 0017/0187: training loss 0.685 Epoch 116 iteration 0018/0187: training loss 0.687 Epoch 116 iteration 0019/0187: training loss 0.690 Epoch 116 iteration 0020/0187: training loss 0.688 Epoch 116 iteration 0021/0187: training loss 0.682 Epoch 116 iteration 0022/0187: training loss 0.680 Epoch 116 iteration 0023/0187: training loss 0.678 Epoch 116 iteration 0024/0187: training loss 0.685 Epoch 116 iteration 0025/0187: training loss 0.684 Epoch 116 iteration 0026/0187: training loss 0.678 Epoch 116 iteration 0027/0187: training loss 0.677 Epoch 116 iteration 0028/0187: training loss 0.673 Epoch 116 iteration 0029/0187: training loss 0.678 Epoch 116 iteration 0030/0187: training loss 0.672 Epoch 116 iteration 0031/0187: training loss 0.669 Epoch 116 iteration 0032/0187: training loss 0.676 Epoch 116 iteration 0033/0187: training loss 0.675 Epoch 116 iteration 0034/0187: training loss 0.675 Epoch 116 iteration 0035/0187: training loss 0.673 Epoch 116 iteration 0036/0187: training loss 0.674 Epoch 116 iteration 0037/0187: training loss 0.673 Epoch 116 iteration 0038/0187: training loss 0.673 Epoch 116 iteration 0039/0187: training loss 0.671 Epoch 116 iteration 0040/0187: training loss 0.673 Epoch 116 iteration 0041/0187: training loss 0.675 Epoch 116 iteration 0042/0187: training loss 0.673 Epoch 116 iteration 0043/0187: training loss 0.672 Epoch 116 iteration 0044/0187: training loss 0.675 Epoch 116 iteration 0045/0187: training loss 0.683 Epoch 116 iteration 0046/0187: training loss 0.683 Epoch 116 iteration 0047/0187: training loss 0.684 Epoch 116 iteration 0048/0187: training loss 0.685 Epoch 116 iteration 0049/0187: training loss 0.685 Epoch 116 iteration 0050/0187: training loss 0.685 Epoch 116 iteration 0051/0187: training loss 0.685 Epoch 116 iteration 0052/0187: training loss 0.686 Epoch 116 iteration 0053/0187: training loss 0.689 Epoch 116 iteration 0054/0187: training loss 0.688 Epoch 116 iteration 0055/0187: training loss 0.689 Epoch 116 iteration 0056/0187: training loss 0.690 Epoch 116 iteration 0057/0187: training loss 0.690 Epoch 116 iteration 0058/0187: training loss 0.688 Epoch 116 iteration 0059/0187: training loss 0.689 Epoch 116 iteration 0060/0187: training loss 0.690 Epoch 116 iteration 0061/0187: training loss 0.687 Epoch 116 iteration 0062/0187: training loss 0.686 Epoch 116 iteration 0063/0187: training loss 0.686 Epoch 116 iteration 0064/0187: training loss 0.684 Epoch 116 iteration 0065/0187: training loss 0.683 Epoch 116 iteration 0066/0187: training loss 0.684 Epoch 116 iteration 0067/0187: training loss 0.685 Epoch 116 iteration 0068/0187: training loss 0.685 Epoch 116 iteration 0069/0187: training loss 0.685 Epoch 116 iteration 0070/0187: training loss 0.687 Epoch 116 iteration 0071/0187: training loss 0.688 Epoch 116 iteration 0072/0187: training loss 0.688 Epoch 116 iteration 0073/0187: training loss 0.688 Epoch 116 iteration 0074/0187: training loss 0.686 Epoch 116 iteration 0075/0187: training loss 0.684 Epoch 116 iteration 0076/0187: training loss 0.682 Epoch 116 iteration 0077/0187: training loss 0.681 Epoch 116 iteration 0078/0187: training loss 0.679 Epoch 116 iteration 0079/0187: training loss 0.680 Epoch 116 iteration 0080/0187: training loss 0.679 Epoch 116 iteration 0081/0187: training loss 0.681 Epoch 116 iteration 0082/0187: training loss 0.680 Epoch 116 iteration 0083/0187: training loss 0.679 Epoch 116 iteration 0084/0187: training loss 0.679 Epoch 116 iteration 0085/0187: training loss 0.679 Epoch 116 iteration 0086/0187: training loss 0.678 Epoch 116 iteration 0087/0187: training loss 0.677 Epoch 116 iteration 0088/0187: training loss 0.676 Epoch 116 iteration 0089/0187: training loss 0.674 Epoch 116 iteration 0090/0187: training loss 0.673 Epoch 116 iteration 0091/0188: training loss 0.673 Epoch 116 iteration 0092/0188: training loss 0.673 Epoch 116 iteration 0093/0188: training loss 0.674 Epoch 116 iteration 0094/0188: training loss 0.673 Epoch 116 iteration 0095/0188: training loss 0.672 Epoch 116 iteration 0096/0188: training loss 0.672 Epoch 116 iteration 0097/0188: training loss 0.671 Epoch 116 iteration 0098/0188: training loss 0.672 Epoch 116 iteration 0099/0188: training loss 0.672 Epoch 116 iteration 0100/0188: training loss 0.672 Epoch 116 iteration 0101/0188: training loss 0.674 Epoch 116 iteration 0102/0188: training loss 0.674 Epoch 116 iteration 0103/0188: training loss 0.672 Epoch 116 iteration 0104/0188: training loss 0.673 Epoch 116 iteration 0105/0188: training loss 0.674 Epoch 116 iteration 0106/0188: training loss 0.673 Epoch 116 iteration 0107/0188: training loss 0.673 Epoch 116 iteration 0108/0188: training loss 0.673 Epoch 116 iteration 0109/0188: training loss 0.673 Epoch 116 iteration 0110/0188: training loss 0.673 Epoch 116 iteration 0111/0188: training loss 0.672 Epoch 116 iteration 0112/0188: training loss 0.671 Epoch 116 iteration 0113/0188: training loss 0.671 Epoch 116 iteration 0114/0188: training loss 0.672 Epoch 116 iteration 0115/0188: training loss 0.671 Epoch 116 iteration 0116/0188: training loss 0.671 Epoch 116 iteration 0117/0188: training loss 0.670 Epoch 116 iteration 0118/0188: training loss 0.669 Epoch 116 iteration 0119/0188: training loss 0.671 Epoch 116 iteration 0120/0188: training loss 0.671 Epoch 116 iteration 0121/0188: training loss 0.672 Epoch 116 iteration 0122/0188: training loss 0.671 Epoch 116 iteration 0123/0188: training loss 0.669 Epoch 116 iteration 0124/0188: training loss 0.670 Epoch 116 iteration 0125/0188: training loss 0.671 Epoch 116 iteration 0126/0188: training loss 0.673 Epoch 116 iteration 0127/0188: training loss 0.673 Epoch 116 iteration 0128/0188: training loss 0.672 Epoch 116 iteration 0129/0188: training loss 0.672 Epoch 116 iteration 0130/0188: training loss 0.673 Epoch 116 iteration 0131/0188: training loss 0.672 Epoch 116 iteration 0132/0188: training loss 0.671 Epoch 116 iteration 0133/0188: training loss 0.672 Epoch 116 iteration 0134/0188: training loss 0.672 Epoch 116 iteration 0135/0188: training loss 0.672 Epoch 116 iteration 0136/0188: training loss 0.671 Epoch 116 iteration 0137/0188: training loss 0.671 Epoch 116 iteration 0138/0188: training loss 0.671 Epoch 116 iteration 0139/0188: training loss 0.671 Epoch 116 iteration 0140/0188: training loss 0.671 Epoch 116 iteration 0141/0188: training loss 0.673 Epoch 116 iteration 0142/0188: training loss 0.674 Epoch 116 iteration 0143/0188: training loss 0.673 Epoch 116 iteration 0144/0188: training loss 0.672 Epoch 116 iteration 0145/0188: training loss 0.672 Epoch 116 iteration 0146/0188: training loss 0.671 Epoch 116 iteration 0147/0188: training loss 0.671 Epoch 116 iteration 0148/0188: training loss 0.671 Epoch 116 iteration 0149/0188: training loss 0.671 Epoch 116 iteration 0150/0188: training loss 0.671 Epoch 116 iteration 0151/0188: training loss 0.670 Epoch 116 iteration 0152/0188: training loss 0.670 Epoch 116 iteration 0153/0188: training loss 0.671 Epoch 116 iteration 0154/0188: training loss 0.670 Epoch 116 iteration 0155/0188: training loss 0.670 Epoch 116 iteration 0156/0188: training loss 0.670 Epoch 116 iteration 0157/0188: training loss 0.669 Epoch 116 iteration 0158/0188: training loss 0.669 Epoch 116 iteration 0159/0188: training loss 0.669 Epoch 116 iteration 0160/0188: training loss 0.669 Epoch 116 iteration 0161/0188: training loss 0.669 Epoch 116 iteration 0162/0188: training loss 0.668 Epoch 116 iteration 0163/0188: training loss 0.668 Epoch 116 iteration 0164/0188: training loss 0.667 Epoch 116 iteration 0165/0188: training loss 0.666 Epoch 116 iteration 0166/0188: training loss 0.666 Epoch 116 iteration 0167/0188: training loss 0.666 Epoch 116 iteration 0168/0188: training loss 0.666 Epoch 116 iteration 0169/0188: training loss 0.667 Epoch 116 iteration 0170/0188: training loss 0.667 Epoch 116 iteration 0171/0188: training loss 0.668 Epoch 116 iteration 0172/0188: training loss 0.668 Epoch 116 iteration 0173/0188: training loss 0.668 Epoch 116 iteration 0174/0188: training loss 0.667 Epoch 116 iteration 0175/0188: training loss 0.668 Epoch 116 iteration 0176/0188: training loss 0.669 Epoch 116 iteration 0177/0188: training loss 0.668 Epoch 116 iteration 0178/0188: training loss 0.668 Epoch 116 iteration 0179/0188: training loss 0.668 Epoch 116 iteration 0180/0188: training loss 0.669 Epoch 116 iteration 0181/0188: training loss 0.668 Epoch 116 iteration 0182/0188: training loss 0.668 Epoch 116 iteration 0183/0188: training loss 0.667 Epoch 116 iteration 0184/0188: training loss 0.667 Epoch 116 iteration 0185/0188: training loss 0.667 Epoch 116 iteration 0186/0188: training loss 0.666 Epoch 116 validation pixAcc: 0.875, mIoU: 0.395 Epoch 117 iteration 0001/0187: training loss 0.684 Epoch 117 iteration 0002/0187: training loss 0.726 Epoch 117 iteration 0003/0187: training loss 0.744 Epoch 117 iteration 0004/0187: training loss 0.717 Epoch 117 iteration 0005/0187: training loss 0.699 Epoch 117 iteration 0006/0187: training loss 0.712 Epoch 117 iteration 0007/0187: training loss 0.708 Epoch 117 iteration 0008/0187: training loss 0.688 Epoch 117 iteration 0009/0187: training loss 0.674 Epoch 117 iteration 0010/0187: training loss 0.666 Epoch 117 iteration 0011/0187: training loss 0.660 Epoch 117 iteration 0012/0187: training loss 0.664 Epoch 117 iteration 0013/0187: training loss 0.680 Epoch 117 iteration 0014/0187: training loss 0.670 Epoch 117 iteration 0015/0187: training loss 0.682 Epoch 117 iteration 0016/0187: training loss 0.675 Epoch 117 iteration 0017/0187: training loss 0.670 Epoch 117 iteration 0018/0187: training loss 0.672 Epoch 117 iteration 0019/0187: training loss 0.676 Epoch 117 iteration 0020/0187: training loss 0.680 Epoch 117 iteration 0021/0187: training loss 0.677 Epoch 117 iteration 0022/0187: training loss 0.682 Epoch 117 iteration 0023/0187: training loss 0.677 Epoch 117 iteration 0024/0187: training loss 0.674 Epoch 117 iteration 0025/0187: training loss 0.674 Epoch 117 iteration 0026/0187: training loss 0.671 Epoch 117 iteration 0027/0187: training loss 0.672 Epoch 117 iteration 0028/0187: training loss 0.674 Epoch 117 iteration 0029/0187: training loss 0.676 Epoch 117 iteration 0030/0187: training loss 0.673 Epoch 117 iteration 0031/0187: training loss 0.672 Epoch 117 iteration 0032/0187: training loss 0.671 Epoch 117 iteration 0033/0187: training loss 0.674 Epoch 117 iteration 0034/0187: training loss 0.675 Epoch 117 iteration 0035/0187: training loss 0.673 Epoch 117 iteration 0036/0187: training loss 0.672 Epoch 117 iteration 0037/0187: training loss 0.673 Epoch 117 iteration 0038/0187: training loss 0.670 Epoch 117 iteration 0039/0187: training loss 0.671 Epoch 117 iteration 0040/0187: training loss 0.670 Epoch 117 iteration 0041/0187: training loss 0.676 Epoch 117 iteration 0042/0187: training loss 0.675 Epoch 117 iteration 0043/0187: training loss 0.675 Epoch 117 iteration 0044/0187: training loss 0.674 Epoch 117 iteration 0045/0187: training loss 0.679 Epoch 117 iteration 0046/0187: training loss 0.680 Epoch 117 iteration 0047/0187: training loss 0.683 Epoch 117 iteration 0048/0187: training loss 0.680 Epoch 117 iteration 0049/0187: training loss 0.677 Epoch 117 iteration 0050/0187: training loss 0.676 Epoch 117 iteration 0051/0187: training loss 0.677 Epoch 117 iteration 0052/0187: training loss 0.676 Epoch 117 iteration 0053/0187: training loss 0.675 Epoch 117 iteration 0054/0187: training loss 0.676 Epoch 117 iteration 0055/0187: training loss 0.675 Epoch 117 iteration 0056/0187: training loss 0.674 Epoch 117 iteration 0057/0187: training loss 0.674 Epoch 117 iteration 0058/0187: training loss 0.674 Epoch 117 iteration 0059/0187: training loss 0.678 Epoch 117 iteration 0060/0187: training loss 0.678 Epoch 117 iteration 0061/0187: training loss 0.679 Epoch 117 iteration 0062/0187: training loss 0.679 Epoch 117 iteration 0063/0187: training loss 0.678 Epoch 117 iteration 0064/0187: training loss 0.678 Epoch 117 iteration 0065/0187: training loss 0.680 Epoch 117 iteration 0066/0187: training loss 0.678 Epoch 117 iteration 0067/0187: training loss 0.676 Epoch 117 iteration 0068/0187: training loss 0.675 Epoch 117 iteration 0069/0187: training loss 0.675 Epoch 117 iteration 0070/0187: training loss 0.674 Epoch 117 iteration 0071/0187: training loss 0.673 Epoch 117 iteration 0072/0187: training loss 0.675 Epoch 117 iteration 0073/0187: training loss 0.676 Epoch 117 iteration 0074/0187: training loss 0.674 Epoch 117 iteration 0075/0187: training loss 0.673 Epoch 117 iteration 0076/0187: training loss 0.673 Epoch 117 iteration 0077/0187: training loss 0.671 Epoch 117 iteration 0078/0187: training loss 0.672 Epoch 117 iteration 0079/0187: training loss 0.672 Epoch 117 iteration 0080/0187: training loss 0.673 Epoch 117 iteration 0081/0187: training loss 0.673 Epoch 117 iteration 0082/0187: training loss 0.673 Epoch 117 iteration 0083/0187: training loss 0.673 Epoch 117 iteration 0084/0187: training loss 0.672 Epoch 117 iteration 0085/0187: training loss 0.674 Epoch 117 iteration 0086/0187: training loss 0.673 Epoch 117 iteration 0087/0187: training loss 0.673 Epoch 117 iteration 0088/0187: training loss 0.672 Epoch 117 iteration 0089/0187: training loss 0.670 Epoch 117 iteration 0090/0187: training loss 0.671 Epoch 117 iteration 0091/0187: training loss 0.677 Epoch 117 iteration 0092/0187: training loss 0.678 Epoch 117 iteration 0093/0187: training loss 0.677 Epoch 117 iteration 0094/0187: training loss 0.678 Epoch 117 iteration 0095/0187: training loss 0.681 Epoch 117 iteration 0096/0187: training loss 0.681 Epoch 117 iteration 0097/0187: training loss 0.681 Epoch 117 iteration 0098/0187: training loss 0.682 Epoch 117 iteration 0099/0187: training loss 0.683 Epoch 117 iteration 0100/0187: training loss 0.683 Epoch 117 iteration 0101/0187: training loss 0.682 Epoch 117 iteration 0102/0187: training loss 0.683 Epoch 117 iteration 0103/0187: training loss 0.682 Epoch 117 iteration 0104/0187: training loss 0.683 Epoch 117 iteration 0105/0187: training loss 0.683 Epoch 117 iteration 0106/0187: training loss 0.682 Epoch 117 iteration 0107/0187: training loss 0.680 Epoch 117 iteration 0108/0187: training loss 0.680 Epoch 117 iteration 0109/0187: training loss 0.679 Epoch 117 iteration 0110/0187: training loss 0.679 Epoch 117 iteration 0111/0187: training loss 0.680 Epoch 117 iteration 0112/0187: training loss 0.680 Epoch 117 iteration 0113/0187: training loss 0.679 Epoch 117 iteration 0114/0187: training loss 0.679 Epoch 117 iteration 0115/0187: training loss 0.678 Epoch 117 iteration 0116/0187: training loss 0.679 Epoch 117 iteration 0117/0187: training loss 0.678 Epoch 117 iteration 0118/0187: training loss 0.677 Epoch 117 iteration 0119/0187: training loss 0.677 Epoch 117 iteration 0120/0187: training loss 0.677 Epoch 117 iteration 0121/0187: training loss 0.676 Epoch 117 iteration 0122/0187: training loss 0.676 Epoch 117 iteration 0123/0187: training loss 0.676 Epoch 117 iteration 0124/0187: training loss 0.676 Epoch 117 iteration 0125/0187: training loss 0.678 Epoch 117 iteration 0126/0187: training loss 0.678 Epoch 117 iteration 0127/0187: training loss 0.676 Epoch 117 iteration 0128/0187: training loss 0.676 Epoch 117 iteration 0129/0187: training loss 0.675 Epoch 117 iteration 0130/0187: training loss 0.676 Epoch 117 iteration 0131/0187: training loss 0.677 Epoch 117 iteration 0132/0187: training loss 0.676 Epoch 117 iteration 0133/0187: training loss 0.676 Epoch 117 iteration 0134/0187: training loss 0.675 Epoch 117 iteration 0135/0187: training loss 0.675 Epoch 117 iteration 0136/0187: training loss 0.674 Epoch 117 iteration 0137/0187: training loss 0.673 Epoch 117 iteration 0138/0187: training loss 0.674 Epoch 117 iteration 0139/0187: training loss 0.673 Epoch 117 iteration 0140/0187: training loss 0.674 Epoch 117 iteration 0141/0187: training loss 0.673 Epoch 117 iteration 0142/0187: training loss 0.673 Epoch 117 iteration 0143/0187: training loss 0.673 Epoch 117 iteration 0144/0187: training loss 0.673 Epoch 117 iteration 0145/0187: training loss 0.672 Epoch 117 iteration 0146/0187: training loss 0.673 Epoch 117 iteration 0147/0187: training loss 0.672 Epoch 117 iteration 0148/0187: training loss 0.674 Epoch 117 iteration 0149/0187: training loss 0.674 Epoch 117 iteration 0150/0187: training loss 0.674 Epoch 117 iteration 0151/0187: training loss 0.675 Epoch 117 iteration 0152/0187: training loss 0.676 Epoch 117 iteration 0153/0187: training loss 0.676 Epoch 117 iteration 0154/0187: training loss 0.677 Epoch 117 iteration 0155/0187: training loss 0.676 Epoch 117 iteration 0156/0187: training loss 0.676 Epoch 117 iteration 0157/0187: training loss 0.675 Epoch 117 iteration 0158/0187: training loss 0.675 Epoch 117 iteration 0159/0187: training loss 0.674 Epoch 117 iteration 0160/0187: training loss 0.675 Epoch 117 iteration 0161/0187: training loss 0.675 Epoch 117 iteration 0162/0187: training loss 0.675 Epoch 117 iteration 0163/0187: training loss 0.674 Epoch 117 iteration 0164/0187: training loss 0.673 Epoch 117 iteration 0165/0187: training loss 0.674 Epoch 117 iteration 0166/0187: training loss 0.673 Epoch 117 iteration 0167/0187: training loss 0.673 Epoch 117 iteration 0168/0187: training loss 0.672 Epoch 117 iteration 0169/0187: training loss 0.672 Epoch 117 iteration 0170/0187: training loss 0.672 Epoch 117 iteration 0171/0187: training loss 0.672 Epoch 117 iteration 0172/0187: training loss 0.672 Epoch 117 iteration 0173/0187: training loss 0.672 Epoch 117 iteration 0174/0187: training loss 0.673 Epoch 117 iteration 0175/0187: training loss 0.672 Epoch 117 iteration 0176/0187: training loss 0.672 Epoch 117 iteration 0177/0187: training loss 0.672 Epoch 117 iteration 0178/0187: training loss 0.673 Epoch 117 iteration 0179/0187: training loss 0.673 Epoch 117 iteration 0180/0187: training loss 0.674 Epoch 117 iteration 0181/0187: training loss 0.674 Epoch 117 iteration 0182/0187: training loss 0.674 Epoch 117 iteration 0183/0187: training loss 0.674 Epoch 117 iteration 0184/0187: training loss 0.676 Epoch 117 iteration 0185/0187: training loss 0.676 Epoch 117 iteration 0186/0187: training loss 0.676 Epoch 117 iteration 0187/0187: training loss 0.677 Epoch 117 validation pixAcc: 0.876, mIoU: 0.393 Epoch 118 iteration 0001/0187: training loss 0.617 Epoch 118 iteration 0002/0187: training loss 0.606 Epoch 118 iteration 0003/0187: training loss 0.590 Epoch 118 iteration 0004/0187: training loss 0.599 Epoch 118 iteration 0005/0187: training loss 0.635 Epoch 118 iteration 0006/0187: training loss 0.649 Epoch 118 iteration 0007/0187: training loss 0.636 Epoch 118 iteration 0008/0187: training loss 0.641 Epoch 118 iteration 0009/0187: training loss 0.647 Epoch 118 iteration 0010/0187: training loss 0.667 Epoch 118 iteration 0011/0187: training loss 0.661 Epoch 118 iteration 0012/0187: training loss 0.655 Epoch 118 iteration 0013/0187: training loss 0.655 Epoch 118 iteration 0014/0187: training loss 0.660 Epoch 118 iteration 0015/0187: training loss 0.668 Epoch 118 iteration 0016/0187: training loss 0.666 Epoch 118 iteration 0017/0187: training loss 0.664 Epoch 118 iteration 0018/0187: training loss 0.660 Epoch 118 iteration 0019/0187: training loss 0.658 Epoch 118 iteration 0020/0187: training loss 0.662 Epoch 118 iteration 0021/0187: training loss 0.663 Epoch 118 iteration 0022/0187: training loss 0.661 Epoch 118 iteration 0023/0187: training loss 0.661 Epoch 118 iteration 0024/0187: training loss 0.661 Epoch 118 iteration 0025/0187: training loss 0.664 Epoch 118 iteration 0026/0187: training loss 0.668 Epoch 118 iteration 0027/0187: training loss 0.667 Epoch 118 iteration 0028/0187: training loss 0.663 Epoch 118 iteration 0029/0187: training loss 0.665 Epoch 118 iteration 0030/0187: training loss 0.666 Epoch 118 iteration 0031/0187: training loss 0.669 Epoch 118 iteration 0032/0187: training loss 0.665 Epoch 118 iteration 0033/0187: training loss 0.665 Epoch 118 iteration 0034/0187: training loss 0.664 Epoch 118 iteration 0035/0187: training loss 0.662 Epoch 118 iteration 0036/0187: training loss 0.662 Epoch 118 iteration 0037/0187: training loss 0.662 Epoch 118 iteration 0038/0187: training loss 0.660 Epoch 118 iteration 0039/0187: training loss 0.659 Epoch 118 iteration 0040/0187: training loss 0.661 Epoch 118 iteration 0041/0187: training loss 0.660 Epoch 118 iteration 0042/0187: training loss 0.657 Epoch 118 iteration 0043/0187: training loss 0.655 Epoch 118 iteration 0044/0187: training loss 0.660 Epoch 118 iteration 0045/0187: training loss 0.661 Epoch 118 iteration 0046/0187: training loss 0.660 Epoch 118 iteration 0047/0187: training loss 0.663 Epoch 118 iteration 0048/0187: training loss 0.661 Epoch 118 iteration 0049/0187: training loss 0.659 Epoch 118 iteration 0050/0187: training loss 0.661 Epoch 118 iteration 0051/0187: training loss 0.661 Epoch 118 iteration 0052/0187: training loss 0.659 Epoch 118 iteration 0053/0187: training loss 0.660 Epoch 118 iteration 0054/0187: training loss 0.659 Epoch 118 iteration 0055/0187: training loss 0.657 Epoch 118 iteration 0056/0187: training loss 0.656 Epoch 118 iteration 0057/0187: training loss 0.658 Epoch 118 iteration 0058/0187: training loss 0.660 Epoch 118 iteration 0059/0187: training loss 0.661 Epoch 118 iteration 0060/0187: training loss 0.662 Epoch 118 iteration 0061/0187: training loss 0.664 Epoch 118 iteration 0062/0187: training loss 0.660 Epoch 118 iteration 0063/0187: training loss 0.664 Epoch 118 iteration 0064/0187: training loss 0.662 Epoch 118 iteration 0065/0187: training loss 0.661 Epoch 118 iteration 0066/0187: training loss 0.661 Epoch 118 iteration 0067/0187: training loss 0.663 Epoch 118 iteration 0068/0187: training loss 0.662 Epoch 118 iteration 0069/0187: training loss 0.660 Epoch 118 iteration 0070/0187: training loss 0.659 Epoch 118 iteration 0071/0187: training loss 0.658 Epoch 118 iteration 0072/0187: training loss 0.658 Epoch 118 iteration 0073/0187: training loss 0.657 Epoch 118 iteration 0074/0187: training loss 0.657 Epoch 118 iteration 0075/0187: training loss 0.657 Epoch 118 iteration 0076/0187: training loss 0.658 Epoch 118 iteration 0077/0187: training loss 0.658 Epoch 118 iteration 0078/0187: training loss 0.659 Epoch 118 iteration 0079/0187: training loss 0.659 Epoch 118 iteration 0080/0187: training loss 0.659 Epoch 118 iteration 0081/0187: training loss 0.658 Epoch 118 iteration 0082/0187: training loss 0.658 Epoch 118 iteration 0083/0187: training loss 0.657 Epoch 118 iteration 0084/0187: training loss 0.655 Epoch 118 iteration 0085/0187: training loss 0.656 Epoch 118 iteration 0086/0187: training loss 0.656 Epoch 118 iteration 0087/0187: training loss 0.656 Epoch 118 iteration 0088/0187: training loss 0.658 Epoch 118 iteration 0089/0187: training loss 0.658 Epoch 118 iteration 0090/0187: training loss 0.660 Epoch 118 iteration 0091/0188: training loss 0.661 Epoch 118 iteration 0092/0188: training loss 0.661 Epoch 118 iteration 0093/0188: training loss 0.660 Epoch 118 iteration 0094/0188: training loss 0.660 Epoch 118 iteration 0095/0188: training loss 0.660 Epoch 118 iteration 0096/0188: training loss 0.660 Epoch 118 iteration 0097/0188: training loss 0.661 Epoch 118 iteration 0098/0188: training loss 0.663 Epoch 118 iteration 0099/0188: training loss 0.661 Epoch 118 iteration 0100/0188: training loss 0.661 Epoch 118 iteration 0101/0188: training loss 0.661 Epoch 118 iteration 0102/0188: training loss 0.662 Epoch 118 iteration 0103/0188: training loss 0.663 Epoch 118 iteration 0104/0188: training loss 0.662 Epoch 118 iteration 0105/0188: training loss 0.662 Epoch 118 iteration 0106/0188: training loss 0.663 Epoch 118 iteration 0107/0188: training loss 0.663 Epoch 118 iteration 0108/0188: training loss 0.664 Epoch 118 iteration 0109/0188: training loss 0.668 Epoch 118 iteration 0110/0188: training loss 0.667 Epoch 118 iteration 0111/0188: training loss 0.667 Epoch 118 iteration 0112/0188: training loss 0.666 Epoch 118 iteration 0113/0188: training loss 0.667 Epoch 118 iteration 0114/0188: training loss 0.666 Epoch 118 iteration 0115/0188: training loss 0.665 Epoch 118 iteration 0116/0188: training loss 0.666 Epoch 118 iteration 0117/0188: training loss 0.665 Epoch 118 iteration 0118/0188: training loss 0.665 Epoch 118 iteration 0119/0188: training loss 0.665 Epoch 118 iteration 0120/0188: training loss 0.664 Epoch 118 iteration 0121/0188: training loss 0.664 Epoch 118 iteration 0122/0188: training loss 0.665 Epoch 118 iteration 0123/0188: training loss 0.664 Epoch 118 iteration 0124/0188: training loss 0.663 Epoch 118 iteration 0125/0188: training loss 0.666 Epoch 118 iteration 0126/0188: training loss 0.666 Epoch 118 iteration 0127/0188: training loss 0.665 Epoch 118 iteration 0128/0188: training loss 0.665 Epoch 118 iteration 0129/0188: training loss 0.665 Epoch 118 iteration 0130/0188: training loss 0.667 Epoch 118 iteration 0131/0188: training loss 0.667 Epoch 118 iteration 0132/0188: training loss 0.667 Epoch 118 iteration 0133/0188: training loss 0.668 Epoch 118 iteration 0134/0188: training loss 0.668 Epoch 118 iteration 0135/0188: training loss 0.668 Epoch 118 iteration 0136/0188: training loss 0.668 Epoch 118 iteration 0137/0188: training loss 0.668 Epoch 118 iteration 0138/0188: training loss 0.668 Epoch 118 iteration 0139/0188: training loss 0.668 Epoch 118 iteration 0140/0188: training loss 0.668 Epoch 118 iteration 0141/0188: training loss 0.667 Epoch 118 iteration 0142/0188: training loss 0.668 Epoch 118 iteration 0143/0188: training loss 0.668 Epoch 118 iteration 0144/0188: training loss 0.668 Epoch 118 iteration 0145/0188: training loss 0.668 Epoch 118 iteration 0146/0188: training loss 0.667 Epoch 118 iteration 0147/0188: training loss 0.669 Epoch 118 iteration 0148/0188: training loss 0.669 Epoch 118 iteration 0149/0188: training loss 0.669 Epoch 118 iteration 0150/0188: training loss 0.668 Epoch 118 iteration 0151/0188: training loss 0.668 Epoch 118 iteration 0152/0188: training loss 0.669 Epoch 118 iteration 0153/0188: training loss 0.669 Epoch 118 iteration 0154/0188: training loss 0.669 Epoch 118 iteration 0155/0188: training loss 0.668 Epoch 118 iteration 0156/0188: training loss 0.668 Epoch 118 iteration 0157/0188: training loss 0.669 Epoch 118 iteration 0158/0188: training loss 0.668 Epoch 118 iteration 0159/0188: training loss 0.667 Epoch 118 iteration 0160/0188: training loss 0.668 Epoch 118 iteration 0161/0188: training loss 0.667 Epoch 118 iteration 0162/0188: training loss 0.666 Epoch 118 iteration 0163/0188: training loss 0.666 Epoch 118 iteration 0164/0188: training loss 0.666 Epoch 118 iteration 0165/0188: training loss 0.666 Epoch 118 iteration 0166/0188: training loss 0.666 Epoch 118 iteration 0167/0188: training loss 0.667 Epoch 118 iteration 0168/0188: training loss 0.668 Epoch 118 iteration 0169/0188: training loss 0.668 Epoch 118 iteration 0170/0188: training loss 0.669 Epoch 118 iteration 0171/0188: training loss 0.670 Epoch 118 iteration 0172/0188: training loss 0.669 Epoch 118 iteration 0173/0188: training loss 0.668 Epoch 118 iteration 0174/0188: training loss 0.668 Epoch 118 iteration 0175/0188: training loss 0.668 Epoch 118 iteration 0176/0188: training loss 0.667 Epoch 118 iteration 0177/0188: training loss 0.667 Epoch 118 iteration 0178/0188: training loss 0.667 Epoch 118 iteration 0179/0188: training loss 0.667 Epoch 118 iteration 0180/0188: training loss 0.666 Epoch 118 iteration 0181/0188: training loss 0.667 Epoch 118 iteration 0182/0188: training loss 0.667 Epoch 118 iteration 0183/0188: training loss 0.666 Epoch 118 iteration 0184/0188: training loss 0.666 Epoch 118 iteration 0185/0188: training loss 0.665 Epoch 118 iteration 0186/0188: training loss 0.666 Epoch 118 validation pixAcc: 0.875, mIoU: 0.395 Epoch 119 iteration 0001/0187: training loss 0.661 Epoch 119 iteration 0002/0187: training loss 0.682 Epoch 119 iteration 0003/0187: training loss 0.689 Epoch 119 iteration 0004/0187: training loss 0.703 Epoch 119 iteration 0005/0187: training loss 0.698 Epoch 119 iteration 0006/0187: training loss 0.685 Epoch 119 iteration 0007/0187: training loss 0.702 Epoch 119 iteration 0008/0187: training loss 0.700 Epoch 119 iteration 0009/0187: training loss 0.691 Epoch 119 iteration 0010/0187: training loss 0.722 Epoch 119 iteration 0011/0187: training loss 0.713 Epoch 119 iteration 0012/0187: training loss 0.737 Epoch 119 iteration 0013/0187: training loss 0.725 Epoch 119 iteration 0014/0187: training loss 0.724 Epoch 119 iteration 0015/0187: training loss 0.715 Epoch 119 iteration 0016/0187: training loss 0.711 Epoch 119 iteration 0017/0187: training loss 0.710 Epoch 119 iteration 0018/0187: training loss 0.705 Epoch 119 iteration 0019/0187: training loss 0.702 Epoch 119 iteration 0020/0187: training loss 0.708 Epoch 119 iteration 0021/0187: training loss 0.700 Epoch 119 iteration 0022/0187: training loss 0.699 Epoch 119 iteration 0023/0187: training loss 0.703 Epoch 119 iteration 0024/0187: training loss 0.701 Epoch 119 iteration 0025/0187: training loss 0.696 Epoch 119 iteration 0026/0187: training loss 0.693 Epoch 119 iteration 0027/0187: training loss 0.694 Epoch 119 iteration 0028/0187: training loss 0.691 Epoch 119 iteration 0029/0187: training loss 0.688 Epoch 119 iteration 0030/0187: training loss 0.685 Epoch 119 iteration 0031/0187: training loss 0.684 Epoch 119 iteration 0032/0187: training loss 0.680 Epoch 119 iteration 0033/0187: training loss 0.680 Epoch 119 iteration 0034/0187: training loss 0.677 Epoch 119 iteration 0035/0187: training loss 0.677 Epoch 119 iteration 0036/0187: training loss 0.674 Epoch 119 iteration 0037/0187: training loss 0.670 Epoch 119 iteration 0038/0187: training loss 0.670 Epoch 119 iteration 0039/0187: training loss 0.669 Epoch 119 iteration 0040/0187: training loss 0.669 Epoch 119 iteration 0041/0187: training loss 0.672 Epoch 119 iteration 0042/0187: training loss 0.670 Epoch 119 iteration 0043/0187: training loss 0.670 Epoch 119 iteration 0044/0187: training loss 0.672 Epoch 119 iteration 0045/0187: training loss 0.672 Epoch 119 iteration 0046/0187: training loss 0.677 Epoch 119 iteration 0047/0187: training loss 0.679 Epoch 119 iteration 0048/0187: training loss 0.679 Epoch 119 iteration 0049/0187: training loss 0.682 Epoch 119 iteration 0050/0187: training loss 0.679 Epoch 119 iteration 0051/0187: training loss 0.680 Epoch 119 iteration 0052/0187: training loss 0.679 Epoch 119 iteration 0053/0187: training loss 0.678 Epoch 119 iteration 0054/0187: training loss 0.677 Epoch 119 iteration 0055/0187: training loss 0.679 Epoch 119 iteration 0056/0187: training loss 0.679 Epoch 119 iteration 0057/0187: training loss 0.678 Epoch 119 iteration 0058/0187: training loss 0.678 Epoch 119 iteration 0059/0187: training loss 0.676 Epoch 119 iteration 0060/0187: training loss 0.674 Epoch 119 iteration 0061/0187: training loss 0.673 Epoch 119 iteration 0062/0187: training loss 0.673 Epoch 119 iteration 0063/0187: training loss 0.671 Epoch 119 iteration 0064/0187: training loss 0.673 Epoch 119 iteration 0065/0187: training loss 0.673 Epoch 119 iteration 0066/0187: training loss 0.671 Epoch 119 iteration 0067/0187: training loss 0.673 Epoch 119 iteration 0068/0187: training loss 0.673 Epoch 119 iteration 0069/0187: training loss 0.673 Epoch 119 iteration 0070/0187: training loss 0.672 Epoch 119 iteration 0071/0187: training loss 0.671 Epoch 119 iteration 0072/0187: training loss 0.671 Epoch 119 iteration 0073/0187: training loss 0.670 Epoch 119 iteration 0074/0187: training loss 0.669 Epoch 119 iteration 0075/0187: training loss 0.668 Epoch 119 iteration 0076/0187: training loss 0.668 Epoch 119 iteration 0077/0187: training loss 0.668 Epoch 119 iteration 0078/0187: training loss 0.668 Epoch 119 iteration 0079/0187: training loss 0.670 Epoch 119 iteration 0080/0187: training loss 0.670 Epoch 119 iteration 0081/0187: training loss 0.669 Epoch 119 iteration 0082/0187: training loss 0.670 Epoch 119 iteration 0083/0187: training loss 0.671 Epoch 119 iteration 0084/0187: training loss 0.670 Epoch 119 iteration 0085/0187: training loss 0.669 Epoch 119 iteration 0086/0187: training loss 0.669 Epoch 119 iteration 0087/0187: training loss 0.670 Epoch 119 iteration 0088/0187: training loss 0.669 Epoch 119 iteration 0089/0187: training loss 0.670 Epoch 119 iteration 0090/0187: training loss 0.670 Epoch 119 iteration 0091/0187: training loss 0.669 Epoch 119 iteration 0092/0187: training loss 0.670 Epoch 119 iteration 0093/0187: training loss 0.670 Epoch 119 iteration 0094/0187: training loss 0.674 Epoch 119 iteration 0095/0187: training loss 0.674 Epoch 119 iteration 0096/0187: training loss 0.675 Epoch 119 iteration 0097/0187: training loss 0.674 Epoch 119 iteration 0098/0187: training loss 0.674 Epoch 119 iteration 0099/0187: training loss 0.676 Epoch 119 iteration 0100/0187: training loss 0.676 Epoch 119 iteration 0101/0187: training loss 0.677 Epoch 119 iteration 0102/0187: training loss 0.678 Epoch 119 iteration 0103/0187: training loss 0.678 Epoch 119 iteration 0104/0187: training loss 0.679 Epoch 119 iteration 0105/0187: training loss 0.678 Epoch 119 iteration 0106/0187: training loss 0.679 Epoch 119 iteration 0107/0187: training loss 0.678 Epoch 119 iteration 0108/0187: training loss 0.678 Epoch 119 iteration 0109/0187: training loss 0.678 Epoch 119 iteration 0110/0187: training loss 0.678 Epoch 119 iteration 0111/0187: training loss 0.679 Epoch 119 iteration 0112/0187: training loss 0.682 Epoch 119 iteration 0113/0187: training loss 0.682 Epoch 119 iteration 0114/0187: training loss 0.682 Epoch 119 iteration 0115/0187: training loss 0.681 Epoch 119 iteration 0116/0187: training loss 0.682 Epoch 119 iteration 0117/0187: training loss 0.682 Epoch 119 iteration 0118/0187: training loss 0.682 Epoch 119 iteration 0119/0187: training loss 0.682 Epoch 119 iteration 0120/0187: training loss 0.682 Epoch 119 iteration 0121/0187: training loss 0.682 Epoch 119 iteration 0122/0187: training loss 0.682 Epoch 119 iteration 0123/0187: training loss 0.681 Epoch 119 iteration 0124/0187: training loss 0.680 Epoch 119 iteration 0125/0187: training loss 0.679 Epoch 119 iteration 0126/0187: training loss 0.680 Epoch 119 iteration 0127/0187: training loss 0.680 Epoch 119 iteration 0128/0187: training loss 0.682 Epoch 119 iteration 0129/0187: training loss 0.682 Epoch 119 iteration 0130/0187: training loss 0.682 Epoch 119 iteration 0131/0187: training loss 0.681 Epoch 119 iteration 0132/0187: training loss 0.681 Epoch 119 iteration 0133/0187: training loss 0.682 Epoch 119 iteration 0134/0187: training loss 0.681 Epoch 119 iteration 0135/0187: training loss 0.682 Epoch 119 iteration 0136/0187: training loss 0.681 Epoch 119 iteration 0137/0187: training loss 0.681 Epoch 119 iteration 0138/0187: training loss 0.681 Epoch 119 iteration 0139/0187: training loss 0.681 Epoch 119 iteration 0140/0187: training loss 0.680 Epoch 119 iteration 0141/0187: training loss 0.681 Epoch 119 iteration 0142/0187: training loss 0.680 Epoch 119 iteration 0143/0187: training loss 0.680 Epoch 119 iteration 0144/0187: training loss 0.680 Epoch 119 iteration 0145/0187: training loss 0.679 Epoch 119 iteration 0146/0187: training loss 0.679 Epoch 119 iteration 0147/0187: training loss 0.679 Epoch 119 iteration 0148/0187: training loss 0.679 Epoch 119 iteration 0149/0187: training loss 0.678 Epoch 119 iteration 0150/0187: training loss 0.678 Epoch 119 iteration 0151/0187: training loss 0.676 Epoch 119 iteration 0152/0187: training loss 0.676 Epoch 119 iteration 0153/0187: training loss 0.676 Epoch 119 iteration 0154/0187: training loss 0.677 Epoch 119 iteration 0155/0187: training loss 0.677 Epoch 119 iteration 0156/0187: training loss 0.677 Epoch 119 iteration 0157/0187: training loss 0.677 Epoch 119 iteration 0158/0187: training loss 0.677 Epoch 119 iteration 0159/0187: training loss 0.676 Epoch 119 iteration 0160/0187: training loss 0.676 Epoch 119 iteration 0161/0187: training loss 0.676 Epoch 119 iteration 0162/0187: training loss 0.677 Epoch 119 iteration 0163/0187: training loss 0.677 Epoch 119 iteration 0164/0187: training loss 0.677 Epoch 119 iteration 0165/0187: training loss 0.676 Epoch 119 iteration 0166/0187: training loss 0.677 Epoch 119 iteration 0167/0187: training loss 0.676 Epoch 119 iteration 0168/0187: training loss 0.676 Epoch 119 iteration 0169/0187: training loss 0.678 Epoch 119 iteration 0170/0187: training loss 0.678 Epoch 119 iteration 0171/0187: training loss 0.677 Epoch 119 iteration 0172/0187: training loss 0.677 Epoch 119 iteration 0173/0187: training loss 0.677 Epoch 119 iteration 0174/0187: training loss 0.676 Epoch 119 iteration 0175/0187: training loss 0.676 Epoch 119 iteration 0176/0187: training loss 0.676 Epoch 119 iteration 0177/0187: training loss 0.676 Epoch 119 iteration 0178/0187: training loss 0.677 Epoch 119 iteration 0179/0187: training loss 0.677 Epoch 119 iteration 0180/0187: training loss 0.677 Epoch 119 iteration 0181/0187: training loss 0.676 Epoch 119 iteration 0182/0187: training loss 0.677 Epoch 119 iteration 0183/0187: training loss 0.678 Epoch 119 iteration 0184/0187: training loss 0.678 Epoch 119 iteration 0185/0187: training loss 0.679 Epoch 119 iteration 0186/0187: training loss 0.679 Epoch 119 iteration 0187/0187: training loss 0.679 Epoch 119 validation pixAcc: 0.876, mIoU: 0.394