Nvidia Semantic Segmentation monorepo

Overview

Paper | YouTube | Cityscapes Score

Pytorch implementation of our paper Hierarchical Multi-Scale Attention for Semantic Segmentation.

Please refer to the sdcnet branch if you are looking for the code corresponding to Improving Semantic Segmentation via Video Prediction and Label Relaxation.

Installation

  • The code is tested with pytorch 1.3 and python 3.6
  • You can use ./Dockerfile to build an image.

Download Weights

  • Create a directory where you can keep large files. Ideally, not in this directory.
  > mkdir <large_asset_dir>
  • Update __C.ASSETS_PATH in config.py to point at that directory

    __C.ASSETS_PATH=<large_asset_dir>

  • Download pretrained weights from google drive and put into <large_asset_dir>/seg_weights

Download/Prepare Data

If using Cityscapes, download Cityscapes data, then update config.py to set the path:

__C.DATASET.CITYSCAPES_DIR=<path_to_cityscapes>

If using Cityscapes Autolabelled Images, download Cityscapes data, then update config.py to set the path:

__C.DATASET.CITYSCAPES_CUSTOMCOARSE=<path_to_cityscapes>

If using Mapillary, download Mapillary data, then update config.py to set the path:

__C.DATASET.MAPILLARY_DIR=<path_to_mapillary>

Running the code

The instructions below make use of a tool called runx, which we find useful to help automate experiment running and summarization. For more information about this tool, please see runx. In general, you can either use the runx-style commandlines shown below. Or you can call python train.py <args ...> directly if you like.

Run inference on Cityscapes

Dry run:

> python -m runx.runx scripts/eval_cityscapes.yml -i -n

This will just print out the command but not run. It's a good way to inspect the commandline.

Real run:

> python -m runx.runx scripts/eval_cityscapes.yml -i

The reported IOU should be 86.92. This evaluates with scales of 0.5, 1.0. and 2.0. You will find evaluation results in ./logs/eval_cityscapes/...

Run inference on Mapillary

> python -m runx.runx scripts/eval_mapillary.yml -i

The reported IOU should be 61.05. Note that this must be run on a 32GB node and the use of 'O3' mode for amp is critical in order to avoid GPU out of memory. Results in logs/eval_mapillary/...

Dump images for Cityscapes

> python -m runx.runx scripts/dump_cityscapes.yml -i

This will dump network output and composited images from running evaluation with the Cityscapes validation set.

Run inference and dump images on a folder of images

> python -m runx.runx scripts/dump_folder.yml -i

You should end up seeing images that look like the following:

alt text

Train a model

Train cityscapes, using HRNet + OCR + multi-scale attention with fine data and mapillary-pretrained model

> python -m runx.runx scripts/train_cityscapes.yml -i

The first time this command is run, a centroid file has to be built for the dataset. It'll take about 10 minutes. The centroid file is used during training to know how to sample from the dataset in a class-uniform way.

This training run should deliver a model that achieves 84.7 IOU.

Train SOTA default train-val split

> python -m runx.runx  scripts/train_cityscapes_sota.yml -i

Again, use -n to do a dry run and just print out the command. This should result in a model with 86.8 IOU. If you run out of memory, try to lower the crop size or turn off rmi_loss.

Comments
  • need your help

    need your help

    Hi authors, thank you for sharing your great work! I meet a problem need your help when I run your code(./scripts/submit_cityscapes_WideResNet38.sh ./cityscapes_best.pth ./result):

    RuntimeError: CUDA out of memory. Tried to allocate 1.50 GiB (GPU 0; 10.91 GiB total capacity; 8.29 GiB already allocated; 1.06 GiB free; 877.60 MiB cached)

    Could you please tell me how to solve it? looking for ward your help, thank you very much!

    opened by sde123 39
  • Pretrained models' training condition

    Pretrained models' training condition

    What is the training condition of your cityscapes_best.pth? Is it

    1. Pretrained on Mapillary?
    2. Use label relaxation loss?
    3. Use sdc-aug label propagation?

    I want to do comparison with my models. Thank you.

    opened by kwea123 20
  • tarfile.EmptyHeaderError: empty header

    tarfile.EmptyHeaderError: empty header

    Hi

    I am having a problem when I try to run the demo.py. When I run the following command:

    python3 demo.py --demo-image test.png --snapshot pretrained_models/cityscapes_best.pth

    I get the error:

    Using regular batch norm
    Net built.
    Traceback (most recent call last):
      File "/usr/lib/python3.6/tarfile.py", line 2297, in next
        tarinfo = self.tarinfo.fromtarfile(self)
      File "/usr/lib/python3.6/tarfile.py", line 1093, in fromtarfile
        obj = cls.frombuf(buf, tarfile.encoding, tarfile.errors)
      File "/usr/lib/python3.6/tarfile.py", line 1029, in frombuf
        raise EmptyHeaderError("empty header")
    tarfile.EmptyHeaderError: empty header
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/lib/python3.6/dist-packages/torch/serialization.py", line 595, in _load
        return legacy_load(f)
      File "/usr/local/lib/python3.6/dist-packages/torch/serialization.py", line 506, in legacy_load
        with closing(tarfile.open(fileobj=f, mode='r:', format=tarfile.PAX_FORMAT)) as tar, \
      File "/usr/lib/python3.6/tarfile.py", line 1589, in open
        return func(name, filemode, fileobj, **kwargs)
      File "/usr/lib/python3.6/tarfile.py", line 1619, in taropen
        return cls(name, mode, fileobj, **kwargs)
      File "/usr/lib/python3.6/tarfile.py", line 1482, in __init__
        self.firstmember = self.next()
      File "/usr/lib/python3.6/tarfile.py", line 2312, in next
        raise ReadError("empty file")
    tarfile.ReadError: empty file
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "demo.py", line 32, in <module>
        net, _ = restore_snapshot(net, optimizer=None, snapshot=args.snapshot, restore_optimizer_bool=False)
      File "/home/sieuwe/Downloads/semantic-segmentation-master/optimizer.py", line 69, in restore_snapshot
        checkpoint = torch.load(snapshot, map_location=torch.device('cpu'))
      File "/usr/local/lib/python3.6/dist-packages/torch/serialization.py", line 426, in load
        return _load(f, map_location, pickle_module, **pickle_load_args)
      File "/usr/local/lib/python3.6/dist-packages/torch/serialization.py", line 597, in _load
        if _is_zipfile(f):
      File "/usr/local/lib/python3.6/dist-packages/torch/serialization.py", line 75, in _is_zipfile
        if ord(magic_byte) != ord(read_byte):
    TypeError: ord() expected a character, but string of length 0 found
    
    

    I believe the error accours when it is loading the pretrained_models/wider_resnet38.pth.tar. But I dont know what I have done wrong.

    The steps I followed where:

    • clone repo
    • Install all dependencies
    • download cityscapes_best.pth and wider_resnet38.pth.tar
    • make a folder called pretrained_models in repo root and add the two mentioned files.
    • type above mentioned command to run demo.

    My system Ubuntu 18.4 LTS python 3.6 Cuda 10.2 Driver 440.33.01 card GTX1080

    PS: I am also wondering if there is a python video implementation for this network which can take a normal video as input? Because the Demo.py is only for images right? Or can I make a loop which loops over a video and per frame makes a prediction using the net(img) function. If that is possible what will the FPS be for such an implementation on a GTX1080?

    Thanks

    Sieuwe

    opened by sieuwe1 19
  • package usage

    package usage

    Hello,

    Thank you for sharing the project. I wonder how to use the package after executing docker build -t nvidia-segmgentation -f Dockerfile .? Is there any commands to run before trying the trained model by using CUDA_VISIBLE_DEVICES=0 python demo.py --demo-image YOUR_IMG --snapshot ./pretrained_models/cityscapes_best_wideresnet38.pth --save-dir YOUR_SAVE_DIR? Thank you very much.

    Regards, Jay

    opened by jwangjie 19
  • Loss

    Loss

    您好,最近在看这篇文章,其中Label Relaxation觉得很有用,想把文章中Label Relaxation部分拿过来用,自己之前用的是交叉熵损失函数,今天试了试您的ImgWtLossSoftNLL,却报错了,input[1,3,480,640],target[1,480,640],我用这两种输入测试的,您可不可以具体说一下您的的loss使用方法,很感谢,在线等!

    opened by songyadong106 19
  • Finetuning on KITTI

    Finetuning on KITTI

    Hi, I am impressed by the outcome of your work on multiple datasets. However, I wonder if you can offer some training details about finetuning on KITTI to achive 72.8% on test set since the amount of data on KITTI is highly limited. Sincere thanks.

    opened by HanqingXu 15
  • Centroid file causing issues with 2 GPUs but not with 1 GPU

    Centroid file causing issues with 2 GPUs but not with 1 GPU

    I am trying to train this model on the IDD dataset (https://idd.insaan.iiit.ac.in/). I wrote an idd.py and idd_labels.py to correspond with training on cityscapes. If I use only 1 GPU I can train the model but if I use 2 GPUs the program fails after building the centroid file and before training begins. If I turn off using the centroid file, set class_uniform_pct=0, then I can make the model train with 2 GPUs.

    What could be causing things to work on 1 GPU but not 2 with the centroid file? Thanks!

    opened by rod409 13
  • Error when attempting training: 'No Support for SyncBN without Apex'

    Error when attempting training: 'No Support for SyncBN without Apex'

    Hello, I am trying to use the train_kitti script within docker but am getting the output below regarding 'No Support for SyncBN without Apex'. I am not sure what I need to do, can you help? In case it is related, the original docker file failed to build during apex setup so I changed the "Install Apex" line to checkout an earlier commit of Apex before running the setup. Thanks!

    Total world size: 1 Traceback (most recent call last): File "train.py", line 318, in main() File "train.py", line 156, in main assert_and_infer_cfg(args) File "/home/test/semantic-segmentation/config.py", line 101, in assert_and_infer_cfg raise Exception('No Support for SyncBN without Apex') Exception: No Support for SyncBN without Apex Traceback (most recent call last): File "/opt/conda/lib/python3.6/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "/opt/conda/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/opt/conda/lib/python3.6/site-packages/torch/distributed/launch.py", line 235, in main() File "/opt/conda/lib/python3.6/site-packages/torch/distributed/launch.py", line 231, in main cmd=process.args) subprocess.CalledProcessError: Command '['/opt/conda/bin/python', '-u', 'train.py', '--local_rank=0', '--dataset', 'kitti', '--cv', '2', '--arch', 'network.deepv3.DeepWV3Plus', '--snapshot', './pretrained_models/cityscapes_best.pth', '--class_uniform_pct', '0.5', '--class_uniform_tile', '300', '--lr', '0.001', '--lr_schedule', 'poly', '--poly_exp', '1.0', '--syncbn', '--sgd', '--crop_size', '360', '--scale_min', '1.0', '--scale_max', '2.0', '--color_aug', '0.25', '--max_epoch', '90', '--img_wt_loss', '--wt_bound', '1.0', '--bs_mult', '2', '--apex', '--exp', 'kitti_ft', '--ckpt', './logs/', '--tb_path', './logs/']' returned non-zero exit status 1.

    opened by rod409 11
  • module 'optimizer' has no attribute 'load_state_dict'

    module 'optimizer' has no attribute 'load_state_dict'

    Thank you for providing code! There is something wrong when I resume the checkpoint.

    Traceback (most recent call last): File "train.py", line 326, in main() File "train.py", line 169, in main args.snapshot, args.restore_optimizer) File "/home/fuyi02/vos/semantic-segmentation/optimizer.py", line 61, in load_weights net, optimizer = restore_snapshot(net, optimizer, snapshot_file, restore_optimizer_bool) File "/home/fuyi02/vos/semantic-segmentation/optimizer.py", line 77, in restore_snapshot optimizer.load_state_dict(checkpoint['optimizer']) AttributeError: module 'optimizer' has no attribute 'load_state_dict'

    It seems that the definition of optimizer do not have 'load_state_dict'。。

    opened by Ximoi 11
  • How can I get the video data of CamVid?

    How can I get the video data of CamVid?

    Thank you for your great work! Could you please help with the data pre-processing? How can I get the video data of Camvid? I only find some MXF video on the official database page, but the link to the code of pre-processing is broken.

    Thanks, yifan

    opened by irfanICMLL 9
  • Failed to achieve 80+ mIOU on the pretrained model

    Failed to achieve 80+ mIOU on the pretrained model

    Hello, I am trying to test the inference code on the Cityscapes datasets. I have downloaded the pretrained_models/cityscapes_best.pth[1071MB] and put it on the pretrained_model folder. Also I have manage to change the directory of Cityscapes dataset in the config.py. The only thing I have changed is in the eval.py I reduce the crop_size from 1024 to 512 due to GPU constraint memory (I am having RTX 2080TI).

    phong@phong-Server:~/data/Work/Paper2/Code/semantic-segmentation$ sudo ./scripts/eval_cityscapes.sh pretrained_models/cityscapes_best.pth ./results/ Running inference on pretrained_models/cityscapes_best.pth Saving Results : ./results/ Using regular batch norm Logging : ./results/val/eval_2019_07_23_11_14_50_rank_0.log 07-23 11:14:50.179 Network Arch: network.deepv3.DeepSRNX50V3PlusD_m1 07-23 11:14:50.186 CV split: 0 07-23 11:14:50.186 Exp_name: 07-23 11:14:50.186 Ckpt path: ./results/ 07-23 11:14:50.186 Scales : 1.0 07-23 11:14:50.186 Inference mode: sliding 07-23 11:14:50.187 val fine cities: ['val/frankfurt', 'val/lindau', 'val/munster'] /media/phong/Data/dataset/cityscapes/leftImg8bit_trainvaltest/leftImg8bit /media/phong/Data/dataset/cityscapes/gtFine_trainvaltest/gtFine 07-23 11:14:50.188 Cityscapes-val: 500 images 07-23 11:14:50.189 Load model file: pretrained_models/cityscapes_best.pth 07-23 11:14:50.647 Model params = 42.4M 07-23 11:14:53.781 Checkpoint Load Compelete 07-23 11:14:53.782 Skipped loading parameter module.layer0.conv1.weight 07-23 11:14:53.782 Skipped loading parameter module.layer0.bn1.weight 07-23 11:14:53.782 Skipped loading parameter module.layer0.bn1.bias 07-23 11:14:53.782 Skipped loading parameter module.layer0.bn1.running_mean 07-23 11:14:53.782 Skipped loading parameter module.layer0.bn1.running_var 07-23 11:14:53.782 Skipped loading parameter module.layer0.bn1.num_batches_tracked 07-23 11:14:53.782 Skipped loading parameter module.layer1.0.conv1.weight 07-23 11:14:53.782 Skipped loading parameter module.layer1.0.bn1.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn1.bias 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn1.running_mean 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn1.running_var 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn1.num_batches_tracked 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.conv2.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn2.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn2.bias 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn2.running_mean 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn2.running_var 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn2.num_batches_tracked 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.conv3.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn3.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn3.bias 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn3.running_mean 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn3.running_var 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.bn3.num_batches_tracked 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.se_module.fc1.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.se_module.fc1.bias 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.se_module.fc2.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.se_module.fc2.bias 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.downsample.0.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.downsample.1.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.downsample.1.bias 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.downsample.1.running_mean 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.downsample.1.running_var 07-23 11:14:53.783 Skipped loading parameter module.layer1.0.downsample.1.num_batches_tracked 07-23 11:14:53.783 Skipped loading parameter module.layer1.1.conv1.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.1.bn1.weight 07-23 11:14:53.783 Skipped loading parameter module.layer1.1.bn1.bias 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn1.running_mean 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn1.running_var 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn1.num_batches_tracked 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.conv2.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn2.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn2.bias 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn2.running_mean 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn2.running_var 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn2.num_batches_tracked 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.conv3.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn3.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn3.bias 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn3.running_mean 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn3.running_var 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.bn3.num_batches_tracked 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.se_module.fc1.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.se_module.fc1.bias 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.se_module.fc2.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.1.se_module.fc2.bias 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.conv1.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn1.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn1.bias 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn1.running_mean 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn1.running_var 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn1.num_batches_tracked 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.conv2.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn2.weight 07-23 11:14:53.784 Skipped loading parameter module.layer1.2.bn2.bias 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn2.running_mean 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn2.running_var 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn2.num_batches_tracked 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.conv3.weight 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn3.weight 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn3.bias 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn3.running_mean 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn3.running_var 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.bn3.num_batches_tracked 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.se_module.fc1.weight 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.se_module.fc1.bias 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.se_module.fc2.weight 07-23 11:14:53.785 Skipped loading parameter module.layer1.2.se_module.fc2.bias 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.conv1.weight 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn1.weight 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn1.bias 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn1.running_mean 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn1.running_var 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn1.num_batches_tracked 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.conv2.weight 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn2.weight 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn2.bias 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn2.running_mean 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn2.running_var 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn2.num_batches_tracked 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.conv3.weight 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn3.weight 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn3.bias 07-23 11:14:53.785 Skipped loading parameter module.layer2.0.bn3.running_mean 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.bn3.running_var 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.bn3.num_batches_tracked 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.se_module.fc1.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.se_module.fc1.bias 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.se_module.fc2.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.se_module.fc2.bias 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.downsample.0.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.downsample.1.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.downsample.1.bias 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.downsample.1.running_mean 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.downsample.1.running_var 07-23 11:14:53.786 Skipped loading parameter module.layer2.0.downsample.1.num_batches_tracked 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.conv1.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn1.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn1.bias 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn1.running_mean 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn1.running_var 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn1.num_batches_tracked 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.conv2.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn2.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn2.bias 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn2.running_mean 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn2.running_var 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn2.num_batches_tracked 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.conv3.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn3.weight 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn3.bias 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn3.running_mean 07-23 11:14:53.786 Skipped loading parameter module.layer2.1.bn3.running_var 07-23 11:14:53.787 Skipped loading parameter module.layer2.1.bn3.num_batches_tracked 07-23 11:14:53.787 Skipped loading parameter module.layer2.1.se_module.fc1.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.1.se_module.fc1.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.1.se_module.fc2.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.1.se_module.fc2.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.conv1.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn1.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn1.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn1.running_mean 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn1.running_var 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn1.num_batches_tracked 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.conv2.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn2.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn2.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn2.running_mean 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn2.running_var 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn2.num_batches_tracked 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.conv3.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn3.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn3.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn3.running_mean 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn3.running_var 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.bn3.num_batches_tracked 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.se_module.fc1.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.se_module.fc1.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.se_module.fc2.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.2.se_module.fc2.bias 07-23 11:14:53.787 Skipped loading parameter module.layer2.3.conv1.weight 07-23 11:14:53.787 Skipped loading parameter module.layer2.3.bn1.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn1.bias 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn1.running_mean 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn1.running_var 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn1.num_batches_tracked 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.conv2.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn2.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn2.bias 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn2.running_mean 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn2.running_var 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn2.num_batches_tracked 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.conv3.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn3.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn3.bias 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn3.running_mean 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn3.running_var 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.bn3.num_batches_tracked 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.se_module.fc1.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.se_module.fc1.bias 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.se_module.fc2.weight 07-23 11:14:53.788 Skipped loading parameter module.layer2.3.se_module.fc2.bias 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.conv1.weight 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.bn1.weight 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.bn1.bias 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.bn1.running_mean 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.bn1.running_var 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.bn1.num_batches_tracked 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.conv2.weight 07-23 11:14:53.788 Skipped loading parameter module.layer3.0.bn2.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn2.bias 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn2.running_mean 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn2.running_var 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn2.num_batches_tracked 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.conv3.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn3.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn3.bias 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn3.running_mean 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn3.running_var 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.bn3.num_batches_tracked 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.se_module.fc1.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.se_module.fc1.bias 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.se_module.fc2.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.se_module.fc2.bias 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.downsample.0.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.downsample.1.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.downsample.1.bias 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.downsample.1.running_mean 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.downsample.1.running_var 07-23 11:14:53.789 Skipped loading parameter module.layer3.0.downsample.1.num_batches_tracked 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.conv1.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.bn1.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.bn1.bias 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.bn1.running_mean 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.bn1.running_var 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.bn1.num_batches_tracked 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.conv2.weight 07-23 11:14:53.789 Skipped loading parameter module.layer3.1.bn2.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn2.bias 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn2.running_mean 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn2.running_var 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn2.num_batches_tracked 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.conv3.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn3.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn3.bias 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn3.running_mean 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn3.running_var 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.bn3.num_batches_tracked 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.se_module.fc1.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.se_module.fc1.bias 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.se_module.fc2.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.1.se_module.fc2.bias 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.conv1.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn1.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn1.bias 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn1.running_mean 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn1.running_var 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn1.num_batches_tracked 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.conv2.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn2.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn2.bias 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn2.running_mean 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn2.running_var 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn2.num_batches_tracked 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.conv3.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn3.weight 07-23 11:14:53.790 Skipped loading parameter module.layer3.2.bn3.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.bn3.running_mean 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.bn3.running_var 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.bn3.num_batches_tracked 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.se_module.fc1.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.se_module.fc1.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.se_module.fc2.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.2.se_module.fc2.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.conv1.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn1.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn1.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn1.running_mean 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn1.running_var 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn1.num_batches_tracked 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.conv2.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn2.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn2.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn2.running_mean 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn2.running_var 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn2.num_batches_tracked 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.conv3.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn3.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn3.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn3.running_mean 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn3.running_var 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.bn3.num_batches_tracked 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.se_module.fc1.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.se_module.fc1.bias 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.se_module.fc2.weight 07-23 11:14:53.791 Skipped loading parameter module.layer3.3.se_module.fc2.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.conv1.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn1.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn1.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn1.running_mean 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn1.running_var 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn1.num_batches_tracked 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.conv2.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn2.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn2.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn2.running_mean 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn2.running_var 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn2.num_batches_tracked 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.conv3.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn3.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn3.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn3.running_mean 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn3.running_var 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.bn3.num_batches_tracked 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.se_module.fc1.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.se_module.fc1.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.se_module.fc2.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.4.se_module.fc2.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.5.conv1.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.5.bn1.weight 07-23 11:14:53.792 Skipped loading parameter module.layer3.5.bn1.bias 07-23 11:14:53.792 Skipped loading parameter module.layer3.5.bn1.running_mean 07-23 11:14:53.792 Skipped loading parameter module.layer3.5.bn1.running_var 07-23 11:14:53.792 Skipped loading parameter module.layer3.5.bn1.num_batches_tracked 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.conv2.weight 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn2.weight 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn2.bias 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn2.running_mean 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn2.running_var 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn2.num_batches_tracked 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.conv3.weight 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn3.weight 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn3.bias 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn3.running_mean 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn3.running_var 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.bn3.num_batches_tracked 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.se_module.fc1.weight 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.se_module.fc1.bias 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.se_module.fc2.weight 07-23 11:14:53.793 Skipped loading parameter module.layer3.5.se_module.fc2.bias 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.conv1.weight 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn1.weight 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn1.bias 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn1.running_mean 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn1.running_var 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn1.num_batches_tracked 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.conv2.weight 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn2.weight 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn2.bias 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn2.running_mean 07-23 11:14:53.793 Skipped loading parameter module.layer4.0.bn2.running_var 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.bn2.num_batches_tracked 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.conv3.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.bn3.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.bn3.bias 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.bn3.running_mean 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.bn3.running_var 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.bn3.num_batches_tracked 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.se_module.fc1.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.se_module.fc1.bias 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.se_module.fc2.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.se_module.fc2.bias 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.downsample.0.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.downsample.1.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.downsample.1.bias 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.downsample.1.running_mean 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.downsample.1.running_var 07-23 11:14:53.794 Skipped loading parameter module.layer4.0.downsample.1.num_batches_tracked 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.conv1.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn1.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn1.bias 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn1.running_mean 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn1.running_var 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn1.num_batches_tracked 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.conv2.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn2.weight 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn2.bias 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn2.running_mean 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn2.running_var 07-23 11:14:53.794 Skipped loading parameter module.layer4.1.bn2.num_batches_tracked 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.conv3.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.bn3.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.bn3.bias 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.bn3.running_mean 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.bn3.running_var 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.bn3.num_batches_tracked 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.se_module.fc1.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.se_module.fc1.bias 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.se_module.fc2.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.1.se_module.fc2.bias 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.conv1.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn1.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn1.bias 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn1.running_mean 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn1.running_var 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn1.num_batches_tracked 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.conv2.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn2.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn2.bias 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn2.running_mean 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn2.running_var 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn2.num_batches_tracked 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.conv3.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn3.weight 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn3.bias 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn3.running_mean 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn3.running_var 07-23 11:14:53.795 Skipped loading parameter module.layer4.2.bn3.num_batches_tracked 07-23 11:14:53.796 Skipped loading parameter module.layer4.2.se_module.fc1.weight 07-23 11:14:53.796 Skipped loading parameter module.layer4.2.se_module.fc1.bias 07-23 11:14:53.796 Skipped loading parameter module.layer4.2.se_module.fc2.weight 07-23 11:14:53.796 Skipped loading parameter module.layer4.2.se_module.fc2.bias 07-23 11:14:53.796 Skipped loading parameter module.aspp.features.0.0.weight 07-23 11:14:53.796 Skipped loading parameter module.aspp.features.1.0.weight 07-23 11:14:53.796 Skipped loading parameter module.aspp.features.2.0.weight 07-23 11:14:53.796 Skipped loading parameter module.aspp.features.3.0.weight 07-23 11:14:53.796 Skipped loading parameter module.aspp.img_conv.0.weight 07-23 11:14:53.796 Skipped loading parameter module.bot_fine.weight eval val: 0%| | 0/500 [00:00<?, ?it/s]/home/phong/data/Work/Paper2/Code/semantic-segmentation/utils/misc.py:68: RuntimeWarning: invalid value encountered in true_divide return np.diag(hist) / (hist.sum(1) + hist.sum(0) - np.diag(hist)) Mean IOU: 2.63: 7%|███████████ | 37/500 [03:36<44:12, 5.73s/it] This is the result I have: frankfurt_000000_000294_leftImg8bit_compose

    I dont know what I have done wrong here, please help ! Thank you !

    opened by phongnhhn92 9
  • There are some problems may cause the input parameter

    There are some problems may cause the input parameter "max_cu_epoch" couldn't work as you hope

    1. datasets/cityscapes.py:143 self.centroids = self.fine_centroids might be changed to self.centroids = copy.deepcopy(self.fine_centroids). As I see, self.centroids is a combination of self.fine_centroids and 'coarse_centroids', but if you just use = here, self.fine_centroids would be changed as self.centroids changing in the fowllowing codes: self.centroids[cid].extend(self.coarse_centroids[cid]), which locates at 163 in the same file. So when disable_coarse() is called, self.centroids is not really changed, because self.fine_centroids is the same as self.centroids.
    2. train.py:442: train_obj.disable_coarse(). In my test, even though we call this function, and it did execute, however, the dataset in DataLoader is not changed. If you print the id(self.imgs) inside function build_epoch() and __getitem__() separately after call disable_coarse(), you will find their id are different, which means self.imgs is changed in build_epoch(), but not used in __getitem__(). Maybe this is because of the implementation of DataLoader.
    3. When epoch > max_cu_epoch, new generated self.imgs should be all come from 'gtFine_trainvaltest', but as these two problems, it won't work. Screenshot from 2022-10-10 15-52-21 Screenshot from 2022-10-10 15-53-38 Screenshot from 2022-10-10 15-51-55
    opened by MoriartyShan 0
  • using the available trained models for fine-tuning over a new dataset

    using the available trained models for fine-tuning over a new dataset

    I am going to fine-tune the available semantic segmentation models(in this repo). for fine-tuning I am going to use the Aeroscapes dataset which has both aerial images + masks. I don't know how to add the new data to be used for training. What folder structure is needed (train, valid)? Please let me know if you used the available YMl files and models from this repo for fine-tuning?

    opened by nattaran 0
  • inference issue

    inference issue

    We had some issues to run the inference for testing data and suddenly we understood that we need to run this command: "python -m runx.runx scripts/dump_folder.yml -i" rather than what it is available in repo

    opened by nattaran 0
  • Auto-Labelling ignore label

    Auto-Labelling ignore label

    Hi, thanks for the great work. I noticed from the paper that the labels assignment is set as follows:

    Instead, we adopt a hard labelling strategy, whereby for a given pixel, we select the top class prediction of the teacher
    network. We threshold the label based on teacher network output probability. Teacher predictions that exceed the
    threshold become true labels, otherwise the pixel is labelled as ignore class. In practice we use a threshold of 0.9.
    

    However I downloaded the Auto-Labelled data, I noticed that all pixels are annotated and there is no presence of ignore labels in this subset.

    opened by lkdci 0
  • Demo for HSMA

    Demo for HSMA

    Hi, thanks for sharing the code.

    I would like to generate the inference result on a video sequence using the pre-trained HRNet_Mscale. Could you please give me some hints?

    opened by HieuPhan33 0
  • why  does  eval report out of memory  while n_scale is set the same in train and eval?

    why does eval report out of memory while n_scale is set the same in train and eval?

    N_scale is set the same in train and eval, batch 2 or batch 4 can be used in training, but eval will report out of memory error after batch 2 or batch 4 is run.

    opened by xu19971109 0
Owner
NVIDIA Corporation
NVIDIA Corporation
Learning Pixel-level Semantic Affinity with Image-level Supervision for Weakly Supervised Semantic Segmentation, CVPR 2018

Learning Pixel-level Semantic Affinity with Image-level Supervision This code is deprecated. Please see https://github.com/jiwoon-ahn/irn instead. Int

Jiwoon Ahn 337 Dec 15, 2022
Segmentation in Style: Unsupervised Semantic Image Segmentation with Stylegan and CLIP

Segmentation in Style: Unsupervised Semantic Image Segmentation with Stylegan and CLIP Abstract: We introduce a method that allows to automatically se

Daniil Pakhomov 134 Dec 19, 2022
TorchDistiller - a collection of the open source pytorch code for knowledge distillation, especially for the perception tasks, including semantic segmentation, depth estimation, object detection and instance segmentation.

This project is a collection of the open source pytorch code for knowledge distillation, especially for the perception tasks, including semantic segmentation, depth estimation, object detection and instance segmentation.

yifan liu 147 Dec 3, 2022
Mae segmentation - Reproduction of semantic segmentation using masked autoencoder (mae)

ADE20k Semantic segmentation with MAE Getting started Install the mmsegmentation

null 97 Dec 17, 2022
🦕 NanoSaur is a little tracked robot ROS2 enabled, made for an NVIDIA Jetson Nano

?? nanosaur NanoSaur is a little tracked robot ROS2 enabled, made for an NVIDIA Jetson Nano Website: nanosaur.ai Do you need an help? Discord For tech

NanoSaur 162 Dec 9, 2022
The 1st place solution of track2 (Vehicle Re-Identification) in the NVIDIA AI City Challenge at CVPR 2021 Workshop.

AICITY2021_Track2_DMT The 1st place solution of track2 (Vehicle Re-Identification) in the NVIDIA AI City Challenge at CVPR 2021 Workshop. Introduction

Hao Luo 91 Dec 21, 2022
Tutorial on active learning with the Nvidia Transfer Learning Toolkit (TLT).

Active Learning with the Nvidia TLT Tutorial on active learning with the Nvidia Transfer Learning Toolkit (TLT). In this tutorial, we will show you ho

Lightly 25 Dec 3, 2022
PyTorch implementation of the Quasi-Recurrent Neural Network - up to 16 times faster than NVIDIA's cuDNN LSTM

Quasi-Recurrent Neural Network (QRNN) for PyTorch Updated to support multi-GPU environments via DataParallel - see the the multigpu_dataparallel.py ex

Salesforce 1.3k Dec 28, 2022
A Jupyter notebook to play with NVIDIA's StyleGAN3 and OpenAI's CLIP for a text-based guided image generation.

A Jupyter notebook to play with NVIDIA's StyleGAN3 and OpenAI's CLIP for a text-based guided image generation.

Eugenio Herrera 175 Dec 29, 2022
NVIDIA Merlin is an open source library providing end-to-end GPU-accelerated recommender systems, from feature engineering and preprocessing to training deep learning models and running inference in production.

NVIDIA Merlin NVIDIA Merlin is an open source library designed to accelerate recommender systems on NVIDIA’s GPUs. It enables data scientists, machine

null 419 Jan 3, 2023
General purpose GPU compute framework for cross vendor graphics cards (AMD, Qualcomm, NVIDIA & friends)

General purpose GPU compute framework for cross vendor graphics cards (AMD, Qualcomm, NVIDIA & friends). Blazing fast, mobile-enabled, asynchronous and optimized for advanced GPU data processing usecases. Backed by the Linux Foundation.

The Kompute Project 1k Jan 6, 2023
AI pipelines for Nvidia Jetson Platform

Jetson Multicamera Pipelines Easy-to-use realtime CV/AI pipelines for Nvidia Jetson Platform. This project: Builds a typical multi-camera pipeline, i.

NVIDIA AI IOT 96 Dec 23, 2022
Simply enable or disable your Nvidia dGPU

EnvyControl (WIP) Simply enable or disable your Nvidia dGPU Usage First clone this repo and install envycontrol with sudo pip install . CLI Turn off y

Victor Bayas 292 Jan 3, 2023
Deploy optimized transformer based models on Nvidia Triton server

Deploy optimized transformer based models on Nvidia Triton server

Lefebvre Sarrut Services 1.2k Jan 5, 2023
A Pythonic library for Nvidia Codec.

A Pythonic library for Nvidia Codec. The project is still in active development; expect breaking changes. Why another Python library for Nvidia Codec?

Zesen Qian 12 Dec 27, 2022
Deploy optimized transformer based models on Nvidia Triton server

?? Hugging Face Transformer submillisecond inference ?? and deployment on Nvidia Triton server Yes, you can perfom inference with transformer based mo

Lefebvre Sarrut Services 1.2k Jan 5, 2023
Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation. Intel iHD GPU (iGPU) support. NVIDIA GPU (dGPU) support.

mtomo Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation.

Katsuya Hyodo 24 Mar 2, 2022
Build and run Docker containers leveraging NVIDIA GPUs

NVIDIA Container Toolkit Introduction The NVIDIA Container Toolkit allows users to build and run GPU accelerated Docker containers. The toolkit includ

NVIDIA Corporation 15.6k Jan 1, 2023
Real-time pose estimation accelerated with NVIDIA TensorRT

trt_pose Want to detect hand poses? Check out the new trt_pose_hand project for real-time hand pose and gesture recognition! trt_pose is aimed at enab

NVIDIA AI IOT 803 Jan 6, 2023