Lepard: Learning Partial point cloud matching in Rigid and Deformable scenes

Related tags

Deep Learning lepard
Overview

Lepard: Learning Partial point cloud matching in Rigid and Deformable scenes [Paper]

drawing

Method overview

drawing

4DMatch Benchmark

4DMatch is a benchmark for matching and registration of partial point clouds with time-varying geometry. It is constructed using randomly selected 1761 sequences from DeformingThings4D. Below shows point cloud pairs with different overlap ratios.

drawing

Installation

We tested the code on python 3.8.10; Pytroch version '1.7.1' or '1.9.0+cu111'; GPU model GeForce RTX-2080 or Nvidia A100.

conda env create -f environment.yml
conda activate lepard
cd cpp_wrappers; sh compile_wrappers.sh; cd ..

Download data and pretrained model

Train and evaluation on 4DMatch

Download and extract the 4DMatch split to your custom folder. Then update the data_root in configs/train/4dmatch.yaml and configs/test/4dmatch.yaml

  • Evaluate pre-trained
python main.py configs/test/4dmatch.yaml

(To switch between 4DMatch and 4DLoMatch benchmark, modify the split configuration in configs/test/4dmatch.yaml)

  • Train from scratch
python main.py configs/train/4dmatch.yaml

Train and evaluation on 3DMatch

Download and extract the 3DMatch split to your custom folder. Then update the data_root in configs/train/3dmatch.yaml and configs/test/3dmatch.yaml

  • Evaluate pre-trained
python main.py configs/test/3dmatch.yaml

(To switch between 3DMatch and 3DLoMatch benchmark, modify the split configuration in configs/test/3dmatch.yaml)

  • Train from scratch
python main.py configs/train/3dmatch.yaml

Citation

If you use Lepard code or 4DMatch data please cite:

@article{lepard2021, 
    title={Lepard: Learning partial point cloud matching in rigid and deformable scenes.}, 
    author={Yang Li and Tatsuya Harada},
    journal={arXiv preprint arXiv:2111.12591},
    year={2021}
}
Comments
  • when i trained using 3dmatch , the dataloader have one problem

    when i trained using 3dmatch , the dataloader have one problem

    in main.py line 99 ,'cofig.train_loader, neighborhood_limits = get_dataloader(train_set,config,shuffle=True)' this line have one problem. the 'train_set' need one file '3dmatch/3dmatch/train/rgbd-scenes-v2-scene_01/cloud_bin_1.pth' , but i can't find the ''cloud_bin_1.pth''. i have already download 3dmatch.zip ,and unzip this file. but in the unzip file,i don't find the ''cloud_bin_1.pth''. i hope you can help me. thank you very much.

    opened by WuHaoYu21 9
  • Some questions about the hyperparameters

    Some questions about the hyperparameters

    Hello! Here are some questions for the code part on dataset 3DMatch.

    1. The dimensionality of the output from the KPConv backbone is set to 528, theoretically, any number that could be divided by 6 and 4 with no remainder is feasible here? Because in my understanding, divided by 6 is for the rotary positional embedding, and divided by 4 is for the multi-head attention, so why not choose 516 as the output dimensionality? Is there any reason behind this choice?
    2. The code part of rotary positional embedding you implemented is impressive, I noticed that you first voxelize the raw 3d coordinates (but not flooring the output coordinates) to scale the 3d coordinates, then I believe the vol_bnds = [-3.6, -2.4, 1.14] is the minimal coordinates among all 3DMatch train&val&test set? also, does voxelizing(or called scaling) the coordinates before positional embedding could get better results than just utilizing the raw 3d coordinates for positional embedding? Could you give some hints about it?

    Thank you very much for your help.

    opened by qsisi 7
  • OOM error; mem cost keep growing!

    OOM error; mem cost keep growing!

    (Thanks for your excellent work bro ! this paper is really nice and amazing~ When i try to train the lepard myself ( I only change the "batchsize=1" and "num_workers=0"), i found the memory is keep raising(i have 126g mem, lepard almost cost 100g and still raising ), finally it was killed by linux

    Have you encountered this kind of problem before? I did not change your code, i guess this problem come from dataloader? can you give me some suggestions ?

    opened by 35p32 7
  • EPE3D, Acc5 and Acc10 computation on 4DMatch

    EPE3D, Acc5 and Acc10 computation on 4DMatch

    Hi Yang, how are EPE3D, Acc5 and Acc10 computed on 4DMatch? Are they computed on ALL points in the source point cloud or only the points from "metric_index" in the dataset?

    opened by qinzheng93 4
  • Results of pretrained weights are different from the paper.

    Results of pretrained weights are different from the paper.

    Hi Yang,

    I tried to run the pretrained weights on 4DMatch/4DLoMatch, but the results are lower than those reported in the paper. Do I need more modifications to get the results in the paper?

    image
    opened by ghost 3
  • Some question about the paper

    Some question about the paper

    Hi,

    Thank you for sharing your excellent work! I have read the paper recently, and I have some questions about the position encoding. In your paper, you add position encoding to three places: self-attention, cross-attention, confidence matrix. And as you mention in the paper, position encoding in cross-attention and matching may mislead the registration at the beginning. So, I wonder if you have compared the performance at the situation you remove the position encoding at the cross-attention part and the confidence matrix part at the first stage but retain them at the second stage. Plus, I think the figures in your paper are very beautiful. So could you please tell me how you draw the point cloud and correspondences of figure 7 in your paper and the point cloud video in the readme? I will be very appreciative if you can give me some hints about the above question. Thank you very much!

    opened by Gilgamesh666666 3
  • loading state_dict error during 3dmatch inference

    loading state_dict error during 3dmatch inference

    thanks for sharing such inspiring work, i was trying to retest inference performance based on 3dmatch dataset and followed the instruction using this command to start testing: python main.py configs/test/3dmatch.yaml however inference procedure failed since the error below:

    File "/home/hadley/Development/lepard/lib/trainer.py", line 80, in _load_pretrain
        if os.path.isfile(resume):
      File "/home/hadley/anaconda3/envs/torch_12/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1604, in load_state_dict
        raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
    RuntimeError: Error(s) in loading state_dict for Pipeline:
    	Missing key(s) in state_dict: "backbone.encoder_blocks.0.KPConv.weights", "backbone.encoder_blocks.0.KPConv.kernel_points", "backbone.encoder_blocks.1.unary1.mlp.weight", "backbone.encoder_blocks.1.KPConv.weights", "backbone.encoder_blocks.1.KPConv.kernel_points", "backbone.encoder_blocks.1.unary2.mlp.weight", "backbone.encoder_blocks.1.unary_shortcut.mlp.weight", "backbone.encoder_blocks.2.unary1.mlp.weight", "backbone.encoder_blocks.2.KPConv.weights", "backbone.encoder_blocks.2.KPConv.kernel_points", "backbone.encoder_blocks.2.unary2.mlp.weight", "backbone.encoder_blocks.3.unary1.mlp.weight", "backbone.encoder_blocks.3.KPConv.weights", "backbone.encoder_blocks.3.KPConv.kernel_points", "backbone.encoder_blocks.3.unary2.mlp.weight", "backbone.encoder_blocks.3.unary_shortcut.mlp.weight", "backbone.encoder_blocks.4.unary1.mlp.weight", "backbone.encoder_blocks.4.KPConv.weights", "backbone.encoder_blocks.4.KPConv.kernel_points", "backbone.encoder_blocks.4.unary2.mlp.weight", "backbone.encoder_blocks.5.unary1.mlp.weight", "backbone.encoder_blocks.5.KPConv.weights", "backbone.encoder_blocks.5.KPConv.kernel_points", "backbone.encoder_blocks.5.unary2.mlp.weight", "backbone.encoder_blocks.6.unary1.mlp.weight", "backbone.encoder_blocks.6.KPConv.weights", "backbone.encoder_blocks.6.KPConv.kernel_points", "backbone.encoder_blocks.6.unary2.mlp.weight", "backbone.encoder_blocks.6.unary_shortcut.mlp.weight", "backbone.encoder_blocks.7.unary1.mlp.weight", "backbone.encoder_blocks.7.KPConv.weights", "backbone.encoder_blocks.7.KPConv.kernel_points", "backbone.encoder_blocks.7.unary2.mlp.weight", "backbone.encoder_blocks.8.unary1.mlp.weight", "backbone.encoder_blocks.8.KPConv.weights", "backbone.encoder_blocks.8.KPConv.kernel_points", "backbone.encoder_blocks.8.unary2.mlp.weight", "backbone.encoder_blocks.9.unary1.mlp.weight", "backbone.encoder_blocks.9.KPConv.weights", "backbone.encoder_blocks.9.KPConv.kernel_points", "backbone.encoder_blocks.9.unary2.mlp.weight", "backbone.encoder_blocks.9.unary_shortcut.mlp.weight", "backbone.encoder_blocks.10.unary1.mlp.weight", "backbone.encoder_blocks.10.KPConv.weights", "backbone.encoder_blocks.10.KPConv.kernel_points", "backbone.encoder_blocks.10.unary2.mlp.weight", "backbone.coarse_out.weight", "backbone.coarse_out.bias", "backbone.coarse_in.weight", "backbone.coarse_in.bias", "backbone.decoder_blocks.1.mlp.weight", "backbone.decoder_blocks.3.mlp.weight", "backbone.decoder_blocks.5.mlp.weight", "backbone.fine_out.weight", "backbone.fine_out.bias", "coarse_transformer.layers.0.q_proj.weight", "coarse_transformer.layers.0.k_proj.weight", "coarse_transformer.layers.0.v_proj.weight", "coarse_transformer.layers.0.merge.weight", "coarse_transformer.layers.0.mlp.0.weight", "coarse_transformer.layers.0.mlp.2.weight", "coarse_transformer.layers.0.norm1.weight", "coarse_transformer.layers.0.norm1.bias", "coarse_transformer.layers.0.norm2.weight", "coarse_transformer.layers.0.norm2.bias", "coarse_transformer.layers.1.q_proj.weight", "coarse_transformer.layers.1.k_proj.weight", "coarse_transformer.layers.1.v_proj.weight", "coarse_transformer.layers.1.merge.weight", "coarse_transformer.layers.1.mlp.0.weight", "coarse_transformer.layers.1.mlp.2.weight", "coarse_transformer.layers.1.norm1.weight", "coarse_transformer.layers.1.norm1.bias", "coarse_transformer.layers.1.norm2.weight", "coarse_transformer.layers.1.norm2.bias", "coarse_transformer.layers.2.0.src_proj.weight", "coarse_transformer.layers.2.0.tgt_proj.weight", "coarse_transformer.layers.2.0.instNormLayer.weight", "coarse_transformer.layers.2.0.instNormLayer.bias", "coarse_transformer.layers.2.0.edgeNormLayer.weight", "coarse_transformer.layers.2.0.edgeNormLayer.bias", "coarse_transformer.layers.3.q_proj.weight", "coarse_transformer.layers.3.k_proj.weight", "coarse_transformer.layers.3.v_proj.weight", "coarse_transformer.layers.3.merge.weight", "coarse_transformer.layers.3.mlp.0.weight", "coarse_transformer.layers.3.mlp.2.weight", "coarse_transformer.layers.3.norm1.weight", "coarse_transformer.layers.3.norm1.bias", "coarse_transformer.layers.3.norm2.weight", "coarse_transformer.layers.3.norm2.bias", "coarse_transformer.layers.4.q_proj.weight", "coarse_transformer.layers.4.k_proj.weight", "coarse_transformer.layers.4.v_proj.weight", "coarse_transformer.layers.4.merge.weight", "coarse_transformer.layers.4.mlp.0.weight", "coarse_transformer.layers.4.mlp.2.weight", "coarse_transformer.layers.4.norm1.weight", "coarse_transformer.layers.4.norm1.bias", "coarse_transformer.layers.4.norm2.weight", "coarse_transformer.layers.4.norm2.bias", "coarse_matching.src_proj.weight", "coarse_matching.tgt_proj.weight", "coarse_matching.instNormLayer.weight", "coarse_matching.instNormLayer.bias", "coarse_matching.edgeNormLayer.weight", "coarse_matching.edgeNormLayer.bias". 
    	Unexpected key(s) in state_dict: "kpf_encoder.encoder_blocks.0.KPConv.weights", "kpf_encoder.encoder_blocks.0.KPConv.kernel_points", "kpf_encoder.encoder_blocks.1.unary1.mlp.weight", "kpf_encoder.encoder_blocks.1.KPConv.weights", "kpf_encoder.encoder_blocks.1.KPConv.kernel_points", "kpf_encoder.encoder_blocks.1.unary2.mlp.weight", "kpf_encoder.encoder_blocks.1.unary_shortcut.mlp.weight", "kpf_encoder.encoder_blocks.2.unary1.mlp.weight", "kpf_encoder.encoder_blocks.2.KPConv.weights", "kpf_encoder.encoder_blocks.2.KPConv.kernel_points", "kpf_encoder.encoder_blocks.2.unary2.mlp.weight", "kpf_encoder.encoder_blocks.3.unary1.mlp.weight", "kpf_encoder.encoder_blocks.3.KPConv.weights", "kpf_encoder.encoder_blocks.3.KPConv.kernel_points", "kpf_encoder.encoder_blocks.3.unary2.mlp.weight", "kpf_encoder.encoder_blocks.3.unary_shortcut.mlp.weight", "kpf_encoder.encoder_blocks.4.unary1.mlp.weight", "kpf_encoder.encoder_blocks.4.KPConv.weights", "kpf_encoder.encoder_blocks.4.KPConv.kernel_points", "kpf_encoder.encoder_blocks.4.unary2.mlp.weight", "kpf_encoder.encoder_blocks.5.unary1.mlp.weight", "kpf_encoder.encoder_blocks.5.KPConv.weights", "kpf_encoder.encoder_blocks.5.KPConv.kernel_points", "kpf_encoder.encoder_blocks.5.unary2.mlp.weight", "kpf_encoder.encoder_blocks.6.unary1.mlp.weight", "kpf_encoder.encoder_blocks.6.KPConv.weights", "kpf_encoder.encoder_blocks.6.KPConv.kernel_points", "kpf_encoder.encoder_blocks.6.unary2.mlp.weight", "kpf_encoder.encoder_blocks.6.unary_shortcut.mlp.weight", "kpf_encoder.encoder_blocks.7.unary1.mlp.weight", "kpf_encoder.encoder_blocks.7.KPConv.weights", "kpf_encoder.encoder_blocks.7.KPConv.kernel_points", "kpf_encoder.encoder_blocks.7.unary2.mlp.weight", "kpf_encoder.encoder_blocks.8.unary1.mlp.weight", "kpf_encoder.encoder_blocks.8.KPConv.weights", "kpf_encoder.encoder_blocks.8.KPConv.kernel_points", "kpf_encoder.encoder_blocks.8.unary2.mlp.weight", "kpf_encoder.encoder_blocks.9.unary1.mlp.weight", "kpf_encoder.encoder_blocks.9.KPConv.weights", "kpf_encoder.encoder_blocks.9.KPConv.kernel_points", "kpf_encoder.encoder_blocks.9.unary2.mlp.weight", "kpf_encoder.encoder_blocks.9.unary_shortcut.mlp.weight", "kpf_encoder.encoder_blocks.10.unary1.mlp.weight", "kpf_encoder.encoder_blocks.10.KPConv.weights", "kpf_encoder.encoder_blocks.10.KPConv.kernel_points", "kpf_encoder.encoder_blocks.10.unary2.mlp.weight", "feat_proj.weight", "feat_proj.bias", "transformer_encoder.layers.0.self_attn.in_proj_weight", "transformer_encoder.layers.0.self_attn.in_proj_bias", "transformer_encoder.layers.0.self_attn.out_proj.weight", "transformer_encoder.layers.0.self_attn.out_proj.bias", "transformer_encoder.layers.0.multihead_attn.in_proj_weight", "transformer_encoder.layers.0.multihead_attn.in_proj_bias", "transformer_encoder.layers.0.multihead_attn.out_proj.weight", "transformer_encoder.layers.0.multihead_attn.out_proj.bias", "transformer_encoder.layers.0.linear1.weight", "transformer_encoder.layers.0.linear1.bias", "transformer_encoder.layers.0.linear2.weight", "transformer_encoder.layers.0.linear2.bias", "transformer_encoder.layers.0.norm1.weight", "transformer_encoder.layers.0.norm1.bias", "transformer_encoder.layers.0.norm2.weight", "transformer_encoder.layers.0.norm2.bias", "transformer_encoder.layers.0.norm3.weight", "transformer_encoder.layers.0.norm3.bias", "transformer_encoder.layers.1.self_attn.in_proj_weight", "transformer_encoder.layers.1.self_attn.in_proj_bias", "transformer_encoder.layers.1.self_attn.out_proj.weight", "transformer_encoder.layers.1.self_attn.out_proj.bias", "transformer_encoder.layers.1.multihead_attn.in_proj_weight", "transformer_encoder.layers.1.multihead_attn.in_proj_bias", "transformer_encoder.layers.1.multihead_attn.out_proj.weight", "transformer_encoder.layers.1.multihead_attn.out_proj.bias", "transformer_encoder.layers.1.linear1.weight", "transformer_encoder.layers.1.linear1.bias", "transformer_encoder.layers.1.linear2.weight", "transformer_encoder.layers.1.linear2.bias", "transformer_encoder.layers.1.norm1.weight", "transformer_encoder.layers.1.norm1.bias", "transformer_encoder.layers.1.norm2.weight", "transformer_encoder.layers.1.norm2.bias", "transformer_encoder.layers.1.norm3.weight", "transformer_encoder.layers.1.norm3.bias", "transformer_encoder.layers.2.self_attn.in_proj_weight", "transformer_encoder.layers.2.self_attn.in_proj_bias", "transformer_encoder.layers.2.self_attn.out_proj.weight", "transformer_encoder.layers.2.self_attn.out_proj.bias", "transformer_encoder.layers.2.multihead_attn.in_proj_weight", "transformer_encoder.layers.2.multihead_attn.in_proj_bias", "transformer_encoder.layers.2.multihead_attn.out_proj.weight", "transformer_encoder.layers.2.multihead_attn.out_proj.bias", "transformer_encoder.layers.2.linear1.weight", "transformer_encoder.layers.2.linear1.bias", "transformer_encoder.layers.2.linear2.weight", "transformer_encoder.layers.2.linear2.bias", "transformer_encoder.layers.2.norm1.weight", "transformer_encoder.layers.2.norm1.bias", "transformer_encoder.layers.2.norm2.weight", "transformer_encoder.layers.2.norm2.bias", "transformer_encoder.layers.2.norm3.weight", "transformer_encoder.layers.2.norm3.bias", "transformer_encoder.layers.3.self_attn.in_proj_weight", "transformer_encoder.layers.3.self_attn.in_proj_bias", "transformer_encoder.layers.3.self_attn.out_proj.weight", "transformer_encoder.layers.3.self_attn.out_proj.bias", "transformer_encoder.layers.3.multihead_attn.in_proj_weight", "transformer_encoder.layers.3.multihead_attn.in_proj_bias", "transformer_encoder.layers.3.multihead_attn.out_proj.weight", "transformer_encoder.layers.3.multihead_attn.out_proj.bias", "transformer_encoder.layers.3.linear1.weight", "transformer_encoder.layers.3.linear1.bias", "transformer_encoder.layers.3.linear2.weight", "transformer_encoder.layers.3.linear2.bias", "transformer_encoder.layers.3.norm1.weight", "transformer_encoder.layers.3.norm1.bias", "transformer_encoder.layers.3.norm2.weight", "transformer_encoder.layers.3.norm2.bias", "transformer_encoder.layers.3.norm3.weight", "transformer_encoder.layers.3.norm3.bias", "transformer_encoder.layers.4.self_attn.in_proj_weight", "transformer_encoder.layers.4.self_attn.in_proj_bias", "transformer_encoder.layers.4.self_attn.out_proj.weight", "transformer_encoder.layers.4.self_attn.out_proj.bias", "transformer_encoder.layers.4.multihead_attn.in_proj_weight", "transformer_encoder.layers.4.multihead_attn.in_proj_bias", "transformer_encoder.layers.4.multihead_attn.out_proj.weight", "transformer_encoder.layers.4.multihead_attn.out_proj.bias", "transformer_encoder.layers.4.linear1.weight", "transformer_encoder.layers.4.linear1.bias", "transformer_encoder.layers.4.linear2.weight", "transformer_encoder.layers.4.linear2.bias", "transformer_encoder.layers.4.norm1.weight", "transformer_encoder.layers.4.norm1.bias", "transformer_encoder.layers.4.norm2.weight", "transformer_encoder.layers.4.norm2.bias", "transformer_encoder.layers.4.norm3.weight", "transformer_encoder.layers.4.norm3.bias", "transformer_encoder.layers.5.self_attn.in_proj_weight", "transformer_encoder.layers.5.self_attn.in_proj_bias", "transformer_encoder.layers.5.self_attn.out_proj.weight", "transformer_encoder.layers.5.self_attn.out_proj.bias", "transformer_encoder.layers.5.multihead_attn.in_proj_weight", "transformer_encoder.layers.5.multihead_attn.in_proj_bias", "transformer_encoder.layers.5.multihead_attn.out_proj.weight", "transformer_encoder.layers.5.multihead_attn.out_proj.bias", "transformer_encoder.layers.5.linear1.weight", "transformer_encoder.layers.5.linear1.bias", "transformer_encoder.layers.5.linear2.weight", "transformer_encoder.layers.5.linear2.bias", "transformer_encoder.layers.5.norm1.weight", "transformer_encoder.layers.5.norm1.bias", "transformer_encoder.layers.5.norm2.weight", "transformer_encoder.layers.5.norm2.bias", "transformer_encoder.layers.5.norm3.weight", "transformer_encoder.layers.5.norm3.bias", "transformer_encoder.norm.weight", "transformer_encoder.norm.bias", "correspondence_decoder.coor_mlp.0.weight", "correspondence_decoder.coor_mlp.0.bias", "correspondence_decoder.coor_mlp.2.weight", "correspondence_decoder.coor_mlp.2.bias", "correspondence_decoder.coor_mlp.4.weight", "correspondence_decoder.coor_mlp.4.bias", "correspondence_decoder.conf_logits_decoder.weight", "correspondence_decoder.conf_logits_decoder.bias", "feature_criterion.W", "feature_criterion_un.W".
    

    what could be possibly the issue here? it seems that backbone or any other block name changed during training and testing? like 'backbone.encoder_blocks....' is missing while 'kpf_encoder.encoder_blocks...' is unexpected keys. Any hint or help would be appreciated!

    opened by hadleyhzy34 2
  • 4DMatch data and deformation graph generation.

    4DMatch data and deformation graph generation.

    Hi Yang,

    I notice the names of data files in 4DMatch is "camA_xxxx_camB_xxxx.npz". However, there is only one sequence for each shape in DeformingThings4D. So I am wondering what's the meaning of "cam1" and "cam2"? I guess "cam1" is the original camera in DeformingThings4D and "cam2" is generated with a random rigid transformation (fixed for the same shape), is this right?

    And I see in the supplementary material the deformation graph is generated according to the geodesic distance computed from the depth images, how can I find the implementation details for this?

    Thanks a lot.

    opened by qinzheng93 2
  • Which recall did you report in the paper?

    Which recall did you report in the paper?

    Hello! Thanks for open-sourcing this great work, here I got a question about the registration recall reported in the article. From the code : https://github.com/rabbityl/lepard/blob/bcae7f2ee1a2043372f2582b140645f2d6ade9f2/lib/tester.py#L119 It seems that you calculate the pair-level recall instead of the scene-level recall, but in the methods you compared with such as D3Feat or PREDATOR, the metric recall is calculated in scene-level but not in pair-level.

    Could you give some hints about it?

    opened by qsisi 2
  • Some questions about training process

    Some questions about training process

    Hi, I'd like to ask some questions about the training process,

    1. Do you use only one video card during training?
    2. How much time does it take to train for an epoch?
    3. In configs/train/3Dmatch.yaml, your batch_size = 8, num_worker = 16. I use a 3090 video card. When I use Batch_size = 2 and num_worker =4, I will report an error of insufficient video memory of the video card. Therefore, only batch_size = 1 can be used for training
    4. Also, your max_epoch is 1500, so do you need to train for 1500 epochs? Thank you very much for your help.
    opened by littlewater3 1
  • Error while executing sh compile_wrappers.sh for installing C++ extension

    Error while executing sh compile_wrappers.sh for installing C++ extension

    Hi

    I am trying to bring up lepard but I am facing issues while installing C++ extension. Logs are attached below

    My system specifications

    1. Windows 11
    2. using WSL (subsystem for Linux on windows)

    Can you please provide some input for solving this?

    `running build_ext building 'grid_subsampling' extension Warning: Can't read registry to find the necessary compiler setting Make sure that Python modules winreg, win32api or win32con are installed. C compiler: gcc -pthread -B /home/grblr06/anaconda3/envs/lepard/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/home/grblr06/anaconda3/envs/lepard/lib/python3.8/site-packages/numpy/core/include -I/home/grblr06/anaconda3/envs/lepard/include/python3.8 -c' extra options: '-std=c++11 -D_GLIBCXX_USE_CXX11_ABI=0' gcc: ../cpp_utils/cloud/cloud.cpp gcc: wrapper.cpp gcc: grid_subsampling/grid_subsampling.cpp error: Command "gcc -pthread -B /home/grblr06/anaconda3/envs/lepard/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/grblr06/anaconda3/envs/lepard/lib/python3.8/site-packages/numpy/core/include -I/home/grblr06/anaconda3/envs/lepard/include/python3.8 -c grid_subsampling/grid_subsampling.cpp -o build/temp.linux-x86_64-3.8/grid_subsampling/grid_subsampling.o -std=c++11 -D_GLIBCXX_USE_CXX11_ABI=0" failed with exit status 127 running build_ext building 'radius_neighbors' extension Warning: Can't read registry to find the necessary compiler setting Make sure that Python modules winreg, win32api or win32con are installed. C compiler: gcc -pthread -B /home/grblr06/anaconda3/envs/lepard/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/home/grblr06/anaconda3/envs/lepard/lib/python3.8/site-packages/numpy/core/include -I/home/grblr06/anaconda3/envs/lepard/include/python3.8 -c' extra options: '-std=c++11 -D_GLIBCXX_USE_CXX11_ABI=0' gcc: ../cpp_utils/cloud/cloud.cpp gcc: wrapper.cpp gcc: neighbors/neighbors.cpp error: Command "gcc -pthread -B /home/grblr06/anaconda3/envs/lepard/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/grblr06/anaconda3/envs/lepard/lib/python3.8/site-packages/numpy/core/include -I/home/grblr06/anaconda3/envs/lepard/include/python3.8 -c neighbors/neighbors.cpp -o build/temp.linux-x86_64-3.8/neighbors/neighbors.o -std=c++11 -D_GLIBCXX_USE_CXX11_ABI=0" failed with exit status 127`

    opened by mnrmja007 1
  • question about training process

    question about training process

    Hi, excellent work! I try to re-train your net with 3dmatch datasets, during the process I found that the parameter 'src_mask' is all True, I think the True or False in 'src_mask' means the point in the source point cloud has the possibility of correspondence to other points in the target point cloud. Am I wrong? or this parameter will change in 4dmatch training process? Hope for your reply!

    opened by wra1229 0
  • question about the difference between get_match and get_topk_match

    question about the difference between get_match and get_topk_match

    hi, thank you for sharing such brilliant work! I found the definition of def get_match( conf_matrix, the, mutual=True): is same as the definition of def get_topk_match( conf_matrix, the, mutual=True):. Is there any difference between get_match and get_topk_match? And the paper shows that the authors select matches with confidence higher than a threshold of θc, and further enforce mutual nearest neighbor (MNN) criteria. Does that mean the "get_topk_match" ?

    opened by 2019EPWL 0
  • How to perform ICP on 3DMatch ?

    How to perform ICP on 3DMatch ?

    Hi,

    Thank you for sharing your excellent work! I have read your paper recently, and I'm wondering what kind of ICP implementaion do you use referred in your paper, and what is Point-to-point ICP.

    opened by Gardlin 0
  • How to visualize the  point cloud features?

    How to visualize the point cloud features?

    Hi,

    Thank you for sharing your excellent work! I have read your paper recently, and I'm wondering how the draw the point clouds feature map like yours, which I think is really amazing. I will be very appreciative if you can give me some idea about how to draw like this. Thank you very much!

    opened by Gardlin 2
  • Problem about retraining

    Problem about retraining

    Amazing work! I tried to retrain the dataset, but the loss function did not converge. Have you ever encountered such a situation ? The log file is shown as follows :

    #parameters 37.549191 M train Epoch: 1 [ 101/9204]focal_coarse: 3.13 recall_coarse: 0.00 precision_coarse: 0.02 loss: 3.13 train Epoch: 1 [ 202/9204]focal_coarse: 2.97 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.97 train Epoch: 1 [ 303/9204]focal_coarse: 2.81 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.81 train Epoch: 1 [ 404/9204]focal_coarse: 2.72 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.72 train Epoch: 1 [ 505/9204]focal_coarse: 2.65 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.65 train Epoch: 1 [ 606/9204]focal_coarse: 2.62 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.62 train Epoch: 1 [ 707/9204]focal_coarse: 2.60 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.60 train Epoch: 1 [ 808/9204]focal_coarse: 2.56 recall_coarse: 0.00 precision_coarse: 0.02 loss: 2.56 train Epoch: 1 [ 909/9204]focal_coarse: 2.53 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.53 train Epoch: 1 [1010/9204]focal_coarse: 2.50 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.50 train Epoch: 1 [1111/9204]focal_coarse: 2.48 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.48 train Epoch: 1 [1212/9204]focal_coarse: 2.46 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.46 train Epoch: 1 [1313/9204]focal_coarse: 2.44 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.44 train Epoch: 1 [1414/9204]focal_coarse: 2.43 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.43 train Epoch: 1 [1515/9204]focal_coarse: 2.42 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.42 train Epoch: 1 [1616/9204]focal_coarse: 2.41 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.41 train Epoch: 1 [1717/9204]focal_coarse: 2.39 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.39 train Epoch: 1 [1818/9204]focal_coarse: 2.38 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.38 train Epoch: 1 [1919/9204]focal_coarse: 2.37 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.37 train Epoch: 1 [2020/9204]focal_coarse: 2.36 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.36 train Epoch: 1 [2121/9204]focal_coarse: 2.35 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.35 train Epoch: 1 [2222/9204]focal_coarse: 2.34 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.34 train Epoch: 1 [2323/9204]focal_coarse: 2.33 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.33 train Epoch: 1 [2424/9204]focal_coarse: 2.32 recall_coarse: 0.00 precision_coarse: 0.03 loss: 2.32 train Epoch: 1 [2525/9204]focal_coarse: 2.31 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.31 train Epoch: 1 [2626/9204]focal_coarse: 2.30 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.30 train Epoch: 1 [2727/9204]focal_coarse: 2.29 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.29 train Epoch: 1 [2828/9204]focal_coarse: 2.28 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.28 train Epoch: 1 [2929/9204]focal_coarse: 2.27 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.27 train Epoch: 1 [3030/9204]focal_coarse: 2.26 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.26 train Epoch: 1 [3131/9204]focal_coarse: 2.26 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.26 train Epoch: 1 [3232/9204]focal_coarse: 2.24 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.24 train Epoch: 1 [3333/9204]focal_coarse: 2.24 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.24 train Epoch: 1 [3434/9204]focal_coarse: 2.23 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.23 train Epoch: 1 [3535/9204]focal_coarse: 2.22 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.22 train Epoch: 1 [3636/9204]focal_coarse: 2.21 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.21 train Epoch: 1 [3737/9204]focal_coarse: 2.20 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.20 train Epoch: 1 [3838/9204]focal_coarse: 2.20 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.20 train Epoch: 1 [3939/9204]focal_coarse: 2.19 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.19 train Epoch: 1 [4040/9204]focal_coarse: 2.18 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.18 train Epoch: 1 [4141/9204]focal_coarse: 2.17 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.17 train Epoch: 1 [4242/9204]focal_coarse: 2.16 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.16 train Epoch: 1 [4343/9204]focal_coarse: 2.15 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.15 train Epoch: 1 [4444/9204]focal_coarse: 2.14 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.14 train Epoch: 1 [4545/9204]focal_coarse: 2.13 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.13 train Epoch: 1 [4646/9204]focal_coarse: 2.13 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.13 train Epoch: 1 [4747/9204]focal_coarse: 2.12 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.12 train Epoch: 1 [4848/9204]focal_coarse: 2.11 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.11 train Epoch: 1 [4949/9204]focal_coarse: 2.10 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.10 train Epoch: 1 [5050/9204]focal_coarse: 2.10 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.10 train Epoch: 1 [5151/9204]focal_coarse: 2.09 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.09 train Epoch: 1 [5252/9204]focal_coarse: 2.08 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.08 train Epoch: 1 [5353/9204]focal_coarse: 2.07 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.07 train Epoch: 1 [5454/9204]focal_coarse: 2.06 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.06 train Epoch: 1 [5555/9204]focal_coarse: 2.05 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.05 train Epoch: 1 [5656/9204]focal_coarse: 2.05 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.05 train Epoch: 1 [5757/9204]focal_coarse: 2.04 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.04 train Epoch: 1 [5858/9204]focal_coarse: 2.04 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.04 train Epoch: 1 [5959/9204]focal_coarse: 2.03 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.03 train Epoch: 1 [6060/9204]focal_coarse: 2.03 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.03 train Epoch: 1 [6161/9204]focal_coarse: 2.02 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.02 train Epoch: 1 [6262/9204]focal_coarse: 2.02 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.02 train Epoch: 1 [6363/9204]focal_coarse: 2.01 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.01 train Epoch: 1 [6464/9204]focal_coarse: 2.01 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.01 train Epoch: 1 [6565/9204]focal_coarse: 2.00 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.00 train Epoch: 1 [6666/9204]focal_coarse: 2.00 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.00 train Epoch: 1 [6767/9204]focal_coarse: 1.99 recall_coarse: 0.00 precision_coarse: 0.10 loss: 1.99 train Epoch: 1 [6868/9204]focal_coarse: 1.99 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.99 train Epoch: 1 [6969/9204]focal_coarse: 1.98 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.98 train Epoch: 1 [7070/9204]focal_coarse: 1.98 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.98 train Epoch: 1 [7171/9204]focal_coarse: 1.97 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.97 train Epoch: 1 [7272/9204]focal_coarse: 1.96 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.96 train Epoch: 1 [7373/9204]focal_coarse: 1.96 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.96 train Epoch: 1 [7474/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [7575/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [7676/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [7777/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [7878/9204]focal_coarse: 1.94 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.94 train Epoch: 1 [7979/9204]focal_coarse: 1.94 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.94 train Epoch: 1 [8080/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8181/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8282/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8383/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8484/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8585/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8686/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8787/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8888/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.95 train Epoch: 1 [8989/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.95 train Epoch: 1 [9090/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.95 train Epoch: 1 [9191/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.95 train Epoch: 1focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.95 Save model to experiments/3dmatch-last/2D3D_no_circle_dualmatching/checkpoints/model_1.pth val Epoch: 1focal_coarse: 1.81 recall_coarse: 0.00 precision_coarse: 0.09 loss: 1.81 Save model to experiments/3dmatch-last/2D3D_no_circle_dualmatching/checkpoints/model_best_loss.pth train Epoch: 2 [ 101/9204]focal_coarse: 2.00 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.00 train Epoch: 2 [ 202/9204]focal_coarse: 2.05 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.05 train Epoch: 2 [ 303/9204]focal_coarse: 2.08 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.08 train Epoch: 2 [ 404/9204]focal_coarse: 2.06 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.06 train Epoch: 2 [ 505/9204]focal_coarse: 2.01 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.01 train Epoch: 2 [ 606/9204]focal_coarse: 1.96 recall_coarse: 0.00 precision_coarse: 0.09 loss: 1.96 train Epoch: 2 [ 707/9204]focal_coarse: 1.94 recall_coarse: 0.00 precision_coarse: 0.10 loss: 1.94 train Epoch: 2 [ 808/9204]focal_coarse: 1.95 recall_coarse: 0.00 precision_coarse: 0.10 loss: 1.95 train Epoch: 2 [ 909/9204]focal_coarse: 1.93 recall_coarse: 0.00 precision_coarse: 0.10 loss: 1.93 train Epoch: 2 [1010/9204]focal_coarse: 1.92 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.92 train Epoch: 2 [1111/9204]focal_coarse: 1.89 recall_coarse: 0.00 precision_coarse: 0.11 loss: 1.89 train Epoch: 2 [1212/9204]focal_coarse: 1.86 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.86 train Epoch: 2 [1313/9204]focal_coarse: 1.86 recall_coarse: 0.00 precision_coarse: 0.12 loss: 1.86 train Epoch: 2 [1414/9204]focal_coarse: 1.84 recall_coarse: 0.00 precision_coarse: 0.13 loss: 1.84 train Epoch: 2 [1515/9204]focal_coarse: 1.82 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.82 train Epoch: 2 [1616/9204]focal_coarse: 1.81 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.81 train Epoch: 2 [1717/9204]focal_coarse: 1.81 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.81 train Epoch: 2 [1818/9204]focal_coarse: 1.80 recall_coarse: 0.00 precision_coarse: 0.15 loss: 1.80 train Epoch: 2 [1919/9204]focal_coarse: 1.81 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.81 train Epoch: 2 [2020/9204]focal_coarse: 1.81 recall_coarse: 0.00 precision_coarse: 0.15 loss: 1.81 train Epoch: 2 [2121/9204]focal_coarse: 1.85 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.85 train Epoch: 2 [2222/9204]focal_coarse: 1.85 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.85 train Epoch: 2 [2323/9204]focal_coarse: 1.92 recall_coarse: 0.00 precision_coarse: 0.14 loss: 1.92 train Epoch: 2 [2424/9204]focal_coarse: 1.98 recall_coarse: 0.00 precision_coarse: 0.13 loss: 1.98 train Epoch: 2 [2525/9204]focal_coarse: 2.04 recall_coarse: 0.00 precision_coarse: 0.13 loss: 2.04 train Epoch: 2 [2626/9204]focal_coarse: 2.09 recall_coarse: 0.00 precision_coarse: 0.12 loss: 2.09 train Epoch: 2 [2727/9204]focal_coarse: 2.14 recall_coarse: 0.00 precision_coarse: 0.12 loss: 2.14 train Epoch: 2 [2828/9204]focal_coarse: 2.19 recall_coarse: 0.00 precision_coarse: 0.11 loss: 2.19 train Epoch: 2 [2929/9204]focal_coarse: 2.23 recall_coarse: 0.00 precision_coarse: 0.11 loss: 2.23 train Epoch: 2 [3030/9204]focal_coarse: 2.27 recall_coarse: 0.00 precision_coarse: 0.11 loss: 2.27 train Epoch: 2 [3131/9204]focal_coarse: 2.31 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.31 train Epoch: 2 [3232/9204]focal_coarse: 2.35 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.35 train Epoch: 2 [3333/9204]focal_coarse: 2.38 recall_coarse: 0.00 precision_coarse: 0.10 loss: 2.38 train Epoch: 2 [3434/9204]focal_coarse: 2.41 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.41 train Epoch: 2 [3535/9204]focal_coarse: 2.44 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.44 train Epoch: 2 [3636/9204]focal_coarse: 2.47 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.47 train Epoch: 2 [3737/9204]focal_coarse: 2.49 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.49 train Epoch: 2 [3838/9204]focal_coarse: 2.52 recall_coarse: 0.00 precision_coarse: 0.09 loss: 2.52 train Epoch: 2 [3939/9204]focal_coarse: 2.54 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.54 train Epoch: 2 [4040/9204]focal_coarse: 2.57 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.57 train Epoch: 2 [4141/9204]focal_coarse: 2.59 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.59 train Epoch: 2 [4242/9204]focal_coarse: 2.61 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.61 train Epoch: 2 [4343/9204]focal_coarse: 2.63 recall_coarse: 0.00 precision_coarse: 0.08 loss: 2.63 train Epoch: 2 [4444/9204]focal_coarse: 2.65 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.65 train Epoch: 2 [4545/9204]focal_coarse: 2.66 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.66 train Epoch: 2 [4646/9204]focal_coarse: 2.68 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.68 train Epoch: 2 [4747/9204]focal_coarse: 2.70 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.70 train Epoch: 2 [4848/9204]focal_coarse: 2.71 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.71 train Epoch: 2 [4949/9204]focal_coarse: 2.73 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.73 train Epoch: 2 [5050/9204]focal_coarse: 2.74 recall_coarse: 0.00 precision_coarse: 0.07 loss: 2.74 train Epoch: 2 [5151/9204]focal_coarse: 2.76 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.76 train Epoch: 2 [5252/9204]focal_coarse: 2.77 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.77 train Epoch: 2 [5353/9204]focal_coarse: 2.78 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.78 train Epoch: 2 [5454/9204]focal_coarse: 2.80 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.80 train Epoch: 2 [5555/9204]focal_coarse: 2.81 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.81 train Epoch: 2 [5656/9204]focal_coarse: 2.82 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.82 train Epoch: 2 [5757/9204]focal_coarse: 2.83 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.83 train Epoch: 2 [5858/9204]focal_coarse: 2.84 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.84 train Epoch: 2 [5959/9204]focal_coarse: 2.85 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.85 train Epoch: 2 [6060/9204]focal_coarse: 2.86 recall_coarse: 0.00 precision_coarse: 0.06 loss: 2.86 train Epoch: 2 [6161/9204]focal_coarse: 2.87 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.87 train Epoch: 2 [6262/9204]focal_coarse: 2.88 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.88 train Epoch: 2 [6363/9204]focal_coarse: 2.89 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.89 train Epoch: 2 [6464/9204]focal_coarse: 2.90 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.90 train Epoch: 2 [6565/9204]focal_coarse: 2.91 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.91 train Epoch: 2 [6666/9204]focal_coarse: 2.91 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.91 train Epoch: 2 [6767/9204]focal_coarse: 2.92 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.92 train Epoch: 2 [6868/9204]focal_coarse: 2.93 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.93 train Epoch: 2 [6969/9204]focal_coarse: 2.94 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.94 train Epoch: 2 [7070/9204]focal_coarse: 2.94 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.94 train Epoch: 2 [7171/9204]focal_coarse: 2.95 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.95 train Epoch: 2 [7272/9204]focal_coarse: 2.96 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.96 train Epoch: 2 [7373/9204]focal_coarse: 2.97 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.97 train Epoch: 2 [7474/9204]focal_coarse: 2.97 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.97 train Epoch: 2 [7575/9204]focal_coarse: 2.98 recall_coarse: 0.00 precision_coarse: 0.05 loss: 2.98 train Epoch: 2 [7676/9204]focal_coarse: 2.98 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.98 train Epoch: 2 [7777/9204]focal_coarse: 2.99 recall_coarse: 0.00 precision_coarse: 0.04 loss: 2.99 train Epoch: 2 [7878/9204]focal_coarse: 3.00 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.00 train Epoch: 2 [7979/9204]focal_coarse: 3.00 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.00 train Epoch: 2 [8080/9204]focal_coarse: 3.01 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.01 train Epoch: 2 [8181/9204]focal_coarse: 3.01 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.01 train Epoch: 2 [8282/9204]focal_coarse: 3.02 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.02 train Epoch: 2 [8383/9204]focal_coarse: 3.02 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.02 train Epoch: 2 [8484/9204]focal_coarse: 3.03 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.03 train Epoch: 2 [8585/9204]focal_coarse: 3.03 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.03 train Epoch: 2 [8686/9204]focal_coarse: 3.04 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.04 train Epoch: 2 [8787/9204]focal_coarse: 3.04 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.04 train Epoch: 2 [8888/9204]focal_coarse: 3.05 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.05 train Epoch: 2 [8989/9204]focal_coarse: 3.05 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.05 train Epoch: 2 [9090/9204]focal_coarse: 3.06 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.06 train Epoch: 2 [9191/9204]focal_coarse: 3.06 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.06 train Epoch: 2focal_coarse: 3.06 recall_coarse: 0.00 precision_coarse: 0.04 loss: 3.06 Save model to experiments/3dmatch-last/2D3D_no_circle_dualmatching/checkpoints/model_2.pth val Epoch: 2focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [ 101/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 3 [ 202/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 3 [ 303/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [ 404/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [ 505/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 3 [ 606/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 3 [ 707/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 3 [ 808/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [ 909/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [1010/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [1111/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 3 [1212/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [1313/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [1414/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [1515/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [1616/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [1717/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [1818/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [1919/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2020/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2121/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2222/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2323/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2424/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2525/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2626/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2727/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2828/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [2929/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3030/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3131/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3232/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3333/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3434/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3535/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3636/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3737/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3838/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [3939/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4040/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4141/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4242/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4343/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4444/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4545/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4646/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4747/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4848/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [4949/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [5050/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [5151/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [5252/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [5353/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [5454/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 3 [5555/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [5656/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [5757/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [5858/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [5959/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6060/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6161/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6262/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6363/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6464/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6565/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6666/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6767/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6868/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [6969/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7070/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7171/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7272/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7373/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7474/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7575/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7676/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7777/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7878/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [7979/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8080/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8181/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8282/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8383/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8484/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8585/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8686/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8787/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8888/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [8989/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [9090/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3 [9191/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 3focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 Save model to experiments/3dmatch-last/2D3D_no_circle_dualmatching/checkpoints/model_3.pth val Epoch: 3focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [ 101/9204]focal_coarse: 3.42 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.42 train Epoch: 4 [ 202/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [ 303/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [ 404/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [ 505/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [ 606/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [ 707/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [ 808/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [ 909/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1010/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1111/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1212/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1313/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1414/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1515/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 4 [1616/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1717/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1818/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [1919/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2020/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2121/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2222/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2323/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2424/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2525/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2626/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2727/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2828/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [2929/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3030/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3131/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3232/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3333/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3434/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3535/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3636/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3737/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3838/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [3939/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4040/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4141/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4242/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4343/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4444/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4545/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4646/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4747/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4848/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [4949/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5050/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5151/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5252/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5353/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5454/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5555/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5656/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5757/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5858/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [5959/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6060/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6161/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6262/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6363/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6464/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6565/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6666/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6767/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 4 [6868/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [6969/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7070/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7171/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7272/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7373/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7474/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7575/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7676/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7777/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7878/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [7979/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8080/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8181/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8282/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8383/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8484/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8585/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8686/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8787/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8888/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [8989/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [9090/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4 [9191/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 4focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 Save model to experiments/3dmatch-last/2D3D_no_circle_dualmatching/checkpoints/model_4.pth val Epoch: 4focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 101/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 202/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 303/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 404/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 505/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 606/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 707/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 808/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [ 909/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1010/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1111/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1212/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1313/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1414/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1515/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1616/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1717/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1818/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [1919/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 5 [2020/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 5 [2121/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 5 [2222/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 5 [2323/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 5 [2424/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 5 [2525/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [2626/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [2727/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [2828/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [2929/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3030/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3131/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [3232/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3333/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3434/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3535/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3636/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3737/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3838/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 5 [3939/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4040/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4141/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4242/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4343/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4444/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4545/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4646/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4747/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4848/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [4949/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5050/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5151/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5252/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5353/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5454/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5555/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5656/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5757/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5858/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [5959/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6060/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6161/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6262/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6363/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6464/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6565/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6666/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6767/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6868/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [6969/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7070/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7171/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7272/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7373/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7474/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7575/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7676/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7777/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7878/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [7979/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8080/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8181/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8282/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8383/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8484/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8585/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8686/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8787/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8888/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [8989/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [9090/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5 [9191/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 5focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 Save model to experiments/3dmatch-last/2D3D_no_circle_dualmatching/checkpoints/model_5.pth val Epoch: 5focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [ 101/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 202/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 303/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 404/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 505/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 606/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 707/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 808/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [ 909/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [1010/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [1111/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [1212/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [1313/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [1414/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [1515/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [1616/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [1717/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [1818/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [1919/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [2020/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [2121/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 6 [2222/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 6 [2323/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.44 train Epoch: 6 [2424/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [2525/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [2626/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 6 [2727/9204]focal_coarse: 3.44 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.44 train Epoch: 6 [2828/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [2929/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [3030/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [3131/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [3232/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.01 loss: 3.45 train Epoch: 6 [3333/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [3434/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [3535/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [3636/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [3737/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [3838/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [3939/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [4040/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [4141/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [4242/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45 train Epoch: 6 [4343/9204]focal_coarse: 3.45 recall_coarse: 0.00 precision_coarse: 0.00 loss: 3.45

    opened by Pterosaur-Yao 0
Owner
null
Point Cloud Denoising input segmentation output raw point-cloud valid/clear fog rain de-noised Abstract Lidar sensors are frequently used in environme

Point Cloud Denoising input segmentation output raw point-cloud valid/clear fog rain de-noised Abstract Lidar sensors are frequently used in environme

null 75 Nov 24, 2022
MVP Benchmark for Multi-View Partial Point Cloud Completion and Registration

MVP Benchmark: Multi-View Partial Point Clouds for Completion and Registration [NEWS] 2021-07-12 [NEW ?? ] The submission on Codalab starts! 2021-07-1

PL 93 Dec 21, 2022
Semantic Segmentation for Real Point Cloud Scenes via Bilateral Augmentation and Adaptive Fusion (CVPR 2021)

Semantic Segmentation for Real Point Cloud Scenes via Bilateral Augmentation and Adaptive Fusion (CVPR 2021) This repository is for BAAF-Net introduce

null 90 Dec 29, 2022
Official PyTorch implementation of CAPTRA: CAtegory-level Pose Tracking for Rigid and Articulated Objects from Point Clouds

CAPTRA: CAtegory-level Pose Tracking for Rigid and Articulated Objects from Point Clouds Introduction This is the official PyTorch implementation of o

Yijia Weng 96 Dec 7, 2022
Puzzle-CAM: Improved localization via matching partial and full features.

Puzzle-CAM The official implementation of "Puzzle-CAM: Improved localization via matching partial and full features".

Sanghyun Jo 150 Nov 14, 2022
GEP (GDB Enhanced Prompt) - a GDB plug-in for GDB command prompt with fzf history search, fish-like autosuggestions, auto-completion with floating window, partial string matching in history, and more!

GEP (GDB Enhanced Prompt) GEP (GDB Enhanced Prompt) is a GDB plug-in which make your GDB command prompt more convenient and flexibility. Why I need th

Alan Li 23 Dec 21, 2022
Robust Partial Matching for Person Search in the Wild

APNet for Person Search Introduction This is the code of Robust Partial Matching for Person Search in the Wild accepted in CVPR2020. The Align-to-Part

Yingji Zhong 36 Dec 18, 2022
Style-based Point Generator with Adversarial Rendering for Point Cloud Completion (CVPR 2021)

Style-based Point Generator with Adversarial Rendering for Point Cloud Completion (CVPR 2021) An efficient PyTorch library for Point Cloud Completion.

Microsoft 119 Jan 2, 2023
Implementation of the "PSTNet: Point Spatio-Temporal Convolution on Point Cloud Sequences" paper.

PSTNet: Point Spatio-Temporal Convolution on Point Cloud Sequences Introduction Point cloud sequences are irregular and unordered in the spatial dimen

Hehe Fan 63 Dec 9, 2022
Implementation of the "Point 4D Transformer Networks for Spatio-Temporal Modeling in Point Cloud Videos" paper.

Point 4D Transformer Networks for Spatio-Temporal Modeling in Point Cloud Videos Introduction Point cloud videos exhibit irregularities and lack of or

Hehe Fan 101 Dec 29, 2022
Synthetic LiDAR sequential point cloud dataset with point-wise annotations

SynLiDAR dataset: Learning From Synthetic LiDAR Sequential Point Cloud This is official repository of the SynLiDAR dataset. For technical details, ple

null 78 Dec 27, 2022
[ICCV 2021 Oral] SnowflakeNet: Point Cloud Completion by Snowflake Point Deconvolution with Skip-Transformer

This repository contains the source code for the paper SnowflakeNet: Point Cloud Completion by Snowflake Point Deconvolution with Skip-Transformer (ICCV 2021 Oral). The project page is here.

AllenXiang 65 Dec 26, 2022
Code for "Learning to Segment Rigid Motions from Two Frames".

rigidmask Code for "Learning to Segment Rigid Motions from Two Frames". ** This is a partial release with inference and evaluation code.

Gengshan Yang 157 Nov 21, 2022
Weakly Supervised Learning of Rigid 3D Scene Flow

Weakly Supervised Learning of Rigid 3D Scene Flow This repository provides code and data to train and evaluate a weakly supervised method for rigid 3D

Zan Gojcic 124 Dec 27, 2022
A variational Bayesian method for similarity learning in non-rigid image registration (CVPR 2022)

A variational Bayesian method for similarity learning in non-rigid image registration We provide the source code and the trained models used in the re

daniel grzech 14 Nov 21, 2022
Code for C2-Matching (CVPR2021). Paper: Robust Reference-based Super-Resolution via C2-Matching.

C2-Matching (CVPR2021) This repository contains the implementation of the following paper: Robust Reference-based Super-Resolution via C2-Matching Yum

Yuming Jiang 151 Dec 26, 2022
A Python implementation of the Locality Preserving Matching (LPM) method for pruning outliers in image matching.

LPM_Python A Python implementation of the Locality Preserving Matching (LPM) method for pruning outliers in image matching. The code is established ac

AoxiangFan 11 Nov 7, 2022
A multi-scale unsupervised learning for deformable image registration

A multi-scale unsupervised learning for deformable image registration Shuwei Shao, Zhongcai Pei, Weihai Chen, Wentao Zhu, Xingming Wu and Baochang Zha

ShuweiShao 2 Apr 13, 2022
Open source repository for the code accompanying the paper 'Non-Rigid Neural Radiance Fields Reconstruction and Novel View Synthesis of a Deforming Scene from Monocular Video'.

Non-Rigid Neural Radiance Fields This is the official repository for the project "Non-Rigid Neural Radiance Fields: Reconstruction and Novel View Synt

Facebook Research 296 Dec 29, 2022