Hi, In medicaltorch, inappropriate dependency versioning constraints can cause risks.
Below are the dependencies and version constraints that the project is using
nibabel>=2.2.1
scipy>=1.0.0
numpy>=1.14.1
torch>=0.4.0
torchvision>=0.2.1
tqdm>=4.23.0
scikit-image==0.15.0
The version constraint == will introduce the risk of dependency conflicts because the scope of dependencies is too strict.
The version constraint No Upper Bound and * will introduce the risk of the missing API Error because the latest version of the dependencies may remove some APIs.
After further analysis, in this project,
The version constraint of dependency scipy can be changed to >=0.19.0,<=1.7.3.
The version constraint of dependency tqdm can be changed to >=4.36.0,<=4.64.0.
The above modification suggestions can reduce the dependency conflicts as much as possible,
and introduce the latest version as much as possible without calling Error in the projects.
The invocation of the current project includes all the following methods.
The calling methods from the scipy
scipy.spatial.distance.directed_hausdorff
scipy.ndimage.filters.gaussian_filter
scipy.ndimage.interpolation.map_coordinates
scipy.spatial.distance.dice
scipy.spatial.distance.jaccard
The calling methods from the tqdm
tqdm.tqdm.set_postfix
tqdm.tqdm
The calling methods from the all methods
self.up3
self.mp3
self.conv1a
f.read
re.search
self.branch4a_bn
DownConv
isinstance
numpy.arange
ValueError
scipy.spatial.distance.directed_hausdorff
self.conv3
self.dc5
torch.LongTensor
numpy.any
numpy.copy
range
numpy.allclose
torch.from_numpy
self.branch4a_drop
self.ec2
self.mp1
index.self.handlers.get_pair_data
torch.nn.BatchNorm2d
numpy.sqrt
self.branch5b_bn
self.metadata.keys
training_mean.input_data.pow.sum
torch.stack
torch.nn.LeakyReLU
self.input_handle.header.get_zooms
self.conv2_bn
torchvision.transforms.functional.pad
numpy.float32
input.view
self.conv1b_bn
numpy.zeros
input_data.np.flip.copy
torchvision.transforms.functional.rotate
self.sample_transform
type
self.slice_filter_fn
numpy.random.uniform
len
tflat.iflat.sum
medicaltorch.transforms.ToTensor
self.conv9
self.up_conv
self.branch1a
SegmentationPair2D.get_pair_slice
prediction.flatten
self.dc4
self.branch2a
self.branch4b_bn
noise.astype.astype
self.result_dict.items
target.index_select
self.threshold.target.torch.gt.float.view
f.read.splitlines
mt_collate
self.branch3b_bn
self.branch1a_bn
numpy.random.random
self.branch1b_drop
self.branch3a
self.branch3b_drop
self.input_handle.header.get_data_shape
self._build_train_input_filename
self.gt_handle.header.get_data_shape
self.conv2a_bn
PIL.Image.fromarray.resize
torch.nn.functional.avg_pool2d
self.ec0
sample_data.numpy
self.branch3b
self.amort
self.conv2b_drop
self.branch1a_drop
error_msg.format
os.path.dirname
self.up1
torchvision.transforms.functional.center_crop
self.input_handle.get_fdata
target.index_select.view
numpy.squeeze
self.branch4b_drop
int
self.ec3
Mock
nibabel.as_closest_canonical
self.branch3a_bn
os.path.exists
self.branch1b
SegmentationPair2D
UpConv
numpy.divide
target.view
self.input_handle.get_fdata.numel
torch.nn.Conv2d
PIL.Image.fromarray.mean
self.propagate_params
self.Unet.super.__init__
self.batch.items
self.branch2a_bn
collections.defaultdict
self.input_handle.get_fdata.sum
self.down_conv
torch.gt
sys.path.insert
numeric_score
input.size
masking.squeeze.sum
self.branch2b_drop
i.self.handlers.get_pair_data
self.up2
self.branch4a
coord.self.handlers.get_pair_data
tqdm.tqdm
NotImplementedError
self.indexes.append
self.mp2
self.dc3
torch.nn.functional.relu
indices.image.map_coordinates.reshape
self.conv4
self._prepare_indexes
self.get_pair_data
DatasetManager
self.branch2b
self.branch5b
torchvision.transforms.functional.to_tensor
self.conv2b_bn
self.dc1
SampleMetadata
self.gt_handle.header.get_zooms
labeled_target.view.sum
self.dc8
skimage.exposure.equalize_adapthist
torch.is_tensor
self.UNet3D.super.__init__
torch.cat
format
numpy.random.randint
self.transform
PIL.Image.fromarray.std
self.ec7
self.branch3a_drop
setuptools.setup
self.downconv.size
setuptools.find_packages
elem.dtype.name.startswith
scipy.ndimage.filters.gaussian_filter
torch.nn.Dropout2d
masking.sum.sum
self.conv1b_drop
self.conv2b
scipy.spatial.distance.dice
numpy.isnan
elem.dtype.name.__numpy_type_map
self.conv2a_drop
self.conv1a_bn
torch.DoubleTensor
numpy.reshape
torch.nn.ConvTranspose3d
codecs.open
self.branch5a
torch.nn.Conv3d
torch.nn.MaxPool3d
RuntimeError
masking.squeeze.nonzero
list
self.prediction
self.conv2_drop
os.path.join
groundtruth.flatten
numpy.meshgrid
self.amort_bn
numpy.random.rand
torchvision.transforms.functional.affine
numpy.round
input.index_select
self.dc2
self.sample_augment.append
self.dc0
scipy.ndimage.interpolation.map_coordinates
masking.nonzero.squeeze
self.conv2a
self.ec5
map
TypeError
tqdm.tqdm.set_postfix
self.sample_augment
self.branch1b_bn
self.transform.undo_transform
self._load_filenames
torch.nn.Sequential
self.label_augment
self.get_params
input.index_select.view
scipy.spatial.distance.jaccard
self.conv1a_drop
self.DownConv.super.__init__
round
self.handlers.append
self.UpConv.super.__init__
self.dc9
SegmentationPair2D.get_pair_shapes
numpy.transpose
self.downconv
os.path.abspath
numpy.percentile
self.gt_handle.get_fdata
numpy.array
self.conv2
self.pool0
numpy.flip
self.conv1_drop
self.ec1
self.filename_pairs.append
torchvision.transforms.functional.normalize
self.branch5a_bn
self.branch5b_drop
self.ec4
self.elastic_transform
numpy.sum
self.branch2b_bn
super.__init__
self.concat_bn
torch.sigmoid
diff_conf.mean
self.ec6
global_pool.expand.expand
t.undo_transform
self.threshold.target.torch.gt.float
self.branch2a_drop
numpy.random.normal
self.branch4b
labeled_input.view.sum
self.conv1
self.get_pair_shapes
self.dc6
PIL.Image.fromarray
self.branch5a_drop
self.amort_drop
nibabel.load
numpy.sqrt.item
self.conv1_bn
torch.nn.MaxPool2d
sample.update
self.dc7
self.pool2
self.concat_drop
training_mean.input_data.pow
metric_fn
self.conv1b
self.pool1
training_mean.item
zip
unittest.mock.MagicMock
super
numpy.asarray
masking.squeeze.squeeze
gt_data.np.flip.copy
@developer
Could please help me check this issue?
May I pull a request to fix it?
Thank you very much.