MONAI Label is a server-client system that facilitates interactive medical image annotation by using AI.

Overview

MONAI Label

License CI Build Documentation Status codecov PyPI version

MONAI Label is a server-client system that facilitates interactive medical image annotation by using AI. It is an open-source and easy-to-install ecosystem that can run locally on a machine with one or two GPUs. Both server and client work on the same/different machine. However, initial support for multiple users is restricted. It shares the same principles with MONAI.

Brief Demo

Features

The codebase is currently under active development.

  • framework for developing and deploying MONAI Label Apps to train and infer AI models
  • compositional & portable APIs for ease of integration in existing workflows
  • customizable design for varying user expertise
  • 3DSlicer support

Installation

MONAI Label supports following OS with GPU/CUDA enabled.

  • Ubuntu
  • Windows

To install the current release, you can simply run:

  pip install monailabel
  
  # download sample apps/dataset
  monailabel apps --download --name deepedit_left_atrium --output apps
  monailabel datasets --download --name Task02_Heart --output datasets
  
  # run server
  monailabel start_server --app apps\deepedit_left_atrium --studies datasets\Task02_Heart\imagesTr
  

For other installation methods (using the default GitHub branch, using Docker, etc.), please refer to the installation guide.

Once you start the MONAI Label Server, by default it will be up and serving at http://127.0.0.1:8000/. Open the serving URL in browser. It will provide you the list of Rest APIs available.

3D Slicer

Download Preview Release from https://download.slicer.org/ and install MONAI Label plugin from Slicer Extension Manager.

Refer 3D Slicer plugin for other options to install and run MONAI Label plugin in 3D Slicer.

Contributing

For guidance on making a contribution to MONAI Label, see the contributing guidelines.

Community

Join the conversation on Twitter @ProjectMONAI or join our Slack channel.

Ask and answer questions over on MONAI Label's GitHub Discussions tab.

Links

Comments
  • Segmentation_[anatomy] model does not show up under Auto Segmentation header.

    Segmentation_[anatomy] model does not show up under Auto Segmentation header.

    Describe the bug When using a fairly straightforward non-interactive segmentation model, I expected to be able to provide a handful of manual segmentations, train the model, and simply run the auto segmentation to finish up the rest of my dataset.

    However, I unfortunately can't select the model in the Auto Segmentation header.

    I am using a custom implementation of the segmentation_spleen.py example files, with only the label name and class names changed.

    To Reproduce Steps to reproduce the behavior:

    1. Provide a number of manual segmentations to start off the segmentation model
    2. Train model successfully
    3. Notice that there is no way to actually use the trained model

    Expected behavior After providing a few manual segmentations, I expect to be able to use the now trained model in segmenting my anatomy of choice.

    Screenshots WhereModel

    Environment

    Ensuring you use the relevant python executable, please paste the output of:

    ================================
    Printing MONAI config...
    ================================
    MONAI version: 0.9.0
    Numpy version: 1.22.4
    Pytorch version: 1.11.0+cpu
    MONAI flags: HAS_EXT = False, USE_COMPILED = False
    MONAI rev id: af0e0e9f757558d144b655c63afcea3a4e0a06f5
    MONAI __file__: C:\Users\[Username]\AppData\Local\pypoetry\Cache\virtualenvs\nnunetdocker-uHdzYLSm-py3.9\lib\site-packages\monai\__init__.py
    
    Optional dependencies:
    Pytorch Ignite version: 0.4.8
    Nibabel version: 3.2.2
    scikit-image version: 0.19.3
    Pillow version: 9.1.1
    Tensorboard version: 2.9.1
    gdown version: 4.4.0
    TorchVision version: 0.12.0+cpu
    tqdm version: 4.64.0
    lmdb version: 1.3.0
    psutil version: 5.9.1
    pandas version: 1.4.2
    einops version: 0.4.1
    transformers version: NOT INSTALLED or UNKNOWN VERSION.
    mlflow version: NOT INSTALLED or UNKNOWN VERSION.
    pynrrd version: 0.4.3
    
    For details about installing the optional dependencies, please visit:
        https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies
    
    
    ================================
    Printing system config...
    ================================
    System: Windows
    Win32 version: ('10', '10.0.19044', 'SP0', 'Multiprocessor Free')
    Win32 edition: Enterprise
    Platform: Windows-10-10.0.19044-SP0
    Processor: Intel64 Family 6 Model 85 Stepping 4, GenuineIntel
    Machine: AMD64
    Python version: 3.9.13
    Process name: python.exe
    Command: ['C:\\Users\\[Username]\\AppData\\Local\\Programs\\Python\\Python39\\python.exe', '-c', 'import monai; monai.config.print_debug_info()']
    Open files: [popenfile(path='C:\\Windows\\System32\\en-US\\kernel32.dll.mui', fd=-1), popenfile(path='C:\\Windows\\System32\\en-US\\tzres.dll.mui', fd=-1), popenfile(path='C:\\Windows\\System32\\en-US\\KernelBase.dll.mui', fd=-1)]  
    Num physical CPUs: 12
    Num logical CPUs: 24
    Num usable CPUs: 24
    CPU usage (%): [5.0, 3.8, 1.9, 0.6, 0.0, 1.2, 1.9, 1.9, 0.6, 0.0, 0.6, 0.6, 2.5, 72.3, 0.6, 0.6, 0.6, 1.2, 1.9, 0.0, 1.2, 1.9, 0.0, 13.8]
    CPU freq. (MHz): 3504
    Load avg. in last 1, 5, 15 mins (%): [0.0, 0.0, 0.0]
    Disk usage (%): 66.6
    Avg. sensor temp. (Celsius): UNKNOWN for given OS
    Total physical memory (GB): 31.7
    Available memory (GB): 15.7
    Used memory (GB): 16.0
    
    ================================
    Printing GPU config...
    ================================
    Num GPUs: 0
    Has CUDA: False
    cuDNN enabled: False
    

    Additional context None, however CUDA 11.6 and 11.7 are installed, and nvidia-smi works, while MONAI tells me CUDA is not available. This is (probably) not relevant to my issue though, and I might open a new issue for that.

    opened by MathijsdeBoer 29
  • MONAI Label: The OHIF flow as described in the README does not work

    MONAI Label: The OHIF flow as described in the README does not work

    The OHIF flow as described in the README does not work. I have followed the instructions and tried to get OHIF up and running, but it does not work. Can anyone confirm that this flow is currently not working or update the README file?

    opened by patricio-astudillo 26
  • OHIF Viewer : Couldn't connect to Orthanc Server

    OHIF Viewer : Couldn't connect to Orthanc Server

    Describe the bug I am trying to use OHIF viewer. When trying to run monailabel start_server --app apps\deepedit --studies http://127.0.0.1:8042/dicom-web this URL , it pops up with a dialogbox to enter the username and password.

    I tried some default username and password, but it doesn't signed into the Orthanc Server.

    To Reproduce Steps to reproduce the behavior: For this, I followed the below steps which was mentioned in the document.

    1. Installed Orthanc using apt-get install orthanc orthanc-dicomweb.
    2. Upgraded to latest version by following steps mentioned here

    Now the orthanc server setup is completed.

    1. Run the command monailabel start_server --app apps\deepedit --studies http://127.0.0.1:8042/dicom-web

    Expected behavior

    I expect that orthanc server should load without the authentication pop up.

    Screenshots

    orthancerror

    Environment

    ================================
    Printing MONAI config...
    ================================
    MONAI version: 0.9.dev2149
    Numpy version: 1.21.4
    Pytorch version: 1.9.0+cu111
    MONAI flags: HAS_EXT = False, USE_COMPILED = False
    MONAI rev id: 1ad68787c35e259cb7704b56d679659104d2494c
    
    Optional dependencies:
    Pytorch Ignite version: 0.4.6
    Nibabel version: 3.2.1
    scikit-image version: 0.18.3
    Pillow version: 8.4.0
    Tensorboard version: 2.7.0
    gdown version: 4.2.0
    TorchVision version: 0.10.0+cu111
    tqdm version: 4.62.3
    lmdb version: 1.2.1
    psutil version: 5.8.0
    pandas version: NOT INSTALLED or UNKNOWN VERSION.
    einops version: NOT INSTALLED or UNKNOWN VERSION.
    transformers version: NOT INSTALLED or UNKNOWN VERSION.
    mlflow version: NOT INSTALLED or UNKNOWN VERSION.
    
    For details about installing the optional dependencies, please visit:
        https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies
    
    
    ================================
    Printing system config...
    ================================
    System: Linux
    Linux version: Ubuntu 20.04.3 LTS
    Platform: Linux-4.15.0-136-generic-x86_64-with-glibc2.10
    Processor: x86_64
    Machine: x86_64
    Python version: 3.8.10
    Process name: python
    Command: ['python', '-c', 'import monai; monai.config.print_debug_info()']
    Open files: []
    Num physical CPUs: 20
    Num logical CPUs: 40
    Num usable CPUs: 40
    CPU usage (%): [10.3, 11.3, 11.3, 11.9, 9.5, 11.8, 100.0, 9.4, 12.6, 8.9, 11.3, 10.7, 10.1, 13.3, 13.2, 10.2, 10.7, 16.9, 9.6, 9.5, 11.5, 10.1, 11.3, 9.5, 11.9, 8.9, 9.4, 10.1, 16.9, 11.9, 9.5, 10.1, 10.7, 10.1, 10.1, 10.1, 10.7, 10.1, 10.3, 14.6]
    CPU freq. (MHz): 1461
    Load avg. in last 1, 5, 15 mins (%): [1.7, 2.8, 4.1]
    Disk usage (%): 73.7
    Avg. sensor temp. (Celsius): UNKNOWN for given OS
    Total physical memory (GB): 251.8
    Available memory (GB): 237.8
    Used memory (GB): 12.8
    
    ================================
    Printing GPU config...
    ================================
    Num GPUs: 4
    Has CUDA: True
    CUDA version: 11.1
    cuDNN enabled: True
    cuDNN version: 8005
    Current device: 0
    Library compiled for CUDA architectures: ['sm_37', 'sm_50', 'sm_60', 'sm_70', 'sm_75', 'sm_80', 'sm_86']
    GPU 0 Name: Tesla V100-DGXS-32GB
    GPU 0 Is integrated: False
    GPU 0 Is multi GPU board: False
    GPU 0 Multi processor count: 80
    GPU 0 Total memory (GB): 31.7
    GPU 0 CUDA capability (maj.min): 7.0
    GPU 1 Name: Tesla V100-DGXS-32GB
    GPU 1 Is integrated: False
    GPU 1 Is multi GPU board: False
    GPU 1 Multi processor count: 80
    GPU 1 Total memory (GB): 31.7
    GPU 1 CUDA capability (maj.min): 7.0
    GPU 2 Name: Tesla V100-DGXS-32GB
    GPU 2 Is integrated: False
    GPU 2 Is multi GPU board: False
    GPU 2 Multi processor count: 80
    GPU 2 Total memory (GB): 31.7
    GPU 2 CUDA capability (maj.min): 7.0
    GPU 3 Name: Tesla V100-DGXS-32GB
    GPU 3 Is integrated: False
    GPU 3 Is multi GPU board: False
    GPU 3 Multi processor count: 80
    GPU 3 Total memory (GB): 31.7
    GPU 3 CUDA capability (maj.min): 7.0
    

    Additional context I installed the orthanc in DGX docker container.

    Any suggestion and help is appreciated. Thank you

    documentation help wanted 
    opened by Suchi97 24
  • Incorrect filename truncation

    Incorrect filename truncation

    Describe the bug

    When I use the Next Sample button, the retrieved displayed volume name is incorrect. That's because i believe parser incorrectly cuts at the volume name at "-" sign of the volume. Below is the log from the python interactor, displaying the correct volume name. In slicer, regardless of the volume, all retrieved volumes are displayed as 15.5_baseline in Slicer, which confuses the user.

    Check if file exists/shared locally: /workspace/undeterminedSex_AAMN_N305-17-e15.5_baseline.nii.gz => False
    http://127.0.0.1:8000/datastore/image?image=undeterminedSex_AAMN_N305-17-e15.5_baseline
    

    image

    0.4.0 
    opened by SlicerMorph 23
  • Client needs to be able to send data for annotation to remote-server (MONAILabel server)

    Client needs to be able to send data for annotation to remote-server (MONAILabel server)

    Is your feature request related to a problem? Please describe. MONAILabel V1 only supports server-sided hosting of data.

    For clinical use, we want our research assistants to be able to use MONAILabel from individual computers. A workstation with a dedicated GPU was assigned to be the remote-server, receiving client requests for annotating data and returning the annotation to the client.

    Describe the solution you'd like Clients should be able to send images for automatic annotation to the MONAILabel server.

    Additional context https://github.com/JolleyLabCHOP/DeepHeart/issues/1 https://github.com/JolleyLabCHOP/DeepHeart/issues/2

    enhancement 
    opened by che85 23
  • Inability to train model on monailabel

    Inability to train model on monailabel

    Dear Experts,

    Greetings from Mumbai, India.

    At the outset thank you for the open source application for image annotation and segmentation.

    I attempted segmentation model training on monailabel server deployed on WSL, however, training failed. Please find the entire log below for perusal and inputs:

    monailabel start_server --app apps/radiology --studies /mnt/h/Cases/TMH/Liver/ --conf models segmentation --host 0.0.0.0 --port 8080
    Using PYTHONPATH=/home/amitjc:
    
    Failed to load image Python extension: libtorch_cuda_cu.so: cannot open shared object file: No such file or directory
    2022-09-25 04:23:11,658 - USING:: version = False
    2022-09-25 04:23:11,658 - USING:: app = /home/amitjc/apps/radiology
    2022-09-25 04:23:11,658 - USING:: studies = /mnt/h/Cases/TMH/Liver
    2022-09-25 04:23:11,658 - USING:: verbose = INFO
    2022-09-25 04:23:11,658 - USING:: conf = [['models', 'segmentation']]
    2022-09-25 04:23:11,658 - USING:: host = 0.0.0.0
    2022-09-25 04:23:11,658 - USING:: port = 8080
    2022-09-25 04:23:11,658 - USING:: uvicorn_app = monailabel.app:app
    2022-09-25 04:23:11,658 - USING:: ssl_keyfile = None
    2022-09-25 04:23:11,658 - USING:: ssl_certfile = None
    2022-09-25 04:23:11,658 - USING:: ssl_keyfile_password = None
    2022-09-25 04:23:11,658 - USING:: ssl_ca_certs = None
    2022-09-25 04:23:11,658 - USING:: workers = None
    2022-09-25 04:23:11,658 - USING:: limit_concurrency = None
    2022-09-25 04:23:11,658 - USING:: access_log = False
    2022-09-25 04:23:11,658 - USING:: log_config = None
    2022-09-25 04:23:11,658 - USING:: dryrun = False
    2022-09-25 04:23:11,658 - USING:: action = start_server
    2022-09-25 04:23:11,658 - ENV SETTINGS:: MONAI_LABEL_API_STR =
    2022-09-25 04:23:11,658 - ENV SETTINGS:: MONAI_LABEL_PROJECT_NAME = MONAILabel
    2022-09-25 04:23:11,658 - ENV SETTINGS:: MONAI_LABEL_APP_DIR =
    2022-09-25 04:23:11,658 - ENV SETTINGS:: MONAI_LABEL_STUDIES =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_AUTH_ENABLE = False
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_AUTH_DB =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_APP_CONF = '{}'
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_TASKS_TRAIN = True
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_TASKS_STRATEGY = True
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_TASKS_SCORING = True
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_TASKS_BATCH_INFER = True
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_URL =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_USERNAME =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PASSWORD =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_API_KEY =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_CACHE_PATH =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PROJECT =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_ASSET_PATH =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_DSA_ANNOTATION_GROUPS =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_USERNAME =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PASSWORD =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_PATH =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_QIDO_PREFIX =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_WADO_PREFIX =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_STOW_PREFIX =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_FETCH_BY_FRAME = False
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_SEARCH_FILTER = '{"Modality": "CT"}'
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_EXPIRY = 180
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_READ_TIMEOUT = 5.0
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_AUTO_RELOAD = True
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_FILE_EXT = '["*.nii.gz", "*.nii", "*.nrrd", "*.jpg", "*.png", "*.tif", "*.svs", "*.xml"]'
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_SERVER_PORT = 8000
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_CORS_ORIGINS = '[]'
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_SESSIONS = True
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_SESSION_PATH =
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_SESSION_EXPIRY = 3600
    2022-09-25 04:23:11,659 - ENV SETTINGS:: MONAI_LABEL_INFER_CONCURRENCY = -1
    2022-09-25 04:23:11,660 - ENV SETTINGS:: MONAI_LABEL_INFER_TIMEOUT = 600
    2022-09-25 04:23:11,660 - ENV SETTINGS:: MONAI_LABEL_AUTO_UPDATE_SCORING = True
    2022-09-25 04:23:11,660 -
    Allow Origins: ['*']
    [2022-09-25 04:23:12,084] [393] [MainThread] [INFO] (uvicorn.error:75) - Started server process [393]
    [2022-09-25 04:23:12,084] [393] [MainThread] [INFO] (uvicorn.error:45) - Waiting for application startup.
    [2022-09-25 04:23:12,084] [393] [MainThread] [INFO] (monailabel.interfaces.utils.app:38) - Initializing App from: /home/amitjc/apps/radiology; studies: /mnt/h/Cases/TMH/Liver; conf: {'models': 'segmentation'}
    [2022-09-25 04:23:12,354] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class 'main.MyApp'>
    [2022-09-25 04:23:12,361] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_3d.Deepgrow3D'>
    [2022-09-25 04:23:12,361] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_spine.LocalizationSpine'>
    [2022-09-25 04:23:12,362] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation.Segmentation'>
    [2022-09-25 04:23:12,362] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_vertebra.LocalizationVertebra'>
    [2022-09-25 04:23:12,366] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_spleen.SegmentationSpleen'>
    [2022-09-25 04:23:12,366] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_vertebra.SegmentationVertebra'>
    [2022-09-25 04:23:12,367] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_2d.Deepgrow2D'>
    [2022-09-25 04:23:12,367] [393] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepedit.DeepEdit'>
    [2022-09-25 04:23:12,367] [393] [MainThread] [INFO] (main:86) - +++ Adding Model: segmentation => lib.configs.segmentation.Segmentation
    [2022-09-25 04:23:12,406] [393] [MainThread] [INFO] (main:90) - +++ Using Models: ['segmentation']
    [2022-09-25 04:23:12,406] [393] [MainThread] [INFO] (monailabel.interfaces.app:128) - Init Datastore for: /mnt/h/Cases/TMH/Liver
    [2022-09-25 04:23:12,406] [393] [MainThread] [INFO] (monailabel.datastore.local:126) - Auto Reload: True; Extensions: ['*.nii.gz', '*.nii', '*.nrrd', '*.jpg', '*.png', '*.tif', '*.svs', '*.xml']
    [2022-09-25 04:23:12,474] [393] [MainThread] [INFO] (monailabel.datastore.local:540) - Invalidate count: 0
    [2022-09-25 04:23:12,474] [393] [MainThread] [INFO] (monailabel.datastore.local:146) - Start observing external modifications on datastore (AUTO RELOAD)
    [2022-09-25 04:23:12,602] [393] [MainThread] [INFO] (main:116) - +++ Adding Inferer:: segmentation => <lib.infers.segmentation.Segmentation object at 0x7f413c5adeb0>
    [2022-09-25 04:23:12,602] [393] [MainThread] [INFO] (main:172) - {'segmentation': <lib.infers.segmentation.Segmentation object at 0x7f413c5adeb0>, 'Histogram+GraphCut': <monailabel.scribbles.infer.HistogramBasedGraphCut object at 0x7f4134049850>, 'GMM+GraphCut': <monailabel.scribbles.infer.GMMBasedGraphCut object at 0x7f4134049880>}
    [2022-09-25 04:23:12,602] [393] [MainThread] [INFO] (main:185) - +++ Adding Trainer:: segmentation => <lib.trainers.segmentation.Segmentation object at 0x7f4133ff67c0>
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: /home/amitjc/.cache/monailabel/sessions
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (monailabel.interfaces.app:465) - App Init - completed
    [2022-09-25 04:23:12,603] [timeloop] [INFO] Starting Timeloop..
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (timeloop:60) - Starting Timeloop..
    [2022-09-25 04:23:12,603] [timeloop] [INFO] Registered job <function MONAILabelApp.on_init_complete.<locals>.run_scheduler at 0x7f4133fff160>
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (timeloop:42) - Registered job <function MONAILabelApp.on_init_complete.<locals>.run_scheduler at 0x7f4133fff160>
    [2022-09-25 04:23:12,603] [timeloop] [INFO] Timeloop now started. Jobs will run based on the interval set
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (timeloop:63) - Timeloop now started. Jobs will run based on the interval set
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (uvicorn.error:59) - Application startup complete.
    [2022-09-25 04:23:12,603] [393] [MainThread] [INFO] (uvicorn.error:206) - Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
    [2022-09-25 04:31:28,445] [393] [MainThread] [INFO] (monailabel.endpoints.datastore:67) - Image: CV17056; File: <starlette.datastructures.UploadFile object at 0x7f414e2283d0>; params: {"client_id": "user-xyz"}
    [2022-09-25 04:31:28,513] [393] [MainThread] [INFO] (monailabel.datastore.local:402) - Adding Image: CV17056 => /tmp/tmpyt7z7vu_.nii.gz
    [2022-09-25 04:36:30,193] [393] [MainThread] [INFO] (monailabel.endpoints.datastore:100) - Saving Label for CV17056 for tag: final by admin
    [2022-09-25 04:36:30,194] [393] [MainThread] [INFO] (monailabel.endpoints.datastore:111) - Save Label params: {"label_info": [{"name": "spleen", "idx": 1}, {"name": "right kidney", "idx": 2}, {"name": "left kidney", "idx": 3}, {"name": "gallbladder", "idx": 4}, {"name": "esophagus", "idx": 5}, {"name": "liver", "idx": 6}, {"name": "stomach", "idx": 7}, {"name": "aorta", "idx": 8}, {"name": "inferior vena cava", "idx": 9}, {"name": "portal vein and splenic vein", "idx": 10}, {"name": "pancreas", "idx": 11}, {"name": "right adrenal gland", "idx": 12}, {"name": "left adrenal gland", "idx": 13}], "client_id": "user-xyz"}
    [2022-09-25 04:36:30,194] [393] [MainThread] [INFO] (monailabel.datastore.local:449) - Saving Label for Image: CV17056; Tag: final; Info: {'label_info': [{'name': 'spleen', 'idx': 1}, {'name': 'right kidney', 'idx': 2}, {'name': 'left kidney', 'idx': 3}, {'name': 'gallbladder', 'idx': 4}, {'name': 'esophagus', 'idx': 5}, {'name': 'liver', 'idx': 6}, {'name': 'stomach', 'idx': 7}, {'name': 'aorta', 'idx': 8}, {'name': 'inferior vena cava', 'idx': 9}, {'name': 'portal vein and splenic vein', 'idx': 10}, {'name': 'pancreas', 'idx': 11}, {'name': 'right adrenal gland', 'idx': 12}, {'name': 'left adrenal gland', 'idx': 13}], 'client_id': 'user-xyz'}
    [2022-09-25 04:36:30,194] [393] [MainThread] [INFO] (monailabel.datastore.local:457) - Adding Label: CV17056 => final => /tmp/tmpndizli05.nii.gz
    [2022-09-25 04:36:30,212] [393] [MainThread] [INFO] (monailabel.datastore.local:473) - Label Info: {'label_info': [{'name': 'spleen', 'idx': 1}, {'name': 'right kidney', 'idx': 2}, {'name': 'left kidney', 'idx': 3}, {'name': 'gallbladder', 'idx': 4}, {'name': 'esophagus', 'idx': 5}, {'name': 'liver', 'idx': 6}, {'name': 'stomach', 'idx': 7}, {'name': 'aorta', 'idx': 8}, {'name': 'inferior vena cava', 'idx': 9}, {'name': 'portal vein and splenic vein', 'idx': 10}, {'name': 'pancreas', 'idx': 11}, {'name': 'right adrenal gland', 'idx': 12}, {'name': 'left adrenal gland', 'idx': 13}], 'client_id': 'user-xyz', 'ts': 1664060790, 'checksum': 'SHA256:be4f651b196e6c8799deebee7718d2204c55ecd7af144ee7d11ce936e85c06ac', 'name': 'CV17056.nii.gz'}
    [2022-09-25 04:36:30,214] [393] [MainThread] [INFO] (monailabel.interfaces.app:489) - New label saved for: CV17056 => CV17056
    [2022-09-25 04:36:39,812] [393] [MainThread] [INFO] (monailabel.utils.async_tasks.task:36) - Train request: {'model': 'segmentation', 'name': 'train_01', 'pretrained': True, 'device': 'cuda', 'max_epochs': 50, 'early_stop_patience': -1, 'val_split': 0.2, 'train_batch_size': 1, 'val_batch_size': 1, 'multi_gpu': True, 'gpus': 'all', 'dataset': 'SmartCacheDataset', 'dataloader': 'ThreadDataLoader', 'client_id': 'user-xyz'}
    [2022-09-25 04:36:39,812] [393] [ThreadPoolExecutor-0_0] [INFO] (monailabel.utils.async_tasks.utils:59) - COMMAND:: /home/amitjc/anaconda3/envs/monailabel-env/bin/python -m monailabel.interfaces.utils.app -m train -r {"model":"segmentation","name":"train_01","pretrained":true,"device":"cuda","max_epochs":50,"early_stop_patience":-1,"val_split":0.2,"train_batch_size":1,"val_batch_size":1,"multi_gpu":true,"gpus":"all","dataset":"SmartCacheDataset","dataloader":"ThreadDataLoader","client_id":"user-xyz"}
    [2022-09-25 04:36:39,859] [446] [MainThread] [INFO] (__main__:38) - Initializing App from: /home/amitjc/apps/radiology; studies: /mnt/h/Cases/TMH/Liver; conf: {'models': 'segmentation'}
    Failed to load image Python extension: libtorch_cuda_cu.so: cannot open shared object file: No such file or directory
    [2022-09-25 04:36:41,070] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class 'main.MyApp'>
    [2022-09-25 04:36:41,072] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_3d.Deepgrow3D'>
    [2022-09-25 04:36:41,072] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_spine.LocalizationSpine'>
    [2022-09-25 04:36:41,073] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation.Segmentation'>
    [2022-09-25 04:36:41,073] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_vertebra.LocalizationVertebra'>
    [2022-09-25 04:36:41,074] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_spleen.SegmentationSpleen'>
    [2022-09-25 04:36:41,074] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_vertebra.SegmentationVertebra'>
    [2022-09-25 04:36:41,074] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_2d.Deepgrow2D'>
    [2022-09-25 04:36:41,074] [446] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepedit.DeepEdit'>
    [2022-09-25 04:36:41,075] [446] [MainThread] [INFO] (main:86) - +++ Adding Model: segmentation => lib.configs.segmentation.Segmentation
    [2022-09-25 04:36:41,100] [446] [MainThread] [INFO] (main:90) - +++ Using Models: ['segmentation']
    [2022-09-25 04:36:41,100] [446] [MainThread] [INFO] (monailabel.interfaces.app:128) - Init Datastore for: /mnt/h/Cases/TMH/Liver
    [2022-09-25 04:36:41,100] [446] [MainThread] [INFO] (monailabel.datastore.local:126) - Auto Reload: False; Extensions: ['*.nii.gz', '*.nii', '*.nrrd', '*.jpg', '*.png', '*.tif', '*.svs', '*.xml']
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (monailabel.datastore.local:540) - Invalidate count: 0
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (main:116) - +++ Adding Inferer:: segmentation => <lib.infers.segmentation.Segmentation object at 0x7feeaf66edf0>
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (main:172) - {'segmentation': <lib.infers.segmentation.Segmentation object at 0x7feeaf66edf0>, 'Histogram+GraphCut': <monailabel.scribbles.infer.HistogramBasedGraphCut object at 0x7feea7102970>, 'GMM+GraphCut': <monailabel.scribbles.infer.GMMBasedGraphCut object at 0x7feea71029a0>}
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (main:185) - +++ Adding Trainer:: segmentation => <lib.trainers.segmentation.Segmentation object at 0x7feea7102a30>
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: /home/amitjc/.cache/monailabel/sessions
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:365) - Train Request (input): {'model': 'segmentation', 'name': 'train_01', 'pretrained': True, 'device': 'cuda', 'max_epochs': 50, 'early_stop_patience': -1, 'val_split': 0.2, 'train_batch_size': 1, 'val_batch_size': 1, 'multi_gpu': True, 'gpus': 'all', 'dataset': 'SmartCacheDataset', 'dataloader': 'ThreadDataLoader', 'client_id': 'user-xyz', 'local_rank': 0}
    [2022-09-25 04:36:41,158] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:375) - CUDA_VISIBLE_DEVICES: None
    [2022-09-25 04:36:41,166] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:380) - Distributed/Multi GPU is limited
    [2022-09-25 04:36:41,166] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:395) - Distributed Training = FALSE
    [2022-09-25 04:36:41,166] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:422) - 0 - Train Request (final): {'name': 'train_01', 'pretrained': True, 'device': 'cuda', 'max_epochs': 50, 'early_stop_patience': -1, 'val_split': 0.2, 'train_batch_size': 1, 'val_batch_size': 1, 'multi_gpu': False, 'gpus': 'all', 'dataset': 'SmartCacheDataset', 'dataloader': 'ThreadDataLoader', 'model': 'segmentation', 'client_id': 'user-xyz', 'local_rank': 0, 'run_id': '20220925_0436'}
    [2022-09-25 04:36:41,166] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:521) - 0 - Using Device: cuda; IDX: None
    [2022-09-25 04:36:41,167] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:343) - Total Records for Training: 1
    [2022-09-25 04:36:41,167] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:344) - Total Records for Validation: 0
    NVIDIA GeForce RTX 3090 with CUDA capability sm_86 is not compatible with the current PyTorch installation.
    The current PyTorch install supports CUDA capabilities sm_37 sm_50 sm_60 sm_70.
    If you want to use the NVIDIA GeForce RTX 3090 GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/
    [2022-09-25 04:36:41,871] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:608) - 0 - Load Path /home/amitjc/apps/radiology/model/pretrained_segmentation.pt
    Loading dataset:   0%|          | 0/1 [00:00<?, ?it/s]
    Loading dataset: 100%|██████████| 1/1 [00:12<00:00, 12.42s/it]
    Loading dataset: 100%|██████████| 1/1 [00:12<00:00, 12.42s/it]
    cache_num is greater or equal than dataset length, fall back to regular monai.data.CacheDataset.
    [2022-09-25 04:36:54,299] [446] [MainThread] [INFO] (monailabel.tasks.train.basic_train:228) - 0 - Records for Training: 1
    [2022-09-25 04:36:54,302] [446] [MainThread] [INFO] (ignite.engine.engine.SupervisedTrainer:876) - Engine run resuming from iteration 0, epoch 0 until 50 epochs
    [2022-09-25 04:36:54,346] [446] [MainThread] [INFO] (ignite.engine.engine.SupervisedTrainer:138) - Restored all variables from /home/amitjc/apps/radiology/model/pretrained_segmentation.pt
    Exception in thread Thread-8:
    Traceback (most recent call last):
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 90, in apply_transform
        return [_apply_transform(transform, item, unpack_items) for item in data]
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 90, in <listcomp>
        return [_apply_transform(transform, item, unpack_items) for item in data]
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 55, in _apply_transform
        return transform(parameters)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/spatial/dictionary.py", line 1227, in __call__
        d[key] = self.flipper(d[key])
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/spatial/array.py", line 765, in __call__
        out = self.forward_image(img, axes)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/spatial/array.py", line 756, in forward_image
        return torch.flip(img, axes)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/meta_tensor.py", line 249, in __torch_function__
        ret = super().__torch_function__(func, types, args, kwargs)
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/_tensor.py", line 1121, in __torch_function__
        ret = func(*args, **kwargs)
    RuntimeError: CUDA error: no kernel image is available for execution on the device
    CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
    For debugging consider passing CUDA_LAUNCH_BLOCKING=1.
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/threading.py", line 980, in _bootstrap_inner
        self.run()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/threading.py", line 917, in run
        self._target(*self._args, **self._kwargs)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/thread_buffer.py", line 48, in enqueue_values
        for src_val in self.src:
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 681, in __next__
        data = self._next_data()
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 721, in _next_data
        data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch
        data = [self.dataset[idx] for idx in possibly_batched_index]
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 49, in <listcomp>
        data = [self.dataset[idx] for idx in possibly_batched_index]
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/dataset.py", line 105, in __getitem__
        return self._transform(index)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/dataset.py", line 878, in _transform
        data = apply_transform(_transform, data)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 118, in apply_transform
        raise RuntimeError(f"applying transform {transform}") from e
    RuntimeError: applying transform <monai.transforms.spatial.dictionary.RandFlipd object at 0x7feea7583760>
    Exception in thread Thread-9:
    Traceback (most recent call last):
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 90, in apply_transform
        return [_apply_transform(transform, item, unpack_items) for item in data]
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 90, in <listcomp>
        return [_apply_transform(transform, item, unpack_items) for item in data]
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 55, in _apply_transform
        return transform(parameters)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/intensity/dictionary.py", line 421, in __call__
        d[key] = self.shifter(d[key], factor=factor, randomize=False)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/intensity/array.py", line 289, in __call__
        return self._shifter(img, self._offset if factor is None else self._offset * factor)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/intensity/array.py", line 236, in __call__
        out = img + offset
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/meta_tensor.py", line 249, in __torch_function__
        ret = super().__torch_function__(func, types, args, kwargs)
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/_tensor.py", line 1121, in __torch_function__
        ret = func(*args, **kwargs)
    RuntimeError: CUDA error: no kernel image is available for execution on the device
    CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
    For debugging consider passing CUDA_LAUNCH_BLOCKING=1.
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/threading.py", line 980, in _bootstrap_inner
        self.run()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/threading.py", line 917, in run
        self._target(*self._args, **self._kwargs)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/thread_buffer.py", line 48, in enqueue_values
        for src_val in self.src:
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 681, in __next__
        data = self._next_data()
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 721, in _next_data
        data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch
        data = [self.dataset[idx] for idx in possibly_batched_index]
      File "/home/amitjc/.local/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 49, in <listcomp>
        data = [self.dataset[idx] for idx in possibly_batched_index]
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/dataset.py", line 105, in __getitem__
        return self._transform(index)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/data/dataset.py", line 878, in _transform
        data = apply_transform(_transform, data)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/transforms/transform.py", line 118, in apply_transform
        raise RuntimeError(f"applying transform {transform}") from e
    RuntimeError: applying transform <monai.transforms.intensity.dictionary.RandShiftIntensityd object at 0x7feea7583850>
    Data iterator can not provide data anymore but required total number of iterations to run is not reached. Current iteration: 0 vs Total iterations to run : 50
    [2022-09-25 04:36:54,629] [446] [MainThread] [ERROR] (ignite.engine.engine.SupervisedTrainer:992) - Engine run is terminating due to exception: the data to aggregate must be PyTorch Tensor.
    [2022-09-25 04:36:54,629] [446] [MainThread] [ERROR] (ignite.engine.engine.SupervisedTrainer:178) - Exception: the data to aggregate must be PyTorch Tensor.
    Traceback (most recent call last):
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 965, in _internal_run_as_gen
        self._fire_event(Events.EPOCH_COMPLETED)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 425, in _fire_event
        func(*first, *(event_args + others), **kwargs)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/metrics/metric.py", line 329, in completed
        result = self.compute()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/handlers/ignite_metric.py", line 90, in compute
        result = self.metric_fn.aggregate()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/metrics/meandice.py", line 99, in aggregate
        raise ValueError("the data to aggregate must be PyTorch Tensor.")
    ValueError: the data to aggregate must be PyTorch Tensor.
    Traceback (most recent call last):
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/runpy.py", line 197, in _run_module_as_main
        return _run_code(code, main_globals, None,
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monailabel/interfaces/utils/app.py", line 132, in <module>
        run_main()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monailabel/interfaces/utils/app.py", line 117, in run_main
        result = a.train(request)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monailabel/interfaces/app.py", line 416, in train
        result = task(request, self.datastore())
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monailabel/tasks/train/basic_train.py", line 396, in __call__
        res = self.train(0, world_size, req, datalist)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monailabel/tasks/train/basic_train.py", line 458, in train
        context.trainer.run()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/engines/trainer.py", line 53, in run
        super().run()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/engines/workflow.py", line 278, in run
        super().run(data=self.data_loader, max_epochs=self.state.max_epochs)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 892, in run
        return self._internal_run()
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 935, in _internal_run
        return next(self._internal_run_generator)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 993, in _internal_run_as_gen
        self._handle_exception(e)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 636, in _handle_exception
        self._fire_event(Events.EXCEPTION_RAISED, e)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 425, in _fire_event
        func(*first, *(event_args + others), **kwargs)
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/handlers/stats_handler.py", line 179, in exception_raised
        raise e
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 965, in _internal_run_as_gen
        self._fire_event(Events.EPOCH_COMPLETED)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/engine/engine.py", line 425, in _fire_event
        func(*first, *(event_args + others), **kwargs)
      File "/home/amitjc/.local/lib/python3.9/site-packages/ignite/metrics/metric.py", line 329, in completed
        result = self.compute()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/handlers/ignite_metric.py", line 90, in compute
        result = self.metric_fn.aggregate()
      File "/home/amitjc/anaconda3/envs/monailabel-env/lib/python3.9/site-packages/monai/metrics/meandice.py", line 99, in aggregate
        raise ValueError("the data to aggregate must be PyTorch Tensor.")
    ValueError: the data to aggregate must be PyTorch Tensor.
    [2022-09-25 04:36:54,988] [393] [ThreadPoolExecutor-0_0] [INFO] (monailabel.utils.async_tasks.utils:77) - Return code: 1
    

    RTX 3090 pytorch incompatibility error was returned; however,

    conda list | grep "torch"
    

    reveals

    pytorch                   1.13.0.dev20220924 py3.9_cuda11.7_cudnn8.5.0_0    pytorch-nightly
    pytorch-cuda              11.7                 h67b0de4_0    pytorch-nightly
    pytorch-mutex             1.0                        cuda    pytorch-nightly
    torchaudio                0.13.0.dev20220924      py39_cu117    pytorch-nightly
    torchvision               0.13.1                   pypi_0    pypi
    

    Monailabel server was deployed on wsl because, Ubuntu slicer monailabel module lacks "Train" and "Stop" tabs/buttons after deploying monailabel server on Ubuntu 22.04; issue raised over slicer forum

    Monailabel server was also deployed successfully over powershell, however, training could not be done through this deployment either. Please find the logs below:

    monailabel start_server --app apps/radiology --studies datasets/TMH/Liver/ --conf models segmentation
    Using PYTHONPATH=C:\ProgramData\Anaconda3\envs;
    ""
    2022-09-25 05:00:26,485 - USING:: version = False
    2022-09-25 05:00:26,485 - USING:: app = N:\ApPz\MONAI\apps\radiology
    2022-09-25 05:00:26,486 - USING:: studies = N:\ApPz\MONAI\datasets\TMH\Liver
    2022-09-25 05:00:26,488 - USING:: verbose = INFO
    2022-09-25 05:00:26,489 - USING:: conf = [['models', 'segmentation']]
    2022-09-25 05:00:26,489 - USING:: host = 0.0.0.0
    2022-09-25 05:00:26,490 - USING:: port = 8000
    2022-09-25 05:00:26,494 - USING:: uvicorn_app = monailabel.app:app
    2022-09-25 05:00:26,494 - USING:: ssl_keyfile = None
    2022-09-25 05:00:26,495 - USING:: ssl_certfile = None
    2022-09-25 05:00:26,495 - USING:: ssl_keyfile_password = None
    2022-09-25 05:00:26,496 - USING:: ssl_ca_certs = None
    2022-09-25 05:00:26,496 - USING:: workers = None
    2022-09-25 05:00:26,496 - USING:: limit_concurrency = None
    2022-09-25 05:00:26,497 - USING:: access_log = False
    2022-09-25 05:00:26,497 - USING:: log_config = None
    2022-09-25 05:00:26,497 - USING:: dryrun = False
    2022-09-25 05:00:26,498 - USING:: action = start_server
    2022-09-25 05:00:26,499 - ENV SETTINGS:: MONAI_LABEL_API_STR =
    2022-09-25 05:00:26,499 - ENV SETTINGS:: MONAI_LABEL_PROJECT_NAME = MONAILabel
    2022-09-25 05:00:26,499 - ENV SETTINGS:: MONAI_LABEL_APP_DIR =
    2022-09-25 05:00:26,500 - ENV SETTINGS:: MONAI_LABEL_STUDIES =
    2022-09-25 05:00:26,500 - ENV SETTINGS:: MONAI_LABEL_AUTH_ENABLE = False
    2022-09-25 05:00:26,501 - ENV SETTINGS:: MONAI_LABEL_AUTH_DB =
    2022-09-25 05:00:26,501 - ENV SETTINGS:: MONAI_LABEL_APP_CONF = '{}'
    2022-09-25 05:00:26,504 - ENV SETTINGS:: MONAI_LABEL_TASKS_TRAIN = True
    2022-09-25 05:00:26,507 - ENV SETTINGS:: MONAI_LABEL_TASKS_STRATEGY = True
    2022-09-25 05:00:26,508 - ENV SETTINGS:: MONAI_LABEL_TASKS_SCORING = True
    2022-09-25 05:00:26,511 - ENV SETTINGS:: MONAI_LABEL_TASKS_BATCH_INFER = True
    2022-09-25 05:00:26,512 - ENV SETTINGS:: MONAI_LABEL_DATASTORE =
    2022-09-25 05:00:26,512 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_URL =
    2022-09-25 05:00:26,513 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_USERNAME =
    2022-09-25 05:00:26,513 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PASSWORD =
    2022-09-25 05:00:26,513 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_API_KEY =
    2022-09-25 05:00:26,514 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_CACHE_PATH =
    2022-09-25 05:00:26,514 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PROJECT =
    2022-09-25 05:00:26,515 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_ASSET_PATH =
    2022-09-25 05:00:26,515 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_DSA_ANNOTATION_GROUPS =
    2022-09-25 05:00:26,515 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_USERNAME =
    2022-09-25 05:00:26,516 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PASSWORD =
    2022-09-25 05:00:26,516 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_PATH =
    2022-09-25 05:00:26,517 - ENV SETTINGS:: MONAI_LABEL_QIDO_PREFIX =
    2022-09-25 05:00:26,517 - ENV SETTINGS:: MONAI_LABEL_WADO_PREFIX =
    2022-09-25 05:00:26,517 - ENV SETTINGS:: MONAI_LABEL_STOW_PREFIX =
    2022-09-25 05:00:26,518 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_FETCH_BY_FRAME = False
    2022-09-25 05:00:26,521 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_SEARCH_FILTER = '{"Modality": "CT"}'
    2022-09-25 05:00:26,525 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_EXPIRY = 180
    2022-09-25 05:00:26,525 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_READ_TIMEOUT = 5.0
    2022-09-25 05:00:26,526 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_AUTO_RELOAD = True
    2022-09-25 05:00:26,526 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_FILE_EXT = '["*.nii.gz", "*.nii", "*.nrrd", "*.jpg", "*.png", "*.tif", "*.svs", "*.xml"]'
    2022-09-25 05:00:26,527 - ENV SETTINGS:: MONAI_LABEL_SERVER_PORT = 8000
    2022-09-25 05:00:26,527 - ENV SETTINGS:: MONAI_LABEL_CORS_ORIGINS = '[]'
    2022-09-25 05:00:26,528 - ENV SETTINGS:: MONAI_LABEL_SESSIONS = True
    2022-09-25 05:00:26,528 - ENV SETTINGS:: MONAI_LABEL_SESSION_PATH =
    2022-09-25 05:00:26,529 - ENV SETTINGS:: MONAI_LABEL_SESSION_EXPIRY = 3600
    2022-09-25 05:00:26,529 - ENV SETTINGS:: MONAI_LABEL_INFER_CONCURRENCY = -1
    2022-09-25 05:00:26,529 - ENV SETTINGS:: MONAI_LABEL_INFER_TIMEOUT = 600
    2022-09-25 05:00:26,530 - ENV SETTINGS:: MONAI_LABEL_AUTO_UPDATE_SCORING = True
    2022-09-25 05:00:26,530 -
    Allow Origins: ['*']
    [2022-09-25 05:00:27,327] [25360] [MainThread] [INFO] (uvicorn.error:75) - Started server process [25360]
    [2022-09-25 05:00:27,328] [25360] [MainThread] [INFO] (uvicorn.error:45) - Waiting for application startup.
    [2022-09-25 05:00:27,328] [25360] [MainThread] [INFO] (monailabel.interfaces.utils.app:38) - Initializing App from: N:\ApPz\MONAI\apps\radiology; studies: N:\ApPz\MONAI\datasets\TMH\Liver; conf: {'models': 'segmentation'}
    [2022-09-25 05:00:27,392] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class 'main.MyApp'>
    [2022-09-25 05:00:27,408] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepedit.DeepEdit'>
    [2022-09-25 05:00:27,408] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_2d.Deepgrow2D'>
    [2022-09-25 05:00:27,409] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_3d.Deepgrow3D'>
    [2022-09-25 05:00:27,411] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_spine.LocalizationSpine'>
    [2022-09-25 05:00:27,412] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_vertebra.LocalizationVertebra'>
    [2022-09-25 05:00:27,413] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation.Segmentation'>
    [2022-09-25 05:00:27,418] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_spleen.SegmentationSpleen'>
    [2022-09-25 05:00:27,419] [25360] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_vertebra.SegmentationVertebra'>
    [2022-09-25 05:00:27,421] [25360] [MainThread] [INFO] (main:86) - +++ Adding Model: segmentation => lib.configs.segmentation.Segmentation
    [2022-09-25 05:00:27,463] [25360] [MainThread] [INFO] (main:90) - +++ Using Models: ['segmentation']
    [2022-09-25 05:00:27,463] [25360] [MainThread] [INFO] (monailabel.interfaces.app:128) - Init Datastore for: N:\ApPz\MONAI\datasets\TMH\Liver
    [2022-09-25 05:00:27,465] [25360] [MainThread] [INFO] (monailabel.datastore.local:126) - Auto Reload: True; Extensions: ['*.nii.gz', '*.nii', '*.nrrd', '*.jpg', '*.png', '*.tif', '*.svs', '*.xml']
    [2022-09-25 05:00:27,475] [25360] [MainThread] [INFO] (monailabel.datastore.local:540) - Invalidate count: 0
    [2022-09-25 05:00:27,475] [25360] [MainThread] [INFO] (monailabel.datastore.local:146) - Start observing external modifications on datastore (AUTO RELOAD)
    [2022-09-25 05:00:27,478] [25360] [MainThread] [INFO] (main:116) - +++ Adding Inferer:: segmentation => <lib.infers.segmentation.Segmentation object at 0x000002792556EB80>
    [2022-09-25 05:00:27,479] [25360] [MainThread] [INFO] (main:172) - {'segmentation': <lib.infers.segmentation.Segmentation object at 0x000002792556EB80>, 'Histogram+GraphCut': <monailabel.scribbles.infer.HistogramBasedGraphCut object at 0x0000027926507400>, 'GMM+GraphCut': <monailabel.scribbles.infer.GMMBasedGraphCut object at 0x00000279265073D0>}
    [2022-09-25 05:00:27,479] [25360] [MainThread] [INFO] (main:185) - +++ Adding Trainer:: segmentation => <lib.trainers.segmentation.Segmentation object at 0x0000027926507EE0>
    [2022-09-25 05:00:27,481] [25360] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: C:\Users\AMiT\.cache\monailabel\sessions
    [2022-09-25 05:00:27,482] [25360] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600
    [2022-09-25 05:00:27,482] [25360] [MainThread] [INFO] (monailabel.interfaces.app:465) - App Init - completed
    [2022-09-25 05:00:27,483] [timeloop] [INFO] Starting Timeloop..
    [2022-09-25 05:00:27,483] [25360] [MainThread] [INFO] (timeloop:60) - Starting Timeloop..
    [2022-09-25 05:00:27,484] [timeloop] [INFO] Registered job <function MONAILabelApp.on_init_complete.<locals>.run_scheduler at 0x00000279255FB1F0>
    [2022-09-25 05:00:27,484] [25360] [MainThread] [INFO] (timeloop:42) - Registered job <function MONAILabelApp.on_init_complete.<locals>.run_scheduler at 0x00000279255FB1F0>
    [2022-09-25 05:00:27,485] [timeloop] [INFO] Timeloop now started. Jobs will run based on the interval set
    [2022-09-25 05:00:27,485] [25360] [MainThread] [INFO] (timeloop:63) - Timeloop now started. Jobs will run based on the interval set
    [2022-09-25 05:00:27,489] [25360] [MainThread] [INFO] (uvicorn.error:59) - Application startup complete.
    [2022-09-25 05:00:27,490] [25360] [MainThread] [INFO] (uvicorn.error:206) - Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
    [2022-09-25 05:02:10,277] [25360] [MainThread] [INFO] (monailabel.endpoints.datastore:100) - Saving Label for CV17056 for tag: final by admin
    [2022-09-25 05:02:10,278] [25360] [MainThread] [INFO] (monailabel.endpoints.datastore:111) - Save Label params: {"label_info": [{"name": "Liver", "idx": 1}, {"name": "right kidney", "idx": 2}, {"name": "left kidney", "idx": 3}, {"name": "gallbladder", "idx": 4}, {"name": "esophagus", "idx": 5}, {"name": "stomach", "idx": 6}, {"name": "aorta", "idx": 7}, {"name": "inferior vena cava", "idx": 8}, {"name": "portal vein and splenic vein", "idx": 9}, {"name": "pancreas", "idx": 10}, {"name": "right adrenal gland", "idx": 11}, {"name": "left adrenal gland", "idx": 12}], "client_id": "user-xyz"}
    [2022-09-25 05:02:10,279] [25360] [MainThread] [INFO] (monailabel.datastore.local:449) - Saving Label for Image: CV17056; Tag: final; Info: {'label_info': [{'name': 'Liver', 'idx': 1}, {'name': 'right kidney', 'idx': 2}, {'name': 'left kidney', 'idx': 3}, {'name': 'gallbladder', 'idx': 4}, {'name': 'esophagus', 'idx': 5}, {'name': 'stomach', 'idx': 6}, {'name': 'aorta', 'idx': 7}, {'name': 'inferior vena cava', 'idx': 8}, {'name': 'portal vein and splenic vein', 'idx': 9}, {'name': 'pancreas', 'idx': 10}, {'name': 'right adrenal gland', 'idx': 11}, {'name': 'left adrenal gland', 'idx': 12}], 'client_id': 'user-xyz'}
    [2022-09-25 05:02:10,282] [25360] [MainThread] [INFO] (monailabel.datastore.local:457) - Adding Label: CV17056 => final => C:\Users\AMiT\AppData\Local\Temp\tmpo1idaxoj.nii.gz
    [2022-09-25 05:02:10,289] [25360] [MainThread] [INFO] (monailabel.datastore.local:473) - Label Info: {'label_info': [{'name': 'Liver', 'idx': 1}, {'name': 'right kidney', 'idx': 2}, {'name': 'left kidney', 'idx': 3}, {'name': 'gallbladder', 'idx': 4}, {'name': 'esophagus', 'idx': 5}, {'name': 'stomach', 'idx': 6}, {'name': 'aorta', 'idx': 7}, {'name': 'inferior vena cava', 'idx': 8}, {'name': 'portal vein and splenic vein', 'idx': 9}, {'name': 'pancreas', 'idx': 10}, {'name': 'right adrenal gland', 'idx': 11}, {'name': 'left adrenal gland', 'idx': 12}], 'client_id': 'user-xyz', 'ts': 1664062330, 'checksum': 'SHA256:be4f651b196e6c8799deebee7718d2204c55ecd7af144ee7d11ce936e85c06ac', 'name': 'CV17056.nii.gz'}
    [2022-09-25 05:02:10,291] [25360] [MainThread] [INFO] (monailabel.interfaces.app:489) - New label saved for: CV17056 => CV17056
    [2022-09-25 05:02:20,220] [25360] [MainThread] [INFO] (monailabel.utils.async_tasks.task:36) - Train request: {'model': 'segmentation', 'name': 'train_01', 'pretrained': True, 'device': 'cuda', 'max_epochs': 50, 'early_stop_patience': -1, 'val_split': 0.2, 'train_batch_size': 1, 'val_batch_size': 1, 'multi_gpu': True, 'gpus': 'all', 'dataset': 'SmartCacheDataset', 'dataloader': 'ThreadDataLoader', 'client_id': 'user-xyz'}
    [2022-09-25 05:02:20,222] [25360] [ThreadPoolExecutor-0_0] [INFO] (monailabel.utils.async_tasks.utils:59) - COMMAND:: C:\ProgramData\Anaconda3\envs\monailabel-env\python.exe -m monailabel.interfaces.utils.app -m train -r {"model":"segmentation","name":"train_01","pretrained":true,"device":"cuda","max_epochs":50,"early_stop_patience":-1,"val_split":0.2,"train_batch_size":1,"val_batch_size":1,"multi_gpu":true,"gpus":"all","dataset":"SmartCacheDataset","dataloader":"ThreadDataLoader","client_id":"user-xyz"}
    [2022-09-25 05:02:20,328] [4396] [MainThread] [INFO] (__main__:38) - Initializing App from: N:\ApPz\MONAI\apps\radiology; studies: N:\ApPz\MONAI\datasets\TMH\Liver; conf: {'models': 'segmentation'}
    [2022-09-25 05:02:22,635] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class 'main.MyApp'>
    [2022-09-25 05:02:22,645] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepedit.DeepEdit'>
    [2022-09-25 05:02:22,645] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_2d.Deepgrow2D'>
    [2022-09-25 05:02:22,645] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.deepgrow_3d.Deepgrow3D'>
    [2022-09-25 05:02:22,646] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_spine.LocalizationSpine'>
    [2022-09-25 05:02:22,646] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.localization_vertebra.LocalizationVertebra'>
    [2022-09-25 05:02:22,647] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation.Segmentation'>
    [2022-09-25 05:02:22,647] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_spleen.SegmentationSpleen'>
    [2022-09-25 05:02:22,648] [4396] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_vertebra.SegmentationVertebra'>
    [2022-09-25 05:02:22,648] [4396] [MainThread] [INFO] (main:86) - +++ Adding Model: segmentation => lib.configs.segmentation.Segmentation
    [2022-09-25 05:02:22,674] [4396] [MainThread] [INFO] (main:90) - +++ Using Models: ['segmentation']
    [2022-09-25 05:02:22,674] [4396] [MainThread] [INFO] (monailabel.interfaces.app:128) - Init Datastore for: N:\ApPz\MONAI\datasets\TMH\Liver
    [2022-09-25 05:02:22,674] [4396] [MainThread] [INFO] (monailabel.datastore.local:126) - Auto Reload: False; Extensions: ['*.nii.gz', '*.nii', '*.nrrd', '*.jpg', '*.png', '*.tif', '*.svs', '*.xml']
    [2022-09-25 05:02:22,680] [4396] [MainThread] [INFO] (monailabel.datastore.local:540) - Invalidate count: 0
    [2022-09-25 05:02:22,680] [4396] [MainThread] [INFO] (main:116) - +++ Adding Inferer:: segmentation => <lib.infers.segmentation.Segmentation object at 0x0000026767A08400>
    [2022-09-25 05:02:22,680] [4396] [MainThread] [INFO] (main:172) - {'segmentation': <lib.infers.segmentation.Segmentation object at 0x0000026767A08400>, 'Histogram+GraphCut': <monailabel.scribbles.infer.HistogramBasedGraphCut object at 0x0000026769D73E50>, 'GMM+GraphCut': <monailabel.scribbles.infer.GMMBasedGraphCut object at 0x0000026769D73E20>}
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (main:185) - +++ Adding Trainer:: segmentation => <lib.trainers.segmentation.Segmentation object at 0x0000026769D73EB0>
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: C:\Users\AMiT\.cache\monailabel\sessions
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:365) - Train Request (input): {'model': 'segmentation', 'name': 'train_01', 'pretrained': True, 'device': 'cuda', 'max_epochs': 50, 'early_stop_patience': -1, 'val_split': 0.2, 'train_batch_size': 1, 'val_batch_size': 1, 'multi_gpu': True, 'gpus': 'all', 'dataset': 'SmartCacheDataset', 'dataloader': 'ThreadDataLoader', 'client_id': 'user-xyz', 'local_rank': 0}
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:375) - CUDA_VISIBLE_DEVICES: None
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:380) - Distributed/Multi GPU is limited
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:395) - Distributed Training = FALSE
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:422) - 0 - Train Request (final): {'name': 'train_01', 'pretrained': True, 'device': 'cuda', 'max_epochs': 50, 'early_stop_patience': -1, 'val_split': 0.2, 'train_batch_size': 1, 'val_batch_size': 1, 'multi_gpu': False, 'gpus': 'all', 'dataset': 'SmartCacheDataset', 'dataloader': 'ThreadDataLoader', 'model': 'segmentation', 'client_id': 'user-xyz', 'local_rank': 0, 'run_id': '20220925_0502'}
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:521) - 0 - Using Device: cpu; IDX: None
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:343) - Total Records for Training: 1
    [2022-09-25 05:02:22,681] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:344) - Total Records for Validation: 0
    [2022-09-25 05:02:22,688] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:608) - 0 - Load Path N:\ApPz\MONAI\apps\radiology\model\pretrained_segmentation.pt
    Loading dataset:   0%|          | 0/1 [00:00<?, ?it/s]
    Loading dataset: 100%|##########| 1/1 [00:14<00:00, 14.51s/it]
    Loading dataset: 100%|##########| 1/1 [00:14<00:00, 14.51s/it]
    cache_num is greater or equal than dataset length, fall back to regular monai.data.CacheDataset.
    [2022-09-25 05:02:37,202] [4396] [MainThread] [INFO] (monailabel.tasks.train.basic_train:228) - 0 - Records for Training: 1
    torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.
    [2022-09-25 05:02:37,205] [4396] [MainThread] [INFO] (ignite.engine.engine.SupervisedTrainer:876) - Engine run resuming from iteration 0, epoch 0 until 50 epochs
    [2022-09-25 05:02:37,221] [4396] [MainThread] [ERROR] (ignite.engine.engine.SupervisedTrainer:992) - Engine run is terminating due to exception: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
    [2022-09-25 05:02:37,221] [4396] [MainThread] [ERROR] (ignite.engine.engine.SupervisedTrainer:178) - Exception: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
    Traceback (most recent call last):
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 946, in _internal_run_as_gen
        self._fire_event(Events.STARTED)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 425, in _fire_event
        func(*first, *(event_args + others), **kwargs)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monai\handlers\checkpoint_loader.py", line 107, in __call__
        checkpoint = torch.load(self.load_path, map_location=self.map_location)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 712, in load
        return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 1049, in _load
        result = unpickler.load()
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 1019, in persistent_load
        load_tensor(dtype, nbytes, key, _maybe_decode_ascii(location))
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 1001, in load_tensor
        wrap_storage=restore_location(storage, location),
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 175, in default_restore_location
        result = fn(storage, location)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 152, in _cuda_deserialize
        device = validate_cuda_device(location)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 136, in validate_cuda_device
        raise RuntimeError('Attempting to deserialize object on a CUDA '
    RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
    Traceback (most recent call last):
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\runpy.py", line 197, in _run_module_as_main
        return _run_code(code, main_globals, None,
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monailabel\interfaces\utils\app.py", line 132, in <module>
        run_main()
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monailabel\interfaces\utils\app.py", line 117, in run_main
        result = a.train(request)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monailabel\interfaces\app.py", line 416, in train
        result = task(request, self.datastore())
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monailabel\tasks\train\basic_train.py", line 396, in __call__
        res = self.train(0, world_size, req, datalist)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monailabel\tasks\train\basic_train.py", line 458, in train
        context.trainer.run()
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monai\engines\trainer.py", line 53, in run
        super().run()
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monai\engines\workflow.py", line 278, in run
        super().run(data=self.data_loader, max_epochs=self.state.max_epochs)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 892, in run
        return self._internal_run()
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 935, in _internal_run
        return next(self._internal_run_generator)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 993, in _internal_run_as_gen
        self._handle_exception(e)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 636, in _handle_exception
        self._fire_event(Events.EXCEPTION_RAISED, e)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 425, in _fire_event
        func(*first, *(event_args + others), **kwargs)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monai\handlers\stats_handler.py", line 179, in exception_raised
        raise e
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 946, in _internal_run_as_gen
        self._fire_event(Events.STARTED)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\ignite\engine\engine.py", line 425, in _fire_event
        func(*first, *(event_args + others), **kwargs)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\monai\handlers\checkpoint_loader.py", line 107, in __call__
        checkpoint = torch.load(self.load_path, map_location=self.map_location)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 712, in load
        return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 1049, in _load
        result = unpickler.load()
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 1019, in persistent_load
        load_tensor(dtype, nbytes, key, _maybe_decode_ascii(location))
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 1001, in load_tensor
        wrap_storage=restore_location(storage, location),
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 175, in default_restore_location
        result = fn(storage, location)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 152, in _cuda_deserialize
        device = validate_cuda_device(location)
      File "C:\ProgramData\Anaconda3\envs\monailabel-env\lib\site-packages\torch\serialization.py", line 136, in validate_cuda_device
        raise RuntimeError('Attempting to deserialize object on a CUDA '
    RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
    [2022-09-25 05:02:37,573] [25360] [ThreadPoolExecutor-0_0] [INFO] (monailabel.utils.async_tasks.utils:77) - Return code: 1
    

    Look forward to guidance for working with monai.

    Thanks again.

    Best Regards, Amit.

    question 
    opened by amitjc 22
  • Related to label deviating position (DeepEdit multi label app)

    Related to label deviating position (DeepEdit multi label app)

    Dear Sachidanand,

    First of all I would like to thank and congratulate you for your tremendous work. AI for healthcare imaging has never been more accessible.

    I am a medical physicist in radiotherapy department in Amsterdam. I am currently investigating the possibilities of deep learning based segmentation in adaptive radiotherapy using MRI images. I've been using/developing monailabel APP (deep_edit multilabel app) for a while and so far I'm very happy with it.

    However, I have encountered a few issues and have not been able to resolve them and I hope you can help me with that;

    • This app generates labels with a (default) deviating position ( x: -2, y: -2, z: -2). Maybe this is because of a transform or something else do you have an idea how I can solve this?

    -I use this app to segment different organs in the abdomen. The results are very good for 8 of the 9 organs (dice > 0.95), but for the duodenum it is much lower. Is there an opportunity within monailabel to put more focus on improving a single structure.

    • Is there a possibility to automatically export the created labels in Dicom-RT file structure.

    Kind Regards

    OB

    help wanted 0.4.0 
    opened by SachidanandAlle 21
  • Miscellaneous - Generated mask is shown with different origin from the original image

    Miscellaneous - Generated mask is shown with different origin from the original image

    Describe the bug Slicer shows the automatic label with a different origin. Apparently, it is related to the way ITK interprets the nifty header. From this discussion, it seems nibabel and nifty differ in the way they interpret the nifty header (https://github.com/nipy/nibabel/issues/863)

    CC @wyli

    To Reproduce Using the Fusion Challenge dataset MICCAI 2012 (OASIS project) to train highresnet for Brain parcellation.

    Fusion Challenge dataset MICCAI 2012: 35 brain MRI scans obtained from the OASIS project. The manual brain segmentations of these images were produced by Neuromorphometrics, Inc. (http://Neuromorphometrics.com/) using the brainCOLOR labeling protocol.

    Expected behavior To show the label in the same origin as the original image

    Screenshots The workaround for this is to first change the affine of images and labels. Then use MONAI Label to train the model. See the affine changes in the following images

    Original Affine:

    original_affine_origin

    Modified affine:

    modified_origin_affine

    bug 0.4.0 
    opened by diazandr3s 19
  • compatibility with MONAI v0.6.0 (15/July)

    compatibility with MONAI v0.6.0 (15/July)

    Is your feature request related to a problem? Please describe. MONAI v0.6.0rc1 is available: https://pypi.org/project/monai/0.6.0rc1/

    this ticket tracks any issues with the new version of MONAI and verifies that the new features are usable:

    • load MMAR pretrained https://github.com/Project-MONAI/MONAI/blob/0.6.0rc1/monai/apps/mmars/mmars.py#L204-L207
    • dynamic datalist there are breaking changes since 0.5.x, we'll release demos https://github.com/Project-MONAI/MONAI/issues/2402

    cc @Nic-Ma @yiheng-wang-nv

    0.1.0 
    opened by wyli 19
  • Improve conversion of DICOM-SEGs to ITK images in DICOMWeb datastore

    Improve conversion of DICOM-SEGs to ITK images in DICOMWeb datastore

    Minor improvement to DICOMWeb datastore:

    • Now uses pydicom-seg in favor of dcmqi_tools's binary segimage2itkimage for conversion of DICOM-SEGs to ITK images
    • Now supports multiclass conversion
    opened by jlvahldiek 18
  • Restoring scribbles using torchmaxflow

    Restoring scribbles using torchmaxflow

    Signed-off-by: masadcv [email protected]

    I have update and released binary for torchmaxflow (https://github.com/masadcv/torchmaxflow) which can now be used for scribbles. It has pre-built windows binaries so will resolve issues reported in https://github.com/Project-MONAI/MONAILabel/issues/719

    This PR restores scribbles functionality using graphcut method from torchmaxflow. This PR still has issue as reported in https://github.com/Project-MONAI/MONAILabel/discussions/713 It will be resolved with merging of an on-going PR #717 once this is merged.

    opened by masadcv 18
  • Activelearning strategy

    Activelearning strategy "Random" very slow with large datastores

    In various projects I have noticed that the speed at which the next sample is determined by the "Random" strategy depends strongly on the total number of studies in the datastore, i.e. it takes up to 10 s when having 2,000 unlabeled studies in the store. I consider this to be a great drawback for the user experience.

    The time bottleneck originates from the following for-loop which is used to get further information on every unlabeled image. This information is then used to retrieve the image's last timestamp in order to generate an image-specific weight: https://github.com/Project-MONAI/MONAILabel/blob/cb8421c4991a351468d0f2487601b02ec6911944/monailabel/tasks/activelearning/random.py#L39-L43

    All unlabeled images' weights are then used to determine one random image out of all unlabeled images: https://github.com/Project-MONAI/MONAILabel/blob/cb8421c4991a351468d0f2487601b02ec6911944/monailabel/tasks/activelearning/random.py#L45

    Now I wonder if we need this time intense weighting in a random draw at all? Probably this is very valid for small datastores to avoid repeated image selections. But in larger datastores this won't play a role. What do you think about a PR that deactivates weighting if a user-specified number of unlabeled images is available(i.e. > 50 images)? Or is there a more time-efficient way to determine the weights?

    opened by jlvahldiek 4
  • PIL.Image.DecompressionBombError: Image size (445833922 pixels) exceeds limit of 178956970 pixels, could be decompression bomb DOS attack.

    PIL.Image.DecompressionBombError: Image size (445833922 pixels) exceeds limit of 178956970 pixels, could be decompression bomb DOS attack.

    Thank you for your marvelous work!

    While using QuPath with your plugin in training my annotations, I received errors. Please help in solving these errors.


    [2022-12-28 21:03:17,778] [191288] [MainThread] [INFO] (monailabel.utils.async_tasks.task:41) - Train request: {'model': 'ss', 'train_batch_size': 10, 'val_batch_size': 10} [2022-12-28 21:03:17,778] [191288] [ThreadPoolExecutor-2_0] [INFO] (monailabel.utils.async_tasks.utils:49) - Before:: /miniconda3/envs: [2022-12-28 21:03:17,779] [191288] [ThreadPoolExecutor-2_0] [INFO] (monailabel.utils.async_tasks.utils:53) - After:: /miniconda3/envs: [2022-12-28 21:03:17,779] [191288] [ThreadPoolExecutor-2_0] [INFO] (monailabel.utils.async_tasks.utils:65) - COMMAND:: /miniconda3/envs/p39/bin/python -m monailabel.interfaces.utils.app -m train -r {"model":"ss","train_batch_size":10,"val_batch_size":10,"gpus":"all"} [2022-12-28 21:03:17,878] [202303] [MainThread] [INFO] (main:38) - Initializing App from: /MONAILabel/monailabel/apps/pathology; studies: /MONAILabel/monailabel/datasets/pathology; conf: {} [2022-12-28 21:03:19,114] [202303] [MainThread] [INFO] (numexpr.utils:148) - Note: NumExpr detected 24 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8. [2022-12-28 21:03:19,114] [202303] [MainThread] [INFO] (numexpr.utils:160) - NumExpr defaulting to 8 threads. [2022-12-28 21:03:20,654] [202303] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class 'main.MyApp'> [2022-12-28 21:03:20,657] [202303] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.segmentation_nuclei.SegmentationNuclei'> [2022-12-28 21:03:20,658] [202303] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.nuclick.NuClick'> [2022-12-28 21:03:20,658] [202303] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.hovernet_nuclei.HovernetNuclei'> [2022-12-28 21:03:20,658] [202303] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.ss.SegmentationNuclei'> [2022-12-28 21:03:20,658] [202303] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for TaskConfig Found: <class 'lib.configs.classification_nuclei.ClassificationNuclei'> [2022-12-28 21:03:20,658] [202303] [MainThread] [INFO] (main:83) - +++ Adding Model: classification_nuclei => lib.configs.classification_nuclei.ClassificationNuclei [2022-12-28 21:03:21,073] [202303] [MainThread] [INFO] (main:83) - +++ Adding Model: hovernet_nuclei => lib.configs.hovernet_nuclei.HovernetNuclei [2022-12-28 21:03:21,148] [202303] [MainThread] [INFO] (main:83) - +++ Adding Model: nuclick => lib.configs.nuclick.NuClick [2022-12-28 21:03:21,211] [202303] [MainThread] [INFO] (main:83) - +++ Adding Model: ss => lib.configs.ss.SegmentationNuclei [2022-12-28 21:03:21,279] [202303] [MainThread] [INFO] (main:83) - +++ Adding Model: segmentation_nuclei => lib.configs.segmentation_nuclei.SegmentationNuclei [2022-12-28 21:03:21,325] [202303] [MainThread] [INFO] (main:87) - +++ Using Models: ['classification_nuclei', 'hovernet_nuclei', 'nuclick', 'ss', 'segmentation_nuclei'] [2022-12-28 21:03:21,325] [202303] [MainThread] [INFO] (monailabel.interfaces.app:135) - Init Datastore for: /MONAILabel/monailabel/datasets/pathology [2022-12-28 21:03:21,325] [202303] [MainThread] [INFO] (monailabel.datastore.local:129) - Auto Reload: False; Extensions: ['.nii.gz', '.nii', '.nrrd', '.jpg', '.png', '.tif', '.svs', '.xml'] [2022-12-28 21:03:21,327] [202303] [MainThread] [INFO] (monailabel.datastore.local:576) - Invalidate count: 0 [2022-12-28 21:03:21,483] [202303] [MainThread] [INFO] (main:129) - +++ Adding Inferer:: classification_nuclei => <lib.infers.classification_nuclei.ClassificationNuclei object at 0x7f29d0a49100> [2022-12-28 21:03:21,883] [202303] [MainThread] [INFO] (main:129) - +++ Adding Inferer:: hovernet_nuclei => <lib.infers.hovernet_nuclei.HovernetNuclei object at 0x7f29ae3176a0> [2022-12-28 21:03:21,941] [202303] [MainThread] [INFO] (main:129) - +++ Adding Inferer:: nuclick => <lib.infers.nuclick.NuClick object at 0x7f299bb10a60> [2022-12-28 21:03:21,941] [202303] [MainThread] [INFO] (lib.configs.ss:102) - Using Preload: False; ROI Size: [8192819281920, 8192819281920] [2022-12-28 21:03:21,941] [202303] [MainThread] [INFO] (main:129) - +++ Adding Inferer:: ss => <lib.infers.segmentation_nuclei.SegmentationNuclei object at 0x7f2990d99ee0> [2022-12-28 21:03:21,941] [202303] [MainThread] [INFO] (lib.configs.segmentation_nuclei:88) - Using Preload: False; ROI Size: [1024, 1024] [2022-12-28 21:03:21,941] [202303] [MainThread] [INFO] (main:129) - +++ Adding Inferer:: segmentation_nuclei => <lib.infers.segmentation_nuclei.SegmentationNuclei object at 0x7f29903d6e80> [2022-12-28 21:03:21,941] [202303] [MainThread] [INFO] (main:153) - +++ Adding Trainer:: classification_nuclei => <lib.trainers.classification_nuclei.ClassificationNuclei object at 0x7f29903d6fa0> [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (main:153) - +++ Adding Trainer:: hovernet_nuclei => <lib.trainers.hovernet_nuclei.HovernetNuclei object at 0x7f29903d6f70> [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (main:153) - +++ Adding Trainer:: nuclick => <lib.trainers.nuclick.NuClick object at 0x7f2990360bb0> [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (main:153) - +++ Adding Trainer:: ss => <lib.trainers.segmentation_nuclei.SegmentationNuclei object at 0x7f2990360eb0> [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (main:153) - +++ Adding Trainer:: segmentation_nuclei => <lib.trainers.segmentation_nuclei.SegmentationNuclei object at 0x7f2990360ee0> [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (main:174) - Active Learning Strategies:: ['wsi_random'] [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: /root/.cache/monailabel/sessions [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600 [2022-12-28 21:03:21,942] [202303] [MainThread] [INFO] (monailabel.tasks.train.basic_train:432) - Train Request (input): {'model': 'ss', 'train_batch_size': 10, 'val_batch_size': 10, 'gpus': 'all', 'local_rank': 0} [2022-12-28 21:03:21,943] [202303] [MainThread] [INFO] (monailabel.tasks.train.basic_train:442) - CUDA_VISIBLE_DEVICES: None [2022-12-28 21:03:21,943] [202303] [MainThread] [INFO] (monailabel.tasks.train.basic_train:605) - Running cleanup... [2022-12-28 21:03:21,943] [202303] [MainThread] [INFO] (lib.utils:90) - Split data based on tile size: (512, 512); groups: {'CC': 1, 'll': 2, 'hh': 3} BasicUNet features: (32, 64, 128, 256, 512, 32). BasicUNet features: (32, 64, 128, 256, 512, 32). BasicUNet features: (32, 64, 128, 256, 512, 32). 0%| | 0/7 [00:00<?, ?it/s][2022-12-28 21:03:21,944] [202303] [MainThread] [INFO] (lib.utils:550) - ++ Using Groups: {'CC': 1, 'll': 2, 'hh': 3} [2022-12-28 21:03:21,944] [202303] [MainThread] [INFO] (lib.utils:552) - Fetching Image/Label : {'image': '/MONAILabel/monailabel/datasets/pathology/PPPP.png', 'label': '/MONAILabel/monailabel/datasets/pathology/labels/final/PPPP.xml'} [2022-12-28 21:03:22,129] [202303] [MainThread] [INFO] (lib.utils:560) - Total Points: 18458 [2022-12-28 21:03:22,133] [202303] [MainThread] [INFO] (lib.utils:562) - ID: PPPP => Groups: dict_keys(['CC', 'll', 'hh']); Location: (1990, 350); Size: 3077 x 6109 [2022-12-28 21:03:22,946] [202303] [MainThread] [INFO] (lib.utils:616) - Image => Input: (6864, 5312, 3); Total Patches to save: 154 [2022-12-28 21:03:33,933] [202303] [MainThread] [INFO] (lib.utils:593) - CC => p: 0; c: 1; unique: (array([0], dtype=uint8), array([36461568])) [2022-12-28 21:03:34,288] [202303] [MainThread] [INFO] (lib.utils:593) - ll => p: 14; c: 2; unique: (array([0, 2], dtype=uint8), array([35428467, 1033101])) [2022-12-28 21:03:34,636] [202303] [MainThread] [INFO] (lib.utils:593) - hh => p: 0; c: 3; unique: (array([0, 2], dtype=uint8), array([35428467, 1033101])) [2022-12-28 21:03:34,636] [202303] [MainThread] [INFO] (lib.utils:616) - Label => Input: (6864, 5312); Total Patches to save: 154 14%|█▍ | 1/7 [00:12<01:17, 12.91s/it][2022-12-28 21:03:34,850] [202303] [MainThread] [INFO] (lib.utils:550) - ++ Using Groups: {'CC': 1, 'll': 2, 'hh': 3} [2022-12-28 21:03:34,850] [202303] [MainThread] [INFO] (lib.utils:552) - Fetching Image/Label : {'image': '/MONAILabel/monailabel/datasets/pathology/PPPP.png', 'label': '/MONAILabel/monailabel/datasets/pathology/labels/final/PPPP.xml'} [2022-12-28 21:03:35,376] [202303] [MainThread] [INFO] (lib.utils:560) - Total Points: 85424 [2022-12-28 21:03:35,393] [202303] [MainThread] [INFO] (lib.utils:562) - ID: PPPP => Groups: dict_keys(['CC', 'll', 'hh']); Location: (-1289, -586); Size: 34681 x 16388 [2022-12-28 21:03:35,393] [202303] [MainThread] [WARNING] (lib.utils:565) - Reducing Region to Max-Width; w: 34681; max_w: 10240 [2022-12-28 21:03:35,393] [202303] [MainThread] [WARNING] (lib.utils:568) - Reducing Region to Max-Height; h: 16388; max_h: 10240 14%|█▍ | 1/7 [00:13<01:20, 13.45s/it] Traceback (most recent call last): File "/miniconda3/envs/p39/lib/python3.9/runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "/miniconda3/envs/p39/lib/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/miniconda3/envs/p39/lib/python3.9/site-packages/monailabel/interfaces/utils/app.py", line 132, in run_main() File "/miniconda3/envs/p39/lib/python3.9/site-packages/monailabel/interfaces/utils/app.py", line 117, in run_main result = a.train(request) File "/miniconda3/envs/p39/lib/python3.9/site-packages/monailabel/interfaces/app.py", line 429, in train result = task(request, self.datastore()) File "/miniconda3/envs/p39/lib/python3.9/site-packages/monailabel/tasks/train/basic_train.py", line 444, in call datalist = self.pre_process(req, datastore) File "/MONAILabel/monailabel/apps/pathology/lib/trainers/segmentation_nuclei.py", line 74, in pre_process return split_dataset( File "/MONAILabel/monailabel/apps/pathology/lib/utils.py", line 99, in split_dataset ds_new.extend(split_local_dataset(datastore, d, output_dir, groups, tile_size, max_region)) File "/MONAILabel/monailabel/apps/pathology/lib/utils.py", line 261, in split_local_dataset img = Image.open(d["image"]).convert("RGB") File "/miniconda3/envs/p39/lib/python3.9/site-packages/PIL/Image.py", line 3172, in open im = _open_core(fp, filename, prefix, formats) File "/miniconda3/envs/p39/lib/python3.9/site-packages/PIL/Image.py", line 3159, in _open_core _decompression_bomb_check(im.size) File "/miniconda3/envs/p39/lib/python3.9/site-packages/PIL/Image.py", line 3068, in _decompression_bomb_check raise DecompressionBombError( PIL.Image.DecompressionBombError: Image size (445833922 pixels) exceeds limit of 178956970 pixels, could be decompression bomb DOS attack. [2022-12-28 21:03:36,205] [191288] [ThreadPoolExecutor-2_0] [INFO] (monailabel.utils.async_tasks.utils:83) - Return code: 1


    I would appreciate any help. Best.

    opened by BenoitKAO 2
  • Pytorch, Ignite, Monai -> Monai Label Tutorial

    Pytorch, Ignite, Monai -> Monai Label Tutorial

    Is your feature request related to a problem? Please describe. Due to the complexities required by the creation of interfaces for the backend and frontend to interact the "information" flow within MONAI Label is heavily offuscated and hard to comprehend for development.

    Describe the solution you'd like Create the following four animations/figures:

    • An UML-like figure of MONAI Label that is specific for the backend to complement the existing REST API one.
    • Three transition animations that exemplify how you move from X to MONAI Label, where X is:
      • Pure Pytorch
      • Ignite
      • MONAI

    The last point is inspired by the PyTorch Lighting animation which conveys a lot of information in a very succinct way.

    Additional context This came to be after I had explained MONAI Label to @MichelaA and a while ago Ignite to @diazandr3s. This would facilitate a faster and better understanding of MONAI Label and hopefully as a side-effect less issue.

    Furthermore, this falls in line with the style of tutorials that are available in MONAI Core and it would be highly useful due to the fact that MONAI Label borrows design from PyTorch Lighting ultimately merging two very different APIs together which leads to a steep learning curve.

    opened by danieltudosiu 0
  • MONAILabel Annotation job fails on _name_to_id - Digital Slide Archive (DSA) integration

    MONAILabel Annotation job fails on _name_to_id - Digital Slide Archive (DSA) integration

    Discussed in https://github.com/Project-MONAI/MONAILabel/discussions/1190

    Originally posted by orenlivne December 8, 2022 I'm trying to annotate pathology images using monai within DSA. Any advice on fixing it would be greatly appreciated. Thank you.

    Problem

    1. An annotation (inference) MONAILabel server call from the DSA GUI fails inside _name_to_id, probably due to a wrong image path. When the infer method calls _name_to_id, it does not return anything inside the foreach loop, which returns an ID, name tuple. The fallback is returning the name only, causing the stack trace. This means _name_to_id has a small bug in its last line, but my real issue is that the input name to it is somehow wrong in the workflow I created.
    2. The DSA monai annotation is very slow (30-40s before it finally calls the infer function and crashes), even for a small ROI of 400x200 pixels. Could be related to the wrong image path issue.

    Monailabel source dsa.py

        def _name_to_id(self, name):
            folders = self.folders if self.folders else self._get_all_folders()
            ###### I added the following two lines for debugging, cf. logs below:
            logger.info(f"name: {name}")
            logger.info(f"folders: {folders}")
            
            for folder in folders:
                data = self.gc.get("item", parameters={"folderId": folder, "limit": 0})
                for d in data:
                    if d.get("largeImage") and d["name"] == name or Path(d["name"]).stem == name:
                        return d["_id"], d["name"].  ###### nothing must have been returned during this loop.**
            return name  ###### So this fallback is called and causes an exception since a tuple is expected by the caller.**
    

    MONAILabel Server Stack Trace:

    File "/Users/oren/monai/MONAILabel/monailabel/endpoints/wsi_infer.py", line 126, in api_run_wsi_inference
        return run_wsi_inference(background_tasks, model, image, session_id, None, wsi, output)
      File "/Users/oren/monai/MONAILabel/monailabel/endpoints/wsi_infer.py", line 110, in run_wsi_inference
        result = instance.infer_wsi(request)
      File "/Users/oren/monai/MONAILabel/monailabel/interfaces/app.py", line 633, in infer_wsi
        image = datastore.get_image_uri(request["image"])
      File "/Users/oren/monai/MONAILabel/monailabel/datastore/dsa.py", line 152, in get_image_uri
        image_id, name = self._name_to_id(image_id)
    ValueError: too many values to unpack (expected 2)
    

    Steps to Reproduce

    Hardware: Apple Air, Apple M1 CPU, Monterey MacOS. Note that in the steps below I fixed two bugs specific to M1 + docker that I comment on, but are not relevant to the current discussion.

    • Install OpenSlide: brew install openslide.
    • Find the location of the openslide dylib: brew info openslide | grep Cellar | awk {'print $1}') and set to an environment variable. On Monterey, this is export OPENSLIDE_LIB=opt/homebrew/Cellar/openslide/3.4.1_7/lib
    • Install DSA. Add the line chmod g+w /var/run/docker.sock 2>/dev/null to start_girder.sh (BUG 1). Run DSA via docker-compose.
    • DSA is running on localhost:8080.
    • Within a conda env, install monailabel from weekly release with pip install monailabel-weekly (note: the main release pip package is broken for Apple M1 due to bad package numpymaxflow==0.0.2 which leads numpy wheel building to fail; the weekly release upgraded to 0.0.5, which works).
    • Download the monai pathology app.
    • Note that DYLD_LIBRARY_PATHcannot be passed to a script if SIP is enabled on MacOS and thus OpenSlide won't be recognized by the monailabel server(BUG 2). Thus, start the monailabel server script with an additional line dynamically added to it that sets the library path, with the command tmpfile=$(mktemp /tmp/monailabel.XXXXXX) && ( cat $(which monailabel) | awk -v n=2 -v s="export DYLD_LIBRARY_PATH=\"${OPENSLIDE_LIB}\"" 'NR == n {print s} {print}' > ${tmpfile} ) && chmod +x ${tmpfile} && ( ${tmpfile} start_server --app $HOME/out/pathology --studies http://0.0.0.0:8080/api/v1 ) ; rm ${tmpfile}.
    • Install the monai DSA plugin:
    • Download the monai DSA plugin docker image: docker pull projectmonai/monailabel-dsa. Install it using the Slicer CLI Web under DSA admin plugins.
    • Open the HistomicsUI, open image, select ROI, set ** MONAILabel Address: http://host.docker.internal:8000/ (note: if localhost:8080 is used here, the job will fail on Permission Denied; this is the appropriate docker address on Mac). ** Model Name: segmentation_nuclei.
    Screen Shot 2022-12-08 at 9 05 33 AM

    DSA Job Log

    Title:MONAILabel Annotations on sample-image.tiff
    Type:projectmonai/monailabel-dsa:latest#MONAILabelAnnotation
    Job ID:6391e73bf5e935c6e8bcb171
    Status: ERROR
    Timeline:
    0 s163.411 s
    Created:December 8, 2022 at 8:31:39
    Scheduled start:December 8, 2022 at 8:31:39
    Last update:December 8, 2022 at 8:34:22
    Log output:
    [2022-12-08 13:34:19,563] INFO: Running container: image: projectmonai/monailabel-dsa@sha256:99c7a31b0be9790205efa2327b38a8bf8583d4cab71a157ddbbdf0812e69f100 args: ['MONAILabelAnnotation', '--analysis_level', '0', '--analysis_roi', '14967, 19208, 451, 217', '--analysis_tile_size', '1024', '--api-url', 'http://girder:8080/api/v1/', '--extra_params', '{}', '--girder-token', 'd9C4qgeV3sEku4QT6xLM6ecMsQCABJ2gOLE2k2qmWLOWaOy5J99W1CSB6wicLwNG', '--min_fgnd_frac', '-1', '--min_poly_area', '80', '--model_name', 'segmentation_nuclei', '--server', 'http://host.docker.internal:8000/', '/mnt/girder_worker/2e6ee16dbe1f4d86abbc4b9cc8e32dba/sample-image.tiff', '/mnt/girder_worker/2e6ee16dbe1f4d86abbc4b9cc8e32dba/MONAILabel Annotations-outputAnnotationFile.anot'] runtime: None kwargs: {'tty': False, 'volumes': {'/var/folders/zw/c2j1_twn3_g09ldg5frh0pmm0000gn/T/tmpjb3ekwrq': {'bind': '/mnt/girder_worker/2e6ee16dbe1f4d86abbc4b9cc8e32dba', 'mode': 'rw'}}, 'detach': True, 'network': 'container:4ec6011efa66054d0e9310701f107d5019de73d95f08a651484582a0f640dd67'}
    INFO:root:CLI Parameters ...
    
    INFO:root:USING:: inputImageFile = /mnt/girder_worker/2e6ee16dbe1f4d86abbc4b9cc8e32dba/sample-image.tiff
    INFO:root:USING:: outputAnnotationFile = /mnt/girder_worker/2e6ee16dbe1f4d86abbc4b9cc8e32dba/MONAILabel Annotations-outputAnnotationFile.anot
    INFO:root:USING:: analysis_level = 0
    INFO:root:USING:: analysis_roi = [14967.0, 19208.0, 451.0, 217.0]
    INFO:root:USING:: analysis_tile_size = 1024.0
    INFO:root:USING:: girderApiUrl = http://girder:8080/api/v1/
    INFO:root:USING:: extra_params = {}
    INFO:root:USING:: girderToken = d9C4qgeV3sEku4QT6xLM6ecMsQCABJ2gOLE2k2qmWLOWaOy5J99W1CSB6wicLwNG
    INFO:root:USING:: min_fgnd_frac = -1.0
    INFO:root:USING:: min_poly_area = 80.0
    INFO:root:USING:: model_name = segmentation_nuclei
    INFO:root:USING:: server = http://host.docker.internal:8000/
    INFO:root:>> Reading input image ... 
    
    INFO:root:Run MONAILabel Task... and collect the annotations: [14967.0, 19208.0] => [451.0, 217.0]
    INFO:root:For Server Logs click/open:  http://host.docker.internal:8000/logs/?refresh=3
    Traceback (most recent call last):
      File "/opt/monailabel/dsa/cli/MONAILabelAnnotation/MONAILabelAnnotation.py", line 174, in <module>
        main(CLIArgumentParser().parse_args())
      File "/opt/monailabel/dsa/cli/MONAILabelAnnotation/MONAILabelAnnotation.py", line 168, in main
        fetch_annotations(args, tiles)
      File "/opt/monailabel/dsa/cli/MONAILabelAnnotation/MONAILabelAnnotation.py", line 55, in fetch_annotations
        _, res = client.wsi_infer(model=args.model_name, image_in=image, body=body, output=output)
      File "/opt/monailabel/dsa/cli/client.py", line 188, in wsi_infer
        raise MONAILabelClientException(
    cli.client.MONAILabelClientException: (2, "Status: 500; Response: b'Internal Server Error'")
    DockerException: Non-zero exit code from docker container (1).
      File "/opt/venv/lib/python3.9/site-packages/celery/app/trace.py", line 451, in trace_task
        R = retval = fun(*args, **kwargs)
      File "/opt/slicer_cli_web/slicer_cli_web/girder_worker_plugin/direct_docker_run.py", line 87, in __call__
        super().__call__(*args, **kwargs)
      File "/opt/girder_worker/girder_worker/docker/tasks/__init__.py", line 337, in __call__
        super(DockerTask, self).__call__(*args, **kwargs)
      File "/opt/girder_worker/girder_worker/task.py", line 154, in __call__
        results = super(Task, self).__call__(*_t_args, **_t_kwargs)
      File "/opt/venv/lib/python3.9/site-packages/celery/app/trace.py", line 734, in __protected_call__
        return self.run(*args, **kwargs)
      File "/opt/slicer_cli_web/slicer_cli_web/girder_worker_plugin/direct_docker_run.py", line 123, in run
        return _docker_run(task, **kwargs)
      File "/opt/girder_worker/girder_worker/docker/tasks/__init__.py", line 405, in _docker_run
        _run_select_loop(task, container, read_streams, write_streams)
      File "/opt/girder_worker/girder_worker/docker/tasks/__init__.py", line 251, in _run_select_loop
        raise DockerException('Non-zero exit code from docker container (%d).' % exit_code)
    Keyword arguments:
    {
      "container_args": [
        "MONAILabelAnnotation",
        "--analysis_level",
        "0",
        "--analysis_roi",
        "14967, 19208, 451, 217",
        "--analysis_tile_size",
        "1024",
        "--api-url",
        "<slicer_cli_web.girder_worker_plugin.direct_docker_run.GirderApiUrl object at 0x41958da550>",
        "--extra_params",
        "{}",
        "--girder-token",
        "<slicer_cli_web.girder_worker_plugin.direct_docker_run.GirderToken object at 0x42891d5280>",
        "--min_fgnd_frac",
        "-1",
        "--min_poly_area",
        "80",
        "--model_name",
        "segmentation_nuclei",
        "--server",
        "http://host.docker.internal:8000/",
        "<slicer_cli_web.girder_worker_plugin.direct_docker_run.DirectGirderFileIdToVolume: File ID=6391ba8af5e935c6e8bcb152 -> \"sample-image.tiff\">",
        "<girder_worker.docker.transforms.VolumePath: \"MONAILabel Annotations-outputAnnotationFile.anot\">"
      ],
      "image": "projectmonai/monailabel-dsa@sha256:99c7a31b0be9790205efa2327b38a8bf8583d4cab71a157ddbbdf0812e69f100",
      "pull_image": "if-not-present"
    }
    

    MONAILabel Server Logs

    [2022-12-08 08:31:09,012] [60736] [MainThread] [INFO] (monailabel.interfaces.app:468) - App Init - completed [2022-12-08 08:31:09,012] [60736] [MainThread] [INFO] (timeloop:60) - Starting Timeloop.. [2022-12-08 08:31:09,013] [60736] [MainThread] [INFO] (timeloop:42) - Registered job <function MONAILabelApp.on_init_complete..run_scheduler at 0x16b3d7e20> [2022-12-08 08:31:09,013] [60736] [MainThread] [INFO] (timeloop:63) - Timeloop now started. Jobs will run based on the interval set [2022-12-08 08:34:22,032] [60736] [MainThread] [INFO] (monailabel.endpoints.wsi_infer:108) - WSI Infer Request: {'model': 'segmentation_nuclei', 'image': 'sample-image', 'output': 'dsa', 'level': 0, 'location': [14967, 19208], 'size': [451, 217], 'tile_size': [1024, 1024], 'min_poly_area': 80, 'wsi_tiles': []} [2022-12-08 08:34:22,165] [60736] [MainThread] [INFO] (monailabel.datastore.dsa:138) - name: sample-image [2022-12-08 08:34:22,165] [60736] [MainThread] [INFO] (monailabel.datastore.dsa:139) - folders: ['63865dff31e1eb24754d2e99', '63865dd1f37a980298fa9ce9']

    Note the two lines I added, showing the name and folders inside _name_to_id.

    MONAILabel Server Screen Output

    [2022-12-08 08:31:09,013] [timeloop] [INFO] Timeloop now started. Jobs will run based on the interval set                           
    [2022-12-08 08:31:09,013] [60736] [MainThread] [INFO] (timeloop:63) - Timeloop now started. Jobs will run based on the interval set 
    [2022-12-08 08:31:09,013] [60736] [MainThread] [INFO] (uvicorn.error:59) - Application startup complete.                            
    [2022-12-08 08:31:09,013] [60736] [MainThread] [INFO] (uvicorn.error:206) - Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to 
    quit)                                                                                                                               
    [2022-12-08 08:34:22,032] [60736] [MainThread] [INFO] (monailabel.endpoints.wsi_infer:108) - WSI Infer Request: {'model': 'segmentat
    ion_nuclei', 'image': 'sample-image', 'output': 'dsa', 'level': 0, 'location': [14967, 19208], 'size': [451, 217], 'tile_size': [102
    4, 1024], 'min_poly_area': 80, 'wsi_tiles': []}
    [2022-12-08 08:34:22,165] [60736] [MainThread] [INFO] (monailabel.datastore.dsa:138) - name: sample-image                  
    [2022-12-08 08:34:22,165] [60736] [MainThread] [INFO] (monailabel.datastore.dsa:139) - folders: ['63865dff31e1eb24754d2e99', '63865d
    d1f37a980298fa9ce9']                                                                                                                
    [2022-12-08 08:34:22,227] [60736] [MainThread] [ERROR] (uvicorn.error:369) - Exception in ASGI application
    Traceback (most recent call last):                                                                                                  
      File "/Users/oren/monai/MONAILabel/monailabel/datastore/dsa.py", line 150, in get_image_uri
        name = self.get_image_info(image_id)["name"]                                                                                    
      File "/Users/oren/monai/MONAILabel/monailabel/datastore/dsa.py", line 176, in get_image_info
        return self.gc.getItem(image_id)  # type: ignore                                                                                
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/girder_client/__init__.py", line 619, in getItem
        return self.getResource('item', itemId)                                                                                         
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/girder_client/__init__.py", line 519, in getResource
        return self.get(route)                                                                                                          
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/girder_client/__init__.py", line 471, in get
        return self.sendRestRequest('GET', path, parameters, jsonResp=jsonResp)                                   
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/girder_client/__init__.py", line 463, in sendRestRequest
        raise HttpError(                                                                                                                
    girder_client.HttpError: HTTP error 400: GET http://0.0.0.0:8080/api/v1/item/sample-image
    Response text: {"field": "id", "message": "Invalid ObjectId: sample-image", "type": "validation"}                              
     During handling of the above exception, another exception occurred:                                        
                                                                                                                                        
    Traceback (most recent call last):                                                                                                  
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 366, in run_asgi
        result = await app(self.scope, self.receive, self.send)                                                                         
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 75, in __call__
        return await self.app(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/fastapi/applications.py", line 269, in __call__
        await super().__call__(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/applications.py", line 124, in __call__
        await self.middleware_stack(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
        raise exc
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
        await self.app(scope, receive, _send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/middleware/cors.py", line 84, in __call__
        await self.app(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/exceptions.py", line 93, in __call__
        raise exc
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/exceptions.py", line 82, in __call__
        await self.app(scope, receive, sender)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
        raise e
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
        await self.app(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/routing.py", line 670, in __call__
        await route.handle(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/routing.py", line 266, in handle
        await self.app(scope, receive, send)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/starlette/routing.py", line 65, in app
        response = await func(request)
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/fastapi/routing.py", line 227, in app
        raw_response = await run_endpoint_function(
      File "/Users/oren/miniconda3/envs/monai2/lib/python3.10/site-packages/fastapi/routing.py", line 160, in run_endpoint_function
        return await dependant.call(**values)
      File "/Users/oren/monai/MONAILabel/monailabel/endpoints/wsi_infer.py", line 126, in api_run_wsi_inference
        return run_wsi_inference(background_tasks, model, image, session_id, None, wsi, output)
      File "/Users/oren/monai/MONAILabel/monailabel/endpoints/wsi_infer.py", line 110, in run_wsi_inference
        result = instance.infer_wsi(request)
      File "/Users/oren/monai/MONAILabel/monailabel/interfaces/app.py", line 633, in infer_wsi
        image = datastore.get_image_uri(request["image"])
      File "/Users/oren/monai/MONAILabel/monailabel/datastore/dsa.py", line 152, in get_image_uri
        image_id, name = self._name_to_id(image_id)
    ValueError: too many values to unpack (expected 2)
    ```</div>
    0.6.0 
    opened by orenlivne 2
  • 0.5 threshold used in multi-label deepedit

    0.5 threshold used in multi-label deepedit

    Describe the bug In the codebase (here, here and here) the threshold of 0.5 is used disregarding the number of labels.

    Expected behavior The threshold should be 1/(len(labels)-1). Such that for a 3-class problem (besides background) the threshold is 0.33.

    At the moment we are asking the models to be extra sure of the labels before considering them right. This might be useful but is not explicitly implemented and is worse and worse the more classes there are.

    opened by danieltudosiu 2
  • Scalability of monailabel (OOM errors)

    Scalability of monailabel (OOM errors)

    Describe the bug I have encountered two different situations where monai label is using far more memory than I would expect. Are these user errors, or related to my dataset? Has monai label been designed with scalability in mind?

    1. When I push train, my entire dataset is loaded into CPU RAM. Our dataset is larger than some of the competition datasets (BTCV or MSD) but not extremely large - roughly 100 CT scans that are 512x512xH, where H is usually in the range of about 500. Uncompressed, that adds up to nearly 100GB, which leads to the program crashing. Is there an option to avoid loading all data into RAM, and just load it on demand? Perhaps using pre-fetch to avoid creating a bottleneck? Since I am using the segmentation model, which trains on patches, perhaps it would be sufficient to load just the patches into RAM, rather than the full images?

    In case it is relevant, my dataset has 12 foreground labels.

    My work around solution is to use swap, but obviously that's not ideal.

    1. After training, clicking RUN gives me another OOM error. I tried decreasing the roi_size for my model, but even at 64x64x64 I'm still exceeding the 8GB of GPU VRAM available:

    For 128x128x128

    torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 8.25 GiB (GPU 0; 7.92 GiB total capacity; 440.46 MiB already allocated; 6.63 GiB free; 610.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
    

    For 96x96x96

    torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 7.62 GiB (GPU 0; 7.92 GiB total capacity; 1.20 GiB already allocated; 5.46 GiB free; 1.77 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
    

    For 64x64x64

    torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 7.62 GiB (GPU 0; 7.92 GiB total capacity; 1.20 GiB already allocated; 5.46 GiB free; 1.77 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
    

    It might be expected behaviour for deepedit models to cause an OOM, since they run on the full image. However, I expected that a segmentation model would scale to arbitrary sized images, because it analyses the image in patches. Have I misunderstood something? Or is the stitching of the patches also carried out in the GPU?

    To Reproduce Steps to reproduce the behavior:

    1. Get hold of a medium sized dataset with ground truth labels. Put them in a folder structure as expected by monai. Hold back the ground truth labels for at least one for step 6.
    2. Make a copy of the radiology/lib/config/segmentation.py file (e.g. segmentation_custom.py) and modify the foreground classes and roi_size.
    3. Run the monailabel app:
    monailabel start_server --app radiology --studies relative/path/to/images --conf models segmentation_custom --conf use_pretrained_model false
    
    1. In Slicer, connect to the server and click Train.
    2. If you have enough CPU RAM and training completes, click Next Sample to get an unlabelled image and then Run to automatically generate labels.

    Expected behavior I expected to be able to train a network and run inference on a dataset with an arbitrary number of arbitrarily sized images.

    I've used 128x128x128 patches with nnunet, and been able to run inference on GPUs with only 4GB of VRAM. I'm surprised that an 8GB GPU gets an OOM when trying to run the segmentation network with 64x64x64 patches.

    8GB of GPU memory was enough to train the network, so I assumed it would also be enough to run inference.

    Screenshots N/A

    Environment

    Ensuring you use the relevant python executable, please paste the output of:

    python -c 'import monai; monai.config.print_debug_info()'
    
    ================================
    Printing MONAI config...
    ================================
    MONAI version: 1.0.1
    Numpy version: 1.23.4
    Pytorch version: 1.13.0+cu117
    MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
    MONAI rev id: 8271a193229fe4437026185e218d5b06f7c8ce69
    MONAI __file__: /home/chris/Software/monai/venv/lib/python3.8/site-packages/monai/__init__.py
    
    Optional dependencies:
    Pytorch Ignite version: 0.4.10
    Nibabel version: 4.0.2
    scikit-image version: 0.19.3
    Pillow version: 9.3.0
    Tensorboard version: 2.11.0
    gdown version: 4.5.3
    TorchVision version: 0.14.0+cu117
    tqdm version: 4.64.1
    lmdb version: 1.3.0
    psutil version: 5.9.4
    pandas version: NOT INSTALLED or UNKNOWN VERSION.
    einops version: 0.6.0
    transformers version: NOT INSTALLED or UNKNOWN VERSION.
    mlflow version: NOT INSTALLED or UNKNOWN VERSION.
    pynrrd version: 0.4.3
    
    For details about installing the optional dependencies, please visit:
        https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies
    
    
    ================================
    Printing system config...
    ================================
    System: Linux
    Linux version: Ubuntu 20.04.5 LTS
    Platform: Linux-5.14.0-1054-oem-x86_64-with-glibc2.29
    Processor: x86_64
    Machine: x86_64
    Python version: 3.8.10
    Process name: python
    Command: ['python', '-c', 'import monai; monai.config.print_debug_info()']
    Open files: []
    Num physical CPUs: 6
    Num logical CPUs: 12
    Num usable CPUs: 12
    CPU usage (%): [16.5, 22.2, 15.4, 25.0, 20.9, 82.1, 13.4, 10.5, 12.3, 12.3, 13.9, 15.2]
    CPU freq. (MHz): 1579
    Load avg. in last 1, 5, 15 mins (%): [11.6, 10.4, 26.0]
    Disk usage (%): 81.0
    Avg. sensor temp. (Celsius): UNKNOWN for given OS
    Total physical memory (GB): 31.0
    Available memory (GB): 28.3
    Used memory (GB): 2.2
    
    ================================
    Printing GPU config...
    ================================
    Num GPUs: 1
    Has CUDA: True
    CUDA version: 11.7
    cuDNN enabled: True
    cuDNN version: 8500
    Current device: 0
    Library compiled for CUDA architectures: ['sm_37', 'sm_50', 'sm_60', 'sm_70', 'sm_75', 'sm_80', 'sm_86']
    GPU 0 Name: NVIDIA GeForce GTX 1080
    GPU 0 Is integrated: False
    GPU 0 Is multi GPU board: False
    GPU 0 Multi processor count: 20
    GPU 0 Total memory (GB): 7.9
    GPU 0 CUDA capability (maj.min): 6.1
    

    Additional context N/A

    opened by chrisrapson 8
Releases(0.6.0)
Owner
Project MONAI
AI Toolkit for Healthcare Imaging
Project MONAI
Awesome multilingual OCR toolkits based on PaddlePaddle (practical ultra lightweight OCR system, provide data annotation and synthesis tools, support training and deployment among server, mobile, embedded and IoT devices)

English | 简体中文 Introduction PaddleOCR aims to create multilingual, awesome, leading, and practical OCR tools that help users train better models and a

null 27.5k Jan 8, 2023
Text layer for bio-image annotation.

napari-text-layer Napari text layer for bio-image annotation. Installation You can install using pip: pip install napari-text-layer Keybindings and m

null 6 Sep 29, 2022
An interactive interface for using OpenCV's GrabCut algorithm for image segmentation.

Interactive GrabCut An interactive interface for using OpenCV's GrabCut algorithm for image segmentation. Setup Install dependencies: pip install nump

Jason Y. Zhang 16 Oct 10, 2022
(L2ID@CVPR2021) Boosting Co-teaching with Compression Regularization for Label Noise

Nested-Co-teaching (L2ID@CVPR2021) Pytorch implementation of paper "Boosting Co-teaching with Compression Regularization for Label Noise" [PDF] If our

YINGYI CHEN 41 Jan 3, 2023
An interactive document scanner built in Python using OpenCV

The scanner takes a poorly scanned image, finds the corners of the document, applies the perspective transformation to get a top-down view of the document, sharpens the image, and applies an adaptive color threshold to clean up the image.

Kushal Shingote 1 Feb 12, 2022
Awesome anomaly detection in medical images

A curated list of awesome anomaly detection works in medical imaging, inspired by the other awesome-* initiatives.

Kang Zhou 57 Dec 19, 2022
This project proposes a camera vision based cursor control system, using hand moment captured from a webcam through a landmarks of hand by using Mideapipe module

This project proposes a camera vision based cursor control system, using hand moment captured from a webcam through a landmarks of hand by using Mideapipe module

Chandru 2 Feb 20, 2022
scantailor - Scan Tailor is an interactive post-processing tool for scanned pages.

Scan Tailor - scantailor.org This project is no longer maintained, and has not been maintained for a while. About Scan Tailor is an interactive post-p

null 1.5k Dec 28, 2022
An advanced 2D image manipulation with features such as edge detection and image segmentation built using OpenCV

OpenCV-ToothPaint3-Advanced-Digital-Image-Editor This application named ‘Tooth Paint’ version TP_2020.3 (64-bit) or version 3 was developed within a w

JunHong 1 Nov 5, 2021
This pyhton script converts a pdf to Image then using tesseract as OCR engine converts Image to Text

Script_Convertir_PDF_IMG_TXT Este script de pyhton convierte un pdf en Imagen luego utilizando tesseract como motor OCR convierte la Imagen a Texto. p

alebogado 1 Jan 27, 2022
Thresholding-and-masking-using-OpenCV - Image Thresholding is used for image segmentation

Image Thresholding is used for image segmentation. From a grayscale image, thresholding can be used to create binary images. In thresholding we pick a threshold T.

Grace Ugochi Nneji 3 Feb 15, 2022
Developed an AI-based system to control the mouse cursor using Python and OpenCV with the real-time camera.

Developed an AI-based system to control the mouse cursor using Python and OpenCV with the real-time camera. Fingertip location is mapped to RGB images to control the mouse cursor.

Ravi Sharma 71 Dec 20, 2022
Multi-choice answer sheet correction system using computer vision with opencv & python.

Multi choice answer correction ?? 5 answer sheet samples with a specific solution for detecting answers and sheet correction. ?? By running the soluti

Reza Firouzi 7 Mar 7, 2022
Deskew is a command line tool for deskewing scanned text documents. It uses Hough transform to detect "text lines" in the image. As an output, you get an image rotated so that the lines are horizontal.

Deskew by Marek Mauder https://galfar.vevb.net/deskew https://github.com/galfar/deskew v1.30 2019-06-07 Overview Deskew is a command line tool for des

Marek Mauder 127 Dec 3, 2022
A facial recognition device is a device that takes an image or a video of a human face and compares it to another image faces in a database.

A facial recognition device is a device that takes an image or a video of a human face and compares it to another image faces in a database. The structure, shape and proportions of the faces are compared during the face recognition steps.

Pavankumar Khot 4 Mar 19, 2022
WACV 2022 Paper - Is An Image Worth Five Sentences? A New Look into Semantics for Image-Text Matching

Is An Image Worth Five Sentences? A New Look into Semantics for Image-Text Matching Code based on our WACV 2022 Accepted Paper: https://arxiv.org/pdf/

Andres 13 Dec 17, 2022
A simple OCR API server, seriously easy to be deployed by Docker, on Heroku as well

ocrserver Simple OCR server, as a small working sample for gosseract. Try now here https://ocr-example.herokuapp.com/, and deploy your own now. Deploy

Hiromu OCHIAI 541 Dec 28, 2022
Handwriting Recognition System based on a deep Convolutional Recurrent Neural Network architecture

Handwriting Recognition System This repository is the Tensorflow implementation of the Handwriting Recognition System described in Handwriting Recogni

Edgard Chammas 346 Jan 7, 2023