Simple node deletion tool for onnx.

Overview

snd4onnx

Simple node deletion tool for onnx. I only test very miscellaneous and limited patterns as a hobby. There are probably a large number of bugs. Pull requests are welcome.

https://github.com/PINTO0309/simple-onnx-processing-tools

Downloads GitHub PyPI CodeQL

1. Setup

1-1. HostPC

### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U snd4onnx

1-2. Docker

### docker pull
$ docker pull pinto0309/snd4onnx:latest

### docker build
$ docker build -t pinto0309/snd4onnx:latest .

### docker run
$ docker run --rm -it -v `pwd`:/workdir pinto0309/snd4onnx:latest
$ cd /workdir

2. CLI Usage

$ snd4onnx -h

usage:
  snd4onnx [-h]
    --remove_node_names REMOVE_NODE_NAMES [REMOVE_NODE_NAMES ...]
    --input_onnx_file_path INPUT_ONNX_FILE_PATH
    --output_onnx_file_path OUTPUT_ONNX_FILE_PATH

optional arguments:
  -h, --help
        show this help message and exit

  --remove_node_names REMOVE_NODE_NAMES [REMOVE_NODE_NAMES ...]
        ONNX node name to be deleted.

  --input_onnx_file_path INPUT_ONNX_FILE_PATH
        Input onnx file path.

  --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
        Output onnx file path.

3. In-script Usage

>>> from snd4onnx import remove
>>> help(remove)

Help on function remove in module snd4onnx.onnx_remove_node:

remove(
    remove_node_names: List[str],
    input_onnx_file_path: Union[str, NoneType] = '',
    output_onnx_file_path: Union[str, NoneType] = '',
    onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None
) -> onnx.onnx_ml_pb2.ModelProto

    Parameters
    ----------
    remove_node_names: List[str]
        List of OP names to be deleted.
        e.g. remove_node_names = ['op_name1', 'op_name2', 'op_name3', ...]

    input_onnx_file_path: Optional[str]
        Input onnx file path.
        Either input_onnx_file_path or onnx_graph must be specified.

    output_onnx_file_path: Optional[str]
        Output onnx file path.
        If output_onnx_file_path is not specified, no .onnx file is output.

    onnx_graph: Optional[onnx.ModelProto]
        onnx.ModelProto.
        Either input_onnx_file_path or onnx_graph must be specified.
        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.

    Returns
    -------
    removed_graph: onnx.ModelProto
        OP removed onnx ModelProto.

4. CLI Execution

$ snd4onnx \
--remove_node_names node_name_a node_name_b
--input_onnx_file_path input.onnx \
--output_onnx_file_path output.onnx

5. In-script Execution

from snd4onnx import remove

onnx_graph = remove(
    remove_node_names=['node_name_a', 'node_name_b'],
    input_onnx_file_path='input.onnx',
)

# or

onnx_graph = remove(
    remove_node_names=['node_name_a', 'node_name_b'],
    onnx_graph=graph,
)

6. Sample

6-1. sample.1

Before After
test1 onnx test1_removed onnx

6-2. sample.2

Before After
test3 onnx test3_removed onnx

6-3. sample.3

Before After
test5 onnx test5_removed onnx

6-4. sample.4

Before After
test7 onnx test7_removed onnx

6-5. sample.5

Before After
test8 onnx test8_removed onnx

7. Reference

  1. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
  2. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
  3. https://github.com/PINTO0309/scs4onnx
  4. https://github.com/PINTO0309/sne4onnx
  5. https://github.com/PINTO0309/snc4onnx
  6. https://github.com/PINTO0309/sog4onnx
  7. https://github.com/PINTO0309/PINTO_model_zoo

8. Issues

https://github.com/PINTO0309/simple-onnx-processing-tools/issues

You might also like...
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.

MMdnn MMdnn is a comprehensive and cross-framework tool to convert, visualize and diagnose deep learning (DL) models. The "MM" stands for model manage

PyTorch ,ONNX and TensorRT implementation of YOLOv4
PyTorch ,ONNX and TensorRT implementation of YOLOv4

PyTorch ,ONNX and TensorRT implementation of YOLOv4

YOLOv5 in PyTorch > ONNX > CoreML > TFLite
YOLOv5 in PyTorch ONNX CoreML TFLite

This repository represents Ultralytics open-source research into future object detection methods, and incorporates lessons learned and best practices evolved over thousands of hours of training and evolution on anonymized client datasets. All code and models are under active development, and are subject to modification or deletion without notice.

A code generator from ONNX to PyTorch code

onnx-pytorch Generating pytorch code from ONNX. Currently support onnx==1.9.0 and torch==1.8.1. Installation From PyPI pip install onnx-pytorch From

tf2onnx - Convert TensorFlow, Keras and Tflite models to ONNX.

tf2onnx converts TensorFlow (tf-1.x or tf-2.x), tf.keras and tflite models to ONNX via command line or python api.

Export CenterPoint PonintPillars ONNX Model For TensorRT
Export CenterPoint PonintPillars ONNX Model For TensorRT

CenterPoint-PonintPillars Pytroch model convert to ONNX and TensorRT Welcome to CenterPoint! This project is fork from tianweiy/CenterPoint. I impleme

A high-performance anchor-free YOLO. Exceeding yolov3~v5 with ONNX, TensorRT, NCNN, and Openvino supported.
A high-performance anchor-free YOLO. Exceeding yolov3~v5 with ONNX, TensorRT, NCNN, and Openvino supported.

YOLOX is an anchor-free version of YOLO, with a simpler design but better performance! It aims to bridge the gap between research and industrial communities. For more details, please refer to our report on Arxiv.

YOLOv3 in PyTorch > ONNX > CoreML > TFLite
YOLOv3 in PyTorch ONNX CoreML TFLite

This repository represents Ultralytics open-source research into future object detection methods, and incorporates lessons learned and best practices

YOLOX is a high-performance anchor-free YOLO, exceeding yolov3~v5 with ONNX, TensorRT, ncnn, and OpenVINO supported.
YOLOX is a high-performance anchor-free YOLO, exceeding yolov3~v5 with ONNX, TensorRT, ncnn, and OpenVINO supported.

Introduction YOLOX is an anchor-free version of YOLO, with a simpler design but better performance! It aims to bridge the gap between research and ind

Releases(1.1.6)
Owner
Katsuya Hyodo
Hobby programmer. Intel Software Innovator Program member.
Katsuya Hyodo
Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX.

snc4onnx Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX. https://github.com/PINTO0309/simple-onnx-processing-tools 1.

Katsuya Hyodo 8 Oct 13, 2022
A very simple tool to rewrite parameters such as attributes and constants for OPs in ONNX models. Simple Attribute and Constant Modifier for ONNX.

sam4onnx A very simple tool to rewrite parameters such as attributes and constants for OPs in ONNX models. Simple Attribute and Constant Modifier for

Katsuya Hyodo 6 May 15, 2022
Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the channel order of RGB and BGR. Simple Channel Converter for ONNX.

scc4onnx Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the channel

Katsuya Hyodo 16 Dec 22, 2022
A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or simply to separate onnx files to any size you want.

sne4onnx A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or

Katsuya Hyodo 10 Aug 30, 2022
Simple ONNX operation generator. Simple Operation Generator for ONNX.

sog4onnx Simple ONNX operation generator. Simple Operation Generator for ONNX. https://github.com/PINTO0309/simple-onnx-processing-tools Key concept V

Katsuya Hyodo 6 May 15, 2022
ONNX Runtime Web demo is an interactive demo portal showing real use cases running ONNX Runtime Web in VueJS.

ONNX Runtime Web demo is an interactive demo portal showing real use cases running ONNX Runtime Web in VueJS. It currently supports four examples for you to quickly experience the power of ONNX Runtime Web.

Microsoft 58 Dec 18, 2022
An executor that loads ONNX models and embeds documents using the ONNX runtime.

ONNXEncoder An executor that loads ONNX models and embeds documents using the ONNX runtime. Usage via Docker image (recommended) from jina import Flow

Jina AI 2 Mar 15, 2022
ONNX-GLPDepth - Python scripts for performing monocular depth estimation using the GLPDepth model in ONNX

ONNX-GLPDepth - Python scripts for performing monocular depth estimation using the GLPDepth model in ONNX

Ibai Gorordo 18 Nov 6, 2022
ONNX-PackNet-SfM: Python scripts for performing monocular depth estimation using the PackNet-SfM model in ONNX

Python scripts for performing monocular depth estimation using the PackNet-SfM model in ONNX

Ibai Gorordo 14 Dec 9, 2022