Faster RCNN classifier to ONNX format

18 views (last 30 days)
Hello all;
I want to export a gesture detector programmed in MATLAB using Deep Learning toolbox to deploy it to another platform, in my case the Neural Compute Stick 2 from INTEL (using openvino framework...).
I have the code and the trained "detector" (output of trainFasterRCNNObjectDetector) in a MAT file. My idea is to export this detector to an ONNX format.
The doubt I have is: The exported "detector" is the whole FasterRCNN architecture, or only the CNN in charge of classifying the objects inside the Faster RCNN architecture?
If I export the detector, should I implement the fasterRCNN algorithm to use the Network exported?
And last but not least : Is there a tool or procedure to to this port more straight forward?
THank you all in advance
  1 Comment
Alessio Gagliardi
Alessio Gagliardi on 29 Oct 2019
I have got the same issue. Following.

Sign in to comment.

Accepted Answer

Raynier Suresh
Raynier Suresh on 26 Mar 2020
The exportONNXNetwork(net,filename)” could be used to export the deep learning network net with weights to the ONNX format, but this function does not support all the deep learning layers. If you export a network that contains a layer that the ONNX format does not support, then exportONNXNetwork saves a placeholder ONNX operator in place of the unsupported layer and returns a warning. You cannot import an ONNX network with a placeholder operator into other deep learning frameworks. RegionProposalLayer in RCNN is not supported by the exportONNXNetwork function thus it will keep a place holder operator at that position, hence you cannot use it in other deep learning frameworks .However you can export a YOLO Object Detection Network to ONNX model format.
Refer to the below links for more information:

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!