Main Content

importONNXNetwork

(To be removed) Import pretrained ONNX network

importONNXNetwork will be removed in a future release. Use importNetworkFromONNX instead. (since R2023b) For more information about updating your code, see Version History.

Description

example

net = importONNXNetwork(modelfile) imports a pretrained ONNX™ (Open Neural Network Exchange) network from the file modelfile. The function returns the network net as a DAGNetwork or dlnetwork object.

importONNXNetwork requires the Deep Learning Toolbox™ Converter for ONNX Model Format support package. If this support package is not installed, then importONNXNetwork provides a download link.

Note

By default, importONNXNetwork tries to generate a custom layer when the software cannot convert an ONNX operator into an equivalent built-in MATLAB® layer. For a list of operators for which the software supports conversion, see ONNX Operators Supported for Conversion into Built-In MATLAB Layers.

importONNXNetwork saves the generated custom layers in the package +modelfile.

importONNXNetwork does not automatically generate a custom layer for each ONNX operator that is not supported for conversion into a built-in MATLAB layer. For more information on how to handle unsupported layers, see Alternative Functionality.

example

net = importONNXNetwork(modelfile,Name=Value) imports a pretrained ONNX network with additional options specified by one or more name-value arguments. For example, OutputLayerType="classification" imports the network as a DAGNetwork object with a classification output layer appended to the end of the first output branch of the imported network architecture.

Examples

collapse all

Download and install the Deep Learning Toolbox Converter for ONNX Model Format support package.

Type importONNXNetwork at the command line.

importONNXNetwork

If Deep Learning Toolbox Converter for ONNX Model Format is not installed, then the function provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install. Check that the installation is successful by importing the network from the model file "simplenet.onnx" at the command line. If the support package is installed, then the function returns a DAGNetwork object.

modelfile = "simplenet.onnx";
net = importONNXNetwork(modelfile)
net = 
  DAGNetwork with properties:

         Layers: [9×1 nnet.cnn.layer.Layer]
    Connections: [8×2 table]
     InputNames: {'imageinput'}
    OutputNames: {'ClassificationLayer_softmax1002'}

Plot the network architecture.

plot(net)

Import a pretrained ONNX network as a DAGNetwork object, and use the imported network to classify an image.

Generate an ONNX model of the squeezenet convolution neural network.

squeezeNet = squeezenet;
exportONNXNetwork(squeezeNet,"squeezeNet.onnx");

Specify the class names.

ClassNames = squeezeNet.Layers(end).Classes;

Import the pretrained squeezeNet.onnx model, and specify the classes. By default, importONNXNetwork imports the network as a DAGNetwork object.

net = importONNXNetwork("squeezeNet.onnx",Classes=ClassNames)
net = 
  DAGNetwork with properties:

         Layers: [70×1 nnet.cnn.layer.Layer]
    Connections: [77×2 table]
     InputNames: {'data'}
    OutputNames: {'ClassificationLayer_prob'}

Analyze the imported network.

analyzeNetwork(net)

squeezeNet_DAGNetwork.png

Read the image you want to classify and display the size of the image. The image is 384-by-512 pixels and has three color channels (RGB).

I = imread("peppers.png");
size(I)
ans = 1×3

   384   512     3

Resize the image to the input size of the network. Show the image.

I = imresize(I,[227 227]);
imshow(I)

Classify the image using the imported network.

label = classify(net,I)
label = categorical
     bell pepper 

Import a pretrained ONNX network as a dlnetwork object, and use the imported network to classify an image.

Generate an ONNX model of the squeezenet convolution neural network.

squeezeNet = squeezenet;
exportONNXNetwork(squeezeNet,"squeezeNet.onnx");

Specify the class names.

ClassNames = squeezeNet.Layers(end).Classes;

Import the pretrained squeezeNet.onnx model as a dlnetwork object.

net = importONNXNetwork("squeezeNet.onnx",TargetNetwork="dlnetwork")
net = 
  dlnetwork with properties:

         Layers: [70×1 nnet.cnn.layer.Layer]
    Connections: [77×2 table]
     Learnables: [52×3 table]
          State: [0×3 table]
     InputNames: {'data'}
    OutputNames: {'probOutput'}
    Initialized: 1

Read the image you want to classify and display the size of the image. The image is 384-by-512 pixels and has three color channels (RGB).

I = imread("peppers.png");
size(I)
ans = 1×3

   384   512     3

Resize the image to the input size of the network. Show the image.

I = imresize(I,[227 227]);
imshow(I)

Convert the image to a dlarray. Format the images with the dimensions "SSCB" (spatial, spatial, channel, batch). In this case, the batch size is 1 and you can omit it ("SSC").

I_dlarray = dlarray(single(I),"SSCB");

Classify the sample image and find the predicted label.

prob = predict(net,I_dlarray);
[~,label] = max(prob);

Display the classification result.

ClassNames(label)
ans = categorical
     bell pepper 

Import a pretrained ONNX network, which contains operators that the software cannot convert to built-in MATLAB layers. The software automatically generates custom layers when you import this network.

This example uses the findCustomLayers helper function. To view the code for this function, see Helper Function.

Specify the model file to import as shufflenet with operator set 9 from the ONNX Model Zoo. shufflenet is a convolutional neural network that is trained on more than a million images from the ImageNet database. As a result, the network has learned rich feature representations for a wide range of images. The network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals.

modelfile = "shufflenet-9.onnx";

Import shufflenet. By default, importNetworkFromONNX imports the network as a dlnetwork object. importNetworkFromONNX saves each generated custom layer to a separate M file in the +shufflenet_9 package in the current folder.

net = importNetworkFromONNX(modelfile,PackageName="shufflenet_9")
net = 
  dlnetwork with properties:

         Layers: [174×1 nnet.cnn.layer.Layer]
    Connections: [189×2 table]
     Learnables: [198×3 table]
          State: [98×3 table]
     InputNames: {'gpu_0_data_0'}
    OutputNames: {'gpu_0_softmax_1Output'}
    Initialized: 1

  View summary with summary.

Find the indices of the automatically generated custom layers by using the findCustomLayers helper function and display the custom layers.

ind = findCustomLayers(net.Layers,'+shufflenet_9');
net.Layers(ind)
ans = 
  16×1 Layer array with layers:

     1   'Reshape_To_ReshapeLayer1285'   shufflenet_9.Reshape_To_ReshapeLayer1285   shufflenet_9.Reshape_To_ReshapeLayer1285
     2   'Reshape_To_ReshapeLayer1290'   shufflenet_9.Reshape_To_ReshapeLayer1290   shufflenet_9.Reshape_To_ReshapeLayer1290
     3   'Reshape_To_ReshapeLayer1295'   shufflenet_9.Reshape_To_ReshapeLayer1295   shufflenet_9.Reshape_To_ReshapeLayer1295
     4   'Reshape_To_ReshapeLayer1300'   shufflenet_9.Reshape_To_ReshapeLayer1300   shufflenet_9.Reshape_To_ReshapeLayer1300
     5   'Reshape_To_ReshapeLayer1305'   shufflenet_9.Reshape_To_ReshapeLayer1305   shufflenet_9.Reshape_To_ReshapeLayer1305
     6   'Reshape_To_ReshapeLayer1310'   shufflenet_9.Reshape_To_ReshapeLayer1310   shufflenet_9.Reshape_To_ReshapeLayer1310
     7   'Reshape_To_ReshapeLayer1315'   shufflenet_9.Reshape_To_ReshapeLayer1315   shufflenet_9.Reshape_To_ReshapeLayer1315
     8   'Reshape_To_ReshapeLayer1320'   shufflenet_9.Reshape_To_ReshapeLayer1320   shufflenet_9.Reshape_To_ReshapeLayer1320
     9   'Reshape_To_ReshapeLayer1325'   shufflenet_9.Reshape_To_ReshapeLayer1325   shufflenet_9.Reshape_To_ReshapeLayer1325
    10   'Reshape_To_ReshapeLayer1330'   shufflenet_9.Reshape_To_ReshapeLayer1330   shufflenet_9.Reshape_To_ReshapeLayer1330
    11   'Reshape_To_ReshapeLayer1335'   shufflenet_9.Reshape_To_ReshapeLayer1335   shufflenet_9.Reshape_To_ReshapeLayer1335
    12   'Reshape_To_ReshapeLayer1340'   shufflenet_9.Reshape_To_ReshapeLayer1340   shufflenet_9.Reshape_To_ReshapeLayer1340
    13   'Reshape_To_ReshapeLayer1345'   shufflenet_9.Reshape_To_ReshapeLayer1345   shufflenet_9.Reshape_To_ReshapeLayer1345
    14   'Reshape_To_ReshapeLayer1350'   shufflenet_9.Reshape_To_ReshapeLayer1350   shufflenet_9.Reshape_To_ReshapeLayer1350
    15   'Reshape_To_ReshapeLayer1355'   shufflenet_9.Reshape_To_ReshapeLayer1355   shufflenet_9.Reshape_To_ReshapeLayer1355
    16   'Reshape_To_ReshapeLayer1360'   shufflenet_9.Reshape_To_ReshapeLayer1360   shufflenet_9.Reshape_To_ReshapeLayer1360

Helper Function

This section defines the findCustomLayers helper function. findCustomLayers returns the indices of the custom layers that importNetworkFromONNX automatically generates.

function indices = findCustomLayers(layers,PackageName)

s = what(['.\' PackageName]);

indices = zeros(1,length(s.m));
for i = 1:length(layers)
    for j = 1:length(s.m)
        if strcmpi(class(layers(i)),[PackageName(2:end) '.' s.m{j}(1:end-2)])
            indices(j) = i;
        end
    end
end

end

Import an ONNX network that has multiple outputs as a DAGNetwork object.

Specify the ONNX model file and import the pretrained ONNX model. By default, importONNXNetwork imports the network as a DAGNetwork object.

modelfile = "digitsMIMO.onnx";
net = importONNXNetwork(modelfile)
net = 
  DAGNetwork with properties:

         Layers: [19×1 nnet.cnn.layer.Layer]
    Connections: [19×2 table]
     InputNames: {'input'}
    OutputNames: {'ClassificationLayer_sm_1'  'RegressionLayer_fc_1_Flatten'}

The network has two output layers: one classification layer (ClassificationLayer_sm_1) to classify digits and one regression layer (RegressionLayer_fc_1_Flatten) to compute the mean squared error for the predicted angles of the digits. Plot the network architecture.

plot(net)
title('digitsMIMO Network Architecture')

To make predictions using the imported network, use the predict function and set the ReturnCategorical option to true.

Input Arguments

collapse all

Name of the ONNX model file containing the network, specified as a character vector or string scalar. The file must be in the current folder or in a folder on the MATLAB path, or you must include a full or relative path to the file.

Example: "cifarResNet.onnx"

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: importONNXNetwork(modelfile,TargetNetwork="dagnetwork",GenerateCustomLayers=true,PackageName="CustomLayers") imports the network in modelfile as a DAGNetwork object and saves the automatically generated custom layers in the package +CustomLayers in the current folder.

Option for custom layer generation, specified as a numeric or logical 1 (true) or 0 (false). If you set GenerateCustomLayers to true, importONNXNetwork tries to generate a custom layer when the software cannot convert an ONNX operator into an equivalent built-in MATLAB layer. importONNXNetwork saves each generated custom layer to a separate .m file in +PackageName. To view or edit a custom layer, open the associated .m file. For more information on custom layers, see Custom Layers.

Example: GenerateCustomLayers=false

Name of the custom layers package in which importONNXNetwork saves custom layers, specified as a character vector or string scalar. importONNXNetwork saves the custom layers package +PackageName in the current folder. If you do not specify PackageName, then importONNXNetwork saves the custom layers in a package named +modelfile in the current folder. For more information about packages, see Packages Create Namespaces.

Example: PackageName="shufflenet_9"

Example: PackageName="CustomLayers"

Target type of Deep Learning Toolbox network, specified as "dagnetwork" or "dlnetwork". The function importONNXNetwork imports the network net as a DAGNetwork or dlnetwork object.

  • If you import the network as a DAGNetwork object, net must include input and output layers specified by the ONNX model or that you specify using the name-value arguments InputDataFormats, OutputDataFormats, or OutputLayerType.

  • If you import the network as a dlnetwork object, importONNXNetwork appends a CustomOutputLayer at the end of each output branch of net, and might append a CustomInputLayer at the beginning of an input branch. The function appends a CustomInputLayer if the input data formats or input image sizes are not known. For network-specific information on the data formats of these layers, see the properties of the CustomInputLayer and CustomOutputLayer objects. For information on how to interpret Deep Learning Toolbox input and output data formats, see Conversion of ONNX Input and Output Tensors into Built-In MATLAB Layers.

Example: TargetNetwork="dlnetwork"

Data format of the network inputs, specified as a character vector, string scalar, or string array. importONNXNetwork tries to interpret the input data formats from the ONNX file. The name-value argument InputDataFormats is useful when importONNXNetwork cannot derive the input data formats.

Set InputDataFomats to a data format in the ordering of an ONNX input tensor. For example, if you specify InputDataFormats as "BSSC", the imported network has one imageInputLayer input. For more information on how importONNXNetwork interprets the data format of ONNX input tensors and how to specify InputDataFormats for different Deep Learning Toolbox input layers, see Conversion of ONNX Input and Output Tensors into Built-In MATLAB Layers.

If you specify an empty data format ([] or ""), importONNXNetwork automatically interprets the input data format.

Example: InputDataFormats='BSSC'

Example: InputDataFormats="BSSC"

Example: InputDataFormats=["BCSS","","BC"]

Example: InputDataFormats={'BCSS',[],'BC'}

Data Types: char | string | cell

Data format of the network outputs, specified as a character vector, string scalar, or string array. importONNXNetwork tries to interpret the output data formats from the ONNX file. The name-value argument OutputDataFormats is useful when importONNXNetwork cannot derive the output data formats.

Set OutputDataFormats to a data format in the ordering of an ONNX output tensor. For example, if you specify OutputDataFormats as "BC", the imported network has one classificationLayer output. For more information on how importONNXNetwork interprets the data format of ONNX output tensors and how to specify OutputDataFormats for different Deep Learning Toolbox output layers, see Conversion of ONNX Input and Output Tensors into Built-In MATLAB Layers.

If you specify an empty data format ([] or ""), importONNXNetwork automatically interprets the output data format.

Example: OutputDataFormats='BC'

Example: OutputDataFormats="BC"

Example: OutputDataFormats=["BCSS","","BC"]

Example: OutputDataFormats={'BCSS',[],'BC'}

Data Types: char | string | cell

Size of the input image for the first network input, specified as a vector of three or four numerical values corresponding to [height,width,channels] for 2-D images and [height,width,depth,channels] for 3-D images. The network uses this information only when the ONNX model in modelfile does not specify the input size.

Example: ImageInputSize=[28 28 1] for a 2-D grayscale input image

Example: ImageInputSize=[224 224 3] for a 2-D color input image

Example: ImageInputSize=[28 28 36 3] for a 3-D color input image

Layer type for the first network output, specified as "classification", "regression", or "pixelclassification". The function importONNXNetwork appends a ClassificationOutputLayer, RegressionOutputLayer, or pixelClassificationLayer (Computer Vision Toolbox) object to the end of the first output branch of the imported network architecture. Appending a pixelClassificationLayer (Computer Vision Toolbox) object requires Computer Vision Toolbox™. If the ONNX model in modelfile specifies the output layer type or you specify TargetNetwork as "dlnetwork", importONNXNetwork ignores the name-value argument OutputLayerType.

Example: OutputLayerType="regression"

Classes of the output layer for the first network output, specified as a categorical vector, string array, cell array of character vectors, or "auto". If Classes is "auto", then importONNXNetwork sets the classes to categorical(1:N), where N is the number of classes. If you specify a string array or cell array of character vectors str, then importONNXNetwork sets the classes of the output layer to categorical(str,str). If you specify TargetNetwork as "dlnetwork", importONNXNetwork ignores the name-value argument Classes.

Example: Classes={'0','1','3'}

Example: Classes=categorical({'dog','cat'})

Data Types: char | categorical | string | cell

Constant folding optimization, specified as "deep", "shallow", or "none". Constant folding optimizes the imported network architecture by computing operations on ONNX initializers (initial constant values) during the conversion of ONNX operators to equivalent built-in MATLAB layers.

If the ONNX network contains operators that the software cannot convert to equivalent built-in MATLAB layers (see ONNX Operators Supported for Conversion into Built-In MATLAB Layers), constant folding optimization can reduce the number of unsupported layers. When you set FoldConstants to "deep", the network has the same or fewer unsupported layers, compared to when you set the argument to "shallow". However, the network importing time might increase. Set FoldConstants to "none" to disable the network architecture optimization.

If the network still contains unsupported layers after constant folding optimization, importONNXNetwork returns an error. In this case, you can import the network by using importONNXLayers or importONNXFunction. For more information, see Alternative Functionality.

Example: FoldConstants="shallow"

Output Arguments

collapse all

Pretrained ONNX network, returned as a DAGNetwork or dlnetwork object.

  • Specify TargetNetwork as "dagnetwork" to import the network as a DAGNetwork object. On the DAGNetwork object, you then predict class labels by using the classify function.

  • Specify TargetNetwork as "dlnetwork" to import the network as a dlnetwork object. On the dlnetwork object, you then predict class labels by using the predict function. Specify the input data as a dlarray using the correct data format (for more information, see the fmt argument of dlarray).

Limitations

  • importONNXNetwork supports ONNX versions as follows:

    • The function supports ONNX intermediate representation version 7.

    • The function supports ONNX operator sets 6 to 14.

Note

If you import an exported network, layers of the reimported network might differ from the original network and might not be supported.

More About

collapse all

ONNX Operators Supported for Conversion into Built-In MATLAB Layers

importONNXNetwork supports these ONNX operators for conversion into built-in MATLAB layers, with some limitations.

ONNX OperatorDeep Learning Toolbox Layer

Add

additionLayer or nnet.onnx.layer.ElementwiseAffineLayer

AveragePool

averagePooling1dLayer or averagePooling2dLayer

BatchNormalization

batchNormalizationLayer

Concat

concatenationLayer

Constant

None (Imported as weights)

Conv*

convolution1dLayer or convolution2dLayer

ConvTranspose

transposedConv2dLayer

Dropout

dropoutLayer

Elu

eluLayer

Gemm

fullyConnectedLayer if ONNX network is recurrent, otherwise nnet.onnx.layer.FlattenLayer followed by convolution2dLayer

GlobalAveragePool

globalAveragePooling1dLayer or globalAveragePooling2dLayer

GlobalMaxPool

globalMaxPooling1dLayer or globalMaxPooling2dLayer

GRU

gruLayer

InstanceNormalization

groupNormalizationLayer with numGroups specified as "channel-wise"

LeakyRelu

leakyReluLayer

LRN

CrossChannelNormalizationLayer

LSTM

lstmLayer or bilstmLayer

MatMul

fullyConnectedLayer if ONNX network is recurrent, otherwise convolution2dLayer

MaxPool

maxPooling1dLayer or maxPooling2dLayer

Mul

multiplicationLayer

Relu

reluLayer or clippedReluLayer

Sigmoid

sigmoidLayer

Softmax

softmaxLayer

Sum

additionLayer

Tanh

tanhLayer

* If importONNXNetwork imports the Conv ONNX operator as a convolution2dLayer object and the Conv operator is a vector with only two elements [p1 p2], importONNXNetwork sets the Padding option of convolution2dLayer to [p1 p2 p1 p2].

ONNX OperatorONNX Importer Custom Layer

Clip

nnet.onnx.layer.ClipLayer

Div

nnet.onnx.layer.ElementwiseAffineLayer

Flatten

nnet.onnx.layer.FlattenLayer or nnet.onnx.layer.Flatten3dLayer

Identity

nnet.onnx.layer.IdentityLayer

ImageScaler

nnet.onnx.layer.ElementwiseAffineLayer

PRelu

nnet.onnx.layer.PReluLayer

Reshape

nnet.onnx.layer.FlattenLayer

Sub

nnet.onnx.layer.ElementwiseAffineLayer
ONNX OperatorCorresponding Image Processing Toolbox™ Layer
DepthToSpacedepthToSpace2dLayer (Image Processing Toolbox)
Resizeresize2dLayer (Image Processing Toolbox) or resize3dLayer (Image Processing Toolbox)
SpaceToDepthspaceToDepthLayer (Image Processing Toolbox)
Upsampleresize2dLayer (Image Processing Toolbox) or resize3dLayer (Image Processing Toolbox)

Conversion of ONNX Input and Output Tensors into Built-In MATLAB Layers

importONNXNetwork tries to interpret the data format of the ONNX network's input and output tensors, and then convert them into built-in MATLAB input and output layers. For details on the interpretation, see the tables Conversion of ONNX Input Tensors into Deep Learning Toolbox Layers and Conversion of ONNX Output Tensors into MATLAB Layers.

In Deep Learning Toolbox, each data format character must be one of these labels:

  • S — Spatial

  • C — Channel

  • B — Batch observations

  • T — Time or sequence

  • U — Unspecified

Conversion of ONNX Input Tensors into Deep Learning Toolbox Layers

Data FormatsData InterpretationDeep Learning Toolbox Layer
ONNX Input Tensor MATLAB Input FormatShapeType
BCCBc-by-n array, where c is the number of features and n is the number of observationsFeaturesfeatureInputLayer
BCSS, BSSC, CSS, SSCSSCB

h-by-w-by-c-by-n numeric array, where h, w, c and n are the height, width, number of channels of the images, and number of observations, respectively

2-D imageimageInputLayer
BCSSS, BSSSC, CSSS, SSSCSSSCB

h-by-w-by-d-by-c-by-n numeric array, where h, w, d, c and n are the height, width, depth, number of channels of the images, and number of image observations, respectively

3-D imageimage3dInputLayer
TBCCBT

c-by-s-by-n matrix, where c is the number of features of the sequence, s is the sequence length, and n is the number of sequence observations

Vector sequencesequenceInputLayer
TBCSSSSCBT

h-by-w-by-c-by-s-by-n array, where h, w, c and n correspond to the height, width, and number of channels of the image, respectively, s is the sequence length, and n is the number of image sequence observations

2-D image sequencesequenceInputLayer
TBCSSSSSSCBT

h-by-w-by-d-by-c-by-s-by-n array, where h, w, d, and c correspond to the height, width, depth, and number of channels of the image, respectively, s is the sequence length, and n is the number of image sequence observations

3-D image sequencesequenceInputLayer

Conversion of ONNX Output Tensors into MATLAB Layers

Data FormatsMATLAB Layer
ONNX Output TensorMATLAB Output Format
BC, TBCCB, CBTclassificationLayer
BCSS, BSSC, CSS, SSC, BCSSS, BSSSC, CSSS, SSSCSSCB, SSSCBpixelClassificationLayer (Computer Vision Toolbox)
TBCSS, TBCSSSSSCBT, SSSCBTregressionLayer

Generate Code for Imported Network

You can use MATLAB Coder™ or GPU Coder™ together with Deep Learning Toolbox to generate MEX, standalone CPU, CUDA® MEX, or standalone CUDA code for an imported network. For more information, see Code Generation.

  • Use MATLAB Coder with Deep Learning Toolbox to generate MEX or standalone CPU code that runs on desktop or embedded targets. You can deploy generated standalone code that uses the Intel® MKL-DNN library or the ARM® Compute library. Alternatively, you can generate generic C or C++ code that does not call third-party library functions. For more information, see Deep Learning with MATLAB Coder (MATLAB Coder).

  • Use GPU Coder with Deep Learning Toolbox to generate CUDA MEX or standalone CUDA code that runs on desktop or embedded targets. You can deploy generated standalone CUDA code that uses the CUDA deep neural network library (cuDNN), the TensorRT™ high performance inference library, or the ARM Compute library for Mali GPU. For more information, see Deep Learning with GPU Coder (GPU Coder).

importONNXNetwork returns the network net as a DAGNetwork or dlnetwork object. Both these objects support code generation. For more information on MATLAB Coder and GPU Coder support for Deep Learning Toolbox objects, see Supported Classes (MATLAB Coder) and Supported Classes (GPU Coder), respectively.

You can generate code for any imported network whose layers support code generation. For lists of the layers that support code generation with MATLAB Coder and GPU Coder, see Supported Layers (MATLAB Coder) and Supported Layers (GPU Coder), respectively. For more information on the code generation capabilities and limitations of each built-in MATLAB layer, see the Extended Capabilities section of the layer. For example, see Code Generation and GPU Code Generation of imageInputLayer.

Use Imported Network on GPU

importONNXNetwork does not execute on a GPU. However, importONNXNetwork imports a pretrained neural network for deep learning as a DAGNetwork or dlnetwork object, which you can use on a GPU.

  • If you import the network as a DAGNetwork object, you can make predictions with the imported network on either a CPU or GPU by using classify. Specify the hardware requirements using the name-value argument ExecutionEnvironment. For networks with multiple outputs, use the predict function for DAGNetwork objects.

  • If you import the network as a DAGNetwork object, you can make predictions with the imported network on either a CPU or GPU by using predict. Specify the hardware requirements using the name-value argument ExecutionEnvironment. If the network has multiple outputs, specify the name-value argument ReturnCategorical as true.

  • If you import the network as a dlnetwork object, you can make predictions with the imported network on either a CPU or GPU by using predict. The function predict executes on the GPU if either the input data or network parameters are stored on the GPU.

    • If you use minibatchqueue to process and manage the mini-batches of input data, the minibatchqueue object converts the output to a GPU array by default if a GPU is available.

    • Use dlupdate to convert the learnable parameters of a dlnetwork object to GPU arrays.

      net = dlupdate(@gpuArray,net)

  • You can train the imported network on either a CPU or GPU by using the trainnet and trainNetwork functions. To specify training options, including options for the execution environment, use the trainingOptions function. Specify the hardware requirements using the name-value argument ExecutionEnvironment. For more information on how to accelerate training, see Scale Up Deep Learning in Parallel, on GPUs, and in the Cloud.

Using a GPU requires a Parallel Computing Toolbox™ license and a supported GPU device. For information about supported devices, see GPU Computing Requirements (Parallel Computing Toolbox).

Tips

  • To use a pretrained network for prediction or transfer learning on new images, you must preprocess your images in the as same way the images that you use to train the imported model. The most common preprocessing steps are resizing images, subtracting image average values, and converting the images from BGR format to RGB format.

    • To resize images, use imresize. For example, imresize(image,[227 227 3]).

    • To convert images from RGB to BGR format, use flip. For example, flip(image,3).

    For more information about preprocessing images for training and prediction, see Preprocess Images for Deep Learning.

  • MATLAB uses one-based indexing, whereas Python® uses zero-based indexing. In other words, the first element in an array has an index of 1 and 0 in MATLAB and Python, respectively. For more information about MATLAB indexing, see Array Indexing. In MATLAB, to use an array of indices (ind) created in Python, convert the array to ind+1.

  • For more tips, see Tips on Importing Models from TensorFlow, PyTorch, and ONNX.

Alternative Functionality

Deep Learning Toolbox Converter for ONNX Model Format provides three functions to import a pretrained ONNX network: importONNXNetwork, importONNXLayers, and importONNXFunction.

If the imported network contains an ONNX operator not supported for conversion into a built-in MATLAB layer (see ONNX Operators Supported for Conversion into Built-In MATLAB Layers) and importONNXNetwork does not generate a custom layer, then importONNXNetwork returns an error. In this case, you can still use importONNXLayers to import the network architecture and weights or importONNXFunction to import the network as an ONNXParameters object and a model function.

For more information on which import function best suits different scenarios, see Select Function to Import ONNX Pretrained Network.

References

[1] Open Neural Network Exchange. https://github.com/onnx/.

[2] ONNX. https://onnx.ai/.

Version History

Introduced in R2018a

expand all

R2023b: importONNXNetwork will be removed

Starting in R2023b, the importONNXNetwork function warns. Use importNetworkFromONNX instead. The importNetworkFromONNX function has these advantages over importONNXNetwork:

  • Imports an ONNX model into a dlnetwork object in a single step

  • Provides a simplified workflow for importing models with unknown input and output information

  • Has improved name-value arguments that you can use to more easily specify import options