Main Content

loadTFLiteModel

Load TensorFlow Lite model

Since R2022a

Description

example

net = loadTFLiteModel(modelFileName) loads a pretrained TensorFlow™ Lite model file modelFileName and returns a TFLiteModel object.

Use this TFLiteModel object with the predict function in your MATLAB® code to perform inference in MATLAB execution, code generation, or MATLAB Function block in Simulink® models. For more information, see Prerequisites for Deep Learning with TensorFlow Lite Models.

To use this function, you must install the Deep Learning Toolbox Interface for TensorFlow Lite support package.

___ = loadTFLiteModel(___, PreserveDataFormats = false) By default PreserveDataFormats is set to false. Set PreserveDataFormats explicitly only when you need to maintain and produce the same data layout as the pretrained TensorFlow Lite model.

___ = loadTFLiteModel(___, PreserveDataFormats = true) preserves the layouts of the input and output data to match the layout of the pretrained TensorFlow Lite model during model construction. You must specify this option as a compile-time constant for code generation workflow.

Examples

collapse all

Suppose that your current working directory contains a TensorFlow Lite Model named mobilenet_v1_0.5_224.tflite.

Load the model by using the loadTFLite function. Inspect the object this function creates.

net = loadTFLiteModel('mobilenet_v1_0.5_224.tflite');
disp(net)
  TFLiteModel with properties:
            ModelName: 'mobilenet_v1_0.5_224.tflite'
            NumInputs: 1
           NumOutputs: 1
            InputSize: {[224 224 3]}
           OutputSize: {[1001 1]}
           NumThreads: 8
                 Mean: 127.5000
    StandardDeviation: 127.5000

Create a MATLAB function that can perform inference using the object net. This function loads the Mobilenet-V1 model into a persistent network object. Then the function performs prediction by passing the network object to the predict function. Subsequent calls to this function reuse this the persistent object.

function out = tflite_predict(in)
persistent net;
if isempty(net)
    net = loadTFLiteModel('mobilenet_v1_0.5_224.tflite');
end
out = predict(net,in);
end

For an example that shows how to generate code for this function and deploy on Raspberry Pi® hardware, see Generate Code for TensorFlow Lite (TFLite) Model and Deploy on Raspberry Pi.

Note

By default, the Mean and StandardDeviation properties of a TFLiteModel object are both set to 127.5. To change these default values after you create the object, make assignments by using the dot notation. For example:

net.Mean = 0;
net.StandardDeviation = 1;

If the input data is not normalized, you must set Mean to 0 and StandardDeviation to 1. Otherwise, set these properties based on how the input data is normalized.

Input Arguments

collapse all

Name of the TensorFlow Lite model file, specified as a character vector or a string scalar.

Output Arguments

collapse all

TFLiteModel object that represents the TensorFlow Lite model file.

Extended Capabilities

Version History

Introduced in R2022a