Deep Learning Toolbox Interface for LiteRT Library
Incorporate pretrained LiteRT (aka TFLite) models into MATLAB and Simulink applications for simulation and deployment to hardware.
664 Downloads
Updated
15 Oct 2025
The Deep Learning Toolbox Interface for LiteRT Library enables you to run cosimulations of MATLAB and Simulink applications with LiteRT (aka TensorFlow Lite or TFLite) models. This workflow allows you to use pretrained LiteRT models, including classification and object detection networks, with the rest of the application code implemented in MATLAB or Simulink for development and testing.
Inference of pretrained LiteRT models is executed by the LiteRT Interpreter while the rest of the application code is executed by MATLAB or Simulink. Data exchange between MATLAB or Simulink and LiteRT is handled automatically.
When used with MATLAB Coder, you can generate C++ code for the complete application for deployment to target hardware. In the generated code, inference of the LiteRT model is executed by the LiteRT Interpreter while C++ code is generated for the remainder of the MATLAB or Simulink application, including pre- and post-processing. Data exchange between the generated code and the LiteRT Interpreter is again handled automatically.
If you need to generate code from the LiteRT models alongside the pre and postprocessing, you can use the MATLAB Coder Support Package for PyTorch and LiteRT Models.
Please see the following list for a list of prerequisites for using this software package:
If you experience download or installation problems, please contact Technical Support:
MATLAB Release Compatibility
Created with
R2022a
Compatible with R2022a to R2026a
Platform Compatibility
Windows macOS (Apple Silicon) macOS (Intel) LinuxTags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.