Custom Training Loops
trainingOptions function does
not provide the training options that you need for your task, or custom
output layers do not support the loss functions that you need, then you can
define a custom training loop. For networks that cannot be created using
layer graphs, you can define custom networks as a function. To learn more,
see Define Custom Training Loops, Loss Functions, and Networks.
Custom Training Loops
|Deep learning network for custom training loops (Since R2019b)
|Monitor and plot training progress for deep learning custom training loops (Since R2022b)
|Create mini-batches for deep learning (Since R2020b)
|Pad or truncate sequence data to same length (Since R2021a)
|Deep learning array for customization (Since R2019b)
|Compute gradients for custom training loops using automatic differentiation (Since R2019b)
|Evaluate deep learning model for custom training loops (Since R2019b)
|Cross-entropy loss for classification tasks (Since R2019b)
|L1 loss for regression tasks (Since R2021b)
|L2 loss for regression tasks (Since R2021b)
|Huber loss for regression tasks (Since R2021a)
|Half mean squared error (Since R2019b)
|Connectionist temporal classification (CTC) loss for unaligned sequence classification (Since R2021a)
Deep Learning Operations
|Deep learning convolution (Since R2019b)
|Deep learning transposed convolution (Since R2019b)
|Long short-term memory (Since R2019b)
|Gated recurrent unit (Since R2020a)
|Dot-product attention (Since R2022b)
|Embed discrete data (Since R2020b)
|Sum all weighted input data and apply a bias (Since R2019b)
|Deep learning solution of nonstiff ordinary differential equation (ODE) (Since R2021b)
|Normalize data across all observations for each channel independently (Since R2019b)
|Cross channel square-normalize using local responses (Since R2020a)
|Normalize data across grouped subsets of channels for each observation independently (Since R2020b)
|Normalize across each channel for each observation independently (Since R2021a)
|Normalize data across all channels for each observation independently (Since R2021a)
|Pool data to average values over spatial dimensions (Since R2019b)
|Pool data to maximum value (Since R2019b)
|Unpool the output of a maximum pooling operation (Since R2019b)
|Apply rectified linear unit activation (Since R2019b)
|Apply leaky rectified linear unit activation (Since R2019b)
|Apply Gaussian error linear unit (GELU) activation (Since R2022b)
|Apply softmax activation to channel dimension (Since R2019b)
|Apply sigmoid activation (Since R2019b)
Custom Training Loops
- Train Deep Learning Model in MATLAB
Learn how to training deep learning models in MATLAB®.
- Define Custom Training Loops, Loss Functions, and Networks
Learn how to define and customize deep learning training loops, loss functions, and networks using automatic differentiation.
- Train Sequence Classification Network Using Custom Training Loop
This example shows how to train a network that classifies sequences with a custom learning rate schedule.
- Monitor Custom Training Loop Progress
Track and plot custom training loop progress.
- Train Network with Multiple Outputs
This example shows how to train a deep learning network with multiple outputs that predict both labels and angles of rotations of handwritten digits.
- Classify Videos Using Deep Learning with Custom Training Loop
This example shows how to create a network for video classification by combining a pretrained image classification model and a sequence classification network.
- Train Neural ODE Network
This example shows how to train an augmented neural ordinary differential equation (ODE) network.
- Solve Ordinary Differential Equation Using Neural Network
This example shows how to solve an ordinary differential equation (ODE) using a neural network.
- Create Bidirectional LSTM (BiLSTM) Function
This example shows how to create a bidirectional long-short term memory (BiLSTM) function for custom deep learning functions. (Since R2023b)
- List of Functions with dlarray Support
View the list of functions that support
- Automatic Differentiation Background
Learn how automatic differentiation works.
- Use Automatic Differentiation In Deep Learning Toolbox
How to use automatic differentiation in deep learning.