Specify Custom Output Layer Backward Loss Function
Tip
Custom output layers are not recommended, use the trainnet
function and specify a custom loss function instead. To specify a custom backward function
for the loss function, use a deep.DifferentiableFunction
object. For more
information, see Define Custom Deep Learning Operations.
When you use the trainNetwork
function, if Deep Learning Toolbox™ does not provide the output layer you require for your task, then you can
define your own custom output layer. When you create a custom output layer, creating a
backward loss function is optional. If the forward loss function only uses functions that
support dlarray
objects, then software determines the derivatives
automatically using automatic differentiation. For a list of functions that support
dlarray
objects, see List of Functions with dlarray Support. If you want to use
functions that do not support dlarray
objects, or want to use a specific
algorithm for the backward loss function, then you can define a custom backward function
using this example as a guide.
The example Define Custom Classification Output Layer shows how to define and create a custom classification output layer with sum of squares error (SSE) loss and goes through these steps:
Name the layer – Give the layer a name so it can be used in MATLAB®.
Declare the layer properties – Specify the properties of the layer.
Create a constructor function (optional) – Specify how to construct the layer and initialize its properties. If you do not specify a constructor function, then the software initializes the properties with
''
at creation.Create a forward loss function – Specify the loss between the predictions and the training targets.
Create a backward loss function (optional) – Specify the derivative of the loss with respect to the predictions. If you do not specify a backward loss function, then the forward loss function must support
dlarray
objects.
Creating a backward loss function is optional. If the forward loss function only uses
functions that support dlarray
objects, then software determines the
derivatives automatically using automatic differentiation. For a list of functions that
support dlarray
objects, see List of Functions with dlarray Support. If you want to use
functions that do not support dlarray
objects, or want to use a specific
algorithm for the backward loss function, then you can define a custom backward function
using this example as a guide.
Create Custom Layer
The example Define Custom Classification Output Layer shows how to create a SSE classification layer.
A classification SSE layer computes the sum of squares error loss for classification problems. SSE is an error measure between two continuous random variables. For predictions Y and training targets T, the SSE loss between Y and T is given by
where N is the number of observations and K is the number of classes.
View the layer created in the example Define Custom Classification Output Layer. This layer does not
have a backwardLoss
function.
classdef sseClassificationLayer < nnet.layer.ClassificationLayer ... & nnet.layer.Acceleratable % Example custom classification layer with sum of squares error loss. methods function layer = sseClassificationLayer(name) % layer = sseClassificationLayer(name) creates a sum of squares % error classification layer and specifies the layer name. % Set layer name. layer.Name = name; % Set layer description. layer.Description = 'Sum of squares error'; end function loss = forwardLoss(layer, Y, T) % loss = forwardLoss(layer, Y, T) returns the SSE loss between % the predictions Y and the training targets T. % Calculate sum of squares. sumSquares = sum((Y-T).^2); % Take mean over mini-batch. N = size(Y,4); loss = sum(sumSquares)/N; end end end
Create Backward Loss Function
Implement the backwardLoss
function that returns the derivatives of
the loss with respect to the input data and the learnable parameters.
The syntax for backwardLoss
is dLdY
= backwardLoss(layer,Y,T)
. The input Y
contains the predictions
made by the network and T
contains the training targets. The output
dLdY
is the derivative of the loss with respect to the predictions
Y
. The output dLdY
must be the same size as the layer
input Y
.
The dimensions of Y
and T
are the same as the
inputs in forwardLoss
.
The derivative of the SSE loss with respect to the predictions Y is given by
where N is the number of observations in the input.
Create the backward loss function that returns these derivatives.
function dLdY = backwardLoss(layer, Y, T)
% dLdY = backwardLoss(layer, Y, T) returns the derivatives of
% the SSE loss with respect to the predictions Y.
N = size(Y,4);
dLdY = 2*(Y-T)/N;
end
Complete Layer
View the completed layer class file.
classdef sseClassificationLayer < nnet.layer.ClassificationLayer % Example custom classification layer with sum of squares error loss. methods function layer = sseClassificationLayer(name) % layer = sseClassificationLayer(name) creates a sum of squares % error classification layer and specifies the layer name. % Set layer name. layer.Name = name; % Set layer description. layer.Description = 'Sum of squares error'; end function loss = forwardLoss(layer, Y, T) % loss = forwardLoss(layer, Y, T) returns the SSE loss between % the predictions Y and the training targets T. % Calculate sum of squares. sumSquares = sum((Y-T).^2); % Take mean over mini-batch. N = size(Y,4); loss = sum(sumSquares)/N; end function dLdY = backwardLoss(layer, Y, T) % dLdY = backwardLoss(layer, Y, T) returns the derivatives of % the SSE loss with respect to the predictions Y. N = size(Y,4); dLdY = 2*(Y-T)/N; end end end
GPU Compatibility
If the layer forward functions fully support dlarray
objects, then the layer is GPU compatible. Otherwise, to be GPU compatible, the layer functions must support inputs and return outputs of type gpuArray
(Parallel Computing Toolbox).
Many MATLAB built-in functions support gpuArray
(Parallel Computing Toolbox) and dlarray
input arguments. For a list of functions that support dlarray
objects, see List of Functions with dlarray Support. For a list of functions that execute on a GPU, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox). To use a GPU for deep learning, you must also have a supported GPU device. For information on supported devices, see GPU Computing Requirements (Parallel Computing Toolbox). For more information on working with GPUs in MATLAB, see GPU Computing in MATLAB (Parallel Computing Toolbox).
See Also
trainnet
| trainingOptions
| dlnetwork
| checkLayer
| findPlaceholderLayers
| replaceLayer
| PlaceholderLayer