Main Content


Directed acyclic graph (DAG) network for deep learning


A DAG network is a neural network for deep learning with layers arranged as a directed acyclic graph. A DAG network can have a more complex architecture in which layers have inputs from multiple layers and outputs to multiple layers.


There are several ways to create a DAGNetwork object:


To learn about other pretrained networks, see Pretrained Deep Neural Networks.


expand all

Network layers, specified as a Layer array.

Layer connections, specified as a table with two columns.

Each table row represents a connection in the layer graph. The first column, Source, specifies the source of each connection. The second column, Destination, specifies the destination of each connection. The connection sources and destinations are either layer names or have the form 'layerName/IOName', where 'IOName' is the name of the layer input or output.

Data Types: table

Network input layer names, specified as a cell array of character vectors.

Data Types: cell

Network output layer names, specified as a cell array of character vectors.

Data Types: cell

Object Functions

activationsCompute deep learning network layer activations
classifyClassify data using a trained deep learning neural network
predictPredict responses using a trained deep learning neural network
plotPlot neural network layer graph


collapse all

Create a simple directed acyclic graph (DAG) network for deep learning. Train the network to classify images of digits. The simple network in this example consists of:

  • A main branch with layers connected sequentially.

  • A shortcut connection containing a single 1-by-1 convolutional layer. Shortcut connections enable the parameter gradients to flow more easily from the output layer to the earlier layers of the network.

Create the main branch of the network as a layer array. The addition layer sums multiple inputs element-wise. Specify the number of inputs for the addition layer to sum. All layers must have names and all names must be unique.

layers = [
    imageInputLayer([28 28 1],'Name','input')

Create a layer graph from the layer array. layerGraph connects all the layers in layers sequentially. Plot the layer graph.

lgraph = layerGraph(layers);

Create the 1-by-1 convolutional layer and add it to the layer graph. Specify the number of convolutional filters and the stride so that the activation size matches the activation size of the 'relu_3' layer. This arrangement enables the addition layer to add the outputs of the 'skipConv' and 'relu_3' layers. To check that the layer is in the graph, plot the layer graph.

skipConv = convolution2dLayer(1,32,'Stride',2,'Name','skipConv');
lgraph = addLayers(lgraph,skipConv);

Create the shortcut connection from the 'relu_1' layer to the 'add' layer. Because you specified two as the number of inputs to the addition layer when you created it, the layer has two inputs named 'in1' and 'in2'. The 'relu_3' layer is already connected to the 'in1' input. Connect the 'relu_1' layer to the 'skipConv' layer and the 'skipConv' layer to the 'in2' input of the 'add' layer. The addition layer now sums the outputs of the 'relu_3' and 'skipConv' layers. To check that the layers are connected correctly, plot the layer graph.

lgraph = connectLayers(lgraph,'relu_1','skipConv');
lgraph = connectLayers(lgraph,'skipConv','add/in2');

Load the training and validation data, which consists of 28-by-28 grayscale images of digits.

[XTrain,YTrain] = digitTrain4DArrayData;
[XValidation,YValidation] = digitTest4DArrayData;

Specify training options and train the network. trainNetwork validates the network using the validation data every ValidationFrequency iterations.

options = trainingOptions('sgdm', ...
    'MaxEpochs',8, ...
    'Shuffle','every-epoch', ...
    'ValidationData',{XValidation,YValidation}, ...
    'ValidationFrequency',30, ...
    'Verbose',false, ...
net = trainNetwork(XTrain,YTrain,lgraph,options);

Display the properties of the trained network. The network is a DAGNetwork object.

net = 
  DAGNetwork with properties:

         Layers: [16×1 nnet.cnn.layer.Layer]
    Connections: [16×2 table]
     InputNames: {'input'}
    OutputNames: {'classOutput'}

Classify the validation images and calculate the accuracy. The network is very accurate.

YPredicted = classify(net,XValidation);
accuracy = mean(YPredicted == YValidation)
accuracy = 0.9930

Extended Capabilities

Introduced in R2017b