Understanding Shallow Network Data Structures
This topic discusses how the format of input data structures affects the simulation of networks. It starts with static networks, and then continues with dynamic networks. The following section describes how the format of the data structures affects network training.
There are two basic types of input vectors: those that occur concurrently (at the same time, or in no particular time sequence), and those that occur sequentially in time. For concurrent vectors, the order is not important, and if there were a number of networks running in parallel, you could present one input vector to each of the networks. For sequential vectors, the order in which the vectors appear is important.
Simulation with Concurrent Inputs in a Static Network
The simplest situation for simulating a network occurs when the network to be simulated is static (has no feedback or delays). In this case, you need not be concerned about whether or not the input vectors occur in a particular time sequence, so you can treat the inputs as concurrent. In addition, the problem is made even simpler by assuming that the network has only one input vector. Use the following network as an example.
To set up this linear feedforward network, use the following commands:
net = linearlayer; net.inputs{1}.size = 2; net.layers{1}.dimensions = 1;
For simplicity, assign the weight matrix and bias to be W = [1 2] and b = [0].
The commands for these assignments are
net.IW{1,1} = [1 2]; net.b{1} = 0;
Suppose that the network simulation data set consists of Q = 4 concurrent vectors:
Concurrent vectors are presented to the network as a single matrix:
P = [1 2 2 3; 2 1 3 1];
You can now simulate the network:
A = net(P) A = 5 4 8 5
A single matrix of concurrent vectors is presented to the network, and the network produces a single matrix of concurrent vectors as output. The result would be the same if there were four networks operating in parallel and each network received one of the input vectors and produced one of the outputs. The ordering of the input vectors is not important, because they do not interact with each other.
Simulation with Sequential Inputs in a Dynamic Network
When a network contains delays, the input to the network would normally be a sequence of input vectors that occur in a certain time order. To illustrate this case, the next figure shows a simple network that contains one delay.
The following commands create this network:
net = linearlayer([0 1]); net.inputs{1}.size = 1; net.layers{1}.dimensions = 1; net.biasConnect = 0;
Assign the weight matrix to be W = [1 2].
The command is:
net.IW{1,1} = [1 2];
Suppose that the input sequence is:
Sequential inputs are presented to the network as elements of a cell array:
P = {1 2 3 4};
You can now simulate the network:
A = net(P) A = [1] [4] [7] [10]
You input a cell array containing a sequence of inputs, and the network produces a cell array containing a sequence of outputs. The order of the inputs is important when they are presented as a sequence. In this case, the current output is obtained by multiplying the current input by 1 and the preceding input by 2 and summing the result. If you were to change the order of the inputs, the numbers obtained in the output would change.
Simulation with Concurrent Inputs in a Dynamic Network
If you were to apply the same inputs as a set of concurrent inputs instead of a sequence of inputs, you would obtain a completely different response. (However, it is not clear why you would want to do this with a dynamic network.) It would be as if each input were applied concurrently to a separate parallel network. For the previous example, Simulation with Sequential Inputs in a Dynamic Network, if you use a concurrent set of inputs you have
which can be created with the following code:
P = [1 2 3 4];
When you simulate with concurrent inputs, you obtain
A = net(P) A = 1 2 3 4
The result is the same as if you had concurrently applied each one of the inputs to a separate network and computed one output. Note that because you did not assign any initial conditions to the network delays, they were assumed to be 0. For this case the output is simply 1 times the input, because the weight that multiplies the current input is 1.
In certain special cases, you might want to simulate the network response to several different sequences at the same time. In this case, you would want to present the network with a concurrent set of sequences. For example, suppose you wanted to present the following two sequences to the network:
The input P
should be a cell array,
where each element of the array contains the two elements of the two
sequences that occur at the same time:
P = {[1 4] [2 3] [3 2] [4 1]};
You can now simulate the network:
A = net(P);
The resulting network output would be
A = {[1 4] [4 11] [7 8] [10 5]}
As you can see, the first column of each matrix makes up the output sequence produced by the first input sequence, which was the one used in an earlier example. The second column of each matrix makes up the output sequence produced by the second input sequence. There is no interaction between the two concurrent sequences. It is as if they were each applied to separate networks running in parallel.
The following diagram shows the general format for the network
input P
when there are Q concurrent
sequences of TS time steps. It covers all cases
where there is a single input vector. Each element of the cell array is a matrix of concurrent vectors that correspond
to the same point in time for each sequence. If there are multiple
input vectors, there will be multiple rows of matrices in the cell
array.
In this topic, you apply sequential and concurrent inputs to dynamic networks. In Simulation with Concurrent Inputs in a Static Network, you applied concurrent inputs to static networks. It is also possible to apply sequential inputs to static networks. It does not change the simulated response of the network, but it can affect the way in which the network is trained. This will become clear in Neural Network Training Concepts.
See also Configure Shallow Neural Network Inputs and Outputs.