changing the activation function to ReLU? nntool command.
9 views (last 30 days)
I want to build a neural network such that it consist of 3 layers ( 2 ReLU layers and an output layer) with 10 neurons in each of the nonlinear layers.
I am currntly using "nntool".
However, I couldn't figure out how to change the activation function to ReLU?