scale data for NN

Hi,
How do I scale data in a neural network multilayer backpropagation?

 Accepted Answer

Greg Heath
Greg Heath on 17 May 2012
You do not have to scale the data because variables are AUTOMATICALLY scaled to {-1,1} with MAPMINMAX by NEWFF, FITNET, PATTERNNET AND FEEDFORWARDNET.
The command
type feedforwardnet
yields these commands
========================
% Inputs
net.numInputs = 1;
net.inputConnect(1,1) = true;
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
% Outputs
net.outputConnect(Nl) = true;
net.outputs{Nl}.processFcns = {'removeconstantrows','mapminmax'};
===================================================
If you wish, you can replace either occurrance of MAPMINMAX with 'mapstd' (zero-mean/unit-variance) or 'none'.
I prefer to use MAPSTD before creating the net to try to understand the input/output relationships via plots, correlation coefficients and outliers.
Then I accept the automatic minmax normalization instead of removing or changing it.
Hope this helps.
Greg

2 Comments

mustafa
mustafa on 18 May 2012
Thanks Greg for ur reply. I want to build my program by own ( I dont use the toolbox). R there formula for scale?. Actually, i found this formula "I = Imin + (Imax-Imin)*(X-Dmin)/(Dmax-Dmin)" by searching in google but I dont know the reference it.
mustafa
mustafa on 18 May 2012
so I see old ur comment . You used this formula "xn = -1+ 2*(x-xmin)/(xmax-xmin) ;". is this formula for scalling? if yes, can I know the refrence? So what do you meant the numbers(1 &2) in formula.

Sign in to comment.

More Answers (1)

Greg Heath
Greg Heath on 18 May 2012
1. Derive a linear tranformation xn(x) = a.*x + b such that
xn( x = min(x) ) = -1
xn( x = max(x) ) = +1
2. Derive a linear tranformation xn(x) = a.*x + b such that
mean(xn) = 0
var( xn ) = 1
Hope this helps.
Greg

1 Comment

mustafa
mustafa on 18 May 2012
sorry Greg, it is not clear. plz I need some details about this.

Sign in to comment.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!