calculating prediction RMSE of partial least squares regression with leave one out cross validation

8 views (last 30 days)
Hello,
I am trying to calculate the prediction error of partial least squares regression (PLSR) model. I know how to do it on a given set of variables which I found in Matlab help. I use the code below:
ZN = zscore(N);
Zref=zscore(ref);
[XL,YL,XS,YS,BETA,PCTVAR,MSE,stats] = plsregress(Zref,ZN,10);
[XL,YL,XS,YS,BETA,PCTVAR,PLSmsep] = plsregress(Zref,ZN,1,'CV',34);
The third line of the code generates the calibration MSE and the fourth line yields the prediction error based on leave one out cross validation using one latent variable. As you've noticed, I am applying a standard normal distribution transformation to the data before the regression analysis because this increases the explanatory power of the model. So, the errors I get are based on that transformation. And this is where I need help. I want to report the calibration and prediction errors in the original scale of the response variable, not the z-transformed one. It is easy with the prediction error as I can get the predicted response variables based on the code below and then calculate the calibration RMSE from that.
yfitPLS = [ones(34,1) ref]*BETA;
I have 34 observations and BETA are the coefficients of the model. I don't know Matlab enough to do this for the prediction error. I am envisioning a code which would leave out one observation and use the remaining 33 to run the PLSR and calculate the predicted response using the PLSR model based on 33 samples and compare it to the left out observation, calculate the residual and do this in a successive manner going through all 34 observations. I think, this way I can obtain the RMSECV in the original scale of the response variable.
I will REALLY appreciate any help. I’ve been struggling with this for a while and I need to get this done to move on.
Thank you!

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!