LIBSVM training error , needed help.
1 view (last 30 days)
Show older comments
Hello team,
I am doing LIBSVM regression analysis on time series data but getting rho as NAN and not predicting anythin. the code is as below
trn_data.X = WCA(1:500,1:85); trn_data.y = WCA(1:500,86); tst_data.X = WCA(501:800,1:85); tst_data.y = WCA(500:800,86); %%
param.s = 3; % epsilon SVR param.C = max(trn_data.y) - min(trn_data.y); param.t = 2; % RBF kernel param.gset = 2.^[-7:20]; % range of the gamma parameter param.eset = [0:1000]; % range of the epsilon parameter param.nfold = 25; % 5-fold CV
%%
Rval = zeros(length(param.gset), length(param.eset));
for i = 1:param.nfold % partition the training data into the learning/validation % in this example, the 5-fold data partitioning is done by the following strategy, % for partition 1: Use samples 1, 6, 11, ... as validation samples and % the remaining as learning samples % for partition 2: Use samples 2, 7, 12, ... as validation samples and % the remaining as learning samples % : % for partition 5: Use samples 5, 10, 15, ... as validation samples and % the remaining as learning samples
data = [trn_data.y, trn_data.X]; [learn, val] = k_FoldCV_SPLIT(data, param.nfold, i); lrndata.X = learn(:, 2:end); lrndata.y = learn(:, 1); valdata.X = val(:, 2:end); valdata.y = val(:, 1);
for j = 1:length(param.gset) param.g = param.gset(j);
for k = 1:length(param.eset) param.e = param.eset(k); param.libsvm = ['-s ', num2str(param.s), ' -t ', num2str(param.t), ... ' -c ', num2str(param.C), ' -g ', num2str(param.g), ... ' -p ', num2str(param.e)];
% build model on Learning data model = svmtrain(lrndata.y, lrndata.X, param.libsvm);
% predict on the validation data [y_hat, Acc, projection] = svmpredict(valdata.y, valdata.X, model);
Rval(j,k) = Rval(j,k) + mean((y_hat-valdata.y).^2); end end
end %% Rval = Rval ./ (param.nfold);
[v1, i1] = min(Rval); [v2, i2] = min(v1); optparam = param; optparam.g = param.gset( i1(i2) ); optparam.e = param.eset(i2);
Getting as optimization finished, #iter = 0 nu = -nan(ind) obj = 0.000000, rho = -nan(ind) nSV = 0, nBSV = 0 Mean squared error = -1.#IND (regression) Squared correlation coefficient = -1.#IND (regression) . optimization finished, #iter = 0 nu = -nan(ind) obj = 0.000000, rho = -nan(ind) nSV = 0, nBSV = 0 Mean squared error = -1.#IND (regression) Squared correlation coefficient = -1.#IND (regression) . optimization finished, #iter = 0 nu = -nan(ind) obj = 0.000000, rho = -nan(ind) nSV = 0, nBSV = 0 Mean squared error = -1.#IND (regression) Squared correlation coefficient = -1.#IND (regression) . optimization finished, #iter = 0 nu = -nan(ind) obj = 0.000000, rho = -nan(ind) nSV = 0, nBSV = 0 Mean squared error = -1.#IND (regression) Squared correlation coefficient = -1.#IND (regression) . optimization finished, #iter = 0 nu = -nan(ind) obj = 0.000000, rho = -nan(ind) nSV = 0, nBSV = 0 Mean squared error = -1.#IND (regression) Squared correlation coefficient = -1.#IND (regression)
ANY IDEA?
0 Comments
Answers (0)
See Also
Categories
Find more on Descriptive Statistics and Visualization in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!