File Exchange

Support Vector Machine

version 1.0.0.0 (204 KB) by Bhartendu

Bhartendu (view profile)

SVM (Linearly Seperable Data) using linear Kernel with Gradient ascent

Updated 28 May 2017

Refer: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods by Nello Cristianini and John Shawe-Taylor]
In this demo: training or cross-validation of a support vector machine (SVM) model for two-class (binary) classification on a low dimensional data set.

The training algorithm only depend on the data through dot products in H, i.e. on functions of the form Φ(x_i)·Φ(x_j). Now if there were a “kernel function” K such that
K(x_i,x_j) = Φ(x_i)·Φ(x_j),
we would only need to use K in the training algorithm, and would never need to explicitly even know what Φ is. One example is radial basis functions (RBF) or gaussian kernels where, H is inﬁnite dimensional, so it would not be very easy to work with Φ explicitly.

Training the model requires the choice of:
• the kernel function, that determines the shape of the decision surface
• parameters in the kernel function (eg: for gaussian kernel:variance of the Gaussian, for polynomial kernel: degree of the polynomial)
• the regularization parameter λ.

Related Examples:

2. SVM using various kernels
https://in.mathworks.com/matlabcentral/fileexchange/63033-svm-using-various-kernels

3. SVM for nonlinear classification
https://in.mathworks.com/matlabcentral/fileexchange/63024-svm-for-nonlinear-classification

Amaresh Singh

Mohamed Farchi

Bhartendu

Bhartendu (view profile)

@Matthys Holdout is a method of CV (Cross Validation) partition.

joachim Matthys

joachim Matthys (view profile)

Can someone tell me what the function of the holdout function is?

fatima farooq

Inturi srivani

ammar noori

ammar noori (view profile)

Dear Bhartendu
do you have the documentation that describe your work....your input is highly appreciated

BR

GANESH SINGARAJU

Bhartendu

Bhartendu (view profile)

Thanks @Yeonjong, the two errors are probably due to mismatch of MatLab versions.

Yeonjong

Yeonjong (view profile)

I run into two errors while I run this code.
For me, the following changes work very well.

w1=(alp_old.*Y).*X; ==> w1=(alp_old.*Y)'*X;
w2=(alpha.*Y).*X; ==> w2=(alpha.*Y)'*X;

2. Plotting
------------------------------------------------
syms x
fn=vpa((-bias-W(1)*x)/W(2),4);
fplot(fn,'Linewidth',2);
fn1=vpa((1-bias-W(1)*x)/W(2),4);
fplot(fn1,'--');
fn2=vpa((-1-bias-W(1)*x)/W(2),4);
fplot(fn2,'--');
------------------------------------------------
I changed to the following and it works for me.
------------------------------------------------
xItv = linspace(-5,5,1000);
fn = @(x) vpa((-bias-W(1)*x)/W(2),4);
plot(xItv,fn(xItv),'Linewidth',2);
fn1 = @(x) vpa((1-bias-W(1)*x)/W(2),4);
plot(xItv,fn1(xItv),'--');
fn2 = @(x) vpa((-1-bias-W(1)*x)/W(2),4);
plot(xItv,fn2(xItv),'--');

John Martin

John Martin (view profile)

Jairo Fernando Gudiño

Jairo Fernando Gudiño (view profile)

earth science learner

Bhartendu

Bhartendu (view profile)

What is the reason for your poor rating nhat truong??

nhat truong

sagar kumar dash

sagar kumar dash

sagar kumar dash (view profile)

MATLAB Release Compatibility
Created with R2015a
Compatible with any release
Platform Compatibility
Windows macOS Linux