Part 1: Simplified Mathematical Proof of SVM
Imagine you have two kinds of points on a 2D graph:
The goal of SVM is to draw the best straight line that:
Separates the two classes
Is as far away as possible from the closest points
This line is called the decision boundary , and the closest points are called support vectors .
Step 1: Define the Line
We define the separating line (or hyperplane) as:
Where:
W T W_{T} W T is the weight vector (slope of the line)
b b b is the bias (how far the line is from the origin)
X X X is the input point (like (2, 3), etc.)
Step 2: Set the Condition for Correct Classification
We want:
If y = + 1 y=+1 y = + 1 , the point is above the line
W T X + b ≥ 1 W^{T}X+b\ge 1 W T X + b ≥ 1 If y = − 1 y=−1 y = − 1 , the point is below the line
W T X + b ≤ − 1 W^{T}X+b\le -1 W T X + b ≤ − 1 Combine both:
y i ( W T X + b ) ≥ 1 y_{i}\left( W^{T}X+b \right)\ge 1 y i ( W T X + b ) ≥ 1 Step 3: Maximize the Margin
The margin is the distance from the line to the closest point. SVM wants to maximize this margin . The margin is:
M a r g i n = 2 ∥ W ∥ Margin=\frac{2}{\left\| W \right\|} M a r g in = ∥ W ∥ 2 To maximize this, we minimize :
1 2 ∥ W ∥ 2 \frac{1}{2}\left\| W \right\|^{2} 2 1 ∥ W ∥ 2 Subject to:
y i ( W T x i + b ) ≥ 1 y_{i}\left( W^{T}x_{i}+b \right)\ge 1 y i ( W T x i + b ) ≥ 1 That’s the core idea of SVM!
Part 2: Simple Numerical Example (Step by Step)
We want to draw a line that separates the +1 and -1 classes.
Step 1: Assume Solution (try values for 𝑤 and 𝑏)
Let’s guess a line:
W = ( 1 , − 1 ) , b = 0 W=(1,−1),b=0 W = ( 1 , − 1 ) , b = 0 So the equation of the line is:
x 1 − x 2 = 0 ( o r ) x 2 = x 1 x_{1}−x_{2}=0 \;\;(or)\;\;x_{2}=x_{1} x 1 − x 2 = 0 ( or ) x 2 = x 1 Step 2: Plug into the condition y i ( W T x i + b ) ≥ 1 y_{i}\left( W^{T}x_{i}+b \right)\ge 1 y i ( W T x i + b ) ≥ 1 Check all points:
So our guess was wrong.
Step 3: Try Better Line
Try:
W = ( 1 , 1 ) , b = − 3 W=(1,1),b=−3 W = ( 1 , 1 ) , b = − 3 So the line is:
x 1 + x 2 = 3 x_{1}+x_{2}=3 x 1 + x 2 = 3 Check:
So only point A fails. Almost correct!
Step-by-Step Python Code: SVM Example
Let’s implement a simple Support Vector Machine (SVM) from scratch using Python and NumPy , step by step, based on the math we discussed.
We’ll:
Train a simple linear SVM using sklearn.
Show the decision boundary and support vectors.
Step 1: Import Libraries
Step 2: Define a Small Dataset
Step 3: Train the SVM Model
Step 4: Plot the Data, Decision Boundary, and Support Vectors
svm.support_vectors_: The actual support vectors found by the algorithm.
svm.coef_: The learned weight vector W W W .
svm.intercept_: The learned bias b b b .
The dashed lines are the margins (distance from the decision boundary).
The solid black line is the separating hyperplane .
Step 5: Print the Model Parameters
Add this code after training the SVM :
Last updated 10 months ago