Solving the SVM problem by inspection

By inspection we can see that the boundary decision line is the function . Using the formula we can obtain a first guess of the parameters as

Using these values we would obtain the following width between the support vectors: . Again by inspection we see that the width between the support vectors is in fact of length meaning that these values are incorrect.

Recall that scaling the boundary by a factor of does not change the boundary line, hence we can generalize the equation as

$$ cx_1 - xc_2 - 3c = 0$$

Plugging back into the equation for the width we get

\begin{aligned} \frac{2}{||w||} & = 4 \sqrt{2} \\ \frac{2}{\sqrt{2}c} & = 4 \sqrt{2} \\ c = \frac{1}{4} \end{aligned}

Hence the parameters are in fact

To find the values of we can use the following two constraints which come from the dual problem:

And using the fact that for support vectors only (i.e. 3 vectors in this case) we obtain the system of simultaneous linear equations: \begin{aligned} \begin{bmatrix} 6 \alpha_1 - 2 \alpha_2 - 3 \alpha_3 \\ -1 \alpha_1 - 3 \alpha_2 - 4 \alpha_3 \\ 1 \alpha_1 - 2 \alpha_2 - 1 \alpha_3 \end{bmatrix} & = \begin{bmatrix} 1/4 \\ -1/4 \\ 0 \end{bmatrix} \\ \alpha & = \begin{bmatrix} 1/16 \\ 1/16 \\ 0 \end{bmatrix} \end{aligned}

Source

  • https://ai6034.mit.edu/wiki/images/SVM_and_Boosting.pdf
  • Full post here
Answer from Xavier Bourret Sicotte on Stack Exchange
Top answer
1 of 3
11

Solving the SVM problem by inspection

By inspection we can see that the boundary decision line is the function . Using the formula we can obtain a first guess of the parameters as

Using these values we would obtain the following width between the support vectors: . Again by inspection we see that the width between the support vectors is in fact of length meaning that these values are incorrect.

Recall that scaling the boundary by a factor of does not change the boundary line, hence we can generalize the equation as

$$ cx_1 - xc_2 - 3c = 0$$

Plugging back into the equation for the width we get

\begin{aligned} \frac{2}{||w||} & = 4 \sqrt{2} \\ \frac{2}{\sqrt{2}c} & = 4 \sqrt{2} \\ c = \frac{1}{4} \end{aligned}

Hence the parameters are in fact

To find the values of we can use the following two constraints which come from the dual problem:

And using the fact that for support vectors only (i.e. 3 vectors in this case) we obtain the system of simultaneous linear equations: \begin{aligned} \begin{bmatrix} 6 \alpha_1 - 2 \alpha_2 - 3 \alpha_3 \\ -1 \alpha_1 - 3 \alpha_2 - 4 \alpha_3 \\ 1 \alpha_1 - 2 \alpha_2 - 1 \alpha_3 \end{bmatrix} & = \begin{bmatrix} 1/4 \\ -1/4 \\ 0 \end{bmatrix} \\ \alpha & = \begin{bmatrix} 1/16 \\ 1/16 \\ 0 \end{bmatrix} \end{aligned}

Source

  • https://ai6034.mit.edu/wiki/images/SVM_and_Boosting.pdf
  • Full post here
2 of 3
1

Instead of computing the width between the support vectors (which in this case was easy because two of them happened to be directly across from each other over the decision line), it might be more convenient to use that the support vectors should have value under the decision function:

$$ cx_1 - cx_2 -3c =0 $$

represents the line, but using the point with target in the diagram, we should have

and hence (again) .

🌐
Xavierbourretsicotte
xavierbourretsicotte.github.io › SVM_by_hand.html
Support Vector Machine: calculate coefficients manually — Data Blog
June 25, 2018 - from sklearn.svm import SVC X = np.array([[3,4],[1,4],[2,3],[6,-1],[7,-1],[5,-3]] ) y = np.array([-1,-1, -1, 1, 1 , 1 ]) clf = SVC(C = 1e5, kernel = 'linear') clf.fit(X, y) print('w = ',clf.coef_) print('b = ',clf.intercept_) print('Indices of support vectors = ', clf.support_) print('Support vectors = ', clf.support_vectors_) print('Number of support vectors for each class = ', clf.n_support_) print('Coefficients of the support vector in the decision function = ', np.abs(clf.dual_coef_))
🌐
Koshegio
koshegio.com › support-vector-machine-calculator
Online free machine learning and statistic calculator
Online free machine learning and statistic calculator with graph generator tools that provides you with images that you can download for free
🌐
Cytognomix
chemotherapy.cytognomix.com
Chemotherapy SVM Calculator
The Support Vector Machine (SVM) Calculator accepts gene expression values from log2 normalized microarray data and copy number values as integers. It can also accept qRT-PCR data, but equal width binning is necessary for comparison to the training set (paclitaxel and gemcitabine models only).
🌐
MIT
web.mit.edu › 6.034 › wwwbob › svm-notes-long-08.pdf pdf
1 An Idiot’s guide to Support vector machines (SVMs) R. Berwick, Village Idiot
Inner products, similarity, and SVMs · 19 · Insight into inner products · Consider that we are trying to maximize the form: LD(ai ) = ai · i=1 · l · ! " 1 · 2 · aia j · i=1 · l · ! yi y j xi #x j · ( ) s.t. ai yi = 0 · i=1 · l · ! & ai $ 0 · The claim is that this function will be maximized if we give nonzero values to a’s that ·
🌐
scikit-learn
scikit-learn.org › stable › modules › svm.html
1.4. Support Vector Machines — scikit-learn 1.8.0 documentation
When the constructor option probability is set to True, class membership probability estimates (from the methods predict_proba and predict_log_proba) are enabled. In the binary case, the probabilities are calibrated using Platt scaling [9]: logistic regression on the SVM’s scores, fit by an additional cross-validation on the training data.
🌐
Desmos
desmos.com › calculator › rp0lniv4oj
SVM - Maximum Margin | Desmos
Explore math with our beautiful, free online graphing calculator. Graph functions, plot points, visualize algebraic equations, add sliders, animate graphs, and more.
🌐
Stanford University
cs.stanford.edu › ~karpathy › svmjs › demo
svmjs Support Vector Machine in Javascript: demo
Note, these demos are somewhat depractated in favor of much better and recent ones over at convnetjs.com. Also, ConvNetJS supports (linear) SVMs as a special case, where the Neural Network has no hidden layers and the loss function is the hinge loss.
Find elsewhere
🌐
MathWorks
mathworks.com › statistics and machine learning toolbox › classification › support vector machine classification
Support Vector Machines for Binary Classification - MATLAB & Simulink
Though SVM models that use fewer support vectors are more desirable and consume less memory, increasing the value of the box constraint tends to increase the training time. Remove MdlSV and Mdl from the workspace. ... Calculate the classification error of the holdout sample.
🌐
SVM Tutorial
svm-tutorial.com › home
Support Vector Machines Tutorial - SVM Tutorial
May 14, 2022 - SVM are known to be difficult to grasp. This tutorial series is intended to give you all the necessary tools to really understand the math behind SVM.
🌐
Calculator.org
calculator.org › manufacturers › SVM.html
manufacturers/SVM - calculator.org
calculator.org · the calculator home page · Manufacturers: SVM · Calculators · ALPHA50 · ALPHA60 · LC 880 · site info
🌐
Asmquantmacro
asmquantmacro.com › 2016 › 06 › 14 › support-vector-machines-without-tears-part-1
Support Vector Machines Without Tears – Part 1 [Hard Margin]
June 26, 2016 - In a final post I will discuss what is commonly referred to as the kernel trick to handle non linear decision boundaries. I borrow heavily from the authors that I will mention at the end of the post so to add some original content I implement SVM in an excel spreadsheet using Solver.
🌐
Aaron-sp
aaron-sp.github.io › svm › index.html
Support Vector Regression Calculator
Support Vector Regression Calculator is an implementation of libSVM transpiled to javascript.
🌐
Jeremy Jordan
jeremyjordan.me › support-vector-machines
Support vector machines. - Jeremy Jordan
October 18, 2017 - In the world of SVMs, this "space" is called the margin (as visualized below). By maximizing the size of this margin, we can build an optimal classifier. In practice, there two main ways to measure this margin: the functional margin and the geometric margin. We can calculate the margin of our ...
🌐
Analytics Vidhya
analyticsvidhya.com › home › support vector machine (svm)
Support Vector Machine (SVM)
April 21, 2025 - We can now say that these lines will move as we do changes in (w,b) and this is how this gets optimized. But what is the optimization function? Let’s calculate it. We know that the aim of SVM is to maximize this margin that means distance (d). But there are few constraints for this distance (d).
🌐
ScienceDirect
sciencedirect.com › science › article › pii › S2352914819300917
Machine learning in medicine: Performance calculation of dementia prediction by support vector machines (SVM) - ScienceDirect
June 27, 2019 - Numerous studies previously attempted to apply these algorithms on MRI (Magnetic Resonance Image) data to predict AD (Alzheimer's disease) in advance. The present study aims to explore the usage of support vector machine (SVM) in the prediction of dementia and validate its performance through statistical analysis.
🌐
SVM Tutorial
svm-tutorial.com › home › svm - understanding the math - the optimal hyperplane
SVM - Understanding the math : the optimal hyperplane
April 30, 2023 - How do we find the optimal hyperplane for a SVM. This article will explain you the mathematical reasoning necessary to derive the svm optimization problem.