Unfortunately there seems to be no way to do that. LinearSVC calls liblinear (see relevant code) but doesn't retrieve the vectors, only the coefficients and the intercept.

One alternative would be to use SVC with the 'linear' kernel (libsvm instead of liblinear based), but also poly, dbf and sigmoid kernel support this option:

from sklearn import svm

X = [[0, 0], [1, 1]]
y = [0, 1]

clf = svm.SVC(kernel='linear')
clf.fit(X, y)
print clf.support_vectors_

Output:

[[ 0.  0.]
 [ 1.  1.]]

liblinear scales better to large number of samples, but otherwise the are mostly equivalent.

Answer from elyase on Stack Overflow
Top answer
1 of 2
1

You can get the support vectors using clf.support_vectors_.

Plotting the support vectors:

import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm

# we create 40 separable points
np.random.seed(0)
X = np.r_[np.random.randn(20, 2) - [2, 2], np.random.randn(20, 2) + [2, 2]]
Y = [0] * 20 + [1] * 20

# fit the model
clf = svm.SVC(kernel='linear', C=1)
clf.fit(X, Y)

# get the separating hyperplane
w = clf.coef_[0]
a = -w[0] / w[1]
xx = np.linspace(-5, 5)
yy = a * xx - (clf.intercept_[0]) / w[1]


margin = 1 / np.sqrt(np.sum(clf.coef_ ** 2))
yy_down = yy - np.sqrt(1 + a ** 2) * margin
yy_up = yy + np.sqrt(1 + a ** 2) * margin

plt.figure(1, figsize=(4, 3))
plt.clf()
plt.plot(xx, yy, 'k-')
plt.plot(xx, yy_down, 'k--')
plt.plot(xx, yy_up, 'k--')

plt.scatter(clf.support_vectors_[:, 0], clf.support_vectors_[:, 1], s=80,
            facecolors='none', zorder=10, edgecolors='k')
plt.scatter(X[:, 0], X[:, 1], c=Y, zorder=10, cmap=plt.cm.Paired,
            edgecolors='k')

plt.axis('tight')
x_min = -4.8
x_max = 4.2
y_min = -6
y_max = 6

XX, YY = np.mgrid[x_min:x_max:200j, y_min:y_max:200j]
Z = clf.predict(np.c_[XX.ravel(), YY.ravel()])

# Put the result into a color plot
Z = Z.reshape(XX.shape)
plt.figure(1, figsize=(4, 3))
plt.pcolormesh(XX, YY, Z, cmap=plt.cm.Paired)

plt.xlim(x_min, x_max)
plt.ylim(y_min, y_max)

plt.xticks(())
plt.yticks(())

plt.show()

2 of 2
1

Let me assume we are talking about libsvm instead of sklearn svc.

The answer can be found in the LIBLINEAR FAQ. In short, you can't. You need to modify the source code.

Q: How could I know which training instances are support vectors?

Some LIBLINEAR solvers consider the primal problem, so support vectors are not obtained during the training procedure. For dual solvers, we output only the primal weight vector w, so support vectors are not stored in the model. This is different from LIBSVM.

To know support vectors, you can modify the following loop in solve_l2r_l1l2_svc() of linear.cpp to print out indices:

    for(i=0; i<l; i++)
    {
        v += alpha[i]*(alpha[i]*diag[GETI(i)] - 2);
        if(alpha[i] > 0)
            ++nSV;
    }

Note that we group data in the same class together before calling this subroutine. Thus the order of your training instances has been changed. You can sort your data (e.g., positive instances before negative ones) before using liblinear. Then indices will be the same.

🌐
scikit-learn
scikit-learn.org › stable › modules › svm.html
1.4. Support Vector Machines — scikit-learn 1.8.0 documentation
Proper choice of C and gamma is critical to the SVM’s performance. One is advised to use GridSearchCV with C and gamma spaced exponentially far apart to choose good values. ... You can define your own kernels by either giving the kernel as a python function or by precomputing the Gram matrix. Classifiers with custom kernels behave the same way as any other classifiers, except that: Field support_vectors_ is now empty, only indices of support vectors are stored in support_
🌐
MIT
web.mit.edu › 6.034 › wwwbob › svm-notes-long-08.pdf pdf
1 An Idiot’s guide to Support vector machines (SVMs) R. Berwick, Village Idiot
2 dissimilar (orthogonal) vectors don’t · count at all · xi · xj · 21 · But…are we done??? Not Linearly Separable! Find a line that penalizes · points on “the wrong side” · 22 · x · x · x · x · x · x · x · ϕ (o) X · F · ϕ · ϕ (x) ϕ (x) ϕ (x) ϕ (x) ϕ (x) ϕ (x) ϕ (x) ϕ (o) ϕ (o) ϕ (o) ϕ (o) ϕ (o) ϕ (o) o · o · o · o · o · o · Transformation to separate · Non–Linear SVMs ·
🌐
Python Data Science Handbook
jakevdp.github.io › PythonDataScienceHandbook › 05.07-support-vector-machines.html
In-Depth: Support Vector Machines | Python Data Science Handbook
We could proceed by simply using each pixel value as a feature, but often it is more effective to use some sort of preprocessor to extract more meaningful features; here we will use a principal component analysis (see In Depth: Principal Component Analysis) to extract 150 fundamental components to feed into our support vector machine classifier. We can do this most straightforwardly by packaging the preprocessor and the classifier into a single pipeline: ... from sklearn.svm import SVC from sklearn.decomposition import RandomizedPCA from sklearn.pipeline import make_pipeline pca = RandomizedPCA(n_components=150, whiten=True, random_state=42) svc = SVC(kernel='rbf', class_weight='balanced') model = make_pipeline(pca, svc)
🌐
scikit-learn
scikit-learn.org › 1.5 › auto_examples › svm › plot_linearsvc_support_vectors.html
Plot the support vectors in LinearSVC — scikit-learn 1.5.2 documentation
This example demonstrates how to obtain the support vectors in LinearSVC. import matplotlib.pyplot as plt import numpy as np from sklearn.datasets import make_blobs from sklearn.inspection import DecisionBoundaryDisplay from sklearn.svm import LinearSVC X, y = make_blobs(n_samples=40, centers=2, random_state=0) plt.figure(figsize=(10, 5)) for i, C in enumerate([1, 100]): # "hinge" is the standard SVM loss clf = LinearSVC(C=C, loss="hinge", random_state=42).fit(X, y) # obtain the support vectors through the decision function decision_function = clf.decision_function(X) # we can also calculate t
🌐
GeeksforGeeks
geeksforgeeks.org › machine learning › support-vector-machine-algorithm
Support Vector Machine (SVM) Algorithm - GeeksforGeeks
The main goal of SVM is to maximize the margin between the two classes. The larger the margin the better the model performs on new and unseen data. Hyperplane: A decision boundary separating different classes in feature space and is represented by the equation wx + b = 0 in linear classification. Support Vectors: The closest data points to the hyperplane, crucial for determining the hyperplane and margin in SVM.
Published   3 weeks ago
Find elsewhere
🌐
Quora
quora.com › How-does-a-SVM-choose-its-support-vectors
How does a SVM choose its support vectors? - Quora
But to identify the margin we need to make sure all the vectors of all datapoints to satisfy the SVM constraint. ... Other answers so far explain what support vectors are. To find the support vectors, you need to solve an optimization problem.
🌐
Quora
quora.com › How-can-you-find-support-vectors-when-programming-SVM
How to find support vectors when programming SVM - Quora
Answer: It is simple if you are using a quadratic solver(in most cases we use it). The vector returned from quadratic solver be \alpha = (\alpha_1, \alpha_2, ....., \alpha_N) where N \, is the number of training examples used to train SVM. This vector will have non-negative values ( \alpha_i \g...
🌐
Machinecurve
machinecurve.com › index.php › 2020 › 05 › 05 › how-to-visualize-support-vectors-of-your-svm-classifier
How to visualize support vectors of your SVM classifier? | MachineCurve.com
May 5, 2020 - Building further on top of an existing MachineCurve blog article, which constructs and trains a simple binary SVM classifier, we then looked at how support vectors for an SVM can be visualized. By using Python and Scikit-learn, we provided a step-by-step example of how to do this.
🌐
Medium
medium.com › @agrawalsam1997 › support-vectors-in-svm-5c66497a5f51
Support Vectors in SVM. Support Vector Machine is a supervised… | by Saurav Agrawal | Medium
June 27, 2023 - # Normal SVM Classifier svc = SVC() svc.fit(X, y) plot_classifier(X, y, svc, "Sepal Length (cm)", "Sepal Width (cm)", "Iris Dataset") ... # Subsetting the support vectors. X_small = X[svc.support_] y_small = y[svc.support_] # SVM Classifier fitted with the support vectors.
🌐
GitHub
github.com › christianversloot › machine-learning-articles › blob › main › how-to-visualize-support-vectors-of-your-svm-classifier.md
machine-learning-articles/how-to-visualize-support-vectors-of-your-svm-classifier.md at main · christianversloot/machine-learning-articles
May 5, 2020 - Building further on top of an existing MachineCurve blog article, which constructs and trains a simple binary SVM classifier, we then looked at how support vectors for an SVM can be visualized. By using Python and Scikit-learn, we provided a step-by-step example of how to do this.
Author   christianversloot
Top answer
1 of 3
11

Solving the SVM problem by inspection

By inspection we can see that the boundary decision line is the function $x_2 = x_1 - 3$. Using the formula $w^T x + b = 0$ we can obtain a first guess of the parameters as

$$ w = [1,-1] \ \ b = -3$$

Using these values we would obtain the following width between the support vectors: $\frac{2}{\sqrt{2}} = \sqrt{2}$. Again by inspection we see that the width between the support vectors is in fact of length $4 \sqrt{2}$ meaning that these values are incorrect.

Recall that scaling the boundary by a factor of $c$ does not change the boundary line, hence we can generalize the equation as

$$ cx_1 - xc_2 - 3c = 0$$ $$ w = [c,-c] \ \ b = -3c$$

Plugging back into the equation for the width we get

\begin{aligned} \frac{2}{||w||} & = 4 \sqrt{2} \\ \frac{2}{\sqrt{2}c} & = 4 \sqrt{2} \\ c = \frac{1}{4} \end{aligned}

Hence the parameters are in fact $$ w = [\frac{1}{4},-\frac{1}{4}] \ \ b = -\frac{3}{4}$$

To find the values of $\alpha_i$ we can use the following two constraints which come from the dual problem:

$$ w = \sum_i^m \alpha_i y^{(i)} x^{(i)} $$ $$\sum_i^m \alpha_i y^{(i)} = 0 $$

And using the fact that $\alpha_i \geq 0$ for support vectors only (i.e. 3 vectors in this case) we obtain the system of simultaneous linear equations: \begin{aligned} \begin{bmatrix} 6 \alpha_1 - 2 \alpha_2 - 3 \alpha_3 \\ -1 \alpha_1 - 3 \alpha_2 - 4 \alpha_3 \\ 1 \alpha_1 - 2 \alpha_2 - 1 \alpha_3 \end{bmatrix} & = \begin{bmatrix} 1/4 \\ -1/4 \\ 0 \end{bmatrix} \\ \alpha & = \begin{bmatrix} 1/16 \\ 1/16 \\ 0 \end{bmatrix} \end{aligned}

Source

  • https://ai6034.mit.edu/wiki/images/SVM_and_Boosting.pdf
  • Full post here
2 of 3
1

Instead of computing the width between the support vectors (which in this case was easy because two of them happened to be directly across from each other over the decision line), it might be more convenient to use that the support vectors should have value $\pm1$ under the decision function:

$$ cx_1 - cx_2 -3c =0 $$

represents the line, but using the point $B=(2,3)$ with target $-1$ in the diagram, we should have

$$ c(2) - c(3) -3c =-1$$

and hence (again) $c=1/4$.

🌐
Analytics Vidhya
analyticsvidhya.com › home › support vector machine (svm)
Support Vector Machine (SVM)
April 21, 2025 - In this article, we looked at a very powerful machine learning algorithm, Support Vector Machine in detail. I discussed its concept of working, math intuition behind SVM, implementation in python, the tricks to classify non-linear datasets, Pros and cons, and finally, we solved a problem with the help of SVM. Also you will get some insights on SVM in Machine Learning. Q1. What is a SVM algorithm? A. A machine learning model that finds the best boundary to separate different groups of data points.
🌐
Built In
builtin.com › machine-learning › support-vector-machine
An Introduction to Support Vector Machine (SVM) in Python | Built In
According to the SVM algorithm, we find the points closest to the line from both the classes. These points are called support vectors. Now, we compute the distance between the line and the support vectors.
Published   March 19, 2025
🌐
scikit-learn
scikit-learn.org › stable › modules › generated › sklearn.svm.SVC.html
SVC — scikit-learn 1.8.0 documentation
If the exact distances are required, divide the function values by the norm of the weight vector (coef_). See also this question for further details. If decision_function_shape=’ovr’, the decision function is a monotonic transformation of ovo decision function. ... Fit the SVM model according to the given training data.
🌐
Analytics Vidhya
analyticsvidhya.com › home › how to use support vector machines (svm) in python and r
How to Use Support Vector Machines (SVM) in Python and R
June 16, 2025 - That is where ‘Support Vector Machines’ acts like a sharp knife – it works on smaller datasets, but on complex ones, it can be much stronger in building machine learning models. In this article, we’ll explore the fundamentals of SVM in machine learning, understand the algorithm, and learn how to implement SVM in Python and R for effective data classification.