🌐
Python Programming
pythonprogramming.net › linear-svc-example-scikit-learn-svm-python
Linear SVC Machine learning SVM example with Python
We're going to be using the SVC (support vector classifier) SVM (support vector machine). Our kernel is going to be linear, and C is equal to 1.0. What is C you ask? Don't worry about it for now, but, if you must know, C is a valuation of "how badly" you want to properly classify, or fit, ...
🌐
Medium
randomresearchai.medium.com › svc-model-in-python-2d7b6d9434b4
SVC Model in Python. To create a Support Vector Classifier… | by RandomResearchAI | Medium
June 18, 2023 - Create an instance of the SVC model by calling SVC(). Train the model on the training data using the fit method, which takes the features (X_train) and corresponding labels (y_train).
🌐
ProgramCreek
programcreek.com › python › example › 75182 › sklearn.svm.SVC
Python Examples of sklearn.svm.SVC
def create_pandas_only_svm_classifier(X, y, probability=True): class PandasOnlyEstimator(TransformerMixin): def fit(self, X, y=None, **fitparams): return self def transform(self, X, **transformparams): dataset_is_df = isinstance(X, pd.DataFrame) if not dataset_is_df: raise Exception("Dataset must be a pandas dataframe!") return X pandas_only = PandasOnlyEstimator() clf = svm.SVC(gamma=0.001, C=100.0, probability=probability, random_state=777) pipeline = Pipeline([("pandas_only", pandas_only), ("clf", clf)]) return pipeline.fit(X, y)
🌐
Datatechnotes
datatechnotes.com › 2020 › 07 › classification-example-with-linearsvm-in-python.html
DataTechNotes: Classification Example with Linear SVC in Python
from sklearn.svm import LinearSVC from sklearn.datasets import load_iris from sklearn.datasets import make_classification from sklearn.model_selection import train_test_split from sklearn.model_selection import cross_val_score from sklearn.metrics import confusion_matrix from sklearn.metrics import classification_report x, y = make_classification(n_samples=5000, n_features=10, n_classes=3, n_clusters_per_class=1) xtrain, xtest, ytrain, ytest=train_test_split(x, y, test_size=0.15) lsvc = LinearSVC() print(lsvc) lsvc.fit(xtrain, ytrain) score = lsvc.score(xtrain, ytrain) print("Score: ", score)
🌐
scikit-learn
scikit-learn.org › stable › modules › svm.html
1.4. Support Vector Machines — scikit-learn 1.8.0 documentation
>>> linear_svc = svm.SVC(kernel='linear') >>> linear_svc.kernel 'linear' >>> rbf_svc = svm.SVC(kernel='rbf') >>> rbf_svc.kernel 'rbf' See also Kernel Approximation for a solution to use RBF kernels that is much faster and more scalable. When training an SVM with the Radial Basis Function (RBF) kernel, two parameters must be considered: C and gamma. The parameter C, common to all SVM kernels, trades off misclassification of training examples against simplicity of the decision surface.
🌐
Python Data Science Handbook
jakevdp.github.io › PythonDataScienceHandbook › 05.07-support-vector-machines.html
In-Depth: Support Vector Machines | Python Data Science Handbook
We can see this, for example, if we plot the model learned from the first 60 points and first 120 points of this dataset: ... def plot_svm(N=10, ax=None): X, y = make_blobs(n_samples=200, centers=2, random_state=0, cluster_std=0.60) X = X[:N] y = y[:N] model = SVC(kernel='linear', C=1E10) ...
Top answer
1 of 6
63

I don't fully understand your code, but let's go through the example in the documentation page you referenced:

import numpy as np
X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])
y = np.array([1, 1, 2, 2])
from sklearn.svm import SVC
clf = SVC()
clf.fit(X, y) 

Now let's apply both the decision_function() and predict() to the samples:

clf.decision_function(X)
clf.predict(X)

The output we get is:

array([[-1.00052254],
       [-1.00006594],
       [ 1.00029424],
       [ 1.00029424]])
array([1, 1, 2, 2])

And that is easy to interpret: The decision function tells us on which side of the hyperplane generated by the classifier we are (and how far we are away from it). Based on that information, the estimator then labels the examples with the corresponding label.

2 of 6
33

For those interested, I'll post a quick example of the predict function translated from C++ (here) to python:

# I've only implemented the linear and rbf kernels
def kernel(params, sv, X):
    if params.kernel == 'linear':
        return [np.dot(vi, X) for vi in sv]
    elif params.kernel == 'rbf':
        return [math.exp(-params.gamma * np.dot(vi - X, vi - X)) for vi in sv]

# This replicates clf.decision_function(X)
def decision_function(params, sv, nv, a, b, X):
    # calculate the kernels
    k = kernel(params, sv, X)

    # define the start and end index for support vectors for each class
    start = [sum(nv[:i]) for i in range(len(nv))]
    end = [start[i] + nv[i] for i in range(len(nv))]

    # calculate: sum(a_p * k(x_p, x)) between every 2 classes
    c = [ sum(a[ i ][p] * k[p] for p in range(start[j], end[j])) +
          sum(a[j-1][p] * k[p] for p in range(start[i], end[i]))
                for i in range(len(nv)) for j in range(i+1,len(nv))]

    # add the intercept
    return [sum(x) for x in zip(c, b)]

# This replicates clf.predict(X)
def predict(params, sv, nv, a, b, cs, X):
    ''' params = model parameters
        sv = support vectors
        nv = # of support vectors per class
        a  = dual coefficients
        b  = intercepts 
        cs = list of class names
        X  = feature to predict       
    '''
    decision = decision_function(params, sv, nv, a, b, X)
    votes = [(i if decision[p] > 0 else j) for p,(i,j) in enumerate((i,j) 
                                           for i in range(len(cs))
                                           for j in range(i+1,len(cs)))]

    return cs[max(set(votes), key=votes.count)]

There are a lot of input arguments for predict and decision_function, but note that these are all used internally in by the model when calling predict(X). In fact, all of the arguments are accessible to you inside the model after fitting:

# Create model
clf = svm.SVC(gamma=0.001, C=100.)

# Fit model using features, X, and labels, Y.
clf.fit(X, y)

# Get parameters from model
params = clf.get_params()
sv = clf.support_vectors_ #added missing underscore
nv = clf.n_support_
#a  = clf.dual_coef_
a  = clf._dual_coef_ #use complementary dual coefficients
b  = clf._intercept_
cs = clf.classes_

# Use the functions to predict
print(predict(params, sv, nv, a, b, cs, X))

# Compare with the builtin predict
print(clf.predict(X))
Find elsewhere
🌐
pytz
pythonhosted.org › Optunity › examples › python › sklearn › svc.html
Support vector machine classification (SVC) — Optunity 0.2.1 documentation
In this example, we will use optunity.maximize(). import optunity import optunity.metrics import sklearn.svm # score function: twice iterated 10-fold cross-validated accuracy @optunity.cross_validated(x=data, y=labels, num_folds=10, num_iter=2) def svm_auc(x_train, y_train, x_test, y_test, C, gamma): model = sklearn.svm.SVC(C=C, gamma=gamma).fit(x_train, y_train) decision_values = model.decision_function(x_test) return optunity.metrics.roc_auc(y_test, decision_values) # perform tuning optimal_pars, _, _ = optunity.maximize(svm_auc, num_evals=200, C=[0, 10], gamma=[0, 1]) # train model on the full training set with tuned hyperparameters optimal_model = sklearn.svm.SVC(**optimal_pars).fit(data, labels)
🌐
DataCamp
datacamp.com › tutorial › svm-classification-scikit-learn-python
Scikit-learn SVM Tutorial with Python (Support Vector Machines) | DataCamp
December 27, 2019 - Let's build support vector machine model. First, import the SVM module and create support vector classifier object by passing argument kernel as the linear kernel in SVC() function.
🌐
GeeksforGeeks
geeksforgeeks.org › machine learning › understanding-scikit-learns-svc-decision-function-and-predict
Understanding Scikit-Learn's SVC: Decision Function and Predict - GeeksforGeeks
July 23, 2025 - Python · # Calculate decision function decision_scores = clf.decision_function(X_test) print("Decision Scores:", decision_scores) Output: Decision Scores: [-0.04274893 0.29143233 -0.13001369] In this example, the decision scores provide insight into how far each test point is from the hyperplane · The predict method in SVC is used to assign a class label to each sample in the input data based on the decision scores.
🌐
Smith College
science.smith.edu › ~jcrouser › SDS293 › labs › lab15-py.html
Lab 15 - Support Vector Machines in Python
import pandas as pd import numpy as np import matplotlib as mpl import matplotlib.pyplot as plt from sklearn.metrics import confusion_matrix %matplotlib inline # We'll define a function to draw a nice plot of an SVM def plot_svc(svc, X, y, h=0.02, pad=0.25): x_min, x_max = X[:, 0].min()-pad, ...
🌐
scikit-learn
scikit-learn.org › 0.21 › modules › generated › sklearn.svm.SVC.html
sklearn.svm.SVC — scikit-learn 0.21.3 documentation
Examples · >>> import numpy as np >>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]]) >>> y = np.array([1, 1, 2, 2]) >>> from sklearn.svm import SVC >>> clf = SVC(gamma='auto') >>> clf.fit(X, y) SVC(C=1.0, cache_size=200, class_weight=None, coef0=0.0, decision_function_shape='ovr', degree=3, gamma='auto', kernel='rbf', max_iter=-1, probability=False, random_state=None, shrinking=True, tol=0.001, verbose=False) >>> print(clf.predict([[-0.8, -1]])) [1] Methods ·
🌐
MyScale
myscale.com › blog › understanding-support-vector-machines-scikit-learn-svc
Mastering scikit learn SVC: A Comprehensive Guide
Delving into the realm of scikit learn SVC opens up a gateway to seamless implementation of Support Vector Machines in Python.
🌐
scikit-learn
ogrisel.github.io › scikit-learn.org › sklearn-tutorial › modules › generated › sklearn.svm.SVC.html
8.26.1.1. sklearn.svm.SVC — scikit-learn 0.11-git documentation
Examples · >>> import numpy as np >>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]]) >>> y = np.array([1, 1, 2, 2]) >>> from sklearn.svm import SVC >>> clf = SVC() >>> clf.fit(X, y) SVC(C=1.0, cache_size=200, class_weight=None, coef0=0.0, degree=3, gamma=0.5, kernel='rbf', probability=False, scale_C=True, shrinking=True, tol=0.001) >>> print clf.predict([[-0.8, -1]]) [ 1.] Attributes ·
🌐
TutorialsPoint
tutorialspoint.com › scikit_learn › scikit_learn_support_vector_machines.htm
Scikit Learn - Support Vector Machines
Rest of the parameters and attributes are similar as we used in SVC. Following Python script uses sklearn.svm.SVR class −
🌐
Codecademy
codecademy.com › docs › python:sklearn › support vector machines
Python:Sklearn | Support Vector Machines | Codecademy
October 17, 2024 - This codebyte example demonstrates the use of a Support Vector Classifier (SVC) with a linear kernel on a synthetic two-class dataset and predicts the class of a new data point: ... Looking for an introduction to the theory behind programming? Master Python while learning data structures, ...