๐ŸŒ
scikit-learn
scikit-learn.org โ€บ stable โ€บ auto_examples โ€บ svm โ€บ plot_rbf_parameters.html
RBF SVM parameters โ€” scikit-learn 1.8.0 documentation
This example illustrates the effect of the parameters gamma and C of the Radial Basis Function (RBF) kernel SVM. Intuitively, the gamma parameter defines how far the influence of a single training ...
machine learning kernel function
In machine learning, the radial basis function kernel, or RBF kernel, is a popular kernel function used in various kernelized learning algorithms. In particular, it is commonly used in support vector machine โ€ฆ Wikipedia
๐ŸŒ
Wikipedia
en.wikipedia.org โ€บ wiki โ€บ Radial_basis_function_kernel
Radial basis function kernel - Wikipedia
3 weeks ago - Because support vector machines and other models employing the kernel trick do not scale well to large numbers of training samples or large numbers of features in the input space, several approximations to the RBF kernel (and similar kernels) have been introduced.
๐ŸŒ
Quark Machine Learning
quarkml.com โ€บ home โ€บ data science โ€บ machine learning
The RBF kernel in SVM: A Complete Guide - Quark Machine Learning
April 6, 2025 - Now let's see the RBF kernel in action! For that, we need a dataset that is non-linearly separable which can be created using the Scikit-Learn make_circles dataset. ... Now let's plot the dataset to see its distribution. ... Now let's try the fit this data to a Linear SVM to check the accuracy of predictions.
๐ŸŒ
ScienceDirect
sciencedirect.com โ€บ science โ€บ article โ€บ abs โ€บ pii โ€บ S0016003221006025
Random radial basis function kernel-based support vector machine - ScienceDirect
October 21, 2021 - For example, orthogonal polynomial kernels were plugged into SVMs, enabling these SVMs to find a smaller number of support vectors and generalize well to unseen data [4], [5], [6], [7]. The work in [4] proposed a new Chebyshev kernel constructed ...
๐ŸŒ
DZone
dzone.com โ€บ data engineering โ€บ ai/ml โ€บ svm rbf kernel parameters with code examples
SVM RBF Kernel Parameters With Code Examples
July 28, 2020 - In this post, you will learn about SVM RBF (Radial Basis Function) kernel hyperparameters with the python code example.
๐ŸŒ
Towards Data Science
towardsdatascience.com โ€บ home โ€บ latest โ€บ radial basis function (rbf) kernel: the go-to kernel
Radial Basis Function (RBF) Kernel: The Go-To Kernel | Towards Data Science
January 21, 2025 - Fig 6: RBF Kernel SVM for Iris Dataset [Image Credits: https://scikit-learn.org/] From the figure, we can see that as ฮณ increases, i.e. ฯƒ reduces, the model tends to overfit for a given value of C. Finding the right ฮณ or ฯƒ along with the value of C is essential in order to achieve the best Bias-Variance Trade off. ... Scikit-Learn Implementation of SVM: https://scikit-learn.org/stable/auto_examples/svm/plot_rbf_parameters.html
Find elsewhere
Top answer
1 of 1
7

(so an rbf is the correct choice?)

It depends. RBF is very simple, generic kernel which might be used, but there are dozens of others. Take a look for example at the ones included in pykernels https://github.com/gmum/pykernels

When the SVM is trained it will plot a hyperplane(which I think is like a plane in 3d but with more dimensions?) that best separates the data.

Lets avoid some weird confusions. Nothing is plotted here. SVM will look for d-dimensional hyperplane defined by v (normal vector) and b (bias, distance from the origin), which is simply set of points x such that <v, x> = b. In 2D hyperplane is a line, in 3D hyperplane is plane, in d+1 dimensions it is d dimensional object, always one dimension lower than the space (line is 1D, plane is 2D).

When tuning, changing the value of gamma changes the surface of the of the hyperplane (also called the decision boundary?).

Now this is an often mistake. Decision boundary is not a hyperplane. Decision boundary is a projection of the hyperplane onto input space. You cannot observe actual hyperplane as it is often of very high dimension. You can express this hyperplane as a functional equation, but nothing more. Decision boundary on the other hand "lives" in your input space, if input is low-dimensional, you can even plot this object. But this is not a hyperplane, it is just the way this hyperplane intersects with your input space. This is why decision boundary is often curved or even discontinous even though hyperplane is always linear and continuous - because you just see a nonlinear section through it. Now what is gamma doing? RBF kernel leads to the optimization in the space of continous functions. There are plenty ot these (there is continuum of such objects). However, SVM can express only a tiny fraction of these guys - linear combinations of kernel values in training points. Fixing particular gamma limits set of functions to consider - bigger the gamma, more narrow the kernels, thus functions that are being considered consists of linear combinations of such "spiky" distributions. So gamma itself does not change the surface, it changes the space of considered hypotheses.

So an increase in the value of gamma, results in a Gaussian which is narrower. Is this like saying that the bumps on the plane (if plotted in 3d) that can be plotted are allowed to be narrower to fit the training data better? Or in 2D is this like saying gamma defines how bendy the line that separtates the data can be?

I think I answered with previous point - high gamma means that you only consider hyperplanes of form

<v, x> - b = SUM_i alpha_i K_gamma(x_i, x) - b

where K_gamma(x_i, x) = exp(-gamma ||x_i-x||^2), thus you will get very "spiky" elements of your basis. this will lead to very tight fit to your training data. Exact shape of the decision boundary is hard to estimate, as this depends on optimal lagrange multipliers alpha_i selected during training.

I'm also very confused about about how this can lead to an infinite dimensional representation from a finite number of features? Any good analogies would help me greatly.

The "infinite representation" comes from the fact, that in order to work with vectors and hyperplanes, each of your point is actually mapped to a continuous function. So SVM, internally, is not really working with d-dimensional points anymore, it is working with functions. Consider 2d case, you have points [0,0] and [1,1]. This is a simple 2d problem. When you apply SVM with rbf kernel here - you will instead work with an unnormalized gaussian distribution centered in [0, 0] and another one in [1,1]. Each such gaussian is a function from R^2 to R, which expresses its probability density function (pdf). It is a bit confusing because kernel looks like a gaussian too, but this is only because dot product of two functions is usually defined as an integral of their product, and integral of product of two gaussians is .... a gaussian too! So where is this infinity? Remember that you are supposed to work with vectors. How to write down a function as a vector? You would have to list all its values, thus if you have a function f(x) = 1/sqrt(2*pi(sigma^2) exp(-||x-m||^2 / (2*sigma^2)) you will have to list infinite number of such values to fully define it. And this is this concept of infinite dimension - you are mapping points to functions, functions are infinite dimensional in terms of vector spaces, thus your representation is infinitely dimensional.

One good example might be different mapping. Consider a 1D dataset of numbers 1,2,3,4,5,6,7,8,9,10. Lets assign odd numbers different label than even ones. You cannot linearly separate these guys. But you can instead map each point (number) to a kind of characteristic function, function of the form

f_x(y) = 1 iff x e [y-0.5, y+0.5]

now, in space of all such functions I can easily linearly separate the ones created from odd x's from the rest, by simply building hyperplane of equation

<v, x> = SUM_[v_odd] <f_v_odd, f_x(y)> = INTEGRAL (f_v * f_x) (y) dy

And this will equal 1 iff x is odd, as only this integral will be non zero. Obviously I am just using finite amount of training points (v_odd here) but the representation itself is infinite dimensional. Where is this additional "information" coming from? From my assumptions - the way I defined the mapping introduces a particular structure in the space I am considering. Similarly with RBF - you get infinite dimension, but it does not mean you are actually considering every continouus function - you are limiting yourself to the linear combinations of gaussians centered in training points. Similarly you could use sinusoidal kernel which limits you to the combinations of sinusoidal functions. The choice of a particular, "best" kernel is the whole other story, complex and without clear answers. Hope this helps a bit.

๐ŸŒ
GitHub
github.com โ€บ xbeat โ€บ Machine-Learning โ€บ blob โ€บ main โ€บ The Mathematics of RBF Kernel in Python.md
Machine-Learning/The Mathematics of RBF Kernel in Python.md at main ยท xbeat/Machine-Learning
Selecting the right gamma value is crucial for the performance of RBF kernel-based models. Too small gamma can lead to underfitting, while too large gamma can cause overfitting. from sklearn.svm import SVC from sklearn.datasets import make_moons from sklearn.model_selection import train_test_split import numpy as np import matplotlib.pyplot as plt # Generate non-linear data X, y = make_moons(n_samples=100, noise=0.15, random_state=42) # Split the data X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) # Train SVM with different gamma values gammas = [0.01
Author ย  xbeat
๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ python โ€บ rbf-svm-parameters-in-scikit-learn
RBF SVM Parameters in Scikit Learn - GeeksforGeeks
April 28, 2025 - The coef0 parameter is used when the kernel is set to polynomial or sigmoid, and it controls the independent term in the kernel function. To find the optimal values for these parameters, a grid search or randomized search can be performed over a range of values. Cross-validation can also be used to evaluate the performance of the model for different parameter values. It is important to note that selecting the right combination of parameters is a crucial step in building an accurate and robust SVM model with the RBF kernel.
๐ŸŒ
Mldemystified
mldemystified.com โ€บ demystifying support vector machines: kernel machines
Demystifying Support Vector Machines: Kernel Machines | MLDemystified
February 18, 2024 - Linear Kernel: \(K(x, x') = x^T x'\). This kernel does not actually transform the data and is equivalent to the standard linear SVM. Polynomial Kernel: \(K(x, x') = (\gamma x^T x' + r)^d\), where \(d\) is the degree of the polynomial, \(\gamma\) is a scale factor, and \(r\) is a constant term. Radial Basis Function (RBF) Kernel: \(K(x, x') = \exp(-\gamma \|x - x'\|^2)\), where \(\gamma\) is a scale factor.
๐ŸŒ
Towards Data Science
towardsdatascience.com โ€บ home โ€บ latest โ€บ svm classifier and rbf kernel โ€“ how to make better models in python
SVM Classifier and RBF Kernel - How to Make Better Models in Python | Towards Data Science
January 23, 2025 - SVM with RBF kernel and high gamma. See how it was created in the Python section at the end of this story. Image by author. It is essential to understand how different Machine Learning algorithms work to succeed in your Data Science projects. I have written this story as part of the series that dives into each ML algorithm explaining its mechanics, supplemented by Python code examples and intuitive visualizations.
๐ŸŒ
scikit-learn
scikit-learn.org โ€บ dev โ€บ auto_examples โ€บ svm โ€บ plot_rbf_parameters.html
RBF SVM parameters โ€” scikit-learn 1.9.dev0 documentation
This example illustrates the effect of the parameters gamma and C of the Radial Basis Function (RBF) kernel SVM. Intuitively, the gamma parameter defines how far the influence of a single training ...
๐ŸŒ
AI Mind
pub.aimind.so โ€บ using-radial-basis-functions-for-svms-with-python-and-scikit-learn-c935aa06a56e
Using Radial Basis Functions for Support Vector Machines | by Francesco Franco | AI Mind
June 12, 2025 - As you can see, the RBF kernelโ€™s Distribution graph resembles the Gaussian Distribution curve, sometimes known as a bell-shaped curve. RBF kernel is also known as the Gaussian Radial Basis Kernel. We can easily implement an RBF-based SVM classifier with Scikit-learn: the only thing we have to do is change kernel='linear' to kernel='rbf' during SVC(...) initialization.
๐ŸŒ
VitalFlux
vitalflux.com โ€บ home โ€บ data science โ€บ svm rbf kernel parameters: python examples
SVM RBF Kernel Parameters: Python Examples - Analytics Yogi
April 15, 2023 - In this post, you will learn about SVM RBF (Radial Basis Function) kernel hyperparameters with the python code example.
๐ŸŒ
Quora
quora.com โ€บ What-is-RBF-kernel-in-SVM
What is RBF kernel in SVM? - Quora
If you are classifying images, you can try a RBF Kernel--because the RBF Kernel selects solutions that are smooth (this can be easily shown in frequency space...I started a blog to explain...bear with me as I proof it: http://charlesmartin14.wordpress.com/2012/02/06/kernels_part_1/ ) If you think your solutions are naturally sparse, then pick an L1-regularizer. If you only have a small set of labels but lots of unlabeled data, then you might try a Manifold Regularizer (i.e. Transductive SVM), with or without a non-linear Kernel
๐ŸŒ
GitHub
github.com โ€บ christianversloot โ€บ machine-learning-articles โ€บ blob โ€บ main โ€บ using-radial-basis-functions-for-svms-with-python-and-scikit-learn.md
machine-learning-articles/using-radial-basis-functions-for-svms-with-python-and-scikit-learn.md at main ยท christianversloot/machine-learning-articles
November 25, 2020 - Let's take a look what happens when we implement our Scikit-learn classifier with the RBF kernel. We can easily implement an RBF based SVM classifier with Scikit-learn: the only thing we have to do is change kernel='linear' to kernel='rbf' during SVC(...) initialization.
Author ย  christianversloot
๐ŸŒ
Wordpress
dwbi1.wordpress.com โ€บ 2021 โ€บ 05 โ€บ 24 โ€บ svm-with-rbf-kernel
SVM with RBF Kernel | Data Platform and Data Science
July 10, 2021 - For example, distance of 3 from point (5,5) is like this: Figure 9. Radial Basis Function ยท We can sum multiple RBFs to get shapes with multiple centres like this: ... SVM with RBF Kernel is a machine learning algorithm which is capable to ...
๐ŸŒ
Kaggle
kaggle.com โ€บ code โ€บ manmohan291 โ€บ 16-sklearn-svm-rbf-kernel
16 SKLearn - SVM RBF Kernel
Checking your browser before accessing www.kaggle.com ยท Click here if you are not automatically redirected after 5 seconds