🌐
scikit-learn
scikit-learn.org › stable › modules › generated › sklearn.linear_model.SGDRegressor.html
SGDRegressor — scikit-learn 1.8.0 documentation
Theil-Sen Estimator robust multivariate regression model. ... Online Passive-Aggressive Algorithms <http://jmlr.csail.mit.edu/papers/volume7/crammer06a/crammer06a.pdf> K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006) ... >>> import numpy as np >>> from sklearn.linear_model import SGDRegressor >>> from sklearn.pipeline import make_pipeline >>> from sklearn.preprocessing import StandardScaler >>> n_samples, n_features = 10, 5 >>> rng = np.random.RandomState(0) >>> y = rng.randn(n_samples) >>> X = rng.randn(n_samples, n_features) >>> # Always scale the input.
🌐
scikit-learn
scikit-learn.org › stable › modules › sgd.html
1.5. Stochastic Gradient Descent — scikit-learn 1.8.0 documentation
For regression with a squared loss and a \(L_2\) penalty, another variant of SGD with an averaging strategy is available with Stochastic Average Gradient (SAG) algorithm, available as a solver in Ridge. ... The class sklearn.linear_model.SGDOneClassSVM implements an online linear version of the One-Class SVM using a stochastic gradient descent.
🌐
scikit-learn
scikit-learn.org › stable › modules › generated › sklearn.linear_model.SGDClassifier.html
SGDClassifier — scikit-learn 1.8.0 documentation
Logistic regression. ... Inherits from SGDClassifier. Perceptron() is equivalent to SGDClassifier(loss="perceptron", eta0=1, learning_rate="constant", penalty=None). ... Online Passive-Aggressive Algorithms <http://jmlr.csail.mit.edu/papers/volume7/crammer06a/crammer06a.pdf> K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006) ... >>> import numpy as np >>> from sklearn.linear_model import SGDClassifier >>> from sklearn.preprocessing import StandardScaler >>> from sklearn.pipeline import make_pipeline >>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]]) >>> Y = np.array([1, 1, 2, 2]) >>> # Always scale the input.
🌐
TutorialsPoint
tutorialspoint.com › scikit_learn › scikit_learn_stochastic_gradient_descent.htm
Scikit Learn - Stochastic Gradient Descent
Attributes of SGDRegressor are also same as that were of SGDClassifier module. Rather it has three extra attributes as follows − ... As name suggest, it provides the average weights assigned to the features. ... As name suggest, it provides the averaged intercept term. ... It provides the number of weight updates performed during the training phase. Note − the attributes average_coef_ and average_intercept_ will work after enabling parameter average to True. ... import numpy as np from sklearn import linear_model n_samples, n_features = 10, 5 rng = np.random.RandomState(0) y = rng.randn(n_samples) X = rng.randn(n_samples, n_features) SGDReg =linear_model.SGDRegressor( max_iter = 1000,penalty = "elasticnet",loss = 'huber',tol = 1e-3, average = True ) SGDReg.fit(X, y)
🌐
Datatechnotes
datatechnotes.com › 2020 › 09 › regression-example-with-sgdregressor-in-python.html
DataTechNotes: Regression Example with SGDRegressor in Python
In this tutorial, we've briefly learned how to fit and predict regression data by using Scikit-learn API's SGDRegressor class in Python. The full source code is listed below. ... from sklearn.linear_model import SGDRegressor from sklearn.datasets import load_boston from sklearn.datasets import make_regression from sklearn.metrics import mean_squared_error from sklearn.model_selection import train_test_split from sklearn.model_selection import cross_val_score from sklearn.preprocessing import scale import matplotlib.pyplot as plt x, y = make_regression(n_samples=1000, n_features=30) print(x[0:2
🌐
scikit-learn
ogrisel.github.io › scikit-learn.org › sklearn-tutorial › modules › generated › sklearn.linear_model.SGDRegressor.html
8.15.1.18. sklearn.linear_model.SGDRegressor — scikit-learn 0.11-git documentation
>>> import numpy as np >>> from sklearn import linear_model >>> n_samples, n_features = 10, 5 >>> np.random.seed(0) >>> y = np.random.randn(n_samples) >>> X = np.random.randn(n_samples, n_features) >>> clf = linear_model.SGDRegressor() >>> clf.fit(X, y) SGDRegressor(alpha=0.0001, eta0=0.01, fit_intercept=True, learning_rate='invscaling', loss='squared_loss', n_iter=5, p=0.1, penalty='l2', power_t=0.25, rho=0.85, seed=0, shuffle=False, verbose=0, warm_start=False)
🌐
GeeksforGeeks
geeksforgeeks.org › machine learning › stochastic-gradient-descent-regressor-using-scikit-learn
Stochastic Gradient Descent Regressor using Scikit-learn - GeeksforGeeks
July 23, 2025 - We will use the diabetes dataset to build and evaluate a linear regression model using SGD. ... from sklearn.datasets import load_diabetes import numpy as np from sklearn.linear_model import LinearRegression from sklearn.metrics import r2_score from sklearn.model_selection import train_test_split
🌐
scikit-learn
scikit-learn.org › 1.5 › modules › generated › sklearn.linear_model.SGDRegressor.html
SGDRegressor — scikit-learn 1.5.2 documentation
Theil-Sen Estimator robust multivariate regression model. ... >>> import numpy as np >>> from sklearn.linear_model import SGDRegressor >>> from sklearn.pipeline import make_pipeline >>> from sklearn.preprocessing import StandardScaler >>> n_samples, n_features = 10, 5 >>> rng = np.random.RandomState(0) >>> y = rng.randn(n_samples) >>> X = rng.randn(n_samples, n_features) >>> # Always scale the input.
🌐
scikit-learn
scikit-learn.org › 0.15 › modules › generated › sklearn.linear_model.SGDRegressor.html
sklearn.linear_model.SGDRegressor — scikit-learn 0.15-git documentation
>>> import numpy as np >>> from sklearn import linear_model >>> n_samples, n_features = 10, 5 >>> np.random.seed(0) >>> y = np.random.randn(n_samples) >>> X = np.random.randn(n_samples, n_features) >>> clf = linear_model.SGDRegressor() >>> clf.fit(X, y) SGDRegressor(alpha=0.0001, epsilon=0.1, eta0=0.01, fit_intercept=True, l1_ratio=0.15, learning_rate='invscaling', loss='squared_loss', n_iter=5, penalty='l2', power_t=0.25, random_state=None, shuffle=False, verbose=0, warm_start=False)
Find elsewhere
🌐
scikit-learn
scikit-learn.org › dev › modules › generated › sklearn.linear_model.SGDRegressor.html
SGDRegressor — scikit-learn 1.9.dev0 documentation
Theil-Sen Estimator robust multivariate regression model. ... Online Passive-Aggressive Algorithms <http://jmlr.csail.mit.edu/papers/volume7/crammer06a/crammer06a.pdf> K. Crammer, O. Dekel, J. Keshat, S. Shalev-Shwartz, Y. Singer - JMLR (2006) ... >>> import numpy as np >>> from sklearn.linear_model import SGDRegressor >>> from sklearn.pipeline import make_pipeline >>> from sklearn.preprocessing import StandardScaler >>> n_samples, n_features = 10, 5 >>> rng = np.random.RandomState(0) >>> y = rng.randn(n_samples) >>> X = rng.randn(n_samples, n_features) >>> # Always scale the input.
🌐
scikit-learn
scikit-learn.org › 0.15 › modules › sgd.html
1.3. Stochastic Gradient Descent — scikit-learn 0.15-git documentation
The class SGDRegressor implements a plain stochastic gradient descent learning routine which supports different loss functions and penalties to fit linear regression models.
🌐
GeeksforGeeks
geeksforgeeks.org › python › stochastic-gradient-descent-regressor
Stochastic Gradient Descent Regressor - GeeksforGeeks
July 23, 2025 - from sklearn.linear_model import SGDRegressor sgd_regressor = SGDRegressor(parameters,...)
🌐
Evening Session
sdsawtelle.github.io › blog › output › week2-andrew-ng-machine-learning-with-python.html
LinearRegression vs. SGDRegressor
September 1, 2016 - If you really want to get gradient descent in scikit-learn then the relevant object is called SGDRegressor (or for classification problems SGDClassifier - recall classification is where your labels are discrete category-like).
🌐
scikit-learn
scikit-learn.org › 1.5 › modules › sgd.html
1.5. Stochastic Gradient Descent — scikit-learn 1.5.2 documentation
For regression with a squared loss and a l2 penalty, another variant of SGD with an averaging strategy is available with Stochastic Average Gradient (SAG) algorithm, available as a solver in Ridge. The class sklearn.linear_model.SGDOneClassSVM implements an online linear version of the One-Class ...
🌐
SourceForge
scikit-learn.sourceforge.net › stable › modules › sgd.html
1.5. Stochastic Gradient Descent — scikit-learn 0.16.1 documentation
The class SGDRegressor implements a plain stochastic gradient descent learning routine which supports different loss functions and penalties to fit linear regression models.
🌐
GitHub
github.com › scikit-learn › scikit-learn › blob › main › sklearn › linear_model › _stochastic_gradient.py
scikit-learn/sklearn/linear_model/_stochastic_gradient.py at main · scikit-learn/scikit-learn
"""Classification, regression and One-Class SVM using Stochastic Gradient · Descent (SGD). """ · import warnings · from abc import ABCMeta, abstractmethod · from numbers import Integral, Real · · import numpy as np · · from sklearn._loss._loss import CyHalfBinomialLoss, CyHalfSquaredError, CyHuberLoss ·
Author   scikit-learn
Top answer
1 of 3
23

Logistic Regression in Sklearn doesn't have a 'sgd' solver though. It implements a log regularized logistic regression : it minimizes the log-probability.

SGDClassifier is a generalized linear classifier that will use Stochastic Gradient Descent as a solver. As it is mentionned here http://scikit-learn.org/stable/modules/sgd.html : "Even though SGD has been around in the machine learning community for a long time, it has received a considerable amount of attention just recently in the context of large-scale learning." It is easy to implement and efficient. For example, this is one of the solvers that is used for Neural Networks.

With SGDClassifier you can use lots of different loss functions (a function to minimize or maximize to find the optimum solution) that allows you to "tune" your model and find the best sgd based linear model for your data. Indeed, some data structures or some problems will need different loss functions.

In your example, the SGD classifier will have the same loss function as the Logistic Regression but a different solver. Depending on your data, you can have different results. You may try to find the best one using cross validation or even try a grid search cross validation to find the best hyper-parameters.

Hope that answers your questions.

2 of 3
2

Basically, SGD is like an umbrella capable to facing different linear functions. SGD is an approximation algorithm like taking single single points and as the number of point increases it converses more to the optimal solution. Therefore, it is mostly used when the dataset is large. Logistic Regression uses Gradient descent by default so its slower (if compared on large dataset) To make SGD perform well for any particular linear function, lets say here logistic Regression we tune the parameters called hyperparameter tuning

🌐
scikit-learn
scikit-learn.org › 1.0 › modules › generated › sklearn.linear_model.SGDRegressor.html
sklearn.linear_model.SGDRegressor — scikit-learn 1.0.2 documentation
Theil-Sen Estimator robust multivariate regression model. Examples · >>> import numpy as np >>> from sklearn.linear_model import SGDRegressor >>> from sklearn.pipeline import make_pipeline >>> from sklearn.preprocessing import StandardScaler >>> n_samples, n_features = 10, 5 >>> rng = np.random.RandomState(0) >>> y = rng.randn(n_samples) >>> X = rng.randn(n_samples, n_features) >>> # Always scale the input.
🌐
scikit-learn
scikit-learn.org › 0.19 › modules › generated › sklearn.linear_model.SGDRegressor.html
sklearn.linear_model.SGDRegressor — scikit-learn 0.19.2 documentation
class sklearn.linear_model.SGDRegressor(loss=’squared_loss’, penalty=’l2’, alpha=0.0001, l1_ratio=0.15, fit_intercept=True, max_iter=None, tol=None, shuffle=True, verbose=0, epsilon=0.1, random_state=None, learning_rate=’invscaling’, eta0=0.01, power_t=0.25, warm_start=False, average=False, n_iter=None)[source]¶