Update README.md

Signed-off-by: David Rotermund <54365609+davrot@users.noreply.github.com>
This commit is contained in:
David Rotermund 2023-12-16 15:33:45 +01:00 committed by GitHub
parent ccd3235671
commit d2e506c5c2
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -266,78 +266,49 @@ see [here](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.e
||| |||
|---|---| |---|---|
impute.SimpleImputer(*[, missing_values, ...]) |impute.SimpleImputer(*[, missing_values, ...])|Univariate imputer for completing missing values with simple strategies.|
Univariate imputer for completing missing values with simple strategies. |impute.IterativeImputer([estimator, ...])|Multivariate imputer that estimates each feature from all the others.|
|impute.MissingIndicator(*[, missing_values, ...])|Binary indicators for missing values.|
impute.IterativeImputer([estimator, ...]) |impute.KNNImputer(*[, missing_values, ...])|Imputation for completing missing values using k-Nearest Neighbors.|
Multivariate imputer that estimates each feature from all the others.
impute.MissingIndicator(*[, missing_values, ...])
Binary indicators for missing values.
impute.KNNImputer(*[, missing_values, ...])
Imputation for completing missing values using k-Nearest Neighbors.
## [sklearn.inspection: Inspection](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.inspection) ## [sklearn.inspection: Inspection](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.inspection)
||| |||
|---|---| |---|---|
inspection.partial_dependence(estimator, X, ...) |inspection.partial_dependence(estimator, X, ...)|Partial dependence of features.|
Partial dependence of features. |inspection.permutation_importance(estimator, ...)|Permutation importance for feature evaluation [Rd9e56ef97513-BRE].|
inspection.permutation_importance(estimator, ...)
Permutation importance for feature evaluation [Rd9e56ef97513-BRE].
### Plotting ### Plotting
||| |||
|---|---| |---|---|
inspection.DecisionBoundaryDisplay(*, xx0, ...) |inspection.DecisionBoundaryDisplay(*, xx0, ...)|Decisions boundary visualization.|
Decisions boundary visualization. |inspection.PartialDependenceDisplay(...[, ...])|Partial Dependence Plot (PDP).|
inspection.PartialDependenceDisplay(...[, ...])
Partial Dependence Plot (PDP).
## [sklearn.isotonic: Isotonic regression](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.isotonic) ## [sklearn.isotonic: Isotonic regression](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.isotonic)
||| |||
|---|---| |---|---|
isotonic.IsotonicRegression(*[, y_min, ...]) |isotonic.IsotonicRegression(*[, y_min, ...])|Isotonic regression model.|
Isotonic regression model. |isotonic.check_increasing(x, y)|Determine whether y is monotonically correlated with x.|
|isotonic.isotonic_regression(y, *[, ...])|Solve the isotonic regression model.|
isotonic.check_increasing(x, y)
Determine whether y is monotonically correlated with x.
isotonic.isotonic_regression(y, *[, ...])
Solve the isotonic regression model.
## [sklearn.kernel_approximation: Kernel Approximation](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.kernel_approximation) ## [sklearn.kernel_approximation: Kernel Approximation](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.kernel_approximation)
||| |||
|---|---| |---|---|
kernel_approximation.AdditiveChi2Sampler(*) |kernel_approximation.AdditiveChi2Sampler(*)|Approximate feature map for additive chi2 kernel.|
Approximate feature map for additive chi2 kernel. |kernel_approximation.Nystroem([kernel, ...])|Approximate a kernel map using a subset of the training data.|
|kernel_approximation.PolynomialCountSketch(*)|Polynomial kernel approximation via Tensor Sketch.|
kernel_approximation.Nystroem([kernel, ...]) |kernel_approximation.RBFSampler(*[, gamma, ...])|Approximate a RBF kernel feature map using random Fourier features.|
Approximate a kernel map using a subset of the training data. |kernel_approximation.SkewedChi2Sampler(*[, ...])|Approximate feature map for "skewed chi-squared" kernel.|
kernel_approximation.PolynomialCountSketch(*)
Polynomial kernel approximation via Tensor Sketch.
kernel_approximation.RBFSampler(*[, gamma, ...])
Approximate a RBF kernel feature map using random Fourier features.
kernel_approximation.SkewedChi2Sampler(*[, ...])
Approximate feature map for "skewed chi-squared" kernel.
## [sklearn.kernel_ridge: Kernel Ridge Regression](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.kernel_ridge) ## [sklearn.kernel_ridge: Kernel Ridge Regression](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.kernel_ridge)
||| |||
|---|---| |---|---|
kernel_ridge.KernelRidge([alpha, kernel, ...]) |kernel_ridge.KernelRidge([alpha, kernel, ...])|Kernel ridge regression.|
Kernel ridge regression.
## [sklearn.linear_model: Linear Models](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.linear_model) ## [sklearn.linear_model: Linear Models](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.linear_model)
@ -345,198 +316,100 @@ Kernel ridge regression.
||| |||
|---|---| |---|---|
linear_model.LogisticRegression([penalty, ...]) |linear_model.LogisticRegression([penalty, ...])|Logistic Regression (aka logit, MaxEnt) classifier.|
Logistic Regression (aka logit, MaxEnt) classifier. |linear_model.LogisticRegressionCV(*[, Cs, ...])|Logistic Regression CV (aka logit, MaxEnt) classifier.|
|linear_model.PassiveAggressiveClassifier(*)|Passive Aggressive Classifier.|
linear_model.LogisticRegressionCV(*[, Cs, ...]) |linear_model.Perceptron(*[, penalty, alpha, ...])|Linear perceptron classifier.|
Logistic Regression CV (aka logit, MaxEnt) classifier. |linear_model.RidgeClassifier([alpha, ...])|Classifier using Ridge regression.|
|linear_model.RidgeClassifierCV([alphas, ...])|Ridge classifier with built-in cross-validation.|
linear_model.PassiveAggressiveClassifier(*) |linear_model.SGDClassifier([loss, penalty, ...])|Linear classifiers (SVM, logistic regression, etc.) with SGD training.|
Passive Aggressive Classifier. |linear_model.SGDOneClassSVM([nu, ...])|Solves linear One-Class SVM using Stochastic Gradient Descent.|
linear_model.Perceptron(*[, penalty, alpha, ...])
Linear perceptron classifier.
linear_model.RidgeClassifier([alpha, ...])
Classifier using Ridge regression.
linear_model.RidgeClassifierCV([alphas, ...])
Ridge classifier with built-in cross-validation.
linear_model.SGDClassifier([loss, penalty, ...])
Linear classifiers (SVM, logistic regression, etc.) with SGD training.
linear_model.SGDOneClassSVM([nu, ...])
Solves linear One-Class SVM using Stochastic Gradient Descent.
### Classical linear regressors ### Classical linear regressors
||| |||
|---|---| |---|---|
linear_model.LinearRegression(*[, ...]) |linear_model.LinearRegression(*[, ...])|Ordinary least squares Linear Regression.|
Ordinary least squares Linear Regression. |linear_model.Ridge([alpha, fit_intercept, ...])|Linear least squares with l2 regularization.|
|linear_model.RidgeCV([alphas, ...])|Ridge regression with built-in cross-validation.|
linear_model.Ridge([alpha, fit_intercept, ...]) |linear_model.SGDRegressor([loss, penalty, ...])|Linear model fitted by minimizing a regularized empirical loss with SGD.|
Linear least squares with l2 regularization.
linear_model.RidgeCV([alphas, ...])
Ridge regression with built-in cross-validation.
linear_model.SGDRegressor([loss, penalty, ...])
Linear model fitted by minimizing a regularized empirical loss with SGD.
### Regressors with variable selection ### Regressors with variable selection
||| |||
|---|---| |---|---|
linear_model.ElasticNet([alpha, l1_ratio, ...]) |linear_model.ElasticNet([alpha, l1_ratio, ...])|Linear regression with combined L1 and L2 priors as regularizer.|
Linear regression with combined L1 and L2 priors as regularizer. |linear_model.ElasticNetCV(*[, l1_ratio, ...])|Elastic Net model with iterative fitting along a regularization path.|
|linear_model.Lars(*[, fit_intercept, ...])|Least Angle Regression model a.k.a.|
linear_model.ElasticNetCV(*[, l1_ratio, ...]) |linear_model.LarsCV(*[, fit_intercept, ...])|Cross-validated Least Angle Regression model.|
Elastic Net model with iterative fitting along a regularization path. |linear_model.Lasso([alpha, fit_intercept, ...])|Linear Model trained with L1 prior as regularizer (aka the Lasso).|
|linear_model.LassoCV(*[, eps, n_alphas, ...])|Lasso linear model with iterative fitting along a regularization path.|
linear_model.Lars(*[, fit_intercept, ...]) |linear_model.LassoLars([alpha, ...])|Lasso model fit with Least Angle Regression a.k.a.|
Least Angle Regression model a.k.a. |linear_model.LassoLarsCV(*[, fit_intercept, ...])|Cross-validated Lasso, using the LARS algorithm.|
|linear_model.LassoLarsIC([criterion, ...])|Lasso model fit with Lars using BIC or AIC for model selection.|
linear_model.LarsCV(*[, fit_intercept, ...]) |linear_model.OrthogonalMatchingPursuit(*[, ...])|Orthogonal Matching Pursuit model (OMP).|
Cross-validated Least Angle Regression model. |linear_model.OrthogonalMatchingPursuitCV(*)|Cross-validated Orthogonal Matching Pursuit model (OMP).|
linear_model.Lasso([alpha, fit_intercept, ...])
Linear Model trained with L1 prior as regularizer (aka the Lasso).
linear_model.LassoCV(*[, eps, n_alphas, ...])
Lasso linear model with iterative fitting along a regularization path.
linear_model.LassoLars([alpha, ...])
Lasso model fit with Least Angle Regression a.k.a.
linear_model.LassoLarsCV(*[, fit_intercept, ...])
Cross-validated Lasso, using the LARS algorithm.
linear_model.LassoLarsIC([criterion, ...])
Lasso model fit with Lars using BIC or AIC for model selection.
linear_model.OrthogonalMatchingPursuit(*[, ...])
Orthogonal Matching Pursuit model (OMP).
linear_model.OrthogonalMatchingPursuitCV(*)
Cross-validated Orthogonal Matching Pursuit model (OMP).
### Bayesian regressors ### Bayesian regressors
||| |||
|---|---| |---|---|
linear_model.ARDRegression(*[, max_iter, ...]) |linear_model.ARDRegression(*[, max_iter, ...])|Bayesian ARD regression.|
Bayesian ARD regression. |linear_model.BayesianRidge(*[, max_iter, ...])|Bayesian ridge regression.|
linear_model.BayesianRidge(*[, max_iter, ...])
Bayesian ridge regression.
### Multi-task linear regressors with variable selection ### Multi-task linear regressors with variable selection
||| |||
|---|---| |---|---|
linear_model.MultiTaskElasticNet([alpha, ...]) |linear_model.MultiTaskElasticNet([alpha, ...])|Multi-task ElasticNet model trained with L1/L2 mixed-norm as regularizer.|
Multi-task ElasticNet model trained with L1/L2 mixed-norm as regularizer. |linear_model.MultiTaskElasticNetCV(*[, ...])|Multi-task L1/L2 ElasticNet with built-in cross-validation.|
|linear_model.MultiTaskLasso([alpha, ...])|Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer.|
linear_model.MultiTaskElasticNetCV(*[, ...]) |linear_model.MultiTaskLassoCV(*[, eps, ...])|Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer.|
Multi-task L1/L2 ElasticNet with built-in cross-validation.
linear_model.MultiTaskLasso([alpha, ...])
Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer.
linear_model.MultiTaskLassoCV(*[, eps, ...])
Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer.
### Outlier-robust regressors ### Outlier-robust regressors
||| |||
|---|---| |---|---|
linear_model.HuberRegressor(*[, epsilon, ...]) |linear_model.HuberRegressor(*[, epsilon, ...])|L2-regularized linear regression model that is robust to outliers.|
L2-regularized linear regression model that is robust to outliers. |linear_model.QuantileRegressor(*[, ...])|Linear regression model that predicts conditional quantiles.|
|linear_model.RANSACRegressor([estimator, ...])|RANSAC (RANdom SAmple Consensus) algorithm.|
linear_model.QuantileRegressor(*[, ...]) |linear_model.TheilSenRegressor(*[, ...])|Theil-Sen Estimator: robust multivariate regression model.|
Linear regression model that predicts conditional quantiles.
linear_model.RANSACRegressor([estimator, ...])
RANSAC (RANdom SAmple Consensus) algorithm.
linear_model.TheilSenRegressor(*[, ...])
Theil-Sen Estimator: robust multivariate regression model.
### Generalized linear models (GLM) for regression ### Generalized linear models (GLM) for regression
||| |||
|---|---| |---|---|
linear_model.PoissonRegressor(*[, alpha, ...]) |linear_model.PoissonRegressor(*[, alpha, ...])|Generalized Linear Model with a Poisson distribution.|
Generalized Linear Model with a Poisson distribution. |linear_model.TweedieRegressor(*[, power, ...])|Generalized Linear Model with a Tweedie distribution.|
|linear_model.GammaRegressor(*[, alpha, ...])|Generalized Linear Model with a Gamma distribution.|
linear_model.TweedieRegressor(*[, power, ...])
Generalized Linear Model with a Tweedie distribution.
linear_model.GammaRegressor(*[, alpha, ...])
Generalized Linear Model with a Gamma distribution.
### Miscellaneous ### Miscellaneous
||| |||
|---|---| |---|---|
linear_model.PassiveAggressiveRegressor(*[, ...]) |linear_model.PassiveAggressiveRegressor(*[, ...])|Passive Aggressive Regressor.|
Passive Aggressive Regressor. |linear_model.enet_path(X, y, *[, l1_ratio, ...])|Compute elastic net path with coordinate descent.|
|linear_model.lars_path(X, y[, Xy, Gram, ...])|Compute Least Angle Regression or Lasso path using the LARS algorithm [1].|
linear_model.enet_path(X, y, *[, l1_ratio, ...]) |linear_model.lars_path_gram(Xy, Gram, *, ...)|The lars_path in the sufficient stats mode [1].|
Compute elastic net path with coordinate descent. |linear_model.lasso_path(X, y, *[, eps, ...])|Compute Lasso path with coordinate descent.|
|linear_model.orthogonal_mp(X, y, *[, ...])|Orthogonal Matching Pursuit (OMP).|
linear_model.lars_path(X, y[, Xy, Gram, ...]) |linear_model.orthogonal_mp_gram(Gram, Xy, *)|Gram Orthogonal Matching Pursuit (OMP).|
Compute Least Angle Regression or Lasso path using the LARS algorithm [1]. |linear_model.ridge_regression(X, y, alpha, *)|Solve the ridge equation by the method of normal equations.|
linear_model.lars_path_gram(Xy, Gram, *, ...)
The lars_path in the sufficient stats mode [1].
linear_model.lasso_path(X, y, *[, eps, ...])
Compute Lasso path with coordinate descent.
linear_model.orthogonal_mp(X, y, *[, ...])
Orthogonal Matching Pursuit (OMP).
linear_model.orthogonal_mp_gram(Gram, Xy, *)
Gram Orthogonal Matching Pursuit (OMP).
linear_model.ridge_regression(X, y, alpha, *)
Solve the ridge equation by the method of normal equations.
## [sklearn.manifold: Manifold Learning](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.manifold) ## [sklearn.manifold: Manifold Learning](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.manifold)
||| |||
|---|---| |---|---|
manifold.Isomap(*[, n_neighbors, radius, ...]) |manifold.Isomap(*[, n_neighbors, radius, ...])|Isomap Embedding.|
Isomap Embedding. |manifold.LocallyLinearEmbedding(*[, ...])|Locally Linear Embedding.|
|manifold.MDS([n_components, metric, n_init, ...])|Multidimensional scaling.|
manifold.LocallyLinearEmbedding(*[, ...]) |manifold.SpectralEmbedding([n_components, ...])|Spectral embedding for non-linear dimensionality reduction.|
Locally Linear Embedding. |manifold.TSNE([n_components, perplexity, ...])|T-distributed Stochastic Neighbor Embedding.|
|manifold.locally_linear_embedding(X, *, ...)|Perform a Locally Linear Embedding analysis on the data.|
manifold.MDS([n_components, metric, n_init, ...]) |manifold.smacof(dissimilarities, *[, ...])|Compute multidimensional scaling using the SMACOF algorithm.|
Multidimensional scaling. |manifold.spectral_embedding(adjacency, *[, ...])|Project the sample on the first eigenvectors of the graph Laplacian.|
|manifold.trustworthiness(X, X_embedded, *[, ...])|Indicate to what extent the local structure is retained.|
manifold.SpectralEmbedding([n_components, ...])
Spectral embedding for non-linear dimensionality reduction.
manifold.TSNE([n_components, perplexity, ...])
T-distributed Stochastic Neighbor Embedding.
manifold.locally_linear_embedding(X, *, ...)
Perform a Locally Linear Embedding analysis on the data.
manifold.smacof(dissimilarities, *[, ...])
Compute multidimensional scaling using the SMACOF algorithm.
manifold.spectral_embedding(adjacency, *[, ...])
Project the sample on the first eigenvectors of the graph Laplacian.
manifold.trustworthiness(X, X_embedded, *[, ...])
Indicate to what extent the local structure is retained.
## [sklearn.metrics: Metrics](https://scikit-learn.org/stable/modules/classes.html#sklearn-metrics-metrics) ## [sklearn.metrics: Metrics](https://scikit-learn.org/stable/modules/classes.html#sklearn-metrics-metrics)
@ -545,17 +418,10 @@ Indicate to what extent the local structure is retained.
||| |||
|---|---| |---|---|
metrics.check_scoring(estimator[, scoring, ...]) |metrics.check_scoring(estimator[, scoring, ...])|Determine scorer from user options.|
Determine scorer from user options. |metrics.get_scorer(scoring)|Get a scorer from string.|
|metrics.get_scorer_names()|Get the names of all available scorers.|
metrics.get_scorer(scoring) |metrics.make_scorer(score_func, *[, ...])|Make a scorer from a performance metric or loss function.|
Get a scorer from string.
metrics.get_scorer_names()
Get the names of all available scorers.
metrics.make_scorer(score_func, *[, ...])
Make a scorer from a performance metric or loss function.
### Classification metrics ### Classification metrics