Robust Linear Models

Robust Linear Models

Robust linear models with support for the M-estimators listed under Norms.

See Module Reference for commands and arguments.

Examples

# Load modules and data
import statsmodels.api as sm
data = sm.datasets.stackloss.load()
data.exog = sm.add_constant(data.exog)

# Fit model and print summary
rlm_model = sm.RLM(data.endog, data.exog, M=sm.robust.norms.HuberT())
rlm_results = rlm_model.fit()
print rlm_results.params

Detailed examples can be found here:

Technical Documentation

References

  • PJ Huber. ?Robust Statistics? John Wiley and Sons, Inc., New York. 1981.
  • PJ Huber. 1973, ?The 1972 Wald Memorial Lectures: Robust Regression: Asymptotics, Conjectures, and Monte Carlo.? The Annals of Statistics, 1.5, 799-821.
  • R Venables, B Ripley. ?Modern Applied Statistics in S? Springer, New York,

Module Reference

Model Classes

RLM(endog, exog[, M, missing]) Robust Linear Models

Model Results

RLMResults(model, params, ...) Class to contain RLM results

Norms

AndrewWave([a]) Andrew?s wave for M estimation.
Hampel([a, b, c]) Hampel function for M-estimation.
HuberT([t]) Huber?s T for M estimation.
LeastSquares Least squares rho for M-estimation and its derived functions.
RamsayE([a]) Ramsay?s Ea for M estimation.
RobustNorm The parent class for the norms used for robust regression.
TrimmedMean([c]) Trimmed mean function for M-estimation.
TukeyBiweight([c]) Tukey?s biweight function for M-estimation.
estimate_location(a, scale[, norm, axis, ...]) M-estimator of location using self.norm and a current estimator of scale.

Scale

Huber([c, tol, maxiter, norm]) Huber?s proposal 2 for estimating location and scale jointly.
HuberScale([d, tol, maxiter]) Huber?s scaling for fitting robust linear models.
mad(a[, c, axis, center]) The Median Absolute Deviation along given axis of an array
huber Huber?s proposal 2 for estimating location and scale jointly.
hubers_scale Huber?s scaling for fitting robust linear models.
stand_mad(a[, c, axis])
doc_statsmodels
2017-01-18 16:15:13
Comments
Leave a Comment

Please login to continue.