GLMResults.wald_test()

statsmodels.genmod.generalized_linear_model.GLMResults.wald_test GLMResults.wald_test(r_matrix, cov_p=None, scale=1.0, invcov=None, use_f=None) Compute a Wald-test for a joint linear hypothesis. Parameters: r_matrix : array-like, str, or tuple array : An r x k array where r is the number of restrictions to test and k is the number of regressors. It is assumed that the linear combination is equal to zero. str : The full hypotheses to test can be given as a string. See the examples. tuple :

WLS.fit_regularized()

statsmodels.regression.linear_model.WLS.fit_regularized WLS.fit_regularized(method='coord_descent', maxiter=1000, alpha=0.0, L1_wt=1.0, start_params=None, cnvrg_tol=1e-08, zero_tol=1e-08, **kwargs) Return a regularized fit to a linear regression model. Parameters: method : string Only the coordinate descent algorithm is implemented. maxiter : integer The maximum number of iteration cycles (an iteration cycle involves running coordinate descent on all variables). alpha : scalar or array-

QuantReg.initialize()

statsmodels.regression.quantile_regression.QuantReg.initialize QuantReg.initialize()

static OLSInfluence.resid_studentized_external()

statsmodels.stats.outliers_influence.OLSInfluence.resid_studentized_external static OLSInfluence.resid_studentized_external() [source] (cached attribute) studentized residuals using LOOO variance this uses sigma from leave-one-out estimates requires leave one out loop for observations

Interactions and ANOVA

Interactions and ANOVA Link to Notebook GitHub Note: This script is based heavily on Jonathan Taylor's class notes http://www.stanford.edu/class/stats191/interactions.html Download and format data: In [1]: from __future__ import print_function from statsmodels.compat import urlopen import numpy as np np.set_printoptions(precision=4, suppress=True) import statsmodels.api as sm import pandas as pd pd.set_option("display.width", 100) import matplotlib.pyplot as plt from statsmodels.fo

static QuantRegResults.cov_HC0()

statsmodels.regression.quantile_regression.QuantRegResults.cov_HC0 static QuantRegResults.cov_HC0() See statsmodels.RegressionResults

Patsy: Contrast Coding Systems for categorical variables

Patsy: Contrast Coding Systems for categorical variables Note This document is based heavily on this excellent resource from UCLA. A categorical variable of K categories, or levels, usually enters a regression as a sequence of K-1 dummy variables. This amounts to a linear hypothesis on the level means. That is, each test statistic for these variables amounts to testing whether the mean for that level is statistically significantly different from the mean of the base category. This dummy codin

static BinaryResults.fittedvalues()

statsmodels.discrete.discrete_model.BinaryResults.fittedvalues static BinaryResults.fittedvalues()

MultinomialResults.summary2()

statsmodels.discrete.discrete_model.MultinomialResults.summary2 MultinomialResults.summary2(alpha=0.05, float_format='%.4f') [source] Experimental function to summarize regression results Parameters: alpha : float significance level for the confidence intervals float_format: string : print format for floats in parameters summary Returns: smry : Summary instance this holds the summary tables and text, which can be printed or converted to various output formats. See also statsmode

PoissonOffsetGMLE.loglikeobs()

statsmodels.miscmodels.count.PoissonOffsetGMLE.loglikeobs PoissonOffsetGMLE.loglikeobs(params)