DiscreteResults.predict()

statsmodels.discrete.discrete_model.DiscreteResults.predict DiscreteResults.predict(exog=None, transform=True, *args, **kwargs) Call self.model.predict with self.params as the first argument. Parameters: exog : array-like, optional The values for which you want to predict. transform : bool, optional If the model was fit via a formula, do you want to pass exog through the formula. Default is True. E.g., if you fit a model y ~ log(x1) + log(x2), and transform is True, then you can pass a d

GEEMargins.conf_int()

statsmodels.genmod.generalized_estimating_equations.GEEMargins.conf_int GEEMargins.conf_int(alpha=0.05) [source] Returns the confidence intervals of the marginal effects Parameters: alpha : float Number between 0 and 1. The confidence intervals have the probability 1-alpha. Returns: conf_int : ndarray An array with lower, upper confidence intervals for the marginal effects.

NegativeBinomialResults.summary2()

statsmodels.discrete.discrete_model.NegativeBinomialResults.summary2 NegativeBinomialResults.summary2(yname=None, xname=None, title=None, alpha=0.05, float_format='%.4f') Experimental function to summarize regression results Parameters: xname : List of strings of length equal to the number of parameters Names of the independent variables (optional) yname : string Name of the dependent variable (optional) title : string, optional Title for the top table. If not None, then this replaces

stats.diagnostic.linear_rainbow()

statsmodels.stats.diagnostic.linear_rainbow statsmodels.stats.diagnostic.linear_rainbow(res, frac=0.5) Rainbow test for linearity The Null hypothesis is that the regression is correctly modelled as linear. The alternative for which the power might be large are convex, check Parameters: res : Result instance Returns: fstat : float test statistic based of F test pvalue : float pvalue of the test

DiscreteResults.summary()

statsmodels.discrete.discrete_model.DiscreteResults.summary DiscreteResults.summary(yname=None, xname=None, title=None, alpha=0.05, yname_list=None) [source] Summarize the Regression Results Parameters: yname : string, optional Default is y xname : list of strings, optional Default is var_## for ## in p the number of regressors title : string, optional Title for the top table. If not None, then this replaces the default title alpha : float significance level for the confidence interv

TransfTwo_gen.std()

statsmodels.sandbox.distributions.transformed.TransfTwo_gen.std TransfTwo_gen.std(*args, **kwds) Standard deviation of the distribution. Parameters: arg1, arg2, arg3,... : array_like The shape parameter(s) for the distribution (see docstring of the instance object for more information) loc : array_like, optional location parameter (default=0) scale : array_like, optional scale parameter (default=1) Returns: std : float standard deviation of the distribution

QuantReg.score()

statsmodels.regression.quantile_regression.QuantReg.score QuantReg.score(params) Score vector of model. The gradient of logL with respect to each parameter.

static IVRegressionResults.fvalue()

statsmodels.sandbox.regression.gmm.IVRegressionResults.fvalue static IVRegressionResults.fvalue() [source]

static DynamicVAR.coefs()

statsmodels.tsa.vector_ar.dynamic.DynamicVAR.coefs static DynamicVAR.coefs() [source] Return dynamic regression coefficients as WidePanel

CLogLog.deriv2()

statsmodels.genmod.families.links.CLogLog.deriv2 CLogLog.deriv2(p) Second derivative of the link function g??(p) implemented through numerical differentiation