Probit.score()

statsmodels.discrete.discrete_model.Probit.score Probit.score(params) [source] Probit model score (gradient) vector Parameters: params : array-like The parameters of the model Returns: score : ndarray, 1-D The score vector of the model, i.e. the first derivative of the loglikelihood function, evaluated at params Notes Where . This simplification comes from the fact that the normal distribution is symmetric.

static MultinomialResults.aic()

statsmodels.discrete.discrete_model.MultinomialResults.aic static MultinomialResults.aic() [source]

DiscreteResults.load()

statsmodels.discrete.discrete_model.DiscreteResults.load classmethod DiscreteResults.load(fname) load a pickle, (class method) Parameters: fname : string or filehandle fname can be a string to a file path or filename, or a filehandle. Returns: unpickled instance :

StepDown.iter_subsets()

statsmodels.sandbox.stats.multicomp.StepDown.iter_subsets StepDown.iter_subsets(indices) [source]

TLinearModel.from_formula()

statsmodels.miscmodels.tmodel.TLinearModel.from_formula classmethod TLinearModel.from_formula(formula, data, subset=None, *args, **kwargs) Create a Model from a formula and dataframe. Parameters: formula : str or generic Formula object The formula specifying the model data : array-like The data for the model. See Notes. subset : array-like An array-like object of booleans, integers, or index values that indicate the subset of df to use in the model. Assumes df is a pandas.DataFrame ar

GlobalOddsRatio.update()

statsmodels.genmod.cov_struct.GlobalOddsRatio.update GlobalOddsRatio.update(params) [source] Updates the association parameter values based on the current regression coefficients. Parameters: params : array-like Working values for the regression parameters.

robust.scale.HuberScale()

statsmodels.robust.scale.HuberScale class statsmodels.robust.scale.HuberScale(d=2.5, tol=1e-08, maxiter=30) [source] Huber?s scaling for fitting robust linear models. Huber?s scale is intended to be used as the scale estimate in the IRLS algorithm and is slightly different than the Huber class. Parameters: d : float, optional d is the tuning constant for Huber?s scale. Default is 2.5 tol : float, optional The convergence tolerance maxiter : int, optiona The maximum number of iterations

NegativeBinomial.inverse_deriv()

statsmodels.genmod.families.links.NegativeBinomial.inverse_deriv NegativeBinomial.inverse_deriv(z) [source] Derivative of the inverse of the negative binomial transform Parameters: z : array-like Usually the linear predictor for a GLM or GEE model Returns: The value of the inverse of the derivative of the negative binomial : link :

static IVRegressionResults.resid_pearson()

statsmodels.sandbox.regression.gmm.IVRegressionResults.resid_pearson static IVRegressionResults.resid_pearson() Residuals, normalized to have unit variance. Returns: An array wresid/sqrt(scale) :

DescStatUV.test_mean()

statsmodels.emplike.descriptive.DescStatUV.test_mean DescStatUV.test_mean(mu0, return_weights=False) [source] Returns - 2 x log-likelihood ratio, p-value and weights for a hypothesis test of the mean. Parameters: mu0 : float Mean value to be tested return_weights : bool If return_weights is True the funtion returns the weights of the observations under the null hypothesis. Default is False Returns: test_results : tuple The log-likelihood ratio and p-value of mu0