sandbox.distributions.transformed.SquareFunc

statsmodels.sandbox.distributions.transformed.SquareFunc class statsmodels.sandbox.distributions.transformed.SquareFunc [source] class to hold quadratic function with inverse function and derivative using instance methods instead of class methods, if we want extension to parameterized function Methods derivminus(x) derivplus(x) inverseminus(x) inverseplus(x) squarefunc(x)

Summary.as_csv()

statsmodels.iolib.summary.Summary.as_csv Summary.as_csv() [source] return tables as string Returns: csv : string concatenated summary tables in comma delimited format

OLSResults.remove_data()

statsmodels.regression.linear_model.OLSResults.remove_data OLSResults.remove_data() remove data arrays, all nobs arrays from result and model This reduces the size of the instance, so it can be pickled with less memory. Currently tested for use with predict from an unpickled results and model instance. Warning Since data and some intermediate results have been removed calculating new statistics that require them will raise exceptions. The exception will occur the first time an attribute is

sandbox.stats.multicomp.tiecorrect()

statsmodels.sandbox.stats.multicomp.tiecorrect statsmodels.sandbox.stats.multicomp.tiecorrect(xranks) [source] should be equivalent of scipy.stats.tiecorrect

ANOVA

ANOVA Analysis of Variance models Examples In [1]: import statsmodels.api as sm In [2]: from statsmodels.formula.api import ols In [3]: moore = sm.datasets.get_rdataset("Moore", "car", ...: cache=True) # load data ...: In [4]: data = moore.data In [5]: data = data.rename(columns={"partner.status" : ...: "partner_status"}) # make name pythonic ...: In [6]: moore_lm = ols('conformity ~ C(fcategory, Sum)*C(partner_sta

static RLMResults.llf()

statsmodels.robust.robust_linear_model.RLMResults.llf static RLMResults.llf()

ProbitResults.predict()

statsmodels.discrete.discrete_model.ProbitResults.predict ProbitResults.predict(exog=None, transform=True, *args, **kwargs) Call self.model.predict with self.params as the first argument. Parameters: exog : array-like, optional The values for which you want to predict. transform : bool, optional If the model was fit via a formula, do you want to pass exog through the formula. Default is True. E.g., if you fit a model y ~ log(x1) + log(x2), and transform is True, then you can pass a data

KDEMultivariate.loo_likelihood()

statsmodels.nonparametric.kernel_density.KDEMultivariate.loo_likelihood KDEMultivariate.loo_likelihood(bw, func= at 0x2ac6543ac488>) [source] Returns the leave-one-out likelihood function. The leave-one-out likelihood function for the unconditional KDE. Parameters: bw: array_like : The value for the bandwidth parameter(s). func: callable, optional : Function to transform the likelihood values (before summing); for the log likelihood, use func=np.log. Default is f(x) = x. Notes The l

NonlinearIVGMM.gradient_momcond()

statsmodels.sandbox.regression.gmm.NonlinearIVGMM.gradient_momcond NonlinearIVGMM.gradient_momcond(params, epsilon=0.0001, centered=True) gradient of moment conditions Parameters: params : ndarray parameter at which the moment conditions are evaluated epsilon : float stepsize for finite difference calculation centered : bool This refers to the finite difference calculation. If centered is true, then the centered finite difference calculation is used. Otherwise the one-sided forward dif

static OLSInfluence.dffits()

statsmodels.stats.outliers_influence.OLSInfluence.dffits static OLSInfluence.dffits() [source] (cached attribute) dffits measure for influence of an observation based on resid_studentized_external, uses results from leave-one-observation-out loop It is recommended that observations with dffits large than a threshold of 2 sqrt{k / n} where k is the number of parameters, should be investigated. Returns: dffits: float : dffits_threshold : float References Wikipedia