StataReader.file_timestamp()

statsmodels.iolib.foreign.StataReader.file_timestamp StataReader.file_timestamp() [source] Returns the date and time Stata recorded on last file save. Returns: out : str

stats.diagnostic.linear_lm()

statsmodels.stats.diagnostic.linear_lm statsmodels.stats.diagnostic.linear_lm(resid, exog, func=None) Lagrange multiplier test for linearity against functional alternative limitations: Assumes currently that the first column is integer. Currently it doesn?t check whether the transformed variables contain NaNs, for example log of negative number. Parameters: resid : ndarray residuals of a regression exog : ndarray exogenous variables for which linearity is tested func : callable If func

SkewNorm_gen.fit_loc_scale()

statsmodels.sandbox.distributions.extras.SkewNorm_gen.fit_loc_scale SkewNorm_gen.fit_loc_scale(data, *args) Estimate loc and scale parameters from data using 1st and 2nd moments. Parameters: data : array_like Data to fit. arg1, arg2, arg3,... : array_like The shape parameter(s) for the distribution (see docstring of the instance object for more information). Returns: Lhat : float Estimated location parameter for the data. Shat : float Estimated scale parameter for the data.

MNLogit.hessian()

statsmodels.discrete.discrete_model.MNLogit.hessian MNLogit.hessian(params) [source] Multinomial logit Hessian matrix of the log-likelihood Parameters: params : array-like The parameters of the model Returns: hess : ndarray, (J*K, J*K) The Hessian, second derivative of loglikelihood function with respect to the flattened parameters, evaluated at params Notes where equals 1 if j = l and 0 otherwise. The actual Hessian matrix has J**2 * K x K elements. Our Hessian is reshaped to be

DiscreteResults.predict()

statsmodels.discrete.discrete_model.DiscreteResults.predict DiscreteResults.predict(exog=None, transform=True, *args, **kwargs) Call self.model.predict with self.params as the first argument. Parameters: exog : array-like, optional The values for which you want to predict. transform : bool, optional If the model was fit via a formula, do you want to pass exog through the formula. Default is True. E.g., if you fit a model y ~ log(x1) + log(x2), and transform is True, then you can pass a d

GEEMargins.conf_int()

statsmodels.genmod.generalized_estimating_equations.GEEMargins.conf_int GEEMargins.conf_int(alpha=0.05) [source] Returns the confidence intervals of the marginal effects Parameters: alpha : float Number between 0 and 1. The confidence intervals have the probability 1-alpha. Returns: conf_int : ndarray An array with lower, upper confidence intervals for the marginal effects.

stats.diagnostic.linear_rainbow()

statsmodels.stats.diagnostic.linear_rainbow statsmodels.stats.diagnostic.linear_rainbow(res, frac=0.5) Rainbow test for linearity The Null hypothesis is that the regression is correctly modelled as linear. The alternative for which the power might be large are convex, check Parameters: res : Result instance Returns: fstat : float test statistic based of F test pvalue : float pvalue of the test

NegativeBinomialResults.initialize()

statsmodels.discrete.discrete_model.NegativeBinomialResults.initialize NegativeBinomialResults.initialize(model, params, **kwd)

static ProbPlot.sample_percentiles()

statsmodels.graphics.gofplots.ProbPlot.sample_percentiles static ProbPlot.sample_percentiles() [source]

nonparametric.bandwidths.bw_silverman()

statsmodels.nonparametric.bandwidths.bw_silverman statsmodels.nonparametric.bandwidths.bw_silverman(x, kernel=None) [source] Silverman?s Rule of Thumb Parameters: x : array-like Array for which to get the bandwidth kernel : CustomKernel object Unused Returns: bw : float The estimate of the bandwidth Notes Returns .9 * A * n ** (-1/5.) where A = min(std(x, ddof=1), IQR/1.349) IQR = np.subtract.reduce(np.percentile(x, [75,25])) References Silverman, B.W. (1986) Density Estimation.