MultinomialResults.summary()

statsmodels.discrete.discrete_model.MultinomialResults.summary MultinomialResults.summary(yname=None, xname=None, title=None, alpha=0.05, yname_list=None) Summarize the Regression Results Parameters: yname : string, optional Default is y xname : list of strings, optional Default is var_## for ## in p the number of regressors title : string, optional Title for the top table. If not None, then this replaces the default title alpha : float significance level for the confidence intervals

MultinomialResults.save()

statsmodels.discrete.discrete_model.MultinomialResults.save MultinomialResults.save(fname, remove_data=False) save a pickle of this instance Parameters: fname : string or filehandle fname can be a string to a file path or filename, or a filehandle. remove_data : bool If False (default), then the instance is pickled without changes. If True, then all arrays with length nobs are set to None before pickling. See the remove_data method. In some cases not all arrays will be set to None. Not

MultinomialResults.remove_data()

statsmodels.discrete.discrete_model.MultinomialResults.remove_data MultinomialResults.remove_data() remove data arrays, all nobs arrays from result and model This reduces the size of the instance, so it can be pickled with less memory. Currently tested for use with predict from an unpickled results and model instance. Warning Since data and some intermediate results have been removed calculating new statistics that require them will raise exceptions. The exception will occur the first time

MultinomialResults.predict()

statsmodels.discrete.discrete_model.MultinomialResults.predict MultinomialResults.predict(exog=None, transform=True, *args, **kwargs) Call self.model.predict with self.params as the first argument. Parameters: exog : array-like, optional The values for which you want to predict. transform : bool, optional If the model was fit via a formula, do you want to pass exog through the formula. Default is True. E.g., if you fit a model y ~ log(x1) + log(x2), and transform is True, then you can pa

MultinomialResults.normalized_cov_params()

statsmodels.discrete.discrete_model.MultinomialResults.normalized_cov_params MultinomialResults.normalized_cov_params()

MultinomialResults.pred_table()

statsmodels.discrete.discrete_model.MultinomialResults.pred_table MultinomialResults.pred_table() [source] Returns the J x J prediction table. Notes pred_table[i,j] refers to the number of times ?i? was observed and the model predicted ?j?. Correct predictions are along the diagonal.

MultinomialResults.load()

statsmodels.discrete.discrete_model.MultinomialResults.load classmethod MultinomialResults.load(fname) load a pickle, (class method) Parameters: fname : string or filehandle fname can be a string to a file path or filename, or a filehandle. Returns: unpickled instance :

MultinomialResults.initialize()

statsmodels.discrete.discrete_model.MultinomialResults.initialize MultinomialResults.initialize(model, params, **kwd)

MultinomialResults.margeff()

statsmodels.discrete.discrete_model.MultinomialResults.margeff MultinomialResults.margeff() [source]

MultinomialResults.f_test()

statsmodels.discrete.discrete_model.MultinomialResults.f_test MultinomialResults.f_test(r_matrix, cov_p=None, scale=1.0, invcov=None) Compute the F-test for a joint linear hypothesis. This is a special case of wald_test that always uses the F distribution. Parameters: r_matrix : array-like, str, or tuple array : An r x k array where r is the number of restrictions to test and k is the number of regressors. It is assumed that the linear combination is equal to zero. str : The full hypothese