OLSResults.el_test()

statsmodels.regression.linear_model.OLSResults.el_test

OLSResults.el_test(b0_vals, param_nums, return_weights=0, ret_params=0, method='nm', stochastic_exog=1, return_params=0) [source]

Tests single or joint hypotheses of the regression parameters using Empirical Likelihood.

Parameters:

b0_vals : 1darray

The hypothesized value of the parameter to be tested

param_nums : 1darray

The parameter number to be tested

print_weights : bool

If true, returns the weights that optimize the likelihood ratio at b0_vals. Default is False

ret_params : bool

If true, returns the parameter vector that maximizes the likelihood ratio at b0_vals. Also returns the weights. Default is False

method : string

Can either be ?nm? for Nelder-Mead or ?powell? for Powell. The optimization method that optimizes over nuisance parameters. Default is ?nm?

stochastic_exog : bool

When TRUE, the exogenous variables are assumed to be stochastic. When the regressors are nonstochastic, moment conditions are placed on the exogenous variables. Confidence intervals for stochastic regressors are at least as large as non-stochastic regressors. Default = TRUE

Returns:

res : tuple

The p-value and -2 times the log-likelihood ratio for the hypothesized values.

Examples

>>> import statsmodels.api as sm
>>> data = sm.datasets.stackloss.load()
>>> endog = data.endog
>>> exog = sm.add_constant(data.exog)
>>> model = sm.OLS(endog, exog)
>>> fitted = model.fit()
>>> fitted.params
>>> array([-39.91967442,   0.7156402 ,   1.29528612,  -0.15212252])
>>> fitted.rsquared
>>> 0.91357690446068196
>>> # Test that the slope on the first variable is 0
>>> fitted.test_beta([0], [1])
>>> (1.7894660442330235e-07, 27.248146353709153)
doc_statsmodels
2017-01-18 16:13:25
Comments
Leave a Comment

Please login to continue.