Regression diagnostics

Regression diagnostics

Link to Notebook GitHub

This example file shows how to use a few of the statsmodels regression diagnostic tests in a real-life context. You can learn about more tests and find out more information abou the tests here on the Regression Diagnostics page.

Note that most of the tests described here only return a tuple of numbers, without any annotation. A full description of outputs is always included in the docstring and in the online statsmodels documentation. For presentation purposes, we use the zip(name,test) construct to pretty-print short descriptions in the examples below.

Estimate a regression model

In [1]:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
<span class="kn">from</span> <span class="nn">__future__</span> <span class="kn">import</span> <span class="n">print_function</span>
<span class="kn">from</span> <span class="nn">statsmodels.compat</span> <span class="kn">import</span> <span class="n">lzip</span>
<span class="kn">import</span> <span class="nn">statsmodels</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="kn">as</span> <span class="nn">np</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="kn">as</span> <span class="nn">pd</span>
<span class="kn">import</span> <span class="nn">statsmodels.formula.api</span> <span class="kn">as</span> <span class="nn">smf</span>
<span class="kn">import</span> <span class="nn">statsmodels.stats.api</span> <span class="kn">as</span> <span class="nn">sms</span>
<span class="kn">import</span> <span class="nn">matplotlib.pyplot</span> <span class="kn">as</span> <span class="nn">plt</span>
 
<span class="c"># Load data</span>
<span class="n">url</span> <span class="o">=</span> <span class="s">'http://vincentarelbundock.github.io/Rdatasets/csv/HistData/Guerry.csv'</span>
<span class="n">dat</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">url</span><span class="p">)</span>
 
<span class="c"># Fit regression model (using the natural log of one of the regressaors)</span>
<span class="n">results</span> <span class="o">=</span> <span class="n">smf</span><span class="o">.</span><span class="n">ols</span><span class="p">(</span><span class="s">'Lottery ~ Literacy + np.log(Pop1831)'</span><span class="p">,</span> <span class="n">data</span><span class="o">=</span><span class="n">dat</span><span class="p">)</span><span class="o">.</span><span class="n">fit</span><span class="p">()</span>
 
<span class="c"># Inspect the results</span>
<span class="k">print</span><span class="p">(</span><span class="n">results</span><span class="o">.</span><span class="n">summary</span><span class="p">())</span>
                            OLS Regression Results
==============================================================================
Dep. Variable:                Lottery   R-squared:                       0.348
Model:                            OLS   Adj. R-squared:                  0.333
Method:                 Least Squares   F-statistic:                     22.20
Date:                Tue, 02 Dec 2014   Prob (F-statistic):           1.90e-08
Time:                        12:52:28   Log-Likelihood:                -379.82
No. Observations:                  86   AIC:                             765.6
Df Residuals:                      83   BIC:                             773.0
Df Model:                           2
Covariance Type:            nonrobust
===================================================================================
                      coef    std err          t      P>|t|      [95.0% Conf. Int.]
-----------------------------------------------------------------------------------
Intercept         246.4341     35.233      6.995      0.000       176.358   316.510
Literacy           -0.4889      0.128     -3.832      0.000        -0.743    -0.235
np.log(Pop1831)   -31.3114      5.977     -5.239      0.000       -43.199   -19.424
==============================================================================
Omnibus:                        3.713   Durbin-Watson:                   2.019
Prob(Omnibus):                  0.156   Jarque-Bera (JB):                3.394
Skew:                          -0.487   Prob(JB):                        0.183
Kurtosis:                       3.003   Cond. No.                         702.
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

Normality of the residuals

Jarque-Bera test:

In [2]:
1
2
3
<span class="n">name</span> <span class="o">=</span> <span class="p">[</span><span class="s">'Jarque-Bera'</span><span class="p">,</span> <span class="s">'Chi^2 two-tail prob.'</span><span class="p">,</span> <span class="s">'Skew'</span><span class="p">,</span> <span class="s">'Kurtosis'</span><span class="p">]</span>
<span class="n">test</span> <span class="o">=</span> <span class="n">sms</span><span class="o">.</span><span class="n">jarque_bera</span><span class="p">(</span><span class="n">results</span><span class="o">.</span><span class="n">resid</span><span class="p">)</span>
<span class="n">lzip</span><span class="p">(</span><span class="n">name</span><span class="p">,</span> <span class="n">test</span><span class="p">)</span>
Out[2]:
[('Jarque-Bera', 3.393608024843173),
 ('Chi^2 two-tail prob.', 0.1832683123166331),
 ('Skew', -0.4865803431122342),
 ('Kurtosis', 3.003417757881634)]

Omni test:

In [3]:
1
2
3
<span class="n">name</span> <span class="o">=</span> <span class="p">[</span><span class="s">'Chi^2'</span><span class="p">,</span> <span class="s">'Two-tail probability'</span><span class="p">]</span>
<span class="n">test</span> <span class="o">=</span> <span class="n">sms</span><span class="o">.</span><span class="n">omni_normtest</span><span class="p">(</span><span class="n">results</span><span class="o">.</span><span class="n">resid</span><span class="p">)</span>
<span class="n">lzip</span><span class="p">(</span><span class="n">name</span><span class="p">,</span> <span class="n">test</span><span class="p">)</span>
Out[3]:
[('Chi^2', 3.7134378115971902), ('Two-tail probability', 0.15618424580304752)]

Influence tests

Once created, an object of class OLSInfluence holds attributes and methods that allow users to assess the influence of each observation. For example, we can compute and extract the first few rows of DFbetas by:

In [4]:
1
2
3
<span class="kn">from</span> <span class="nn">statsmodels.stats.outliers_influence</span> <span class="kn">import</span> <span class="n">OLSInfluence</span>
<span class="n">test_class</span> <span class="o">=</span> <span class="n">OLSInfluence</span><span class="p">(</span><span class="n">results</span><span class="p">)</span>
<span class="n">test_class</span><span class="o">.</span><span class="n">dfbetas</span><span class="p">[:</span><span class="mi">5</span><span class="p">,:]</span>
Out[4]:
array([[-0.003 ,  0.0029,  0.0012],
       [-0.0643,  0.0404,  0.0628],
       [ 0.0155, -0.0356, -0.0091],
       [ 0.179 ,  0.041 , -0.1806],
       [ 0.2968,  0.2125, -0.3214]])

Explore other options by typing dir(influence_test)

Useful information on leverage can also be plotted:

In [5]:
1
2
3
<span class="kn">from</span> <span class="nn">statsmodels.graphics.regressionplots</span> <span class="kn">import</span> <span class="n">plot_leverage_resid2</span>
<span class="n">fig</span><span class="p">,</span> <span class="n">ax</span> <span class="o">=</span> <span class="n">plt</span><span class="o">.</span><span class="n">subplots</span><span class="p">(</span><span class="n">figsize</span><span class="o">=</span><span class="p">(</span><span class="mi">8</span><span class="p">,</span><span class="mi">6</span><span class="p">))</span>
<span class="n">fig</span> <span class="o">=</span> <span class="n">plot_leverage_resid2</span><span class="p">(</span><span class="n">results</span><span class="p">,</span> <span class="n">ax</span> <span class="o">=</span> <span class="n">ax</span><span class="p">)</span>

Other plotting options can be found on the Graphics page.

Multicollinearity

Condition number:

In [6]:
1
<span class="n">np</span><span class="o">.</span><span class="n">linalg</span><span class="o">.</span><span class="n">cond</span><span class="p">(</span><span class="n">results</span><span class="o">.</span><span class="n">model</span><span class="o">.</span><span class="n">exog</span><span class="p">)</span>
Out[6]:
702.17921454900625

Heteroskedasticity tests

Breush-Pagan test:

In [7]:
1
2
3
4
<span class="n">name</span> <span class="o">=</span> <span class="p">[</span><span class="s">'Lagrange multiplier statistic'</span><span class="p">,</span> <span class="s">'p-value'</span><span class="p">,</span>
        <span class="s">'f-value'</span><span class="p">,</span> <span class="s">'f p-value'</span><span class="p">]</span>
<span class="n">test</span> <span class="o">=</span> <span class="n">sms</span><span class="o">.</span><span class="n">het_breushpagan</span><span class="p">(</span><span class="n">results</span><span class="o">.</span><span class="n">resid</span><span class="p">,</span> <span class="n">results</span><span class="o">.</span><span class="n">model</span><span class="o">.</span><span class="n">exog</span><span class="p">)</span>
<span class="n">lzip</span><span class="p">(</span><span class="n">name</span><span class="p">,</span> <span class="n">test</span><span class="p">)</span>
Out[7]:
[('Lagrange multiplier statistic', 4.8932133740939481),
 ('p-value', 0.086586905023522523),
 ('f-value', 2.5037159462564333),
 ('f p-value', 0.087940287826730287)]

Goldfeld-Quandt test

In [8]:
1
2
3
<span class="n">name</span> <span class="o">=</span> <span class="p">[</span><span class="s">'F statistic'</span><span class="p">,</span> <span class="s">'p-value'</span><span class="p">]</span>
<span class="n">test</span> <span class="o">=</span> <span class="n">sms</span><span class="o">.</span><span class="n">het_goldfeldquandt</span><span class="p">(</span><span class="n">results</span><span class="o">.</span><span class="n">resid</span><span class="p">,</span> <span class="n">results</span><span class="o">.</span><span class="n">model</span><span class="o">.</span><span class="n">exog</span><span class="p">)</span>
<span class="n">lzip</span><span class="p">(</span><span class="n">name</span><span class="p">,</span> <span class="n">test</span><span class="p">)</span>
Out[8]:
[('F statistic', 1.1002422436378136), ('p-value', 0.38202950686925313)]

Linearity

Harvey-Collier multiplier test for Null hypothesis that the linear specification is correct:

In [9]:
1
2
3
<span class="n">name</span> <span class="o">=</span> <span class="p">[</span><span class="s">'t value'</span><span class="p">,</span> <span class="s">'p value'</span><span class="p">]</span>
<span class="n">test</span> <span class="o">=</span> <span class="n">sms</span><span class="o">.</span><span class="n">linear_harvey_collier</span><span class="p">(</span><span class="n">results</span><span class="p">)</span>
<span class="n">lzip</span><span class="p">(</span><span class="n">name</span><span class="p">,</span> <span class="n">test</span><span class="p">)</span>
Out[9]:
[('t value', -1.0796490077784162), ('p value', 0.28346392475584337)]
doc_statsmodels
2025-01-10 15:47:30
Comments
Leave a Comment

Please login to continue.