Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 27 Nov 2018 11:18:31 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2018/Nov/27/t1543313976sa9iplga79p6jz9.htm/, Retrieved Fri, 03 May 2024 17:20:15 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=315708, Retrieved Fri, 03 May 2024 17:20:15 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact115
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2018-11-27 10:18:31] [9e2b27cf675a1d7b8e7da04c0e26bb6a] [Current]
Feedback Forum

Post a new message
Dataseries X:
63.2
68.6
77.7
68.1
75.1
73.3
60.5
65.9
77.7
77.1
77.7
71.3
76
75.3
81.7
72.5
77.4
81.1
65.1
68.7
75.6
79.7
75.3
67.7
73.2
72.2
79.3
77.5
75.6
77.4
69.2
67.1
77.9
82.7
75.7
70.1
76.4
74.3
80.5
78
73.5
78.8
71.2
66.2
82.7
83.8
75
80.4
74.6
77.7
89.8
82.4
77
89.6
75.7
75.1
89.9
88.8
86.5
90
84
82.7
91.7
87.5
82
92.2
73.1
75.6
91.6
87.5
90.1
91.3
87.6
88.4
100.7
85.3
92
96.8
77.9
80.9
95.3
99.3
96.1
92.5
93.7
92.1
103.6
92.5
95.7
103.4
89
89.1
98.7
109.4
101.1
95.4
101.4
102.1
103.6
106
98.4
106.6
95.8
87.2
108.5
107
92
94.9
84.4
85
94
84.5
88.2
92.1
81.1
81.2
96.1
95.3
92.1
91.7
90.3
96.1
108.7
95.9
95.1
109.4
91.2
91.4
107.4
105.6
105.3
103.7
99.5
103.2
123.1
102.2
110
106.2
91.3
99.3
111.8
104.4
102.4
101
100.6
104.5
117.4
97.4
99.5
106.4
95.2
94
104.1
105.8
101.1
93.5
97.9
96.8
108.4
103.5
101.3
107.4
100.7
91.1
105
112.8
105.6
101
101.9
103.5
109.5
105
102.9
108.5
96.9
88.4
112.4
111.3
101.6
101.2
101.8
98.8
114.4
104.5
97.6
109.1
94.5
90.4
111.8
110.5
106.8
101.8
103.7
107.4
117.5
109.6
102.8
115.5
97.8
100.2
112.9
108.7
109
113.9
106.9
109.6
124.5
104.2
110.8
118.7
102.1
105.1




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time22 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time22 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]22 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=315708&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time22 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
[t] = + 29.104 + 0.199592`X6(t-1)`[t] + 0.20534`X6(t-2)`[t] + 0.471149`X6(t-3)`[t] -0.0370984`(t-1s)`[t] -0.0601437`(t-2s)`[t] -12.5293M1[t] -14.2767M2[t] -9.4313M3[t] -21.051M4[t] -20.8715M5[t] -4.71901M6[t] -0.685146M7[t] -8.5297M8[t] -16.2902M9[t] -16.5133M10[t] -12.5547M11[t] + 0.0390226t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
[t] =  +  29.104 +  0.199592`X6(t-1)`[t] +  0.20534`X6(t-2)`[t] +  0.471149`X6(t-3)`[t] -0.0370984`(t-1s)`[t] -0.0601437`(t-2s)`[t] -12.5293M1[t] -14.2767M2[t] -9.4313M3[t] -21.051M4[t] -20.8715M5[t] -4.71901M6[t] -0.685146M7[t] -8.5297M8[t] -16.2902M9[t] -16.5133M10[t] -12.5547M11[t] +  0.0390226t  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C][t] =  +  29.104 +  0.199592`X6(t-1)`[t] +  0.20534`X6(t-2)`[t] +  0.471149`X6(t-3)`[t] -0.0370984`(t-1s)`[t] -0.0601437`(t-2s)`[t] -12.5293M1[t] -14.2767M2[t] -9.4313M3[t] -21.051M4[t] -20.8715M5[t] -4.71901M6[t] -0.685146M7[t] -8.5297M8[t] -16.2902M9[t] -16.5133M10[t] -12.5547M11[t] +  0.0390226t  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
[t] = + 29.104 + 0.199592`X6(t-1)`[t] + 0.20534`X6(t-2)`[t] + 0.471149`X6(t-3)`[t] -0.0370984`(t-1s)`[t] -0.0601437`(t-2s)`[t] -12.5293M1[t] -14.2767M2[t] -9.4313M3[t] -21.051M4[t] -20.8715M5[t] -4.71901M6[t] -0.685146M7[t] -8.5297M8[t] -16.2902M9[t] -16.5133M10[t] -12.5547M11[t] + 0.0390226t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+29.1 6.393+4.5520e+00 1.018e-05 5.09e-06
`X6(t-1)`+0.1996 0.06881+2.9010e+00 0.004226 0.002113
`X6(t-2)`+0.2053 0.06802+3.0190e+00 0.002937 0.001469
`X6(t-3)`+0.4712 0.07045+6.6880e+00 3.259e-10 1.63e-10
`(t-1s)`-0.0371 0.05549-6.6860e-01 0.5047 0.2523
`(t-2s)`-0.06014 0.05236-1.1490e+00 0.2523 0.1262
M1-12.53 1.602-7.8220e+00 5.613e-13 2.807e-13
M2-14.28 1.567-9.1140e+00 2.425e-16 1.212e-16
M3-9.431 1.451-6.5000e+00 8.884e-10 4.442e-10
M4-21.05 1.769-1.1900e+01 4.843e-24 2.422e-24
M5-20.87 1.828-1.1420e+01 1.115e-22 5.573e-23
M6-4.719 1.581-2.9850e+00 0.003262 0.001631
M7-0.6852 1.595-4.2960e-01 0.668 0.334
M8-8.53 1.708-4.9940e+00 1.479e-06 7.395e-07
M9-16.29 1.525-1.0680e+01 1.252e-20 6.26e-21
M10-16.51 1.53-1.0790e+01 6.133e-21 3.067e-21
M11-12.55 1.436-8.7450e+00 2.308e-15 1.154e-15
t+0.03902 0.01563+2.4970e+00 0.01349 0.006746

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +29.1 &  6.393 & +4.5520e+00 &  1.018e-05 &  5.09e-06 \tabularnewline
`X6(t-1)` & +0.1996 &  0.06881 & +2.9010e+00 &  0.004226 &  0.002113 \tabularnewline
`X6(t-2)` & +0.2053 &  0.06802 & +3.0190e+00 &  0.002937 &  0.001469 \tabularnewline
`X6(t-3)` & +0.4712 &  0.07045 & +6.6880e+00 &  3.259e-10 &  1.63e-10 \tabularnewline
`(t-1s)` & -0.0371 &  0.05549 & -6.6860e-01 &  0.5047 &  0.2523 \tabularnewline
`(t-2s)` & -0.06014 &  0.05236 & -1.1490e+00 &  0.2523 &  0.1262 \tabularnewline
M1 & -12.53 &  1.602 & -7.8220e+00 &  5.613e-13 &  2.807e-13 \tabularnewline
M2 & -14.28 &  1.567 & -9.1140e+00 &  2.425e-16 &  1.212e-16 \tabularnewline
M3 & -9.431 &  1.451 & -6.5000e+00 &  8.884e-10 &  4.442e-10 \tabularnewline
M4 & -21.05 &  1.769 & -1.1900e+01 &  4.843e-24 &  2.422e-24 \tabularnewline
M5 & -20.87 &  1.828 & -1.1420e+01 &  1.115e-22 &  5.573e-23 \tabularnewline
M6 & -4.719 &  1.581 & -2.9850e+00 &  0.003262 &  0.001631 \tabularnewline
M7 & -0.6852 &  1.595 & -4.2960e-01 &  0.668 &  0.334 \tabularnewline
M8 & -8.53 &  1.708 & -4.9940e+00 &  1.479e-06 &  7.395e-07 \tabularnewline
M9 & -16.29 &  1.525 & -1.0680e+01 &  1.252e-20 &  6.26e-21 \tabularnewline
M10 & -16.51 &  1.53 & -1.0790e+01 &  6.133e-21 &  3.067e-21 \tabularnewline
M11 & -12.55 &  1.436 & -8.7450e+00 &  2.308e-15 &  1.154e-15 \tabularnewline
t & +0.03902 &  0.01563 & +2.4970e+00 &  0.01349 &  0.006746 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+29.1[/C][C] 6.393[/C][C]+4.5520e+00[/C][C] 1.018e-05[/C][C] 5.09e-06[/C][/ROW]
[ROW][C]`X6(t-1)`[/C][C]+0.1996[/C][C] 0.06881[/C][C]+2.9010e+00[/C][C] 0.004226[/C][C] 0.002113[/C][/ROW]
[ROW][C]`X6(t-2)`[/C][C]+0.2053[/C][C] 0.06802[/C][C]+3.0190e+00[/C][C] 0.002937[/C][C] 0.001469[/C][/ROW]
[ROW][C]`X6(t-3)`[/C][C]+0.4712[/C][C] 0.07045[/C][C]+6.6880e+00[/C][C] 3.259e-10[/C][C] 1.63e-10[/C][/ROW]
[ROW][C]`(t-1s)`[/C][C]-0.0371[/C][C] 0.05549[/C][C]-6.6860e-01[/C][C] 0.5047[/C][C] 0.2523[/C][/ROW]
[ROW][C]`(t-2s)`[/C][C]-0.06014[/C][C] 0.05236[/C][C]-1.1490e+00[/C][C] 0.2523[/C][C] 0.1262[/C][/ROW]
[ROW][C]M1[/C][C]-12.53[/C][C] 1.602[/C][C]-7.8220e+00[/C][C] 5.613e-13[/C][C] 2.807e-13[/C][/ROW]
[ROW][C]M2[/C][C]-14.28[/C][C] 1.567[/C][C]-9.1140e+00[/C][C] 2.425e-16[/C][C] 1.212e-16[/C][/ROW]
[ROW][C]M3[/C][C]-9.431[/C][C] 1.451[/C][C]-6.5000e+00[/C][C] 8.884e-10[/C][C] 4.442e-10[/C][/ROW]
[ROW][C]M4[/C][C]-21.05[/C][C] 1.769[/C][C]-1.1900e+01[/C][C] 4.843e-24[/C][C] 2.422e-24[/C][/ROW]
[ROW][C]M5[/C][C]-20.87[/C][C] 1.828[/C][C]-1.1420e+01[/C][C] 1.115e-22[/C][C] 5.573e-23[/C][/ROW]
[ROW][C]M6[/C][C]-4.719[/C][C] 1.581[/C][C]-2.9850e+00[/C][C] 0.003262[/C][C] 0.001631[/C][/ROW]
[ROW][C]M7[/C][C]-0.6852[/C][C] 1.595[/C][C]-4.2960e-01[/C][C] 0.668[/C][C] 0.334[/C][/ROW]
[ROW][C]M8[/C][C]-8.53[/C][C] 1.708[/C][C]-4.9940e+00[/C][C] 1.479e-06[/C][C] 7.395e-07[/C][/ROW]
[ROW][C]M9[/C][C]-16.29[/C][C] 1.525[/C][C]-1.0680e+01[/C][C] 1.252e-20[/C][C] 6.26e-21[/C][/ROW]
[ROW][C]M10[/C][C]-16.51[/C][C] 1.53[/C][C]-1.0790e+01[/C][C] 6.133e-21[/C][C] 3.067e-21[/C][/ROW]
[ROW][C]M11[/C][C]-12.55[/C][C] 1.436[/C][C]-8.7450e+00[/C][C] 2.308e-15[/C][C] 1.154e-15[/C][/ROW]
[ROW][C]t[/C][C]+0.03902[/C][C] 0.01563[/C][C]+2.4970e+00[/C][C] 0.01349[/C][C] 0.006746[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+29.1 6.393+4.5520e+00 1.018e-05 5.09e-06
`X6(t-1)`+0.1996 0.06881+2.9010e+00 0.004226 0.002113
`X6(t-2)`+0.2053 0.06802+3.0190e+00 0.002937 0.001469
`X6(t-3)`+0.4712 0.07045+6.6880e+00 3.259e-10 1.63e-10
`(t-1s)`-0.0371 0.05549-6.6860e-01 0.5047 0.2523
`(t-2s)`-0.06014 0.05236-1.1490e+00 0.2523 0.1262
M1-12.53 1.602-7.8220e+00 5.613e-13 2.807e-13
M2-14.28 1.567-9.1140e+00 2.425e-16 1.212e-16
M3-9.431 1.451-6.5000e+00 8.884e-10 4.442e-10
M4-21.05 1.769-1.1900e+01 4.843e-24 2.422e-24
M5-20.87 1.828-1.1420e+01 1.115e-22 5.573e-23
M6-4.719 1.581-2.9850e+00 0.003262 0.001631
M7-0.6852 1.595-4.2960e-01 0.668 0.334
M8-8.53 1.708-4.9940e+00 1.479e-06 7.395e-07
M9-16.29 1.525-1.0680e+01 1.252e-20 6.26e-21
M10-16.51 1.53-1.0790e+01 6.133e-21 3.067e-21
M11-12.55 1.436-8.7450e+00 2.308e-15 1.154e-15
t+0.03902 0.01563+2.4970e+00 0.01349 0.006746







Multiple Linear Regression - Regression Statistics
Multiple R 0.962
R-squared 0.9254
Adjusted R-squared 0.9179
F-TEST (value) 121.9
F-TEST (DF numerator)17
F-TEST (DF denominator)167
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 3.469
Sum Squared Residuals 2010

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.962 \tabularnewline
R-squared &  0.9254 \tabularnewline
Adjusted R-squared &  0.9179 \tabularnewline
F-TEST (value) &  121.9 \tabularnewline
F-TEST (DF numerator) & 17 \tabularnewline
F-TEST (DF denominator) & 167 \tabularnewline
p-value &  0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  3.469 \tabularnewline
Sum Squared Residuals &  2010 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.962[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.9254[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.9179[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 121.9[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]17[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]167[/C][/ROW]
[ROW][C]p-value[/C][C] 0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 3.469[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 2010[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.962
R-squared 0.9254
Adjusted R-squared 0.9179
F-TEST (value) 121.9
F-TEST (DF numerator)17
F-TEST (DF denominator)167
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 3.469
Sum Squared Residuals 2010







Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute

\begin{tabular}{lllllllll}
\hline
Menu of Residual Diagnostics \tabularnewline
Description & Link \tabularnewline
Histogram & Compute \tabularnewline
Central Tendency & Compute \tabularnewline
QQ Plot & Compute \tabularnewline
Kernel Density Plot & Compute \tabularnewline
Skewness/Kurtosis Test & Compute \tabularnewline
Skewness-Kurtosis Plot & Compute \tabularnewline
Harrell-Davis Plot & Compute \tabularnewline
Bootstrap Plot -- Central Tendency & Compute \tabularnewline
Blocked Bootstrap Plot -- Central Tendency & Compute \tabularnewline
(Partial) Autocorrelation Plot & Compute \tabularnewline
Spectral Analysis & Compute \tabularnewline
Tukey lambda PPCC Plot & Compute \tabularnewline
Box-Cox Normality Plot & Compute \tabularnewline
Summary Statistics & Compute \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=4

[TABLE]
[ROW][C]Menu of Residual Diagnostics[/C][/ROW]
[ROW][C]Description[/C][C]Link[/C][/ROW]
[ROW][C]Histogram[/C][C]Compute[/C][/ROW]
[ROW][C]Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]QQ Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Kernel Density Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness/Kurtosis Test[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness-Kurtosis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Harrell-Davis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]Blocked Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C](Partial) Autocorrelation Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Spectral Analysis[/C][C]Compute[/C][/ROW]
[ROW][C]Tukey lambda PPCC Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Box-Cox Normality Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Summary Statistics[/C][C]Compute[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 77.5 74.97 2.531
2 75.6 73.29 2.314
3 77.4 80.74-3.338
4 69.2 69.64-0.4413
5 67.1 67.24-0.1393
6 77.9 81.21-3.31
7 82.7 83.03-0.3281
8 75.7 77.54-1.836
9 70.1 75.16-5.058
10 76.4 74.19 2.206
11 74.3 75.08-0.7803
12 80.5 85.26-4.762
13 78 77.17 0.834
14 73.5 75.02-1.518
15 78.8 81.12-2.323
16 71.2 69.76 1.435
17 66.2 67.3-1.096
18 82.7 82.61 0.0898
19 83.8 84.94-1.144
20 75 78.91-3.915
21 80.4 78.1 2.298
22 74.6 77.14-2.542
23 77.7 77.08 0.617
24 89.8 90.99-1.192
25 82.4 79.02 3.379
26 77 80.06-3.062
27 89.6 87.75 1.855
28 75.7 74.86 0.8405
29 75.1 72.66 2.441
30 89.9 90.55-0.6509
31 88.8 90.58-1.776
32 86.5 86.05 0.4452
33 90 84.76 5.242
34 84 84.12-0.118
35 82.7 86.56-3.864
36 91.7 98.49-6.794
37 87.5 85.13 2.369
38 82 84.29-2.291
39 92.2 90.67 1.531
40 73.1 78.99-5.889
41 75.6 75.22 0.3785
42 91.6 91.25 0.3458
43 87.5 90.01-2.51
44 90.1 86.46 3.636
45 91.3 85.5 5.797
46 87.6 84.73 2.868
47 88.4 89.32-0.9244
48 100.7 100.8-0.1217
49 85.3 89.81-4.508
50 92 88.46 3.542
51 96.8 96.18 0.624
52 77.9 81.22-3.318
53 80.9 81.75-0.85
54 95.3 95.44-0.1372
55 99.3 94.31 4.986
56 96.1 91.72 4.381
57 92.5 90.71 1.791
58 93.7 91.53 2.167
59 92.1 93.57-1.471
60 103.6 103.4 0.2017
61 92.5 94.26-1.764
62 95.7 92.03 3.67
63 103.4 99.9 3.499
64 89 87.13 1.866
65 89.1 87.31 1.794
66 98.7 102.7-3.991
67 109.4 102 7.385
68 101.1 98.33 2.775
69 95.4 95.73-0.3288
70 101.4 97.92 3.478
71 102.1 98.05 4.052
72 103.6 108.2-4.561
73 106 100.3 5.721
74 98.4 99.17-0.7656
75 106.6 103.2 3.442
76 95.8 94.46 1.345
77 87.2 90.44-3.237
78 108.5 105.3 3.164
79 107 106.2 0.8319
80 92 98.89-6.885
81 94.9 98.33-3.426
82 84.4 94.64-10.24
83 85 90.14-5.139
84 94 101.3-7.316
85 84.5 86.38-1.876
86 88.2 84.99 3.208
87 92.1 92.14-0.03724
88 81.1 78.89 2.214
89 81.2 79.77 1.434
90 96.1 94.19 1.912
91 95.3 95.49-0.1851
92 92.1 91.68 0.4178
93 91.7 90.41 1.287
94 90.3 89.14 1.156
95 96.1 91.21 4.892
96 108.7 104.1 4.641
97 95.9 94.82 1.077
98 95.1 96.2-1.1
99 109.4 103.6 5.805
100 91.2 89.73 1.469
101 91.4 89.39 2.01
102 107.4 106.8 0.6122
103 105.6 105.6-0.04023
104 105.3 101.9 3.424
105 103.7 101.1 2.596
106 99.5 100.4-0.8742
107 103.2 102.8 0.3876
108 123.1 113.5 9.58
109 102.2 104.8-2.628
110 110 104.6 5.415
111 106.2 115.3-9.146
112 91.3 96.1-4.798
113 99.3 96.22 3.076
114 111.8 107.7 4.128
115 104.4 109-4.578
116 102.4 106.2-3.835
117 101 102.6-1.567
118 100.6 98.45 2.153
119 104.5 100.6 3.852
120 117.4 111.8 5.617
121 97.4 104-6.625
122 99.5 102.6-3.07
123 106.4 109.1-2.725
124 95.2 91.58 3.623
125 94 91.66 2.342
126 104.1 107.1-3.035
127 105.8 108.1-2.283
128 101.1 102.2-1.118
129 93.5 98.81-5.314
130 97.9 97.22 0.6836
131 96.8 97.95-1.15
132 108.4 106 2.428
133 103.5 99.64 3.857
134 101.3 98.27 3.027
135 107.4 107.2 0.25
136 100.7 95.34 5.362
137 91.1 94-2.899
138 105 108.6-3.646
139 112.8 110.7 2.053
140 105.6 103.1 2.476
141 101 102.5-1.483
142 101.9 103.4-1.538
143 103.5 103.1 0.4157
144 109.5 112.8-3.309
145 105 103.7 1.347
146 102.9 103-0.0879
147 108.5 108.7-0.2147
148 96.9 96.62 0.2774
149 88.4 95.11-6.715
150 112.4 108.7 3.657
151 111.3 110 1.296
152 101.6 103.5-1.852
153 101.2 105.5-4.304
154 101.8 102.4-0.6318
155 98.8 101.9-3.104
156 114.4 112.9 1.487
157 104.5 103.7 0.8353
158 97.6 102-4.381
159 109.1 110.2-1.13
160 94.5 95.7-1.197
161 90.4 93-2.605
162 111.8 109.1 2.728
163 110.5 109.3 1.233
164 106.8 104.5 2.343
165 101.8 106.1-4.304
166 103.7 103.5 0.2262
167 107.4 105.1 2.304
168 117.5 115.5 1.977
169 109.6 107.3 2.259
170 102.8 108.3-5.456
171 115.5 114.2 1.344
172 97.8 101.2-3.431
173 100.2 97.98 2.216
174 112.9 114.8-1.866
175 108.7 113.6-4.942
176 109 109.5-0.4571
177 113.9 107.1 6.774
178 106.9 105.9 1.004
179 109.6 109.7-0.08748
180 124.5 122.4 2.122
181 104.2 111-6.807
182 110.8 110.2 0.5542
183 118.7 118.1 0.5636
184 102.1 101.5 0.6418
185 105.1 103.3 1.847

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  77.5 &  74.97 &  2.531 \tabularnewline
2 &  75.6 &  73.29 &  2.314 \tabularnewline
3 &  77.4 &  80.74 & -3.338 \tabularnewline
4 &  69.2 &  69.64 & -0.4413 \tabularnewline
5 &  67.1 &  67.24 & -0.1393 \tabularnewline
6 &  77.9 &  81.21 & -3.31 \tabularnewline
7 &  82.7 &  83.03 & -0.3281 \tabularnewline
8 &  75.7 &  77.54 & -1.836 \tabularnewline
9 &  70.1 &  75.16 & -5.058 \tabularnewline
10 &  76.4 &  74.19 &  2.206 \tabularnewline
11 &  74.3 &  75.08 & -0.7803 \tabularnewline
12 &  80.5 &  85.26 & -4.762 \tabularnewline
13 &  78 &  77.17 &  0.834 \tabularnewline
14 &  73.5 &  75.02 & -1.518 \tabularnewline
15 &  78.8 &  81.12 & -2.323 \tabularnewline
16 &  71.2 &  69.76 &  1.435 \tabularnewline
17 &  66.2 &  67.3 & -1.096 \tabularnewline
18 &  82.7 &  82.61 &  0.0898 \tabularnewline
19 &  83.8 &  84.94 & -1.144 \tabularnewline
20 &  75 &  78.91 & -3.915 \tabularnewline
21 &  80.4 &  78.1 &  2.298 \tabularnewline
22 &  74.6 &  77.14 & -2.542 \tabularnewline
23 &  77.7 &  77.08 &  0.617 \tabularnewline
24 &  89.8 &  90.99 & -1.192 \tabularnewline
25 &  82.4 &  79.02 &  3.379 \tabularnewline
26 &  77 &  80.06 & -3.062 \tabularnewline
27 &  89.6 &  87.75 &  1.855 \tabularnewline
28 &  75.7 &  74.86 &  0.8405 \tabularnewline
29 &  75.1 &  72.66 &  2.441 \tabularnewline
30 &  89.9 &  90.55 & -0.6509 \tabularnewline
31 &  88.8 &  90.58 & -1.776 \tabularnewline
32 &  86.5 &  86.05 &  0.4452 \tabularnewline
33 &  90 &  84.76 &  5.242 \tabularnewline
34 &  84 &  84.12 & -0.118 \tabularnewline
35 &  82.7 &  86.56 & -3.864 \tabularnewline
36 &  91.7 &  98.49 & -6.794 \tabularnewline
37 &  87.5 &  85.13 &  2.369 \tabularnewline
38 &  82 &  84.29 & -2.291 \tabularnewline
39 &  92.2 &  90.67 &  1.531 \tabularnewline
40 &  73.1 &  78.99 & -5.889 \tabularnewline
41 &  75.6 &  75.22 &  0.3785 \tabularnewline
42 &  91.6 &  91.25 &  0.3458 \tabularnewline
43 &  87.5 &  90.01 & -2.51 \tabularnewline
44 &  90.1 &  86.46 &  3.636 \tabularnewline
45 &  91.3 &  85.5 &  5.797 \tabularnewline
46 &  87.6 &  84.73 &  2.868 \tabularnewline
47 &  88.4 &  89.32 & -0.9244 \tabularnewline
48 &  100.7 &  100.8 & -0.1217 \tabularnewline
49 &  85.3 &  89.81 & -4.508 \tabularnewline
50 &  92 &  88.46 &  3.542 \tabularnewline
51 &  96.8 &  96.18 &  0.624 \tabularnewline
52 &  77.9 &  81.22 & -3.318 \tabularnewline
53 &  80.9 &  81.75 & -0.85 \tabularnewline
54 &  95.3 &  95.44 & -0.1372 \tabularnewline
55 &  99.3 &  94.31 &  4.986 \tabularnewline
56 &  96.1 &  91.72 &  4.381 \tabularnewline
57 &  92.5 &  90.71 &  1.791 \tabularnewline
58 &  93.7 &  91.53 &  2.167 \tabularnewline
59 &  92.1 &  93.57 & -1.471 \tabularnewline
60 &  103.6 &  103.4 &  0.2017 \tabularnewline
61 &  92.5 &  94.26 & -1.764 \tabularnewline
62 &  95.7 &  92.03 &  3.67 \tabularnewline
63 &  103.4 &  99.9 &  3.499 \tabularnewline
64 &  89 &  87.13 &  1.866 \tabularnewline
65 &  89.1 &  87.31 &  1.794 \tabularnewline
66 &  98.7 &  102.7 & -3.991 \tabularnewline
67 &  109.4 &  102 &  7.385 \tabularnewline
68 &  101.1 &  98.33 &  2.775 \tabularnewline
69 &  95.4 &  95.73 & -0.3288 \tabularnewline
70 &  101.4 &  97.92 &  3.478 \tabularnewline
71 &  102.1 &  98.05 &  4.052 \tabularnewline
72 &  103.6 &  108.2 & -4.561 \tabularnewline
73 &  106 &  100.3 &  5.721 \tabularnewline
74 &  98.4 &  99.17 & -0.7656 \tabularnewline
75 &  106.6 &  103.2 &  3.442 \tabularnewline
76 &  95.8 &  94.46 &  1.345 \tabularnewline
77 &  87.2 &  90.44 & -3.237 \tabularnewline
78 &  108.5 &  105.3 &  3.164 \tabularnewline
79 &  107 &  106.2 &  0.8319 \tabularnewline
80 &  92 &  98.89 & -6.885 \tabularnewline
81 &  94.9 &  98.33 & -3.426 \tabularnewline
82 &  84.4 &  94.64 & -10.24 \tabularnewline
83 &  85 &  90.14 & -5.139 \tabularnewline
84 &  94 &  101.3 & -7.316 \tabularnewline
85 &  84.5 &  86.38 & -1.876 \tabularnewline
86 &  88.2 &  84.99 &  3.208 \tabularnewline
87 &  92.1 &  92.14 & -0.03724 \tabularnewline
88 &  81.1 &  78.89 &  2.214 \tabularnewline
89 &  81.2 &  79.77 &  1.434 \tabularnewline
90 &  96.1 &  94.19 &  1.912 \tabularnewline
91 &  95.3 &  95.49 & -0.1851 \tabularnewline
92 &  92.1 &  91.68 &  0.4178 \tabularnewline
93 &  91.7 &  90.41 &  1.287 \tabularnewline
94 &  90.3 &  89.14 &  1.156 \tabularnewline
95 &  96.1 &  91.21 &  4.892 \tabularnewline
96 &  108.7 &  104.1 &  4.641 \tabularnewline
97 &  95.9 &  94.82 &  1.077 \tabularnewline
98 &  95.1 &  96.2 & -1.1 \tabularnewline
99 &  109.4 &  103.6 &  5.805 \tabularnewline
100 &  91.2 &  89.73 &  1.469 \tabularnewline
101 &  91.4 &  89.39 &  2.01 \tabularnewline
102 &  107.4 &  106.8 &  0.6122 \tabularnewline
103 &  105.6 &  105.6 & -0.04023 \tabularnewline
104 &  105.3 &  101.9 &  3.424 \tabularnewline
105 &  103.7 &  101.1 &  2.596 \tabularnewline
106 &  99.5 &  100.4 & -0.8742 \tabularnewline
107 &  103.2 &  102.8 &  0.3876 \tabularnewline
108 &  123.1 &  113.5 &  9.58 \tabularnewline
109 &  102.2 &  104.8 & -2.628 \tabularnewline
110 &  110 &  104.6 &  5.415 \tabularnewline
111 &  106.2 &  115.3 & -9.146 \tabularnewline
112 &  91.3 &  96.1 & -4.798 \tabularnewline
113 &  99.3 &  96.22 &  3.076 \tabularnewline
114 &  111.8 &  107.7 &  4.128 \tabularnewline
115 &  104.4 &  109 & -4.578 \tabularnewline
116 &  102.4 &  106.2 & -3.835 \tabularnewline
117 &  101 &  102.6 & -1.567 \tabularnewline
118 &  100.6 &  98.45 &  2.153 \tabularnewline
119 &  104.5 &  100.6 &  3.852 \tabularnewline
120 &  117.4 &  111.8 &  5.617 \tabularnewline
121 &  97.4 &  104 & -6.625 \tabularnewline
122 &  99.5 &  102.6 & -3.07 \tabularnewline
123 &  106.4 &  109.1 & -2.725 \tabularnewline
124 &  95.2 &  91.58 &  3.623 \tabularnewline
125 &  94 &  91.66 &  2.342 \tabularnewline
126 &  104.1 &  107.1 & -3.035 \tabularnewline
127 &  105.8 &  108.1 & -2.283 \tabularnewline
128 &  101.1 &  102.2 & -1.118 \tabularnewline
129 &  93.5 &  98.81 & -5.314 \tabularnewline
130 &  97.9 &  97.22 &  0.6836 \tabularnewline
131 &  96.8 &  97.95 & -1.15 \tabularnewline
132 &  108.4 &  106 &  2.428 \tabularnewline
133 &  103.5 &  99.64 &  3.857 \tabularnewline
134 &  101.3 &  98.27 &  3.027 \tabularnewline
135 &  107.4 &  107.2 &  0.25 \tabularnewline
136 &  100.7 &  95.34 &  5.362 \tabularnewline
137 &  91.1 &  94 & -2.899 \tabularnewline
138 &  105 &  108.6 & -3.646 \tabularnewline
139 &  112.8 &  110.7 &  2.053 \tabularnewline
140 &  105.6 &  103.1 &  2.476 \tabularnewline
141 &  101 &  102.5 & -1.483 \tabularnewline
142 &  101.9 &  103.4 & -1.538 \tabularnewline
143 &  103.5 &  103.1 &  0.4157 \tabularnewline
144 &  109.5 &  112.8 & -3.309 \tabularnewline
145 &  105 &  103.7 &  1.347 \tabularnewline
146 &  102.9 &  103 & -0.0879 \tabularnewline
147 &  108.5 &  108.7 & -0.2147 \tabularnewline
148 &  96.9 &  96.62 &  0.2774 \tabularnewline
149 &  88.4 &  95.11 & -6.715 \tabularnewline
150 &  112.4 &  108.7 &  3.657 \tabularnewline
151 &  111.3 &  110 &  1.296 \tabularnewline
152 &  101.6 &  103.5 & -1.852 \tabularnewline
153 &  101.2 &  105.5 & -4.304 \tabularnewline
154 &  101.8 &  102.4 & -0.6318 \tabularnewline
155 &  98.8 &  101.9 & -3.104 \tabularnewline
156 &  114.4 &  112.9 &  1.487 \tabularnewline
157 &  104.5 &  103.7 &  0.8353 \tabularnewline
158 &  97.6 &  102 & -4.381 \tabularnewline
159 &  109.1 &  110.2 & -1.13 \tabularnewline
160 &  94.5 &  95.7 & -1.197 \tabularnewline
161 &  90.4 &  93 & -2.605 \tabularnewline
162 &  111.8 &  109.1 &  2.728 \tabularnewline
163 &  110.5 &  109.3 &  1.233 \tabularnewline
164 &  106.8 &  104.5 &  2.343 \tabularnewline
165 &  101.8 &  106.1 & -4.304 \tabularnewline
166 &  103.7 &  103.5 &  0.2262 \tabularnewline
167 &  107.4 &  105.1 &  2.304 \tabularnewline
168 &  117.5 &  115.5 &  1.977 \tabularnewline
169 &  109.6 &  107.3 &  2.259 \tabularnewline
170 &  102.8 &  108.3 & -5.456 \tabularnewline
171 &  115.5 &  114.2 &  1.344 \tabularnewline
172 &  97.8 &  101.2 & -3.431 \tabularnewline
173 &  100.2 &  97.98 &  2.216 \tabularnewline
174 &  112.9 &  114.8 & -1.866 \tabularnewline
175 &  108.7 &  113.6 & -4.942 \tabularnewline
176 &  109 &  109.5 & -0.4571 \tabularnewline
177 &  113.9 &  107.1 &  6.774 \tabularnewline
178 &  106.9 &  105.9 &  1.004 \tabularnewline
179 &  109.6 &  109.7 & -0.08748 \tabularnewline
180 &  124.5 &  122.4 &  2.122 \tabularnewline
181 &  104.2 &  111 & -6.807 \tabularnewline
182 &  110.8 &  110.2 &  0.5542 \tabularnewline
183 &  118.7 &  118.1 &  0.5636 \tabularnewline
184 &  102.1 &  101.5 &  0.6418 \tabularnewline
185 &  105.1 &  103.3 &  1.847 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=5

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 77.5[/C][C] 74.97[/C][C] 2.531[/C][/ROW]
[ROW][C]2[/C][C] 75.6[/C][C] 73.29[/C][C] 2.314[/C][/ROW]
[ROW][C]3[/C][C] 77.4[/C][C] 80.74[/C][C]-3.338[/C][/ROW]
[ROW][C]4[/C][C] 69.2[/C][C] 69.64[/C][C]-0.4413[/C][/ROW]
[ROW][C]5[/C][C] 67.1[/C][C] 67.24[/C][C]-0.1393[/C][/ROW]
[ROW][C]6[/C][C] 77.9[/C][C] 81.21[/C][C]-3.31[/C][/ROW]
[ROW][C]7[/C][C] 82.7[/C][C] 83.03[/C][C]-0.3281[/C][/ROW]
[ROW][C]8[/C][C] 75.7[/C][C] 77.54[/C][C]-1.836[/C][/ROW]
[ROW][C]9[/C][C] 70.1[/C][C] 75.16[/C][C]-5.058[/C][/ROW]
[ROW][C]10[/C][C] 76.4[/C][C] 74.19[/C][C] 2.206[/C][/ROW]
[ROW][C]11[/C][C] 74.3[/C][C] 75.08[/C][C]-0.7803[/C][/ROW]
[ROW][C]12[/C][C] 80.5[/C][C] 85.26[/C][C]-4.762[/C][/ROW]
[ROW][C]13[/C][C] 78[/C][C] 77.17[/C][C] 0.834[/C][/ROW]
[ROW][C]14[/C][C] 73.5[/C][C] 75.02[/C][C]-1.518[/C][/ROW]
[ROW][C]15[/C][C] 78.8[/C][C] 81.12[/C][C]-2.323[/C][/ROW]
[ROW][C]16[/C][C] 71.2[/C][C] 69.76[/C][C] 1.435[/C][/ROW]
[ROW][C]17[/C][C] 66.2[/C][C] 67.3[/C][C]-1.096[/C][/ROW]
[ROW][C]18[/C][C] 82.7[/C][C] 82.61[/C][C] 0.0898[/C][/ROW]
[ROW][C]19[/C][C] 83.8[/C][C] 84.94[/C][C]-1.144[/C][/ROW]
[ROW][C]20[/C][C] 75[/C][C] 78.91[/C][C]-3.915[/C][/ROW]
[ROW][C]21[/C][C] 80.4[/C][C] 78.1[/C][C] 2.298[/C][/ROW]
[ROW][C]22[/C][C] 74.6[/C][C] 77.14[/C][C]-2.542[/C][/ROW]
[ROW][C]23[/C][C] 77.7[/C][C] 77.08[/C][C] 0.617[/C][/ROW]
[ROW][C]24[/C][C] 89.8[/C][C] 90.99[/C][C]-1.192[/C][/ROW]
[ROW][C]25[/C][C] 82.4[/C][C] 79.02[/C][C] 3.379[/C][/ROW]
[ROW][C]26[/C][C] 77[/C][C] 80.06[/C][C]-3.062[/C][/ROW]
[ROW][C]27[/C][C] 89.6[/C][C] 87.75[/C][C] 1.855[/C][/ROW]
[ROW][C]28[/C][C] 75.7[/C][C] 74.86[/C][C] 0.8405[/C][/ROW]
[ROW][C]29[/C][C] 75.1[/C][C] 72.66[/C][C] 2.441[/C][/ROW]
[ROW][C]30[/C][C] 89.9[/C][C] 90.55[/C][C]-0.6509[/C][/ROW]
[ROW][C]31[/C][C] 88.8[/C][C] 90.58[/C][C]-1.776[/C][/ROW]
[ROW][C]32[/C][C] 86.5[/C][C] 86.05[/C][C] 0.4452[/C][/ROW]
[ROW][C]33[/C][C] 90[/C][C] 84.76[/C][C] 5.242[/C][/ROW]
[ROW][C]34[/C][C] 84[/C][C] 84.12[/C][C]-0.118[/C][/ROW]
[ROW][C]35[/C][C] 82.7[/C][C] 86.56[/C][C]-3.864[/C][/ROW]
[ROW][C]36[/C][C] 91.7[/C][C] 98.49[/C][C]-6.794[/C][/ROW]
[ROW][C]37[/C][C] 87.5[/C][C] 85.13[/C][C] 2.369[/C][/ROW]
[ROW][C]38[/C][C] 82[/C][C] 84.29[/C][C]-2.291[/C][/ROW]
[ROW][C]39[/C][C] 92.2[/C][C] 90.67[/C][C] 1.531[/C][/ROW]
[ROW][C]40[/C][C] 73.1[/C][C] 78.99[/C][C]-5.889[/C][/ROW]
[ROW][C]41[/C][C] 75.6[/C][C] 75.22[/C][C] 0.3785[/C][/ROW]
[ROW][C]42[/C][C] 91.6[/C][C] 91.25[/C][C] 0.3458[/C][/ROW]
[ROW][C]43[/C][C] 87.5[/C][C] 90.01[/C][C]-2.51[/C][/ROW]
[ROW][C]44[/C][C] 90.1[/C][C] 86.46[/C][C] 3.636[/C][/ROW]
[ROW][C]45[/C][C] 91.3[/C][C] 85.5[/C][C] 5.797[/C][/ROW]
[ROW][C]46[/C][C] 87.6[/C][C] 84.73[/C][C] 2.868[/C][/ROW]
[ROW][C]47[/C][C] 88.4[/C][C] 89.32[/C][C]-0.9244[/C][/ROW]
[ROW][C]48[/C][C] 100.7[/C][C] 100.8[/C][C]-0.1217[/C][/ROW]
[ROW][C]49[/C][C] 85.3[/C][C] 89.81[/C][C]-4.508[/C][/ROW]
[ROW][C]50[/C][C] 92[/C][C] 88.46[/C][C] 3.542[/C][/ROW]
[ROW][C]51[/C][C] 96.8[/C][C] 96.18[/C][C] 0.624[/C][/ROW]
[ROW][C]52[/C][C] 77.9[/C][C] 81.22[/C][C]-3.318[/C][/ROW]
[ROW][C]53[/C][C] 80.9[/C][C] 81.75[/C][C]-0.85[/C][/ROW]
[ROW][C]54[/C][C] 95.3[/C][C] 95.44[/C][C]-0.1372[/C][/ROW]
[ROW][C]55[/C][C] 99.3[/C][C] 94.31[/C][C] 4.986[/C][/ROW]
[ROW][C]56[/C][C] 96.1[/C][C] 91.72[/C][C] 4.381[/C][/ROW]
[ROW][C]57[/C][C] 92.5[/C][C] 90.71[/C][C] 1.791[/C][/ROW]
[ROW][C]58[/C][C] 93.7[/C][C] 91.53[/C][C] 2.167[/C][/ROW]
[ROW][C]59[/C][C] 92.1[/C][C] 93.57[/C][C]-1.471[/C][/ROW]
[ROW][C]60[/C][C] 103.6[/C][C] 103.4[/C][C] 0.2017[/C][/ROW]
[ROW][C]61[/C][C] 92.5[/C][C] 94.26[/C][C]-1.764[/C][/ROW]
[ROW][C]62[/C][C] 95.7[/C][C] 92.03[/C][C] 3.67[/C][/ROW]
[ROW][C]63[/C][C] 103.4[/C][C] 99.9[/C][C] 3.499[/C][/ROW]
[ROW][C]64[/C][C] 89[/C][C] 87.13[/C][C] 1.866[/C][/ROW]
[ROW][C]65[/C][C] 89.1[/C][C] 87.31[/C][C] 1.794[/C][/ROW]
[ROW][C]66[/C][C] 98.7[/C][C] 102.7[/C][C]-3.991[/C][/ROW]
[ROW][C]67[/C][C] 109.4[/C][C] 102[/C][C] 7.385[/C][/ROW]
[ROW][C]68[/C][C] 101.1[/C][C] 98.33[/C][C] 2.775[/C][/ROW]
[ROW][C]69[/C][C] 95.4[/C][C] 95.73[/C][C]-0.3288[/C][/ROW]
[ROW][C]70[/C][C] 101.4[/C][C] 97.92[/C][C] 3.478[/C][/ROW]
[ROW][C]71[/C][C] 102.1[/C][C] 98.05[/C][C] 4.052[/C][/ROW]
[ROW][C]72[/C][C] 103.6[/C][C] 108.2[/C][C]-4.561[/C][/ROW]
[ROW][C]73[/C][C] 106[/C][C] 100.3[/C][C] 5.721[/C][/ROW]
[ROW][C]74[/C][C] 98.4[/C][C] 99.17[/C][C]-0.7656[/C][/ROW]
[ROW][C]75[/C][C] 106.6[/C][C] 103.2[/C][C] 3.442[/C][/ROW]
[ROW][C]76[/C][C] 95.8[/C][C] 94.46[/C][C] 1.345[/C][/ROW]
[ROW][C]77[/C][C] 87.2[/C][C] 90.44[/C][C]-3.237[/C][/ROW]
[ROW][C]78[/C][C] 108.5[/C][C] 105.3[/C][C] 3.164[/C][/ROW]
[ROW][C]79[/C][C] 107[/C][C] 106.2[/C][C] 0.8319[/C][/ROW]
[ROW][C]80[/C][C] 92[/C][C] 98.89[/C][C]-6.885[/C][/ROW]
[ROW][C]81[/C][C] 94.9[/C][C] 98.33[/C][C]-3.426[/C][/ROW]
[ROW][C]82[/C][C] 84.4[/C][C] 94.64[/C][C]-10.24[/C][/ROW]
[ROW][C]83[/C][C] 85[/C][C] 90.14[/C][C]-5.139[/C][/ROW]
[ROW][C]84[/C][C] 94[/C][C] 101.3[/C][C]-7.316[/C][/ROW]
[ROW][C]85[/C][C] 84.5[/C][C] 86.38[/C][C]-1.876[/C][/ROW]
[ROW][C]86[/C][C] 88.2[/C][C] 84.99[/C][C] 3.208[/C][/ROW]
[ROW][C]87[/C][C] 92.1[/C][C] 92.14[/C][C]-0.03724[/C][/ROW]
[ROW][C]88[/C][C] 81.1[/C][C] 78.89[/C][C] 2.214[/C][/ROW]
[ROW][C]89[/C][C] 81.2[/C][C] 79.77[/C][C] 1.434[/C][/ROW]
[ROW][C]90[/C][C] 96.1[/C][C] 94.19[/C][C] 1.912[/C][/ROW]
[ROW][C]91[/C][C] 95.3[/C][C] 95.49[/C][C]-0.1851[/C][/ROW]
[ROW][C]92[/C][C] 92.1[/C][C] 91.68[/C][C] 0.4178[/C][/ROW]
[ROW][C]93[/C][C] 91.7[/C][C] 90.41[/C][C] 1.287[/C][/ROW]
[ROW][C]94[/C][C] 90.3[/C][C] 89.14[/C][C] 1.156[/C][/ROW]
[ROW][C]95[/C][C] 96.1[/C][C] 91.21[/C][C] 4.892[/C][/ROW]
[ROW][C]96[/C][C] 108.7[/C][C] 104.1[/C][C] 4.641[/C][/ROW]
[ROW][C]97[/C][C] 95.9[/C][C] 94.82[/C][C] 1.077[/C][/ROW]
[ROW][C]98[/C][C] 95.1[/C][C] 96.2[/C][C]-1.1[/C][/ROW]
[ROW][C]99[/C][C] 109.4[/C][C] 103.6[/C][C] 5.805[/C][/ROW]
[ROW][C]100[/C][C] 91.2[/C][C] 89.73[/C][C] 1.469[/C][/ROW]
[ROW][C]101[/C][C] 91.4[/C][C] 89.39[/C][C] 2.01[/C][/ROW]
[ROW][C]102[/C][C] 107.4[/C][C] 106.8[/C][C] 0.6122[/C][/ROW]
[ROW][C]103[/C][C] 105.6[/C][C] 105.6[/C][C]-0.04023[/C][/ROW]
[ROW][C]104[/C][C] 105.3[/C][C] 101.9[/C][C] 3.424[/C][/ROW]
[ROW][C]105[/C][C] 103.7[/C][C] 101.1[/C][C] 2.596[/C][/ROW]
[ROW][C]106[/C][C] 99.5[/C][C] 100.4[/C][C]-0.8742[/C][/ROW]
[ROW][C]107[/C][C] 103.2[/C][C] 102.8[/C][C] 0.3876[/C][/ROW]
[ROW][C]108[/C][C] 123.1[/C][C] 113.5[/C][C] 9.58[/C][/ROW]
[ROW][C]109[/C][C] 102.2[/C][C] 104.8[/C][C]-2.628[/C][/ROW]
[ROW][C]110[/C][C] 110[/C][C] 104.6[/C][C] 5.415[/C][/ROW]
[ROW][C]111[/C][C] 106.2[/C][C] 115.3[/C][C]-9.146[/C][/ROW]
[ROW][C]112[/C][C] 91.3[/C][C] 96.1[/C][C]-4.798[/C][/ROW]
[ROW][C]113[/C][C] 99.3[/C][C] 96.22[/C][C] 3.076[/C][/ROW]
[ROW][C]114[/C][C] 111.8[/C][C] 107.7[/C][C] 4.128[/C][/ROW]
[ROW][C]115[/C][C] 104.4[/C][C] 109[/C][C]-4.578[/C][/ROW]
[ROW][C]116[/C][C] 102.4[/C][C] 106.2[/C][C]-3.835[/C][/ROW]
[ROW][C]117[/C][C] 101[/C][C] 102.6[/C][C]-1.567[/C][/ROW]
[ROW][C]118[/C][C] 100.6[/C][C] 98.45[/C][C] 2.153[/C][/ROW]
[ROW][C]119[/C][C] 104.5[/C][C] 100.6[/C][C] 3.852[/C][/ROW]
[ROW][C]120[/C][C] 117.4[/C][C] 111.8[/C][C] 5.617[/C][/ROW]
[ROW][C]121[/C][C] 97.4[/C][C] 104[/C][C]-6.625[/C][/ROW]
[ROW][C]122[/C][C] 99.5[/C][C] 102.6[/C][C]-3.07[/C][/ROW]
[ROW][C]123[/C][C] 106.4[/C][C] 109.1[/C][C]-2.725[/C][/ROW]
[ROW][C]124[/C][C] 95.2[/C][C] 91.58[/C][C] 3.623[/C][/ROW]
[ROW][C]125[/C][C] 94[/C][C] 91.66[/C][C] 2.342[/C][/ROW]
[ROW][C]126[/C][C] 104.1[/C][C] 107.1[/C][C]-3.035[/C][/ROW]
[ROW][C]127[/C][C] 105.8[/C][C] 108.1[/C][C]-2.283[/C][/ROW]
[ROW][C]128[/C][C] 101.1[/C][C] 102.2[/C][C]-1.118[/C][/ROW]
[ROW][C]129[/C][C] 93.5[/C][C] 98.81[/C][C]-5.314[/C][/ROW]
[ROW][C]130[/C][C] 97.9[/C][C] 97.22[/C][C] 0.6836[/C][/ROW]
[ROW][C]131[/C][C] 96.8[/C][C] 97.95[/C][C]-1.15[/C][/ROW]
[ROW][C]132[/C][C] 108.4[/C][C] 106[/C][C] 2.428[/C][/ROW]
[ROW][C]133[/C][C] 103.5[/C][C] 99.64[/C][C] 3.857[/C][/ROW]
[ROW][C]134[/C][C] 101.3[/C][C] 98.27[/C][C] 3.027[/C][/ROW]
[ROW][C]135[/C][C] 107.4[/C][C] 107.2[/C][C] 0.25[/C][/ROW]
[ROW][C]136[/C][C] 100.7[/C][C] 95.34[/C][C] 5.362[/C][/ROW]
[ROW][C]137[/C][C] 91.1[/C][C] 94[/C][C]-2.899[/C][/ROW]
[ROW][C]138[/C][C] 105[/C][C] 108.6[/C][C]-3.646[/C][/ROW]
[ROW][C]139[/C][C] 112.8[/C][C] 110.7[/C][C] 2.053[/C][/ROW]
[ROW][C]140[/C][C] 105.6[/C][C] 103.1[/C][C] 2.476[/C][/ROW]
[ROW][C]141[/C][C] 101[/C][C] 102.5[/C][C]-1.483[/C][/ROW]
[ROW][C]142[/C][C] 101.9[/C][C] 103.4[/C][C]-1.538[/C][/ROW]
[ROW][C]143[/C][C] 103.5[/C][C] 103.1[/C][C] 0.4157[/C][/ROW]
[ROW][C]144[/C][C] 109.5[/C][C] 112.8[/C][C]-3.309[/C][/ROW]
[ROW][C]145[/C][C] 105[/C][C] 103.7[/C][C] 1.347[/C][/ROW]
[ROW][C]146[/C][C] 102.9[/C][C] 103[/C][C]-0.0879[/C][/ROW]
[ROW][C]147[/C][C] 108.5[/C][C] 108.7[/C][C]-0.2147[/C][/ROW]
[ROW][C]148[/C][C] 96.9[/C][C] 96.62[/C][C] 0.2774[/C][/ROW]
[ROW][C]149[/C][C] 88.4[/C][C] 95.11[/C][C]-6.715[/C][/ROW]
[ROW][C]150[/C][C] 112.4[/C][C] 108.7[/C][C] 3.657[/C][/ROW]
[ROW][C]151[/C][C] 111.3[/C][C] 110[/C][C] 1.296[/C][/ROW]
[ROW][C]152[/C][C] 101.6[/C][C] 103.5[/C][C]-1.852[/C][/ROW]
[ROW][C]153[/C][C] 101.2[/C][C] 105.5[/C][C]-4.304[/C][/ROW]
[ROW][C]154[/C][C] 101.8[/C][C] 102.4[/C][C]-0.6318[/C][/ROW]
[ROW][C]155[/C][C] 98.8[/C][C] 101.9[/C][C]-3.104[/C][/ROW]
[ROW][C]156[/C][C] 114.4[/C][C] 112.9[/C][C] 1.487[/C][/ROW]
[ROW][C]157[/C][C] 104.5[/C][C] 103.7[/C][C] 0.8353[/C][/ROW]
[ROW][C]158[/C][C] 97.6[/C][C] 102[/C][C]-4.381[/C][/ROW]
[ROW][C]159[/C][C] 109.1[/C][C] 110.2[/C][C]-1.13[/C][/ROW]
[ROW][C]160[/C][C] 94.5[/C][C] 95.7[/C][C]-1.197[/C][/ROW]
[ROW][C]161[/C][C] 90.4[/C][C] 93[/C][C]-2.605[/C][/ROW]
[ROW][C]162[/C][C] 111.8[/C][C] 109.1[/C][C] 2.728[/C][/ROW]
[ROW][C]163[/C][C] 110.5[/C][C] 109.3[/C][C] 1.233[/C][/ROW]
[ROW][C]164[/C][C] 106.8[/C][C] 104.5[/C][C] 2.343[/C][/ROW]
[ROW][C]165[/C][C] 101.8[/C][C] 106.1[/C][C]-4.304[/C][/ROW]
[ROW][C]166[/C][C] 103.7[/C][C] 103.5[/C][C] 0.2262[/C][/ROW]
[ROW][C]167[/C][C] 107.4[/C][C] 105.1[/C][C] 2.304[/C][/ROW]
[ROW][C]168[/C][C] 117.5[/C][C] 115.5[/C][C] 1.977[/C][/ROW]
[ROW][C]169[/C][C] 109.6[/C][C] 107.3[/C][C] 2.259[/C][/ROW]
[ROW][C]170[/C][C] 102.8[/C][C] 108.3[/C][C]-5.456[/C][/ROW]
[ROW][C]171[/C][C] 115.5[/C][C] 114.2[/C][C] 1.344[/C][/ROW]
[ROW][C]172[/C][C] 97.8[/C][C] 101.2[/C][C]-3.431[/C][/ROW]
[ROW][C]173[/C][C] 100.2[/C][C] 97.98[/C][C] 2.216[/C][/ROW]
[ROW][C]174[/C][C] 112.9[/C][C] 114.8[/C][C]-1.866[/C][/ROW]
[ROW][C]175[/C][C] 108.7[/C][C] 113.6[/C][C]-4.942[/C][/ROW]
[ROW][C]176[/C][C] 109[/C][C] 109.5[/C][C]-0.4571[/C][/ROW]
[ROW][C]177[/C][C] 113.9[/C][C] 107.1[/C][C] 6.774[/C][/ROW]
[ROW][C]178[/C][C] 106.9[/C][C] 105.9[/C][C] 1.004[/C][/ROW]
[ROW][C]179[/C][C] 109.6[/C][C] 109.7[/C][C]-0.08748[/C][/ROW]
[ROW][C]180[/C][C] 124.5[/C][C] 122.4[/C][C] 2.122[/C][/ROW]
[ROW][C]181[/C][C] 104.2[/C][C] 111[/C][C]-6.807[/C][/ROW]
[ROW][C]182[/C][C] 110.8[/C][C] 110.2[/C][C] 0.5542[/C][/ROW]
[ROW][C]183[/C][C] 118.7[/C][C] 118.1[/C][C] 0.5636[/C][/ROW]
[ROW][C]184[/C][C] 102.1[/C][C] 101.5[/C][C] 0.6418[/C][/ROW]
[ROW][C]185[/C][C] 105.1[/C][C] 103.3[/C][C] 1.847[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 77.5 74.97 2.531
2 75.6 73.29 2.314
3 77.4 80.74-3.338
4 69.2 69.64-0.4413
5 67.1 67.24-0.1393
6 77.9 81.21-3.31
7 82.7 83.03-0.3281
8 75.7 77.54-1.836
9 70.1 75.16-5.058
10 76.4 74.19 2.206
11 74.3 75.08-0.7803
12 80.5 85.26-4.762
13 78 77.17 0.834
14 73.5 75.02-1.518
15 78.8 81.12-2.323
16 71.2 69.76 1.435
17 66.2 67.3-1.096
18 82.7 82.61 0.0898
19 83.8 84.94-1.144
20 75 78.91-3.915
21 80.4 78.1 2.298
22 74.6 77.14-2.542
23 77.7 77.08 0.617
24 89.8 90.99-1.192
25 82.4 79.02 3.379
26 77 80.06-3.062
27 89.6 87.75 1.855
28 75.7 74.86 0.8405
29 75.1 72.66 2.441
30 89.9 90.55-0.6509
31 88.8 90.58-1.776
32 86.5 86.05 0.4452
33 90 84.76 5.242
34 84 84.12-0.118
35 82.7 86.56-3.864
36 91.7 98.49-6.794
37 87.5 85.13 2.369
38 82 84.29-2.291
39 92.2 90.67 1.531
40 73.1 78.99-5.889
41 75.6 75.22 0.3785
42 91.6 91.25 0.3458
43 87.5 90.01-2.51
44 90.1 86.46 3.636
45 91.3 85.5 5.797
46 87.6 84.73 2.868
47 88.4 89.32-0.9244
48 100.7 100.8-0.1217
49 85.3 89.81-4.508
50 92 88.46 3.542
51 96.8 96.18 0.624
52 77.9 81.22-3.318
53 80.9 81.75-0.85
54 95.3 95.44-0.1372
55 99.3 94.31 4.986
56 96.1 91.72 4.381
57 92.5 90.71 1.791
58 93.7 91.53 2.167
59 92.1 93.57-1.471
60 103.6 103.4 0.2017
61 92.5 94.26-1.764
62 95.7 92.03 3.67
63 103.4 99.9 3.499
64 89 87.13 1.866
65 89.1 87.31 1.794
66 98.7 102.7-3.991
67 109.4 102 7.385
68 101.1 98.33 2.775
69 95.4 95.73-0.3288
70 101.4 97.92 3.478
71 102.1 98.05 4.052
72 103.6 108.2-4.561
73 106 100.3 5.721
74 98.4 99.17-0.7656
75 106.6 103.2 3.442
76 95.8 94.46 1.345
77 87.2 90.44-3.237
78 108.5 105.3 3.164
79 107 106.2 0.8319
80 92 98.89-6.885
81 94.9 98.33-3.426
82 84.4 94.64-10.24
83 85 90.14-5.139
84 94 101.3-7.316
85 84.5 86.38-1.876
86 88.2 84.99 3.208
87 92.1 92.14-0.03724
88 81.1 78.89 2.214
89 81.2 79.77 1.434
90 96.1 94.19 1.912
91 95.3 95.49-0.1851
92 92.1 91.68 0.4178
93 91.7 90.41 1.287
94 90.3 89.14 1.156
95 96.1 91.21 4.892
96 108.7 104.1 4.641
97 95.9 94.82 1.077
98 95.1 96.2-1.1
99 109.4 103.6 5.805
100 91.2 89.73 1.469
101 91.4 89.39 2.01
102 107.4 106.8 0.6122
103 105.6 105.6-0.04023
104 105.3 101.9 3.424
105 103.7 101.1 2.596
106 99.5 100.4-0.8742
107 103.2 102.8 0.3876
108 123.1 113.5 9.58
109 102.2 104.8-2.628
110 110 104.6 5.415
111 106.2 115.3-9.146
112 91.3 96.1-4.798
113 99.3 96.22 3.076
114 111.8 107.7 4.128
115 104.4 109-4.578
116 102.4 106.2-3.835
117 101 102.6-1.567
118 100.6 98.45 2.153
119 104.5 100.6 3.852
120 117.4 111.8 5.617
121 97.4 104-6.625
122 99.5 102.6-3.07
123 106.4 109.1-2.725
124 95.2 91.58 3.623
125 94 91.66 2.342
126 104.1 107.1-3.035
127 105.8 108.1-2.283
128 101.1 102.2-1.118
129 93.5 98.81-5.314
130 97.9 97.22 0.6836
131 96.8 97.95-1.15
132 108.4 106 2.428
133 103.5 99.64 3.857
134 101.3 98.27 3.027
135 107.4 107.2 0.25
136 100.7 95.34 5.362
137 91.1 94-2.899
138 105 108.6-3.646
139 112.8 110.7 2.053
140 105.6 103.1 2.476
141 101 102.5-1.483
142 101.9 103.4-1.538
143 103.5 103.1 0.4157
144 109.5 112.8-3.309
145 105 103.7 1.347
146 102.9 103-0.0879
147 108.5 108.7-0.2147
148 96.9 96.62 0.2774
149 88.4 95.11-6.715
150 112.4 108.7 3.657
151 111.3 110 1.296
152 101.6 103.5-1.852
153 101.2 105.5-4.304
154 101.8 102.4-0.6318
155 98.8 101.9-3.104
156 114.4 112.9 1.487
157 104.5 103.7 0.8353
158 97.6 102-4.381
159 109.1 110.2-1.13
160 94.5 95.7-1.197
161 90.4 93-2.605
162 111.8 109.1 2.728
163 110.5 109.3 1.233
164 106.8 104.5 2.343
165 101.8 106.1-4.304
166 103.7 103.5 0.2262
167 107.4 105.1 2.304
168 117.5 115.5 1.977
169 109.6 107.3 2.259
170 102.8 108.3-5.456
171 115.5 114.2 1.344
172 97.8 101.2-3.431
173 100.2 97.98 2.216
174 112.9 114.8-1.866
175 108.7 113.6-4.942
176 109 109.5-0.4571
177 113.9 107.1 6.774
178 106.9 105.9 1.004
179 109.6 109.7-0.08748
180 124.5 122.4 2.122
181 104.2 111-6.807
182 110.8 110.2 0.5542
183 118.7 118.1 0.5636
184 102.1 101.5 0.6418
185 105.1 103.3 1.847







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
21 0.155 0.31 0.845
22 0.06434 0.1287 0.9357
23 0.04907 0.09815 0.9509
24 0.02199 0.04397 0.978
25 0.1644 0.3288 0.8356
26 0.1209 0.2418 0.8791
27 0.08781 0.1756 0.9122
28 0.05207 0.1041 0.9479
29 0.05891 0.1178 0.9411
30 0.04816 0.09633 0.9518
31 0.0304 0.06081 0.9696
32 0.02009 0.04017 0.9799
33 0.01743 0.03486 0.9826
34 0.01123 0.02246 0.9888
35 0.03483 0.06966 0.9652
36 0.08661 0.1732 0.9134
37 0.06115 0.1223 0.9389
38 0.05633 0.1127 0.9437
39 0.03973 0.07946 0.9603
40 0.09903 0.1981 0.901
41 0.07702 0.154 0.923
42 0.05631 0.1126 0.9437
43 0.04622 0.09244 0.9538
44 0.04329 0.08657 0.9567
45 0.04962 0.09924 0.9504
46 0.03945 0.0789 0.9605
47 0.02897 0.05794 0.971
48 0.03279 0.06558 0.9672
49 0.0689 0.1378 0.9311
50 0.06247 0.1249 0.9375
51 0.04799 0.09597 0.952
52 0.04323 0.08647 0.9568
53 0.03664 0.07329 0.9634
54 0.02805 0.05611 0.9719
55 0.05277 0.1055 0.9472
56 0.05733 0.1147 0.9427
57 0.0442 0.08841 0.9558
58 0.03338 0.06676 0.9666
59 0.02707 0.05413 0.9729
60 0.02182 0.04363 0.9782
61 0.01832 0.03665 0.9817
62 0.01381 0.02762 0.9862
63 0.01226 0.02451 0.9877
64 0.01149 0.02298 0.9885
65 0.008313 0.01663 0.9917
66 0.01059 0.02117 0.9894
67 0.01621 0.03243 0.9838
68 0.01315 0.0263 0.9869
69 0.01039 0.02078 0.9896
70 0.009066 0.01813 0.9909
71 0.01123 0.02246 0.9888
72 0.0113 0.0226 0.9887
73 0.01472 0.02944 0.9853
74 0.0118 0.0236 0.9882
75 0.01116 0.02233 0.9888
76 0.008311 0.01662 0.9917
77 0.01122 0.02244 0.9888
78 0.009926 0.01985 0.9901
79 0.009817 0.01963 0.9902
80 0.03796 0.07592 0.962
81 0.06494 0.1299 0.9351
82 0.3301 0.6602 0.6699
83 0.3444 0.6889 0.6556
84 0.5669 0.8662 0.4331
85 0.5455 0.909 0.4545
86 0.5951 0.8098 0.4049
87 0.568 0.864 0.432
88 0.569 0.862 0.431
89 0.5479 0.9042 0.4521
90 0.5178 0.9644 0.4822
91 0.4753 0.9507 0.5247
92 0.4539 0.9078 0.5461
93 0.4112 0.8224 0.5888
94 0.388 0.7761 0.612
95 0.4186 0.8371 0.5814
96 0.4467 0.8933 0.5533
97 0.4036 0.8072 0.5964
98 0.3936 0.7873 0.6064
99 0.4162 0.8323 0.5838
100 0.3724 0.7448 0.6276
101 0.3374 0.6747 0.6626
102 0.2978 0.5955 0.7022
103 0.2779 0.5558 0.7221
104 0.2544 0.5087 0.7456
105 0.2666 0.5332 0.7334
106 0.2675 0.535 0.7325
107 0.2379 0.4758 0.7621
108 0.3793 0.7587 0.6207
109 0.4248 0.8496 0.5752
110 0.6156 0.7688 0.3844
111 0.848 0.304 0.152
112 0.8714 0.2572 0.1286
113 0.864 0.272 0.136
114 0.861 0.278 0.139
115 0.8721 0.2557 0.1279
116 0.8756 0.2487 0.1244
117 0.8568 0.2865 0.1432
118 0.8301 0.3398 0.1699
119 0.8429 0.3141 0.1571
120 0.8891 0.2218 0.1109
121 0.9211 0.1578 0.07888
122 0.9097 0.1807 0.09033
123 0.8987 0.2026 0.1013
124 0.8897 0.2206 0.1103
125 0.8934 0.2133 0.1066
126 0.8777 0.2446 0.1223
127 0.8572 0.2856 0.1428
128 0.8291 0.3418 0.1709
129 0.8481 0.3038 0.1519
130 0.8135 0.3729 0.1865
131 0.7884 0.4231 0.2116
132 0.7643 0.4714 0.2357
133 0.7526 0.4948 0.2474
134 0.8448 0.3103 0.1552
135 0.8097 0.3806 0.1903
136 0.8624 0.2752 0.1376
137 0.8548 0.2904 0.1452
138 0.8287 0.3427 0.1713
139 0.7986 0.4027 0.2014
140 0.7859 0.4282 0.2141
141 0.7626 0.4747 0.2374
142 0.7163 0.5675 0.2837
143 0.7177 0.5647 0.2823
144 0.6685 0.6631 0.3315
145 0.6317 0.7366 0.3683
146 0.6473 0.7055 0.3527
147 0.6109 0.7783 0.3891
148 0.6742 0.6516 0.3258
149 0.6464 0.7072 0.3536
150 0.6862 0.6275 0.3138
151 0.7747 0.4506 0.2253
152 0.7743 0.4514 0.2257
153 0.7291 0.5417 0.2709
154 0.708 0.584 0.292
155 0.6362 0.7275 0.3638
156 0.551 0.8979 0.449
157 0.523 0.954 0.477
158 0.4418 0.8837 0.5582
159 0.3528 0.7056 0.6472
160 0.2828 0.5656 0.7172
161 0.6402 0.7197 0.3598
162 0.5217 0.9567 0.4783
163 0.4696 0.9392 0.5304
164 0.3303 0.6607 0.6697

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
21 &  0.155 &  0.31 &  0.845 \tabularnewline
22 &  0.06434 &  0.1287 &  0.9357 \tabularnewline
23 &  0.04907 &  0.09815 &  0.9509 \tabularnewline
24 &  0.02199 &  0.04397 &  0.978 \tabularnewline
25 &  0.1644 &  0.3288 &  0.8356 \tabularnewline
26 &  0.1209 &  0.2418 &  0.8791 \tabularnewline
27 &  0.08781 &  0.1756 &  0.9122 \tabularnewline
28 &  0.05207 &  0.1041 &  0.9479 \tabularnewline
29 &  0.05891 &  0.1178 &  0.9411 \tabularnewline
30 &  0.04816 &  0.09633 &  0.9518 \tabularnewline
31 &  0.0304 &  0.06081 &  0.9696 \tabularnewline
32 &  0.02009 &  0.04017 &  0.9799 \tabularnewline
33 &  0.01743 &  0.03486 &  0.9826 \tabularnewline
34 &  0.01123 &  0.02246 &  0.9888 \tabularnewline
35 &  0.03483 &  0.06966 &  0.9652 \tabularnewline
36 &  0.08661 &  0.1732 &  0.9134 \tabularnewline
37 &  0.06115 &  0.1223 &  0.9389 \tabularnewline
38 &  0.05633 &  0.1127 &  0.9437 \tabularnewline
39 &  0.03973 &  0.07946 &  0.9603 \tabularnewline
40 &  0.09903 &  0.1981 &  0.901 \tabularnewline
41 &  0.07702 &  0.154 &  0.923 \tabularnewline
42 &  0.05631 &  0.1126 &  0.9437 \tabularnewline
43 &  0.04622 &  0.09244 &  0.9538 \tabularnewline
44 &  0.04329 &  0.08657 &  0.9567 \tabularnewline
45 &  0.04962 &  0.09924 &  0.9504 \tabularnewline
46 &  0.03945 &  0.0789 &  0.9605 \tabularnewline
47 &  0.02897 &  0.05794 &  0.971 \tabularnewline
48 &  0.03279 &  0.06558 &  0.9672 \tabularnewline
49 &  0.0689 &  0.1378 &  0.9311 \tabularnewline
50 &  0.06247 &  0.1249 &  0.9375 \tabularnewline
51 &  0.04799 &  0.09597 &  0.952 \tabularnewline
52 &  0.04323 &  0.08647 &  0.9568 \tabularnewline
53 &  0.03664 &  0.07329 &  0.9634 \tabularnewline
54 &  0.02805 &  0.05611 &  0.9719 \tabularnewline
55 &  0.05277 &  0.1055 &  0.9472 \tabularnewline
56 &  0.05733 &  0.1147 &  0.9427 \tabularnewline
57 &  0.0442 &  0.08841 &  0.9558 \tabularnewline
58 &  0.03338 &  0.06676 &  0.9666 \tabularnewline
59 &  0.02707 &  0.05413 &  0.9729 \tabularnewline
60 &  0.02182 &  0.04363 &  0.9782 \tabularnewline
61 &  0.01832 &  0.03665 &  0.9817 \tabularnewline
62 &  0.01381 &  0.02762 &  0.9862 \tabularnewline
63 &  0.01226 &  0.02451 &  0.9877 \tabularnewline
64 &  0.01149 &  0.02298 &  0.9885 \tabularnewline
65 &  0.008313 &  0.01663 &  0.9917 \tabularnewline
66 &  0.01059 &  0.02117 &  0.9894 \tabularnewline
67 &  0.01621 &  0.03243 &  0.9838 \tabularnewline
68 &  0.01315 &  0.0263 &  0.9869 \tabularnewline
69 &  0.01039 &  0.02078 &  0.9896 \tabularnewline
70 &  0.009066 &  0.01813 &  0.9909 \tabularnewline
71 &  0.01123 &  0.02246 &  0.9888 \tabularnewline
72 &  0.0113 &  0.0226 &  0.9887 \tabularnewline
73 &  0.01472 &  0.02944 &  0.9853 \tabularnewline
74 &  0.0118 &  0.0236 &  0.9882 \tabularnewline
75 &  0.01116 &  0.02233 &  0.9888 \tabularnewline
76 &  0.008311 &  0.01662 &  0.9917 \tabularnewline
77 &  0.01122 &  0.02244 &  0.9888 \tabularnewline
78 &  0.009926 &  0.01985 &  0.9901 \tabularnewline
79 &  0.009817 &  0.01963 &  0.9902 \tabularnewline
80 &  0.03796 &  0.07592 &  0.962 \tabularnewline
81 &  0.06494 &  0.1299 &  0.9351 \tabularnewline
82 &  0.3301 &  0.6602 &  0.6699 \tabularnewline
83 &  0.3444 &  0.6889 &  0.6556 \tabularnewline
84 &  0.5669 &  0.8662 &  0.4331 \tabularnewline
85 &  0.5455 &  0.909 &  0.4545 \tabularnewline
86 &  0.5951 &  0.8098 &  0.4049 \tabularnewline
87 &  0.568 &  0.864 &  0.432 \tabularnewline
88 &  0.569 &  0.862 &  0.431 \tabularnewline
89 &  0.5479 &  0.9042 &  0.4521 \tabularnewline
90 &  0.5178 &  0.9644 &  0.4822 \tabularnewline
91 &  0.4753 &  0.9507 &  0.5247 \tabularnewline
92 &  0.4539 &  0.9078 &  0.5461 \tabularnewline
93 &  0.4112 &  0.8224 &  0.5888 \tabularnewline
94 &  0.388 &  0.7761 &  0.612 \tabularnewline
95 &  0.4186 &  0.8371 &  0.5814 \tabularnewline
96 &  0.4467 &  0.8933 &  0.5533 \tabularnewline
97 &  0.4036 &  0.8072 &  0.5964 \tabularnewline
98 &  0.3936 &  0.7873 &  0.6064 \tabularnewline
99 &  0.4162 &  0.8323 &  0.5838 \tabularnewline
100 &  0.3724 &  0.7448 &  0.6276 \tabularnewline
101 &  0.3374 &  0.6747 &  0.6626 \tabularnewline
102 &  0.2978 &  0.5955 &  0.7022 \tabularnewline
103 &  0.2779 &  0.5558 &  0.7221 \tabularnewline
104 &  0.2544 &  0.5087 &  0.7456 \tabularnewline
105 &  0.2666 &  0.5332 &  0.7334 \tabularnewline
106 &  0.2675 &  0.535 &  0.7325 \tabularnewline
107 &  0.2379 &  0.4758 &  0.7621 \tabularnewline
108 &  0.3793 &  0.7587 &  0.6207 \tabularnewline
109 &  0.4248 &  0.8496 &  0.5752 \tabularnewline
110 &  0.6156 &  0.7688 &  0.3844 \tabularnewline
111 &  0.848 &  0.304 &  0.152 \tabularnewline
112 &  0.8714 &  0.2572 &  0.1286 \tabularnewline
113 &  0.864 &  0.272 &  0.136 \tabularnewline
114 &  0.861 &  0.278 &  0.139 \tabularnewline
115 &  0.8721 &  0.2557 &  0.1279 \tabularnewline
116 &  0.8756 &  0.2487 &  0.1244 \tabularnewline
117 &  0.8568 &  0.2865 &  0.1432 \tabularnewline
118 &  0.8301 &  0.3398 &  0.1699 \tabularnewline
119 &  0.8429 &  0.3141 &  0.1571 \tabularnewline
120 &  0.8891 &  0.2218 &  0.1109 \tabularnewline
121 &  0.9211 &  0.1578 &  0.07888 \tabularnewline
122 &  0.9097 &  0.1807 &  0.09033 \tabularnewline
123 &  0.8987 &  0.2026 &  0.1013 \tabularnewline
124 &  0.8897 &  0.2206 &  0.1103 \tabularnewline
125 &  0.8934 &  0.2133 &  0.1066 \tabularnewline
126 &  0.8777 &  0.2446 &  0.1223 \tabularnewline
127 &  0.8572 &  0.2856 &  0.1428 \tabularnewline
128 &  0.8291 &  0.3418 &  0.1709 \tabularnewline
129 &  0.8481 &  0.3038 &  0.1519 \tabularnewline
130 &  0.8135 &  0.3729 &  0.1865 \tabularnewline
131 &  0.7884 &  0.4231 &  0.2116 \tabularnewline
132 &  0.7643 &  0.4714 &  0.2357 \tabularnewline
133 &  0.7526 &  0.4948 &  0.2474 \tabularnewline
134 &  0.8448 &  0.3103 &  0.1552 \tabularnewline
135 &  0.8097 &  0.3806 &  0.1903 \tabularnewline
136 &  0.8624 &  0.2752 &  0.1376 \tabularnewline
137 &  0.8548 &  0.2904 &  0.1452 \tabularnewline
138 &  0.8287 &  0.3427 &  0.1713 \tabularnewline
139 &  0.7986 &  0.4027 &  0.2014 \tabularnewline
140 &  0.7859 &  0.4282 &  0.2141 \tabularnewline
141 &  0.7626 &  0.4747 &  0.2374 \tabularnewline
142 &  0.7163 &  0.5675 &  0.2837 \tabularnewline
143 &  0.7177 &  0.5647 &  0.2823 \tabularnewline
144 &  0.6685 &  0.6631 &  0.3315 \tabularnewline
145 &  0.6317 &  0.7366 &  0.3683 \tabularnewline
146 &  0.6473 &  0.7055 &  0.3527 \tabularnewline
147 &  0.6109 &  0.7783 &  0.3891 \tabularnewline
148 &  0.6742 &  0.6516 &  0.3258 \tabularnewline
149 &  0.6464 &  0.7072 &  0.3536 \tabularnewline
150 &  0.6862 &  0.6275 &  0.3138 \tabularnewline
151 &  0.7747 &  0.4506 &  0.2253 \tabularnewline
152 &  0.7743 &  0.4514 &  0.2257 \tabularnewline
153 &  0.7291 &  0.5417 &  0.2709 \tabularnewline
154 &  0.708 &  0.584 &  0.292 \tabularnewline
155 &  0.6362 &  0.7275 &  0.3638 \tabularnewline
156 &  0.551 &  0.8979 &  0.449 \tabularnewline
157 &  0.523 &  0.954 &  0.477 \tabularnewline
158 &  0.4418 &  0.8837 &  0.5582 \tabularnewline
159 &  0.3528 &  0.7056 &  0.6472 \tabularnewline
160 &  0.2828 &  0.5656 &  0.7172 \tabularnewline
161 &  0.6402 &  0.7197 &  0.3598 \tabularnewline
162 &  0.5217 &  0.9567 &  0.4783 \tabularnewline
163 &  0.4696 &  0.9392 &  0.5304 \tabularnewline
164 &  0.3303 &  0.6607 &  0.6697 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=6

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]21[/C][C] 0.155[/C][C] 0.31[/C][C] 0.845[/C][/ROW]
[ROW][C]22[/C][C] 0.06434[/C][C] 0.1287[/C][C] 0.9357[/C][/ROW]
[ROW][C]23[/C][C] 0.04907[/C][C] 0.09815[/C][C] 0.9509[/C][/ROW]
[ROW][C]24[/C][C] 0.02199[/C][C] 0.04397[/C][C] 0.978[/C][/ROW]
[ROW][C]25[/C][C] 0.1644[/C][C] 0.3288[/C][C] 0.8356[/C][/ROW]
[ROW][C]26[/C][C] 0.1209[/C][C] 0.2418[/C][C] 0.8791[/C][/ROW]
[ROW][C]27[/C][C] 0.08781[/C][C] 0.1756[/C][C] 0.9122[/C][/ROW]
[ROW][C]28[/C][C] 0.05207[/C][C] 0.1041[/C][C] 0.9479[/C][/ROW]
[ROW][C]29[/C][C] 0.05891[/C][C] 0.1178[/C][C] 0.9411[/C][/ROW]
[ROW][C]30[/C][C] 0.04816[/C][C] 0.09633[/C][C] 0.9518[/C][/ROW]
[ROW][C]31[/C][C] 0.0304[/C][C] 0.06081[/C][C] 0.9696[/C][/ROW]
[ROW][C]32[/C][C] 0.02009[/C][C] 0.04017[/C][C] 0.9799[/C][/ROW]
[ROW][C]33[/C][C] 0.01743[/C][C] 0.03486[/C][C] 0.9826[/C][/ROW]
[ROW][C]34[/C][C] 0.01123[/C][C] 0.02246[/C][C] 0.9888[/C][/ROW]
[ROW][C]35[/C][C] 0.03483[/C][C] 0.06966[/C][C] 0.9652[/C][/ROW]
[ROW][C]36[/C][C] 0.08661[/C][C] 0.1732[/C][C] 0.9134[/C][/ROW]
[ROW][C]37[/C][C] 0.06115[/C][C] 0.1223[/C][C] 0.9389[/C][/ROW]
[ROW][C]38[/C][C] 0.05633[/C][C] 0.1127[/C][C] 0.9437[/C][/ROW]
[ROW][C]39[/C][C] 0.03973[/C][C] 0.07946[/C][C] 0.9603[/C][/ROW]
[ROW][C]40[/C][C] 0.09903[/C][C] 0.1981[/C][C] 0.901[/C][/ROW]
[ROW][C]41[/C][C] 0.07702[/C][C] 0.154[/C][C] 0.923[/C][/ROW]
[ROW][C]42[/C][C] 0.05631[/C][C] 0.1126[/C][C] 0.9437[/C][/ROW]
[ROW][C]43[/C][C] 0.04622[/C][C] 0.09244[/C][C] 0.9538[/C][/ROW]
[ROW][C]44[/C][C] 0.04329[/C][C] 0.08657[/C][C] 0.9567[/C][/ROW]
[ROW][C]45[/C][C] 0.04962[/C][C] 0.09924[/C][C] 0.9504[/C][/ROW]
[ROW][C]46[/C][C] 0.03945[/C][C] 0.0789[/C][C] 0.9605[/C][/ROW]
[ROW][C]47[/C][C] 0.02897[/C][C] 0.05794[/C][C] 0.971[/C][/ROW]
[ROW][C]48[/C][C] 0.03279[/C][C] 0.06558[/C][C] 0.9672[/C][/ROW]
[ROW][C]49[/C][C] 0.0689[/C][C] 0.1378[/C][C] 0.9311[/C][/ROW]
[ROW][C]50[/C][C] 0.06247[/C][C] 0.1249[/C][C] 0.9375[/C][/ROW]
[ROW][C]51[/C][C] 0.04799[/C][C] 0.09597[/C][C] 0.952[/C][/ROW]
[ROW][C]52[/C][C] 0.04323[/C][C] 0.08647[/C][C] 0.9568[/C][/ROW]
[ROW][C]53[/C][C] 0.03664[/C][C] 0.07329[/C][C] 0.9634[/C][/ROW]
[ROW][C]54[/C][C] 0.02805[/C][C] 0.05611[/C][C] 0.9719[/C][/ROW]
[ROW][C]55[/C][C] 0.05277[/C][C] 0.1055[/C][C] 0.9472[/C][/ROW]
[ROW][C]56[/C][C] 0.05733[/C][C] 0.1147[/C][C] 0.9427[/C][/ROW]
[ROW][C]57[/C][C] 0.0442[/C][C] 0.08841[/C][C] 0.9558[/C][/ROW]
[ROW][C]58[/C][C] 0.03338[/C][C] 0.06676[/C][C] 0.9666[/C][/ROW]
[ROW][C]59[/C][C] 0.02707[/C][C] 0.05413[/C][C] 0.9729[/C][/ROW]
[ROW][C]60[/C][C] 0.02182[/C][C] 0.04363[/C][C] 0.9782[/C][/ROW]
[ROW][C]61[/C][C] 0.01832[/C][C] 0.03665[/C][C] 0.9817[/C][/ROW]
[ROW][C]62[/C][C] 0.01381[/C][C] 0.02762[/C][C] 0.9862[/C][/ROW]
[ROW][C]63[/C][C] 0.01226[/C][C] 0.02451[/C][C] 0.9877[/C][/ROW]
[ROW][C]64[/C][C] 0.01149[/C][C] 0.02298[/C][C] 0.9885[/C][/ROW]
[ROW][C]65[/C][C] 0.008313[/C][C] 0.01663[/C][C] 0.9917[/C][/ROW]
[ROW][C]66[/C][C] 0.01059[/C][C] 0.02117[/C][C] 0.9894[/C][/ROW]
[ROW][C]67[/C][C] 0.01621[/C][C] 0.03243[/C][C] 0.9838[/C][/ROW]
[ROW][C]68[/C][C] 0.01315[/C][C] 0.0263[/C][C] 0.9869[/C][/ROW]
[ROW][C]69[/C][C] 0.01039[/C][C] 0.02078[/C][C] 0.9896[/C][/ROW]
[ROW][C]70[/C][C] 0.009066[/C][C] 0.01813[/C][C] 0.9909[/C][/ROW]
[ROW][C]71[/C][C] 0.01123[/C][C] 0.02246[/C][C] 0.9888[/C][/ROW]
[ROW][C]72[/C][C] 0.0113[/C][C] 0.0226[/C][C] 0.9887[/C][/ROW]
[ROW][C]73[/C][C] 0.01472[/C][C] 0.02944[/C][C] 0.9853[/C][/ROW]
[ROW][C]74[/C][C] 0.0118[/C][C] 0.0236[/C][C] 0.9882[/C][/ROW]
[ROW][C]75[/C][C] 0.01116[/C][C] 0.02233[/C][C] 0.9888[/C][/ROW]
[ROW][C]76[/C][C] 0.008311[/C][C] 0.01662[/C][C] 0.9917[/C][/ROW]
[ROW][C]77[/C][C] 0.01122[/C][C] 0.02244[/C][C] 0.9888[/C][/ROW]
[ROW][C]78[/C][C] 0.009926[/C][C] 0.01985[/C][C] 0.9901[/C][/ROW]
[ROW][C]79[/C][C] 0.009817[/C][C] 0.01963[/C][C] 0.9902[/C][/ROW]
[ROW][C]80[/C][C] 0.03796[/C][C] 0.07592[/C][C] 0.962[/C][/ROW]
[ROW][C]81[/C][C] 0.06494[/C][C] 0.1299[/C][C] 0.9351[/C][/ROW]
[ROW][C]82[/C][C] 0.3301[/C][C] 0.6602[/C][C] 0.6699[/C][/ROW]
[ROW][C]83[/C][C] 0.3444[/C][C] 0.6889[/C][C] 0.6556[/C][/ROW]
[ROW][C]84[/C][C] 0.5669[/C][C] 0.8662[/C][C] 0.4331[/C][/ROW]
[ROW][C]85[/C][C] 0.5455[/C][C] 0.909[/C][C] 0.4545[/C][/ROW]
[ROW][C]86[/C][C] 0.5951[/C][C] 0.8098[/C][C] 0.4049[/C][/ROW]
[ROW][C]87[/C][C] 0.568[/C][C] 0.864[/C][C] 0.432[/C][/ROW]
[ROW][C]88[/C][C] 0.569[/C][C] 0.862[/C][C] 0.431[/C][/ROW]
[ROW][C]89[/C][C] 0.5479[/C][C] 0.9042[/C][C] 0.4521[/C][/ROW]
[ROW][C]90[/C][C] 0.5178[/C][C] 0.9644[/C][C] 0.4822[/C][/ROW]
[ROW][C]91[/C][C] 0.4753[/C][C] 0.9507[/C][C] 0.5247[/C][/ROW]
[ROW][C]92[/C][C] 0.4539[/C][C] 0.9078[/C][C] 0.5461[/C][/ROW]
[ROW][C]93[/C][C] 0.4112[/C][C] 0.8224[/C][C] 0.5888[/C][/ROW]
[ROW][C]94[/C][C] 0.388[/C][C] 0.7761[/C][C] 0.612[/C][/ROW]
[ROW][C]95[/C][C] 0.4186[/C][C] 0.8371[/C][C] 0.5814[/C][/ROW]
[ROW][C]96[/C][C] 0.4467[/C][C] 0.8933[/C][C] 0.5533[/C][/ROW]
[ROW][C]97[/C][C] 0.4036[/C][C] 0.8072[/C][C] 0.5964[/C][/ROW]
[ROW][C]98[/C][C] 0.3936[/C][C] 0.7873[/C][C] 0.6064[/C][/ROW]
[ROW][C]99[/C][C] 0.4162[/C][C] 0.8323[/C][C] 0.5838[/C][/ROW]
[ROW][C]100[/C][C] 0.3724[/C][C] 0.7448[/C][C] 0.6276[/C][/ROW]
[ROW][C]101[/C][C] 0.3374[/C][C] 0.6747[/C][C] 0.6626[/C][/ROW]
[ROW][C]102[/C][C] 0.2978[/C][C] 0.5955[/C][C] 0.7022[/C][/ROW]
[ROW][C]103[/C][C] 0.2779[/C][C] 0.5558[/C][C] 0.7221[/C][/ROW]
[ROW][C]104[/C][C] 0.2544[/C][C] 0.5087[/C][C] 0.7456[/C][/ROW]
[ROW][C]105[/C][C] 0.2666[/C][C] 0.5332[/C][C] 0.7334[/C][/ROW]
[ROW][C]106[/C][C] 0.2675[/C][C] 0.535[/C][C] 0.7325[/C][/ROW]
[ROW][C]107[/C][C] 0.2379[/C][C] 0.4758[/C][C] 0.7621[/C][/ROW]
[ROW][C]108[/C][C] 0.3793[/C][C] 0.7587[/C][C] 0.6207[/C][/ROW]
[ROW][C]109[/C][C] 0.4248[/C][C] 0.8496[/C][C] 0.5752[/C][/ROW]
[ROW][C]110[/C][C] 0.6156[/C][C] 0.7688[/C][C] 0.3844[/C][/ROW]
[ROW][C]111[/C][C] 0.848[/C][C] 0.304[/C][C] 0.152[/C][/ROW]
[ROW][C]112[/C][C] 0.8714[/C][C] 0.2572[/C][C] 0.1286[/C][/ROW]
[ROW][C]113[/C][C] 0.864[/C][C] 0.272[/C][C] 0.136[/C][/ROW]
[ROW][C]114[/C][C] 0.861[/C][C] 0.278[/C][C] 0.139[/C][/ROW]
[ROW][C]115[/C][C] 0.8721[/C][C] 0.2557[/C][C] 0.1279[/C][/ROW]
[ROW][C]116[/C][C] 0.8756[/C][C] 0.2487[/C][C] 0.1244[/C][/ROW]
[ROW][C]117[/C][C] 0.8568[/C][C] 0.2865[/C][C] 0.1432[/C][/ROW]
[ROW][C]118[/C][C] 0.8301[/C][C] 0.3398[/C][C] 0.1699[/C][/ROW]
[ROW][C]119[/C][C] 0.8429[/C][C] 0.3141[/C][C] 0.1571[/C][/ROW]
[ROW][C]120[/C][C] 0.8891[/C][C] 0.2218[/C][C] 0.1109[/C][/ROW]
[ROW][C]121[/C][C] 0.9211[/C][C] 0.1578[/C][C] 0.07888[/C][/ROW]
[ROW][C]122[/C][C] 0.9097[/C][C] 0.1807[/C][C] 0.09033[/C][/ROW]
[ROW][C]123[/C][C] 0.8987[/C][C] 0.2026[/C][C] 0.1013[/C][/ROW]
[ROW][C]124[/C][C] 0.8897[/C][C] 0.2206[/C][C] 0.1103[/C][/ROW]
[ROW][C]125[/C][C] 0.8934[/C][C] 0.2133[/C][C] 0.1066[/C][/ROW]
[ROW][C]126[/C][C] 0.8777[/C][C] 0.2446[/C][C] 0.1223[/C][/ROW]
[ROW][C]127[/C][C] 0.8572[/C][C] 0.2856[/C][C] 0.1428[/C][/ROW]
[ROW][C]128[/C][C] 0.8291[/C][C] 0.3418[/C][C] 0.1709[/C][/ROW]
[ROW][C]129[/C][C] 0.8481[/C][C] 0.3038[/C][C] 0.1519[/C][/ROW]
[ROW][C]130[/C][C] 0.8135[/C][C] 0.3729[/C][C] 0.1865[/C][/ROW]
[ROW][C]131[/C][C] 0.7884[/C][C] 0.4231[/C][C] 0.2116[/C][/ROW]
[ROW][C]132[/C][C] 0.7643[/C][C] 0.4714[/C][C] 0.2357[/C][/ROW]
[ROW][C]133[/C][C] 0.7526[/C][C] 0.4948[/C][C] 0.2474[/C][/ROW]
[ROW][C]134[/C][C] 0.8448[/C][C] 0.3103[/C][C] 0.1552[/C][/ROW]
[ROW][C]135[/C][C] 0.8097[/C][C] 0.3806[/C][C] 0.1903[/C][/ROW]
[ROW][C]136[/C][C] 0.8624[/C][C] 0.2752[/C][C] 0.1376[/C][/ROW]
[ROW][C]137[/C][C] 0.8548[/C][C] 0.2904[/C][C] 0.1452[/C][/ROW]
[ROW][C]138[/C][C] 0.8287[/C][C] 0.3427[/C][C] 0.1713[/C][/ROW]
[ROW][C]139[/C][C] 0.7986[/C][C] 0.4027[/C][C] 0.2014[/C][/ROW]
[ROW][C]140[/C][C] 0.7859[/C][C] 0.4282[/C][C] 0.2141[/C][/ROW]
[ROW][C]141[/C][C] 0.7626[/C][C] 0.4747[/C][C] 0.2374[/C][/ROW]
[ROW][C]142[/C][C] 0.7163[/C][C] 0.5675[/C][C] 0.2837[/C][/ROW]
[ROW][C]143[/C][C] 0.7177[/C][C] 0.5647[/C][C] 0.2823[/C][/ROW]
[ROW][C]144[/C][C] 0.6685[/C][C] 0.6631[/C][C] 0.3315[/C][/ROW]
[ROW][C]145[/C][C] 0.6317[/C][C] 0.7366[/C][C] 0.3683[/C][/ROW]
[ROW][C]146[/C][C] 0.6473[/C][C] 0.7055[/C][C] 0.3527[/C][/ROW]
[ROW][C]147[/C][C] 0.6109[/C][C] 0.7783[/C][C] 0.3891[/C][/ROW]
[ROW][C]148[/C][C] 0.6742[/C][C] 0.6516[/C][C] 0.3258[/C][/ROW]
[ROW][C]149[/C][C] 0.6464[/C][C] 0.7072[/C][C] 0.3536[/C][/ROW]
[ROW][C]150[/C][C] 0.6862[/C][C] 0.6275[/C][C] 0.3138[/C][/ROW]
[ROW][C]151[/C][C] 0.7747[/C][C] 0.4506[/C][C] 0.2253[/C][/ROW]
[ROW][C]152[/C][C] 0.7743[/C][C] 0.4514[/C][C] 0.2257[/C][/ROW]
[ROW][C]153[/C][C] 0.7291[/C][C] 0.5417[/C][C] 0.2709[/C][/ROW]
[ROW][C]154[/C][C] 0.708[/C][C] 0.584[/C][C] 0.292[/C][/ROW]
[ROW][C]155[/C][C] 0.6362[/C][C] 0.7275[/C][C] 0.3638[/C][/ROW]
[ROW][C]156[/C][C] 0.551[/C][C] 0.8979[/C][C] 0.449[/C][/ROW]
[ROW][C]157[/C][C] 0.523[/C][C] 0.954[/C][C] 0.477[/C][/ROW]
[ROW][C]158[/C][C] 0.4418[/C][C] 0.8837[/C][C] 0.5582[/C][/ROW]
[ROW][C]159[/C][C] 0.3528[/C][C] 0.7056[/C][C] 0.6472[/C][/ROW]
[ROW][C]160[/C][C] 0.2828[/C][C] 0.5656[/C][C] 0.7172[/C][/ROW]
[ROW][C]161[/C][C] 0.6402[/C][C] 0.7197[/C][C] 0.3598[/C][/ROW]
[ROW][C]162[/C][C] 0.5217[/C][C] 0.9567[/C][C] 0.4783[/C][/ROW]
[ROW][C]163[/C][C] 0.4696[/C][C] 0.9392[/C][C] 0.5304[/C][/ROW]
[ROW][C]164[/C][C] 0.3303[/C][C] 0.6607[/C][C] 0.6697[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
21 0.155 0.31 0.845
22 0.06434 0.1287 0.9357
23 0.04907 0.09815 0.9509
24 0.02199 0.04397 0.978
25 0.1644 0.3288 0.8356
26 0.1209 0.2418 0.8791
27 0.08781 0.1756 0.9122
28 0.05207 0.1041 0.9479
29 0.05891 0.1178 0.9411
30 0.04816 0.09633 0.9518
31 0.0304 0.06081 0.9696
32 0.02009 0.04017 0.9799
33 0.01743 0.03486 0.9826
34 0.01123 0.02246 0.9888
35 0.03483 0.06966 0.9652
36 0.08661 0.1732 0.9134
37 0.06115 0.1223 0.9389
38 0.05633 0.1127 0.9437
39 0.03973 0.07946 0.9603
40 0.09903 0.1981 0.901
41 0.07702 0.154 0.923
42 0.05631 0.1126 0.9437
43 0.04622 0.09244 0.9538
44 0.04329 0.08657 0.9567
45 0.04962 0.09924 0.9504
46 0.03945 0.0789 0.9605
47 0.02897 0.05794 0.971
48 0.03279 0.06558 0.9672
49 0.0689 0.1378 0.9311
50 0.06247 0.1249 0.9375
51 0.04799 0.09597 0.952
52 0.04323 0.08647 0.9568
53 0.03664 0.07329 0.9634
54 0.02805 0.05611 0.9719
55 0.05277 0.1055 0.9472
56 0.05733 0.1147 0.9427
57 0.0442 0.08841 0.9558
58 0.03338 0.06676 0.9666
59 0.02707 0.05413 0.9729
60 0.02182 0.04363 0.9782
61 0.01832 0.03665 0.9817
62 0.01381 0.02762 0.9862
63 0.01226 0.02451 0.9877
64 0.01149 0.02298 0.9885
65 0.008313 0.01663 0.9917
66 0.01059 0.02117 0.9894
67 0.01621 0.03243 0.9838
68 0.01315 0.0263 0.9869
69 0.01039 0.02078 0.9896
70 0.009066 0.01813 0.9909
71 0.01123 0.02246 0.9888
72 0.0113 0.0226 0.9887
73 0.01472 0.02944 0.9853
74 0.0118 0.0236 0.9882
75 0.01116 0.02233 0.9888
76 0.008311 0.01662 0.9917
77 0.01122 0.02244 0.9888
78 0.009926 0.01985 0.9901
79 0.009817 0.01963 0.9902
80 0.03796 0.07592 0.962
81 0.06494 0.1299 0.9351
82 0.3301 0.6602 0.6699
83 0.3444 0.6889 0.6556
84 0.5669 0.8662 0.4331
85 0.5455 0.909 0.4545
86 0.5951 0.8098 0.4049
87 0.568 0.864 0.432
88 0.569 0.862 0.431
89 0.5479 0.9042 0.4521
90 0.5178 0.9644 0.4822
91 0.4753 0.9507 0.5247
92 0.4539 0.9078 0.5461
93 0.4112 0.8224 0.5888
94 0.388 0.7761 0.612
95 0.4186 0.8371 0.5814
96 0.4467 0.8933 0.5533
97 0.4036 0.8072 0.5964
98 0.3936 0.7873 0.6064
99 0.4162 0.8323 0.5838
100 0.3724 0.7448 0.6276
101 0.3374 0.6747 0.6626
102 0.2978 0.5955 0.7022
103 0.2779 0.5558 0.7221
104 0.2544 0.5087 0.7456
105 0.2666 0.5332 0.7334
106 0.2675 0.535 0.7325
107 0.2379 0.4758 0.7621
108 0.3793 0.7587 0.6207
109 0.4248 0.8496 0.5752
110 0.6156 0.7688 0.3844
111 0.848 0.304 0.152
112 0.8714 0.2572 0.1286
113 0.864 0.272 0.136
114 0.861 0.278 0.139
115 0.8721 0.2557 0.1279
116 0.8756 0.2487 0.1244
117 0.8568 0.2865 0.1432
118 0.8301 0.3398 0.1699
119 0.8429 0.3141 0.1571
120 0.8891 0.2218 0.1109
121 0.9211 0.1578 0.07888
122 0.9097 0.1807 0.09033
123 0.8987 0.2026 0.1013
124 0.8897 0.2206 0.1103
125 0.8934 0.2133 0.1066
126 0.8777 0.2446 0.1223
127 0.8572 0.2856 0.1428
128 0.8291 0.3418 0.1709
129 0.8481 0.3038 0.1519
130 0.8135 0.3729 0.1865
131 0.7884 0.4231 0.2116
132 0.7643 0.4714 0.2357
133 0.7526 0.4948 0.2474
134 0.8448 0.3103 0.1552
135 0.8097 0.3806 0.1903
136 0.8624 0.2752 0.1376
137 0.8548 0.2904 0.1452
138 0.8287 0.3427 0.1713
139 0.7986 0.4027 0.2014
140 0.7859 0.4282 0.2141
141 0.7626 0.4747 0.2374
142 0.7163 0.5675 0.2837
143 0.7177 0.5647 0.2823
144 0.6685 0.6631 0.3315
145 0.6317 0.7366 0.3683
146 0.6473 0.7055 0.3527
147 0.6109 0.7783 0.3891
148 0.6742 0.6516 0.3258
149 0.6464 0.7072 0.3536
150 0.6862 0.6275 0.3138
151 0.7747 0.4506 0.2253
152 0.7743 0.4514 0.2257
153 0.7291 0.5417 0.2709
154 0.708 0.584 0.292
155 0.6362 0.7275 0.3638
156 0.551 0.8979 0.449
157 0.523 0.954 0.477
158 0.4418 0.8837 0.5582
159 0.3528 0.7056 0.6472
160 0.2828 0.5656 0.7172
161 0.6402 0.7197 0.3598
162 0.5217 0.9567 0.4783
163 0.4696 0.9392 0.5304
164 0.3303 0.6607 0.6697







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level240.166667NOK
10% type I error level430.298611NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 24 & 0.166667 & NOK \tabularnewline
10% type I error level & 43 & 0.298611 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=315708&T=7

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]24[/C][C]0.166667[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]43[/C][C]0.298611[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=315708&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level240.166667NOK
10% type I error level430.298611NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.049398, df1 = 2, df2 = 165, p-value = 0.9518
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66411, df1 = 34, df2 = 133, p-value = 0.9171
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.2505, df1 = 2, df2 = 165, p-value = 0.04125

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.049398, df1 = 2, df2 = 165, p-value = 0.9518
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66411, df1 = 34, df2 = 133, p-value = 0.9171
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.2505, df1 = 2, df2 = 165, p-value = 0.04125
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=315708&T=8

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.049398, df1 = 2, df2 = 165, p-value = 0.9518
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66411, df1 = 34, df2 = 133, p-value = 0.9171
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.2505, df1 = 2, df2 = 165, p-value = 0.04125
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=315708&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.049398, df1 = 2, df2 = 165, p-value = 0.9518
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66411, df1 = 34, df2 = 133, p-value = 0.9171
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.2505, df1 = 2, df2 = 165, p-value = 0.04125







Variance Inflation Factors (Multicollinearity)
> vif
`X6(t-1)` `X6(t-2)` `X6(t-3)`  `(t-1s)`  `(t-2s)`        M1        M2        M3 
10.677121 10.627612 11.376867  7.378301  6.950873  3.116439  2.980612  2.556970 
       M4        M5        M6        M7        M8        M9       M10       M11 
 3.800397  4.059677  2.862720  2.913575  3.341568  2.663732  2.680988  2.360666 
        t 
10.707551 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
`X6(t-1)` `X6(t-2)` `X6(t-3)`  `(t-1s)`  `(t-2s)`        M1        M2        M3 
10.677121 10.627612 11.376867  7.378301  6.950873  3.116439  2.980612  2.556970 
       M4        M5        M6        M7        M8        M9       M10       M11 
 3.800397  4.059677  2.862720  2.913575  3.341568  2.663732  2.680988  2.360666 
        t 
10.707551 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=315708&T=9

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
`X6(t-1)` `X6(t-2)` `X6(t-3)`  `(t-1s)`  `(t-2s)`        M1        M2        M3 
10.677121 10.627612 11.376867  7.378301  6.950873  3.116439  2.980612  2.556970 
       M4        M5        M6        M7        M8        M9       M10       M11 
 3.800397  4.059677  2.862720  2.913575  3.341568  2.663732  2.680988  2.360666 
        t 
10.707551 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=315708&T=9

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=315708&T=9

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
`X6(t-1)` `X6(t-2)` `X6(t-3)`  `(t-1s)`  `(t-2s)`        M1        M2        M3 
10.677121 10.627612 11.376867  7.378301  6.950873  3.116439  2.980612  2.556970 
       M4        M5        M6        M7        M8        M9       M10       M11 
 3.800397  4.059677  2.862720  2.913575  3.341568  2.663732  2.680988  2.360666 
        t 
10.707551 



Parameters (Session):
par1 = 1 ; par2 = Include Seasonal Dummies ; par3 = Linear Trend ; par4 = 3 ; par5 = 2 ; par6 = 12 ;
Parameters (R input):
par1 = 1 ; par2 = Include Seasonal Dummies ; par3 = Linear Trend ; par4 = 3 ; par5 = 2 ; par6 = 12 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par6 <- as.numeric(par6)
if(is.na(par6)) {
par6 <- 12
mywarning = 'Warning: you did not specify the seasonality. The seasonal period was set to s = 12.'
}
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (!is.numeric(par4)) par4 <- 0
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
if (!is.numeric(par5)) par5 <- 0
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s)'){
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*par6,par5), dimnames=list(1:(n-par5*par6), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*par6)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*par6-j*par6,par1]
}
}
x <- cbind(x[(par5*par6+1):n,], x2)
n <- n - par5*par6
}
if (par2 == 'Include Seasonal Dummies'){
x2 <- array(0, dim=c(n,par6-1), dimnames=list(1:n, paste('M', seq(1:(par6-1)), sep ='')))
for (i in 1:(par6-1)){
x2[seq(i,n,par6),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
a <-table.start()
a <- table.row.start(a)
a <- table.element(a,'Menu of Residual Diagnostics',2,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Description',1,TRUE)
a <- table.element(a,'Link',1,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Histogram',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_histogram.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_centraltendency.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'QQ Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_fitdistrnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Kernel Density Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_density.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness/Kurtosis Test',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness-Kurtosis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis_plot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Harrell-Davis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_harrell_davis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Blocked Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'(Partial) Autocorrelation Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_autocorrelation.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Spectral Analysis',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_spectrum.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Tukey lambda PPCC Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_tukeylambda.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Box-Cox Normality Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_boxcoxnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Summary Statistics',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_summary1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable7.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')