Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 02 Feb 2017 10:25:58 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Feb/02/t1486027593kxyy0ozvc6bvu2l.htm/, Retrieved Fri, 10 May 2024 06:58:28 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=306634, Retrieved Fri, 10 May 2024 06:58:28 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact143
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [mp] [2017-02-02 09:25:58] [c383a3f496d779b12e2493a523dfe438] [Current]
Feedback Forum

Post a new message
Dataseries X:
13 22 14 22
16 24 19 24
17 21 17 26
NA 21 17 21
NA 24 15 26
16 20 20 25
NA 22 15 21
NA 20 19 24
NA 19 15 27
17 23 15 28
17 21 19 23
15 19 NA 25
16 19 20 24
14 21 18 24
16 21 15 24
17 22 14 25
NA 22 20 25
NA 19 NA NA
NA 21 16 25
NA 21 16 25
16 21 16 24
NA 20 10 26
16 22 19 26
NA 22 19 25
NA 24 16 26
NA 21 15 23
16 19 18 24
15 19 17 24
16 23 19 25
16 21 17 25
13 21 NA 24
15 19 19 28
17 21 20 27
NA 19 5 NA
13 21 19 23
17 21 16 23
NA 23 15 24
14 19 16 24
14 19 18 22
18 19 16 25
NA 18 15 25
17 22 17 28
13 18 NA 22
16 22 20 28
15 18 19 25
15 22 7 24
NA 22 13 24
15 19 16 23
13 22 16 25
NA 25 NA NA
17 19 18 26
NA 19 18 25
NA 19 16 27
11 19 17 26
14 21 19 23
13 21 16 25
NA 20 19 21
17 19 13 22
16 19 16 24
NA 22 13 25
17 26 12 27
16 19 17 24
16 21 17 26
16 21 17 21
15 20 16 27
12 23 16 22
17 22 14 23
14 22 16 24
14 22 13 25
16 21 16 24
NA 21 14 23
NA 22 20 28
NA 23 12 NA
NA 18 13 24
NA 24 18 26
15 22 14 22
16 21 19 25
14 21 18 25
15 21 14 24
17 23 18 24
NA 21 19 26
10 23 15 21
NA 21 14 25
17 19 17 25
NA 21 19 26
20 21 13 25
17 21 19 26
18 23 18 27
NA 23 20 25
17 20 15 NA
14 20 15 20
NA 19 15 24
17 23 20 26
NA 22 15 25
17 19 19 25
NA 23 18 24
16 22 18 26
18 22 15 25
18 21 20 28
16 21 17 27
NA 21 12 25
NA 21 18 26
15 22 19 26
13 25 20 26
NA 21 NA NA
NA 23 17 28
NA 19 15 NA
NA 22 16 21
NA 20 18 25
16 21 18 25
NA 25 14 24
NA 21 15 24
NA 19 12 24
12 23 17 23
NA 22 14 23
16 21 18 24
16 24 17 24
NA 21 17 25
16 19 20 28
14 18 16 23
15 19 14 24
14 20 15 23
NA 19 18 24
15 22 20 25
NA 21 17 24
15 22 17 23
16 24 17 23
NA 28 17 25
NA 19 15 21
NA 18 17 22
11 23 18 19
NA 19 17 24
18 23 20 25
NA 19 15 21
11 22 16 22
NA 21 15 23
18 19 18 27
NA 22 11 NA
15 21 15 26
19 23 18 29
17 22 20 28
NA 19 19 24
14 19 14 25
NA 21 16 25
13 22 15 22
17 21 17 25
14 20 18 26
19 23 20 26
14 22 17 24
NA 23 18 25
NA 22 15 19
16 21 16 25
16 20 11 23
15 18 15 25
12 18 18 25
NA 20 17 26
17 19 16 27
NA 21 12 24
NA 24 19 22
18 19 18 25
15 20 15 24
18 19 17 23
15 23 19 27
NA 22 18 24
NA 21 19 24
NA 24 16 21
16 21 16 25
NA 21 16 25
16 22 14 23




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=306634&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVDC[t] = + 4.93434 -0.0293838Bevr_Leeftijd[t] -0.0360228ITHSUM[t] + 0.478778SKEOUSUM[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDC[t] =  +  4.93434 -0.0293838Bevr_Leeftijd[t] -0.0360228ITHSUM[t] +  0.478778SKEOUSUM[t]  + e[t] \tabularnewline
Warning: you did not specify the column number of the endogenous series! The first column was selected by default. \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDC[t] =  +  4.93434 -0.0293838Bevr_Leeftijd[t] -0.0360228ITHSUM[t] +  0.478778SKEOUSUM[t]  + e[t][/C][/ROW]
[ROW][C]Warning: you did not specify the column number of the endogenous series! The first column was selected by default.[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=306634&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDC[t] = + 4.93434 -0.0293838Bevr_Leeftijd[t] -0.0360228ITHSUM[t] + 0.478778SKEOUSUM[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+4.934 3.092+1.5960e+00 0.1139 0.05693
Bevr_Leeftijd-0.02938 0.1019-2.8830e-01 0.7737 0.3869
ITHSUM-0.03602 0.07875-4.5740e-01 0.6484 0.3242
SKEOUSUM+0.4788 0.09603+4.9860e+00 2.772e-06 1.386e-06

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +4.934 &  3.092 & +1.5960e+00 &  0.1139 &  0.05693 \tabularnewline
Bevr_Leeftijd & -0.02938 &  0.1019 & -2.8830e-01 &  0.7737 &  0.3869 \tabularnewline
ITHSUM & -0.03602 &  0.07875 & -4.5740e-01 &  0.6484 &  0.3242 \tabularnewline
SKEOUSUM & +0.4788 &  0.09603 & +4.9860e+00 &  2.772e-06 &  1.386e-06 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+4.934[/C][C] 3.092[/C][C]+1.5960e+00[/C][C] 0.1139[/C][C] 0.05693[/C][/ROW]
[ROW][C]Bevr_Leeftijd[/C][C]-0.02938[/C][C] 0.1019[/C][C]-2.8830e-01[/C][C] 0.7737[/C][C] 0.3869[/C][/ROW]
[ROW][C]ITHSUM[/C][C]-0.03602[/C][C] 0.07875[/C][C]-4.5740e-01[/C][C] 0.6484[/C][C] 0.3242[/C][/ROW]
[ROW][C]SKEOUSUM[/C][C]+0.4788[/C][C] 0.09603[/C][C]+4.9860e+00[/C][C] 2.772e-06[/C][C] 1.386e-06[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=306634&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+4.934 3.092+1.5960e+00 0.1139 0.05693
Bevr_Leeftijd-0.02938 0.1019-2.8830e-01 0.7737 0.3869
ITHSUM-0.03602 0.07875-4.5740e-01 0.6484 0.3242
SKEOUSUM+0.4788 0.09603+4.9860e+00 2.772e-06 1.386e-06







Multiple Linear Regression - Regression Statistics
Multiple R 0.4669
R-squared 0.218
Adjusted R-squared 0.1933
F-TEST (value) 8.829
F-TEST (DF numerator)3
F-TEST (DF denominator)95
p-value 3.201e-05
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.68
Sum Squared Residuals 268

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.4669 \tabularnewline
R-squared &  0.218 \tabularnewline
Adjusted R-squared &  0.1933 \tabularnewline
F-TEST (value) &  8.829 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 95 \tabularnewline
p-value &  3.201e-05 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.68 \tabularnewline
Sum Squared Residuals &  268 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.4669[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.218[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.1933[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 8.829[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]95[/C][/ROW]
[ROW][C]p-value[/C][C] 3.201e-05[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.68[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 268[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=306634&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.4669
R-squared 0.218
Adjusted R-squared 0.1933
F-TEST (value) 8.829
F-TEST (DF numerator)3
F-TEST (DF denominator)95
p-value 3.201e-05
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.68
Sum Squared Residuals 268







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 14.32-1.317
2 16 15.04 0.9646
3 17 16.15 0.8469
4 16 15.6 0.4043
5 17 17.12-0.124
6 17 14.64 2.355
7 16 15.15 0.8537
8 14 15.16-1.16
9 16 15.27 0.7324
10 17 15.75 1.247
11 16 15.23 0.7684
12 16 16.05-0.0517
13 16 15.22 0.7817
14 15 15.25-0.2543
15 16 15.54 0.4565
16 16 15.67 0.3256
17 15 17.1-2.097
18 17 16.52 0.4762
19 13 14.64-1.645
20 17 14.75 2.247
21 14 15.29-1.29
22 14 14.26-0.2608
23 18 15.77 2.231
24 17 17.08-0.0813
25 16 16.97-0.9732
26 15 15.69-0.6905
27 15 15.53-0.5264
28 15 14.81 0.1884
29 13 15.68-2.681
30 17 16.18 0.8241
31 11 16.21-5.212
32 14 14.64-0.6448
33 13 15.71-2.71
34 17 14.44 2.559
35 16 15.29 0.7096
36 17 16.67 0.3349
37 16 15.25 0.7457
38 16 16.15-0.1531
39 16 13.76 2.241
40 15 16.7-1.697
41 12 14.22-2.215
42 17 14.8 2.205
43 14 15.2-1.202
44 14 15.79-1.789
45 16 15.23 0.7684
46 15 14.32 0.6833
47 16 15.6 0.3977
48 14 15.64-1.638
49 15 15.3-0.3036
50 17 15.1 1.899
51 10 13.77-3.773
52 17 15.73 1.267
53 20 15.82 4.182
54 17 16.08 0.9189
55 18 16.54 1.463
56 14 13.38 0.6181
57 17 15.99 1.014
58 17 15.66 1.339
59 16 16.09-0.08772
60 18 15.72 2.283
61 18 17 0.9974
62 16 16.63-0.6319
63 15 16.05-1.052
64 13 15.93-2.928
65 16 15.64 0.3617
66 12 14.66-2.658
67 16 15.16 0.8404
68 16 15.11 0.8926
69 16 17.06-1.061
70 14 14.84-0.841
71 15 15.36-0.3624
72 14 14.82-0.8182
73 15 15.54-0.5369
74 15 14.69 0.3126
75 16 14.63 1.371
76 11 12.71-1.707
77 18 15.51 2.492
78 11 14.24-3.245
79 18 16.65 1.345
80 15 16.23-1.225
81 19 17.49 1.505
82 17 16.97 0.02676
83 14 15.84-1.841
84 13 14.28-1.281
85 17 15.67 1.326
86 14 16.15-2.146
87 19 15.99 3.014
88 14 15.17-1.166
89 16 15.71 0.2896
90 16 14.96 1.038
91 15 15.83-0.8345
92 12 15.73-3.726
93 17 16.73 0.2733
94 18 15.7 2.303
95 15 15.3-0.297
96 18 14.78 3.224
97 15 16.5-1.501
98 16 15.71 0.2896
99 16 14.8 1.205

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  14.32 & -1.317 \tabularnewline
2 &  16 &  15.04 &  0.9646 \tabularnewline
3 &  17 &  16.15 &  0.8469 \tabularnewline
4 &  16 &  15.6 &  0.4043 \tabularnewline
5 &  17 &  17.12 & -0.124 \tabularnewline
6 &  17 &  14.64 &  2.355 \tabularnewline
7 &  16 &  15.15 &  0.8537 \tabularnewline
8 &  14 &  15.16 & -1.16 \tabularnewline
9 &  16 &  15.27 &  0.7324 \tabularnewline
10 &  17 &  15.75 &  1.247 \tabularnewline
11 &  16 &  15.23 &  0.7684 \tabularnewline
12 &  16 &  16.05 & -0.0517 \tabularnewline
13 &  16 &  15.22 &  0.7817 \tabularnewline
14 &  15 &  15.25 & -0.2543 \tabularnewline
15 &  16 &  15.54 &  0.4565 \tabularnewline
16 &  16 &  15.67 &  0.3256 \tabularnewline
17 &  15 &  17.1 & -2.097 \tabularnewline
18 &  17 &  16.52 &  0.4762 \tabularnewline
19 &  13 &  14.64 & -1.645 \tabularnewline
20 &  17 &  14.75 &  2.247 \tabularnewline
21 &  14 &  15.29 & -1.29 \tabularnewline
22 &  14 &  14.26 & -0.2608 \tabularnewline
23 &  18 &  15.77 &  2.231 \tabularnewline
24 &  17 &  17.08 & -0.0813 \tabularnewline
25 &  16 &  16.97 & -0.9732 \tabularnewline
26 &  15 &  15.69 & -0.6905 \tabularnewline
27 &  15 &  15.53 & -0.5264 \tabularnewline
28 &  15 &  14.81 &  0.1884 \tabularnewline
29 &  13 &  15.68 & -2.681 \tabularnewline
30 &  17 &  16.18 &  0.8241 \tabularnewline
31 &  11 &  16.21 & -5.212 \tabularnewline
32 &  14 &  14.64 & -0.6448 \tabularnewline
33 &  13 &  15.71 & -2.71 \tabularnewline
34 &  17 &  14.44 &  2.559 \tabularnewline
35 &  16 &  15.29 &  0.7096 \tabularnewline
36 &  17 &  16.67 &  0.3349 \tabularnewline
37 &  16 &  15.25 &  0.7457 \tabularnewline
38 &  16 &  16.15 & -0.1531 \tabularnewline
39 &  16 &  13.76 &  2.241 \tabularnewline
40 &  15 &  16.7 & -1.697 \tabularnewline
41 &  12 &  14.22 & -2.215 \tabularnewline
42 &  17 &  14.8 &  2.205 \tabularnewline
43 &  14 &  15.2 & -1.202 \tabularnewline
44 &  14 &  15.79 & -1.789 \tabularnewline
45 &  16 &  15.23 &  0.7684 \tabularnewline
46 &  15 &  14.32 &  0.6833 \tabularnewline
47 &  16 &  15.6 &  0.3977 \tabularnewline
48 &  14 &  15.64 & -1.638 \tabularnewline
49 &  15 &  15.3 & -0.3036 \tabularnewline
50 &  17 &  15.1 &  1.899 \tabularnewline
51 &  10 &  13.77 & -3.773 \tabularnewline
52 &  17 &  15.73 &  1.267 \tabularnewline
53 &  20 &  15.82 &  4.182 \tabularnewline
54 &  17 &  16.08 &  0.9189 \tabularnewline
55 &  18 &  16.54 &  1.463 \tabularnewline
56 &  14 &  13.38 &  0.6181 \tabularnewline
57 &  17 &  15.99 &  1.014 \tabularnewline
58 &  17 &  15.66 &  1.339 \tabularnewline
59 &  16 &  16.09 & -0.08772 \tabularnewline
60 &  18 &  15.72 &  2.283 \tabularnewline
61 &  18 &  17 &  0.9974 \tabularnewline
62 &  16 &  16.63 & -0.6319 \tabularnewline
63 &  15 &  16.05 & -1.052 \tabularnewline
64 &  13 &  15.93 & -2.928 \tabularnewline
65 &  16 &  15.64 &  0.3617 \tabularnewline
66 &  12 &  14.66 & -2.658 \tabularnewline
67 &  16 &  15.16 &  0.8404 \tabularnewline
68 &  16 &  15.11 &  0.8926 \tabularnewline
69 &  16 &  17.06 & -1.061 \tabularnewline
70 &  14 &  14.84 & -0.841 \tabularnewline
71 &  15 &  15.36 & -0.3624 \tabularnewline
72 &  14 &  14.82 & -0.8182 \tabularnewline
73 &  15 &  15.54 & -0.5369 \tabularnewline
74 &  15 &  14.69 &  0.3126 \tabularnewline
75 &  16 &  14.63 &  1.371 \tabularnewline
76 &  11 &  12.71 & -1.707 \tabularnewline
77 &  18 &  15.51 &  2.492 \tabularnewline
78 &  11 &  14.24 & -3.245 \tabularnewline
79 &  18 &  16.65 &  1.345 \tabularnewline
80 &  15 &  16.23 & -1.225 \tabularnewline
81 &  19 &  17.49 &  1.505 \tabularnewline
82 &  17 &  16.97 &  0.02676 \tabularnewline
83 &  14 &  15.84 & -1.841 \tabularnewline
84 &  13 &  14.28 & -1.281 \tabularnewline
85 &  17 &  15.67 &  1.326 \tabularnewline
86 &  14 &  16.15 & -2.146 \tabularnewline
87 &  19 &  15.99 &  3.014 \tabularnewline
88 &  14 &  15.17 & -1.166 \tabularnewline
89 &  16 &  15.71 &  0.2896 \tabularnewline
90 &  16 &  14.96 &  1.038 \tabularnewline
91 &  15 &  15.83 & -0.8345 \tabularnewline
92 &  12 &  15.73 & -3.726 \tabularnewline
93 &  17 &  16.73 &  0.2733 \tabularnewline
94 &  18 &  15.7 &  2.303 \tabularnewline
95 &  15 &  15.3 & -0.297 \tabularnewline
96 &  18 &  14.78 &  3.224 \tabularnewline
97 &  15 &  16.5 & -1.501 \tabularnewline
98 &  16 &  15.71 &  0.2896 \tabularnewline
99 &  16 &  14.8 &  1.205 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 14.32[/C][C]-1.317[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.04[/C][C] 0.9646[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 16.15[/C][C] 0.8469[/C][/ROW]
[ROW][C]4[/C][C] 16[/C][C] 15.6[/C][C] 0.4043[/C][/ROW]
[ROW][C]5[/C][C] 17[/C][C] 17.12[/C][C]-0.124[/C][/ROW]
[ROW][C]6[/C][C] 17[/C][C] 14.64[/C][C] 2.355[/C][/ROW]
[ROW][C]7[/C][C] 16[/C][C] 15.15[/C][C] 0.8537[/C][/ROW]
[ROW][C]8[/C][C] 14[/C][C] 15.16[/C][C]-1.16[/C][/ROW]
[ROW][C]9[/C][C] 16[/C][C] 15.27[/C][C] 0.7324[/C][/ROW]
[ROW][C]10[/C][C] 17[/C][C] 15.75[/C][C] 1.247[/C][/ROW]
[ROW][C]11[/C][C] 16[/C][C] 15.23[/C][C] 0.7684[/C][/ROW]
[ROW][C]12[/C][C] 16[/C][C] 16.05[/C][C]-0.0517[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.22[/C][C] 0.7817[/C][/ROW]
[ROW][C]14[/C][C] 15[/C][C] 15.25[/C][C]-0.2543[/C][/ROW]
[ROW][C]15[/C][C] 16[/C][C] 15.54[/C][C] 0.4565[/C][/ROW]
[ROW][C]16[/C][C] 16[/C][C] 15.67[/C][C] 0.3256[/C][/ROW]
[ROW][C]17[/C][C] 15[/C][C] 17.1[/C][C]-2.097[/C][/ROW]
[ROW][C]18[/C][C] 17[/C][C] 16.52[/C][C] 0.4762[/C][/ROW]
[ROW][C]19[/C][C] 13[/C][C] 14.64[/C][C]-1.645[/C][/ROW]
[ROW][C]20[/C][C] 17[/C][C] 14.75[/C][C] 2.247[/C][/ROW]
[ROW][C]21[/C][C] 14[/C][C] 15.29[/C][C]-1.29[/C][/ROW]
[ROW][C]22[/C][C] 14[/C][C] 14.26[/C][C]-0.2608[/C][/ROW]
[ROW][C]23[/C][C] 18[/C][C] 15.77[/C][C] 2.231[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 17.08[/C][C]-0.0813[/C][/ROW]
[ROW][C]25[/C][C] 16[/C][C] 16.97[/C][C]-0.9732[/C][/ROW]
[ROW][C]26[/C][C] 15[/C][C] 15.69[/C][C]-0.6905[/C][/ROW]
[ROW][C]27[/C][C] 15[/C][C] 15.53[/C][C]-0.5264[/C][/ROW]
[ROW][C]28[/C][C] 15[/C][C] 14.81[/C][C] 0.1884[/C][/ROW]
[ROW][C]29[/C][C] 13[/C][C] 15.68[/C][C]-2.681[/C][/ROW]
[ROW][C]30[/C][C] 17[/C][C] 16.18[/C][C] 0.8241[/C][/ROW]
[ROW][C]31[/C][C] 11[/C][C] 16.21[/C][C]-5.212[/C][/ROW]
[ROW][C]32[/C][C] 14[/C][C] 14.64[/C][C]-0.6448[/C][/ROW]
[ROW][C]33[/C][C] 13[/C][C] 15.71[/C][C]-2.71[/C][/ROW]
[ROW][C]34[/C][C] 17[/C][C] 14.44[/C][C] 2.559[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 15.29[/C][C] 0.7096[/C][/ROW]
[ROW][C]36[/C][C] 17[/C][C] 16.67[/C][C] 0.3349[/C][/ROW]
[ROW][C]37[/C][C] 16[/C][C] 15.25[/C][C] 0.7457[/C][/ROW]
[ROW][C]38[/C][C] 16[/C][C] 16.15[/C][C]-0.1531[/C][/ROW]
[ROW][C]39[/C][C] 16[/C][C] 13.76[/C][C] 2.241[/C][/ROW]
[ROW][C]40[/C][C] 15[/C][C] 16.7[/C][C]-1.697[/C][/ROW]
[ROW][C]41[/C][C] 12[/C][C] 14.22[/C][C]-2.215[/C][/ROW]
[ROW][C]42[/C][C] 17[/C][C] 14.8[/C][C] 2.205[/C][/ROW]
[ROW][C]43[/C][C] 14[/C][C] 15.2[/C][C]-1.202[/C][/ROW]
[ROW][C]44[/C][C] 14[/C][C] 15.79[/C][C]-1.789[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 15.23[/C][C] 0.7684[/C][/ROW]
[ROW][C]46[/C][C] 15[/C][C] 14.32[/C][C] 0.6833[/C][/ROW]
[ROW][C]47[/C][C] 16[/C][C] 15.6[/C][C] 0.3977[/C][/ROW]
[ROW][C]48[/C][C] 14[/C][C] 15.64[/C][C]-1.638[/C][/ROW]
[ROW][C]49[/C][C] 15[/C][C] 15.3[/C][C]-0.3036[/C][/ROW]
[ROW][C]50[/C][C] 17[/C][C] 15.1[/C][C] 1.899[/C][/ROW]
[ROW][C]51[/C][C] 10[/C][C] 13.77[/C][C]-3.773[/C][/ROW]
[ROW][C]52[/C][C] 17[/C][C] 15.73[/C][C] 1.267[/C][/ROW]
[ROW][C]53[/C][C] 20[/C][C] 15.82[/C][C] 4.182[/C][/ROW]
[ROW][C]54[/C][C] 17[/C][C] 16.08[/C][C] 0.9189[/C][/ROW]
[ROW][C]55[/C][C] 18[/C][C] 16.54[/C][C] 1.463[/C][/ROW]
[ROW][C]56[/C][C] 14[/C][C] 13.38[/C][C] 0.6181[/C][/ROW]
[ROW][C]57[/C][C] 17[/C][C] 15.99[/C][C] 1.014[/C][/ROW]
[ROW][C]58[/C][C] 17[/C][C] 15.66[/C][C] 1.339[/C][/ROW]
[ROW][C]59[/C][C] 16[/C][C] 16.09[/C][C]-0.08772[/C][/ROW]
[ROW][C]60[/C][C] 18[/C][C] 15.72[/C][C] 2.283[/C][/ROW]
[ROW][C]61[/C][C] 18[/C][C] 17[/C][C] 0.9974[/C][/ROW]
[ROW][C]62[/C][C] 16[/C][C] 16.63[/C][C]-0.6319[/C][/ROW]
[ROW][C]63[/C][C] 15[/C][C] 16.05[/C][C]-1.052[/C][/ROW]
[ROW][C]64[/C][C] 13[/C][C] 15.93[/C][C]-2.928[/C][/ROW]
[ROW][C]65[/C][C] 16[/C][C] 15.64[/C][C] 0.3617[/C][/ROW]
[ROW][C]66[/C][C] 12[/C][C] 14.66[/C][C]-2.658[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 15.16[/C][C] 0.8404[/C][/ROW]
[ROW][C]68[/C][C] 16[/C][C] 15.11[/C][C] 0.8926[/C][/ROW]
[ROW][C]69[/C][C] 16[/C][C] 17.06[/C][C]-1.061[/C][/ROW]
[ROW][C]70[/C][C] 14[/C][C] 14.84[/C][C]-0.841[/C][/ROW]
[ROW][C]71[/C][C] 15[/C][C] 15.36[/C][C]-0.3624[/C][/ROW]
[ROW][C]72[/C][C] 14[/C][C] 14.82[/C][C]-0.8182[/C][/ROW]
[ROW][C]73[/C][C] 15[/C][C] 15.54[/C][C]-0.5369[/C][/ROW]
[ROW][C]74[/C][C] 15[/C][C] 14.69[/C][C] 0.3126[/C][/ROW]
[ROW][C]75[/C][C] 16[/C][C] 14.63[/C][C] 1.371[/C][/ROW]
[ROW][C]76[/C][C] 11[/C][C] 12.71[/C][C]-1.707[/C][/ROW]
[ROW][C]77[/C][C] 18[/C][C] 15.51[/C][C] 2.492[/C][/ROW]
[ROW][C]78[/C][C] 11[/C][C] 14.24[/C][C]-3.245[/C][/ROW]
[ROW][C]79[/C][C] 18[/C][C] 16.65[/C][C] 1.345[/C][/ROW]
[ROW][C]80[/C][C] 15[/C][C] 16.23[/C][C]-1.225[/C][/ROW]
[ROW][C]81[/C][C] 19[/C][C] 17.49[/C][C] 1.505[/C][/ROW]
[ROW][C]82[/C][C] 17[/C][C] 16.97[/C][C] 0.02676[/C][/ROW]
[ROW][C]83[/C][C] 14[/C][C] 15.84[/C][C]-1.841[/C][/ROW]
[ROW][C]84[/C][C] 13[/C][C] 14.28[/C][C]-1.281[/C][/ROW]
[ROW][C]85[/C][C] 17[/C][C] 15.67[/C][C] 1.326[/C][/ROW]
[ROW][C]86[/C][C] 14[/C][C] 16.15[/C][C]-2.146[/C][/ROW]
[ROW][C]87[/C][C] 19[/C][C] 15.99[/C][C] 3.014[/C][/ROW]
[ROW][C]88[/C][C] 14[/C][C] 15.17[/C][C]-1.166[/C][/ROW]
[ROW][C]89[/C][C] 16[/C][C] 15.71[/C][C] 0.2896[/C][/ROW]
[ROW][C]90[/C][C] 16[/C][C] 14.96[/C][C] 1.038[/C][/ROW]
[ROW][C]91[/C][C] 15[/C][C] 15.83[/C][C]-0.8345[/C][/ROW]
[ROW][C]92[/C][C] 12[/C][C] 15.73[/C][C]-3.726[/C][/ROW]
[ROW][C]93[/C][C] 17[/C][C] 16.73[/C][C] 0.2733[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 15.7[/C][C] 2.303[/C][/ROW]
[ROW][C]95[/C][C] 15[/C][C] 15.3[/C][C]-0.297[/C][/ROW]
[ROW][C]96[/C][C] 18[/C][C] 14.78[/C][C] 3.224[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 16.5[/C][C]-1.501[/C][/ROW]
[ROW][C]98[/C][C] 16[/C][C] 15.71[/C][C] 0.2896[/C][/ROW]
[ROW][C]99[/C][C] 16[/C][C] 14.8[/C][C] 1.205[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=306634&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 14.32-1.317
2 16 15.04 0.9646
3 17 16.15 0.8469
4 16 15.6 0.4043
5 17 17.12-0.124
6 17 14.64 2.355
7 16 15.15 0.8537
8 14 15.16-1.16
9 16 15.27 0.7324
10 17 15.75 1.247
11 16 15.23 0.7684
12 16 16.05-0.0517
13 16 15.22 0.7817
14 15 15.25-0.2543
15 16 15.54 0.4565
16 16 15.67 0.3256
17 15 17.1-2.097
18 17 16.52 0.4762
19 13 14.64-1.645
20 17 14.75 2.247
21 14 15.29-1.29
22 14 14.26-0.2608
23 18 15.77 2.231
24 17 17.08-0.0813
25 16 16.97-0.9732
26 15 15.69-0.6905
27 15 15.53-0.5264
28 15 14.81 0.1884
29 13 15.68-2.681
30 17 16.18 0.8241
31 11 16.21-5.212
32 14 14.64-0.6448
33 13 15.71-2.71
34 17 14.44 2.559
35 16 15.29 0.7096
36 17 16.67 0.3349
37 16 15.25 0.7457
38 16 16.15-0.1531
39 16 13.76 2.241
40 15 16.7-1.697
41 12 14.22-2.215
42 17 14.8 2.205
43 14 15.2-1.202
44 14 15.79-1.789
45 16 15.23 0.7684
46 15 14.32 0.6833
47 16 15.6 0.3977
48 14 15.64-1.638
49 15 15.3-0.3036
50 17 15.1 1.899
51 10 13.77-3.773
52 17 15.73 1.267
53 20 15.82 4.182
54 17 16.08 0.9189
55 18 16.54 1.463
56 14 13.38 0.6181
57 17 15.99 1.014
58 17 15.66 1.339
59 16 16.09-0.08772
60 18 15.72 2.283
61 18 17 0.9974
62 16 16.63-0.6319
63 15 16.05-1.052
64 13 15.93-2.928
65 16 15.64 0.3617
66 12 14.66-2.658
67 16 15.16 0.8404
68 16 15.11 0.8926
69 16 17.06-1.061
70 14 14.84-0.841
71 15 15.36-0.3624
72 14 14.82-0.8182
73 15 15.54-0.5369
74 15 14.69 0.3126
75 16 14.63 1.371
76 11 12.71-1.707
77 18 15.51 2.492
78 11 14.24-3.245
79 18 16.65 1.345
80 15 16.23-1.225
81 19 17.49 1.505
82 17 16.97 0.02676
83 14 15.84-1.841
84 13 14.28-1.281
85 17 15.67 1.326
86 14 16.15-2.146
87 19 15.99 3.014
88 14 15.17-1.166
89 16 15.71 0.2896
90 16 14.96 1.038
91 15 15.83-0.8345
92 12 15.73-3.726
93 17 16.73 0.2733
94 18 15.7 2.303
95 15 15.3-0.297
96 18 14.78 3.224
97 15 16.5-1.501
98 16 15.71 0.2896
99 16 14.8 1.205







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.2432 0.4863 0.7568
8 0.2811 0.5622 0.7189
9 0.2365 0.4731 0.7635
10 0.2255 0.4511 0.7745
11 0.1469 0.2938 0.8531
12 0.102 0.2039 0.898
13 0.05938 0.1188 0.9406
14 0.03841 0.07682 0.9616
15 0.02093 0.04187 0.9791
16 0.01079 0.02159 0.9892
17 0.02366 0.04731 0.9763
18 0.01386 0.02771 0.9861
19 0.03199 0.06398 0.968
20 0.04368 0.08736 0.9563
21 0.03887 0.07774 0.9611
22 0.0254 0.05079 0.9746
23 0.04848 0.09696 0.9515
24 0.03182 0.06365 0.9682
25 0.02378 0.04757 0.9762
26 0.0159 0.0318 0.9841
27 0.01178 0.02357 0.9882
28 0.007162 0.01432 0.9928
29 0.02287 0.04574 0.9771
30 0.01757 0.03514 0.9824
31 0.2527 0.5053 0.7473
32 0.22 0.44 0.78
33 0.2953 0.5905 0.7047
34 0.367 0.7339 0.633
35 0.3203 0.6406 0.6797
36 0.2694 0.5388 0.7306
37 0.2303 0.4606 0.7697
38 0.1862 0.3723 0.8138
39 0.1953 0.3907 0.8047
40 0.1849 0.3699 0.8151
41 0.2631 0.5263 0.7369
42 0.2857 0.5715 0.7143
43 0.2658 0.5316 0.7342
44 0.2792 0.5583 0.7208
45 0.2396 0.4791 0.7604
46 0.2002 0.4003 0.7998
47 0.164 0.328 0.836
48 0.1599 0.3199 0.8401
49 0.1279 0.2558 0.8721
50 0.1346 0.2692 0.8654
51 0.3465 0.6931 0.6535
52 0.3265 0.6531 0.6735
53 0.5987 0.8026 0.4013
54 0.5595 0.881 0.4405
55 0.5446 0.9109 0.4554
56 0.5022 0.9956 0.4978
57 0.4631 0.9262 0.5369
58 0.4484 0.8969 0.5516
59 0.3893 0.7785 0.6107
60 0.4324 0.8647 0.5676
61 0.3918 0.7837 0.6082
62 0.3422 0.6844 0.6578
63 0.3067 0.6133 0.6933
64 0.4448 0.8897 0.5552
65 0.3862 0.7724 0.6138
66 0.4792 0.9584 0.5208
67 0.434 0.8681 0.566
68 0.3823 0.7647 0.6177
69 0.3476 0.6953 0.6524
70 0.2979 0.5958 0.7021
71 0.2453 0.4906 0.7547
72 0.2016 0.4033 0.7984
73 0.1652 0.3303 0.8348
74 0.1274 0.2548 0.8726
75 0.1108 0.2215 0.8892
76 0.1015 0.203 0.8985
77 0.1181 0.2362 0.8819
78 0.2321 0.4641 0.7679
79 0.2228 0.4455 0.7772
80 0.1908 0.3815 0.8092
81 0.1783 0.3566 0.8217
82 0.1311 0.2623 0.8689
83 0.1165 0.233 0.8835
84 0.1613 0.3227 0.8387
85 0.1275 0.2551 0.8725
86 0.1278 0.2557 0.8722
87 0.229 0.4579 0.771
88 0.2217 0.4435 0.7783
89 0.1475 0.295 0.8525
90 0.09097 0.1819 0.909
91 0.05384 0.1077 0.9462
92 0.7849 0.4302 0.2151

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 &  0.2432 &  0.4863 &  0.7568 \tabularnewline
8 &  0.2811 &  0.5622 &  0.7189 \tabularnewline
9 &  0.2365 &  0.4731 &  0.7635 \tabularnewline
10 &  0.2255 &  0.4511 &  0.7745 \tabularnewline
11 &  0.1469 &  0.2938 &  0.8531 \tabularnewline
12 &  0.102 &  0.2039 &  0.898 \tabularnewline
13 &  0.05938 &  0.1188 &  0.9406 \tabularnewline
14 &  0.03841 &  0.07682 &  0.9616 \tabularnewline
15 &  0.02093 &  0.04187 &  0.9791 \tabularnewline
16 &  0.01079 &  0.02159 &  0.9892 \tabularnewline
17 &  0.02366 &  0.04731 &  0.9763 \tabularnewline
18 &  0.01386 &  0.02771 &  0.9861 \tabularnewline
19 &  0.03199 &  0.06398 &  0.968 \tabularnewline
20 &  0.04368 &  0.08736 &  0.9563 \tabularnewline
21 &  0.03887 &  0.07774 &  0.9611 \tabularnewline
22 &  0.0254 &  0.05079 &  0.9746 \tabularnewline
23 &  0.04848 &  0.09696 &  0.9515 \tabularnewline
24 &  0.03182 &  0.06365 &  0.9682 \tabularnewline
25 &  0.02378 &  0.04757 &  0.9762 \tabularnewline
26 &  0.0159 &  0.0318 &  0.9841 \tabularnewline
27 &  0.01178 &  0.02357 &  0.9882 \tabularnewline
28 &  0.007162 &  0.01432 &  0.9928 \tabularnewline
29 &  0.02287 &  0.04574 &  0.9771 \tabularnewline
30 &  0.01757 &  0.03514 &  0.9824 \tabularnewline
31 &  0.2527 &  0.5053 &  0.7473 \tabularnewline
32 &  0.22 &  0.44 &  0.78 \tabularnewline
33 &  0.2953 &  0.5905 &  0.7047 \tabularnewline
34 &  0.367 &  0.7339 &  0.633 \tabularnewline
35 &  0.3203 &  0.6406 &  0.6797 \tabularnewline
36 &  0.2694 &  0.5388 &  0.7306 \tabularnewline
37 &  0.2303 &  0.4606 &  0.7697 \tabularnewline
38 &  0.1862 &  0.3723 &  0.8138 \tabularnewline
39 &  0.1953 &  0.3907 &  0.8047 \tabularnewline
40 &  0.1849 &  0.3699 &  0.8151 \tabularnewline
41 &  0.2631 &  0.5263 &  0.7369 \tabularnewline
42 &  0.2857 &  0.5715 &  0.7143 \tabularnewline
43 &  0.2658 &  0.5316 &  0.7342 \tabularnewline
44 &  0.2792 &  0.5583 &  0.7208 \tabularnewline
45 &  0.2396 &  0.4791 &  0.7604 \tabularnewline
46 &  0.2002 &  0.4003 &  0.7998 \tabularnewline
47 &  0.164 &  0.328 &  0.836 \tabularnewline
48 &  0.1599 &  0.3199 &  0.8401 \tabularnewline
49 &  0.1279 &  0.2558 &  0.8721 \tabularnewline
50 &  0.1346 &  0.2692 &  0.8654 \tabularnewline
51 &  0.3465 &  0.6931 &  0.6535 \tabularnewline
52 &  0.3265 &  0.6531 &  0.6735 \tabularnewline
53 &  0.5987 &  0.8026 &  0.4013 \tabularnewline
54 &  0.5595 &  0.881 &  0.4405 \tabularnewline
55 &  0.5446 &  0.9109 &  0.4554 \tabularnewline
56 &  0.5022 &  0.9956 &  0.4978 \tabularnewline
57 &  0.4631 &  0.9262 &  0.5369 \tabularnewline
58 &  0.4484 &  0.8969 &  0.5516 \tabularnewline
59 &  0.3893 &  0.7785 &  0.6107 \tabularnewline
60 &  0.4324 &  0.8647 &  0.5676 \tabularnewline
61 &  0.3918 &  0.7837 &  0.6082 \tabularnewline
62 &  0.3422 &  0.6844 &  0.6578 \tabularnewline
63 &  0.3067 &  0.6133 &  0.6933 \tabularnewline
64 &  0.4448 &  0.8897 &  0.5552 \tabularnewline
65 &  0.3862 &  0.7724 &  0.6138 \tabularnewline
66 &  0.4792 &  0.9584 &  0.5208 \tabularnewline
67 &  0.434 &  0.8681 &  0.566 \tabularnewline
68 &  0.3823 &  0.7647 &  0.6177 \tabularnewline
69 &  0.3476 &  0.6953 &  0.6524 \tabularnewline
70 &  0.2979 &  0.5958 &  0.7021 \tabularnewline
71 &  0.2453 &  0.4906 &  0.7547 \tabularnewline
72 &  0.2016 &  0.4033 &  0.7984 \tabularnewline
73 &  0.1652 &  0.3303 &  0.8348 \tabularnewline
74 &  0.1274 &  0.2548 &  0.8726 \tabularnewline
75 &  0.1108 &  0.2215 &  0.8892 \tabularnewline
76 &  0.1015 &  0.203 &  0.8985 \tabularnewline
77 &  0.1181 &  0.2362 &  0.8819 \tabularnewline
78 &  0.2321 &  0.4641 &  0.7679 \tabularnewline
79 &  0.2228 &  0.4455 &  0.7772 \tabularnewline
80 &  0.1908 &  0.3815 &  0.8092 \tabularnewline
81 &  0.1783 &  0.3566 &  0.8217 \tabularnewline
82 &  0.1311 &  0.2623 &  0.8689 \tabularnewline
83 &  0.1165 &  0.233 &  0.8835 \tabularnewline
84 &  0.1613 &  0.3227 &  0.8387 \tabularnewline
85 &  0.1275 &  0.2551 &  0.8725 \tabularnewline
86 &  0.1278 &  0.2557 &  0.8722 \tabularnewline
87 &  0.229 &  0.4579 &  0.771 \tabularnewline
88 &  0.2217 &  0.4435 &  0.7783 \tabularnewline
89 &  0.1475 &  0.295 &  0.8525 \tabularnewline
90 &  0.09097 &  0.1819 &  0.909 \tabularnewline
91 &  0.05384 &  0.1077 &  0.9462 \tabularnewline
92 &  0.7849 &  0.4302 &  0.2151 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C] 0.2432[/C][C] 0.4863[/C][C] 0.7568[/C][/ROW]
[ROW][C]8[/C][C] 0.2811[/C][C] 0.5622[/C][C] 0.7189[/C][/ROW]
[ROW][C]9[/C][C] 0.2365[/C][C] 0.4731[/C][C] 0.7635[/C][/ROW]
[ROW][C]10[/C][C] 0.2255[/C][C] 0.4511[/C][C] 0.7745[/C][/ROW]
[ROW][C]11[/C][C] 0.1469[/C][C] 0.2938[/C][C] 0.8531[/C][/ROW]
[ROW][C]12[/C][C] 0.102[/C][C] 0.2039[/C][C] 0.898[/C][/ROW]
[ROW][C]13[/C][C] 0.05938[/C][C] 0.1188[/C][C] 0.9406[/C][/ROW]
[ROW][C]14[/C][C] 0.03841[/C][C] 0.07682[/C][C] 0.9616[/C][/ROW]
[ROW][C]15[/C][C] 0.02093[/C][C] 0.04187[/C][C] 0.9791[/C][/ROW]
[ROW][C]16[/C][C] 0.01079[/C][C] 0.02159[/C][C] 0.9892[/C][/ROW]
[ROW][C]17[/C][C] 0.02366[/C][C] 0.04731[/C][C] 0.9763[/C][/ROW]
[ROW][C]18[/C][C] 0.01386[/C][C] 0.02771[/C][C] 0.9861[/C][/ROW]
[ROW][C]19[/C][C] 0.03199[/C][C] 0.06398[/C][C] 0.968[/C][/ROW]
[ROW][C]20[/C][C] 0.04368[/C][C] 0.08736[/C][C] 0.9563[/C][/ROW]
[ROW][C]21[/C][C] 0.03887[/C][C] 0.07774[/C][C] 0.9611[/C][/ROW]
[ROW][C]22[/C][C] 0.0254[/C][C] 0.05079[/C][C] 0.9746[/C][/ROW]
[ROW][C]23[/C][C] 0.04848[/C][C] 0.09696[/C][C] 0.9515[/C][/ROW]
[ROW][C]24[/C][C] 0.03182[/C][C] 0.06365[/C][C] 0.9682[/C][/ROW]
[ROW][C]25[/C][C] 0.02378[/C][C] 0.04757[/C][C] 0.9762[/C][/ROW]
[ROW][C]26[/C][C] 0.0159[/C][C] 0.0318[/C][C] 0.9841[/C][/ROW]
[ROW][C]27[/C][C] 0.01178[/C][C] 0.02357[/C][C] 0.9882[/C][/ROW]
[ROW][C]28[/C][C] 0.007162[/C][C] 0.01432[/C][C] 0.9928[/C][/ROW]
[ROW][C]29[/C][C] 0.02287[/C][C] 0.04574[/C][C] 0.9771[/C][/ROW]
[ROW][C]30[/C][C] 0.01757[/C][C] 0.03514[/C][C] 0.9824[/C][/ROW]
[ROW][C]31[/C][C] 0.2527[/C][C] 0.5053[/C][C] 0.7473[/C][/ROW]
[ROW][C]32[/C][C] 0.22[/C][C] 0.44[/C][C] 0.78[/C][/ROW]
[ROW][C]33[/C][C] 0.2953[/C][C] 0.5905[/C][C] 0.7047[/C][/ROW]
[ROW][C]34[/C][C] 0.367[/C][C] 0.7339[/C][C] 0.633[/C][/ROW]
[ROW][C]35[/C][C] 0.3203[/C][C] 0.6406[/C][C] 0.6797[/C][/ROW]
[ROW][C]36[/C][C] 0.2694[/C][C] 0.5388[/C][C] 0.7306[/C][/ROW]
[ROW][C]37[/C][C] 0.2303[/C][C] 0.4606[/C][C] 0.7697[/C][/ROW]
[ROW][C]38[/C][C] 0.1862[/C][C] 0.3723[/C][C] 0.8138[/C][/ROW]
[ROW][C]39[/C][C] 0.1953[/C][C] 0.3907[/C][C] 0.8047[/C][/ROW]
[ROW][C]40[/C][C] 0.1849[/C][C] 0.3699[/C][C] 0.8151[/C][/ROW]
[ROW][C]41[/C][C] 0.2631[/C][C] 0.5263[/C][C] 0.7369[/C][/ROW]
[ROW][C]42[/C][C] 0.2857[/C][C] 0.5715[/C][C] 0.7143[/C][/ROW]
[ROW][C]43[/C][C] 0.2658[/C][C] 0.5316[/C][C] 0.7342[/C][/ROW]
[ROW][C]44[/C][C] 0.2792[/C][C] 0.5583[/C][C] 0.7208[/C][/ROW]
[ROW][C]45[/C][C] 0.2396[/C][C] 0.4791[/C][C] 0.7604[/C][/ROW]
[ROW][C]46[/C][C] 0.2002[/C][C] 0.4003[/C][C] 0.7998[/C][/ROW]
[ROW][C]47[/C][C] 0.164[/C][C] 0.328[/C][C] 0.836[/C][/ROW]
[ROW][C]48[/C][C] 0.1599[/C][C] 0.3199[/C][C] 0.8401[/C][/ROW]
[ROW][C]49[/C][C] 0.1279[/C][C] 0.2558[/C][C] 0.8721[/C][/ROW]
[ROW][C]50[/C][C] 0.1346[/C][C] 0.2692[/C][C] 0.8654[/C][/ROW]
[ROW][C]51[/C][C] 0.3465[/C][C] 0.6931[/C][C] 0.6535[/C][/ROW]
[ROW][C]52[/C][C] 0.3265[/C][C] 0.6531[/C][C] 0.6735[/C][/ROW]
[ROW][C]53[/C][C] 0.5987[/C][C] 0.8026[/C][C] 0.4013[/C][/ROW]
[ROW][C]54[/C][C] 0.5595[/C][C] 0.881[/C][C] 0.4405[/C][/ROW]
[ROW][C]55[/C][C] 0.5446[/C][C] 0.9109[/C][C] 0.4554[/C][/ROW]
[ROW][C]56[/C][C] 0.5022[/C][C] 0.9956[/C][C] 0.4978[/C][/ROW]
[ROW][C]57[/C][C] 0.4631[/C][C] 0.9262[/C][C] 0.5369[/C][/ROW]
[ROW][C]58[/C][C] 0.4484[/C][C] 0.8969[/C][C] 0.5516[/C][/ROW]
[ROW][C]59[/C][C] 0.3893[/C][C] 0.7785[/C][C] 0.6107[/C][/ROW]
[ROW][C]60[/C][C] 0.4324[/C][C] 0.8647[/C][C] 0.5676[/C][/ROW]
[ROW][C]61[/C][C] 0.3918[/C][C] 0.7837[/C][C] 0.6082[/C][/ROW]
[ROW][C]62[/C][C] 0.3422[/C][C] 0.6844[/C][C] 0.6578[/C][/ROW]
[ROW][C]63[/C][C] 0.3067[/C][C] 0.6133[/C][C] 0.6933[/C][/ROW]
[ROW][C]64[/C][C] 0.4448[/C][C] 0.8897[/C][C] 0.5552[/C][/ROW]
[ROW][C]65[/C][C] 0.3862[/C][C] 0.7724[/C][C] 0.6138[/C][/ROW]
[ROW][C]66[/C][C] 0.4792[/C][C] 0.9584[/C][C] 0.5208[/C][/ROW]
[ROW][C]67[/C][C] 0.434[/C][C] 0.8681[/C][C] 0.566[/C][/ROW]
[ROW][C]68[/C][C] 0.3823[/C][C] 0.7647[/C][C] 0.6177[/C][/ROW]
[ROW][C]69[/C][C] 0.3476[/C][C] 0.6953[/C][C] 0.6524[/C][/ROW]
[ROW][C]70[/C][C] 0.2979[/C][C] 0.5958[/C][C] 0.7021[/C][/ROW]
[ROW][C]71[/C][C] 0.2453[/C][C] 0.4906[/C][C] 0.7547[/C][/ROW]
[ROW][C]72[/C][C] 0.2016[/C][C] 0.4033[/C][C] 0.7984[/C][/ROW]
[ROW][C]73[/C][C] 0.1652[/C][C] 0.3303[/C][C] 0.8348[/C][/ROW]
[ROW][C]74[/C][C] 0.1274[/C][C] 0.2548[/C][C] 0.8726[/C][/ROW]
[ROW][C]75[/C][C] 0.1108[/C][C] 0.2215[/C][C] 0.8892[/C][/ROW]
[ROW][C]76[/C][C] 0.1015[/C][C] 0.203[/C][C] 0.8985[/C][/ROW]
[ROW][C]77[/C][C] 0.1181[/C][C] 0.2362[/C][C] 0.8819[/C][/ROW]
[ROW][C]78[/C][C] 0.2321[/C][C] 0.4641[/C][C] 0.7679[/C][/ROW]
[ROW][C]79[/C][C] 0.2228[/C][C] 0.4455[/C][C] 0.7772[/C][/ROW]
[ROW][C]80[/C][C] 0.1908[/C][C] 0.3815[/C][C] 0.8092[/C][/ROW]
[ROW][C]81[/C][C] 0.1783[/C][C] 0.3566[/C][C] 0.8217[/C][/ROW]
[ROW][C]82[/C][C] 0.1311[/C][C] 0.2623[/C][C] 0.8689[/C][/ROW]
[ROW][C]83[/C][C] 0.1165[/C][C] 0.233[/C][C] 0.8835[/C][/ROW]
[ROW][C]84[/C][C] 0.1613[/C][C] 0.3227[/C][C] 0.8387[/C][/ROW]
[ROW][C]85[/C][C] 0.1275[/C][C] 0.2551[/C][C] 0.8725[/C][/ROW]
[ROW][C]86[/C][C] 0.1278[/C][C] 0.2557[/C][C] 0.8722[/C][/ROW]
[ROW][C]87[/C][C] 0.229[/C][C] 0.4579[/C][C] 0.771[/C][/ROW]
[ROW][C]88[/C][C] 0.2217[/C][C] 0.4435[/C][C] 0.7783[/C][/ROW]
[ROW][C]89[/C][C] 0.1475[/C][C] 0.295[/C][C] 0.8525[/C][/ROW]
[ROW][C]90[/C][C] 0.09097[/C][C] 0.1819[/C][C] 0.909[/C][/ROW]
[ROW][C]91[/C][C] 0.05384[/C][C] 0.1077[/C][C] 0.9462[/C][/ROW]
[ROW][C]92[/C][C] 0.7849[/C][C] 0.4302[/C][C] 0.2151[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=306634&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.2432 0.4863 0.7568
8 0.2811 0.5622 0.7189
9 0.2365 0.4731 0.7635
10 0.2255 0.4511 0.7745
11 0.1469 0.2938 0.8531
12 0.102 0.2039 0.898
13 0.05938 0.1188 0.9406
14 0.03841 0.07682 0.9616
15 0.02093 0.04187 0.9791
16 0.01079 0.02159 0.9892
17 0.02366 0.04731 0.9763
18 0.01386 0.02771 0.9861
19 0.03199 0.06398 0.968
20 0.04368 0.08736 0.9563
21 0.03887 0.07774 0.9611
22 0.0254 0.05079 0.9746
23 0.04848 0.09696 0.9515
24 0.03182 0.06365 0.9682
25 0.02378 0.04757 0.9762
26 0.0159 0.0318 0.9841
27 0.01178 0.02357 0.9882
28 0.007162 0.01432 0.9928
29 0.02287 0.04574 0.9771
30 0.01757 0.03514 0.9824
31 0.2527 0.5053 0.7473
32 0.22 0.44 0.78
33 0.2953 0.5905 0.7047
34 0.367 0.7339 0.633
35 0.3203 0.6406 0.6797
36 0.2694 0.5388 0.7306
37 0.2303 0.4606 0.7697
38 0.1862 0.3723 0.8138
39 0.1953 0.3907 0.8047
40 0.1849 0.3699 0.8151
41 0.2631 0.5263 0.7369
42 0.2857 0.5715 0.7143
43 0.2658 0.5316 0.7342
44 0.2792 0.5583 0.7208
45 0.2396 0.4791 0.7604
46 0.2002 0.4003 0.7998
47 0.164 0.328 0.836
48 0.1599 0.3199 0.8401
49 0.1279 0.2558 0.8721
50 0.1346 0.2692 0.8654
51 0.3465 0.6931 0.6535
52 0.3265 0.6531 0.6735
53 0.5987 0.8026 0.4013
54 0.5595 0.881 0.4405
55 0.5446 0.9109 0.4554
56 0.5022 0.9956 0.4978
57 0.4631 0.9262 0.5369
58 0.4484 0.8969 0.5516
59 0.3893 0.7785 0.6107
60 0.4324 0.8647 0.5676
61 0.3918 0.7837 0.6082
62 0.3422 0.6844 0.6578
63 0.3067 0.6133 0.6933
64 0.4448 0.8897 0.5552
65 0.3862 0.7724 0.6138
66 0.4792 0.9584 0.5208
67 0.434 0.8681 0.566
68 0.3823 0.7647 0.6177
69 0.3476 0.6953 0.6524
70 0.2979 0.5958 0.7021
71 0.2453 0.4906 0.7547
72 0.2016 0.4033 0.7984
73 0.1652 0.3303 0.8348
74 0.1274 0.2548 0.8726
75 0.1108 0.2215 0.8892
76 0.1015 0.203 0.8985
77 0.1181 0.2362 0.8819
78 0.2321 0.4641 0.7679
79 0.2228 0.4455 0.7772
80 0.1908 0.3815 0.8092
81 0.1783 0.3566 0.8217
82 0.1311 0.2623 0.8689
83 0.1165 0.233 0.8835
84 0.1613 0.3227 0.8387
85 0.1275 0.2551 0.8725
86 0.1278 0.2557 0.8722
87 0.229 0.4579 0.771
88 0.2217 0.4435 0.7783
89 0.1475 0.295 0.8525
90 0.09097 0.1819 0.909
91 0.05384 0.1077 0.9462
92 0.7849 0.4302 0.2151







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level100.116279NOK
10% type I error level170.197674NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 10 & 0.116279 & NOK \tabularnewline
10% type I error level & 17 & 0.197674 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=306634&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]10[/C][C]0.116279[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]17[/C][C]0.197674[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=306634&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level100.116279NOK
10% type I error level170.197674NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2921, df1 = 2, df2 = 93, p-value = 0.2796
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0568, df1 = 6, df2 = 89, p-value = 0.3946
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.080494, df1 = 2, df2 = 93, p-value = 0.9227

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2921, df1 = 2, df2 = 93, p-value = 0.2796
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0568, df1 = 6, df2 = 89, p-value = 0.3946
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.080494, df1 = 2, df2 = 93, p-value = 0.9227
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=306634&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2921, df1 = 2, df2 = 93, p-value = 0.2796
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0568, df1 = 6, df2 = 89, p-value = 0.3946
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.080494, df1 = 2, df2 = 93, p-value = 0.9227
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=306634&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2921, df1 = 2, df2 = 93, p-value = 0.2796
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0568, df1 = 6, df2 = 89, p-value = 0.3946
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.080494, df1 = 2, df2 = 93, p-value = 0.9227







Variance Inflation Factors (Multicollinearity)
> vif
Bevr_Leeftijd        ITHSUM      SKEOUSUM 
     1.001221      1.124252      1.125474 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
Bevr_Leeftijd        ITHSUM      SKEOUSUM 
     1.001221      1.124252      1.125474 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=306634&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
Bevr_Leeftijd        ITHSUM      SKEOUSUM 
     1.001221      1.124252      1.125474 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=306634&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=306634&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
Bevr_Leeftijd        ITHSUM      SKEOUSUM 
     1.001221      1.124252      1.125474 



Parameters (Session):
Parameters (R input):
par1 = ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')