Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 06 Dec 2016 16:21:44 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/06/t1481037798yv5yhlxk55hy2gb.htm/, Retrieved Fri, 01 Nov 2024 03:48:11 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297850, Retrieved Fri, 01 Nov 2024 03:48:11 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact117
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [vragen imago en p...] [2016-12-06 15:21:44] [a5a1109b2531d70fe6d77f4f71bfe676] [Current]
Feedback Forum

Post a new message
Dataseries X:
2	2	3	4	11
4	2	1	4	9
4	2	5	4	12
4	3	4	4	NA
3	4	3	3	NA
4	3	2	5	12
1	4	4	4	12
4	2	5	4	NA
3	NA	5	2	NA
4	4	3	4	11
2	2	2	4	12
4	2	2	3	12
4	5	4	3	15
5	4	4	4	13
4	2	4	4	12
1	3	5	4	11
2	1	2	5	NA
4	1	NA	NA	NA
4	3	2	4	9
5	4	4	4	NA
5	5	4	4	11
4	5	4	4	NA
1	1	5	4	12
4	4	3	4	NA
2	2	4	4	NA
4	4	3	4	NA
5	4	3	3	12
3	3	3	3	12
5	4	5	5	14
3	2	4	4	NA
5	2	4	4	12
2	4	3	4	9
1	2	3	4	13
NA	4	5	1	NA
4	2	3	3	13
4	4	3	4	12
3	3	3	4	NA
5	3	5	5	12
4	4	3	4	12
NA	2	3	4	12
4	3	3	4	NA
2	2	4	3	12
3	4	3	4	11
1	2	1	5	13
3	2	4	4	13
3	3	4	3	NA
3	3	3	3	NA
4	NA	4	5	13
4	4	4	4	10
4	5	5	1	NA
4	4	4	4	13
4	4	4	4	NA
2	4	3	4	NA
5	2	2	4	5
3	2	4	3	NA
3	1	3	4	10
4	3	3	3	NA
4	4	3	4	15
4	3	4	2	13
3	3	4	4	NA
4	2	3	4	12
4	3	4	4	13
4	2	5	3	13
4	4	2	4	11
4	3	3	3	NA
2	2	3	4	NA
4	4	3	3	12
4	5	4	4	12
4	4	3	4	13
4	3	4	4	14
4	2	3	4	NA
5	3	1	3	NA
3	4	4	3	NA
2	4	3	2	NA
4	4	2	4	NA
5	5	3	5	12
4	4	3	4	12
5	4	4	5	10
5	4	5	2	12
2	3	3	4	12
4	2	4	4	NA
4	4	2	4	NA
4	4	2	4	12
3	4	2	5	13
4	2	3	4	NA
2	2	4	4	14
5	1	3	4	10
3	NA	5	4	12
4	4	4	1	NA
2	4	4	4	13
4	4	3	4	11
3	3	4	3	NA
3	4	3	4	12
4	4	5	4	NA
4	4	4	3	12
4	2	4	3	13
3	4	3	4	12
4	4	4	5	9
3	1	1	3	NA
3	4	4	4	12
1	2	4	3	NA
4	3	4	4	14
3	3	4	5	NA
3	4	4	3	11
5	3	3	4	NA
5	4	5	4	NA
4	4	3	NA	NA
5	4	5	5	NA
4	4	4	4	NA
4	5	4	4	12
4	5	4	5	NA
4	2	4	3	NA
3	1	3	3	NA
4	3	4	3	12
3	3	3	4	NA
4	1	3	4	9
2	4	3	4	13
1	4	3	4	NA
5	2	2	4	10
4	4	4	4	14
3	3	3	3	10
4	4	2	4	12
4	4	4	5	NA
4	2	4	4	11
4	2	3	3	NA
2	4	4	4	14
4	4	5	4	13
4	2	4	3	12
4	2	NA	3	NA
4	2	4	4	NA
3	2	4	2	10
4	5	4	4	NA
5	2	5	3	12
2	NA	2	4	NA
5	2	4	4	12
4	4	4	4	NA
3	5	5	4	15
NA	4	4	3	NA
2	4	4	2	NA
2	3	5	5	12
2	3	2	3	12
4	1	4	4	10
4	4	5	4	12
5	5	3	4	12
3	4	4	5	NA
3	4	4	4	12
4	5	3	4	11
4	4	5	3	13
4	5	5	1	NA
4	5	3	4	NA
4	3	2	5	NA
4	5	4	4	13
4	1	5	4	11
2	3	3	4	10
5	2	3	5	9
4	2	4	4	NA
4	NA	3	4	12
4	4	2	4	NA
4	2	3	4	NA
4	5	3	4	13
2	4	4	3	10
3	5	1	5	13
3	3	4	3	NA
4	2	3	4	NA
4	4	3	4	NA
4	2	2	5	NA
4	3	3	4	12
3	3	3	4	NA
3	2	5	2	12




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time6 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]6 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297850&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
V5[t] = + 10.2852 -0.230543V1[t] + 0.441032V2[t] + 0.442236V3[t] -0.099862V4[t] -0.00458843t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
V5[t] =  +  10.2852 -0.230543V1[t] +  0.441032V2[t] +  0.442236V3[t] -0.099862V4[t] -0.00458843t  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]V5[t] =  +  10.2852 -0.230543V1[t] +  0.441032V2[t] +  0.442236V3[t] -0.099862V4[t] -0.00458843t  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297850&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
V5[t] = + 10.2852 -0.230543V1[t] + 0.441032V2[t] + 0.442236V3[t] -0.099862V4[t] -0.00458843t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.29 1.223+8.4070e+00 6.315e-13 3.157e-13
V1-0.2305 0.1353-1.7040e+00 0.09184 0.04592
V2+0.441 0.1299+3.3950e+00 0.001028 0.000514
V3+0.4422 0.1519+2.9120e+00 0.004535 0.002267
V4-0.09986 0.2253-4.4320e-01 0.6587 0.3293
t-0.004588 0.005421-8.4640e-01 0.3996 0.1998

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +10.29 &  1.223 & +8.4070e+00 &  6.315e-13 &  3.157e-13 \tabularnewline
V1 & -0.2305 &  0.1353 & -1.7040e+00 &  0.09184 &  0.04592 \tabularnewline
V2 & +0.441 &  0.1299 & +3.3950e+00 &  0.001028 &  0.000514 \tabularnewline
V3 & +0.4422 &  0.1519 & +2.9120e+00 &  0.004535 &  0.002267 \tabularnewline
V4 & -0.09986 &  0.2253 & -4.4320e-01 &  0.6587 &  0.3293 \tabularnewline
t & -0.004588 &  0.005421 & -8.4640e-01 &  0.3996 &  0.1998 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+10.29[/C][C] 1.223[/C][C]+8.4070e+00[/C][C] 6.315e-13[/C][C] 3.157e-13[/C][/ROW]
[ROW][C]V1[/C][C]-0.2305[/C][C] 0.1353[/C][C]-1.7040e+00[/C][C] 0.09184[/C][C] 0.04592[/C][/ROW]
[ROW][C]V2[/C][C]+0.441[/C][C] 0.1299[/C][C]+3.3950e+00[/C][C] 0.001028[/C][C] 0.000514[/C][/ROW]
[ROW][C]V3[/C][C]+0.4422[/C][C] 0.1519[/C][C]+2.9120e+00[/C][C] 0.004535[/C][C] 0.002267[/C][/ROW]
[ROW][C]V4[/C][C]-0.09986[/C][C] 0.2253[/C][C]-4.4320e-01[/C][C] 0.6587[/C][C] 0.3293[/C][/ROW]
[ROW][C]t[/C][C]-0.004588[/C][C] 0.005421[/C][C]-8.4640e-01[/C][C] 0.3996[/C][C] 0.1998[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297850&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.29 1.223+8.4070e+00 6.315e-13 3.157e-13
V1-0.2305 0.1353-1.7040e+00 0.09184 0.04592
V2+0.441 0.1299+3.3950e+00 0.001028 0.000514
V3+0.4422 0.1519+2.9120e+00 0.004535 0.002267
V4-0.09986 0.2253-4.4320e-01 0.6587 0.3293
t-0.004588 0.005421-8.4640e-01 0.3996 0.1998







Multiple Linear Regression - Regression Statistics
Multiple R 0.4427
R-squared 0.196
Adjusted R-squared 0.1508
F-TEST (value) 4.338
F-TEST (DF numerator)5
F-TEST (DF denominator)89
p-value 0.001403
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.414
Sum Squared Residuals 177.9

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.4427 \tabularnewline
R-squared &  0.196 \tabularnewline
Adjusted R-squared &  0.1508 \tabularnewline
F-TEST (value) &  4.338 \tabularnewline
F-TEST (DF numerator) & 5 \tabularnewline
F-TEST (DF denominator) & 89 \tabularnewline
p-value &  0.001403 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.414 \tabularnewline
Sum Squared Residuals &  177.9 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.4427[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.196[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.1508[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 4.338[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]5[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]89[/C][/ROW]
[ROW][C]p-value[/C][C] 0.001403[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.414[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 177.9[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297850&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.4427
R-squared 0.196
Adjusted R-squared 0.1508
F-TEST (value) 4.338
F-TEST (DF numerator)5
F-TEST (DF denominator)89
p-value 0.001403
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.414
Sum Squared Residuals 177.9







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 11 11.63-0.6289
2 9 10.28-1.279
3 12 12.04-0.04308
4 12 11.05 0.947
5 12 13.17-1.165
6 11 12.03-1.027
7 12 11.16 0.8409
8 12 10.79 1.207
9 15 13 2.004
10 13 12.22 0.7798
11 12 11.56 0.4359
12 11 13.13-2.134
13 9 11.11-2.112
14 11 12.64-1.643
15 12 12.24-0.2386
16 12 11.85 0.1497
17 12 11.87 0.1342
18 14 12.53 1.474
19 12 11.3 0.7031
20 9 12.42-3.424
21 13 11.77 1.232
22 13 11.17 1.829
23 12 11.95 0.0511
24 12 12.06-0.05735
25 12 11.94 0.06027
26 12 12.06-0.05626
27 11 12.16-1.161
28 13 10.75 2.249
29 13 11.71 1.288
30 10 12.36-2.359
31 13 12.35 0.6456
32 5 10.35-5.353
33 10 10.81-0.8105
34 15 11.9 3.102
35 13 12.09 0.9052
36 12 11.01 0.9928
37 13 11.89 1.114
38 13 11.98 1.018
39 11 11.43-0.4333
40 12 11.97 0.02924
41 12 12.75-0.7496
42 13 11.86 1.138
43 14 11.86 2.142
44 12 11.96 0.03683
45 12 11.85 0.152
46 10 11.96-1.955
47 12 12.69-0.6924
48 12 11.85 0.1458
49 12 11.39 0.6126
50 13 11.51 1.487
51 14 11.84 2.158
52 10 10.26-0.2622
53 13 12.71 0.2854
54 11 11.81-0.8067
55 12 12.03-0.03262
56 12 12.34-0.3396
57 13 11.45 1.547
58 12 12.02-0.01885
59 9 12.13-3.126
60 12 12.45-0.4519
61 14 11.78 2.224
62 11 12.54-1.543
63 12 12.65-0.6486
64 12 11.86 0.1382
65 9 10.43-1.433
66 13 12.21 0.7873
67 10 10.19-0.1922
68 14 12.18 1.815
69 10 11.63-1.627
70 12 11.29 0.709
71 11 11.29-0.2888
72 14 12.63 1.373
73 13 12.6 0.396
74 12 11.37 0.6251
75 10 11.7-1.701
76 12 11.58 0.4226
77 12 11.03 0.9692
78 15 13.25 1.747
79 12 12.5-0.4966
80 12 11.37 0.635
81 10 10.8-0.8019
82 12 12.56-0.5627
83 12 11.88 0.1159
84 12 12.34-0.3418
85 11 12.11-1.105
86 13 12.64 0.3558
87 13 12.54 0.4615
88 11 11.21-0.212
89 10 11.67-1.666
90 9 10.43-1.429
91 13 12.08 0.9221
92 10 12.64-2.635
93 13 11.31 1.685
94 12 11.18 0.8179
95 12 12.05-0.05121

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  11 &  11.63 & -0.6289 \tabularnewline
2 &  9 &  10.28 & -1.279 \tabularnewline
3 &  12 &  12.04 & -0.04308 \tabularnewline
4 &  12 &  11.05 &  0.947 \tabularnewline
5 &  12 &  13.17 & -1.165 \tabularnewline
6 &  11 &  12.03 & -1.027 \tabularnewline
7 &  12 &  11.16 &  0.8409 \tabularnewline
8 &  12 &  10.79 &  1.207 \tabularnewline
9 &  15 &  13 &  2.004 \tabularnewline
10 &  13 &  12.22 &  0.7798 \tabularnewline
11 &  12 &  11.56 &  0.4359 \tabularnewline
12 &  11 &  13.13 & -2.134 \tabularnewline
13 &  9 &  11.11 & -2.112 \tabularnewline
14 &  11 &  12.64 & -1.643 \tabularnewline
15 &  12 &  12.24 & -0.2386 \tabularnewline
16 &  12 &  11.85 &  0.1497 \tabularnewline
17 &  12 &  11.87 &  0.1342 \tabularnewline
18 &  14 &  12.53 &  1.474 \tabularnewline
19 &  12 &  11.3 &  0.7031 \tabularnewline
20 &  9 &  12.42 & -3.424 \tabularnewline
21 &  13 &  11.77 &  1.232 \tabularnewline
22 &  13 &  11.17 &  1.829 \tabularnewline
23 &  12 &  11.95 &  0.0511 \tabularnewline
24 &  12 &  12.06 & -0.05735 \tabularnewline
25 &  12 &  11.94 &  0.06027 \tabularnewline
26 &  12 &  12.06 & -0.05626 \tabularnewline
27 &  11 &  12.16 & -1.161 \tabularnewline
28 &  13 &  10.75 &  2.249 \tabularnewline
29 &  13 &  11.71 &  1.288 \tabularnewline
30 &  10 &  12.36 & -2.359 \tabularnewline
31 &  13 &  12.35 &  0.6456 \tabularnewline
32 &  5 &  10.35 & -5.353 \tabularnewline
33 &  10 &  10.81 & -0.8105 \tabularnewline
34 &  15 &  11.9 &  3.102 \tabularnewline
35 &  13 &  12.09 &  0.9052 \tabularnewline
36 &  12 &  11.01 &  0.9928 \tabularnewline
37 &  13 &  11.89 &  1.114 \tabularnewline
38 &  13 &  11.98 &  1.018 \tabularnewline
39 &  11 &  11.43 & -0.4333 \tabularnewline
40 &  12 &  11.97 &  0.02924 \tabularnewline
41 &  12 &  12.75 & -0.7496 \tabularnewline
42 &  13 &  11.86 &  1.138 \tabularnewline
43 &  14 &  11.86 &  2.142 \tabularnewline
44 &  12 &  11.96 &  0.03683 \tabularnewline
45 &  12 &  11.85 &  0.152 \tabularnewline
46 &  10 &  11.96 & -1.955 \tabularnewline
47 &  12 &  12.69 & -0.6924 \tabularnewline
48 &  12 &  11.85 &  0.1458 \tabularnewline
49 &  12 &  11.39 &  0.6126 \tabularnewline
50 &  13 &  11.51 &  1.487 \tabularnewline
51 &  14 &  11.84 &  2.158 \tabularnewline
52 &  10 &  10.26 & -0.2622 \tabularnewline
53 &  13 &  12.71 &  0.2854 \tabularnewline
54 &  11 &  11.81 & -0.8067 \tabularnewline
55 &  12 &  12.03 & -0.03262 \tabularnewline
56 &  12 &  12.34 & -0.3396 \tabularnewline
57 &  13 &  11.45 &  1.547 \tabularnewline
58 &  12 &  12.02 & -0.01885 \tabularnewline
59 &  9 &  12.13 & -3.126 \tabularnewline
60 &  12 &  12.45 & -0.4519 \tabularnewline
61 &  14 &  11.78 &  2.224 \tabularnewline
62 &  11 &  12.54 & -1.543 \tabularnewline
63 &  12 &  12.65 & -0.6486 \tabularnewline
64 &  12 &  11.86 &  0.1382 \tabularnewline
65 &  9 &  10.43 & -1.433 \tabularnewline
66 &  13 &  12.21 &  0.7873 \tabularnewline
67 &  10 &  10.19 & -0.1922 \tabularnewline
68 &  14 &  12.18 &  1.815 \tabularnewline
69 &  10 &  11.63 & -1.627 \tabularnewline
70 &  12 &  11.29 &  0.709 \tabularnewline
71 &  11 &  11.29 & -0.2888 \tabularnewline
72 &  14 &  12.63 &  1.373 \tabularnewline
73 &  13 &  12.6 &  0.396 \tabularnewline
74 &  12 &  11.37 &  0.6251 \tabularnewline
75 &  10 &  11.7 & -1.701 \tabularnewline
76 &  12 &  11.58 &  0.4226 \tabularnewline
77 &  12 &  11.03 &  0.9692 \tabularnewline
78 &  15 &  13.25 &  1.747 \tabularnewline
79 &  12 &  12.5 & -0.4966 \tabularnewline
80 &  12 &  11.37 &  0.635 \tabularnewline
81 &  10 &  10.8 & -0.8019 \tabularnewline
82 &  12 &  12.56 & -0.5627 \tabularnewline
83 &  12 &  11.88 &  0.1159 \tabularnewline
84 &  12 &  12.34 & -0.3418 \tabularnewline
85 &  11 &  12.11 & -1.105 \tabularnewline
86 &  13 &  12.64 &  0.3558 \tabularnewline
87 &  13 &  12.54 &  0.4615 \tabularnewline
88 &  11 &  11.21 & -0.212 \tabularnewline
89 &  10 &  11.67 & -1.666 \tabularnewline
90 &  9 &  10.43 & -1.429 \tabularnewline
91 &  13 &  12.08 &  0.9221 \tabularnewline
92 &  10 &  12.64 & -2.635 \tabularnewline
93 &  13 &  11.31 &  1.685 \tabularnewline
94 &  12 &  11.18 &  0.8179 \tabularnewline
95 &  12 &  12.05 & -0.05121 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 11[/C][C] 11.63[/C][C]-0.6289[/C][/ROW]
[ROW][C]2[/C][C] 9[/C][C] 10.28[/C][C]-1.279[/C][/ROW]
[ROW][C]3[/C][C] 12[/C][C] 12.04[/C][C]-0.04308[/C][/ROW]
[ROW][C]4[/C][C] 12[/C][C] 11.05[/C][C] 0.947[/C][/ROW]
[ROW][C]5[/C][C] 12[/C][C] 13.17[/C][C]-1.165[/C][/ROW]
[ROW][C]6[/C][C] 11[/C][C] 12.03[/C][C]-1.027[/C][/ROW]
[ROW][C]7[/C][C] 12[/C][C] 11.16[/C][C] 0.8409[/C][/ROW]
[ROW][C]8[/C][C] 12[/C][C] 10.79[/C][C] 1.207[/C][/ROW]
[ROW][C]9[/C][C] 15[/C][C] 13[/C][C] 2.004[/C][/ROW]
[ROW][C]10[/C][C] 13[/C][C] 12.22[/C][C] 0.7798[/C][/ROW]
[ROW][C]11[/C][C] 12[/C][C] 11.56[/C][C] 0.4359[/C][/ROW]
[ROW][C]12[/C][C] 11[/C][C] 13.13[/C][C]-2.134[/C][/ROW]
[ROW][C]13[/C][C] 9[/C][C] 11.11[/C][C]-2.112[/C][/ROW]
[ROW][C]14[/C][C] 11[/C][C] 12.64[/C][C]-1.643[/C][/ROW]
[ROW][C]15[/C][C] 12[/C][C] 12.24[/C][C]-0.2386[/C][/ROW]
[ROW][C]16[/C][C] 12[/C][C] 11.85[/C][C] 0.1497[/C][/ROW]
[ROW][C]17[/C][C] 12[/C][C] 11.87[/C][C] 0.1342[/C][/ROW]
[ROW][C]18[/C][C] 14[/C][C] 12.53[/C][C] 1.474[/C][/ROW]
[ROW][C]19[/C][C] 12[/C][C] 11.3[/C][C] 0.7031[/C][/ROW]
[ROW][C]20[/C][C] 9[/C][C] 12.42[/C][C]-3.424[/C][/ROW]
[ROW][C]21[/C][C] 13[/C][C] 11.77[/C][C] 1.232[/C][/ROW]
[ROW][C]22[/C][C] 13[/C][C] 11.17[/C][C] 1.829[/C][/ROW]
[ROW][C]23[/C][C] 12[/C][C] 11.95[/C][C] 0.0511[/C][/ROW]
[ROW][C]24[/C][C] 12[/C][C] 12.06[/C][C]-0.05735[/C][/ROW]
[ROW][C]25[/C][C] 12[/C][C] 11.94[/C][C] 0.06027[/C][/ROW]
[ROW][C]26[/C][C] 12[/C][C] 12.06[/C][C]-0.05626[/C][/ROW]
[ROW][C]27[/C][C] 11[/C][C] 12.16[/C][C]-1.161[/C][/ROW]
[ROW][C]28[/C][C] 13[/C][C] 10.75[/C][C] 2.249[/C][/ROW]
[ROW][C]29[/C][C] 13[/C][C] 11.71[/C][C] 1.288[/C][/ROW]
[ROW][C]30[/C][C] 10[/C][C] 12.36[/C][C]-2.359[/C][/ROW]
[ROW][C]31[/C][C] 13[/C][C] 12.35[/C][C] 0.6456[/C][/ROW]
[ROW][C]32[/C][C] 5[/C][C] 10.35[/C][C]-5.353[/C][/ROW]
[ROW][C]33[/C][C] 10[/C][C] 10.81[/C][C]-0.8105[/C][/ROW]
[ROW][C]34[/C][C] 15[/C][C] 11.9[/C][C] 3.102[/C][/ROW]
[ROW][C]35[/C][C] 13[/C][C] 12.09[/C][C] 0.9052[/C][/ROW]
[ROW][C]36[/C][C] 12[/C][C] 11.01[/C][C] 0.9928[/C][/ROW]
[ROW][C]37[/C][C] 13[/C][C] 11.89[/C][C] 1.114[/C][/ROW]
[ROW][C]38[/C][C] 13[/C][C] 11.98[/C][C] 1.018[/C][/ROW]
[ROW][C]39[/C][C] 11[/C][C] 11.43[/C][C]-0.4333[/C][/ROW]
[ROW][C]40[/C][C] 12[/C][C] 11.97[/C][C] 0.02924[/C][/ROW]
[ROW][C]41[/C][C] 12[/C][C] 12.75[/C][C]-0.7496[/C][/ROW]
[ROW][C]42[/C][C] 13[/C][C] 11.86[/C][C] 1.138[/C][/ROW]
[ROW][C]43[/C][C] 14[/C][C] 11.86[/C][C] 2.142[/C][/ROW]
[ROW][C]44[/C][C] 12[/C][C] 11.96[/C][C] 0.03683[/C][/ROW]
[ROW][C]45[/C][C] 12[/C][C] 11.85[/C][C] 0.152[/C][/ROW]
[ROW][C]46[/C][C] 10[/C][C] 11.96[/C][C]-1.955[/C][/ROW]
[ROW][C]47[/C][C] 12[/C][C] 12.69[/C][C]-0.6924[/C][/ROW]
[ROW][C]48[/C][C] 12[/C][C] 11.85[/C][C] 0.1458[/C][/ROW]
[ROW][C]49[/C][C] 12[/C][C] 11.39[/C][C] 0.6126[/C][/ROW]
[ROW][C]50[/C][C] 13[/C][C] 11.51[/C][C] 1.487[/C][/ROW]
[ROW][C]51[/C][C] 14[/C][C] 11.84[/C][C] 2.158[/C][/ROW]
[ROW][C]52[/C][C] 10[/C][C] 10.26[/C][C]-0.2622[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 12.71[/C][C] 0.2854[/C][/ROW]
[ROW][C]54[/C][C] 11[/C][C] 11.81[/C][C]-0.8067[/C][/ROW]
[ROW][C]55[/C][C] 12[/C][C] 12.03[/C][C]-0.03262[/C][/ROW]
[ROW][C]56[/C][C] 12[/C][C] 12.34[/C][C]-0.3396[/C][/ROW]
[ROW][C]57[/C][C] 13[/C][C] 11.45[/C][C] 1.547[/C][/ROW]
[ROW][C]58[/C][C] 12[/C][C] 12.02[/C][C]-0.01885[/C][/ROW]
[ROW][C]59[/C][C] 9[/C][C] 12.13[/C][C]-3.126[/C][/ROW]
[ROW][C]60[/C][C] 12[/C][C] 12.45[/C][C]-0.4519[/C][/ROW]
[ROW][C]61[/C][C] 14[/C][C] 11.78[/C][C] 2.224[/C][/ROW]
[ROW][C]62[/C][C] 11[/C][C] 12.54[/C][C]-1.543[/C][/ROW]
[ROW][C]63[/C][C] 12[/C][C] 12.65[/C][C]-0.6486[/C][/ROW]
[ROW][C]64[/C][C] 12[/C][C] 11.86[/C][C] 0.1382[/C][/ROW]
[ROW][C]65[/C][C] 9[/C][C] 10.43[/C][C]-1.433[/C][/ROW]
[ROW][C]66[/C][C] 13[/C][C] 12.21[/C][C] 0.7873[/C][/ROW]
[ROW][C]67[/C][C] 10[/C][C] 10.19[/C][C]-0.1922[/C][/ROW]
[ROW][C]68[/C][C] 14[/C][C] 12.18[/C][C] 1.815[/C][/ROW]
[ROW][C]69[/C][C] 10[/C][C] 11.63[/C][C]-1.627[/C][/ROW]
[ROW][C]70[/C][C] 12[/C][C] 11.29[/C][C] 0.709[/C][/ROW]
[ROW][C]71[/C][C] 11[/C][C] 11.29[/C][C]-0.2888[/C][/ROW]
[ROW][C]72[/C][C] 14[/C][C] 12.63[/C][C] 1.373[/C][/ROW]
[ROW][C]73[/C][C] 13[/C][C] 12.6[/C][C] 0.396[/C][/ROW]
[ROW][C]74[/C][C] 12[/C][C] 11.37[/C][C] 0.6251[/C][/ROW]
[ROW][C]75[/C][C] 10[/C][C] 11.7[/C][C]-1.701[/C][/ROW]
[ROW][C]76[/C][C] 12[/C][C] 11.58[/C][C] 0.4226[/C][/ROW]
[ROW][C]77[/C][C] 12[/C][C] 11.03[/C][C] 0.9692[/C][/ROW]
[ROW][C]78[/C][C] 15[/C][C] 13.25[/C][C] 1.747[/C][/ROW]
[ROW][C]79[/C][C] 12[/C][C] 12.5[/C][C]-0.4966[/C][/ROW]
[ROW][C]80[/C][C] 12[/C][C] 11.37[/C][C] 0.635[/C][/ROW]
[ROW][C]81[/C][C] 10[/C][C] 10.8[/C][C]-0.8019[/C][/ROW]
[ROW][C]82[/C][C] 12[/C][C] 12.56[/C][C]-0.5627[/C][/ROW]
[ROW][C]83[/C][C] 12[/C][C] 11.88[/C][C] 0.1159[/C][/ROW]
[ROW][C]84[/C][C] 12[/C][C] 12.34[/C][C]-0.3418[/C][/ROW]
[ROW][C]85[/C][C] 11[/C][C] 12.11[/C][C]-1.105[/C][/ROW]
[ROW][C]86[/C][C] 13[/C][C] 12.64[/C][C] 0.3558[/C][/ROW]
[ROW][C]87[/C][C] 13[/C][C] 12.54[/C][C] 0.4615[/C][/ROW]
[ROW][C]88[/C][C] 11[/C][C] 11.21[/C][C]-0.212[/C][/ROW]
[ROW][C]89[/C][C] 10[/C][C] 11.67[/C][C]-1.666[/C][/ROW]
[ROW][C]90[/C][C] 9[/C][C] 10.43[/C][C]-1.429[/C][/ROW]
[ROW][C]91[/C][C] 13[/C][C] 12.08[/C][C] 0.9221[/C][/ROW]
[ROW][C]92[/C][C] 10[/C][C] 12.64[/C][C]-2.635[/C][/ROW]
[ROW][C]93[/C][C] 13[/C][C] 11.31[/C][C] 1.685[/C][/ROW]
[ROW][C]94[/C][C] 12[/C][C] 11.18[/C][C] 0.8179[/C][/ROW]
[ROW][C]95[/C][C] 12[/C][C] 12.05[/C][C]-0.05121[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297850&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 11 11.63-0.6289
2 9 10.28-1.279
3 12 12.04-0.04308
4 12 11.05 0.947
5 12 13.17-1.165
6 11 12.03-1.027
7 12 11.16 0.8409
8 12 10.79 1.207
9 15 13 2.004
10 13 12.22 0.7798
11 12 11.56 0.4359
12 11 13.13-2.134
13 9 11.11-2.112
14 11 12.64-1.643
15 12 12.24-0.2386
16 12 11.85 0.1497
17 12 11.87 0.1342
18 14 12.53 1.474
19 12 11.3 0.7031
20 9 12.42-3.424
21 13 11.77 1.232
22 13 11.17 1.829
23 12 11.95 0.0511
24 12 12.06-0.05735
25 12 11.94 0.06027
26 12 12.06-0.05626
27 11 12.16-1.161
28 13 10.75 2.249
29 13 11.71 1.288
30 10 12.36-2.359
31 13 12.35 0.6456
32 5 10.35-5.353
33 10 10.81-0.8105
34 15 11.9 3.102
35 13 12.09 0.9052
36 12 11.01 0.9928
37 13 11.89 1.114
38 13 11.98 1.018
39 11 11.43-0.4333
40 12 11.97 0.02924
41 12 12.75-0.7496
42 13 11.86 1.138
43 14 11.86 2.142
44 12 11.96 0.03683
45 12 11.85 0.152
46 10 11.96-1.955
47 12 12.69-0.6924
48 12 11.85 0.1458
49 12 11.39 0.6126
50 13 11.51 1.487
51 14 11.84 2.158
52 10 10.26-0.2622
53 13 12.71 0.2854
54 11 11.81-0.8067
55 12 12.03-0.03262
56 12 12.34-0.3396
57 13 11.45 1.547
58 12 12.02-0.01885
59 9 12.13-3.126
60 12 12.45-0.4519
61 14 11.78 2.224
62 11 12.54-1.543
63 12 12.65-0.6486
64 12 11.86 0.1382
65 9 10.43-1.433
66 13 12.21 0.7873
67 10 10.19-0.1922
68 14 12.18 1.815
69 10 11.63-1.627
70 12 11.29 0.709
71 11 11.29-0.2888
72 14 12.63 1.373
73 13 12.6 0.396
74 12 11.37 0.6251
75 10 11.7-1.701
76 12 11.58 0.4226
77 12 11.03 0.9692
78 15 13.25 1.747
79 12 12.5-0.4966
80 12 11.37 0.635
81 10 10.8-0.8019
82 12 12.56-0.5627
83 12 11.88 0.1159
84 12 12.34-0.3418
85 11 12.11-1.105
86 13 12.64 0.3558
87 13 12.54 0.4615
88 11 11.21-0.212
89 10 11.67-1.666
90 9 10.43-1.429
91 13 12.08 0.9221
92 10 12.64-2.635
93 13 11.31 1.685
94 12 11.18 0.8179
95 12 12.05-0.05121







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.486 0.972 0.514
10 0.4025 0.8049 0.5975
11 0.3361 0.6723 0.6639
12 0.4269 0.8539 0.5731
13 0.5898 0.8204 0.4102
14 0.5872 0.8256 0.4128
15 0.5262 0.9475 0.4738
16 0.4263 0.8525 0.5737
17 0.3357 0.6715 0.6643
18 0.4165 0.8329 0.5835
19 0.3337 0.6674 0.6663
20 0.4584 0.9168 0.5416
21 0.6098 0.7804 0.3902
22 0.5847 0.8307 0.4153
23 0.5104 0.9791 0.4895
24 0.4398 0.8796 0.5602
25 0.3692 0.7384 0.6308
26 0.3082 0.6163 0.6918
27 0.2678 0.5357 0.7322
28 0.4264 0.8528 0.5736
29 0.3754 0.7508 0.6246
30 0.5125 0.9751 0.4875
31 0.4583 0.9167 0.5417
32 0.9881 0.0239 0.01195
33 0.9854 0.02917 0.01458
34 0.997 0.005989 0.002994
35 0.9956 0.008817 0.004409
36 0.9939 0.0123 0.00615
37 0.9919 0.01623 0.008113
38 0.989 0.022 0.011
39 0.9845 0.03091 0.01545
40 0.9774 0.04528 0.02264
41 0.9713 0.05736 0.02868
42 0.9654 0.06911 0.03456
43 0.9744 0.05123 0.02562
44 0.9635 0.07299 0.0365
45 0.9491 0.1018 0.05091
46 0.9665 0.06699 0.0335
47 0.96 0.08001 0.04
48 0.9445 0.1111 0.05555
49 0.9264 0.1473 0.07365
50 0.9218 0.1564 0.07821
51 0.9486 0.1027 0.05135
52 0.9334 0.1332 0.06659
53 0.9134 0.1733 0.08663
54 0.8996 0.2008 0.1004
55 0.8688 0.2624 0.1312
56 0.8381 0.3237 0.1619
57 0.8491 0.3018 0.1509
58 0.8084 0.3832 0.1916
59 0.9596 0.08076 0.04038
60 0.9484 0.1033 0.05163
61 0.9677 0.06467 0.03234
62 0.9741 0.05184 0.02592
63 0.9772 0.04556 0.02278
64 0.9658 0.06837 0.03419
65 0.9671 0.06577 0.03288
66 0.954 0.092 0.046
67 0.9367 0.1265 0.06325
68 0.9351 0.1298 0.06489
69 0.9521 0.09585 0.04793
70 0.931 0.1379 0.06897
71 0.9066 0.1867 0.09336
72 0.9007 0.1986 0.09931
73 0.8594 0.2813 0.1406
74 0.821 0.358 0.179
75 0.8396 0.3209 0.1604
76 0.7787 0.4426 0.2213
77 0.73 0.5399 0.27
78 0.7964 0.4071 0.2036
79 0.7801 0.4398 0.2199
80 0.7714 0.4572 0.2286
81 0.7307 0.5386 0.2693
82 0.6276 0.7447 0.3724
83 0.5073 0.9854 0.4927
84 0.4317 0.8635 0.5683
85 0.3746 0.7493 0.6254
86 0.238 0.476 0.762

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
9 &  0.486 &  0.972 &  0.514 \tabularnewline
10 &  0.4025 &  0.8049 &  0.5975 \tabularnewline
11 &  0.3361 &  0.6723 &  0.6639 \tabularnewline
12 &  0.4269 &  0.8539 &  0.5731 \tabularnewline
13 &  0.5898 &  0.8204 &  0.4102 \tabularnewline
14 &  0.5872 &  0.8256 &  0.4128 \tabularnewline
15 &  0.5262 &  0.9475 &  0.4738 \tabularnewline
16 &  0.4263 &  0.8525 &  0.5737 \tabularnewline
17 &  0.3357 &  0.6715 &  0.6643 \tabularnewline
18 &  0.4165 &  0.8329 &  0.5835 \tabularnewline
19 &  0.3337 &  0.6674 &  0.6663 \tabularnewline
20 &  0.4584 &  0.9168 &  0.5416 \tabularnewline
21 &  0.6098 &  0.7804 &  0.3902 \tabularnewline
22 &  0.5847 &  0.8307 &  0.4153 \tabularnewline
23 &  0.5104 &  0.9791 &  0.4895 \tabularnewline
24 &  0.4398 &  0.8796 &  0.5602 \tabularnewline
25 &  0.3692 &  0.7384 &  0.6308 \tabularnewline
26 &  0.3082 &  0.6163 &  0.6918 \tabularnewline
27 &  0.2678 &  0.5357 &  0.7322 \tabularnewline
28 &  0.4264 &  0.8528 &  0.5736 \tabularnewline
29 &  0.3754 &  0.7508 &  0.6246 \tabularnewline
30 &  0.5125 &  0.9751 &  0.4875 \tabularnewline
31 &  0.4583 &  0.9167 &  0.5417 \tabularnewline
32 &  0.9881 &  0.0239 &  0.01195 \tabularnewline
33 &  0.9854 &  0.02917 &  0.01458 \tabularnewline
34 &  0.997 &  0.005989 &  0.002994 \tabularnewline
35 &  0.9956 &  0.008817 &  0.004409 \tabularnewline
36 &  0.9939 &  0.0123 &  0.00615 \tabularnewline
37 &  0.9919 &  0.01623 &  0.008113 \tabularnewline
38 &  0.989 &  0.022 &  0.011 \tabularnewline
39 &  0.9845 &  0.03091 &  0.01545 \tabularnewline
40 &  0.9774 &  0.04528 &  0.02264 \tabularnewline
41 &  0.9713 &  0.05736 &  0.02868 \tabularnewline
42 &  0.9654 &  0.06911 &  0.03456 \tabularnewline
43 &  0.9744 &  0.05123 &  0.02562 \tabularnewline
44 &  0.9635 &  0.07299 &  0.0365 \tabularnewline
45 &  0.9491 &  0.1018 &  0.05091 \tabularnewline
46 &  0.9665 &  0.06699 &  0.0335 \tabularnewline
47 &  0.96 &  0.08001 &  0.04 \tabularnewline
48 &  0.9445 &  0.1111 &  0.05555 \tabularnewline
49 &  0.9264 &  0.1473 &  0.07365 \tabularnewline
50 &  0.9218 &  0.1564 &  0.07821 \tabularnewline
51 &  0.9486 &  0.1027 &  0.05135 \tabularnewline
52 &  0.9334 &  0.1332 &  0.06659 \tabularnewline
53 &  0.9134 &  0.1733 &  0.08663 \tabularnewline
54 &  0.8996 &  0.2008 &  0.1004 \tabularnewline
55 &  0.8688 &  0.2624 &  0.1312 \tabularnewline
56 &  0.8381 &  0.3237 &  0.1619 \tabularnewline
57 &  0.8491 &  0.3018 &  0.1509 \tabularnewline
58 &  0.8084 &  0.3832 &  0.1916 \tabularnewline
59 &  0.9596 &  0.08076 &  0.04038 \tabularnewline
60 &  0.9484 &  0.1033 &  0.05163 \tabularnewline
61 &  0.9677 &  0.06467 &  0.03234 \tabularnewline
62 &  0.9741 &  0.05184 &  0.02592 \tabularnewline
63 &  0.9772 &  0.04556 &  0.02278 \tabularnewline
64 &  0.9658 &  0.06837 &  0.03419 \tabularnewline
65 &  0.9671 &  0.06577 &  0.03288 \tabularnewline
66 &  0.954 &  0.092 &  0.046 \tabularnewline
67 &  0.9367 &  0.1265 &  0.06325 \tabularnewline
68 &  0.9351 &  0.1298 &  0.06489 \tabularnewline
69 &  0.9521 &  0.09585 &  0.04793 \tabularnewline
70 &  0.931 &  0.1379 &  0.06897 \tabularnewline
71 &  0.9066 &  0.1867 &  0.09336 \tabularnewline
72 &  0.9007 &  0.1986 &  0.09931 \tabularnewline
73 &  0.8594 &  0.2813 &  0.1406 \tabularnewline
74 &  0.821 &  0.358 &  0.179 \tabularnewline
75 &  0.8396 &  0.3209 &  0.1604 \tabularnewline
76 &  0.7787 &  0.4426 &  0.2213 \tabularnewline
77 &  0.73 &  0.5399 &  0.27 \tabularnewline
78 &  0.7964 &  0.4071 &  0.2036 \tabularnewline
79 &  0.7801 &  0.4398 &  0.2199 \tabularnewline
80 &  0.7714 &  0.4572 &  0.2286 \tabularnewline
81 &  0.7307 &  0.5386 &  0.2693 \tabularnewline
82 &  0.6276 &  0.7447 &  0.3724 \tabularnewline
83 &  0.5073 &  0.9854 &  0.4927 \tabularnewline
84 &  0.4317 &  0.8635 &  0.5683 \tabularnewline
85 &  0.3746 &  0.7493 &  0.6254 \tabularnewline
86 &  0.238 &  0.476 &  0.762 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]9[/C][C] 0.486[/C][C] 0.972[/C][C] 0.514[/C][/ROW]
[ROW][C]10[/C][C] 0.4025[/C][C] 0.8049[/C][C] 0.5975[/C][/ROW]
[ROW][C]11[/C][C] 0.3361[/C][C] 0.6723[/C][C] 0.6639[/C][/ROW]
[ROW][C]12[/C][C] 0.4269[/C][C] 0.8539[/C][C] 0.5731[/C][/ROW]
[ROW][C]13[/C][C] 0.5898[/C][C] 0.8204[/C][C] 0.4102[/C][/ROW]
[ROW][C]14[/C][C] 0.5872[/C][C] 0.8256[/C][C] 0.4128[/C][/ROW]
[ROW][C]15[/C][C] 0.5262[/C][C] 0.9475[/C][C] 0.4738[/C][/ROW]
[ROW][C]16[/C][C] 0.4263[/C][C] 0.8525[/C][C] 0.5737[/C][/ROW]
[ROW][C]17[/C][C] 0.3357[/C][C] 0.6715[/C][C] 0.6643[/C][/ROW]
[ROW][C]18[/C][C] 0.4165[/C][C] 0.8329[/C][C] 0.5835[/C][/ROW]
[ROW][C]19[/C][C] 0.3337[/C][C] 0.6674[/C][C] 0.6663[/C][/ROW]
[ROW][C]20[/C][C] 0.4584[/C][C] 0.9168[/C][C] 0.5416[/C][/ROW]
[ROW][C]21[/C][C] 0.6098[/C][C] 0.7804[/C][C] 0.3902[/C][/ROW]
[ROW][C]22[/C][C] 0.5847[/C][C] 0.8307[/C][C] 0.4153[/C][/ROW]
[ROW][C]23[/C][C] 0.5104[/C][C] 0.9791[/C][C] 0.4895[/C][/ROW]
[ROW][C]24[/C][C] 0.4398[/C][C] 0.8796[/C][C] 0.5602[/C][/ROW]
[ROW][C]25[/C][C] 0.3692[/C][C] 0.7384[/C][C] 0.6308[/C][/ROW]
[ROW][C]26[/C][C] 0.3082[/C][C] 0.6163[/C][C] 0.6918[/C][/ROW]
[ROW][C]27[/C][C] 0.2678[/C][C] 0.5357[/C][C] 0.7322[/C][/ROW]
[ROW][C]28[/C][C] 0.4264[/C][C] 0.8528[/C][C] 0.5736[/C][/ROW]
[ROW][C]29[/C][C] 0.3754[/C][C] 0.7508[/C][C] 0.6246[/C][/ROW]
[ROW][C]30[/C][C] 0.5125[/C][C] 0.9751[/C][C] 0.4875[/C][/ROW]
[ROW][C]31[/C][C] 0.4583[/C][C] 0.9167[/C][C] 0.5417[/C][/ROW]
[ROW][C]32[/C][C] 0.9881[/C][C] 0.0239[/C][C] 0.01195[/C][/ROW]
[ROW][C]33[/C][C] 0.9854[/C][C] 0.02917[/C][C] 0.01458[/C][/ROW]
[ROW][C]34[/C][C] 0.997[/C][C] 0.005989[/C][C] 0.002994[/C][/ROW]
[ROW][C]35[/C][C] 0.9956[/C][C] 0.008817[/C][C] 0.004409[/C][/ROW]
[ROW][C]36[/C][C] 0.9939[/C][C] 0.0123[/C][C] 0.00615[/C][/ROW]
[ROW][C]37[/C][C] 0.9919[/C][C] 0.01623[/C][C] 0.008113[/C][/ROW]
[ROW][C]38[/C][C] 0.989[/C][C] 0.022[/C][C] 0.011[/C][/ROW]
[ROW][C]39[/C][C] 0.9845[/C][C] 0.03091[/C][C] 0.01545[/C][/ROW]
[ROW][C]40[/C][C] 0.9774[/C][C] 0.04528[/C][C] 0.02264[/C][/ROW]
[ROW][C]41[/C][C] 0.9713[/C][C] 0.05736[/C][C] 0.02868[/C][/ROW]
[ROW][C]42[/C][C] 0.9654[/C][C] 0.06911[/C][C] 0.03456[/C][/ROW]
[ROW][C]43[/C][C] 0.9744[/C][C] 0.05123[/C][C] 0.02562[/C][/ROW]
[ROW][C]44[/C][C] 0.9635[/C][C] 0.07299[/C][C] 0.0365[/C][/ROW]
[ROW][C]45[/C][C] 0.9491[/C][C] 0.1018[/C][C] 0.05091[/C][/ROW]
[ROW][C]46[/C][C] 0.9665[/C][C] 0.06699[/C][C] 0.0335[/C][/ROW]
[ROW][C]47[/C][C] 0.96[/C][C] 0.08001[/C][C] 0.04[/C][/ROW]
[ROW][C]48[/C][C] 0.9445[/C][C] 0.1111[/C][C] 0.05555[/C][/ROW]
[ROW][C]49[/C][C] 0.9264[/C][C] 0.1473[/C][C] 0.07365[/C][/ROW]
[ROW][C]50[/C][C] 0.9218[/C][C] 0.1564[/C][C] 0.07821[/C][/ROW]
[ROW][C]51[/C][C] 0.9486[/C][C] 0.1027[/C][C] 0.05135[/C][/ROW]
[ROW][C]52[/C][C] 0.9334[/C][C] 0.1332[/C][C] 0.06659[/C][/ROW]
[ROW][C]53[/C][C] 0.9134[/C][C] 0.1733[/C][C] 0.08663[/C][/ROW]
[ROW][C]54[/C][C] 0.8996[/C][C] 0.2008[/C][C] 0.1004[/C][/ROW]
[ROW][C]55[/C][C] 0.8688[/C][C] 0.2624[/C][C] 0.1312[/C][/ROW]
[ROW][C]56[/C][C] 0.8381[/C][C] 0.3237[/C][C] 0.1619[/C][/ROW]
[ROW][C]57[/C][C] 0.8491[/C][C] 0.3018[/C][C] 0.1509[/C][/ROW]
[ROW][C]58[/C][C] 0.8084[/C][C] 0.3832[/C][C] 0.1916[/C][/ROW]
[ROW][C]59[/C][C] 0.9596[/C][C] 0.08076[/C][C] 0.04038[/C][/ROW]
[ROW][C]60[/C][C] 0.9484[/C][C] 0.1033[/C][C] 0.05163[/C][/ROW]
[ROW][C]61[/C][C] 0.9677[/C][C] 0.06467[/C][C] 0.03234[/C][/ROW]
[ROW][C]62[/C][C] 0.9741[/C][C] 0.05184[/C][C] 0.02592[/C][/ROW]
[ROW][C]63[/C][C] 0.9772[/C][C] 0.04556[/C][C] 0.02278[/C][/ROW]
[ROW][C]64[/C][C] 0.9658[/C][C] 0.06837[/C][C] 0.03419[/C][/ROW]
[ROW][C]65[/C][C] 0.9671[/C][C] 0.06577[/C][C] 0.03288[/C][/ROW]
[ROW][C]66[/C][C] 0.954[/C][C] 0.092[/C][C] 0.046[/C][/ROW]
[ROW][C]67[/C][C] 0.9367[/C][C] 0.1265[/C][C] 0.06325[/C][/ROW]
[ROW][C]68[/C][C] 0.9351[/C][C] 0.1298[/C][C] 0.06489[/C][/ROW]
[ROW][C]69[/C][C] 0.9521[/C][C] 0.09585[/C][C] 0.04793[/C][/ROW]
[ROW][C]70[/C][C] 0.931[/C][C] 0.1379[/C][C] 0.06897[/C][/ROW]
[ROW][C]71[/C][C] 0.9066[/C][C] 0.1867[/C][C] 0.09336[/C][/ROW]
[ROW][C]72[/C][C] 0.9007[/C][C] 0.1986[/C][C] 0.09931[/C][/ROW]
[ROW][C]73[/C][C] 0.8594[/C][C] 0.2813[/C][C] 0.1406[/C][/ROW]
[ROW][C]74[/C][C] 0.821[/C][C] 0.358[/C][C] 0.179[/C][/ROW]
[ROW][C]75[/C][C] 0.8396[/C][C] 0.3209[/C][C] 0.1604[/C][/ROW]
[ROW][C]76[/C][C] 0.7787[/C][C] 0.4426[/C][C] 0.2213[/C][/ROW]
[ROW][C]77[/C][C] 0.73[/C][C] 0.5399[/C][C] 0.27[/C][/ROW]
[ROW][C]78[/C][C] 0.7964[/C][C] 0.4071[/C][C] 0.2036[/C][/ROW]
[ROW][C]79[/C][C] 0.7801[/C][C] 0.4398[/C][C] 0.2199[/C][/ROW]
[ROW][C]80[/C][C] 0.7714[/C][C] 0.4572[/C][C] 0.2286[/C][/ROW]
[ROW][C]81[/C][C] 0.7307[/C][C] 0.5386[/C][C] 0.2693[/C][/ROW]
[ROW][C]82[/C][C] 0.6276[/C][C] 0.7447[/C][C] 0.3724[/C][/ROW]
[ROW][C]83[/C][C] 0.5073[/C][C] 0.9854[/C][C] 0.4927[/C][/ROW]
[ROW][C]84[/C][C] 0.4317[/C][C] 0.8635[/C][C] 0.5683[/C][/ROW]
[ROW][C]85[/C][C] 0.3746[/C][C] 0.7493[/C][C] 0.6254[/C][/ROW]
[ROW][C]86[/C][C] 0.238[/C][C] 0.476[/C][C] 0.762[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297850&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.486 0.972 0.514
10 0.4025 0.8049 0.5975
11 0.3361 0.6723 0.6639
12 0.4269 0.8539 0.5731
13 0.5898 0.8204 0.4102
14 0.5872 0.8256 0.4128
15 0.5262 0.9475 0.4738
16 0.4263 0.8525 0.5737
17 0.3357 0.6715 0.6643
18 0.4165 0.8329 0.5835
19 0.3337 0.6674 0.6663
20 0.4584 0.9168 0.5416
21 0.6098 0.7804 0.3902
22 0.5847 0.8307 0.4153
23 0.5104 0.9791 0.4895
24 0.4398 0.8796 0.5602
25 0.3692 0.7384 0.6308
26 0.3082 0.6163 0.6918
27 0.2678 0.5357 0.7322
28 0.4264 0.8528 0.5736
29 0.3754 0.7508 0.6246
30 0.5125 0.9751 0.4875
31 0.4583 0.9167 0.5417
32 0.9881 0.0239 0.01195
33 0.9854 0.02917 0.01458
34 0.997 0.005989 0.002994
35 0.9956 0.008817 0.004409
36 0.9939 0.0123 0.00615
37 0.9919 0.01623 0.008113
38 0.989 0.022 0.011
39 0.9845 0.03091 0.01545
40 0.9774 0.04528 0.02264
41 0.9713 0.05736 0.02868
42 0.9654 0.06911 0.03456
43 0.9744 0.05123 0.02562
44 0.9635 0.07299 0.0365
45 0.9491 0.1018 0.05091
46 0.9665 0.06699 0.0335
47 0.96 0.08001 0.04
48 0.9445 0.1111 0.05555
49 0.9264 0.1473 0.07365
50 0.9218 0.1564 0.07821
51 0.9486 0.1027 0.05135
52 0.9334 0.1332 0.06659
53 0.9134 0.1733 0.08663
54 0.8996 0.2008 0.1004
55 0.8688 0.2624 0.1312
56 0.8381 0.3237 0.1619
57 0.8491 0.3018 0.1509
58 0.8084 0.3832 0.1916
59 0.9596 0.08076 0.04038
60 0.9484 0.1033 0.05163
61 0.9677 0.06467 0.03234
62 0.9741 0.05184 0.02592
63 0.9772 0.04556 0.02278
64 0.9658 0.06837 0.03419
65 0.9671 0.06577 0.03288
66 0.954 0.092 0.046
67 0.9367 0.1265 0.06325
68 0.9351 0.1298 0.06489
69 0.9521 0.09585 0.04793
70 0.931 0.1379 0.06897
71 0.9066 0.1867 0.09336
72 0.9007 0.1986 0.09931
73 0.8594 0.2813 0.1406
74 0.821 0.358 0.179
75 0.8396 0.3209 0.1604
76 0.7787 0.4426 0.2213
77 0.73 0.5399 0.27
78 0.7964 0.4071 0.2036
79 0.7801 0.4398 0.2199
80 0.7714 0.4572 0.2286
81 0.7307 0.5386 0.2693
82 0.6276 0.7447 0.3724
83 0.5073 0.9854 0.4927
84 0.4317 0.8635 0.5683
85 0.3746 0.7493 0.6254
86 0.238 0.476 0.762







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level2 0.02564NOK
5% type I error level100.128205NOK
10% type I error level230.294872NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 2 &  0.02564 & NOK \tabularnewline
5% type I error level & 10 & 0.128205 & NOK \tabularnewline
10% type I error level & 23 & 0.294872 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297850&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]2[/C][C] 0.02564[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]10[/C][C]0.128205[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]23[/C][C]0.294872[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297850&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level2 0.02564NOK
5% type I error level100.128205NOK
10% type I error level230.294872NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 6.8132, df1 = 2, df2 = 87, p-value = 0.001783
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.055, df1 = 10, df2 = 79, p-value = 0.4067
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.35495, df1 = 2, df2 = 87, p-value = 0.7022

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 6.8132, df1 = 2, df2 = 87, p-value = 0.001783
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.055, df1 = 10, df2 = 79, p-value = 0.4067
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.35495, df1 = 2, df2 = 87, p-value = 0.7022
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297850&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 6.8132, df1 = 2, df2 = 87, p-value = 0.001783
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.055, df1 = 10, df2 = 79, p-value = 0.4067
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.35495, df1 = 2, df2 = 87, p-value = 0.7022
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297850&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 6.8132, df1 = 2, df2 = 87, p-value = 0.001783
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.055, df1 = 10, df2 = 79, p-value = 0.4067
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.35495, df1 = 2, df2 = 87, p-value = 0.7022







Variance Inflation Factors (Multicollinearity)
> vif
      V1       V2       V3       V4        t 
1.017552 1.054518 1.081245 1.082508 1.050060 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
      V1       V2       V3       V4        t 
1.017552 1.054518 1.081245 1.082508 1.050060 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297850&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
      V1       V2       V3       V4        t 
1.017552 1.054518 1.081245 1.082508 1.050060 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297850&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297850&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
      V1       V2       V3       V4        t 
1.017552 1.054518 1.081245 1.082508 1.050060 



Parameters (Session):
par1 = TRUE ;
Parameters (R input):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '5'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')