Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 06 Dec 2016 16:47:46 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/06/t1481039282wkfas7ty2gsqsdi.htm/, Retrieved Fri, 01 Nov 2024 03:29:50 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297852, Retrieved Fri, 01 Nov 2024 03:29:50 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact93
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2016-12-06 15:47:46] [a5a1109b2531d70fe6d77f4f71bfe676] [Current]
Feedback Forum

Post a new message
Dataseries X:
2	2	3	4	11
4	2	1	4	9
4	2	5	4	12
4	3	4	4	NA
3	4	3	3	NA
4	3	2	5	12
1	4	4	4	12
4	2	5	4	NA
3	NA	5	2	NA
4	4	3	4	11
2	2	2	4	12
4	2	2	3	12
4	5	4	3	15
5	4	4	4	13
4	2	4	4	12
1	3	5	4	11
2	1	2	5	NA
4	1	NA	NA	NA
4	3	2	4	9
5	4	4	4	NA
5	5	4	4	11
4	5	4	4	NA
1	1	5	4	12
4	4	3	4	NA
2	2	4	4	NA
4	4	3	4	NA
5	4	3	3	12
3	3	3	3	12
5	4	5	5	14
3	2	4	4	NA
5	2	4	4	12
2	4	3	4	9
1	2	3	4	13
NA	4	5	1	NA
4	2	3	3	13
4	4	3	4	12
3	3	3	4	NA
5	3	5	5	12
4	4	3	4	12
NA	2	3	4	12
4	3	3	4	NA
2	2	4	3	12
3	4	3	4	11
1	2	1	5	13
3	2	4	4	13
3	3	4	3	NA
3	3	3	3	NA
4	NA	4	5	13
4	4	4	4	10
4	5	5	1	NA
4	4	4	4	13
4	4	4	4	NA
2	4	3	4	NA
5	2	2	4	5
3	2	4	3	NA
3	1	3	4	10
4	3	3	3	NA
4	4	3	4	15
4	3	4	2	13
3	3	4	4	NA
4	2	3	4	12
4	3	4	4	13
4	2	5	3	13
4	4	2	4	11
4	3	3	3	NA
2	2	3	4	NA
4	4	3	3	12
4	5	4	4	12
4	4	3	4	13
4	3	4	4	14
4	2	3	4	NA
5	3	1	3	NA
3	4	4	3	NA
2	4	3	2	NA
4	4	2	4	NA
5	5	3	5	12
4	4	3	4	12
5	4	4	5	10
5	4	5	2	12
2	3	3	4	12
4	2	4	4	NA
4	4	2	4	NA
4	4	2	4	12
3	4	2	5	13
4	2	3	4	NA
2	2	4	4	14
5	1	3	4	10
3	NA	5	4	12
4	4	4	1	NA
2	4	4	4	13
4	4	3	4	11
3	3	4	3	NA
3	4	3	4	12
4	4	5	4	NA
4	4	4	3	12
4	2	4	3	13
3	4	3	4	12
4	4	4	5	9
3	1	1	3	NA
3	4	4	4	12
1	2	4	3	NA
4	3	4	4	14
3	3	4	5	NA
3	4	4	3	11
5	3	3	4	NA
5	4	5	4	NA
4	4	3	NA	NA
5	4	5	5	NA
4	4	4	4	NA
4	5	4	4	12
4	5	4	5	NA
4	2	4	3	NA
3	1	3	3	NA
4	3	4	3	12
3	3	3	4	NA
4	1	3	4	9
2	4	3	4	13
1	4	3	4	NA
5	2	2	4	10
4	4	4	4	14
3	3	3	3	10
4	4	2	4	12
4	4	4	5	NA
4	2	4	4	11
4	2	3	3	NA
2	4	4	4	14
4	4	5	4	13
4	2	4	3	12
4	2	NA	3	NA
4	2	4	4	NA
3	2	4	2	10
4	5	4	4	NA
5	2	5	3	12
2	NA	2	4	NA
5	2	4	4	12
4	4	4	4	NA
3	5	5	4	15
NA	4	4	3	NA
2	4	4	2	NA
2	3	5	5	12
2	3	2	3	12
4	1	4	4	10
4	4	5	4	12
5	5	3	4	12
3	4	4	5	NA
3	4	4	4	12
4	5	3	4	11
4	4	5	3	13
4	5	5	1	NA
4	5	3	4	NA
4	3	2	5	NA
4	5	4	4	13
4	1	5	4	11
2	3	3	4	10
5	2	3	5	9
4	2	4	4	NA
4	NA	3	4	12
4	4	2	4	NA
4	2	3	4	NA
4	5	3	4	13
2	4	4	3	10
3	5	1	5	13
3	3	4	3	NA
4	2	3	4	NA
4	4	3	4	NA
4	2	2	5	NA
4	3	3	4	12
3	3	3	4	NA
3	2	5	2	12




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time6 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]6 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297852&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
PWSUM[t] = + 10.1393 -0.23353V1[t] + 0.425823V2[t] + 0.424058V3[t] -0.0870506V4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
PWSUM[t] =  +  10.1393 -0.23353V1[t] +  0.425823V2[t] +  0.424058V3[t] -0.0870506V4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]PWSUM[t] =  +  10.1393 -0.23353V1[t] +  0.425823V2[t] +  0.424058V3[t] -0.0870506V4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297852&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
PWSUM[t] = + 10.1393 -0.23353V1[t] + 0.425823V2[t] + 0.424058V3[t] -0.0870506V4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.14 1.209+8.3850e+00 6.537e-13 3.269e-13
V1-0.2335 0.135-1.7300e+00 0.08714 0.04357
V2+0.4258 0.1285+3.3150e+00 0.001324 0.0006618
V3+0.4241 0.1501+2.8250e+00 0.005816 0.002908
V4-0.08705 0.2245-3.8780e-01 0.6991 0.3495

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +10.14 &  1.209 & +8.3850e+00 &  6.537e-13 &  3.269e-13 \tabularnewline
V1 & -0.2335 &  0.135 & -1.7300e+00 &  0.08714 &  0.04357 \tabularnewline
V2 & +0.4258 &  0.1285 & +3.3150e+00 &  0.001324 &  0.0006618 \tabularnewline
V3 & +0.4241 &  0.1501 & +2.8250e+00 &  0.005816 &  0.002908 \tabularnewline
V4 & -0.08705 &  0.2245 & -3.8780e-01 &  0.6991 &  0.3495 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+10.14[/C][C] 1.209[/C][C]+8.3850e+00[/C][C] 6.537e-13[/C][C] 3.269e-13[/C][/ROW]
[ROW][C]V1[/C][C]-0.2335[/C][C] 0.135[/C][C]-1.7300e+00[/C][C] 0.08714[/C][C] 0.04357[/C][/ROW]
[ROW][C]V2[/C][C]+0.4258[/C][C] 0.1285[/C][C]+3.3150e+00[/C][C] 0.001324[/C][C] 0.0006618[/C][/ROW]
[ROW][C]V3[/C][C]+0.4241[/C][C] 0.1501[/C][C]+2.8250e+00[/C][C] 0.005816[/C][C] 0.002908[/C][/ROW]
[ROW][C]V4[/C][C]-0.08705[/C][C] 0.2245[/C][C]-3.8780e-01[/C][C] 0.6991[/C][C] 0.3495[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297852&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.14 1.209+8.3850e+00 6.537e-13 3.269e-13
V1-0.2335 0.135-1.7300e+00 0.08714 0.04357
V2+0.4258 0.1285+3.3150e+00 0.001324 0.0006618
V3+0.4241 0.1501+2.8250e+00 0.005816 0.002908
V4-0.08705 0.2245-3.8780e-01 0.6991 0.3495







Multiple Linear Regression - Regression Statistics
Multiple R 0.4353
R-squared 0.1895
Adjusted R-squared 0.1535
F-TEST (value) 5.261
F-TEST (DF numerator)4
F-TEST (DF denominator)90
p-value 0.0007463
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.412
Sum Squared Residuals 179.4

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.4353 \tabularnewline
R-squared &  0.1895 \tabularnewline
Adjusted R-squared &  0.1535 \tabularnewline
F-TEST (value) &  5.261 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 90 \tabularnewline
p-value &  0.0007463 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.412 \tabularnewline
Sum Squared Residuals &  179.4 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.4353[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1895[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.1535[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 5.261[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]90[/C][/ROW]
[ROW][C]p-value[/C][C] 0.0007463[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.412[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 179.4[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297852&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.4353
R-squared 0.1895
Adjusted R-squared 0.1535
F-TEST (value) 5.261
F-TEST (DF numerator)4
F-TEST (DF denominator)90
p-value 0.0007463
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.412
Sum Squared Residuals 179.4







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 11 11.45-0.4479
2 9 10.13-1.133
3 12 11.83 0.1711
4 12 10.9 1.104
5 12 12.96-0.9571
6 11 11.83-0.8325
7 12 11.02 0.9762
8 12 10.64 1.356
9 15 12.77 2.231
10 13 12.02 0.977
11 12 11.4 0.5951
12 11 12.96-1.955
13 9 10.98-1.983
14 11 12.45-1.449
15 12 12.1-0.1037
16 12 11.69 0.314
17 12 11.73 0.2728
18 14 12.36 1.64
19 12 11.17 0.8287
20 9 12.3-3.3
21 13 11.68 1.319
22 13 11.07 1.932
23 12 11.83 0.1675
24 12 11.93 0.06583
25 12 11.83 0.1675
26 12 11.96 0.04102
27 11 12.07-1.066
28 13 10.75 2.254
29 13 11.64 1.362
30 10 12.26-2.257
31 13 12.26 0.7435
32 5 10.32-5.323
33 10 10.79-0.7885
34 15 11.83 3.168
35 13 12 0.9952
36 12 10.98 1.019
37 13 11.83 1.169
38 13 11.92 1.084
39 11 11.41-0.4084
40 12 11.92 0.08049
41 12 12.68-0.6823
42 13 11.83 1.168
43 14 11.83 2.169
44 12 11.94 0.0623
45 12 11.83 0.1675
46 10 11.94-1.936
47 12 12.62-0.6211
48 12 11.87 0.1263
49 12 11.41 0.5916
50 13 11.55 1.445
51 14 11.87 2.128
52 10 10.32-0.3215
53 13 12.72 0.2764
54 11 11.83-0.8325
55 12 12.07-0.06599
56 12 12.34-0.3436
57 13 11.49 1.508
58 12 12.07-0.06599
59 9 12.17-3.169
60 12 12.49-0.49
61 14 11.83 2.169
62 11 12.58-1.577
63 12 12.68-0.6823
64 12 11.92 0.08226
65 9 10.55-1.555
66 13 12.3 0.7005
67 10 10.32-0.3232
68 14 12.26 1.743
69 10 11.73-1.727
70 12 11.41 0.5916
71 11 11.4-0.4049
72 14 12.72 1.276
73 13 12.68 0.3194
74 12 11.49 0.5081
75 10 11.81-1.812
76 12 11.68 0.3176
77 12 11.17 0.8287
78 15 13.34 1.66
79 12 12.63-0.6348
80 12 11.54 0.4633
81 10 10.98-0.979
82 12 12.68-0.6806
83 12 12.02-0.02475
84 12 12.49-0.49
85 11 12.26-1.258
86 13 12.77 0.2324
87 13 12.68 0.3177
88 11 11.4-0.4031
89 10 11.87-1.874
90 9 10.66-1.66
91 13 12.26 0.7417
92 10 12.81-2.811
93 13 11.56 1.443
94 12 11.41 0.5934
95 12 12.24-0.2366

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  11 &  11.45 & -0.4479 \tabularnewline
2 &  9 &  10.13 & -1.133 \tabularnewline
3 &  12 &  11.83 &  0.1711 \tabularnewline
4 &  12 &  10.9 &  1.104 \tabularnewline
5 &  12 &  12.96 & -0.9571 \tabularnewline
6 &  11 &  11.83 & -0.8325 \tabularnewline
7 &  12 &  11.02 &  0.9762 \tabularnewline
8 &  12 &  10.64 &  1.356 \tabularnewline
9 &  15 &  12.77 &  2.231 \tabularnewline
10 &  13 &  12.02 &  0.977 \tabularnewline
11 &  12 &  11.4 &  0.5951 \tabularnewline
12 &  11 &  12.96 & -1.955 \tabularnewline
13 &  9 &  10.98 & -1.983 \tabularnewline
14 &  11 &  12.45 & -1.449 \tabularnewline
15 &  12 &  12.1 & -0.1037 \tabularnewline
16 &  12 &  11.69 &  0.314 \tabularnewline
17 &  12 &  11.73 &  0.2728 \tabularnewline
18 &  14 &  12.36 &  1.64 \tabularnewline
19 &  12 &  11.17 &  0.8287 \tabularnewline
20 &  9 &  12.3 & -3.3 \tabularnewline
21 &  13 &  11.68 &  1.319 \tabularnewline
22 &  13 &  11.07 &  1.932 \tabularnewline
23 &  12 &  11.83 &  0.1675 \tabularnewline
24 &  12 &  11.93 &  0.06583 \tabularnewline
25 &  12 &  11.83 &  0.1675 \tabularnewline
26 &  12 &  11.96 &  0.04102 \tabularnewline
27 &  11 &  12.07 & -1.066 \tabularnewline
28 &  13 &  10.75 &  2.254 \tabularnewline
29 &  13 &  11.64 &  1.362 \tabularnewline
30 &  10 &  12.26 & -2.257 \tabularnewline
31 &  13 &  12.26 &  0.7435 \tabularnewline
32 &  5 &  10.32 & -5.323 \tabularnewline
33 &  10 &  10.79 & -0.7885 \tabularnewline
34 &  15 &  11.83 &  3.168 \tabularnewline
35 &  13 &  12 &  0.9952 \tabularnewline
36 &  12 &  10.98 &  1.019 \tabularnewline
37 &  13 &  11.83 &  1.169 \tabularnewline
38 &  13 &  11.92 &  1.084 \tabularnewline
39 &  11 &  11.41 & -0.4084 \tabularnewline
40 &  12 &  11.92 &  0.08049 \tabularnewline
41 &  12 &  12.68 & -0.6823 \tabularnewline
42 &  13 &  11.83 &  1.168 \tabularnewline
43 &  14 &  11.83 &  2.169 \tabularnewline
44 &  12 &  11.94 &  0.0623 \tabularnewline
45 &  12 &  11.83 &  0.1675 \tabularnewline
46 &  10 &  11.94 & -1.936 \tabularnewline
47 &  12 &  12.62 & -0.6211 \tabularnewline
48 &  12 &  11.87 &  0.1263 \tabularnewline
49 &  12 &  11.41 &  0.5916 \tabularnewline
50 &  13 &  11.55 &  1.445 \tabularnewline
51 &  14 &  11.87 &  2.128 \tabularnewline
52 &  10 &  10.32 & -0.3215 \tabularnewline
53 &  13 &  12.72 &  0.2764 \tabularnewline
54 &  11 &  11.83 & -0.8325 \tabularnewline
55 &  12 &  12.07 & -0.06599 \tabularnewline
56 &  12 &  12.34 & -0.3436 \tabularnewline
57 &  13 &  11.49 &  1.508 \tabularnewline
58 &  12 &  12.07 & -0.06599 \tabularnewline
59 &  9 &  12.17 & -3.169 \tabularnewline
60 &  12 &  12.49 & -0.49 \tabularnewline
61 &  14 &  11.83 &  2.169 \tabularnewline
62 &  11 &  12.58 & -1.577 \tabularnewline
63 &  12 &  12.68 & -0.6823 \tabularnewline
64 &  12 &  11.92 &  0.08226 \tabularnewline
65 &  9 &  10.55 & -1.555 \tabularnewline
66 &  13 &  12.3 &  0.7005 \tabularnewline
67 &  10 &  10.32 & -0.3232 \tabularnewline
68 &  14 &  12.26 &  1.743 \tabularnewline
69 &  10 &  11.73 & -1.727 \tabularnewline
70 &  12 &  11.41 &  0.5916 \tabularnewline
71 &  11 &  11.4 & -0.4049 \tabularnewline
72 &  14 &  12.72 &  1.276 \tabularnewline
73 &  13 &  12.68 &  0.3194 \tabularnewline
74 &  12 &  11.49 &  0.5081 \tabularnewline
75 &  10 &  11.81 & -1.812 \tabularnewline
76 &  12 &  11.68 &  0.3176 \tabularnewline
77 &  12 &  11.17 &  0.8287 \tabularnewline
78 &  15 &  13.34 &  1.66 \tabularnewline
79 &  12 &  12.63 & -0.6348 \tabularnewline
80 &  12 &  11.54 &  0.4633 \tabularnewline
81 &  10 &  10.98 & -0.979 \tabularnewline
82 &  12 &  12.68 & -0.6806 \tabularnewline
83 &  12 &  12.02 & -0.02475 \tabularnewline
84 &  12 &  12.49 & -0.49 \tabularnewline
85 &  11 &  12.26 & -1.258 \tabularnewline
86 &  13 &  12.77 &  0.2324 \tabularnewline
87 &  13 &  12.68 &  0.3177 \tabularnewline
88 &  11 &  11.4 & -0.4031 \tabularnewline
89 &  10 &  11.87 & -1.874 \tabularnewline
90 &  9 &  10.66 & -1.66 \tabularnewline
91 &  13 &  12.26 &  0.7417 \tabularnewline
92 &  10 &  12.81 & -2.811 \tabularnewline
93 &  13 &  11.56 &  1.443 \tabularnewline
94 &  12 &  11.41 &  0.5934 \tabularnewline
95 &  12 &  12.24 & -0.2366 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 11[/C][C] 11.45[/C][C]-0.4479[/C][/ROW]
[ROW][C]2[/C][C] 9[/C][C] 10.13[/C][C]-1.133[/C][/ROW]
[ROW][C]3[/C][C] 12[/C][C] 11.83[/C][C] 0.1711[/C][/ROW]
[ROW][C]4[/C][C] 12[/C][C] 10.9[/C][C] 1.104[/C][/ROW]
[ROW][C]5[/C][C] 12[/C][C] 12.96[/C][C]-0.9571[/C][/ROW]
[ROW][C]6[/C][C] 11[/C][C] 11.83[/C][C]-0.8325[/C][/ROW]
[ROW][C]7[/C][C] 12[/C][C] 11.02[/C][C] 0.9762[/C][/ROW]
[ROW][C]8[/C][C] 12[/C][C] 10.64[/C][C] 1.356[/C][/ROW]
[ROW][C]9[/C][C] 15[/C][C] 12.77[/C][C] 2.231[/C][/ROW]
[ROW][C]10[/C][C] 13[/C][C] 12.02[/C][C] 0.977[/C][/ROW]
[ROW][C]11[/C][C] 12[/C][C] 11.4[/C][C] 0.5951[/C][/ROW]
[ROW][C]12[/C][C] 11[/C][C] 12.96[/C][C]-1.955[/C][/ROW]
[ROW][C]13[/C][C] 9[/C][C] 10.98[/C][C]-1.983[/C][/ROW]
[ROW][C]14[/C][C] 11[/C][C] 12.45[/C][C]-1.449[/C][/ROW]
[ROW][C]15[/C][C] 12[/C][C] 12.1[/C][C]-0.1037[/C][/ROW]
[ROW][C]16[/C][C] 12[/C][C] 11.69[/C][C] 0.314[/C][/ROW]
[ROW][C]17[/C][C] 12[/C][C] 11.73[/C][C] 0.2728[/C][/ROW]
[ROW][C]18[/C][C] 14[/C][C] 12.36[/C][C] 1.64[/C][/ROW]
[ROW][C]19[/C][C] 12[/C][C] 11.17[/C][C] 0.8287[/C][/ROW]
[ROW][C]20[/C][C] 9[/C][C] 12.3[/C][C]-3.3[/C][/ROW]
[ROW][C]21[/C][C] 13[/C][C] 11.68[/C][C] 1.319[/C][/ROW]
[ROW][C]22[/C][C] 13[/C][C] 11.07[/C][C] 1.932[/C][/ROW]
[ROW][C]23[/C][C] 12[/C][C] 11.83[/C][C] 0.1675[/C][/ROW]
[ROW][C]24[/C][C] 12[/C][C] 11.93[/C][C] 0.06583[/C][/ROW]
[ROW][C]25[/C][C] 12[/C][C] 11.83[/C][C] 0.1675[/C][/ROW]
[ROW][C]26[/C][C] 12[/C][C] 11.96[/C][C] 0.04102[/C][/ROW]
[ROW][C]27[/C][C] 11[/C][C] 12.07[/C][C]-1.066[/C][/ROW]
[ROW][C]28[/C][C] 13[/C][C] 10.75[/C][C] 2.254[/C][/ROW]
[ROW][C]29[/C][C] 13[/C][C] 11.64[/C][C] 1.362[/C][/ROW]
[ROW][C]30[/C][C] 10[/C][C] 12.26[/C][C]-2.257[/C][/ROW]
[ROW][C]31[/C][C] 13[/C][C] 12.26[/C][C] 0.7435[/C][/ROW]
[ROW][C]32[/C][C] 5[/C][C] 10.32[/C][C]-5.323[/C][/ROW]
[ROW][C]33[/C][C] 10[/C][C] 10.79[/C][C]-0.7885[/C][/ROW]
[ROW][C]34[/C][C] 15[/C][C] 11.83[/C][C] 3.168[/C][/ROW]
[ROW][C]35[/C][C] 13[/C][C] 12[/C][C] 0.9952[/C][/ROW]
[ROW][C]36[/C][C] 12[/C][C] 10.98[/C][C] 1.019[/C][/ROW]
[ROW][C]37[/C][C] 13[/C][C] 11.83[/C][C] 1.169[/C][/ROW]
[ROW][C]38[/C][C] 13[/C][C] 11.92[/C][C] 1.084[/C][/ROW]
[ROW][C]39[/C][C] 11[/C][C] 11.41[/C][C]-0.4084[/C][/ROW]
[ROW][C]40[/C][C] 12[/C][C] 11.92[/C][C] 0.08049[/C][/ROW]
[ROW][C]41[/C][C] 12[/C][C] 12.68[/C][C]-0.6823[/C][/ROW]
[ROW][C]42[/C][C] 13[/C][C] 11.83[/C][C] 1.168[/C][/ROW]
[ROW][C]43[/C][C] 14[/C][C] 11.83[/C][C] 2.169[/C][/ROW]
[ROW][C]44[/C][C] 12[/C][C] 11.94[/C][C] 0.0623[/C][/ROW]
[ROW][C]45[/C][C] 12[/C][C] 11.83[/C][C] 0.1675[/C][/ROW]
[ROW][C]46[/C][C] 10[/C][C] 11.94[/C][C]-1.936[/C][/ROW]
[ROW][C]47[/C][C] 12[/C][C] 12.62[/C][C]-0.6211[/C][/ROW]
[ROW][C]48[/C][C] 12[/C][C] 11.87[/C][C] 0.1263[/C][/ROW]
[ROW][C]49[/C][C] 12[/C][C] 11.41[/C][C] 0.5916[/C][/ROW]
[ROW][C]50[/C][C] 13[/C][C] 11.55[/C][C] 1.445[/C][/ROW]
[ROW][C]51[/C][C] 14[/C][C] 11.87[/C][C] 2.128[/C][/ROW]
[ROW][C]52[/C][C] 10[/C][C] 10.32[/C][C]-0.3215[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 12.72[/C][C] 0.2764[/C][/ROW]
[ROW][C]54[/C][C] 11[/C][C] 11.83[/C][C]-0.8325[/C][/ROW]
[ROW][C]55[/C][C] 12[/C][C] 12.07[/C][C]-0.06599[/C][/ROW]
[ROW][C]56[/C][C] 12[/C][C] 12.34[/C][C]-0.3436[/C][/ROW]
[ROW][C]57[/C][C] 13[/C][C] 11.49[/C][C] 1.508[/C][/ROW]
[ROW][C]58[/C][C] 12[/C][C] 12.07[/C][C]-0.06599[/C][/ROW]
[ROW][C]59[/C][C] 9[/C][C] 12.17[/C][C]-3.169[/C][/ROW]
[ROW][C]60[/C][C] 12[/C][C] 12.49[/C][C]-0.49[/C][/ROW]
[ROW][C]61[/C][C] 14[/C][C] 11.83[/C][C] 2.169[/C][/ROW]
[ROW][C]62[/C][C] 11[/C][C] 12.58[/C][C]-1.577[/C][/ROW]
[ROW][C]63[/C][C] 12[/C][C] 12.68[/C][C]-0.6823[/C][/ROW]
[ROW][C]64[/C][C] 12[/C][C] 11.92[/C][C] 0.08226[/C][/ROW]
[ROW][C]65[/C][C] 9[/C][C] 10.55[/C][C]-1.555[/C][/ROW]
[ROW][C]66[/C][C] 13[/C][C] 12.3[/C][C] 0.7005[/C][/ROW]
[ROW][C]67[/C][C] 10[/C][C] 10.32[/C][C]-0.3232[/C][/ROW]
[ROW][C]68[/C][C] 14[/C][C] 12.26[/C][C] 1.743[/C][/ROW]
[ROW][C]69[/C][C] 10[/C][C] 11.73[/C][C]-1.727[/C][/ROW]
[ROW][C]70[/C][C] 12[/C][C] 11.41[/C][C] 0.5916[/C][/ROW]
[ROW][C]71[/C][C] 11[/C][C] 11.4[/C][C]-0.4049[/C][/ROW]
[ROW][C]72[/C][C] 14[/C][C] 12.72[/C][C] 1.276[/C][/ROW]
[ROW][C]73[/C][C] 13[/C][C] 12.68[/C][C] 0.3194[/C][/ROW]
[ROW][C]74[/C][C] 12[/C][C] 11.49[/C][C] 0.5081[/C][/ROW]
[ROW][C]75[/C][C] 10[/C][C] 11.81[/C][C]-1.812[/C][/ROW]
[ROW][C]76[/C][C] 12[/C][C] 11.68[/C][C] 0.3176[/C][/ROW]
[ROW][C]77[/C][C] 12[/C][C] 11.17[/C][C] 0.8287[/C][/ROW]
[ROW][C]78[/C][C] 15[/C][C] 13.34[/C][C] 1.66[/C][/ROW]
[ROW][C]79[/C][C] 12[/C][C] 12.63[/C][C]-0.6348[/C][/ROW]
[ROW][C]80[/C][C] 12[/C][C] 11.54[/C][C] 0.4633[/C][/ROW]
[ROW][C]81[/C][C] 10[/C][C] 10.98[/C][C]-0.979[/C][/ROW]
[ROW][C]82[/C][C] 12[/C][C] 12.68[/C][C]-0.6806[/C][/ROW]
[ROW][C]83[/C][C] 12[/C][C] 12.02[/C][C]-0.02475[/C][/ROW]
[ROW][C]84[/C][C] 12[/C][C] 12.49[/C][C]-0.49[/C][/ROW]
[ROW][C]85[/C][C] 11[/C][C] 12.26[/C][C]-1.258[/C][/ROW]
[ROW][C]86[/C][C] 13[/C][C] 12.77[/C][C] 0.2324[/C][/ROW]
[ROW][C]87[/C][C] 13[/C][C] 12.68[/C][C] 0.3177[/C][/ROW]
[ROW][C]88[/C][C] 11[/C][C] 11.4[/C][C]-0.4031[/C][/ROW]
[ROW][C]89[/C][C] 10[/C][C] 11.87[/C][C]-1.874[/C][/ROW]
[ROW][C]90[/C][C] 9[/C][C] 10.66[/C][C]-1.66[/C][/ROW]
[ROW][C]91[/C][C] 13[/C][C] 12.26[/C][C] 0.7417[/C][/ROW]
[ROW][C]92[/C][C] 10[/C][C] 12.81[/C][C]-2.811[/C][/ROW]
[ROW][C]93[/C][C] 13[/C][C] 11.56[/C][C] 1.443[/C][/ROW]
[ROW][C]94[/C][C] 12[/C][C] 11.41[/C][C] 0.5934[/C][/ROW]
[ROW][C]95[/C][C] 12[/C][C] 12.24[/C][C]-0.2366[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297852&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 11 11.45-0.4479
2 9 10.13-1.133
3 12 11.83 0.1711
4 12 10.9 1.104
5 12 12.96-0.9571
6 11 11.83-0.8325
7 12 11.02 0.9762
8 12 10.64 1.356
9 15 12.77 2.231
10 13 12.02 0.977
11 12 11.4 0.5951
12 11 12.96-1.955
13 9 10.98-1.983
14 11 12.45-1.449
15 12 12.1-0.1037
16 12 11.69 0.314
17 12 11.73 0.2728
18 14 12.36 1.64
19 12 11.17 0.8287
20 9 12.3-3.3
21 13 11.68 1.319
22 13 11.07 1.932
23 12 11.83 0.1675
24 12 11.93 0.06583
25 12 11.83 0.1675
26 12 11.96 0.04102
27 11 12.07-1.066
28 13 10.75 2.254
29 13 11.64 1.362
30 10 12.26-2.257
31 13 12.26 0.7435
32 5 10.32-5.323
33 10 10.79-0.7885
34 15 11.83 3.168
35 13 12 0.9952
36 12 10.98 1.019
37 13 11.83 1.169
38 13 11.92 1.084
39 11 11.41-0.4084
40 12 11.92 0.08049
41 12 12.68-0.6823
42 13 11.83 1.168
43 14 11.83 2.169
44 12 11.94 0.0623
45 12 11.83 0.1675
46 10 11.94-1.936
47 12 12.62-0.6211
48 12 11.87 0.1263
49 12 11.41 0.5916
50 13 11.55 1.445
51 14 11.87 2.128
52 10 10.32-0.3215
53 13 12.72 0.2764
54 11 11.83-0.8325
55 12 12.07-0.06599
56 12 12.34-0.3436
57 13 11.49 1.508
58 12 12.07-0.06599
59 9 12.17-3.169
60 12 12.49-0.49
61 14 11.83 2.169
62 11 12.58-1.577
63 12 12.68-0.6823
64 12 11.92 0.08226
65 9 10.55-1.555
66 13 12.3 0.7005
67 10 10.32-0.3232
68 14 12.26 1.743
69 10 11.73-1.727
70 12 11.41 0.5916
71 11 11.4-0.4049
72 14 12.72 1.276
73 13 12.68 0.3194
74 12 11.49 0.5081
75 10 11.81-1.812
76 12 11.68 0.3176
77 12 11.17 0.8287
78 15 13.34 1.66
79 12 12.63-0.6348
80 12 11.54 0.4633
81 10 10.98-0.979
82 12 12.68-0.6806
83 12 12.02-0.02475
84 12 12.49-0.49
85 11 12.26-1.258
86 13 12.77 0.2324
87 13 12.68 0.3177
88 11 11.4-0.4031
89 10 11.87-1.874
90 9 10.66-1.66
91 13 12.26 0.7417
92 10 12.81-2.811
93 13 11.56 1.443
94 12 11.41 0.5934
95 12 12.24-0.2366







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.5627 0.8745 0.4373
9 0.6183 0.7635 0.3817
10 0.4779 0.9558 0.5221
11 0.3499 0.6999 0.6501
12 0.3345 0.669 0.6655
13 0.5306 0.9389 0.4694
14 0.575 0.85 0.425
15 0.4792 0.9584 0.5208
16 0.3974 0.7948 0.6026
17 0.3113 0.6226 0.6887
18 0.3464 0.6929 0.6536
19 0.2742 0.5483 0.7258
20 0.4498 0.8996 0.5502
21 0.535 0.93 0.465
22 0.5184 0.9632 0.4816
23 0.4471 0.8941 0.5529
24 0.3786 0.7572 0.6214
25 0.3142 0.6284 0.6858
26 0.256 0.512 0.744
27 0.2139 0.4278 0.7861
28 0.4044 0.8088 0.5956
29 0.3771 0.7542 0.6229
30 0.4628 0.9256 0.5372
31 0.4238 0.8476 0.5762
32 0.98 0.03991 0.01995
33 0.9746 0.05074 0.02537
34 0.9946 0.0108 0.005401
35 0.9928 0.01431 0.007153
36 0.9909 0.01812 0.009062
37 0.9894 0.0213 0.01065
38 0.9871 0.02581 0.0129
39 0.9814 0.03715 0.01857
40 0.9731 0.05372 0.02686
41 0.9644 0.07128 0.03564
42 0.9605 0.0789 0.03945
43 0.9752 0.04962 0.02481
44 0.9649 0.07017 0.03509
45 0.9514 0.09719 0.04859
46 0.9629 0.07418 0.03709
47 0.9536 0.09275 0.04638
48 0.9368 0.1265 0.06323
49 0.9197 0.1606 0.08031
50 0.9227 0.1547 0.07734
51 0.9571 0.08572 0.04286
52 0.9422 0.1157 0.05783
53 0.925 0.15 0.07501
54 0.9098 0.1805 0.09024
55 0.8812 0.2376 0.1188
56 0.8502 0.2997 0.1498
57 0.8673 0.2655 0.1327
58 0.8296 0.3408 0.1704
59 0.9601 0.07979 0.0399
60 0.9458 0.1083 0.05416
61 0.9724 0.05527 0.02764
62 0.9748 0.05049 0.02524
63 0.9718 0.05636 0.02818
64 0.9594 0.08128 0.04064
65 0.9551 0.08989 0.04494
66 0.9464 0.1071 0.05355
67 0.9241 0.1518 0.07588
68 0.9365 0.1269 0.06345
69 0.939 0.122 0.06101
70 0.9171 0.1659 0.08294
71 0.8844 0.2312 0.1156
72 0.9056 0.1889 0.09443
73 0.8687 0.2625 0.1313
74 0.8459 0.3082 0.1541
75 0.8382 0.3235 0.1618
76 0.7874 0.4252 0.2126
77 0.7705 0.459 0.2295
78 0.8458 0.3083 0.1542
79 0.8347 0.3307 0.1653
80 0.7879 0.4242 0.2121
81 0.7094 0.5812 0.2906
82 0.6117 0.7767 0.3883
83 0.5362 0.9276 0.4638
84 0.4361 0.8721 0.5639
85 0.5059 0.9882 0.4941
86 0.3656 0.7312 0.6344
87 0.2393 0.4785 0.7607

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 &  0.5627 &  0.8745 &  0.4373 \tabularnewline
9 &  0.6183 &  0.7635 &  0.3817 \tabularnewline
10 &  0.4779 &  0.9558 &  0.5221 \tabularnewline
11 &  0.3499 &  0.6999 &  0.6501 \tabularnewline
12 &  0.3345 &  0.669 &  0.6655 \tabularnewline
13 &  0.5306 &  0.9389 &  0.4694 \tabularnewline
14 &  0.575 &  0.85 &  0.425 \tabularnewline
15 &  0.4792 &  0.9584 &  0.5208 \tabularnewline
16 &  0.3974 &  0.7948 &  0.6026 \tabularnewline
17 &  0.3113 &  0.6226 &  0.6887 \tabularnewline
18 &  0.3464 &  0.6929 &  0.6536 \tabularnewline
19 &  0.2742 &  0.5483 &  0.7258 \tabularnewline
20 &  0.4498 &  0.8996 &  0.5502 \tabularnewline
21 &  0.535 &  0.93 &  0.465 \tabularnewline
22 &  0.5184 &  0.9632 &  0.4816 \tabularnewline
23 &  0.4471 &  0.8941 &  0.5529 \tabularnewline
24 &  0.3786 &  0.7572 &  0.6214 \tabularnewline
25 &  0.3142 &  0.6284 &  0.6858 \tabularnewline
26 &  0.256 &  0.512 &  0.744 \tabularnewline
27 &  0.2139 &  0.4278 &  0.7861 \tabularnewline
28 &  0.4044 &  0.8088 &  0.5956 \tabularnewline
29 &  0.3771 &  0.7542 &  0.6229 \tabularnewline
30 &  0.4628 &  0.9256 &  0.5372 \tabularnewline
31 &  0.4238 &  0.8476 &  0.5762 \tabularnewline
32 &  0.98 &  0.03991 &  0.01995 \tabularnewline
33 &  0.9746 &  0.05074 &  0.02537 \tabularnewline
34 &  0.9946 &  0.0108 &  0.005401 \tabularnewline
35 &  0.9928 &  0.01431 &  0.007153 \tabularnewline
36 &  0.9909 &  0.01812 &  0.009062 \tabularnewline
37 &  0.9894 &  0.0213 &  0.01065 \tabularnewline
38 &  0.9871 &  0.02581 &  0.0129 \tabularnewline
39 &  0.9814 &  0.03715 &  0.01857 \tabularnewline
40 &  0.9731 &  0.05372 &  0.02686 \tabularnewline
41 &  0.9644 &  0.07128 &  0.03564 \tabularnewline
42 &  0.9605 &  0.0789 &  0.03945 \tabularnewline
43 &  0.9752 &  0.04962 &  0.02481 \tabularnewline
44 &  0.9649 &  0.07017 &  0.03509 \tabularnewline
45 &  0.9514 &  0.09719 &  0.04859 \tabularnewline
46 &  0.9629 &  0.07418 &  0.03709 \tabularnewline
47 &  0.9536 &  0.09275 &  0.04638 \tabularnewline
48 &  0.9368 &  0.1265 &  0.06323 \tabularnewline
49 &  0.9197 &  0.1606 &  0.08031 \tabularnewline
50 &  0.9227 &  0.1547 &  0.07734 \tabularnewline
51 &  0.9571 &  0.08572 &  0.04286 \tabularnewline
52 &  0.9422 &  0.1157 &  0.05783 \tabularnewline
53 &  0.925 &  0.15 &  0.07501 \tabularnewline
54 &  0.9098 &  0.1805 &  0.09024 \tabularnewline
55 &  0.8812 &  0.2376 &  0.1188 \tabularnewline
56 &  0.8502 &  0.2997 &  0.1498 \tabularnewline
57 &  0.8673 &  0.2655 &  0.1327 \tabularnewline
58 &  0.8296 &  0.3408 &  0.1704 \tabularnewline
59 &  0.9601 &  0.07979 &  0.0399 \tabularnewline
60 &  0.9458 &  0.1083 &  0.05416 \tabularnewline
61 &  0.9724 &  0.05527 &  0.02764 \tabularnewline
62 &  0.9748 &  0.05049 &  0.02524 \tabularnewline
63 &  0.9718 &  0.05636 &  0.02818 \tabularnewline
64 &  0.9594 &  0.08128 &  0.04064 \tabularnewline
65 &  0.9551 &  0.08989 &  0.04494 \tabularnewline
66 &  0.9464 &  0.1071 &  0.05355 \tabularnewline
67 &  0.9241 &  0.1518 &  0.07588 \tabularnewline
68 &  0.9365 &  0.1269 &  0.06345 \tabularnewline
69 &  0.939 &  0.122 &  0.06101 \tabularnewline
70 &  0.9171 &  0.1659 &  0.08294 \tabularnewline
71 &  0.8844 &  0.2312 &  0.1156 \tabularnewline
72 &  0.9056 &  0.1889 &  0.09443 \tabularnewline
73 &  0.8687 &  0.2625 &  0.1313 \tabularnewline
74 &  0.8459 &  0.3082 &  0.1541 \tabularnewline
75 &  0.8382 &  0.3235 &  0.1618 \tabularnewline
76 &  0.7874 &  0.4252 &  0.2126 \tabularnewline
77 &  0.7705 &  0.459 &  0.2295 \tabularnewline
78 &  0.8458 &  0.3083 &  0.1542 \tabularnewline
79 &  0.8347 &  0.3307 &  0.1653 \tabularnewline
80 &  0.7879 &  0.4242 &  0.2121 \tabularnewline
81 &  0.7094 &  0.5812 &  0.2906 \tabularnewline
82 &  0.6117 &  0.7767 &  0.3883 \tabularnewline
83 &  0.5362 &  0.9276 &  0.4638 \tabularnewline
84 &  0.4361 &  0.8721 &  0.5639 \tabularnewline
85 &  0.5059 &  0.9882 &  0.4941 \tabularnewline
86 &  0.3656 &  0.7312 &  0.6344 \tabularnewline
87 &  0.2393 &  0.4785 &  0.7607 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C] 0.5627[/C][C] 0.8745[/C][C] 0.4373[/C][/ROW]
[ROW][C]9[/C][C] 0.6183[/C][C] 0.7635[/C][C] 0.3817[/C][/ROW]
[ROW][C]10[/C][C] 0.4779[/C][C] 0.9558[/C][C] 0.5221[/C][/ROW]
[ROW][C]11[/C][C] 0.3499[/C][C] 0.6999[/C][C] 0.6501[/C][/ROW]
[ROW][C]12[/C][C] 0.3345[/C][C] 0.669[/C][C] 0.6655[/C][/ROW]
[ROW][C]13[/C][C] 0.5306[/C][C] 0.9389[/C][C] 0.4694[/C][/ROW]
[ROW][C]14[/C][C] 0.575[/C][C] 0.85[/C][C] 0.425[/C][/ROW]
[ROW][C]15[/C][C] 0.4792[/C][C] 0.9584[/C][C] 0.5208[/C][/ROW]
[ROW][C]16[/C][C] 0.3974[/C][C] 0.7948[/C][C] 0.6026[/C][/ROW]
[ROW][C]17[/C][C] 0.3113[/C][C] 0.6226[/C][C] 0.6887[/C][/ROW]
[ROW][C]18[/C][C] 0.3464[/C][C] 0.6929[/C][C] 0.6536[/C][/ROW]
[ROW][C]19[/C][C] 0.2742[/C][C] 0.5483[/C][C] 0.7258[/C][/ROW]
[ROW][C]20[/C][C] 0.4498[/C][C] 0.8996[/C][C] 0.5502[/C][/ROW]
[ROW][C]21[/C][C] 0.535[/C][C] 0.93[/C][C] 0.465[/C][/ROW]
[ROW][C]22[/C][C] 0.5184[/C][C] 0.9632[/C][C] 0.4816[/C][/ROW]
[ROW][C]23[/C][C] 0.4471[/C][C] 0.8941[/C][C] 0.5529[/C][/ROW]
[ROW][C]24[/C][C] 0.3786[/C][C] 0.7572[/C][C] 0.6214[/C][/ROW]
[ROW][C]25[/C][C] 0.3142[/C][C] 0.6284[/C][C] 0.6858[/C][/ROW]
[ROW][C]26[/C][C] 0.256[/C][C] 0.512[/C][C] 0.744[/C][/ROW]
[ROW][C]27[/C][C] 0.2139[/C][C] 0.4278[/C][C] 0.7861[/C][/ROW]
[ROW][C]28[/C][C] 0.4044[/C][C] 0.8088[/C][C] 0.5956[/C][/ROW]
[ROW][C]29[/C][C] 0.3771[/C][C] 0.7542[/C][C] 0.6229[/C][/ROW]
[ROW][C]30[/C][C] 0.4628[/C][C] 0.9256[/C][C] 0.5372[/C][/ROW]
[ROW][C]31[/C][C] 0.4238[/C][C] 0.8476[/C][C] 0.5762[/C][/ROW]
[ROW][C]32[/C][C] 0.98[/C][C] 0.03991[/C][C] 0.01995[/C][/ROW]
[ROW][C]33[/C][C] 0.9746[/C][C] 0.05074[/C][C] 0.02537[/C][/ROW]
[ROW][C]34[/C][C] 0.9946[/C][C] 0.0108[/C][C] 0.005401[/C][/ROW]
[ROW][C]35[/C][C] 0.9928[/C][C] 0.01431[/C][C] 0.007153[/C][/ROW]
[ROW][C]36[/C][C] 0.9909[/C][C] 0.01812[/C][C] 0.009062[/C][/ROW]
[ROW][C]37[/C][C] 0.9894[/C][C] 0.0213[/C][C] 0.01065[/C][/ROW]
[ROW][C]38[/C][C] 0.9871[/C][C] 0.02581[/C][C] 0.0129[/C][/ROW]
[ROW][C]39[/C][C] 0.9814[/C][C] 0.03715[/C][C] 0.01857[/C][/ROW]
[ROW][C]40[/C][C] 0.9731[/C][C] 0.05372[/C][C] 0.02686[/C][/ROW]
[ROW][C]41[/C][C] 0.9644[/C][C] 0.07128[/C][C] 0.03564[/C][/ROW]
[ROW][C]42[/C][C] 0.9605[/C][C] 0.0789[/C][C] 0.03945[/C][/ROW]
[ROW][C]43[/C][C] 0.9752[/C][C] 0.04962[/C][C] 0.02481[/C][/ROW]
[ROW][C]44[/C][C] 0.9649[/C][C] 0.07017[/C][C] 0.03509[/C][/ROW]
[ROW][C]45[/C][C] 0.9514[/C][C] 0.09719[/C][C] 0.04859[/C][/ROW]
[ROW][C]46[/C][C] 0.9629[/C][C] 0.07418[/C][C] 0.03709[/C][/ROW]
[ROW][C]47[/C][C] 0.9536[/C][C] 0.09275[/C][C] 0.04638[/C][/ROW]
[ROW][C]48[/C][C] 0.9368[/C][C] 0.1265[/C][C] 0.06323[/C][/ROW]
[ROW][C]49[/C][C] 0.9197[/C][C] 0.1606[/C][C] 0.08031[/C][/ROW]
[ROW][C]50[/C][C] 0.9227[/C][C] 0.1547[/C][C] 0.07734[/C][/ROW]
[ROW][C]51[/C][C] 0.9571[/C][C] 0.08572[/C][C] 0.04286[/C][/ROW]
[ROW][C]52[/C][C] 0.9422[/C][C] 0.1157[/C][C] 0.05783[/C][/ROW]
[ROW][C]53[/C][C] 0.925[/C][C] 0.15[/C][C] 0.07501[/C][/ROW]
[ROW][C]54[/C][C] 0.9098[/C][C] 0.1805[/C][C] 0.09024[/C][/ROW]
[ROW][C]55[/C][C] 0.8812[/C][C] 0.2376[/C][C] 0.1188[/C][/ROW]
[ROW][C]56[/C][C] 0.8502[/C][C] 0.2997[/C][C] 0.1498[/C][/ROW]
[ROW][C]57[/C][C] 0.8673[/C][C] 0.2655[/C][C] 0.1327[/C][/ROW]
[ROW][C]58[/C][C] 0.8296[/C][C] 0.3408[/C][C] 0.1704[/C][/ROW]
[ROW][C]59[/C][C] 0.9601[/C][C] 0.07979[/C][C] 0.0399[/C][/ROW]
[ROW][C]60[/C][C] 0.9458[/C][C] 0.1083[/C][C] 0.05416[/C][/ROW]
[ROW][C]61[/C][C] 0.9724[/C][C] 0.05527[/C][C] 0.02764[/C][/ROW]
[ROW][C]62[/C][C] 0.9748[/C][C] 0.05049[/C][C] 0.02524[/C][/ROW]
[ROW][C]63[/C][C] 0.9718[/C][C] 0.05636[/C][C] 0.02818[/C][/ROW]
[ROW][C]64[/C][C] 0.9594[/C][C] 0.08128[/C][C] 0.04064[/C][/ROW]
[ROW][C]65[/C][C] 0.9551[/C][C] 0.08989[/C][C] 0.04494[/C][/ROW]
[ROW][C]66[/C][C] 0.9464[/C][C] 0.1071[/C][C] 0.05355[/C][/ROW]
[ROW][C]67[/C][C] 0.9241[/C][C] 0.1518[/C][C] 0.07588[/C][/ROW]
[ROW][C]68[/C][C] 0.9365[/C][C] 0.1269[/C][C] 0.06345[/C][/ROW]
[ROW][C]69[/C][C] 0.939[/C][C] 0.122[/C][C] 0.06101[/C][/ROW]
[ROW][C]70[/C][C] 0.9171[/C][C] 0.1659[/C][C] 0.08294[/C][/ROW]
[ROW][C]71[/C][C] 0.8844[/C][C] 0.2312[/C][C] 0.1156[/C][/ROW]
[ROW][C]72[/C][C] 0.9056[/C][C] 0.1889[/C][C] 0.09443[/C][/ROW]
[ROW][C]73[/C][C] 0.8687[/C][C] 0.2625[/C][C] 0.1313[/C][/ROW]
[ROW][C]74[/C][C] 0.8459[/C][C] 0.3082[/C][C] 0.1541[/C][/ROW]
[ROW][C]75[/C][C] 0.8382[/C][C] 0.3235[/C][C] 0.1618[/C][/ROW]
[ROW][C]76[/C][C] 0.7874[/C][C] 0.4252[/C][C] 0.2126[/C][/ROW]
[ROW][C]77[/C][C] 0.7705[/C][C] 0.459[/C][C] 0.2295[/C][/ROW]
[ROW][C]78[/C][C] 0.8458[/C][C] 0.3083[/C][C] 0.1542[/C][/ROW]
[ROW][C]79[/C][C] 0.8347[/C][C] 0.3307[/C][C] 0.1653[/C][/ROW]
[ROW][C]80[/C][C] 0.7879[/C][C] 0.4242[/C][C] 0.2121[/C][/ROW]
[ROW][C]81[/C][C] 0.7094[/C][C] 0.5812[/C][C] 0.2906[/C][/ROW]
[ROW][C]82[/C][C] 0.6117[/C][C] 0.7767[/C][C] 0.3883[/C][/ROW]
[ROW][C]83[/C][C] 0.5362[/C][C] 0.9276[/C][C] 0.4638[/C][/ROW]
[ROW][C]84[/C][C] 0.4361[/C][C] 0.8721[/C][C] 0.5639[/C][/ROW]
[ROW][C]85[/C][C] 0.5059[/C][C] 0.9882[/C][C] 0.4941[/C][/ROW]
[ROW][C]86[/C][C] 0.3656[/C][C] 0.7312[/C][C] 0.6344[/C][/ROW]
[ROW][C]87[/C][C] 0.2393[/C][C] 0.4785[/C][C] 0.7607[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297852&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.5627 0.8745 0.4373
9 0.6183 0.7635 0.3817
10 0.4779 0.9558 0.5221
11 0.3499 0.6999 0.6501
12 0.3345 0.669 0.6655
13 0.5306 0.9389 0.4694
14 0.575 0.85 0.425
15 0.4792 0.9584 0.5208
16 0.3974 0.7948 0.6026
17 0.3113 0.6226 0.6887
18 0.3464 0.6929 0.6536
19 0.2742 0.5483 0.7258
20 0.4498 0.8996 0.5502
21 0.535 0.93 0.465
22 0.5184 0.9632 0.4816
23 0.4471 0.8941 0.5529
24 0.3786 0.7572 0.6214
25 0.3142 0.6284 0.6858
26 0.256 0.512 0.744
27 0.2139 0.4278 0.7861
28 0.4044 0.8088 0.5956
29 0.3771 0.7542 0.6229
30 0.4628 0.9256 0.5372
31 0.4238 0.8476 0.5762
32 0.98 0.03991 0.01995
33 0.9746 0.05074 0.02537
34 0.9946 0.0108 0.005401
35 0.9928 0.01431 0.007153
36 0.9909 0.01812 0.009062
37 0.9894 0.0213 0.01065
38 0.9871 0.02581 0.0129
39 0.9814 0.03715 0.01857
40 0.9731 0.05372 0.02686
41 0.9644 0.07128 0.03564
42 0.9605 0.0789 0.03945
43 0.9752 0.04962 0.02481
44 0.9649 0.07017 0.03509
45 0.9514 0.09719 0.04859
46 0.9629 0.07418 0.03709
47 0.9536 0.09275 0.04638
48 0.9368 0.1265 0.06323
49 0.9197 0.1606 0.08031
50 0.9227 0.1547 0.07734
51 0.9571 0.08572 0.04286
52 0.9422 0.1157 0.05783
53 0.925 0.15 0.07501
54 0.9098 0.1805 0.09024
55 0.8812 0.2376 0.1188
56 0.8502 0.2997 0.1498
57 0.8673 0.2655 0.1327
58 0.8296 0.3408 0.1704
59 0.9601 0.07979 0.0399
60 0.9458 0.1083 0.05416
61 0.9724 0.05527 0.02764
62 0.9748 0.05049 0.02524
63 0.9718 0.05636 0.02818
64 0.9594 0.08128 0.04064
65 0.9551 0.08989 0.04494
66 0.9464 0.1071 0.05355
67 0.9241 0.1518 0.07588
68 0.9365 0.1269 0.06345
69 0.939 0.122 0.06101
70 0.9171 0.1659 0.08294
71 0.8844 0.2312 0.1156
72 0.9056 0.1889 0.09443
73 0.8687 0.2625 0.1313
74 0.8459 0.3082 0.1541
75 0.8382 0.3235 0.1618
76 0.7874 0.4252 0.2126
77 0.7705 0.459 0.2295
78 0.8458 0.3083 0.1542
79 0.8347 0.3307 0.1653
80 0.7879 0.4242 0.2121
81 0.7094 0.5812 0.2906
82 0.6117 0.7767 0.3883
83 0.5362 0.9276 0.4638
84 0.4361 0.8721 0.5639
85 0.5059 0.9882 0.4941
86 0.3656 0.7312 0.6344
87 0.2393 0.4785 0.7607







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level80.1NOK
10% type I error level230.2875NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 8 & 0.1 & NOK \tabularnewline
10% type I error level & 23 & 0.2875 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297852&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]8[/C][C]0.1[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]23[/C][C]0.2875[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297852&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level80.1NOK
10% type I error level230.2875NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 7.1741, df1 = 2, df2 = 88, p-value = 0.001299
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0552, df1 = 8, df2 = 82, p-value = 0.4025
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42124, df1 = 2, df2 = 88, p-value = 0.6575

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 7.1741, df1 = 2, df2 = 88, p-value = 0.001299
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0552, df1 = 8, df2 = 82, p-value = 0.4025
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42124, df1 = 2, df2 = 88, p-value = 0.6575
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297852&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 7.1741, df1 = 2, df2 = 88, p-value = 0.001299
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0552, df1 = 8, df2 = 82, p-value = 0.4025
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42124, df1 = 2, df2 = 88, p-value = 0.6575
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297852&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 7.1741, df1 = 2, df2 = 88, p-value = 0.001299
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0552, df1 = 8, df2 = 82, p-value = 0.4025
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42124, df1 = 2, df2 = 88, p-value = 0.6575







Variance Inflation Factors (Multicollinearity)
> vif
      V1       V2       V3       V4 
1.016860 1.034345 1.059619 1.077623 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
      V1       V2       V3       V4 
1.016860 1.034345 1.059619 1.077623 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297852&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
      V1       V2       V3       V4 
1.016860 1.034345 1.059619 1.077623 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297852&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297852&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
      V1       V2       V3       V4 
1.016860 1.034345 1.059619 1.077623 



Parameters (Session):
par1 = TRUE ;
Parameters (R input):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
par5 <- ''
par4 <- ''
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '5'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')