Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 02 Dec 2015 13:53:58 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/02/t144906450808yfofsfp0a3887.htm/, Retrieved Sat, 18 May 2024 17:20:31 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=284838, Retrieved Sat, 18 May 2024 17:20:31 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact99
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [ML Fitting and QQ Plot- Normal Distribution] [] [2015-09-28 10:11:33] [32b17a345b130fdf5cc88718ed94a974]
- RMPD    [Multiple Regression] [dodenaantal per stad] [2015-12-02 13:53:58] [dde0885d390165e184eb3f8febefa56e] [Current]
Feedback Forum

Post a new message
Dataseries X:
8	78	284	9.100000381
9.300000191	68	433	8.699999809
7.5	70	739	7.199999809
8.899999619	96	1792	8.899999619
10.19999981	74	477	8.300000191
8.300000191	111	362	10.89999962
8.800000191	77	671	10
8.800000191	168	636	9.100000381
10.69999981	82	329	8.699999809
11.69999981	89	634	7.599999905
8.5	149	631	10.80000019
8.300000191	60	257	9.5
8.199999809	96	284	8.800000191
7.900000095	83	603	9.5
10.30000019	130	686	8.699999809
7.400000095	145	345	11.19999981
9.600000381	112	1357	9.699999809
9.300000191	131	544	9.600000381
10.60000038	80	205	9.100000381
9.699999809	130	1264	9.199999809
11.60000038	140	688	8.300000191
8.100000381	154	354	8.399999619
9.800000191	118	1632	9.399999619
7.400000095	94	348	9.800000191
9.399999619	119	370	10.39999962
11.19999981	153	648	9.899999619
9.100000381	116	366	9.199999809
10.5	97	540	10.30000019
11.89999962	176	680	8.899999619
8.399999619	75	345	9.600000381
5	134	525	10.30000019
9.800000191	161	870	10.39999962
9.800000191	111	669	9.699999809
10.80000019	114	452	9.600000381
10.10000038	142	430	10.69999981
10.89999962	238	822	10.30000019
9.199999809	78	190	10.69999981
8.300000191	196	867	9.600000381
7.300000191	125	969	10.5
9.399999619	82	499	7.699999809
9.399999619	125	925	10.19999981
9.800000191	129	353	9.899999619
3.599999905	84	288	8.399999619
8.399999619	183	718	10.39999962
10.80000019	119	540	9.199999809
10.10000038	180	668	13
9	82	347	8.800000191
10	71	345	9.199999809
11.30000019	118	463	7.800000191
11.30000019	121	728	8.199999809
12.80000019	68	383	7.400000095
10	112	316	10.39999962
6.699999809	109	388	8.899999619




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
V1[t] = + 11.9402 + 0.00944821V2[t] + 0.000271286V3[t] -0.412411V4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
V1[t] =  +  11.9402 +  0.00944821V2[t] +  0.000271286V3[t] -0.412411V4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]V1[t] =  +  11.9402 +  0.00944821V2[t] +  0.000271286V3[t] -0.412411V4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
V1[t] = + 11.9402 + 0.00944821V2[t] + 0.000271286V3[t] -0.412411V4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+11.94 2.069+5.7710e+00 5.254e-07 2.627e-07
V2+0.009448 0.007041+1.3420e+00 0.1858 0.09292
V3+0.0002713 0.0007231+3.7520e-01 0.7092 0.3546
V4-0.4124 0.2371-1.7400e+00 0.0882 0.0441

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +11.94 &  2.069 & +5.7710e+00 &  5.254e-07 &  2.627e-07 \tabularnewline
V2 & +0.009448 &  0.007041 & +1.3420e+00 &  0.1858 &  0.09292 \tabularnewline
V3 & +0.0002713 &  0.0007231 & +3.7520e-01 &  0.7092 &  0.3546 \tabularnewline
V4 & -0.4124 &  0.2371 & -1.7400e+00 &  0.0882 &  0.0441 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+11.94[/C][C] 2.069[/C][C]+5.7710e+00[/C][C] 5.254e-07[/C][C] 2.627e-07[/C][/ROW]
[ROW][C]V2[/C][C]+0.009448[/C][C] 0.007041[/C][C]+1.3420e+00[/C][C] 0.1858[/C][C] 0.09292[/C][/ROW]
[ROW][C]V3[/C][C]+0.0002713[/C][C] 0.0007231[/C][C]+3.7520e-01[/C][C] 0.7092[/C][C] 0.3546[/C][/ROW]
[ROW][C]V4[/C][C]-0.4124[/C][C] 0.2371[/C][C]-1.7400e+00[/C][C] 0.0882[/C][C] 0.0441[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+11.94 2.069+5.7710e+00 5.254e-07 2.627e-07
V2+0.009448 0.007041+1.3420e+00 0.1858 0.09292
V3+0.0002713 0.0007231+3.7520e-01 0.7092 0.3546
V4-0.4124 0.2371-1.7400e+00 0.0882 0.0441







Multiple Linear Regression - Regression Statistics
Multiple R 0.2771
R-squared 0.07681
Adjusted R-squared 0.02029
F-TEST (value) 1.359
F-TEST (DF numerator)3
F-TEST (DF denominator)49
p-value 0.2663
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.646
Sum Squared Residuals 132.7

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.2771 \tabularnewline
R-squared &  0.07681 \tabularnewline
Adjusted R-squared &  0.02029 \tabularnewline
F-TEST (value) &  1.359 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 49 \tabularnewline
p-value &  0.2663 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.646 \tabularnewline
Sum Squared Residuals &  132.7 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.2771[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.07681[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.02029[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.359[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]49[/C][/ROW]
[ROW][C]p-value[/C][C] 0.2663[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.646[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 132.7[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.2771
R-squared 0.07681
Adjusted R-squared 0.02029
F-TEST (value) 1.359
F-TEST (DF numerator)3
F-TEST (DF denominator)49
p-value 0.2663
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.646
Sum Squared Residuals 132.7







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 8 9.001-1.001
2 9.3 9.112 0.1878
3 7.5 9.833-2.333
4 8.9 9.663-0.7629
5 10.2 9.346 0.8542
6 8.3 8.592-0.2919
7 8.8 8.726 0.07434
8 8.8 9.947-1.147
9 10.7 9.216 1.484
10 11.7 9.819 1.881
11 8.5 9.065-0.5651
12 8.3 8.659-0.3589
13 8.2 9.295-1.095
14 7.9 8.97-1.07
15 10.3 9.767 0.5334
16 7.4 8.785-1.385
17 9.6 9.366 0.2338
18 9.3 9.366-0.06637
19 10.6 8.999 1.601
20 9.7 9.717-0.01721
21 11.6 10.03 1.573
22 8.1 10.03-1.927
23 9.8 9.621 0.1788
24 7.4 8.881-1.481
25 9.4 8.876 0.5241
26 11.2 9.479 1.721
27 9.1 9.341-0.2413
28 10.5 8.755 1.745
29 11.9 10.12 1.783
30 8.4 8.783-0.3833
31 5 9.101-4.101
32 9.8 9.408 0.3917
33 9.8 9.17 0.6299
34 10.8 9.181 1.619
35 10.1 8.986 1.114
36 10.9 10.16 0.7359
37 9.2 8.316 0.8841
38 8.3 10.07-1.768
39 7.3 9.054-1.754
40 9.4 9.675-0.2748
41 9.4 9.166 0.2344
42 9.8 9.172 0.6281
43 3.6 9.348-5.748
44 8.4 9.575-1.175
45 10.8 9.417 1.383
46 10.1 8.461 1.639
47 9 9.18-0.1799
48 10 8.91 1.09
49 11.3 9.964 1.336
50 11.3 9.899 1.401
51 12.8 9.635 3.165
52 10 8.795 1.205
53 6.7 9.405-2.705

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  8 &  9.001 & -1.001 \tabularnewline
2 &  9.3 &  9.112 &  0.1878 \tabularnewline
3 &  7.5 &  9.833 & -2.333 \tabularnewline
4 &  8.9 &  9.663 & -0.7629 \tabularnewline
5 &  10.2 &  9.346 &  0.8542 \tabularnewline
6 &  8.3 &  8.592 & -0.2919 \tabularnewline
7 &  8.8 &  8.726 &  0.07434 \tabularnewline
8 &  8.8 &  9.947 & -1.147 \tabularnewline
9 &  10.7 &  9.216 &  1.484 \tabularnewline
10 &  11.7 &  9.819 &  1.881 \tabularnewline
11 &  8.5 &  9.065 & -0.5651 \tabularnewline
12 &  8.3 &  8.659 & -0.3589 \tabularnewline
13 &  8.2 &  9.295 & -1.095 \tabularnewline
14 &  7.9 &  8.97 & -1.07 \tabularnewline
15 &  10.3 &  9.767 &  0.5334 \tabularnewline
16 &  7.4 &  8.785 & -1.385 \tabularnewline
17 &  9.6 &  9.366 &  0.2338 \tabularnewline
18 &  9.3 &  9.366 & -0.06637 \tabularnewline
19 &  10.6 &  8.999 &  1.601 \tabularnewline
20 &  9.7 &  9.717 & -0.01721 \tabularnewline
21 &  11.6 &  10.03 &  1.573 \tabularnewline
22 &  8.1 &  10.03 & -1.927 \tabularnewline
23 &  9.8 &  9.621 &  0.1788 \tabularnewline
24 &  7.4 &  8.881 & -1.481 \tabularnewline
25 &  9.4 &  8.876 &  0.5241 \tabularnewline
26 &  11.2 &  9.479 &  1.721 \tabularnewline
27 &  9.1 &  9.341 & -0.2413 \tabularnewline
28 &  10.5 &  8.755 &  1.745 \tabularnewline
29 &  11.9 &  10.12 &  1.783 \tabularnewline
30 &  8.4 &  8.783 & -0.3833 \tabularnewline
31 &  5 &  9.101 & -4.101 \tabularnewline
32 &  9.8 &  9.408 &  0.3917 \tabularnewline
33 &  9.8 &  9.17 &  0.6299 \tabularnewline
34 &  10.8 &  9.181 &  1.619 \tabularnewline
35 &  10.1 &  8.986 &  1.114 \tabularnewline
36 &  10.9 &  10.16 &  0.7359 \tabularnewline
37 &  9.2 &  8.316 &  0.8841 \tabularnewline
38 &  8.3 &  10.07 & -1.768 \tabularnewline
39 &  7.3 &  9.054 & -1.754 \tabularnewline
40 &  9.4 &  9.675 & -0.2748 \tabularnewline
41 &  9.4 &  9.166 &  0.2344 \tabularnewline
42 &  9.8 &  9.172 &  0.6281 \tabularnewline
43 &  3.6 &  9.348 & -5.748 \tabularnewline
44 &  8.4 &  9.575 & -1.175 \tabularnewline
45 &  10.8 &  9.417 &  1.383 \tabularnewline
46 &  10.1 &  8.461 &  1.639 \tabularnewline
47 &  9 &  9.18 & -0.1799 \tabularnewline
48 &  10 &  8.91 &  1.09 \tabularnewline
49 &  11.3 &  9.964 &  1.336 \tabularnewline
50 &  11.3 &  9.899 &  1.401 \tabularnewline
51 &  12.8 &  9.635 &  3.165 \tabularnewline
52 &  10 &  8.795 &  1.205 \tabularnewline
53 &  6.7 &  9.405 & -2.705 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 8[/C][C] 9.001[/C][C]-1.001[/C][/ROW]
[ROW][C]2[/C][C] 9.3[/C][C] 9.112[/C][C] 0.1878[/C][/ROW]
[ROW][C]3[/C][C] 7.5[/C][C] 9.833[/C][C]-2.333[/C][/ROW]
[ROW][C]4[/C][C] 8.9[/C][C] 9.663[/C][C]-0.7629[/C][/ROW]
[ROW][C]5[/C][C] 10.2[/C][C] 9.346[/C][C] 0.8542[/C][/ROW]
[ROW][C]6[/C][C] 8.3[/C][C] 8.592[/C][C]-0.2919[/C][/ROW]
[ROW][C]7[/C][C] 8.8[/C][C] 8.726[/C][C] 0.07434[/C][/ROW]
[ROW][C]8[/C][C] 8.8[/C][C] 9.947[/C][C]-1.147[/C][/ROW]
[ROW][C]9[/C][C] 10.7[/C][C] 9.216[/C][C] 1.484[/C][/ROW]
[ROW][C]10[/C][C] 11.7[/C][C] 9.819[/C][C] 1.881[/C][/ROW]
[ROW][C]11[/C][C] 8.5[/C][C] 9.065[/C][C]-0.5651[/C][/ROW]
[ROW][C]12[/C][C] 8.3[/C][C] 8.659[/C][C]-0.3589[/C][/ROW]
[ROW][C]13[/C][C] 8.2[/C][C] 9.295[/C][C]-1.095[/C][/ROW]
[ROW][C]14[/C][C] 7.9[/C][C] 8.97[/C][C]-1.07[/C][/ROW]
[ROW][C]15[/C][C] 10.3[/C][C] 9.767[/C][C] 0.5334[/C][/ROW]
[ROW][C]16[/C][C] 7.4[/C][C] 8.785[/C][C]-1.385[/C][/ROW]
[ROW][C]17[/C][C] 9.6[/C][C] 9.366[/C][C] 0.2338[/C][/ROW]
[ROW][C]18[/C][C] 9.3[/C][C] 9.366[/C][C]-0.06637[/C][/ROW]
[ROW][C]19[/C][C] 10.6[/C][C] 8.999[/C][C] 1.601[/C][/ROW]
[ROW][C]20[/C][C] 9.7[/C][C] 9.717[/C][C]-0.01721[/C][/ROW]
[ROW][C]21[/C][C] 11.6[/C][C] 10.03[/C][C] 1.573[/C][/ROW]
[ROW][C]22[/C][C] 8.1[/C][C] 10.03[/C][C]-1.927[/C][/ROW]
[ROW][C]23[/C][C] 9.8[/C][C] 9.621[/C][C] 0.1788[/C][/ROW]
[ROW][C]24[/C][C] 7.4[/C][C] 8.881[/C][C]-1.481[/C][/ROW]
[ROW][C]25[/C][C] 9.4[/C][C] 8.876[/C][C] 0.5241[/C][/ROW]
[ROW][C]26[/C][C] 11.2[/C][C] 9.479[/C][C] 1.721[/C][/ROW]
[ROW][C]27[/C][C] 9.1[/C][C] 9.341[/C][C]-0.2413[/C][/ROW]
[ROW][C]28[/C][C] 10.5[/C][C] 8.755[/C][C] 1.745[/C][/ROW]
[ROW][C]29[/C][C] 11.9[/C][C] 10.12[/C][C] 1.783[/C][/ROW]
[ROW][C]30[/C][C] 8.4[/C][C] 8.783[/C][C]-0.3833[/C][/ROW]
[ROW][C]31[/C][C] 5[/C][C] 9.101[/C][C]-4.101[/C][/ROW]
[ROW][C]32[/C][C] 9.8[/C][C] 9.408[/C][C] 0.3917[/C][/ROW]
[ROW][C]33[/C][C] 9.8[/C][C] 9.17[/C][C] 0.6299[/C][/ROW]
[ROW][C]34[/C][C] 10.8[/C][C] 9.181[/C][C] 1.619[/C][/ROW]
[ROW][C]35[/C][C] 10.1[/C][C] 8.986[/C][C] 1.114[/C][/ROW]
[ROW][C]36[/C][C] 10.9[/C][C] 10.16[/C][C] 0.7359[/C][/ROW]
[ROW][C]37[/C][C] 9.2[/C][C] 8.316[/C][C] 0.8841[/C][/ROW]
[ROW][C]38[/C][C] 8.3[/C][C] 10.07[/C][C]-1.768[/C][/ROW]
[ROW][C]39[/C][C] 7.3[/C][C] 9.054[/C][C]-1.754[/C][/ROW]
[ROW][C]40[/C][C] 9.4[/C][C] 9.675[/C][C]-0.2748[/C][/ROW]
[ROW][C]41[/C][C] 9.4[/C][C] 9.166[/C][C] 0.2344[/C][/ROW]
[ROW][C]42[/C][C] 9.8[/C][C] 9.172[/C][C] 0.6281[/C][/ROW]
[ROW][C]43[/C][C] 3.6[/C][C] 9.348[/C][C]-5.748[/C][/ROW]
[ROW][C]44[/C][C] 8.4[/C][C] 9.575[/C][C]-1.175[/C][/ROW]
[ROW][C]45[/C][C] 10.8[/C][C] 9.417[/C][C] 1.383[/C][/ROW]
[ROW][C]46[/C][C] 10.1[/C][C] 8.461[/C][C] 1.639[/C][/ROW]
[ROW][C]47[/C][C] 9[/C][C] 9.18[/C][C]-0.1799[/C][/ROW]
[ROW][C]48[/C][C] 10[/C][C] 8.91[/C][C] 1.09[/C][/ROW]
[ROW][C]49[/C][C] 11.3[/C][C] 9.964[/C][C] 1.336[/C][/ROW]
[ROW][C]50[/C][C] 11.3[/C][C] 9.899[/C][C] 1.401[/C][/ROW]
[ROW][C]51[/C][C] 12.8[/C][C] 9.635[/C][C] 3.165[/C][/ROW]
[ROW][C]52[/C][C] 10[/C][C] 8.795[/C][C] 1.205[/C][/ROW]
[ROW][C]53[/C][C] 6.7[/C][C] 9.405[/C][C]-2.705[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 8 9.001-1.001
2 9.3 9.112 0.1878
3 7.5 9.833-2.333
4 8.9 9.663-0.7629
5 10.2 9.346 0.8542
6 8.3 8.592-0.2919
7 8.8 8.726 0.07434
8 8.8 9.947-1.147
9 10.7 9.216 1.484
10 11.7 9.819 1.881
11 8.5 9.065-0.5651
12 8.3 8.659-0.3589
13 8.2 9.295-1.095
14 7.9 8.97-1.07
15 10.3 9.767 0.5334
16 7.4 8.785-1.385
17 9.6 9.366 0.2338
18 9.3 9.366-0.06637
19 10.6 8.999 1.601
20 9.7 9.717-0.01721
21 11.6 10.03 1.573
22 8.1 10.03-1.927
23 9.8 9.621 0.1788
24 7.4 8.881-1.481
25 9.4 8.876 0.5241
26 11.2 9.479 1.721
27 9.1 9.341-0.2413
28 10.5 8.755 1.745
29 11.9 10.12 1.783
30 8.4 8.783-0.3833
31 5 9.101-4.101
32 9.8 9.408 0.3917
33 9.8 9.17 0.6299
34 10.8 9.181 1.619
35 10.1 8.986 1.114
36 10.9 10.16 0.7359
37 9.2 8.316 0.8841
38 8.3 10.07-1.768
39 7.3 9.054-1.754
40 9.4 9.675-0.2748
41 9.4 9.166 0.2344
42 9.8 9.172 0.6281
43 3.6 9.348-5.748
44 8.4 9.575-1.175
45 10.8 9.417 1.383
46 10.1 8.461 1.639
47 9 9.18-0.1799
48 10 8.91 1.09
49 11.3 9.964 1.336
50 11.3 9.899 1.401
51 12.8 9.635 3.165
52 10 8.795 1.205
53 6.7 9.405-2.705







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.3041 0.6082 0.6959
8 0.1906 0.3812 0.8094
9 0.25 0.5001 0.75
10 0.3777 0.7553 0.6223
11 0.2625 0.5249 0.7375
12 0.1847 0.3693 0.8153
13 0.145 0.2901 0.855
14 0.1063 0.2127 0.8937
15 0.07684 0.1537 0.9232
16 0.05477 0.1095 0.9452
17 0.03696 0.07392 0.963
18 0.02127 0.04254 0.9787
19 0.02406 0.04813 0.9759
20 0.01402 0.02804 0.986
21 0.01478 0.02957 0.9852
22 0.02037 0.04075 0.9796
23 0.01188 0.02377 0.9881
24 0.01023 0.02045 0.9898
25 0.00678 0.01356 0.9932
26 0.009473 0.01895 0.9905
27 0.005192 0.01038 0.9948
28 0.006218 0.01244 0.9938
29 0.007032 0.01406 0.993
30 0.003889 0.007779 0.9961
31 0.05321 0.1064 0.9468
32 0.03511 0.07023 0.9649
33 0.02304 0.04609 0.977
34 0.02134 0.04268 0.9787
35 0.01537 0.03074 0.9846
36 0.01017 0.02035 0.9898
37 0.006226 0.01245 0.9938
38 0.005629 0.01126 0.9944
39 0.007428 0.01486 0.9926
40 0.003897 0.007793 0.9961
41 0.004331 0.008663 0.9957
42 0.004194 0.008389 0.9958
43 0.3171 0.6341 0.6829
44 0.261 0.5221 0.739
45 0.1711 0.3421 0.8289
46 0.1305 0.2609 0.8695

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 &  0.3041 &  0.6082 &  0.6959 \tabularnewline
8 &  0.1906 &  0.3812 &  0.8094 \tabularnewline
9 &  0.25 &  0.5001 &  0.75 \tabularnewline
10 &  0.3777 &  0.7553 &  0.6223 \tabularnewline
11 &  0.2625 &  0.5249 &  0.7375 \tabularnewline
12 &  0.1847 &  0.3693 &  0.8153 \tabularnewline
13 &  0.145 &  0.2901 &  0.855 \tabularnewline
14 &  0.1063 &  0.2127 &  0.8937 \tabularnewline
15 &  0.07684 &  0.1537 &  0.9232 \tabularnewline
16 &  0.05477 &  0.1095 &  0.9452 \tabularnewline
17 &  0.03696 &  0.07392 &  0.963 \tabularnewline
18 &  0.02127 &  0.04254 &  0.9787 \tabularnewline
19 &  0.02406 &  0.04813 &  0.9759 \tabularnewline
20 &  0.01402 &  0.02804 &  0.986 \tabularnewline
21 &  0.01478 &  0.02957 &  0.9852 \tabularnewline
22 &  0.02037 &  0.04075 &  0.9796 \tabularnewline
23 &  0.01188 &  0.02377 &  0.9881 \tabularnewline
24 &  0.01023 &  0.02045 &  0.9898 \tabularnewline
25 &  0.00678 &  0.01356 &  0.9932 \tabularnewline
26 &  0.009473 &  0.01895 &  0.9905 \tabularnewline
27 &  0.005192 &  0.01038 &  0.9948 \tabularnewline
28 &  0.006218 &  0.01244 &  0.9938 \tabularnewline
29 &  0.007032 &  0.01406 &  0.993 \tabularnewline
30 &  0.003889 &  0.007779 &  0.9961 \tabularnewline
31 &  0.05321 &  0.1064 &  0.9468 \tabularnewline
32 &  0.03511 &  0.07023 &  0.9649 \tabularnewline
33 &  0.02304 &  0.04609 &  0.977 \tabularnewline
34 &  0.02134 &  0.04268 &  0.9787 \tabularnewline
35 &  0.01537 &  0.03074 &  0.9846 \tabularnewline
36 &  0.01017 &  0.02035 &  0.9898 \tabularnewline
37 &  0.006226 &  0.01245 &  0.9938 \tabularnewline
38 &  0.005629 &  0.01126 &  0.9944 \tabularnewline
39 &  0.007428 &  0.01486 &  0.9926 \tabularnewline
40 &  0.003897 &  0.007793 &  0.9961 \tabularnewline
41 &  0.004331 &  0.008663 &  0.9957 \tabularnewline
42 &  0.004194 &  0.008389 &  0.9958 \tabularnewline
43 &  0.3171 &  0.6341 &  0.6829 \tabularnewline
44 &  0.261 &  0.5221 &  0.739 \tabularnewline
45 &  0.1711 &  0.3421 &  0.8289 \tabularnewline
46 &  0.1305 &  0.2609 &  0.8695 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C] 0.3041[/C][C] 0.6082[/C][C] 0.6959[/C][/ROW]
[ROW][C]8[/C][C] 0.1906[/C][C] 0.3812[/C][C] 0.8094[/C][/ROW]
[ROW][C]9[/C][C] 0.25[/C][C] 0.5001[/C][C] 0.75[/C][/ROW]
[ROW][C]10[/C][C] 0.3777[/C][C] 0.7553[/C][C] 0.6223[/C][/ROW]
[ROW][C]11[/C][C] 0.2625[/C][C] 0.5249[/C][C] 0.7375[/C][/ROW]
[ROW][C]12[/C][C] 0.1847[/C][C] 0.3693[/C][C] 0.8153[/C][/ROW]
[ROW][C]13[/C][C] 0.145[/C][C] 0.2901[/C][C] 0.855[/C][/ROW]
[ROW][C]14[/C][C] 0.1063[/C][C] 0.2127[/C][C] 0.8937[/C][/ROW]
[ROW][C]15[/C][C] 0.07684[/C][C] 0.1537[/C][C] 0.9232[/C][/ROW]
[ROW][C]16[/C][C] 0.05477[/C][C] 0.1095[/C][C] 0.9452[/C][/ROW]
[ROW][C]17[/C][C] 0.03696[/C][C] 0.07392[/C][C] 0.963[/C][/ROW]
[ROW][C]18[/C][C] 0.02127[/C][C] 0.04254[/C][C] 0.9787[/C][/ROW]
[ROW][C]19[/C][C] 0.02406[/C][C] 0.04813[/C][C] 0.9759[/C][/ROW]
[ROW][C]20[/C][C] 0.01402[/C][C] 0.02804[/C][C] 0.986[/C][/ROW]
[ROW][C]21[/C][C] 0.01478[/C][C] 0.02957[/C][C] 0.9852[/C][/ROW]
[ROW][C]22[/C][C] 0.02037[/C][C] 0.04075[/C][C] 0.9796[/C][/ROW]
[ROW][C]23[/C][C] 0.01188[/C][C] 0.02377[/C][C] 0.9881[/C][/ROW]
[ROW][C]24[/C][C] 0.01023[/C][C] 0.02045[/C][C] 0.9898[/C][/ROW]
[ROW][C]25[/C][C] 0.00678[/C][C] 0.01356[/C][C] 0.9932[/C][/ROW]
[ROW][C]26[/C][C] 0.009473[/C][C] 0.01895[/C][C] 0.9905[/C][/ROW]
[ROW][C]27[/C][C] 0.005192[/C][C] 0.01038[/C][C] 0.9948[/C][/ROW]
[ROW][C]28[/C][C] 0.006218[/C][C] 0.01244[/C][C] 0.9938[/C][/ROW]
[ROW][C]29[/C][C] 0.007032[/C][C] 0.01406[/C][C] 0.993[/C][/ROW]
[ROW][C]30[/C][C] 0.003889[/C][C] 0.007779[/C][C] 0.9961[/C][/ROW]
[ROW][C]31[/C][C] 0.05321[/C][C] 0.1064[/C][C] 0.9468[/C][/ROW]
[ROW][C]32[/C][C] 0.03511[/C][C] 0.07023[/C][C] 0.9649[/C][/ROW]
[ROW][C]33[/C][C] 0.02304[/C][C] 0.04609[/C][C] 0.977[/C][/ROW]
[ROW][C]34[/C][C] 0.02134[/C][C] 0.04268[/C][C] 0.9787[/C][/ROW]
[ROW][C]35[/C][C] 0.01537[/C][C] 0.03074[/C][C] 0.9846[/C][/ROW]
[ROW][C]36[/C][C] 0.01017[/C][C] 0.02035[/C][C] 0.9898[/C][/ROW]
[ROW][C]37[/C][C] 0.006226[/C][C] 0.01245[/C][C] 0.9938[/C][/ROW]
[ROW][C]38[/C][C] 0.005629[/C][C] 0.01126[/C][C] 0.9944[/C][/ROW]
[ROW][C]39[/C][C] 0.007428[/C][C] 0.01486[/C][C] 0.9926[/C][/ROW]
[ROW][C]40[/C][C] 0.003897[/C][C] 0.007793[/C][C] 0.9961[/C][/ROW]
[ROW][C]41[/C][C] 0.004331[/C][C] 0.008663[/C][C] 0.9957[/C][/ROW]
[ROW][C]42[/C][C] 0.004194[/C][C] 0.008389[/C][C] 0.9958[/C][/ROW]
[ROW][C]43[/C][C] 0.3171[/C][C] 0.6341[/C][C] 0.6829[/C][/ROW]
[ROW][C]44[/C][C] 0.261[/C][C] 0.5221[/C][C] 0.739[/C][/ROW]
[ROW][C]45[/C][C] 0.1711[/C][C] 0.3421[/C][C] 0.8289[/C][/ROW]
[ROW][C]46[/C][C] 0.1305[/C][C] 0.2609[/C][C] 0.8695[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.3041 0.6082 0.6959
8 0.1906 0.3812 0.8094
9 0.25 0.5001 0.75
10 0.3777 0.7553 0.6223
11 0.2625 0.5249 0.7375
12 0.1847 0.3693 0.8153
13 0.145 0.2901 0.855
14 0.1063 0.2127 0.8937
15 0.07684 0.1537 0.9232
16 0.05477 0.1095 0.9452
17 0.03696 0.07392 0.963
18 0.02127 0.04254 0.9787
19 0.02406 0.04813 0.9759
20 0.01402 0.02804 0.986
21 0.01478 0.02957 0.9852
22 0.02037 0.04075 0.9796
23 0.01188 0.02377 0.9881
24 0.01023 0.02045 0.9898
25 0.00678 0.01356 0.9932
26 0.009473 0.01895 0.9905
27 0.005192 0.01038 0.9948
28 0.006218 0.01244 0.9938
29 0.007032 0.01406 0.993
30 0.003889 0.007779 0.9961
31 0.05321 0.1064 0.9468
32 0.03511 0.07023 0.9649
33 0.02304 0.04609 0.977
34 0.02134 0.04268 0.9787
35 0.01537 0.03074 0.9846
36 0.01017 0.02035 0.9898
37 0.006226 0.01245 0.9938
38 0.005629 0.01126 0.9944
39 0.007428 0.01486 0.9926
40 0.003897 0.007793 0.9961
41 0.004331 0.008663 0.9957
42 0.004194 0.008389 0.9958
43 0.3171 0.6341 0.6829
44 0.261 0.5221 0.739
45 0.1711 0.3421 0.8289
46 0.1305 0.2609 0.8695







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level4 0.1NOK
5% type I error level230.575NOK
10% type I error level250.625NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 4 &  0.1 & NOK \tabularnewline
5% type I error level & 23 & 0.575 & NOK \tabularnewline
10% type I error level & 25 & 0.625 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284838&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]4[/C][C] 0.1[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]23[/C][C]0.575[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]25[/C][C]0.625[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284838&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284838&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level4 0.1NOK
5% type I error level230.575NOK
10% type I error level250.625NOK



Parameters (Session):
par2 = grey ; par3 = FALSE ; par4 = Interval/Ratio ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1+par4,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1+par4,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}