Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 02 Dec 2015 09:07:00 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/02/t14490473150dprigwbx8nviej.htm/, Retrieved Sat, 18 May 2024 16:17:01 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=284789, Retrieved Sat, 18 May 2024 16:17:01 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact123
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [mutiple regressio...] [2015-12-02 09:07:00] [8343e60ff54e71da739d30e8dad8e06a] [Current]
Feedback Forum

Post a new message
Dataseries X:
0.74 23867
0.76 24107
0.75 24041
0.69 24415
0.59 24496
0.71 24022
0.44 24367
0.72 23869
0.7 24495
0.71 23818
0.7 24081
0.57 24132
0.72 23651
0.58 23622
0.63 23726
0.78 23942
0.48 24573
0.58 23085
0.73 22612
0.68 22960
0.66 22921
0.74 23510
0.69 22729
0.63 23047
0.78 22850
0.59 23426
0.69 22812
0.78 22446
0.41 23567
0.68 23185
0.64 22777
0.55 23508
0.81 23193
0.81 23006
0.77 22332
0.77 22347
0.45 23061
0.57 22887
0.69 22890
0.74 22701
0.76 22467
0.83 22357
0.78 22443
0.68 22824
0.57 22906
0.78 23059
0.76 23055
0.67 22564
0.69 18570
0.59 20329
0.77 19279
0.54 19541
0.63 19517




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
%_winst_thuis[t] = + 0.831639 -6.83958e-06Toeschouwers[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
%_winst_thuis[t] =  +  0.831639 -6.83958e-06Toeschouwers[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]%_winst_thuis[t] =  +  0.831639 -6.83958e-06Toeschouwers[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
%_winst_thuis[t] = + 0.831639 -6.83958e-06Toeschouwers[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+0.8316 0.2449+3.3950e+00 0.001334 0.0006672
Toeschouwers-6.84e-06 1.066e-05-6.4170e-01 0.5239 0.262

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +0.8316 &  0.2449 & +3.3950e+00 &  0.001334 &  0.0006672 \tabularnewline
Toeschouwers & -6.84e-06 &  1.066e-05 & -6.4170e-01 &  0.5239 &  0.262 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+0.8316[/C][C] 0.2449[/C][C]+3.3950e+00[/C][C] 0.001334[/C][C] 0.0006672[/C][/ROW]
[ROW][C]Toeschouwers[/C][C]-6.84e-06[/C][C] 1.066e-05[/C][C]-6.4170e-01[/C][C] 0.5239[/C][C] 0.262[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+0.8316 0.2449+3.3950e+00 0.001334 0.0006672
Toeschouwers-6.84e-06 1.066e-05-6.4170e-01 0.5239 0.262







Multiple Linear Regression - Regression Statistics
Multiple R 0.0895
R-squared 0.00801
Adjusted R-squared-0.01144
F-TEST (value) 0.4118
F-TEST (DF numerator)1
F-TEST (DF denominator)51
p-value 0.5239
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.1009
Sum Squared Residuals 0.5187

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.0895 \tabularnewline
R-squared &  0.00801 \tabularnewline
Adjusted R-squared & -0.01144 \tabularnewline
F-TEST (value) &  0.4118 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 51 \tabularnewline
p-value &  0.5239 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.1009 \tabularnewline
Sum Squared Residuals &  0.5187 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.0895[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.00801[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.01144[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.4118[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]51[/C][/ROW]
[ROW][C]p-value[/C][C] 0.5239[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.1009[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 0.5187[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.0895
R-squared 0.00801
Adjusted R-squared-0.01144
F-TEST (value) 0.4118
F-TEST (DF numerator)1
F-TEST (DF denominator)51
p-value 0.5239
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.1009
Sum Squared Residuals 0.5187







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 0.74 0.6684 0.0716
2 0.76 0.6668 0.09324
3 0.75 0.6672 0.08279
4 0.69 0.6646 0.02535
5 0.59 0.6641-0.0741
6 0.71 0.6673 0.04266
7 0.44 0.665-0.225
8 0.72 0.6684 0.05162
9 0.7 0.6641 0.0359
10 0.71 0.6687 0.04127
11 0.7 0.6669 0.03307
12 0.57 0.6666-0.09659
13 0.72 0.6699 0.05012
14 0.58 0.6701-0.09007
15 0.63 0.6694-0.03936
16 0.78 0.6679 0.1121
17 0.48 0.6636-0.1836
18 0.58 0.6737-0.09375
19 0.73 0.677 0.05302
20 0.68 0.6746 0.005398
21 0.66 0.6749-0.01487
22 0.74 0.6708 0.06916
23 0.69 0.6762 0.01382
24 0.63 0.674-0.04401
25 0.78 0.6754 0.1046
26 0.59 0.6714-0.08141
27 0.69 0.6756 0.01439
28 0.78 0.6781 0.1019
29 0.41 0.6704-0.2605
30 0.68 0.6731 0.006937
31 0.64 0.6759-0.03585
32 0.55 0.6709-0.1209
33 0.81 0.673 0.137
34 0.81 0.6743 0.1357
35 0.77 0.6789 0.0911
36 0.77 0.6788 0.09121
37 0.45 0.6739-0.2239
38 0.57 0.6751-0.1051
39 0.69 0.6751 0.01492
40 0.74 0.6764 0.06363
41 0.76 0.678 0.08203
42 0.83 0.6787 0.1513
43 0.78 0.6781 0.1019
44 0.68 0.6755 0.004468
45 0.57 0.675-0.105
46 0.78 0.6739 0.1061
47 0.76 0.674 0.08605
48 0.67 0.6773-0.00731
49 0.69 0.7046-0.01463
50 0.59 0.6926-0.1026
51 0.77 0.6998 0.07022
52 0.54 0.698-0.158
53 0.63 0.6982-0.06815

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  0.74 &  0.6684 &  0.0716 \tabularnewline
2 &  0.76 &  0.6668 &  0.09324 \tabularnewline
3 &  0.75 &  0.6672 &  0.08279 \tabularnewline
4 &  0.69 &  0.6646 &  0.02535 \tabularnewline
5 &  0.59 &  0.6641 & -0.0741 \tabularnewline
6 &  0.71 &  0.6673 &  0.04266 \tabularnewline
7 &  0.44 &  0.665 & -0.225 \tabularnewline
8 &  0.72 &  0.6684 &  0.05162 \tabularnewline
9 &  0.7 &  0.6641 &  0.0359 \tabularnewline
10 &  0.71 &  0.6687 &  0.04127 \tabularnewline
11 &  0.7 &  0.6669 &  0.03307 \tabularnewline
12 &  0.57 &  0.6666 & -0.09659 \tabularnewline
13 &  0.72 &  0.6699 &  0.05012 \tabularnewline
14 &  0.58 &  0.6701 & -0.09007 \tabularnewline
15 &  0.63 &  0.6694 & -0.03936 \tabularnewline
16 &  0.78 &  0.6679 &  0.1121 \tabularnewline
17 &  0.48 &  0.6636 & -0.1836 \tabularnewline
18 &  0.58 &  0.6737 & -0.09375 \tabularnewline
19 &  0.73 &  0.677 &  0.05302 \tabularnewline
20 &  0.68 &  0.6746 &  0.005398 \tabularnewline
21 &  0.66 &  0.6749 & -0.01487 \tabularnewline
22 &  0.74 &  0.6708 &  0.06916 \tabularnewline
23 &  0.69 &  0.6762 &  0.01382 \tabularnewline
24 &  0.63 &  0.674 & -0.04401 \tabularnewline
25 &  0.78 &  0.6754 &  0.1046 \tabularnewline
26 &  0.59 &  0.6714 & -0.08141 \tabularnewline
27 &  0.69 &  0.6756 &  0.01439 \tabularnewline
28 &  0.78 &  0.6781 &  0.1019 \tabularnewline
29 &  0.41 &  0.6704 & -0.2605 \tabularnewline
30 &  0.68 &  0.6731 &  0.006937 \tabularnewline
31 &  0.64 &  0.6759 & -0.03585 \tabularnewline
32 &  0.55 &  0.6709 & -0.1209 \tabularnewline
33 &  0.81 &  0.673 &  0.137 \tabularnewline
34 &  0.81 &  0.6743 &  0.1357 \tabularnewline
35 &  0.77 &  0.6789 &  0.0911 \tabularnewline
36 &  0.77 &  0.6788 &  0.09121 \tabularnewline
37 &  0.45 &  0.6739 & -0.2239 \tabularnewline
38 &  0.57 &  0.6751 & -0.1051 \tabularnewline
39 &  0.69 &  0.6751 &  0.01492 \tabularnewline
40 &  0.74 &  0.6764 &  0.06363 \tabularnewline
41 &  0.76 &  0.678 &  0.08203 \tabularnewline
42 &  0.83 &  0.6787 &  0.1513 \tabularnewline
43 &  0.78 &  0.6781 &  0.1019 \tabularnewline
44 &  0.68 &  0.6755 &  0.004468 \tabularnewline
45 &  0.57 &  0.675 & -0.105 \tabularnewline
46 &  0.78 &  0.6739 &  0.1061 \tabularnewline
47 &  0.76 &  0.674 &  0.08605 \tabularnewline
48 &  0.67 &  0.6773 & -0.00731 \tabularnewline
49 &  0.69 &  0.7046 & -0.01463 \tabularnewline
50 &  0.59 &  0.6926 & -0.1026 \tabularnewline
51 &  0.77 &  0.6998 &  0.07022 \tabularnewline
52 &  0.54 &  0.698 & -0.158 \tabularnewline
53 &  0.63 &  0.6982 & -0.06815 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 0.74[/C][C] 0.6684[/C][C] 0.0716[/C][/ROW]
[ROW][C]2[/C][C] 0.76[/C][C] 0.6668[/C][C] 0.09324[/C][/ROW]
[ROW][C]3[/C][C] 0.75[/C][C] 0.6672[/C][C] 0.08279[/C][/ROW]
[ROW][C]4[/C][C] 0.69[/C][C] 0.6646[/C][C] 0.02535[/C][/ROW]
[ROW][C]5[/C][C] 0.59[/C][C] 0.6641[/C][C]-0.0741[/C][/ROW]
[ROW][C]6[/C][C] 0.71[/C][C] 0.6673[/C][C] 0.04266[/C][/ROW]
[ROW][C]7[/C][C] 0.44[/C][C] 0.665[/C][C]-0.225[/C][/ROW]
[ROW][C]8[/C][C] 0.72[/C][C] 0.6684[/C][C] 0.05162[/C][/ROW]
[ROW][C]9[/C][C] 0.7[/C][C] 0.6641[/C][C] 0.0359[/C][/ROW]
[ROW][C]10[/C][C] 0.71[/C][C] 0.6687[/C][C] 0.04127[/C][/ROW]
[ROW][C]11[/C][C] 0.7[/C][C] 0.6669[/C][C] 0.03307[/C][/ROW]
[ROW][C]12[/C][C] 0.57[/C][C] 0.6666[/C][C]-0.09659[/C][/ROW]
[ROW][C]13[/C][C] 0.72[/C][C] 0.6699[/C][C] 0.05012[/C][/ROW]
[ROW][C]14[/C][C] 0.58[/C][C] 0.6701[/C][C]-0.09007[/C][/ROW]
[ROW][C]15[/C][C] 0.63[/C][C] 0.6694[/C][C]-0.03936[/C][/ROW]
[ROW][C]16[/C][C] 0.78[/C][C] 0.6679[/C][C] 0.1121[/C][/ROW]
[ROW][C]17[/C][C] 0.48[/C][C] 0.6636[/C][C]-0.1836[/C][/ROW]
[ROW][C]18[/C][C] 0.58[/C][C] 0.6737[/C][C]-0.09375[/C][/ROW]
[ROW][C]19[/C][C] 0.73[/C][C] 0.677[/C][C] 0.05302[/C][/ROW]
[ROW][C]20[/C][C] 0.68[/C][C] 0.6746[/C][C] 0.005398[/C][/ROW]
[ROW][C]21[/C][C] 0.66[/C][C] 0.6749[/C][C]-0.01487[/C][/ROW]
[ROW][C]22[/C][C] 0.74[/C][C] 0.6708[/C][C] 0.06916[/C][/ROW]
[ROW][C]23[/C][C] 0.69[/C][C] 0.6762[/C][C] 0.01382[/C][/ROW]
[ROW][C]24[/C][C] 0.63[/C][C] 0.674[/C][C]-0.04401[/C][/ROW]
[ROW][C]25[/C][C] 0.78[/C][C] 0.6754[/C][C] 0.1046[/C][/ROW]
[ROW][C]26[/C][C] 0.59[/C][C] 0.6714[/C][C]-0.08141[/C][/ROW]
[ROW][C]27[/C][C] 0.69[/C][C] 0.6756[/C][C] 0.01439[/C][/ROW]
[ROW][C]28[/C][C] 0.78[/C][C] 0.6781[/C][C] 0.1019[/C][/ROW]
[ROW][C]29[/C][C] 0.41[/C][C] 0.6704[/C][C]-0.2605[/C][/ROW]
[ROW][C]30[/C][C] 0.68[/C][C] 0.6731[/C][C] 0.006937[/C][/ROW]
[ROW][C]31[/C][C] 0.64[/C][C] 0.6759[/C][C]-0.03585[/C][/ROW]
[ROW][C]32[/C][C] 0.55[/C][C] 0.6709[/C][C]-0.1209[/C][/ROW]
[ROW][C]33[/C][C] 0.81[/C][C] 0.673[/C][C] 0.137[/C][/ROW]
[ROW][C]34[/C][C] 0.81[/C][C] 0.6743[/C][C] 0.1357[/C][/ROW]
[ROW][C]35[/C][C] 0.77[/C][C] 0.6789[/C][C] 0.0911[/C][/ROW]
[ROW][C]36[/C][C] 0.77[/C][C] 0.6788[/C][C] 0.09121[/C][/ROW]
[ROW][C]37[/C][C] 0.45[/C][C] 0.6739[/C][C]-0.2239[/C][/ROW]
[ROW][C]38[/C][C] 0.57[/C][C] 0.6751[/C][C]-0.1051[/C][/ROW]
[ROW][C]39[/C][C] 0.69[/C][C] 0.6751[/C][C] 0.01492[/C][/ROW]
[ROW][C]40[/C][C] 0.74[/C][C] 0.6764[/C][C] 0.06363[/C][/ROW]
[ROW][C]41[/C][C] 0.76[/C][C] 0.678[/C][C] 0.08203[/C][/ROW]
[ROW][C]42[/C][C] 0.83[/C][C] 0.6787[/C][C] 0.1513[/C][/ROW]
[ROW][C]43[/C][C] 0.78[/C][C] 0.6781[/C][C] 0.1019[/C][/ROW]
[ROW][C]44[/C][C] 0.68[/C][C] 0.6755[/C][C] 0.004468[/C][/ROW]
[ROW][C]45[/C][C] 0.57[/C][C] 0.675[/C][C]-0.105[/C][/ROW]
[ROW][C]46[/C][C] 0.78[/C][C] 0.6739[/C][C] 0.1061[/C][/ROW]
[ROW][C]47[/C][C] 0.76[/C][C] 0.674[/C][C] 0.08605[/C][/ROW]
[ROW][C]48[/C][C] 0.67[/C][C] 0.6773[/C][C]-0.00731[/C][/ROW]
[ROW][C]49[/C][C] 0.69[/C][C] 0.7046[/C][C]-0.01463[/C][/ROW]
[ROW][C]50[/C][C] 0.59[/C][C] 0.6926[/C][C]-0.1026[/C][/ROW]
[ROW][C]51[/C][C] 0.77[/C][C] 0.6998[/C][C] 0.07022[/C][/ROW]
[ROW][C]52[/C][C] 0.54[/C][C] 0.698[/C][C]-0.158[/C][/ROW]
[ROW][C]53[/C][C] 0.63[/C][C] 0.6982[/C][C]-0.06815[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 0.74 0.6684 0.0716
2 0.76 0.6668 0.09324
3 0.75 0.6672 0.08279
4 0.69 0.6646 0.02535
5 0.59 0.6641-0.0741
6 0.71 0.6673 0.04266
7 0.44 0.665-0.225
8 0.72 0.6684 0.05162
9 0.7 0.6641 0.0359
10 0.71 0.6687 0.04127
11 0.7 0.6669 0.03307
12 0.57 0.6666-0.09659
13 0.72 0.6699 0.05012
14 0.58 0.6701-0.09007
15 0.63 0.6694-0.03936
16 0.78 0.6679 0.1121
17 0.48 0.6636-0.1836
18 0.58 0.6737-0.09375
19 0.73 0.677 0.05302
20 0.68 0.6746 0.005398
21 0.66 0.6749-0.01487
22 0.74 0.6708 0.06916
23 0.69 0.6762 0.01382
24 0.63 0.674-0.04401
25 0.78 0.6754 0.1046
26 0.59 0.6714-0.08141
27 0.69 0.6756 0.01439
28 0.78 0.6781 0.1019
29 0.41 0.6704-0.2605
30 0.68 0.6731 0.006937
31 0.64 0.6759-0.03585
32 0.55 0.6709-0.1209
33 0.81 0.673 0.137
34 0.81 0.6743 0.1357
35 0.77 0.6789 0.0911
36 0.77 0.6788 0.09121
37 0.45 0.6739-0.2239
38 0.57 0.6751-0.1051
39 0.69 0.6751 0.01492
40 0.74 0.6764 0.06363
41 0.76 0.678 0.08203
42 0.83 0.6787 0.1513
43 0.78 0.6781 0.1019
44 0.68 0.6755 0.004468
45 0.57 0.675-0.105
46 0.78 0.6739 0.1061
47 0.76 0.674 0.08605
48 0.67 0.6773-0.00731
49 0.69 0.7046-0.01463
50 0.59 0.6926-0.1026
51 0.77 0.6998 0.07022
52 0.54 0.698-0.158
53 0.63 0.6982-0.06815







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.1013 0.2025 0.8987
6 0.0438 0.08759 0.9562
7 0.5133 0.9734 0.4867
8 0.4039 0.8078 0.5961
9 0.4109 0.8219 0.5891
10 0.3185 0.637 0.6815
11 0.2254 0.4507 0.7746
12 0.2505 0.501 0.7495
13 0.1878 0.3755 0.8122
14 0.2834 0.5669 0.7166
15 0.236 0.472 0.764
16 0.2435 0.4871 0.7565
17 0.3367 0.6735 0.6633
18 0.4381 0.8763 0.5619
19 0.3592 0.7183 0.6408
20 0.2857 0.5713 0.7143
21 0.226 0.4521 0.774
22 0.1879 0.3757 0.8121
23 0.1376 0.2752 0.8624
24 0.1095 0.219 0.8905
25 0.1003 0.2005 0.8997
26 0.09326 0.1865 0.9067
27 0.06351 0.127 0.9365
28 0.05391 0.1078 0.9461
29 0.3559 0.7118 0.6441
30 0.2834 0.5667 0.7166
31 0.2367 0.4735 0.7633
32 0.3021 0.6042 0.6979
33 0.3224 0.6447 0.6776
34 0.3401 0.6803 0.6599
35 0.2985 0.597 0.7015
36 0.2621 0.5243 0.7379
37 0.6912 0.6175 0.3088
38 0.7806 0.4387 0.2194
39 0.7106 0.5787 0.2894
40 0.626 0.7481 0.374
41 0.5492 0.9016 0.4508
42 0.6192 0.7616 0.3808
43 0.6018 0.7964 0.3982
44 0.4862 0.9723 0.5138
45 0.6029 0.7941 0.3971
46 0.5336 0.9328 0.4664
47 0.5008 0.9983 0.4992
48 0.4901 0.9801 0.5099

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 &  0.1013 &  0.2025 &  0.8987 \tabularnewline
6 &  0.0438 &  0.08759 &  0.9562 \tabularnewline
7 &  0.5133 &  0.9734 &  0.4867 \tabularnewline
8 &  0.4039 &  0.8078 &  0.5961 \tabularnewline
9 &  0.4109 &  0.8219 &  0.5891 \tabularnewline
10 &  0.3185 &  0.637 &  0.6815 \tabularnewline
11 &  0.2254 &  0.4507 &  0.7746 \tabularnewline
12 &  0.2505 &  0.501 &  0.7495 \tabularnewline
13 &  0.1878 &  0.3755 &  0.8122 \tabularnewline
14 &  0.2834 &  0.5669 &  0.7166 \tabularnewline
15 &  0.236 &  0.472 &  0.764 \tabularnewline
16 &  0.2435 &  0.4871 &  0.7565 \tabularnewline
17 &  0.3367 &  0.6735 &  0.6633 \tabularnewline
18 &  0.4381 &  0.8763 &  0.5619 \tabularnewline
19 &  0.3592 &  0.7183 &  0.6408 \tabularnewline
20 &  0.2857 &  0.5713 &  0.7143 \tabularnewline
21 &  0.226 &  0.4521 &  0.774 \tabularnewline
22 &  0.1879 &  0.3757 &  0.8121 \tabularnewline
23 &  0.1376 &  0.2752 &  0.8624 \tabularnewline
24 &  0.1095 &  0.219 &  0.8905 \tabularnewline
25 &  0.1003 &  0.2005 &  0.8997 \tabularnewline
26 &  0.09326 &  0.1865 &  0.9067 \tabularnewline
27 &  0.06351 &  0.127 &  0.9365 \tabularnewline
28 &  0.05391 &  0.1078 &  0.9461 \tabularnewline
29 &  0.3559 &  0.7118 &  0.6441 \tabularnewline
30 &  0.2834 &  0.5667 &  0.7166 \tabularnewline
31 &  0.2367 &  0.4735 &  0.7633 \tabularnewline
32 &  0.3021 &  0.6042 &  0.6979 \tabularnewline
33 &  0.3224 &  0.6447 &  0.6776 \tabularnewline
34 &  0.3401 &  0.6803 &  0.6599 \tabularnewline
35 &  0.2985 &  0.597 &  0.7015 \tabularnewline
36 &  0.2621 &  0.5243 &  0.7379 \tabularnewline
37 &  0.6912 &  0.6175 &  0.3088 \tabularnewline
38 &  0.7806 &  0.4387 &  0.2194 \tabularnewline
39 &  0.7106 &  0.5787 &  0.2894 \tabularnewline
40 &  0.626 &  0.7481 &  0.374 \tabularnewline
41 &  0.5492 &  0.9016 &  0.4508 \tabularnewline
42 &  0.6192 &  0.7616 &  0.3808 \tabularnewline
43 &  0.6018 &  0.7964 &  0.3982 \tabularnewline
44 &  0.4862 &  0.9723 &  0.5138 \tabularnewline
45 &  0.6029 &  0.7941 &  0.3971 \tabularnewline
46 &  0.5336 &  0.9328 &  0.4664 \tabularnewline
47 &  0.5008 &  0.9983 &  0.4992 \tabularnewline
48 &  0.4901 &  0.9801 &  0.5099 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C] 0.1013[/C][C] 0.2025[/C][C] 0.8987[/C][/ROW]
[ROW][C]6[/C][C] 0.0438[/C][C] 0.08759[/C][C] 0.9562[/C][/ROW]
[ROW][C]7[/C][C] 0.5133[/C][C] 0.9734[/C][C] 0.4867[/C][/ROW]
[ROW][C]8[/C][C] 0.4039[/C][C] 0.8078[/C][C] 0.5961[/C][/ROW]
[ROW][C]9[/C][C] 0.4109[/C][C] 0.8219[/C][C] 0.5891[/C][/ROW]
[ROW][C]10[/C][C] 0.3185[/C][C] 0.637[/C][C] 0.6815[/C][/ROW]
[ROW][C]11[/C][C] 0.2254[/C][C] 0.4507[/C][C] 0.7746[/C][/ROW]
[ROW][C]12[/C][C] 0.2505[/C][C] 0.501[/C][C] 0.7495[/C][/ROW]
[ROW][C]13[/C][C] 0.1878[/C][C] 0.3755[/C][C] 0.8122[/C][/ROW]
[ROW][C]14[/C][C] 0.2834[/C][C] 0.5669[/C][C] 0.7166[/C][/ROW]
[ROW][C]15[/C][C] 0.236[/C][C] 0.472[/C][C] 0.764[/C][/ROW]
[ROW][C]16[/C][C] 0.2435[/C][C] 0.4871[/C][C] 0.7565[/C][/ROW]
[ROW][C]17[/C][C] 0.3367[/C][C] 0.6735[/C][C] 0.6633[/C][/ROW]
[ROW][C]18[/C][C] 0.4381[/C][C] 0.8763[/C][C] 0.5619[/C][/ROW]
[ROW][C]19[/C][C] 0.3592[/C][C] 0.7183[/C][C] 0.6408[/C][/ROW]
[ROW][C]20[/C][C] 0.2857[/C][C] 0.5713[/C][C] 0.7143[/C][/ROW]
[ROW][C]21[/C][C] 0.226[/C][C] 0.4521[/C][C] 0.774[/C][/ROW]
[ROW][C]22[/C][C] 0.1879[/C][C] 0.3757[/C][C] 0.8121[/C][/ROW]
[ROW][C]23[/C][C] 0.1376[/C][C] 0.2752[/C][C] 0.8624[/C][/ROW]
[ROW][C]24[/C][C] 0.1095[/C][C] 0.219[/C][C] 0.8905[/C][/ROW]
[ROW][C]25[/C][C] 0.1003[/C][C] 0.2005[/C][C] 0.8997[/C][/ROW]
[ROW][C]26[/C][C] 0.09326[/C][C] 0.1865[/C][C] 0.9067[/C][/ROW]
[ROW][C]27[/C][C] 0.06351[/C][C] 0.127[/C][C] 0.9365[/C][/ROW]
[ROW][C]28[/C][C] 0.05391[/C][C] 0.1078[/C][C] 0.9461[/C][/ROW]
[ROW][C]29[/C][C] 0.3559[/C][C] 0.7118[/C][C] 0.6441[/C][/ROW]
[ROW][C]30[/C][C] 0.2834[/C][C] 0.5667[/C][C] 0.7166[/C][/ROW]
[ROW][C]31[/C][C] 0.2367[/C][C] 0.4735[/C][C] 0.7633[/C][/ROW]
[ROW][C]32[/C][C] 0.3021[/C][C] 0.6042[/C][C] 0.6979[/C][/ROW]
[ROW][C]33[/C][C] 0.3224[/C][C] 0.6447[/C][C] 0.6776[/C][/ROW]
[ROW][C]34[/C][C] 0.3401[/C][C] 0.6803[/C][C] 0.6599[/C][/ROW]
[ROW][C]35[/C][C] 0.2985[/C][C] 0.597[/C][C] 0.7015[/C][/ROW]
[ROW][C]36[/C][C] 0.2621[/C][C] 0.5243[/C][C] 0.7379[/C][/ROW]
[ROW][C]37[/C][C] 0.6912[/C][C] 0.6175[/C][C] 0.3088[/C][/ROW]
[ROW][C]38[/C][C] 0.7806[/C][C] 0.4387[/C][C] 0.2194[/C][/ROW]
[ROW][C]39[/C][C] 0.7106[/C][C] 0.5787[/C][C] 0.2894[/C][/ROW]
[ROW][C]40[/C][C] 0.626[/C][C] 0.7481[/C][C] 0.374[/C][/ROW]
[ROW][C]41[/C][C] 0.5492[/C][C] 0.9016[/C][C] 0.4508[/C][/ROW]
[ROW][C]42[/C][C] 0.6192[/C][C] 0.7616[/C][C] 0.3808[/C][/ROW]
[ROW][C]43[/C][C] 0.6018[/C][C] 0.7964[/C][C] 0.3982[/C][/ROW]
[ROW][C]44[/C][C] 0.4862[/C][C] 0.9723[/C][C] 0.5138[/C][/ROW]
[ROW][C]45[/C][C] 0.6029[/C][C] 0.7941[/C][C] 0.3971[/C][/ROW]
[ROW][C]46[/C][C] 0.5336[/C][C] 0.9328[/C][C] 0.4664[/C][/ROW]
[ROW][C]47[/C][C] 0.5008[/C][C] 0.9983[/C][C] 0.4992[/C][/ROW]
[ROW][C]48[/C][C] 0.4901[/C][C] 0.9801[/C][C] 0.5099[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.1013 0.2025 0.8987
6 0.0438 0.08759 0.9562
7 0.5133 0.9734 0.4867
8 0.4039 0.8078 0.5961
9 0.4109 0.8219 0.5891
10 0.3185 0.637 0.6815
11 0.2254 0.4507 0.7746
12 0.2505 0.501 0.7495
13 0.1878 0.3755 0.8122
14 0.2834 0.5669 0.7166
15 0.236 0.472 0.764
16 0.2435 0.4871 0.7565
17 0.3367 0.6735 0.6633
18 0.4381 0.8763 0.5619
19 0.3592 0.7183 0.6408
20 0.2857 0.5713 0.7143
21 0.226 0.4521 0.774
22 0.1879 0.3757 0.8121
23 0.1376 0.2752 0.8624
24 0.1095 0.219 0.8905
25 0.1003 0.2005 0.8997
26 0.09326 0.1865 0.9067
27 0.06351 0.127 0.9365
28 0.05391 0.1078 0.9461
29 0.3559 0.7118 0.6441
30 0.2834 0.5667 0.7166
31 0.2367 0.4735 0.7633
32 0.3021 0.6042 0.6979
33 0.3224 0.6447 0.6776
34 0.3401 0.6803 0.6599
35 0.2985 0.597 0.7015
36 0.2621 0.5243 0.7379
37 0.6912 0.6175 0.3088
38 0.7806 0.4387 0.2194
39 0.7106 0.5787 0.2894
40 0.626 0.7481 0.374
41 0.5492 0.9016 0.4508
42 0.6192 0.7616 0.3808
43 0.6018 0.7964 0.3982
44 0.4862 0.9723 0.5138
45 0.6029 0.7941 0.3971
46 0.5336 0.9328 0.4664
47 0.5008 0.9983 0.4992
48 0.4901 0.9801 0.5099







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level10.0227273OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 1 & 0.0227273 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=284789&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]1[/C][C]0.0227273[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=284789&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=284789&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level10.0227273OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1+par4,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1+par4,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}