Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 10 Dec 2014 12:58:06 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/10/t1418216315nvuu6w3736wzd5c.htm/, Retrieved Sun, 19 May 2024 16:10:48 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=265069, Retrieved Sun, 19 May 2024 16:10:48 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact77
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [MR impact consoft] [2014-12-10 12:58:06] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
Feedback Forum

Post a new message
Dataseries X:
12.9 12
12.2 8
12.8 11
7.4 13
6.7 11
12.6 10
14.8 7
13.3 10
11.1 15
8.2 12
11.4 12
6.4 10
10.6 10
12 14
6.3 6
11.3 12
11.9 14
9.3 11
9.6 8
10 12
6.4 15
13.8 13
10.8 11
13.8 12
11.7 7
10.9 11
16.1 7
13.4 12
9.9 12
11.5 13
8.3 9
11.7 11
9 12
9.7 15
10.8 12
10.3 6
10.4 5
12.7 13
9.3 11
11.8 6
5.9 12
11.4 10
13 6
10.8 12
12.3 11
11.3 6
11.8 12
7.9 12
12.7 8
12.3 10
11.6 11
6.7 7
10.9 12
12.1 13
13.3 14
10.1 12
5.7 6
14.3 14
8 10
13.3 12
9.3 11
12.5 10
7.6 7
15.9 12
9.2 7
9.1 12
11.1 12
13 10
14.5 10
12.2 12
12.3 12
11.4 12
8.8 8
14.6 10
12.6 5
13 10
12.6 12
13.2 11
9.9 9
7.7 12
10.5 11
13.4 10
10.9 12
4.3 10
10.3 9
11.8 11
11.2 12
11.4 7
8.6 11
13.2 12
12.6 6
5.6 9
9.9 15
8.8 10
7.7 11
9 12
7.3 12
11.4 12
13.6 11
7.9 9
10.7 11
10.3 12
8.3 12
9.6 14
14.2 8
8.5 10
13.5 9
4.9 10
6.4 9
9.6 10
11.6 12
11.1 11




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 10.2434 + 0.0420621CONFSOFTTOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TOT[t] =  +  10.2434 +  0.0420621CONFSOFTTOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TOT[t] =  +  10.2434 +  0.0420621CONFSOFTTOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 10.2434 + 0.0420621CONFSOFTTOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.24341.102749.2891.66574e-158.32872e-16
CONFSOFTTOT0.04206210.1018430.4130.6804050.340202

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 10.2434 & 1.10274 & 9.289 & 1.66574e-15 & 8.32872e-16 \tabularnewline
CONFSOFTTOT & 0.0420621 & 0.101843 & 0.413 & 0.680405 & 0.340202 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]10.2434[/C][C]1.10274[/C][C]9.289[/C][C]1.66574e-15[/C][C]8.32872e-16[/C][/ROW]
[ROW][C]CONFSOFTTOT[/C][C]0.0420621[/C][C]0.101843[/C][C]0.413[/C][C]0.680405[/C][C]0.340202[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.24341.102749.2891.66574e-158.32872e-16
CONFSOFTTOT0.04206210.1018430.4130.6804050.340202







Multiple Linear Regression - Regression Statistics
Multiple R0.0393483
R-squared0.00154829
Adjusted R-squared-0.00752855
F-TEST (value)0.170576
F-TEST (DF numerator)1
F-TEST (DF denominator)110
p-value0.680405
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.48063
Sum Squared Residuals676.885

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.0393483 \tabularnewline
R-squared & 0.00154829 \tabularnewline
Adjusted R-squared & -0.00752855 \tabularnewline
F-TEST (value) & 0.170576 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 110 \tabularnewline
p-value & 0.680405 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.48063 \tabularnewline
Sum Squared Residuals & 676.885 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.0393483[/C][/ROW]
[ROW][C]R-squared[/C][C]0.00154829[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.00752855[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.170576[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]110[/C][/ROW]
[ROW][C]p-value[/C][C]0.680405[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.48063[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]676.885[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.0393483
R-squared0.00154829
Adjusted R-squared-0.00752855
F-TEST (value)0.170576
F-TEST (DF numerator)1
F-TEST (DF denominator)110
p-value0.680405
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.48063
Sum Squared Residuals676.885







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.910.74812.15189
212.210.57991.62014
312.810.7062.09396
47.410.7902-3.39017
56.710.706-4.00604
612.610.6641.93602
714.810.53784.2622
813.310.6642.63602
911.110.87430.225708
108.210.7481-2.54811
1111.410.74810.651894
126.410.664-4.26398
1310.610.664-0.0639818
141210.83221.16777
156.310.4957-4.19573
1611.310.74810.551894
1711.910.83221.06777
189.310.706-1.40604
199.610.5799-0.979858
201010.7481-0.748106
216.410.8743-4.47429
2213.810.79023.00983
2310.810.7060.0939561
2413.810.74813.05189
2511.710.53781.1622
2610.910.7060.193956
2716.110.53785.5622
2813.410.74812.65189
299.910.7481-0.848106
3011.510.79020.709832
318.310.6219-2.32192
3211.710.7060.993956
33910.7481-1.74811
349.710.8743-1.17429
3510.810.74810.051894
3610.310.4957-0.195733
3710.410.4537-0.0536713
3812.710.79021.90983
399.310.706-1.40604
4011.810.49571.30427
415.910.7481-4.84811
4211.410.6640.736018
431310.49572.50427
4410.810.74810.051894
4512.310.7061.59396
4611.310.49570.804267
4711.810.74811.05189
487.910.7481-2.84811
4912.710.57992.12014
5012.310.6641.63602
5111.610.7060.893956
526.710.5378-3.8378
5310.910.74810.151894
5412.110.79021.30983
5513.310.83222.46777
5610.110.7481-0.648106
575.710.4957-4.79573
5814.310.83223.46777
59810.664-2.66398
6013.310.74812.55189
619.310.706-1.40604
6212.510.6641.83602
637.610.5378-2.9378
6415.910.74815.15189
659.210.5378-1.3378
669.110.7481-1.64811
6711.110.74810.351894
681310.6642.33602
6914.510.6643.83602
7012.210.74811.45189
7112.310.74811.55189
7211.410.74810.651894
738.810.5799-1.77986
7414.610.6643.93602
7512.610.45372.14633
761310.6642.33602
7712.610.74811.85189
7813.210.7062.49396
799.910.6219-0.72192
807.710.7481-3.04811
8110.510.706-0.206044
8213.410.6642.73602
8310.910.74810.151894
844.310.664-6.36398
8510.310.6219-0.32192
8611.810.7061.09396
8711.210.74810.451894
8811.410.53780.862204
898.610.706-2.10604
9013.210.74812.45189
9112.610.49572.10427
925.610.6219-5.02192
939.910.8743-0.974292
948.810.664-1.86398
957.710.706-3.00604
96910.7481-1.74811
977.310.7481-3.44811
9811.410.74810.651894
9913.610.7062.89396
1007.910.6219-2.72192
10110.710.706-0.00604391
10210.310.7481-0.448106
1038.310.7481-2.44811
1049.610.8322-1.23223
10514.210.57993.62014
1068.510.664-2.16398
10713.510.62192.87808
1084.910.664-5.76398
1096.410.6219-4.22192
1109.610.664-1.06398
11111.610.74810.851894
11211.110.7060.393956

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 12.9 & 10.7481 & 2.15189 \tabularnewline
2 & 12.2 & 10.5799 & 1.62014 \tabularnewline
3 & 12.8 & 10.706 & 2.09396 \tabularnewline
4 & 7.4 & 10.7902 & -3.39017 \tabularnewline
5 & 6.7 & 10.706 & -4.00604 \tabularnewline
6 & 12.6 & 10.664 & 1.93602 \tabularnewline
7 & 14.8 & 10.5378 & 4.2622 \tabularnewline
8 & 13.3 & 10.664 & 2.63602 \tabularnewline
9 & 11.1 & 10.8743 & 0.225708 \tabularnewline
10 & 8.2 & 10.7481 & -2.54811 \tabularnewline
11 & 11.4 & 10.7481 & 0.651894 \tabularnewline
12 & 6.4 & 10.664 & -4.26398 \tabularnewline
13 & 10.6 & 10.664 & -0.0639818 \tabularnewline
14 & 12 & 10.8322 & 1.16777 \tabularnewline
15 & 6.3 & 10.4957 & -4.19573 \tabularnewline
16 & 11.3 & 10.7481 & 0.551894 \tabularnewline
17 & 11.9 & 10.8322 & 1.06777 \tabularnewline
18 & 9.3 & 10.706 & -1.40604 \tabularnewline
19 & 9.6 & 10.5799 & -0.979858 \tabularnewline
20 & 10 & 10.7481 & -0.748106 \tabularnewline
21 & 6.4 & 10.8743 & -4.47429 \tabularnewline
22 & 13.8 & 10.7902 & 3.00983 \tabularnewline
23 & 10.8 & 10.706 & 0.0939561 \tabularnewline
24 & 13.8 & 10.7481 & 3.05189 \tabularnewline
25 & 11.7 & 10.5378 & 1.1622 \tabularnewline
26 & 10.9 & 10.706 & 0.193956 \tabularnewline
27 & 16.1 & 10.5378 & 5.5622 \tabularnewline
28 & 13.4 & 10.7481 & 2.65189 \tabularnewline
29 & 9.9 & 10.7481 & -0.848106 \tabularnewline
30 & 11.5 & 10.7902 & 0.709832 \tabularnewline
31 & 8.3 & 10.6219 & -2.32192 \tabularnewline
32 & 11.7 & 10.706 & 0.993956 \tabularnewline
33 & 9 & 10.7481 & -1.74811 \tabularnewline
34 & 9.7 & 10.8743 & -1.17429 \tabularnewline
35 & 10.8 & 10.7481 & 0.051894 \tabularnewline
36 & 10.3 & 10.4957 & -0.195733 \tabularnewline
37 & 10.4 & 10.4537 & -0.0536713 \tabularnewline
38 & 12.7 & 10.7902 & 1.90983 \tabularnewline
39 & 9.3 & 10.706 & -1.40604 \tabularnewline
40 & 11.8 & 10.4957 & 1.30427 \tabularnewline
41 & 5.9 & 10.7481 & -4.84811 \tabularnewline
42 & 11.4 & 10.664 & 0.736018 \tabularnewline
43 & 13 & 10.4957 & 2.50427 \tabularnewline
44 & 10.8 & 10.7481 & 0.051894 \tabularnewline
45 & 12.3 & 10.706 & 1.59396 \tabularnewline
46 & 11.3 & 10.4957 & 0.804267 \tabularnewline
47 & 11.8 & 10.7481 & 1.05189 \tabularnewline
48 & 7.9 & 10.7481 & -2.84811 \tabularnewline
49 & 12.7 & 10.5799 & 2.12014 \tabularnewline
50 & 12.3 & 10.664 & 1.63602 \tabularnewline
51 & 11.6 & 10.706 & 0.893956 \tabularnewline
52 & 6.7 & 10.5378 & -3.8378 \tabularnewline
53 & 10.9 & 10.7481 & 0.151894 \tabularnewline
54 & 12.1 & 10.7902 & 1.30983 \tabularnewline
55 & 13.3 & 10.8322 & 2.46777 \tabularnewline
56 & 10.1 & 10.7481 & -0.648106 \tabularnewline
57 & 5.7 & 10.4957 & -4.79573 \tabularnewline
58 & 14.3 & 10.8322 & 3.46777 \tabularnewline
59 & 8 & 10.664 & -2.66398 \tabularnewline
60 & 13.3 & 10.7481 & 2.55189 \tabularnewline
61 & 9.3 & 10.706 & -1.40604 \tabularnewline
62 & 12.5 & 10.664 & 1.83602 \tabularnewline
63 & 7.6 & 10.5378 & -2.9378 \tabularnewline
64 & 15.9 & 10.7481 & 5.15189 \tabularnewline
65 & 9.2 & 10.5378 & -1.3378 \tabularnewline
66 & 9.1 & 10.7481 & -1.64811 \tabularnewline
67 & 11.1 & 10.7481 & 0.351894 \tabularnewline
68 & 13 & 10.664 & 2.33602 \tabularnewline
69 & 14.5 & 10.664 & 3.83602 \tabularnewline
70 & 12.2 & 10.7481 & 1.45189 \tabularnewline
71 & 12.3 & 10.7481 & 1.55189 \tabularnewline
72 & 11.4 & 10.7481 & 0.651894 \tabularnewline
73 & 8.8 & 10.5799 & -1.77986 \tabularnewline
74 & 14.6 & 10.664 & 3.93602 \tabularnewline
75 & 12.6 & 10.4537 & 2.14633 \tabularnewline
76 & 13 & 10.664 & 2.33602 \tabularnewline
77 & 12.6 & 10.7481 & 1.85189 \tabularnewline
78 & 13.2 & 10.706 & 2.49396 \tabularnewline
79 & 9.9 & 10.6219 & -0.72192 \tabularnewline
80 & 7.7 & 10.7481 & -3.04811 \tabularnewline
81 & 10.5 & 10.706 & -0.206044 \tabularnewline
82 & 13.4 & 10.664 & 2.73602 \tabularnewline
83 & 10.9 & 10.7481 & 0.151894 \tabularnewline
84 & 4.3 & 10.664 & -6.36398 \tabularnewline
85 & 10.3 & 10.6219 & -0.32192 \tabularnewline
86 & 11.8 & 10.706 & 1.09396 \tabularnewline
87 & 11.2 & 10.7481 & 0.451894 \tabularnewline
88 & 11.4 & 10.5378 & 0.862204 \tabularnewline
89 & 8.6 & 10.706 & -2.10604 \tabularnewline
90 & 13.2 & 10.7481 & 2.45189 \tabularnewline
91 & 12.6 & 10.4957 & 2.10427 \tabularnewline
92 & 5.6 & 10.6219 & -5.02192 \tabularnewline
93 & 9.9 & 10.8743 & -0.974292 \tabularnewline
94 & 8.8 & 10.664 & -1.86398 \tabularnewline
95 & 7.7 & 10.706 & -3.00604 \tabularnewline
96 & 9 & 10.7481 & -1.74811 \tabularnewline
97 & 7.3 & 10.7481 & -3.44811 \tabularnewline
98 & 11.4 & 10.7481 & 0.651894 \tabularnewline
99 & 13.6 & 10.706 & 2.89396 \tabularnewline
100 & 7.9 & 10.6219 & -2.72192 \tabularnewline
101 & 10.7 & 10.706 & -0.00604391 \tabularnewline
102 & 10.3 & 10.7481 & -0.448106 \tabularnewline
103 & 8.3 & 10.7481 & -2.44811 \tabularnewline
104 & 9.6 & 10.8322 & -1.23223 \tabularnewline
105 & 14.2 & 10.5799 & 3.62014 \tabularnewline
106 & 8.5 & 10.664 & -2.16398 \tabularnewline
107 & 13.5 & 10.6219 & 2.87808 \tabularnewline
108 & 4.9 & 10.664 & -5.76398 \tabularnewline
109 & 6.4 & 10.6219 & -4.22192 \tabularnewline
110 & 9.6 & 10.664 & -1.06398 \tabularnewline
111 & 11.6 & 10.7481 & 0.851894 \tabularnewline
112 & 11.1 & 10.706 & 0.393956 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]12.9[/C][C]10.7481[/C][C]2.15189[/C][/ROW]
[ROW][C]2[/C][C]12.2[/C][C]10.5799[/C][C]1.62014[/C][/ROW]
[ROW][C]3[/C][C]12.8[/C][C]10.706[/C][C]2.09396[/C][/ROW]
[ROW][C]4[/C][C]7.4[/C][C]10.7902[/C][C]-3.39017[/C][/ROW]
[ROW][C]5[/C][C]6.7[/C][C]10.706[/C][C]-4.00604[/C][/ROW]
[ROW][C]6[/C][C]12.6[/C][C]10.664[/C][C]1.93602[/C][/ROW]
[ROW][C]7[/C][C]14.8[/C][C]10.5378[/C][C]4.2622[/C][/ROW]
[ROW][C]8[/C][C]13.3[/C][C]10.664[/C][C]2.63602[/C][/ROW]
[ROW][C]9[/C][C]11.1[/C][C]10.8743[/C][C]0.225708[/C][/ROW]
[ROW][C]10[/C][C]8.2[/C][C]10.7481[/C][C]-2.54811[/C][/ROW]
[ROW][C]11[/C][C]11.4[/C][C]10.7481[/C][C]0.651894[/C][/ROW]
[ROW][C]12[/C][C]6.4[/C][C]10.664[/C][C]-4.26398[/C][/ROW]
[ROW][C]13[/C][C]10.6[/C][C]10.664[/C][C]-0.0639818[/C][/ROW]
[ROW][C]14[/C][C]12[/C][C]10.8322[/C][C]1.16777[/C][/ROW]
[ROW][C]15[/C][C]6.3[/C][C]10.4957[/C][C]-4.19573[/C][/ROW]
[ROW][C]16[/C][C]11.3[/C][C]10.7481[/C][C]0.551894[/C][/ROW]
[ROW][C]17[/C][C]11.9[/C][C]10.8322[/C][C]1.06777[/C][/ROW]
[ROW][C]18[/C][C]9.3[/C][C]10.706[/C][C]-1.40604[/C][/ROW]
[ROW][C]19[/C][C]9.6[/C][C]10.5799[/C][C]-0.979858[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]10.7481[/C][C]-0.748106[/C][/ROW]
[ROW][C]21[/C][C]6.4[/C][C]10.8743[/C][C]-4.47429[/C][/ROW]
[ROW][C]22[/C][C]13.8[/C][C]10.7902[/C][C]3.00983[/C][/ROW]
[ROW][C]23[/C][C]10.8[/C][C]10.706[/C][C]0.0939561[/C][/ROW]
[ROW][C]24[/C][C]13.8[/C][C]10.7481[/C][C]3.05189[/C][/ROW]
[ROW][C]25[/C][C]11.7[/C][C]10.5378[/C][C]1.1622[/C][/ROW]
[ROW][C]26[/C][C]10.9[/C][C]10.706[/C][C]0.193956[/C][/ROW]
[ROW][C]27[/C][C]16.1[/C][C]10.5378[/C][C]5.5622[/C][/ROW]
[ROW][C]28[/C][C]13.4[/C][C]10.7481[/C][C]2.65189[/C][/ROW]
[ROW][C]29[/C][C]9.9[/C][C]10.7481[/C][C]-0.848106[/C][/ROW]
[ROW][C]30[/C][C]11.5[/C][C]10.7902[/C][C]0.709832[/C][/ROW]
[ROW][C]31[/C][C]8.3[/C][C]10.6219[/C][C]-2.32192[/C][/ROW]
[ROW][C]32[/C][C]11.7[/C][C]10.706[/C][C]0.993956[/C][/ROW]
[ROW][C]33[/C][C]9[/C][C]10.7481[/C][C]-1.74811[/C][/ROW]
[ROW][C]34[/C][C]9.7[/C][C]10.8743[/C][C]-1.17429[/C][/ROW]
[ROW][C]35[/C][C]10.8[/C][C]10.7481[/C][C]0.051894[/C][/ROW]
[ROW][C]36[/C][C]10.3[/C][C]10.4957[/C][C]-0.195733[/C][/ROW]
[ROW][C]37[/C][C]10.4[/C][C]10.4537[/C][C]-0.0536713[/C][/ROW]
[ROW][C]38[/C][C]12.7[/C][C]10.7902[/C][C]1.90983[/C][/ROW]
[ROW][C]39[/C][C]9.3[/C][C]10.706[/C][C]-1.40604[/C][/ROW]
[ROW][C]40[/C][C]11.8[/C][C]10.4957[/C][C]1.30427[/C][/ROW]
[ROW][C]41[/C][C]5.9[/C][C]10.7481[/C][C]-4.84811[/C][/ROW]
[ROW][C]42[/C][C]11.4[/C][C]10.664[/C][C]0.736018[/C][/ROW]
[ROW][C]43[/C][C]13[/C][C]10.4957[/C][C]2.50427[/C][/ROW]
[ROW][C]44[/C][C]10.8[/C][C]10.7481[/C][C]0.051894[/C][/ROW]
[ROW][C]45[/C][C]12.3[/C][C]10.706[/C][C]1.59396[/C][/ROW]
[ROW][C]46[/C][C]11.3[/C][C]10.4957[/C][C]0.804267[/C][/ROW]
[ROW][C]47[/C][C]11.8[/C][C]10.7481[/C][C]1.05189[/C][/ROW]
[ROW][C]48[/C][C]7.9[/C][C]10.7481[/C][C]-2.84811[/C][/ROW]
[ROW][C]49[/C][C]12.7[/C][C]10.5799[/C][C]2.12014[/C][/ROW]
[ROW][C]50[/C][C]12.3[/C][C]10.664[/C][C]1.63602[/C][/ROW]
[ROW][C]51[/C][C]11.6[/C][C]10.706[/C][C]0.893956[/C][/ROW]
[ROW][C]52[/C][C]6.7[/C][C]10.5378[/C][C]-3.8378[/C][/ROW]
[ROW][C]53[/C][C]10.9[/C][C]10.7481[/C][C]0.151894[/C][/ROW]
[ROW][C]54[/C][C]12.1[/C][C]10.7902[/C][C]1.30983[/C][/ROW]
[ROW][C]55[/C][C]13.3[/C][C]10.8322[/C][C]2.46777[/C][/ROW]
[ROW][C]56[/C][C]10.1[/C][C]10.7481[/C][C]-0.648106[/C][/ROW]
[ROW][C]57[/C][C]5.7[/C][C]10.4957[/C][C]-4.79573[/C][/ROW]
[ROW][C]58[/C][C]14.3[/C][C]10.8322[/C][C]3.46777[/C][/ROW]
[ROW][C]59[/C][C]8[/C][C]10.664[/C][C]-2.66398[/C][/ROW]
[ROW][C]60[/C][C]13.3[/C][C]10.7481[/C][C]2.55189[/C][/ROW]
[ROW][C]61[/C][C]9.3[/C][C]10.706[/C][C]-1.40604[/C][/ROW]
[ROW][C]62[/C][C]12.5[/C][C]10.664[/C][C]1.83602[/C][/ROW]
[ROW][C]63[/C][C]7.6[/C][C]10.5378[/C][C]-2.9378[/C][/ROW]
[ROW][C]64[/C][C]15.9[/C][C]10.7481[/C][C]5.15189[/C][/ROW]
[ROW][C]65[/C][C]9.2[/C][C]10.5378[/C][C]-1.3378[/C][/ROW]
[ROW][C]66[/C][C]9.1[/C][C]10.7481[/C][C]-1.64811[/C][/ROW]
[ROW][C]67[/C][C]11.1[/C][C]10.7481[/C][C]0.351894[/C][/ROW]
[ROW][C]68[/C][C]13[/C][C]10.664[/C][C]2.33602[/C][/ROW]
[ROW][C]69[/C][C]14.5[/C][C]10.664[/C][C]3.83602[/C][/ROW]
[ROW][C]70[/C][C]12.2[/C][C]10.7481[/C][C]1.45189[/C][/ROW]
[ROW][C]71[/C][C]12.3[/C][C]10.7481[/C][C]1.55189[/C][/ROW]
[ROW][C]72[/C][C]11.4[/C][C]10.7481[/C][C]0.651894[/C][/ROW]
[ROW][C]73[/C][C]8.8[/C][C]10.5799[/C][C]-1.77986[/C][/ROW]
[ROW][C]74[/C][C]14.6[/C][C]10.664[/C][C]3.93602[/C][/ROW]
[ROW][C]75[/C][C]12.6[/C][C]10.4537[/C][C]2.14633[/C][/ROW]
[ROW][C]76[/C][C]13[/C][C]10.664[/C][C]2.33602[/C][/ROW]
[ROW][C]77[/C][C]12.6[/C][C]10.7481[/C][C]1.85189[/C][/ROW]
[ROW][C]78[/C][C]13.2[/C][C]10.706[/C][C]2.49396[/C][/ROW]
[ROW][C]79[/C][C]9.9[/C][C]10.6219[/C][C]-0.72192[/C][/ROW]
[ROW][C]80[/C][C]7.7[/C][C]10.7481[/C][C]-3.04811[/C][/ROW]
[ROW][C]81[/C][C]10.5[/C][C]10.706[/C][C]-0.206044[/C][/ROW]
[ROW][C]82[/C][C]13.4[/C][C]10.664[/C][C]2.73602[/C][/ROW]
[ROW][C]83[/C][C]10.9[/C][C]10.7481[/C][C]0.151894[/C][/ROW]
[ROW][C]84[/C][C]4.3[/C][C]10.664[/C][C]-6.36398[/C][/ROW]
[ROW][C]85[/C][C]10.3[/C][C]10.6219[/C][C]-0.32192[/C][/ROW]
[ROW][C]86[/C][C]11.8[/C][C]10.706[/C][C]1.09396[/C][/ROW]
[ROW][C]87[/C][C]11.2[/C][C]10.7481[/C][C]0.451894[/C][/ROW]
[ROW][C]88[/C][C]11.4[/C][C]10.5378[/C][C]0.862204[/C][/ROW]
[ROW][C]89[/C][C]8.6[/C][C]10.706[/C][C]-2.10604[/C][/ROW]
[ROW][C]90[/C][C]13.2[/C][C]10.7481[/C][C]2.45189[/C][/ROW]
[ROW][C]91[/C][C]12.6[/C][C]10.4957[/C][C]2.10427[/C][/ROW]
[ROW][C]92[/C][C]5.6[/C][C]10.6219[/C][C]-5.02192[/C][/ROW]
[ROW][C]93[/C][C]9.9[/C][C]10.8743[/C][C]-0.974292[/C][/ROW]
[ROW][C]94[/C][C]8.8[/C][C]10.664[/C][C]-1.86398[/C][/ROW]
[ROW][C]95[/C][C]7.7[/C][C]10.706[/C][C]-3.00604[/C][/ROW]
[ROW][C]96[/C][C]9[/C][C]10.7481[/C][C]-1.74811[/C][/ROW]
[ROW][C]97[/C][C]7.3[/C][C]10.7481[/C][C]-3.44811[/C][/ROW]
[ROW][C]98[/C][C]11.4[/C][C]10.7481[/C][C]0.651894[/C][/ROW]
[ROW][C]99[/C][C]13.6[/C][C]10.706[/C][C]2.89396[/C][/ROW]
[ROW][C]100[/C][C]7.9[/C][C]10.6219[/C][C]-2.72192[/C][/ROW]
[ROW][C]101[/C][C]10.7[/C][C]10.706[/C][C]-0.00604391[/C][/ROW]
[ROW][C]102[/C][C]10.3[/C][C]10.7481[/C][C]-0.448106[/C][/ROW]
[ROW][C]103[/C][C]8.3[/C][C]10.7481[/C][C]-2.44811[/C][/ROW]
[ROW][C]104[/C][C]9.6[/C][C]10.8322[/C][C]-1.23223[/C][/ROW]
[ROW][C]105[/C][C]14.2[/C][C]10.5799[/C][C]3.62014[/C][/ROW]
[ROW][C]106[/C][C]8.5[/C][C]10.664[/C][C]-2.16398[/C][/ROW]
[ROW][C]107[/C][C]13.5[/C][C]10.6219[/C][C]2.87808[/C][/ROW]
[ROW][C]108[/C][C]4.9[/C][C]10.664[/C][C]-5.76398[/C][/ROW]
[ROW][C]109[/C][C]6.4[/C][C]10.6219[/C][C]-4.22192[/C][/ROW]
[ROW][C]110[/C][C]9.6[/C][C]10.664[/C][C]-1.06398[/C][/ROW]
[ROW][C]111[/C][C]11.6[/C][C]10.7481[/C][C]0.851894[/C][/ROW]
[ROW][C]112[/C][C]11.1[/C][C]10.706[/C][C]0.393956[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.910.74812.15189
212.210.57991.62014
312.810.7062.09396
47.410.7902-3.39017
56.710.706-4.00604
612.610.6641.93602
714.810.53784.2622
813.310.6642.63602
911.110.87430.225708
108.210.7481-2.54811
1111.410.74810.651894
126.410.664-4.26398
1310.610.664-0.0639818
141210.83221.16777
156.310.4957-4.19573
1611.310.74810.551894
1711.910.83221.06777
189.310.706-1.40604
199.610.5799-0.979858
201010.7481-0.748106
216.410.8743-4.47429
2213.810.79023.00983
2310.810.7060.0939561
2413.810.74813.05189
2511.710.53781.1622
2610.910.7060.193956
2716.110.53785.5622
2813.410.74812.65189
299.910.7481-0.848106
3011.510.79020.709832
318.310.6219-2.32192
3211.710.7060.993956
33910.7481-1.74811
349.710.8743-1.17429
3510.810.74810.051894
3610.310.4957-0.195733
3710.410.4537-0.0536713
3812.710.79021.90983
399.310.706-1.40604
4011.810.49571.30427
415.910.7481-4.84811
4211.410.6640.736018
431310.49572.50427
4410.810.74810.051894
4512.310.7061.59396
4611.310.49570.804267
4711.810.74811.05189
487.910.7481-2.84811
4912.710.57992.12014
5012.310.6641.63602
5111.610.7060.893956
526.710.5378-3.8378
5310.910.74810.151894
5412.110.79021.30983
5513.310.83222.46777
5610.110.7481-0.648106
575.710.4957-4.79573
5814.310.83223.46777
59810.664-2.66398
6013.310.74812.55189
619.310.706-1.40604
6212.510.6641.83602
637.610.5378-2.9378
6415.910.74815.15189
659.210.5378-1.3378
669.110.7481-1.64811
6711.110.74810.351894
681310.6642.33602
6914.510.6643.83602
7012.210.74811.45189
7112.310.74811.55189
7211.410.74810.651894
738.810.5799-1.77986
7414.610.6643.93602
7512.610.45372.14633
761310.6642.33602
7712.610.74811.85189
7813.210.7062.49396
799.910.6219-0.72192
807.710.7481-3.04811
8110.510.706-0.206044
8213.410.6642.73602
8310.910.74810.151894
844.310.664-6.36398
8510.310.6219-0.32192
8611.810.7061.09396
8711.210.74810.451894
8811.410.53780.862204
898.610.706-2.10604
9013.210.74812.45189
9112.610.49572.10427
925.610.6219-5.02192
939.910.8743-0.974292
948.810.664-1.86398
957.710.706-3.00604
96910.7481-1.74811
977.310.7481-3.44811
9811.410.74810.651894
9913.610.7062.89396
1007.910.6219-2.72192
10110.710.706-0.00604391
10210.310.7481-0.448106
1038.310.7481-2.44811
1049.610.8322-1.23223
10514.210.57993.62014
1068.510.664-2.16398
10713.510.62192.87808
1084.910.664-5.76398
1096.410.6219-4.22192
1109.610.664-1.06398
11111.610.74810.851894
11211.110.7060.393956







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.8430370.3139270.156963
60.7619690.4760630.238031
70.671790.6564210.32821
80.6022470.7955070.397753
90.6065440.7869110.393456
100.6015790.7968420.398421
110.509450.98110.49055
120.765110.469780.23489
130.6935730.6128550.306427
140.6774260.6451470.322574
150.8473040.3053920.152696
160.7983570.4032860.201643
170.7500510.4998970.249949
180.7034330.5931340.296567
190.6426510.7146980.357349
200.5759730.8480550.424027
210.6847430.6305150.315257
220.7263360.5473270.273664
230.666480.6670390.33352
240.6981710.6036580.301829
250.6471940.7056120.352806
260.5843140.8313710.415686
270.7596280.4807440.240372
280.7618530.4762930.238147
290.7170750.5658510.282925
300.6674010.6651970.332599
310.6739640.6520720.326036
320.6240820.7518360.375918
330.594220.811560.40578
340.5434390.9131230.456561
350.4838280.9676550.516172
360.4328560.8657110.567144
370.3815380.7630760.618462
380.360390.7207790.63961
390.3256730.6513450.674327
400.2850920.5701840.714908
410.4346610.8693220.565339
420.383330.7666610.61667
430.3777860.7555730.622214
440.3255210.6510410.674479
450.2955050.5910110.704495
460.2567140.5134290.743286
470.2211490.4422980.778851
480.2365730.4731450.763427
490.2251650.450330.774835
500.2019520.4039030.798048
510.1694380.3388760.830562
520.2326110.4652220.767389
530.1928770.3857540.807123
540.1669420.3338830.833058
550.1658910.3317810.834109
560.1363090.2726170.863691
570.2329940.4659870.767006
580.2712270.5424540.728773
590.276830.5536610.72317
600.2785310.5570610.721469
610.2475390.4950790.752461
620.2279110.4558220.772089
630.2440930.4881850.755907
640.4070050.8140110.592995
650.3722440.7444880.627756
660.3400830.6801670.659917
670.2928380.5856760.707162
680.2869590.5739180.713041
690.3567530.7135070.643247
700.3286280.6572560.671372
710.3055260.6110510.694474
720.2656390.5312780.734361
730.244390.4887790.75561
740.3244660.6489330.675534
750.3038530.6077070.696147
760.3086110.6172230.691389
770.3024030.6048060.697597
780.3249180.6498370.675082
790.2757810.5515620.724219
800.2791880.5583760.720812
810.2327990.4655980.767201
820.2634650.526930.736535
830.2226840.4453680.777316
840.4839790.9679580.516021
850.4208630.8417260.579137
860.3875780.7751560.612422
870.342470.6849410.65753
880.2949990.5899990.705001
890.2587040.5174070.741296
900.3004620.6009250.699538
910.3085810.6171610.691419
920.4376760.8753510.562324
930.3716650.743330.628335
940.3189680.6379360.681032
950.305650.6112990.69435
960.2496760.4993510.750324
970.2590630.5181260.740937
980.2101050.4202090.789895
990.271060.5421190.72894
1000.2515290.5030580.748471
1010.188840.377680.81116
1020.1344510.2689010.865549
1030.09671540.1934310.903285
1040.06043610.1208720.939564
1050.1364120.2728240.863588
1060.08292570.1658510.917074
1070.5143360.9713280.485664

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.843037 & 0.313927 & 0.156963 \tabularnewline
6 & 0.761969 & 0.476063 & 0.238031 \tabularnewline
7 & 0.67179 & 0.656421 & 0.32821 \tabularnewline
8 & 0.602247 & 0.795507 & 0.397753 \tabularnewline
9 & 0.606544 & 0.786911 & 0.393456 \tabularnewline
10 & 0.601579 & 0.796842 & 0.398421 \tabularnewline
11 & 0.50945 & 0.9811 & 0.49055 \tabularnewline
12 & 0.76511 & 0.46978 & 0.23489 \tabularnewline
13 & 0.693573 & 0.612855 & 0.306427 \tabularnewline
14 & 0.677426 & 0.645147 & 0.322574 \tabularnewline
15 & 0.847304 & 0.305392 & 0.152696 \tabularnewline
16 & 0.798357 & 0.403286 & 0.201643 \tabularnewline
17 & 0.750051 & 0.499897 & 0.249949 \tabularnewline
18 & 0.703433 & 0.593134 & 0.296567 \tabularnewline
19 & 0.642651 & 0.714698 & 0.357349 \tabularnewline
20 & 0.575973 & 0.848055 & 0.424027 \tabularnewline
21 & 0.684743 & 0.630515 & 0.315257 \tabularnewline
22 & 0.726336 & 0.547327 & 0.273664 \tabularnewline
23 & 0.66648 & 0.667039 & 0.33352 \tabularnewline
24 & 0.698171 & 0.603658 & 0.301829 \tabularnewline
25 & 0.647194 & 0.705612 & 0.352806 \tabularnewline
26 & 0.584314 & 0.831371 & 0.415686 \tabularnewline
27 & 0.759628 & 0.480744 & 0.240372 \tabularnewline
28 & 0.761853 & 0.476293 & 0.238147 \tabularnewline
29 & 0.717075 & 0.565851 & 0.282925 \tabularnewline
30 & 0.667401 & 0.665197 & 0.332599 \tabularnewline
31 & 0.673964 & 0.652072 & 0.326036 \tabularnewline
32 & 0.624082 & 0.751836 & 0.375918 \tabularnewline
33 & 0.59422 & 0.81156 & 0.40578 \tabularnewline
34 & 0.543439 & 0.913123 & 0.456561 \tabularnewline
35 & 0.483828 & 0.967655 & 0.516172 \tabularnewline
36 & 0.432856 & 0.865711 & 0.567144 \tabularnewline
37 & 0.381538 & 0.763076 & 0.618462 \tabularnewline
38 & 0.36039 & 0.720779 & 0.63961 \tabularnewline
39 & 0.325673 & 0.651345 & 0.674327 \tabularnewline
40 & 0.285092 & 0.570184 & 0.714908 \tabularnewline
41 & 0.434661 & 0.869322 & 0.565339 \tabularnewline
42 & 0.38333 & 0.766661 & 0.61667 \tabularnewline
43 & 0.377786 & 0.755573 & 0.622214 \tabularnewline
44 & 0.325521 & 0.651041 & 0.674479 \tabularnewline
45 & 0.295505 & 0.591011 & 0.704495 \tabularnewline
46 & 0.256714 & 0.513429 & 0.743286 \tabularnewline
47 & 0.221149 & 0.442298 & 0.778851 \tabularnewline
48 & 0.236573 & 0.473145 & 0.763427 \tabularnewline
49 & 0.225165 & 0.45033 & 0.774835 \tabularnewline
50 & 0.201952 & 0.403903 & 0.798048 \tabularnewline
51 & 0.169438 & 0.338876 & 0.830562 \tabularnewline
52 & 0.232611 & 0.465222 & 0.767389 \tabularnewline
53 & 0.192877 & 0.385754 & 0.807123 \tabularnewline
54 & 0.166942 & 0.333883 & 0.833058 \tabularnewline
55 & 0.165891 & 0.331781 & 0.834109 \tabularnewline
56 & 0.136309 & 0.272617 & 0.863691 \tabularnewline
57 & 0.232994 & 0.465987 & 0.767006 \tabularnewline
58 & 0.271227 & 0.542454 & 0.728773 \tabularnewline
59 & 0.27683 & 0.553661 & 0.72317 \tabularnewline
60 & 0.278531 & 0.557061 & 0.721469 \tabularnewline
61 & 0.247539 & 0.495079 & 0.752461 \tabularnewline
62 & 0.227911 & 0.455822 & 0.772089 \tabularnewline
63 & 0.244093 & 0.488185 & 0.755907 \tabularnewline
64 & 0.407005 & 0.814011 & 0.592995 \tabularnewline
65 & 0.372244 & 0.744488 & 0.627756 \tabularnewline
66 & 0.340083 & 0.680167 & 0.659917 \tabularnewline
67 & 0.292838 & 0.585676 & 0.707162 \tabularnewline
68 & 0.286959 & 0.573918 & 0.713041 \tabularnewline
69 & 0.356753 & 0.713507 & 0.643247 \tabularnewline
70 & 0.328628 & 0.657256 & 0.671372 \tabularnewline
71 & 0.305526 & 0.611051 & 0.694474 \tabularnewline
72 & 0.265639 & 0.531278 & 0.734361 \tabularnewline
73 & 0.24439 & 0.488779 & 0.75561 \tabularnewline
74 & 0.324466 & 0.648933 & 0.675534 \tabularnewline
75 & 0.303853 & 0.607707 & 0.696147 \tabularnewline
76 & 0.308611 & 0.617223 & 0.691389 \tabularnewline
77 & 0.302403 & 0.604806 & 0.697597 \tabularnewline
78 & 0.324918 & 0.649837 & 0.675082 \tabularnewline
79 & 0.275781 & 0.551562 & 0.724219 \tabularnewline
80 & 0.279188 & 0.558376 & 0.720812 \tabularnewline
81 & 0.232799 & 0.465598 & 0.767201 \tabularnewline
82 & 0.263465 & 0.52693 & 0.736535 \tabularnewline
83 & 0.222684 & 0.445368 & 0.777316 \tabularnewline
84 & 0.483979 & 0.967958 & 0.516021 \tabularnewline
85 & 0.420863 & 0.841726 & 0.579137 \tabularnewline
86 & 0.387578 & 0.775156 & 0.612422 \tabularnewline
87 & 0.34247 & 0.684941 & 0.65753 \tabularnewline
88 & 0.294999 & 0.589999 & 0.705001 \tabularnewline
89 & 0.258704 & 0.517407 & 0.741296 \tabularnewline
90 & 0.300462 & 0.600925 & 0.699538 \tabularnewline
91 & 0.308581 & 0.617161 & 0.691419 \tabularnewline
92 & 0.437676 & 0.875351 & 0.562324 \tabularnewline
93 & 0.371665 & 0.74333 & 0.628335 \tabularnewline
94 & 0.318968 & 0.637936 & 0.681032 \tabularnewline
95 & 0.30565 & 0.611299 & 0.69435 \tabularnewline
96 & 0.249676 & 0.499351 & 0.750324 \tabularnewline
97 & 0.259063 & 0.518126 & 0.740937 \tabularnewline
98 & 0.210105 & 0.420209 & 0.789895 \tabularnewline
99 & 0.27106 & 0.542119 & 0.72894 \tabularnewline
100 & 0.251529 & 0.503058 & 0.748471 \tabularnewline
101 & 0.18884 & 0.37768 & 0.81116 \tabularnewline
102 & 0.134451 & 0.268901 & 0.865549 \tabularnewline
103 & 0.0967154 & 0.193431 & 0.903285 \tabularnewline
104 & 0.0604361 & 0.120872 & 0.939564 \tabularnewline
105 & 0.136412 & 0.272824 & 0.863588 \tabularnewline
106 & 0.0829257 & 0.165851 & 0.917074 \tabularnewline
107 & 0.514336 & 0.971328 & 0.485664 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.843037[/C][C]0.313927[/C][C]0.156963[/C][/ROW]
[ROW][C]6[/C][C]0.761969[/C][C]0.476063[/C][C]0.238031[/C][/ROW]
[ROW][C]7[/C][C]0.67179[/C][C]0.656421[/C][C]0.32821[/C][/ROW]
[ROW][C]8[/C][C]0.602247[/C][C]0.795507[/C][C]0.397753[/C][/ROW]
[ROW][C]9[/C][C]0.606544[/C][C]0.786911[/C][C]0.393456[/C][/ROW]
[ROW][C]10[/C][C]0.601579[/C][C]0.796842[/C][C]0.398421[/C][/ROW]
[ROW][C]11[/C][C]0.50945[/C][C]0.9811[/C][C]0.49055[/C][/ROW]
[ROW][C]12[/C][C]0.76511[/C][C]0.46978[/C][C]0.23489[/C][/ROW]
[ROW][C]13[/C][C]0.693573[/C][C]0.612855[/C][C]0.306427[/C][/ROW]
[ROW][C]14[/C][C]0.677426[/C][C]0.645147[/C][C]0.322574[/C][/ROW]
[ROW][C]15[/C][C]0.847304[/C][C]0.305392[/C][C]0.152696[/C][/ROW]
[ROW][C]16[/C][C]0.798357[/C][C]0.403286[/C][C]0.201643[/C][/ROW]
[ROW][C]17[/C][C]0.750051[/C][C]0.499897[/C][C]0.249949[/C][/ROW]
[ROW][C]18[/C][C]0.703433[/C][C]0.593134[/C][C]0.296567[/C][/ROW]
[ROW][C]19[/C][C]0.642651[/C][C]0.714698[/C][C]0.357349[/C][/ROW]
[ROW][C]20[/C][C]0.575973[/C][C]0.848055[/C][C]0.424027[/C][/ROW]
[ROW][C]21[/C][C]0.684743[/C][C]0.630515[/C][C]0.315257[/C][/ROW]
[ROW][C]22[/C][C]0.726336[/C][C]0.547327[/C][C]0.273664[/C][/ROW]
[ROW][C]23[/C][C]0.66648[/C][C]0.667039[/C][C]0.33352[/C][/ROW]
[ROW][C]24[/C][C]0.698171[/C][C]0.603658[/C][C]0.301829[/C][/ROW]
[ROW][C]25[/C][C]0.647194[/C][C]0.705612[/C][C]0.352806[/C][/ROW]
[ROW][C]26[/C][C]0.584314[/C][C]0.831371[/C][C]0.415686[/C][/ROW]
[ROW][C]27[/C][C]0.759628[/C][C]0.480744[/C][C]0.240372[/C][/ROW]
[ROW][C]28[/C][C]0.761853[/C][C]0.476293[/C][C]0.238147[/C][/ROW]
[ROW][C]29[/C][C]0.717075[/C][C]0.565851[/C][C]0.282925[/C][/ROW]
[ROW][C]30[/C][C]0.667401[/C][C]0.665197[/C][C]0.332599[/C][/ROW]
[ROW][C]31[/C][C]0.673964[/C][C]0.652072[/C][C]0.326036[/C][/ROW]
[ROW][C]32[/C][C]0.624082[/C][C]0.751836[/C][C]0.375918[/C][/ROW]
[ROW][C]33[/C][C]0.59422[/C][C]0.81156[/C][C]0.40578[/C][/ROW]
[ROW][C]34[/C][C]0.543439[/C][C]0.913123[/C][C]0.456561[/C][/ROW]
[ROW][C]35[/C][C]0.483828[/C][C]0.967655[/C][C]0.516172[/C][/ROW]
[ROW][C]36[/C][C]0.432856[/C][C]0.865711[/C][C]0.567144[/C][/ROW]
[ROW][C]37[/C][C]0.381538[/C][C]0.763076[/C][C]0.618462[/C][/ROW]
[ROW][C]38[/C][C]0.36039[/C][C]0.720779[/C][C]0.63961[/C][/ROW]
[ROW][C]39[/C][C]0.325673[/C][C]0.651345[/C][C]0.674327[/C][/ROW]
[ROW][C]40[/C][C]0.285092[/C][C]0.570184[/C][C]0.714908[/C][/ROW]
[ROW][C]41[/C][C]0.434661[/C][C]0.869322[/C][C]0.565339[/C][/ROW]
[ROW][C]42[/C][C]0.38333[/C][C]0.766661[/C][C]0.61667[/C][/ROW]
[ROW][C]43[/C][C]0.377786[/C][C]0.755573[/C][C]0.622214[/C][/ROW]
[ROW][C]44[/C][C]0.325521[/C][C]0.651041[/C][C]0.674479[/C][/ROW]
[ROW][C]45[/C][C]0.295505[/C][C]0.591011[/C][C]0.704495[/C][/ROW]
[ROW][C]46[/C][C]0.256714[/C][C]0.513429[/C][C]0.743286[/C][/ROW]
[ROW][C]47[/C][C]0.221149[/C][C]0.442298[/C][C]0.778851[/C][/ROW]
[ROW][C]48[/C][C]0.236573[/C][C]0.473145[/C][C]0.763427[/C][/ROW]
[ROW][C]49[/C][C]0.225165[/C][C]0.45033[/C][C]0.774835[/C][/ROW]
[ROW][C]50[/C][C]0.201952[/C][C]0.403903[/C][C]0.798048[/C][/ROW]
[ROW][C]51[/C][C]0.169438[/C][C]0.338876[/C][C]0.830562[/C][/ROW]
[ROW][C]52[/C][C]0.232611[/C][C]0.465222[/C][C]0.767389[/C][/ROW]
[ROW][C]53[/C][C]0.192877[/C][C]0.385754[/C][C]0.807123[/C][/ROW]
[ROW][C]54[/C][C]0.166942[/C][C]0.333883[/C][C]0.833058[/C][/ROW]
[ROW][C]55[/C][C]0.165891[/C][C]0.331781[/C][C]0.834109[/C][/ROW]
[ROW][C]56[/C][C]0.136309[/C][C]0.272617[/C][C]0.863691[/C][/ROW]
[ROW][C]57[/C][C]0.232994[/C][C]0.465987[/C][C]0.767006[/C][/ROW]
[ROW][C]58[/C][C]0.271227[/C][C]0.542454[/C][C]0.728773[/C][/ROW]
[ROW][C]59[/C][C]0.27683[/C][C]0.553661[/C][C]0.72317[/C][/ROW]
[ROW][C]60[/C][C]0.278531[/C][C]0.557061[/C][C]0.721469[/C][/ROW]
[ROW][C]61[/C][C]0.247539[/C][C]0.495079[/C][C]0.752461[/C][/ROW]
[ROW][C]62[/C][C]0.227911[/C][C]0.455822[/C][C]0.772089[/C][/ROW]
[ROW][C]63[/C][C]0.244093[/C][C]0.488185[/C][C]0.755907[/C][/ROW]
[ROW][C]64[/C][C]0.407005[/C][C]0.814011[/C][C]0.592995[/C][/ROW]
[ROW][C]65[/C][C]0.372244[/C][C]0.744488[/C][C]0.627756[/C][/ROW]
[ROW][C]66[/C][C]0.340083[/C][C]0.680167[/C][C]0.659917[/C][/ROW]
[ROW][C]67[/C][C]0.292838[/C][C]0.585676[/C][C]0.707162[/C][/ROW]
[ROW][C]68[/C][C]0.286959[/C][C]0.573918[/C][C]0.713041[/C][/ROW]
[ROW][C]69[/C][C]0.356753[/C][C]0.713507[/C][C]0.643247[/C][/ROW]
[ROW][C]70[/C][C]0.328628[/C][C]0.657256[/C][C]0.671372[/C][/ROW]
[ROW][C]71[/C][C]0.305526[/C][C]0.611051[/C][C]0.694474[/C][/ROW]
[ROW][C]72[/C][C]0.265639[/C][C]0.531278[/C][C]0.734361[/C][/ROW]
[ROW][C]73[/C][C]0.24439[/C][C]0.488779[/C][C]0.75561[/C][/ROW]
[ROW][C]74[/C][C]0.324466[/C][C]0.648933[/C][C]0.675534[/C][/ROW]
[ROW][C]75[/C][C]0.303853[/C][C]0.607707[/C][C]0.696147[/C][/ROW]
[ROW][C]76[/C][C]0.308611[/C][C]0.617223[/C][C]0.691389[/C][/ROW]
[ROW][C]77[/C][C]0.302403[/C][C]0.604806[/C][C]0.697597[/C][/ROW]
[ROW][C]78[/C][C]0.324918[/C][C]0.649837[/C][C]0.675082[/C][/ROW]
[ROW][C]79[/C][C]0.275781[/C][C]0.551562[/C][C]0.724219[/C][/ROW]
[ROW][C]80[/C][C]0.279188[/C][C]0.558376[/C][C]0.720812[/C][/ROW]
[ROW][C]81[/C][C]0.232799[/C][C]0.465598[/C][C]0.767201[/C][/ROW]
[ROW][C]82[/C][C]0.263465[/C][C]0.52693[/C][C]0.736535[/C][/ROW]
[ROW][C]83[/C][C]0.222684[/C][C]0.445368[/C][C]0.777316[/C][/ROW]
[ROW][C]84[/C][C]0.483979[/C][C]0.967958[/C][C]0.516021[/C][/ROW]
[ROW][C]85[/C][C]0.420863[/C][C]0.841726[/C][C]0.579137[/C][/ROW]
[ROW][C]86[/C][C]0.387578[/C][C]0.775156[/C][C]0.612422[/C][/ROW]
[ROW][C]87[/C][C]0.34247[/C][C]0.684941[/C][C]0.65753[/C][/ROW]
[ROW][C]88[/C][C]0.294999[/C][C]0.589999[/C][C]0.705001[/C][/ROW]
[ROW][C]89[/C][C]0.258704[/C][C]0.517407[/C][C]0.741296[/C][/ROW]
[ROW][C]90[/C][C]0.300462[/C][C]0.600925[/C][C]0.699538[/C][/ROW]
[ROW][C]91[/C][C]0.308581[/C][C]0.617161[/C][C]0.691419[/C][/ROW]
[ROW][C]92[/C][C]0.437676[/C][C]0.875351[/C][C]0.562324[/C][/ROW]
[ROW][C]93[/C][C]0.371665[/C][C]0.74333[/C][C]0.628335[/C][/ROW]
[ROW][C]94[/C][C]0.318968[/C][C]0.637936[/C][C]0.681032[/C][/ROW]
[ROW][C]95[/C][C]0.30565[/C][C]0.611299[/C][C]0.69435[/C][/ROW]
[ROW][C]96[/C][C]0.249676[/C][C]0.499351[/C][C]0.750324[/C][/ROW]
[ROW][C]97[/C][C]0.259063[/C][C]0.518126[/C][C]0.740937[/C][/ROW]
[ROW][C]98[/C][C]0.210105[/C][C]0.420209[/C][C]0.789895[/C][/ROW]
[ROW][C]99[/C][C]0.27106[/C][C]0.542119[/C][C]0.72894[/C][/ROW]
[ROW][C]100[/C][C]0.251529[/C][C]0.503058[/C][C]0.748471[/C][/ROW]
[ROW][C]101[/C][C]0.18884[/C][C]0.37768[/C][C]0.81116[/C][/ROW]
[ROW][C]102[/C][C]0.134451[/C][C]0.268901[/C][C]0.865549[/C][/ROW]
[ROW][C]103[/C][C]0.0967154[/C][C]0.193431[/C][C]0.903285[/C][/ROW]
[ROW][C]104[/C][C]0.0604361[/C][C]0.120872[/C][C]0.939564[/C][/ROW]
[ROW][C]105[/C][C]0.136412[/C][C]0.272824[/C][C]0.863588[/C][/ROW]
[ROW][C]106[/C][C]0.0829257[/C][C]0.165851[/C][C]0.917074[/C][/ROW]
[ROW][C]107[/C][C]0.514336[/C][C]0.971328[/C][C]0.485664[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.8430370.3139270.156963
60.7619690.4760630.238031
70.671790.6564210.32821
80.6022470.7955070.397753
90.6065440.7869110.393456
100.6015790.7968420.398421
110.509450.98110.49055
120.765110.469780.23489
130.6935730.6128550.306427
140.6774260.6451470.322574
150.8473040.3053920.152696
160.7983570.4032860.201643
170.7500510.4998970.249949
180.7034330.5931340.296567
190.6426510.7146980.357349
200.5759730.8480550.424027
210.6847430.6305150.315257
220.7263360.5473270.273664
230.666480.6670390.33352
240.6981710.6036580.301829
250.6471940.7056120.352806
260.5843140.8313710.415686
270.7596280.4807440.240372
280.7618530.4762930.238147
290.7170750.5658510.282925
300.6674010.6651970.332599
310.6739640.6520720.326036
320.6240820.7518360.375918
330.594220.811560.40578
340.5434390.9131230.456561
350.4838280.9676550.516172
360.4328560.8657110.567144
370.3815380.7630760.618462
380.360390.7207790.63961
390.3256730.6513450.674327
400.2850920.5701840.714908
410.4346610.8693220.565339
420.383330.7666610.61667
430.3777860.7555730.622214
440.3255210.6510410.674479
450.2955050.5910110.704495
460.2567140.5134290.743286
470.2211490.4422980.778851
480.2365730.4731450.763427
490.2251650.450330.774835
500.2019520.4039030.798048
510.1694380.3388760.830562
520.2326110.4652220.767389
530.1928770.3857540.807123
540.1669420.3338830.833058
550.1658910.3317810.834109
560.1363090.2726170.863691
570.2329940.4659870.767006
580.2712270.5424540.728773
590.276830.5536610.72317
600.2785310.5570610.721469
610.2475390.4950790.752461
620.2279110.4558220.772089
630.2440930.4881850.755907
640.4070050.8140110.592995
650.3722440.7444880.627756
660.3400830.6801670.659917
670.2928380.5856760.707162
680.2869590.5739180.713041
690.3567530.7135070.643247
700.3286280.6572560.671372
710.3055260.6110510.694474
720.2656390.5312780.734361
730.244390.4887790.75561
740.3244660.6489330.675534
750.3038530.6077070.696147
760.3086110.6172230.691389
770.3024030.6048060.697597
780.3249180.6498370.675082
790.2757810.5515620.724219
800.2791880.5583760.720812
810.2327990.4655980.767201
820.2634650.526930.736535
830.2226840.4453680.777316
840.4839790.9679580.516021
850.4208630.8417260.579137
860.3875780.7751560.612422
870.342470.6849410.65753
880.2949990.5899990.705001
890.2587040.5174070.741296
900.3004620.6009250.699538
910.3085810.6171610.691419
920.4376760.8753510.562324
930.3716650.743330.628335
940.3189680.6379360.681032
950.305650.6112990.69435
960.2496760.4993510.750324
970.2590630.5181260.740937
980.2101050.4202090.789895
990.271060.5421190.72894
1000.2515290.5030580.748471
1010.188840.377680.81116
1020.1344510.2689010.865549
1030.09671540.1934310.903285
1040.06043610.1208720.939564
1050.1364120.2728240.863588
1060.08292570.1658510.917074
1070.5143360.9713280.485664







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265069&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265069&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265069&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}