Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 14 Dec 2017 10:20:29 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Dec/14/t15132467804k7qlmkh6vgjc72.htm/, Retrieved Tue, 14 May 2024 09:17:25 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=309439, Retrieved Tue, 14 May 2024 09:17:25 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact126
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple Regression ] [2017-12-14 09:20:29] [fda4350e119ddbaf0177fa3308cc9af4] [Current]
Feedback Forum

Post a new message
Dataseries X:
1077	757	24
120	2159	16
121	1606	17
121	2046	2
127	1107	9
129	1554	11
130	2359	8
131	1641	19
132	1032	8
132	1712	3
136	1116	26
137	1717	6
137	47	10
141	1121	12
142	2352	6
144	2354	10
149	129	12
152	942	7
158	138	26
159	1739	8
161	1241	25
189	1709	10
190	40	26
192	1702	3
200	2319	6
208	58	6
208	1209	11
216	236	17
222	2341	10
242	132	10
249	1309	16
290	1150	2
305	1215	11
306	1851	8
344	1229	22
610	42	5
741	1100	1
782	1202	25
120	1735	18
124	52	30
124	39	31
125	1920	20
126	1518	10
127	1552	18
128	2018	10
131	1301	2
133	203	6
135	205	9
135	1305	14
137	1951	25
139	1858	27
140	1859	15
142	7	28
142	17	29
143	2328	17
145	10	17
147	1943	3
148	1952	14
150	2120	5
153	1949	4
154	1619	20
157	1916	10
158	1303	16
175	1727	10
178	1937	9
178	1813	3
190	1823	30
191	1856	24
192	1520	2
192	1411	18
193	1718	31
195	310	17
195	1255	3
207	226	10
218	337	6
219	1814	26
224	1409	17
227	1900	10
228	58	10
234	1829	10
234	1739	28
242	2126	10
244	202	6
259	1854	15
273	1913	10
286	1918	8
291	2305	4
343	1728	31
568	302	15
134	1849	5
134	1347	18
138	1058	16
140	2350	15
142	2007	4
144	900	12
155	2327	13
167	2042	10
177	234	3
179	1029	18
179	1429	6
182	1821	10
210	1655	19
224	2234	2
248	2008	2
273	1837	10
293	2239	24
410	1850	3
430	1435	11
452	1408	18
454	1604	2
520	1940	3
586	16	24
617	1047	10
659	2149	10
759	1939	11
924	1216	23
941	1126	2




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time11 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time11 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]11 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=309439&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time11 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
DEPDELAY[t] = + 291.387 -0.0261852DEPTIME[t] -0.673369DEPDAY[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
DEPDELAY[t] =  +  291.387 -0.0261852DEPTIME[t] -0.673369DEPDAY[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]DEPDELAY[t] =  +  291.387 -0.0261852DEPTIME[t] -0.673369DEPDAY[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
DEPDELAY[t] = + 291.387 -0.0261852DEPTIME[t] -0.673369DEPDAY[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+291.4 51.23+5.6880e+00 1.007e-07 5.036e-08
DEPTIME-0.02618 0.02466-1.0620e+00 0.2905 0.1452
DEPDAY-0.6734 2.156-3.1220e-01 0.7554 0.3777

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +291.4 &  51.23 & +5.6880e+00 &  1.007e-07 &  5.036e-08 \tabularnewline
DEPTIME & -0.02618 &  0.02466 & -1.0620e+00 &  0.2905 &  0.1452 \tabularnewline
DEPDAY & -0.6734 &  2.156 & -3.1220e-01 &  0.7554 &  0.3777 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+291.4[/C][C] 51.23[/C][C]+5.6880e+00[/C][C] 1.007e-07[/C][C] 5.036e-08[/C][/ROW]
[ROW][C]DEPTIME[/C][C]-0.02618[/C][C] 0.02466[/C][C]-1.0620e+00[/C][C] 0.2905[/C][C] 0.1452[/C][/ROW]
[ROW][C]DEPDAY[/C][C]-0.6734[/C][C] 2.156[/C][C]-3.1220e-01[/C][C] 0.7554[/C][C] 0.3777[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+291.4 51.23+5.6880e+00 1.007e-07 5.036e-08
DEPTIME-0.02618 0.02466-1.0620e+00 0.2905 0.1452
DEPDAY-0.6734 2.156-3.1220e-01 0.7554 0.3777







Multiple Linear Regression - Regression Statistics
Multiple R 0.09942
R-squared 0.009885
Adjusted R-squared-0.007485
F-TEST (value) 0.5691
F-TEST (DF numerator)2
F-TEST (DF denominator)114
p-value 0.5677
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 187.1
Sum Squared Residuals 3.992e+06

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.09942 \tabularnewline
R-squared &  0.009885 \tabularnewline
Adjusted R-squared & -0.007485 \tabularnewline
F-TEST (value) &  0.5691 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 114 \tabularnewline
p-value &  0.5677 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  187.1 \tabularnewline
Sum Squared Residuals &  3.992e+06 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.09942[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.009885[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.007485[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.5691[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]114[/C][/ROW]
[ROW][C]p-value[/C][C] 0.5677[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 187.1[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 3.992e+06[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.09942
R-squared 0.009885
Adjusted R-squared-0.007485
F-TEST (value) 0.5691
F-TEST (DF numerator)2
F-TEST (DF denominator)114
p-value 0.5677
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 187.1
Sum Squared Residuals 3.992e+06







Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute

\begin{tabular}{lllllllll}
\hline
Menu of Residual Diagnostics \tabularnewline
Description & Link \tabularnewline
Histogram & Compute \tabularnewline
Central Tendency & Compute \tabularnewline
QQ Plot & Compute \tabularnewline
Kernel Density Plot & Compute \tabularnewline
Skewness/Kurtosis Test & Compute \tabularnewline
Skewness-Kurtosis Plot & Compute \tabularnewline
Harrell-Davis Plot & Compute \tabularnewline
Bootstrap Plot -- Central Tendency & Compute \tabularnewline
Blocked Bootstrap Plot -- Central Tendency & Compute \tabularnewline
(Partial) Autocorrelation Plot & Compute \tabularnewline
Spectral Analysis & Compute \tabularnewline
Tukey lambda PPCC Plot & Compute \tabularnewline
Box-Cox Normality Plot & Compute \tabularnewline
Summary Statistics & Compute \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=4

[TABLE]
[ROW][C]Menu of Residual Diagnostics[/C][/ROW]
[ROW][C]Description[/C][C]Link[/C][/ROW]
[ROW][C]Histogram[/C][C]Compute[/C][/ROW]
[ROW][C]Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]QQ Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Kernel Density Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness/Kurtosis Test[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness-Kurtosis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Harrell-Davis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]Blocked Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C](Partial) Autocorrelation Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Spectral Analysis[/C][C]Compute[/C][/ROW]
[ROW][C]Tukey lambda PPCC Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Box-Cox Normality Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Summary Statistics[/C][C]Compute[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 1077 255.4 821.6
2 120 224.1-104.1
3 121 237.9-116.9
4 121 236.5-115.5
5 127 256.3-129.3
6 129 243.3-114.3
7 130 224.2-94.23
8 131 235.6-104.6
9 132 259-127
10 132 244.5-112.5
11 136 244.7-108.7
12 137 242.4-105.4
13 137 283.4-146.4
14 141 254-113
15 142 225.8-83.76
16 144 223-79.01
17 149 279.9-130.9
18 152 262-110
19 158 270.3-112.3
20 159 240.5-81.46
21 161 242.1-81.06
22 189 239.9-50.9
23 190 272.8-82.83
24 192 244.8-52.8
25 200 226.6-26.62
26 208 285.8-77.83
27 208 252.3-44.32
28 216 273.8-57.76
29 222 223.4-1.354
30 242 281.2-39.2
31 249 246.3 2.664
32 290 259.9 30.07
33 305 252.2 52.84
34 306 237.5 68.47
35 344 244.4 99.61
36 610 286.9 323.1
37 741 261.9 479.1
38 782 243.1 538.9
39 120 233.8-113.8
40 124 269.8-145.8
41 124 269.5-145.5
42 125 227.6-102.6
43 126 244.9-118.9
44 127 238.6-111.6
45 128 231.8-103.8
46 131 256-125
47 133 282-149
48 135 280-145
49 135 247.8-112.8
50 137 223.5-86.47
51 139 224.6-85.55
52 140 232.6-92.61
53 142 272.3-130.3
54 142 271.4-129.4
55 143 219-75.98
56 145 279.7-134.7
57 147 238.5-91.49
58 148 230.8-82.85
59 150 232.5-82.51
60 153 237.7-84.66
61 154 235.5-81.53
62 157 234.5-77.48
63 158 246.5-88.49
64 175 239.4-64.43
65 178 234.6-56.61
66 178 241.9-63.89
67 190 223.4-33.45
68 191 226.6-35.63
69 192 250.2-58.24
70 192 242.3-50.32
71 193 225.5-32.53
72 195 271.8-76.82
73 195 256.5-61.5
74 207 278.7-71.74
75 218 278.5-60.52
76 219 226.4-7.379
77 224 243-19.04
78 227 234.9-7.901
79 228 283.1-55.13
80 234 236.8-2.76
81 234 227 7.004
82 242 229 13.02
83 244 282.1-38.06
84 259 232.7 26.26
85 273 234.6 38.44
86 286 235.8 50.22
87 291 228.3 62.66
88 343 225.3 117.7
89 568 273.4 294.6
90 134 239.6-105.6
91 134 244-110
92 138 252.9-114.9
93 140 219.8-79.75
94 142 236.1-94.14
95 144 259.7-115.7
96 155 221.7-66.7
97 167 231.2-64.18
98 177 283.2-106.2
99 179 252.3-73.32
100 179 249.9-70.93
101 182 237-54.97
102 210 235.3-25.26
103 224 231.5-7.542
104 248 237.5 10.54
105 273 236.6 36.45
106 293 216.6 76.4
107 410 240.9 169.1
108 430 246.4 183.6
109 452 242.4 209.6
110 454 248 206
111 520 238.6 281.4
112 586 274.8 311.2
113 617 257.2 359.8
114 659 228.4 430.6
115 759 233.2 525.8
116 924 244.1 679.9
117 941 260.6 680.4

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  1077 &  255.4 &  821.6 \tabularnewline
2 &  120 &  224.1 & -104.1 \tabularnewline
3 &  121 &  237.9 & -116.9 \tabularnewline
4 &  121 &  236.5 & -115.5 \tabularnewline
5 &  127 &  256.3 & -129.3 \tabularnewline
6 &  129 &  243.3 & -114.3 \tabularnewline
7 &  130 &  224.2 & -94.23 \tabularnewline
8 &  131 &  235.6 & -104.6 \tabularnewline
9 &  132 &  259 & -127 \tabularnewline
10 &  132 &  244.5 & -112.5 \tabularnewline
11 &  136 &  244.7 & -108.7 \tabularnewline
12 &  137 &  242.4 & -105.4 \tabularnewline
13 &  137 &  283.4 & -146.4 \tabularnewline
14 &  141 &  254 & -113 \tabularnewline
15 &  142 &  225.8 & -83.76 \tabularnewline
16 &  144 &  223 & -79.01 \tabularnewline
17 &  149 &  279.9 & -130.9 \tabularnewline
18 &  152 &  262 & -110 \tabularnewline
19 &  158 &  270.3 & -112.3 \tabularnewline
20 &  159 &  240.5 & -81.46 \tabularnewline
21 &  161 &  242.1 & -81.06 \tabularnewline
22 &  189 &  239.9 & -50.9 \tabularnewline
23 &  190 &  272.8 & -82.83 \tabularnewline
24 &  192 &  244.8 & -52.8 \tabularnewline
25 &  200 &  226.6 & -26.62 \tabularnewline
26 &  208 &  285.8 & -77.83 \tabularnewline
27 &  208 &  252.3 & -44.32 \tabularnewline
28 &  216 &  273.8 & -57.76 \tabularnewline
29 &  222 &  223.4 & -1.354 \tabularnewline
30 &  242 &  281.2 & -39.2 \tabularnewline
31 &  249 &  246.3 &  2.664 \tabularnewline
32 &  290 &  259.9 &  30.07 \tabularnewline
33 &  305 &  252.2 &  52.84 \tabularnewline
34 &  306 &  237.5 &  68.47 \tabularnewline
35 &  344 &  244.4 &  99.61 \tabularnewline
36 &  610 &  286.9 &  323.1 \tabularnewline
37 &  741 &  261.9 &  479.1 \tabularnewline
38 &  782 &  243.1 &  538.9 \tabularnewline
39 &  120 &  233.8 & -113.8 \tabularnewline
40 &  124 &  269.8 & -145.8 \tabularnewline
41 &  124 &  269.5 & -145.5 \tabularnewline
42 &  125 &  227.6 & -102.6 \tabularnewline
43 &  126 &  244.9 & -118.9 \tabularnewline
44 &  127 &  238.6 & -111.6 \tabularnewline
45 &  128 &  231.8 & -103.8 \tabularnewline
46 &  131 &  256 & -125 \tabularnewline
47 &  133 &  282 & -149 \tabularnewline
48 &  135 &  280 & -145 \tabularnewline
49 &  135 &  247.8 & -112.8 \tabularnewline
50 &  137 &  223.5 & -86.47 \tabularnewline
51 &  139 &  224.6 & -85.55 \tabularnewline
52 &  140 &  232.6 & -92.61 \tabularnewline
53 &  142 &  272.3 & -130.3 \tabularnewline
54 &  142 &  271.4 & -129.4 \tabularnewline
55 &  143 &  219 & -75.98 \tabularnewline
56 &  145 &  279.7 & -134.7 \tabularnewline
57 &  147 &  238.5 & -91.49 \tabularnewline
58 &  148 &  230.8 & -82.85 \tabularnewline
59 &  150 &  232.5 & -82.51 \tabularnewline
60 &  153 &  237.7 & -84.66 \tabularnewline
61 &  154 &  235.5 & -81.53 \tabularnewline
62 &  157 &  234.5 & -77.48 \tabularnewline
63 &  158 &  246.5 & -88.49 \tabularnewline
64 &  175 &  239.4 & -64.43 \tabularnewline
65 &  178 &  234.6 & -56.61 \tabularnewline
66 &  178 &  241.9 & -63.89 \tabularnewline
67 &  190 &  223.4 & -33.45 \tabularnewline
68 &  191 &  226.6 & -35.63 \tabularnewline
69 &  192 &  250.2 & -58.24 \tabularnewline
70 &  192 &  242.3 & -50.32 \tabularnewline
71 &  193 &  225.5 & -32.53 \tabularnewline
72 &  195 &  271.8 & -76.82 \tabularnewline
73 &  195 &  256.5 & -61.5 \tabularnewline
74 &  207 &  278.7 & -71.74 \tabularnewline
75 &  218 &  278.5 & -60.52 \tabularnewline
76 &  219 &  226.4 & -7.379 \tabularnewline
77 &  224 &  243 & -19.04 \tabularnewline
78 &  227 &  234.9 & -7.901 \tabularnewline
79 &  228 &  283.1 & -55.13 \tabularnewline
80 &  234 &  236.8 & -2.76 \tabularnewline
81 &  234 &  227 &  7.004 \tabularnewline
82 &  242 &  229 &  13.02 \tabularnewline
83 &  244 &  282.1 & -38.06 \tabularnewline
84 &  259 &  232.7 &  26.26 \tabularnewline
85 &  273 &  234.6 &  38.44 \tabularnewline
86 &  286 &  235.8 &  50.22 \tabularnewline
87 &  291 &  228.3 &  62.66 \tabularnewline
88 &  343 &  225.3 &  117.7 \tabularnewline
89 &  568 &  273.4 &  294.6 \tabularnewline
90 &  134 &  239.6 & -105.6 \tabularnewline
91 &  134 &  244 & -110 \tabularnewline
92 &  138 &  252.9 & -114.9 \tabularnewline
93 &  140 &  219.8 & -79.75 \tabularnewline
94 &  142 &  236.1 & -94.14 \tabularnewline
95 &  144 &  259.7 & -115.7 \tabularnewline
96 &  155 &  221.7 & -66.7 \tabularnewline
97 &  167 &  231.2 & -64.18 \tabularnewline
98 &  177 &  283.2 & -106.2 \tabularnewline
99 &  179 &  252.3 & -73.32 \tabularnewline
100 &  179 &  249.9 & -70.93 \tabularnewline
101 &  182 &  237 & -54.97 \tabularnewline
102 &  210 &  235.3 & -25.26 \tabularnewline
103 &  224 &  231.5 & -7.542 \tabularnewline
104 &  248 &  237.5 &  10.54 \tabularnewline
105 &  273 &  236.6 &  36.45 \tabularnewline
106 &  293 &  216.6 &  76.4 \tabularnewline
107 &  410 &  240.9 &  169.1 \tabularnewline
108 &  430 &  246.4 &  183.6 \tabularnewline
109 &  452 &  242.4 &  209.6 \tabularnewline
110 &  454 &  248 &  206 \tabularnewline
111 &  520 &  238.6 &  281.4 \tabularnewline
112 &  586 &  274.8 &  311.2 \tabularnewline
113 &  617 &  257.2 &  359.8 \tabularnewline
114 &  659 &  228.4 &  430.6 \tabularnewline
115 &  759 &  233.2 &  525.8 \tabularnewline
116 &  924 &  244.1 &  679.9 \tabularnewline
117 &  941 &  260.6 &  680.4 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=5

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 1077[/C][C] 255.4[/C][C] 821.6[/C][/ROW]
[ROW][C]2[/C][C] 120[/C][C] 224.1[/C][C]-104.1[/C][/ROW]
[ROW][C]3[/C][C] 121[/C][C] 237.9[/C][C]-116.9[/C][/ROW]
[ROW][C]4[/C][C] 121[/C][C] 236.5[/C][C]-115.5[/C][/ROW]
[ROW][C]5[/C][C] 127[/C][C] 256.3[/C][C]-129.3[/C][/ROW]
[ROW][C]6[/C][C] 129[/C][C] 243.3[/C][C]-114.3[/C][/ROW]
[ROW][C]7[/C][C] 130[/C][C] 224.2[/C][C]-94.23[/C][/ROW]
[ROW][C]8[/C][C] 131[/C][C] 235.6[/C][C]-104.6[/C][/ROW]
[ROW][C]9[/C][C] 132[/C][C] 259[/C][C]-127[/C][/ROW]
[ROW][C]10[/C][C] 132[/C][C] 244.5[/C][C]-112.5[/C][/ROW]
[ROW][C]11[/C][C] 136[/C][C] 244.7[/C][C]-108.7[/C][/ROW]
[ROW][C]12[/C][C] 137[/C][C] 242.4[/C][C]-105.4[/C][/ROW]
[ROW][C]13[/C][C] 137[/C][C] 283.4[/C][C]-146.4[/C][/ROW]
[ROW][C]14[/C][C] 141[/C][C] 254[/C][C]-113[/C][/ROW]
[ROW][C]15[/C][C] 142[/C][C] 225.8[/C][C]-83.76[/C][/ROW]
[ROW][C]16[/C][C] 144[/C][C] 223[/C][C]-79.01[/C][/ROW]
[ROW][C]17[/C][C] 149[/C][C] 279.9[/C][C]-130.9[/C][/ROW]
[ROW][C]18[/C][C] 152[/C][C] 262[/C][C]-110[/C][/ROW]
[ROW][C]19[/C][C] 158[/C][C] 270.3[/C][C]-112.3[/C][/ROW]
[ROW][C]20[/C][C] 159[/C][C] 240.5[/C][C]-81.46[/C][/ROW]
[ROW][C]21[/C][C] 161[/C][C] 242.1[/C][C]-81.06[/C][/ROW]
[ROW][C]22[/C][C] 189[/C][C] 239.9[/C][C]-50.9[/C][/ROW]
[ROW][C]23[/C][C] 190[/C][C] 272.8[/C][C]-82.83[/C][/ROW]
[ROW][C]24[/C][C] 192[/C][C] 244.8[/C][C]-52.8[/C][/ROW]
[ROW][C]25[/C][C] 200[/C][C] 226.6[/C][C]-26.62[/C][/ROW]
[ROW][C]26[/C][C] 208[/C][C] 285.8[/C][C]-77.83[/C][/ROW]
[ROW][C]27[/C][C] 208[/C][C] 252.3[/C][C]-44.32[/C][/ROW]
[ROW][C]28[/C][C] 216[/C][C] 273.8[/C][C]-57.76[/C][/ROW]
[ROW][C]29[/C][C] 222[/C][C] 223.4[/C][C]-1.354[/C][/ROW]
[ROW][C]30[/C][C] 242[/C][C] 281.2[/C][C]-39.2[/C][/ROW]
[ROW][C]31[/C][C] 249[/C][C] 246.3[/C][C] 2.664[/C][/ROW]
[ROW][C]32[/C][C] 290[/C][C] 259.9[/C][C] 30.07[/C][/ROW]
[ROW][C]33[/C][C] 305[/C][C] 252.2[/C][C] 52.84[/C][/ROW]
[ROW][C]34[/C][C] 306[/C][C] 237.5[/C][C] 68.47[/C][/ROW]
[ROW][C]35[/C][C] 344[/C][C] 244.4[/C][C] 99.61[/C][/ROW]
[ROW][C]36[/C][C] 610[/C][C] 286.9[/C][C] 323.1[/C][/ROW]
[ROW][C]37[/C][C] 741[/C][C] 261.9[/C][C] 479.1[/C][/ROW]
[ROW][C]38[/C][C] 782[/C][C] 243.1[/C][C] 538.9[/C][/ROW]
[ROW][C]39[/C][C] 120[/C][C] 233.8[/C][C]-113.8[/C][/ROW]
[ROW][C]40[/C][C] 124[/C][C] 269.8[/C][C]-145.8[/C][/ROW]
[ROW][C]41[/C][C] 124[/C][C] 269.5[/C][C]-145.5[/C][/ROW]
[ROW][C]42[/C][C] 125[/C][C] 227.6[/C][C]-102.6[/C][/ROW]
[ROW][C]43[/C][C] 126[/C][C] 244.9[/C][C]-118.9[/C][/ROW]
[ROW][C]44[/C][C] 127[/C][C] 238.6[/C][C]-111.6[/C][/ROW]
[ROW][C]45[/C][C] 128[/C][C] 231.8[/C][C]-103.8[/C][/ROW]
[ROW][C]46[/C][C] 131[/C][C] 256[/C][C]-125[/C][/ROW]
[ROW][C]47[/C][C] 133[/C][C] 282[/C][C]-149[/C][/ROW]
[ROW][C]48[/C][C] 135[/C][C] 280[/C][C]-145[/C][/ROW]
[ROW][C]49[/C][C] 135[/C][C] 247.8[/C][C]-112.8[/C][/ROW]
[ROW][C]50[/C][C] 137[/C][C] 223.5[/C][C]-86.47[/C][/ROW]
[ROW][C]51[/C][C] 139[/C][C] 224.6[/C][C]-85.55[/C][/ROW]
[ROW][C]52[/C][C] 140[/C][C] 232.6[/C][C]-92.61[/C][/ROW]
[ROW][C]53[/C][C] 142[/C][C] 272.3[/C][C]-130.3[/C][/ROW]
[ROW][C]54[/C][C] 142[/C][C] 271.4[/C][C]-129.4[/C][/ROW]
[ROW][C]55[/C][C] 143[/C][C] 219[/C][C]-75.98[/C][/ROW]
[ROW][C]56[/C][C] 145[/C][C] 279.7[/C][C]-134.7[/C][/ROW]
[ROW][C]57[/C][C] 147[/C][C] 238.5[/C][C]-91.49[/C][/ROW]
[ROW][C]58[/C][C] 148[/C][C] 230.8[/C][C]-82.85[/C][/ROW]
[ROW][C]59[/C][C] 150[/C][C] 232.5[/C][C]-82.51[/C][/ROW]
[ROW][C]60[/C][C] 153[/C][C] 237.7[/C][C]-84.66[/C][/ROW]
[ROW][C]61[/C][C] 154[/C][C] 235.5[/C][C]-81.53[/C][/ROW]
[ROW][C]62[/C][C] 157[/C][C] 234.5[/C][C]-77.48[/C][/ROW]
[ROW][C]63[/C][C] 158[/C][C] 246.5[/C][C]-88.49[/C][/ROW]
[ROW][C]64[/C][C] 175[/C][C] 239.4[/C][C]-64.43[/C][/ROW]
[ROW][C]65[/C][C] 178[/C][C] 234.6[/C][C]-56.61[/C][/ROW]
[ROW][C]66[/C][C] 178[/C][C] 241.9[/C][C]-63.89[/C][/ROW]
[ROW][C]67[/C][C] 190[/C][C] 223.4[/C][C]-33.45[/C][/ROW]
[ROW][C]68[/C][C] 191[/C][C] 226.6[/C][C]-35.63[/C][/ROW]
[ROW][C]69[/C][C] 192[/C][C] 250.2[/C][C]-58.24[/C][/ROW]
[ROW][C]70[/C][C] 192[/C][C] 242.3[/C][C]-50.32[/C][/ROW]
[ROW][C]71[/C][C] 193[/C][C] 225.5[/C][C]-32.53[/C][/ROW]
[ROW][C]72[/C][C] 195[/C][C] 271.8[/C][C]-76.82[/C][/ROW]
[ROW][C]73[/C][C] 195[/C][C] 256.5[/C][C]-61.5[/C][/ROW]
[ROW][C]74[/C][C] 207[/C][C] 278.7[/C][C]-71.74[/C][/ROW]
[ROW][C]75[/C][C] 218[/C][C] 278.5[/C][C]-60.52[/C][/ROW]
[ROW][C]76[/C][C] 219[/C][C] 226.4[/C][C]-7.379[/C][/ROW]
[ROW][C]77[/C][C] 224[/C][C] 243[/C][C]-19.04[/C][/ROW]
[ROW][C]78[/C][C] 227[/C][C] 234.9[/C][C]-7.901[/C][/ROW]
[ROW][C]79[/C][C] 228[/C][C] 283.1[/C][C]-55.13[/C][/ROW]
[ROW][C]80[/C][C] 234[/C][C] 236.8[/C][C]-2.76[/C][/ROW]
[ROW][C]81[/C][C] 234[/C][C] 227[/C][C] 7.004[/C][/ROW]
[ROW][C]82[/C][C] 242[/C][C] 229[/C][C] 13.02[/C][/ROW]
[ROW][C]83[/C][C] 244[/C][C] 282.1[/C][C]-38.06[/C][/ROW]
[ROW][C]84[/C][C] 259[/C][C] 232.7[/C][C] 26.26[/C][/ROW]
[ROW][C]85[/C][C] 273[/C][C] 234.6[/C][C] 38.44[/C][/ROW]
[ROW][C]86[/C][C] 286[/C][C] 235.8[/C][C] 50.22[/C][/ROW]
[ROW][C]87[/C][C] 291[/C][C] 228.3[/C][C] 62.66[/C][/ROW]
[ROW][C]88[/C][C] 343[/C][C] 225.3[/C][C] 117.7[/C][/ROW]
[ROW][C]89[/C][C] 568[/C][C] 273.4[/C][C] 294.6[/C][/ROW]
[ROW][C]90[/C][C] 134[/C][C] 239.6[/C][C]-105.6[/C][/ROW]
[ROW][C]91[/C][C] 134[/C][C] 244[/C][C]-110[/C][/ROW]
[ROW][C]92[/C][C] 138[/C][C] 252.9[/C][C]-114.9[/C][/ROW]
[ROW][C]93[/C][C] 140[/C][C] 219.8[/C][C]-79.75[/C][/ROW]
[ROW][C]94[/C][C] 142[/C][C] 236.1[/C][C]-94.14[/C][/ROW]
[ROW][C]95[/C][C] 144[/C][C] 259.7[/C][C]-115.7[/C][/ROW]
[ROW][C]96[/C][C] 155[/C][C] 221.7[/C][C]-66.7[/C][/ROW]
[ROW][C]97[/C][C] 167[/C][C] 231.2[/C][C]-64.18[/C][/ROW]
[ROW][C]98[/C][C] 177[/C][C] 283.2[/C][C]-106.2[/C][/ROW]
[ROW][C]99[/C][C] 179[/C][C] 252.3[/C][C]-73.32[/C][/ROW]
[ROW][C]100[/C][C] 179[/C][C] 249.9[/C][C]-70.93[/C][/ROW]
[ROW][C]101[/C][C] 182[/C][C] 237[/C][C]-54.97[/C][/ROW]
[ROW][C]102[/C][C] 210[/C][C] 235.3[/C][C]-25.26[/C][/ROW]
[ROW][C]103[/C][C] 224[/C][C] 231.5[/C][C]-7.542[/C][/ROW]
[ROW][C]104[/C][C] 248[/C][C] 237.5[/C][C] 10.54[/C][/ROW]
[ROW][C]105[/C][C] 273[/C][C] 236.6[/C][C] 36.45[/C][/ROW]
[ROW][C]106[/C][C] 293[/C][C] 216.6[/C][C] 76.4[/C][/ROW]
[ROW][C]107[/C][C] 410[/C][C] 240.9[/C][C] 169.1[/C][/ROW]
[ROW][C]108[/C][C] 430[/C][C] 246.4[/C][C] 183.6[/C][/ROW]
[ROW][C]109[/C][C] 452[/C][C] 242.4[/C][C] 209.6[/C][/ROW]
[ROW][C]110[/C][C] 454[/C][C] 248[/C][C] 206[/C][/ROW]
[ROW][C]111[/C][C] 520[/C][C] 238.6[/C][C] 281.4[/C][/ROW]
[ROW][C]112[/C][C] 586[/C][C] 274.8[/C][C] 311.2[/C][/ROW]
[ROW][C]113[/C][C] 617[/C][C] 257.2[/C][C] 359.8[/C][/ROW]
[ROW][C]114[/C][C] 659[/C][C] 228.4[/C][C] 430.6[/C][/ROW]
[ROW][C]115[/C][C] 759[/C][C] 233.2[/C][C] 525.8[/C][/ROW]
[ROW][C]116[/C][C] 924[/C][C] 244.1[/C][C] 679.9[/C][/ROW]
[ROW][C]117[/C][C] 941[/C][C] 260.6[/C][C] 680.4[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 1077 255.4 821.6
2 120 224.1-104.1
3 121 237.9-116.9
4 121 236.5-115.5
5 127 256.3-129.3
6 129 243.3-114.3
7 130 224.2-94.23
8 131 235.6-104.6
9 132 259-127
10 132 244.5-112.5
11 136 244.7-108.7
12 137 242.4-105.4
13 137 283.4-146.4
14 141 254-113
15 142 225.8-83.76
16 144 223-79.01
17 149 279.9-130.9
18 152 262-110
19 158 270.3-112.3
20 159 240.5-81.46
21 161 242.1-81.06
22 189 239.9-50.9
23 190 272.8-82.83
24 192 244.8-52.8
25 200 226.6-26.62
26 208 285.8-77.83
27 208 252.3-44.32
28 216 273.8-57.76
29 222 223.4-1.354
30 242 281.2-39.2
31 249 246.3 2.664
32 290 259.9 30.07
33 305 252.2 52.84
34 306 237.5 68.47
35 344 244.4 99.61
36 610 286.9 323.1
37 741 261.9 479.1
38 782 243.1 538.9
39 120 233.8-113.8
40 124 269.8-145.8
41 124 269.5-145.5
42 125 227.6-102.6
43 126 244.9-118.9
44 127 238.6-111.6
45 128 231.8-103.8
46 131 256-125
47 133 282-149
48 135 280-145
49 135 247.8-112.8
50 137 223.5-86.47
51 139 224.6-85.55
52 140 232.6-92.61
53 142 272.3-130.3
54 142 271.4-129.4
55 143 219-75.98
56 145 279.7-134.7
57 147 238.5-91.49
58 148 230.8-82.85
59 150 232.5-82.51
60 153 237.7-84.66
61 154 235.5-81.53
62 157 234.5-77.48
63 158 246.5-88.49
64 175 239.4-64.43
65 178 234.6-56.61
66 178 241.9-63.89
67 190 223.4-33.45
68 191 226.6-35.63
69 192 250.2-58.24
70 192 242.3-50.32
71 193 225.5-32.53
72 195 271.8-76.82
73 195 256.5-61.5
74 207 278.7-71.74
75 218 278.5-60.52
76 219 226.4-7.379
77 224 243-19.04
78 227 234.9-7.901
79 228 283.1-55.13
80 234 236.8-2.76
81 234 227 7.004
82 242 229 13.02
83 244 282.1-38.06
84 259 232.7 26.26
85 273 234.6 38.44
86 286 235.8 50.22
87 291 228.3 62.66
88 343 225.3 117.7
89 568 273.4 294.6
90 134 239.6-105.6
91 134 244-110
92 138 252.9-114.9
93 140 219.8-79.75
94 142 236.1-94.14
95 144 259.7-115.7
96 155 221.7-66.7
97 167 231.2-64.18
98 177 283.2-106.2
99 179 252.3-73.32
100 179 249.9-70.93
101 182 237-54.97
102 210 235.3-25.26
103 224 231.5-7.542
104 248 237.5 10.54
105 273 236.6 36.45
106 293 216.6 76.4
107 410 240.9 169.1
108 430 246.4 183.6
109 452 242.4 209.6
110 454 248 206
111 520 238.6 281.4
112 586 274.8 311.2
113 617 257.2 359.8
114 659 228.4 430.6
115 759 233.2 525.8
116 924 244.1 679.9
117 941 260.6 680.4







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.9579 0.08414 0.04207
7 0.949 0.1021 0.05103
8 0.9547 0.09062 0.04531
9 0.9407 0.1186 0.05931
10 0.9165 0.167 0.08349
11 0.9671 0.06571 0.03286
12 0.9474 0.1053 0.05263
13 0.955 0.0901 0.04505
14 0.9358 0.1284 0.0642
15 0.9137 0.1727 0.08633
16 0.8796 0.2408 0.1204
17 0.853 0.2941 0.147
18 0.8076 0.3849 0.1924
19 0.825 0.3501 0.175
20 0.7769 0.4463 0.2231
21 0.75 0.5 0.25
22 0.6935 0.6131 0.3065
23 0.6555 0.689 0.3445
24 0.6042 0.7916 0.3958
25 0.5471 0.9058 0.4529
26 0.4875 0.975 0.5125
27 0.4242 0.8484 0.5758
28 0.3631 0.7262 0.6369
29 0.3095 0.6191 0.6905
30 0.2595 0.5191 0.7405
31 0.2117 0.4233 0.7883
32 0.1931 0.3861 0.8069
33 0.1649 0.3297 0.8351
34 0.1453 0.2907 0.8547
35 0.1204 0.2408 0.8796
36 0.2427 0.4854 0.7573
37 0.5903 0.8193 0.4097
38 0.8837 0.2326 0.1163
39 0.8647 0.2706 0.1353
40 0.8612 0.2775 0.1388
41 0.8513 0.2973 0.1487
42 0.8249 0.3503 0.1751
43 0.7996 0.4008 0.2004
44 0.7705 0.459 0.2295
45 0.7373 0.5254 0.2627
46 0.7089 0.5822 0.2911
47 0.6904 0.6192 0.3096
48 0.67 0.6601 0.33
49 0.6359 0.7283 0.3641
50 0.5924 0.8152 0.4076
51 0.5479 0.9041 0.4521
52 0.5054 0.9892 0.4946
53 0.4801 0.9603 0.5199
54 0.4572 0.9143 0.5428
55 0.4118 0.8236 0.5882
56 0.3949 0.7898 0.6051
57 0.3545 0.7091 0.6455
58 0.3153 0.6305 0.6847
59 0.2774 0.5549 0.7226
60 0.2434 0.4869 0.7566
61 0.2126 0.4253 0.7874
62 0.1833 0.3666 0.8167
63 0.1597 0.3195 0.8403
64 0.1347 0.2693 0.8653
65 0.1118 0.2236 0.8882
66 0.09283 0.1857 0.9072
67 0.07467 0.1493 0.9253
68 0.05979 0.1196 0.9402
69 0.04797 0.09594 0.952
70 0.03815 0.07629 0.9619
71 0.03002 0.06005 0.97
72 0.0249 0.0498 0.9751
73 0.01942 0.03884 0.9806
74 0.01604 0.03208 0.984
75 0.01321 0.02641 0.9868
76 0.00973 0.01946 0.9903
77 0.007248 0.0145 0.9928
78 0.005166 0.01033 0.9948
79 0.004489 0.008978 0.9955
80 0.00314 0.006279 0.9969
81 0.002221 0.004443 0.9978
82 0.001488 0.002975 0.9985
83 0.001291 0.002582 0.9987
84 0.0008543 0.001709 0.9991
85 0.0005549 0.00111 0.9994
86 0.0003561 0.0007122 0.9996
87 0.0002258 0.0004515 0.9998
88 0.0001503 0.0003007 0.9999
89 0.000206 0.000412 0.9998
90 0.0001696 0.0003392 0.9998
91 0.0001715 0.0003429 0.9998
92 0.0002126 0.0004252 0.9998
93 0.0001688 0.0003375 0.9998
94 0.0001423 0.0002846 0.9999
95 0.0002183 0.0004367 0.9998
96 0.000188 0.000376 0.9998
97 0.0001824 0.0003647 0.9998
98 0.0005376 0.001075 0.9995
99 0.001415 0.00283 0.9986
100 0.003288 0.006576 0.9967
101 0.004774 0.009548 0.9952
102 0.007764 0.01553 0.9922
103 0.00795 0.0159 0.992
104 0.01192 0.02383 0.9881
105 0.02016 0.04033 0.9798
106 0.02942 0.05884 0.9706
107 0.03271 0.06543 0.9673
108 0.04389 0.08778 0.9561
109 0.07583 0.1517 0.9242
110 0.1017 0.2034 0.8983
111 0.1605 0.321 0.8395

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 &  0.9579 &  0.08414 &  0.04207 \tabularnewline
7 &  0.949 &  0.1021 &  0.05103 \tabularnewline
8 &  0.9547 &  0.09062 &  0.04531 \tabularnewline
9 &  0.9407 &  0.1186 &  0.05931 \tabularnewline
10 &  0.9165 &  0.167 &  0.08349 \tabularnewline
11 &  0.9671 &  0.06571 &  0.03286 \tabularnewline
12 &  0.9474 &  0.1053 &  0.05263 \tabularnewline
13 &  0.955 &  0.0901 &  0.04505 \tabularnewline
14 &  0.9358 &  0.1284 &  0.0642 \tabularnewline
15 &  0.9137 &  0.1727 &  0.08633 \tabularnewline
16 &  0.8796 &  0.2408 &  0.1204 \tabularnewline
17 &  0.853 &  0.2941 &  0.147 \tabularnewline
18 &  0.8076 &  0.3849 &  0.1924 \tabularnewline
19 &  0.825 &  0.3501 &  0.175 \tabularnewline
20 &  0.7769 &  0.4463 &  0.2231 \tabularnewline
21 &  0.75 &  0.5 &  0.25 \tabularnewline
22 &  0.6935 &  0.6131 &  0.3065 \tabularnewline
23 &  0.6555 &  0.689 &  0.3445 \tabularnewline
24 &  0.6042 &  0.7916 &  0.3958 \tabularnewline
25 &  0.5471 &  0.9058 &  0.4529 \tabularnewline
26 &  0.4875 &  0.975 &  0.5125 \tabularnewline
27 &  0.4242 &  0.8484 &  0.5758 \tabularnewline
28 &  0.3631 &  0.7262 &  0.6369 \tabularnewline
29 &  0.3095 &  0.6191 &  0.6905 \tabularnewline
30 &  0.2595 &  0.5191 &  0.7405 \tabularnewline
31 &  0.2117 &  0.4233 &  0.7883 \tabularnewline
32 &  0.1931 &  0.3861 &  0.8069 \tabularnewline
33 &  0.1649 &  0.3297 &  0.8351 \tabularnewline
34 &  0.1453 &  0.2907 &  0.8547 \tabularnewline
35 &  0.1204 &  0.2408 &  0.8796 \tabularnewline
36 &  0.2427 &  0.4854 &  0.7573 \tabularnewline
37 &  0.5903 &  0.8193 &  0.4097 \tabularnewline
38 &  0.8837 &  0.2326 &  0.1163 \tabularnewline
39 &  0.8647 &  0.2706 &  0.1353 \tabularnewline
40 &  0.8612 &  0.2775 &  0.1388 \tabularnewline
41 &  0.8513 &  0.2973 &  0.1487 \tabularnewline
42 &  0.8249 &  0.3503 &  0.1751 \tabularnewline
43 &  0.7996 &  0.4008 &  0.2004 \tabularnewline
44 &  0.7705 &  0.459 &  0.2295 \tabularnewline
45 &  0.7373 &  0.5254 &  0.2627 \tabularnewline
46 &  0.7089 &  0.5822 &  0.2911 \tabularnewline
47 &  0.6904 &  0.6192 &  0.3096 \tabularnewline
48 &  0.67 &  0.6601 &  0.33 \tabularnewline
49 &  0.6359 &  0.7283 &  0.3641 \tabularnewline
50 &  0.5924 &  0.8152 &  0.4076 \tabularnewline
51 &  0.5479 &  0.9041 &  0.4521 \tabularnewline
52 &  0.5054 &  0.9892 &  0.4946 \tabularnewline
53 &  0.4801 &  0.9603 &  0.5199 \tabularnewline
54 &  0.4572 &  0.9143 &  0.5428 \tabularnewline
55 &  0.4118 &  0.8236 &  0.5882 \tabularnewline
56 &  0.3949 &  0.7898 &  0.6051 \tabularnewline
57 &  0.3545 &  0.7091 &  0.6455 \tabularnewline
58 &  0.3153 &  0.6305 &  0.6847 \tabularnewline
59 &  0.2774 &  0.5549 &  0.7226 \tabularnewline
60 &  0.2434 &  0.4869 &  0.7566 \tabularnewline
61 &  0.2126 &  0.4253 &  0.7874 \tabularnewline
62 &  0.1833 &  0.3666 &  0.8167 \tabularnewline
63 &  0.1597 &  0.3195 &  0.8403 \tabularnewline
64 &  0.1347 &  0.2693 &  0.8653 \tabularnewline
65 &  0.1118 &  0.2236 &  0.8882 \tabularnewline
66 &  0.09283 &  0.1857 &  0.9072 \tabularnewline
67 &  0.07467 &  0.1493 &  0.9253 \tabularnewline
68 &  0.05979 &  0.1196 &  0.9402 \tabularnewline
69 &  0.04797 &  0.09594 &  0.952 \tabularnewline
70 &  0.03815 &  0.07629 &  0.9619 \tabularnewline
71 &  0.03002 &  0.06005 &  0.97 \tabularnewline
72 &  0.0249 &  0.0498 &  0.9751 \tabularnewline
73 &  0.01942 &  0.03884 &  0.9806 \tabularnewline
74 &  0.01604 &  0.03208 &  0.984 \tabularnewline
75 &  0.01321 &  0.02641 &  0.9868 \tabularnewline
76 &  0.00973 &  0.01946 &  0.9903 \tabularnewline
77 &  0.007248 &  0.0145 &  0.9928 \tabularnewline
78 &  0.005166 &  0.01033 &  0.9948 \tabularnewline
79 &  0.004489 &  0.008978 &  0.9955 \tabularnewline
80 &  0.00314 &  0.006279 &  0.9969 \tabularnewline
81 &  0.002221 &  0.004443 &  0.9978 \tabularnewline
82 &  0.001488 &  0.002975 &  0.9985 \tabularnewline
83 &  0.001291 &  0.002582 &  0.9987 \tabularnewline
84 &  0.0008543 &  0.001709 &  0.9991 \tabularnewline
85 &  0.0005549 &  0.00111 &  0.9994 \tabularnewline
86 &  0.0003561 &  0.0007122 &  0.9996 \tabularnewline
87 &  0.0002258 &  0.0004515 &  0.9998 \tabularnewline
88 &  0.0001503 &  0.0003007 &  0.9999 \tabularnewline
89 &  0.000206 &  0.000412 &  0.9998 \tabularnewline
90 &  0.0001696 &  0.0003392 &  0.9998 \tabularnewline
91 &  0.0001715 &  0.0003429 &  0.9998 \tabularnewline
92 &  0.0002126 &  0.0004252 &  0.9998 \tabularnewline
93 &  0.0001688 &  0.0003375 &  0.9998 \tabularnewline
94 &  0.0001423 &  0.0002846 &  0.9999 \tabularnewline
95 &  0.0002183 &  0.0004367 &  0.9998 \tabularnewline
96 &  0.000188 &  0.000376 &  0.9998 \tabularnewline
97 &  0.0001824 &  0.0003647 &  0.9998 \tabularnewline
98 &  0.0005376 &  0.001075 &  0.9995 \tabularnewline
99 &  0.001415 &  0.00283 &  0.9986 \tabularnewline
100 &  0.003288 &  0.006576 &  0.9967 \tabularnewline
101 &  0.004774 &  0.009548 &  0.9952 \tabularnewline
102 &  0.007764 &  0.01553 &  0.9922 \tabularnewline
103 &  0.00795 &  0.0159 &  0.992 \tabularnewline
104 &  0.01192 &  0.02383 &  0.9881 \tabularnewline
105 &  0.02016 &  0.04033 &  0.9798 \tabularnewline
106 &  0.02942 &  0.05884 &  0.9706 \tabularnewline
107 &  0.03271 &  0.06543 &  0.9673 \tabularnewline
108 &  0.04389 &  0.08778 &  0.9561 \tabularnewline
109 &  0.07583 &  0.1517 &  0.9242 \tabularnewline
110 &  0.1017 &  0.2034 &  0.8983 \tabularnewline
111 &  0.1605 &  0.321 &  0.8395 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=6

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C] 0.9579[/C][C] 0.08414[/C][C] 0.04207[/C][/ROW]
[ROW][C]7[/C][C] 0.949[/C][C] 0.1021[/C][C] 0.05103[/C][/ROW]
[ROW][C]8[/C][C] 0.9547[/C][C] 0.09062[/C][C] 0.04531[/C][/ROW]
[ROW][C]9[/C][C] 0.9407[/C][C] 0.1186[/C][C] 0.05931[/C][/ROW]
[ROW][C]10[/C][C] 0.9165[/C][C] 0.167[/C][C] 0.08349[/C][/ROW]
[ROW][C]11[/C][C] 0.9671[/C][C] 0.06571[/C][C] 0.03286[/C][/ROW]
[ROW][C]12[/C][C] 0.9474[/C][C] 0.1053[/C][C] 0.05263[/C][/ROW]
[ROW][C]13[/C][C] 0.955[/C][C] 0.0901[/C][C] 0.04505[/C][/ROW]
[ROW][C]14[/C][C] 0.9358[/C][C] 0.1284[/C][C] 0.0642[/C][/ROW]
[ROW][C]15[/C][C] 0.9137[/C][C] 0.1727[/C][C] 0.08633[/C][/ROW]
[ROW][C]16[/C][C] 0.8796[/C][C] 0.2408[/C][C] 0.1204[/C][/ROW]
[ROW][C]17[/C][C] 0.853[/C][C] 0.2941[/C][C] 0.147[/C][/ROW]
[ROW][C]18[/C][C] 0.8076[/C][C] 0.3849[/C][C] 0.1924[/C][/ROW]
[ROW][C]19[/C][C] 0.825[/C][C] 0.3501[/C][C] 0.175[/C][/ROW]
[ROW][C]20[/C][C] 0.7769[/C][C] 0.4463[/C][C] 0.2231[/C][/ROW]
[ROW][C]21[/C][C] 0.75[/C][C] 0.5[/C][C] 0.25[/C][/ROW]
[ROW][C]22[/C][C] 0.6935[/C][C] 0.6131[/C][C] 0.3065[/C][/ROW]
[ROW][C]23[/C][C] 0.6555[/C][C] 0.689[/C][C] 0.3445[/C][/ROW]
[ROW][C]24[/C][C] 0.6042[/C][C] 0.7916[/C][C] 0.3958[/C][/ROW]
[ROW][C]25[/C][C] 0.5471[/C][C] 0.9058[/C][C] 0.4529[/C][/ROW]
[ROW][C]26[/C][C] 0.4875[/C][C] 0.975[/C][C] 0.5125[/C][/ROW]
[ROW][C]27[/C][C] 0.4242[/C][C] 0.8484[/C][C] 0.5758[/C][/ROW]
[ROW][C]28[/C][C] 0.3631[/C][C] 0.7262[/C][C] 0.6369[/C][/ROW]
[ROW][C]29[/C][C] 0.3095[/C][C] 0.6191[/C][C] 0.6905[/C][/ROW]
[ROW][C]30[/C][C] 0.2595[/C][C] 0.5191[/C][C] 0.7405[/C][/ROW]
[ROW][C]31[/C][C] 0.2117[/C][C] 0.4233[/C][C] 0.7883[/C][/ROW]
[ROW][C]32[/C][C] 0.1931[/C][C] 0.3861[/C][C] 0.8069[/C][/ROW]
[ROW][C]33[/C][C] 0.1649[/C][C] 0.3297[/C][C] 0.8351[/C][/ROW]
[ROW][C]34[/C][C] 0.1453[/C][C] 0.2907[/C][C] 0.8547[/C][/ROW]
[ROW][C]35[/C][C] 0.1204[/C][C] 0.2408[/C][C] 0.8796[/C][/ROW]
[ROW][C]36[/C][C] 0.2427[/C][C] 0.4854[/C][C] 0.7573[/C][/ROW]
[ROW][C]37[/C][C] 0.5903[/C][C] 0.8193[/C][C] 0.4097[/C][/ROW]
[ROW][C]38[/C][C] 0.8837[/C][C] 0.2326[/C][C] 0.1163[/C][/ROW]
[ROW][C]39[/C][C] 0.8647[/C][C] 0.2706[/C][C] 0.1353[/C][/ROW]
[ROW][C]40[/C][C] 0.8612[/C][C] 0.2775[/C][C] 0.1388[/C][/ROW]
[ROW][C]41[/C][C] 0.8513[/C][C] 0.2973[/C][C] 0.1487[/C][/ROW]
[ROW][C]42[/C][C] 0.8249[/C][C] 0.3503[/C][C] 0.1751[/C][/ROW]
[ROW][C]43[/C][C] 0.7996[/C][C] 0.4008[/C][C] 0.2004[/C][/ROW]
[ROW][C]44[/C][C] 0.7705[/C][C] 0.459[/C][C] 0.2295[/C][/ROW]
[ROW][C]45[/C][C] 0.7373[/C][C] 0.5254[/C][C] 0.2627[/C][/ROW]
[ROW][C]46[/C][C] 0.7089[/C][C] 0.5822[/C][C] 0.2911[/C][/ROW]
[ROW][C]47[/C][C] 0.6904[/C][C] 0.6192[/C][C] 0.3096[/C][/ROW]
[ROW][C]48[/C][C] 0.67[/C][C] 0.6601[/C][C] 0.33[/C][/ROW]
[ROW][C]49[/C][C] 0.6359[/C][C] 0.7283[/C][C] 0.3641[/C][/ROW]
[ROW][C]50[/C][C] 0.5924[/C][C] 0.8152[/C][C] 0.4076[/C][/ROW]
[ROW][C]51[/C][C] 0.5479[/C][C] 0.9041[/C][C] 0.4521[/C][/ROW]
[ROW][C]52[/C][C] 0.5054[/C][C] 0.9892[/C][C] 0.4946[/C][/ROW]
[ROW][C]53[/C][C] 0.4801[/C][C] 0.9603[/C][C] 0.5199[/C][/ROW]
[ROW][C]54[/C][C] 0.4572[/C][C] 0.9143[/C][C] 0.5428[/C][/ROW]
[ROW][C]55[/C][C] 0.4118[/C][C] 0.8236[/C][C] 0.5882[/C][/ROW]
[ROW][C]56[/C][C] 0.3949[/C][C] 0.7898[/C][C] 0.6051[/C][/ROW]
[ROW][C]57[/C][C] 0.3545[/C][C] 0.7091[/C][C] 0.6455[/C][/ROW]
[ROW][C]58[/C][C] 0.3153[/C][C] 0.6305[/C][C] 0.6847[/C][/ROW]
[ROW][C]59[/C][C] 0.2774[/C][C] 0.5549[/C][C] 0.7226[/C][/ROW]
[ROW][C]60[/C][C] 0.2434[/C][C] 0.4869[/C][C] 0.7566[/C][/ROW]
[ROW][C]61[/C][C] 0.2126[/C][C] 0.4253[/C][C] 0.7874[/C][/ROW]
[ROW][C]62[/C][C] 0.1833[/C][C] 0.3666[/C][C] 0.8167[/C][/ROW]
[ROW][C]63[/C][C] 0.1597[/C][C] 0.3195[/C][C] 0.8403[/C][/ROW]
[ROW][C]64[/C][C] 0.1347[/C][C] 0.2693[/C][C] 0.8653[/C][/ROW]
[ROW][C]65[/C][C] 0.1118[/C][C] 0.2236[/C][C] 0.8882[/C][/ROW]
[ROW][C]66[/C][C] 0.09283[/C][C] 0.1857[/C][C] 0.9072[/C][/ROW]
[ROW][C]67[/C][C] 0.07467[/C][C] 0.1493[/C][C] 0.9253[/C][/ROW]
[ROW][C]68[/C][C] 0.05979[/C][C] 0.1196[/C][C] 0.9402[/C][/ROW]
[ROW][C]69[/C][C] 0.04797[/C][C] 0.09594[/C][C] 0.952[/C][/ROW]
[ROW][C]70[/C][C] 0.03815[/C][C] 0.07629[/C][C] 0.9619[/C][/ROW]
[ROW][C]71[/C][C] 0.03002[/C][C] 0.06005[/C][C] 0.97[/C][/ROW]
[ROW][C]72[/C][C] 0.0249[/C][C] 0.0498[/C][C] 0.9751[/C][/ROW]
[ROW][C]73[/C][C] 0.01942[/C][C] 0.03884[/C][C] 0.9806[/C][/ROW]
[ROW][C]74[/C][C] 0.01604[/C][C] 0.03208[/C][C] 0.984[/C][/ROW]
[ROW][C]75[/C][C] 0.01321[/C][C] 0.02641[/C][C] 0.9868[/C][/ROW]
[ROW][C]76[/C][C] 0.00973[/C][C] 0.01946[/C][C] 0.9903[/C][/ROW]
[ROW][C]77[/C][C] 0.007248[/C][C] 0.0145[/C][C] 0.9928[/C][/ROW]
[ROW][C]78[/C][C] 0.005166[/C][C] 0.01033[/C][C] 0.9948[/C][/ROW]
[ROW][C]79[/C][C] 0.004489[/C][C] 0.008978[/C][C] 0.9955[/C][/ROW]
[ROW][C]80[/C][C] 0.00314[/C][C] 0.006279[/C][C] 0.9969[/C][/ROW]
[ROW][C]81[/C][C] 0.002221[/C][C] 0.004443[/C][C] 0.9978[/C][/ROW]
[ROW][C]82[/C][C] 0.001488[/C][C] 0.002975[/C][C] 0.9985[/C][/ROW]
[ROW][C]83[/C][C] 0.001291[/C][C] 0.002582[/C][C] 0.9987[/C][/ROW]
[ROW][C]84[/C][C] 0.0008543[/C][C] 0.001709[/C][C] 0.9991[/C][/ROW]
[ROW][C]85[/C][C] 0.0005549[/C][C] 0.00111[/C][C] 0.9994[/C][/ROW]
[ROW][C]86[/C][C] 0.0003561[/C][C] 0.0007122[/C][C] 0.9996[/C][/ROW]
[ROW][C]87[/C][C] 0.0002258[/C][C] 0.0004515[/C][C] 0.9998[/C][/ROW]
[ROW][C]88[/C][C] 0.0001503[/C][C] 0.0003007[/C][C] 0.9999[/C][/ROW]
[ROW][C]89[/C][C] 0.000206[/C][C] 0.000412[/C][C] 0.9998[/C][/ROW]
[ROW][C]90[/C][C] 0.0001696[/C][C] 0.0003392[/C][C] 0.9998[/C][/ROW]
[ROW][C]91[/C][C] 0.0001715[/C][C] 0.0003429[/C][C] 0.9998[/C][/ROW]
[ROW][C]92[/C][C] 0.0002126[/C][C] 0.0004252[/C][C] 0.9998[/C][/ROW]
[ROW][C]93[/C][C] 0.0001688[/C][C] 0.0003375[/C][C] 0.9998[/C][/ROW]
[ROW][C]94[/C][C] 0.0001423[/C][C] 0.0002846[/C][C] 0.9999[/C][/ROW]
[ROW][C]95[/C][C] 0.0002183[/C][C] 0.0004367[/C][C] 0.9998[/C][/ROW]
[ROW][C]96[/C][C] 0.000188[/C][C] 0.000376[/C][C] 0.9998[/C][/ROW]
[ROW][C]97[/C][C] 0.0001824[/C][C] 0.0003647[/C][C] 0.9998[/C][/ROW]
[ROW][C]98[/C][C] 0.0005376[/C][C] 0.001075[/C][C] 0.9995[/C][/ROW]
[ROW][C]99[/C][C] 0.001415[/C][C] 0.00283[/C][C] 0.9986[/C][/ROW]
[ROW][C]100[/C][C] 0.003288[/C][C] 0.006576[/C][C] 0.9967[/C][/ROW]
[ROW][C]101[/C][C] 0.004774[/C][C] 0.009548[/C][C] 0.9952[/C][/ROW]
[ROW][C]102[/C][C] 0.007764[/C][C] 0.01553[/C][C] 0.9922[/C][/ROW]
[ROW][C]103[/C][C] 0.00795[/C][C] 0.0159[/C][C] 0.992[/C][/ROW]
[ROW][C]104[/C][C] 0.01192[/C][C] 0.02383[/C][C] 0.9881[/C][/ROW]
[ROW][C]105[/C][C] 0.02016[/C][C] 0.04033[/C][C] 0.9798[/C][/ROW]
[ROW][C]106[/C][C] 0.02942[/C][C] 0.05884[/C][C] 0.9706[/C][/ROW]
[ROW][C]107[/C][C] 0.03271[/C][C] 0.06543[/C][C] 0.9673[/C][/ROW]
[ROW][C]108[/C][C] 0.04389[/C][C] 0.08778[/C][C] 0.9561[/C][/ROW]
[ROW][C]109[/C][C] 0.07583[/C][C] 0.1517[/C][C] 0.9242[/C][/ROW]
[ROW][C]110[/C][C] 0.1017[/C][C] 0.2034[/C][C] 0.8983[/C][/ROW]
[ROW][C]111[/C][C] 0.1605[/C][C] 0.321[/C][C] 0.8395[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.9579 0.08414 0.04207
7 0.949 0.1021 0.05103
8 0.9547 0.09062 0.04531
9 0.9407 0.1186 0.05931
10 0.9165 0.167 0.08349
11 0.9671 0.06571 0.03286
12 0.9474 0.1053 0.05263
13 0.955 0.0901 0.04505
14 0.9358 0.1284 0.0642
15 0.9137 0.1727 0.08633
16 0.8796 0.2408 0.1204
17 0.853 0.2941 0.147
18 0.8076 0.3849 0.1924
19 0.825 0.3501 0.175
20 0.7769 0.4463 0.2231
21 0.75 0.5 0.25
22 0.6935 0.6131 0.3065
23 0.6555 0.689 0.3445
24 0.6042 0.7916 0.3958
25 0.5471 0.9058 0.4529
26 0.4875 0.975 0.5125
27 0.4242 0.8484 0.5758
28 0.3631 0.7262 0.6369
29 0.3095 0.6191 0.6905
30 0.2595 0.5191 0.7405
31 0.2117 0.4233 0.7883
32 0.1931 0.3861 0.8069
33 0.1649 0.3297 0.8351
34 0.1453 0.2907 0.8547
35 0.1204 0.2408 0.8796
36 0.2427 0.4854 0.7573
37 0.5903 0.8193 0.4097
38 0.8837 0.2326 0.1163
39 0.8647 0.2706 0.1353
40 0.8612 0.2775 0.1388
41 0.8513 0.2973 0.1487
42 0.8249 0.3503 0.1751
43 0.7996 0.4008 0.2004
44 0.7705 0.459 0.2295
45 0.7373 0.5254 0.2627
46 0.7089 0.5822 0.2911
47 0.6904 0.6192 0.3096
48 0.67 0.6601 0.33
49 0.6359 0.7283 0.3641
50 0.5924 0.8152 0.4076
51 0.5479 0.9041 0.4521
52 0.5054 0.9892 0.4946
53 0.4801 0.9603 0.5199
54 0.4572 0.9143 0.5428
55 0.4118 0.8236 0.5882
56 0.3949 0.7898 0.6051
57 0.3545 0.7091 0.6455
58 0.3153 0.6305 0.6847
59 0.2774 0.5549 0.7226
60 0.2434 0.4869 0.7566
61 0.2126 0.4253 0.7874
62 0.1833 0.3666 0.8167
63 0.1597 0.3195 0.8403
64 0.1347 0.2693 0.8653
65 0.1118 0.2236 0.8882
66 0.09283 0.1857 0.9072
67 0.07467 0.1493 0.9253
68 0.05979 0.1196 0.9402
69 0.04797 0.09594 0.952
70 0.03815 0.07629 0.9619
71 0.03002 0.06005 0.97
72 0.0249 0.0498 0.9751
73 0.01942 0.03884 0.9806
74 0.01604 0.03208 0.984
75 0.01321 0.02641 0.9868
76 0.00973 0.01946 0.9903
77 0.007248 0.0145 0.9928
78 0.005166 0.01033 0.9948
79 0.004489 0.008978 0.9955
80 0.00314 0.006279 0.9969
81 0.002221 0.004443 0.9978
82 0.001488 0.002975 0.9985
83 0.001291 0.002582 0.9987
84 0.0008543 0.001709 0.9991
85 0.0005549 0.00111 0.9994
86 0.0003561 0.0007122 0.9996
87 0.0002258 0.0004515 0.9998
88 0.0001503 0.0003007 0.9999
89 0.000206 0.000412 0.9998
90 0.0001696 0.0003392 0.9998
91 0.0001715 0.0003429 0.9998
92 0.0002126 0.0004252 0.9998
93 0.0001688 0.0003375 0.9998
94 0.0001423 0.0002846 0.9999
95 0.0002183 0.0004367 0.9998
96 0.000188 0.000376 0.9998
97 0.0001824 0.0003647 0.9998
98 0.0005376 0.001075 0.9995
99 0.001415 0.00283 0.9986
100 0.003288 0.006576 0.9967
101 0.004774 0.009548 0.9952
102 0.007764 0.01553 0.9922
103 0.00795 0.0159 0.992
104 0.01192 0.02383 0.9881
105 0.02016 0.04033 0.9798
106 0.02942 0.05884 0.9706
107 0.03271 0.06543 0.9673
108 0.04389 0.08778 0.9561
109 0.07583 0.1517 0.9242
110 0.1017 0.2034 0.8983
111 0.1605 0.321 0.8395







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level23 0.217NOK
5% type I error level340.320755NOK
10% type I error level440.415094NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 23 &  0.217 & NOK \tabularnewline
5% type I error level & 34 & 0.320755 & NOK \tabularnewline
10% type I error level & 44 & 0.415094 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=309439&T=7

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]23[/C][C] 0.217[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]34[/C][C]0.320755[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]44[/C][C]0.415094[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=309439&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level23 0.217NOK
5% type I error level340.320755NOK
10% type I error level440.415094NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2428, df1 = 2, df2 = 112, p-value = 0.2925
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 3.3223, df1 = 4, df2 = 110, p-value = 0.01307
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.371, df1 = 2, df2 = 112, p-value = 0.03788

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2428, df1 = 2, df2 = 112, p-value = 0.2925
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 3.3223, df1 = 4, df2 = 110, p-value = 0.01307
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.371, df1 = 2, df2 = 112, p-value = 0.03788
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=309439&T=8

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2428, df1 = 2, df2 = 112, p-value = 0.2925
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 3.3223, df1 = 4, df2 = 110, p-value = 0.01307
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.371, df1 = 2, df2 = 112, p-value = 0.03788
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=309439&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.2428, df1 = 2, df2 = 112, p-value = 0.2925
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 3.3223, df1 = 4, df2 = 110, p-value = 0.01307
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.371, df1 = 2, df2 = 112, p-value = 0.03788







Variance Inflation Factors (Multicollinearity)
> vif
 DEPTIME   DEPDAY 
1.041941 1.041941 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
 DEPTIME   DEPDAY 
1.041941 1.041941 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=309439&T=9

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
 DEPTIME   DEPDAY 
1.041941 1.041941 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=309439&T=9

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=309439&T=9

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
 DEPTIME   DEPDAY 
1.041941 1.041941 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ; par6 = 12 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ; par6 = 12 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par6 <- as.numeric(par6)
if(is.na(par6)) {
par6 <- 12
mywarning = 'Warning: you did not specify the seasonality. The seasonal period was set to s = 12.'
}
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (!is.numeric(par4)) par4 <- 0
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
if (!is.numeric(par5)) par5 <- 0
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s)'){
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*par6,par5), dimnames=list(1:(n-par5*par6), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*par6)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*par6-j*par6,par1]
}
}
x <- cbind(x[(par5*par6+1):n,], x2)
n <- n - par5*par6
}
if (par2 == 'Include Seasonal Dummies'){
x2 <- array(0, dim=c(n,par6-1), dimnames=list(1:n, paste('M', seq(1:(par6-1)), sep ='')))
for (i in 1:(par6-1)){
x2[seq(i,n,par6),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
a <-table.start()
a <- table.row.start(a)
a <- table.element(a,'Menu of Residual Diagnostics',2,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Description',1,TRUE)
a <- table.element(a,'Link',1,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Histogram',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_histogram.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_centraltendency.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'QQ Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_fitdistrnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Kernel Density Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_density.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness/Kurtosis Test',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness-Kurtosis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis_plot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Harrell-Davis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_harrell_davis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Blocked Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'(Partial) Autocorrelation Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_autocorrelation.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Spectral Analysis',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_spectrum.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Tukey lambda PPCC Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_tukeylambda.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Box-Cox Normality Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_boxcoxnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Summary Statistics',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_summary1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable7.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')