Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 08 Dec 2014 11:57:14 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/08/t1418039865tx98ie0wgwa9v4u.htm/, Retrieved Sun, 19 May 2024 10:52:25 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=263951, Retrieved Sun, 19 May 2024 10:52:25 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact122
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [paper mr numeracy] [2014-12-08 11:51:34] [673773038936aef3a5778d7e6bda5c1e]
- R  D    [Multiple Regression] [paper mr numeracy] [2014-12-08 11:57:14] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
Feedback Forum

Post a new message
Dataseries X:
1 21
0 22
1 22
0 18
0 23
0 12
1 20
0 22
0 21
0 19
0 22
0 15
0 20
1 19
1 18
1 15
0 20
1 21
0 21
1 15
0 16
0 23
1 21
0 18
0 25
0 9
0 30
1 20
0 23
1 16
1 16
1 19
0 25
0 18
0 23
0 21
1 10
0 14
0 22
1 26
0 23
0 23
0 24
0 24
0 18
1 23
0 15
0 19
1 16
0 25
0 23
0 17
0 19
0 21
0 18
0 27
1 21
0 13
1 8
0 29
0 28
1 23
1 21
0 19
1 19
0 20
1 18
0 19
0 17
1 19
1 25
1 19
1 22
0 23
1 14
0 28
1 16
0 24
1 20
1 12
0 24
1 22
1 12
1 22
0 20
1 10
0 23
0 17
1 22
1 24
1 18
0 21
0 20
0 20
1 22
0 19
1 20
0 26
0 23
0 24
0 21
0 21
1 19
0 8
0 17
0 20
1 11
1 8
1 15
1 18
1 18
1 19
0 19




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 1.00756 -0.0301688NUMERACYTOT_op_32[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
gendercode[t] =  +  1.00756 -0.0301688NUMERACYTOT_op_32[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]gendercode[t] =  +  1.00756 -0.0301688NUMERACYTOT_op_32[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 1.00756 -0.0301688NUMERACYTOT_op_32[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.007560.204684.9232.99247e-061.49623e-06
NUMERACYTOT_op_32-0.03016880.0101814-2.9630.003726730.00186336

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 1.00756 & 0.20468 & 4.923 & 2.99247e-06 & 1.49623e-06 \tabularnewline
NUMERACYTOT_op_32 & -0.0301688 & 0.0101814 & -2.963 & 0.00372673 & 0.00186336 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]1.00756[/C][C]0.20468[/C][C]4.923[/C][C]2.99247e-06[/C][C]1.49623e-06[/C][/ROW]
[ROW][C]NUMERACYTOT_op_32[/C][C]-0.0301688[/C][C]0.0101814[/C][C]-2.963[/C][C]0.00372673[/C][C]0.00186336[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.007560.204684.9232.99247e-061.49623e-06
NUMERACYTOT_op_32-0.03016880.0101814-2.9630.003726730.00186336







Multiple Linear Regression - Regression Statistics
Multiple R0.270743
R-squared0.0733017
Adjusted R-squared0.0649531
F-TEST (value)8.78009
F-TEST (DF numerator)1
F-TEST (DF denominator)111
p-value0.00372673
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.478729
Sum Squared Residuals25.4391

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.270743 \tabularnewline
R-squared & 0.0733017 \tabularnewline
Adjusted R-squared & 0.0649531 \tabularnewline
F-TEST (value) & 8.78009 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 111 \tabularnewline
p-value & 0.00372673 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.478729 \tabularnewline
Sum Squared Residuals & 25.4391 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.270743[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0733017[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0649531[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]8.78009[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]111[/C][/ROW]
[ROW][C]p-value[/C][C]0.00372673[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.478729[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]25.4391[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.270743
R-squared0.0733017
Adjusted R-squared0.0649531
F-TEST (value)8.78009
F-TEST (DF numerator)1
F-TEST (DF denominator)111
p-value0.00372673
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.478729
Sum Squared Residuals25.4391







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.3740130.625987
200.343845-0.343845
310.3438450.656155
400.46452-0.46452
500.313676-0.313676
600.645532-0.645532
710.4041820.595818
800.343845-0.343845
900.374013-0.374013
1000.434351-0.434351
1100.343845-0.343845
1200.555026-0.555026
1300.404182-0.404182
1410.4343510.565649
1510.464520.53548
1610.5550260.444974
1700.404182-0.404182
1810.3740130.625987
1900.374013-0.374013
2010.5550260.444974
2100.524857-0.524857
2200.313676-0.313676
2310.3740130.625987
2400.46452-0.46452
2500.253338-0.253338
2600.736038-0.736038
2700.102495-0.102495
2810.4041820.595818
2900.313676-0.313676
3010.5248570.475143
3110.5248570.475143
3210.4343510.565649
3300.253338-0.253338
3400.46452-0.46452
3500.313676-0.313676
3600.374013-0.374013
3710.705870.29413
3800.585195-0.585195
3900.343845-0.343845
4010.223170.77683
4100.313676-0.313676
4200.313676-0.313676
4300.283507-0.283507
4400.283507-0.283507
4500.46452-0.46452
4610.3136760.686324
4700.555026-0.555026
4800.434351-0.434351
4910.5248570.475143
5000.253338-0.253338
5100.313676-0.313676
5200.494688-0.494688
5300.434351-0.434351
5400.374013-0.374013
5500.46452-0.46452
5600.193001-0.193001
5710.3740130.625987
5800.615363-0.615363
5910.7662070.233793
6000.132663-0.132663
6100.162832-0.162832
6210.3136760.686324
6310.3740130.625987
6400.434351-0.434351
6510.4343510.565649
6600.404182-0.404182
6710.464520.53548
6800.434351-0.434351
6900.494688-0.494688
7010.4343510.565649
7110.2533380.746662
7210.4343510.565649
7310.3438450.656155
7400.313676-0.313676
7510.5851950.414805
7600.162832-0.162832
7710.5248570.475143
7800.283507-0.283507
7910.4041820.595818
8010.6455320.354468
8100.283507-0.283507
8210.3438450.656155
8310.6455320.354468
8410.3438450.656155
8500.404182-0.404182
8610.705870.29413
8700.313676-0.313676
8800.494688-0.494688
8910.3438450.656155
9010.2835070.716493
9110.464520.53548
9200.374013-0.374013
9300.404182-0.404182
9400.404182-0.404182
9510.3438450.656155
9600.434351-0.434351
9710.4041820.595818
9800.22317-0.22317
9900.313676-0.313676
10000.283507-0.283507
10100.374013-0.374013
10200.374013-0.374013
10310.4343510.565649
10400.766207-0.766207
10500.494688-0.494688
10600.404182-0.404182
10710.6757010.324299
10810.7662070.233793
10910.5550260.444974
11010.464520.53548
11110.464520.53548
11210.4343510.565649
11300.434351-0.434351

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1 & 0.374013 & 0.625987 \tabularnewline
2 & 0 & 0.343845 & -0.343845 \tabularnewline
3 & 1 & 0.343845 & 0.656155 \tabularnewline
4 & 0 & 0.46452 & -0.46452 \tabularnewline
5 & 0 & 0.313676 & -0.313676 \tabularnewline
6 & 0 & 0.645532 & -0.645532 \tabularnewline
7 & 1 & 0.404182 & 0.595818 \tabularnewline
8 & 0 & 0.343845 & -0.343845 \tabularnewline
9 & 0 & 0.374013 & -0.374013 \tabularnewline
10 & 0 & 0.434351 & -0.434351 \tabularnewline
11 & 0 & 0.343845 & -0.343845 \tabularnewline
12 & 0 & 0.555026 & -0.555026 \tabularnewline
13 & 0 & 0.404182 & -0.404182 \tabularnewline
14 & 1 & 0.434351 & 0.565649 \tabularnewline
15 & 1 & 0.46452 & 0.53548 \tabularnewline
16 & 1 & 0.555026 & 0.444974 \tabularnewline
17 & 0 & 0.404182 & -0.404182 \tabularnewline
18 & 1 & 0.374013 & 0.625987 \tabularnewline
19 & 0 & 0.374013 & -0.374013 \tabularnewline
20 & 1 & 0.555026 & 0.444974 \tabularnewline
21 & 0 & 0.524857 & -0.524857 \tabularnewline
22 & 0 & 0.313676 & -0.313676 \tabularnewline
23 & 1 & 0.374013 & 0.625987 \tabularnewline
24 & 0 & 0.46452 & -0.46452 \tabularnewline
25 & 0 & 0.253338 & -0.253338 \tabularnewline
26 & 0 & 0.736038 & -0.736038 \tabularnewline
27 & 0 & 0.102495 & -0.102495 \tabularnewline
28 & 1 & 0.404182 & 0.595818 \tabularnewline
29 & 0 & 0.313676 & -0.313676 \tabularnewline
30 & 1 & 0.524857 & 0.475143 \tabularnewline
31 & 1 & 0.524857 & 0.475143 \tabularnewline
32 & 1 & 0.434351 & 0.565649 \tabularnewline
33 & 0 & 0.253338 & -0.253338 \tabularnewline
34 & 0 & 0.46452 & -0.46452 \tabularnewline
35 & 0 & 0.313676 & -0.313676 \tabularnewline
36 & 0 & 0.374013 & -0.374013 \tabularnewline
37 & 1 & 0.70587 & 0.29413 \tabularnewline
38 & 0 & 0.585195 & -0.585195 \tabularnewline
39 & 0 & 0.343845 & -0.343845 \tabularnewline
40 & 1 & 0.22317 & 0.77683 \tabularnewline
41 & 0 & 0.313676 & -0.313676 \tabularnewline
42 & 0 & 0.313676 & -0.313676 \tabularnewline
43 & 0 & 0.283507 & -0.283507 \tabularnewline
44 & 0 & 0.283507 & -0.283507 \tabularnewline
45 & 0 & 0.46452 & -0.46452 \tabularnewline
46 & 1 & 0.313676 & 0.686324 \tabularnewline
47 & 0 & 0.555026 & -0.555026 \tabularnewline
48 & 0 & 0.434351 & -0.434351 \tabularnewline
49 & 1 & 0.524857 & 0.475143 \tabularnewline
50 & 0 & 0.253338 & -0.253338 \tabularnewline
51 & 0 & 0.313676 & -0.313676 \tabularnewline
52 & 0 & 0.494688 & -0.494688 \tabularnewline
53 & 0 & 0.434351 & -0.434351 \tabularnewline
54 & 0 & 0.374013 & -0.374013 \tabularnewline
55 & 0 & 0.46452 & -0.46452 \tabularnewline
56 & 0 & 0.193001 & -0.193001 \tabularnewline
57 & 1 & 0.374013 & 0.625987 \tabularnewline
58 & 0 & 0.615363 & -0.615363 \tabularnewline
59 & 1 & 0.766207 & 0.233793 \tabularnewline
60 & 0 & 0.132663 & -0.132663 \tabularnewline
61 & 0 & 0.162832 & -0.162832 \tabularnewline
62 & 1 & 0.313676 & 0.686324 \tabularnewline
63 & 1 & 0.374013 & 0.625987 \tabularnewline
64 & 0 & 0.434351 & -0.434351 \tabularnewline
65 & 1 & 0.434351 & 0.565649 \tabularnewline
66 & 0 & 0.404182 & -0.404182 \tabularnewline
67 & 1 & 0.46452 & 0.53548 \tabularnewline
68 & 0 & 0.434351 & -0.434351 \tabularnewline
69 & 0 & 0.494688 & -0.494688 \tabularnewline
70 & 1 & 0.434351 & 0.565649 \tabularnewline
71 & 1 & 0.253338 & 0.746662 \tabularnewline
72 & 1 & 0.434351 & 0.565649 \tabularnewline
73 & 1 & 0.343845 & 0.656155 \tabularnewline
74 & 0 & 0.313676 & -0.313676 \tabularnewline
75 & 1 & 0.585195 & 0.414805 \tabularnewline
76 & 0 & 0.162832 & -0.162832 \tabularnewline
77 & 1 & 0.524857 & 0.475143 \tabularnewline
78 & 0 & 0.283507 & -0.283507 \tabularnewline
79 & 1 & 0.404182 & 0.595818 \tabularnewline
80 & 1 & 0.645532 & 0.354468 \tabularnewline
81 & 0 & 0.283507 & -0.283507 \tabularnewline
82 & 1 & 0.343845 & 0.656155 \tabularnewline
83 & 1 & 0.645532 & 0.354468 \tabularnewline
84 & 1 & 0.343845 & 0.656155 \tabularnewline
85 & 0 & 0.404182 & -0.404182 \tabularnewline
86 & 1 & 0.70587 & 0.29413 \tabularnewline
87 & 0 & 0.313676 & -0.313676 \tabularnewline
88 & 0 & 0.494688 & -0.494688 \tabularnewline
89 & 1 & 0.343845 & 0.656155 \tabularnewline
90 & 1 & 0.283507 & 0.716493 \tabularnewline
91 & 1 & 0.46452 & 0.53548 \tabularnewline
92 & 0 & 0.374013 & -0.374013 \tabularnewline
93 & 0 & 0.404182 & -0.404182 \tabularnewline
94 & 0 & 0.404182 & -0.404182 \tabularnewline
95 & 1 & 0.343845 & 0.656155 \tabularnewline
96 & 0 & 0.434351 & -0.434351 \tabularnewline
97 & 1 & 0.404182 & 0.595818 \tabularnewline
98 & 0 & 0.22317 & -0.22317 \tabularnewline
99 & 0 & 0.313676 & -0.313676 \tabularnewline
100 & 0 & 0.283507 & -0.283507 \tabularnewline
101 & 0 & 0.374013 & -0.374013 \tabularnewline
102 & 0 & 0.374013 & -0.374013 \tabularnewline
103 & 1 & 0.434351 & 0.565649 \tabularnewline
104 & 0 & 0.766207 & -0.766207 \tabularnewline
105 & 0 & 0.494688 & -0.494688 \tabularnewline
106 & 0 & 0.404182 & -0.404182 \tabularnewline
107 & 1 & 0.675701 & 0.324299 \tabularnewline
108 & 1 & 0.766207 & 0.233793 \tabularnewline
109 & 1 & 0.555026 & 0.444974 \tabularnewline
110 & 1 & 0.46452 & 0.53548 \tabularnewline
111 & 1 & 0.46452 & 0.53548 \tabularnewline
112 & 1 & 0.434351 & 0.565649 \tabularnewline
113 & 0 & 0.434351 & -0.434351 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1[/C][C]0.374013[/C][C]0.625987[/C][/ROW]
[ROW][C]2[/C][C]0[/C][C]0.343845[/C][C]-0.343845[/C][/ROW]
[ROW][C]3[/C][C]1[/C][C]0.343845[/C][C]0.656155[/C][/ROW]
[ROW][C]4[/C][C]0[/C][C]0.46452[/C][C]-0.46452[/C][/ROW]
[ROW][C]5[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]6[/C][C]0[/C][C]0.645532[/C][C]-0.645532[/C][/ROW]
[ROW][C]7[/C][C]1[/C][C]0.404182[/C][C]0.595818[/C][/ROW]
[ROW][C]8[/C][C]0[/C][C]0.343845[/C][C]-0.343845[/C][/ROW]
[ROW][C]9[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]10[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[ROW][C]11[/C][C]0[/C][C]0.343845[/C][C]-0.343845[/C][/ROW]
[ROW][C]12[/C][C]0[/C][C]0.555026[/C][C]-0.555026[/C][/ROW]
[ROW][C]13[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]14[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]15[/C][C]1[/C][C]0.46452[/C][C]0.53548[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]0.555026[/C][C]0.444974[/C][/ROW]
[ROW][C]17[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]18[/C][C]1[/C][C]0.374013[/C][C]0.625987[/C][/ROW]
[ROW][C]19[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]20[/C][C]1[/C][C]0.555026[/C][C]0.444974[/C][/ROW]
[ROW][C]21[/C][C]0[/C][C]0.524857[/C][C]-0.524857[/C][/ROW]
[ROW][C]22[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]23[/C][C]1[/C][C]0.374013[/C][C]0.625987[/C][/ROW]
[ROW][C]24[/C][C]0[/C][C]0.46452[/C][C]-0.46452[/C][/ROW]
[ROW][C]25[/C][C]0[/C][C]0.253338[/C][C]-0.253338[/C][/ROW]
[ROW][C]26[/C][C]0[/C][C]0.736038[/C][C]-0.736038[/C][/ROW]
[ROW][C]27[/C][C]0[/C][C]0.102495[/C][C]-0.102495[/C][/ROW]
[ROW][C]28[/C][C]1[/C][C]0.404182[/C][C]0.595818[/C][/ROW]
[ROW][C]29[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]30[/C][C]1[/C][C]0.524857[/C][C]0.475143[/C][/ROW]
[ROW][C]31[/C][C]1[/C][C]0.524857[/C][C]0.475143[/C][/ROW]
[ROW][C]32[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]33[/C][C]0[/C][C]0.253338[/C][C]-0.253338[/C][/ROW]
[ROW][C]34[/C][C]0[/C][C]0.46452[/C][C]-0.46452[/C][/ROW]
[ROW][C]35[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]36[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]37[/C][C]1[/C][C]0.70587[/C][C]0.29413[/C][/ROW]
[ROW][C]38[/C][C]0[/C][C]0.585195[/C][C]-0.585195[/C][/ROW]
[ROW][C]39[/C][C]0[/C][C]0.343845[/C][C]-0.343845[/C][/ROW]
[ROW][C]40[/C][C]1[/C][C]0.22317[/C][C]0.77683[/C][/ROW]
[ROW][C]41[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]42[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]43[/C][C]0[/C][C]0.283507[/C][C]-0.283507[/C][/ROW]
[ROW][C]44[/C][C]0[/C][C]0.283507[/C][C]-0.283507[/C][/ROW]
[ROW][C]45[/C][C]0[/C][C]0.46452[/C][C]-0.46452[/C][/ROW]
[ROW][C]46[/C][C]1[/C][C]0.313676[/C][C]0.686324[/C][/ROW]
[ROW][C]47[/C][C]0[/C][C]0.555026[/C][C]-0.555026[/C][/ROW]
[ROW][C]48[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[ROW][C]49[/C][C]1[/C][C]0.524857[/C][C]0.475143[/C][/ROW]
[ROW][C]50[/C][C]0[/C][C]0.253338[/C][C]-0.253338[/C][/ROW]
[ROW][C]51[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]52[/C][C]0[/C][C]0.494688[/C][C]-0.494688[/C][/ROW]
[ROW][C]53[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[ROW][C]54[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]55[/C][C]0[/C][C]0.46452[/C][C]-0.46452[/C][/ROW]
[ROW][C]56[/C][C]0[/C][C]0.193001[/C][C]-0.193001[/C][/ROW]
[ROW][C]57[/C][C]1[/C][C]0.374013[/C][C]0.625987[/C][/ROW]
[ROW][C]58[/C][C]0[/C][C]0.615363[/C][C]-0.615363[/C][/ROW]
[ROW][C]59[/C][C]1[/C][C]0.766207[/C][C]0.233793[/C][/ROW]
[ROW][C]60[/C][C]0[/C][C]0.132663[/C][C]-0.132663[/C][/ROW]
[ROW][C]61[/C][C]0[/C][C]0.162832[/C][C]-0.162832[/C][/ROW]
[ROW][C]62[/C][C]1[/C][C]0.313676[/C][C]0.686324[/C][/ROW]
[ROW][C]63[/C][C]1[/C][C]0.374013[/C][C]0.625987[/C][/ROW]
[ROW][C]64[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[ROW][C]65[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]66[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]67[/C][C]1[/C][C]0.46452[/C][C]0.53548[/C][/ROW]
[ROW][C]68[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[ROW][C]69[/C][C]0[/C][C]0.494688[/C][C]-0.494688[/C][/ROW]
[ROW][C]70[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]71[/C][C]1[/C][C]0.253338[/C][C]0.746662[/C][/ROW]
[ROW][C]72[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]73[/C][C]1[/C][C]0.343845[/C][C]0.656155[/C][/ROW]
[ROW][C]74[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]75[/C][C]1[/C][C]0.585195[/C][C]0.414805[/C][/ROW]
[ROW][C]76[/C][C]0[/C][C]0.162832[/C][C]-0.162832[/C][/ROW]
[ROW][C]77[/C][C]1[/C][C]0.524857[/C][C]0.475143[/C][/ROW]
[ROW][C]78[/C][C]0[/C][C]0.283507[/C][C]-0.283507[/C][/ROW]
[ROW][C]79[/C][C]1[/C][C]0.404182[/C][C]0.595818[/C][/ROW]
[ROW][C]80[/C][C]1[/C][C]0.645532[/C][C]0.354468[/C][/ROW]
[ROW][C]81[/C][C]0[/C][C]0.283507[/C][C]-0.283507[/C][/ROW]
[ROW][C]82[/C][C]1[/C][C]0.343845[/C][C]0.656155[/C][/ROW]
[ROW][C]83[/C][C]1[/C][C]0.645532[/C][C]0.354468[/C][/ROW]
[ROW][C]84[/C][C]1[/C][C]0.343845[/C][C]0.656155[/C][/ROW]
[ROW][C]85[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]86[/C][C]1[/C][C]0.70587[/C][C]0.29413[/C][/ROW]
[ROW][C]87[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]88[/C][C]0[/C][C]0.494688[/C][C]-0.494688[/C][/ROW]
[ROW][C]89[/C][C]1[/C][C]0.343845[/C][C]0.656155[/C][/ROW]
[ROW][C]90[/C][C]1[/C][C]0.283507[/C][C]0.716493[/C][/ROW]
[ROW][C]91[/C][C]1[/C][C]0.46452[/C][C]0.53548[/C][/ROW]
[ROW][C]92[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]93[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]94[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]95[/C][C]1[/C][C]0.343845[/C][C]0.656155[/C][/ROW]
[ROW][C]96[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[ROW][C]97[/C][C]1[/C][C]0.404182[/C][C]0.595818[/C][/ROW]
[ROW][C]98[/C][C]0[/C][C]0.22317[/C][C]-0.22317[/C][/ROW]
[ROW][C]99[/C][C]0[/C][C]0.313676[/C][C]-0.313676[/C][/ROW]
[ROW][C]100[/C][C]0[/C][C]0.283507[/C][C]-0.283507[/C][/ROW]
[ROW][C]101[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]102[/C][C]0[/C][C]0.374013[/C][C]-0.374013[/C][/ROW]
[ROW][C]103[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]104[/C][C]0[/C][C]0.766207[/C][C]-0.766207[/C][/ROW]
[ROW][C]105[/C][C]0[/C][C]0.494688[/C][C]-0.494688[/C][/ROW]
[ROW][C]106[/C][C]0[/C][C]0.404182[/C][C]-0.404182[/C][/ROW]
[ROW][C]107[/C][C]1[/C][C]0.675701[/C][C]0.324299[/C][/ROW]
[ROW][C]108[/C][C]1[/C][C]0.766207[/C][C]0.233793[/C][/ROW]
[ROW][C]109[/C][C]1[/C][C]0.555026[/C][C]0.444974[/C][/ROW]
[ROW][C]110[/C][C]1[/C][C]0.46452[/C][C]0.53548[/C][/ROW]
[ROW][C]111[/C][C]1[/C][C]0.46452[/C][C]0.53548[/C][/ROW]
[ROW][C]112[/C][C]1[/C][C]0.434351[/C][C]0.565649[/C][/ROW]
[ROW][C]113[/C][C]0[/C][C]0.434351[/C][C]-0.434351[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.3740130.625987
200.343845-0.343845
310.3438450.656155
400.46452-0.46452
500.313676-0.313676
600.645532-0.645532
710.4041820.595818
800.343845-0.343845
900.374013-0.374013
1000.434351-0.434351
1100.343845-0.343845
1200.555026-0.555026
1300.404182-0.404182
1410.4343510.565649
1510.464520.53548
1610.5550260.444974
1700.404182-0.404182
1810.3740130.625987
1900.374013-0.374013
2010.5550260.444974
2100.524857-0.524857
2200.313676-0.313676
2310.3740130.625987
2400.46452-0.46452
2500.253338-0.253338
2600.736038-0.736038
2700.102495-0.102495
2810.4041820.595818
2900.313676-0.313676
3010.5248570.475143
3110.5248570.475143
3210.4343510.565649
3300.253338-0.253338
3400.46452-0.46452
3500.313676-0.313676
3600.374013-0.374013
3710.705870.29413
3800.585195-0.585195
3900.343845-0.343845
4010.223170.77683
4100.313676-0.313676
4200.313676-0.313676
4300.283507-0.283507
4400.283507-0.283507
4500.46452-0.46452
4610.3136760.686324
4700.555026-0.555026
4800.434351-0.434351
4910.5248570.475143
5000.253338-0.253338
5100.313676-0.313676
5200.494688-0.494688
5300.434351-0.434351
5400.374013-0.374013
5500.46452-0.46452
5600.193001-0.193001
5710.3740130.625987
5800.615363-0.615363
5910.7662070.233793
6000.132663-0.132663
6100.162832-0.162832
6210.3136760.686324
6310.3740130.625987
6400.434351-0.434351
6510.4343510.565649
6600.404182-0.404182
6710.464520.53548
6800.434351-0.434351
6900.494688-0.494688
7010.4343510.565649
7110.2533380.746662
7210.4343510.565649
7310.3438450.656155
7400.313676-0.313676
7510.5851950.414805
7600.162832-0.162832
7710.5248570.475143
7800.283507-0.283507
7910.4041820.595818
8010.6455320.354468
8100.283507-0.283507
8210.3438450.656155
8310.6455320.354468
8410.3438450.656155
8500.404182-0.404182
8610.705870.29413
8700.313676-0.313676
8800.494688-0.494688
8910.3438450.656155
9010.2835070.716493
9110.464520.53548
9200.374013-0.374013
9300.404182-0.404182
9400.404182-0.404182
9510.3438450.656155
9600.434351-0.434351
9710.4041820.595818
9800.22317-0.22317
9900.313676-0.313676
10000.283507-0.283507
10100.374013-0.374013
10200.374013-0.374013
10310.4343510.565649
10400.766207-0.766207
10500.494688-0.494688
10600.404182-0.404182
10710.6757010.324299
10810.7662070.233793
10910.5550260.444974
11010.464520.53548
11110.464520.53548
11210.4343510.565649
11300.434351-0.434351







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.8322160.3355680.167784
60.7250020.5499950.274998
70.7581510.4836980.241849
80.7401680.5196640.259832
90.6959960.6080090.304004
100.6315970.7368050.368403
110.5796070.8407870.420393
120.4990390.9980780.500961
130.4345220.8690450.565478
140.562640.8747210.43736
150.6490160.7019680.350984
160.6957480.6085040.304252
170.6628660.6742690.337134
180.7044390.5911230.295561
190.6753680.6492640.324632
200.6863130.6273730.313687
210.6772160.6455670.322784
220.6386170.7227660.361383
230.6816910.6366180.318309
240.6638340.6723330.336166
250.6218180.7563630.378182
260.6366040.7267910.363396
270.5906020.8187950.409398
280.6347510.7304970.365249
290.600050.7999010.39995
300.6188650.7622690.381135
310.6295680.7408640.370432
320.6530290.6939420.346971
330.6135940.7728120.386406
340.6028890.7942220.397111
350.5678460.8643080.432154
360.5403180.9193630.459682
370.5099020.9801970.490098
380.5283690.9432620.471631
390.4959410.9918820.504059
400.5866420.8267170.413358
410.5533290.8933430.446671
420.5192450.961510.480755
430.4809850.9619710.519015
440.4427660.8855330.557234
450.4331720.8663430.566828
460.4951540.9903070.504846
470.5048250.990350.495175
480.4908330.9816660.509167
490.4992380.9984770.500762
500.4596290.9192570.540371
510.4278930.8557860.572107
520.4279490.8558990.572051
530.4169690.8339380.583031
540.3963810.7927610.603619
550.3932330.7864670.606767
560.351930.7038610.64807
570.392470.784940.60753
580.4283910.8567820.571609
590.3978670.7957330.602133
600.3511130.7022260.648887
610.3094030.6188060.690597
620.3606440.7212890.639356
630.3955860.7911730.604414
640.3913060.7826130.608694
650.4093740.8187490.590626
660.3996260.7992530.600374
670.4087150.8174310.591285
680.4065560.8131130.593444
690.4220350.844070.577965
700.4359190.8718370.564081
710.5081390.9837220.491861
720.5224320.9551360.477568
730.5667040.8665930.433296
740.5359890.9280220.464011
750.5133310.9733370.486669
760.4618330.9236670.538167
770.451340.9026810.54866
780.4163560.8327120.583644
790.4371290.8742580.562871
800.4034930.8069850.596507
810.3691180.7382350.630882
820.4078340.8156680.592166
830.375420.7508390.62458
840.4207650.8415310.579235
850.4021510.8043030.597849
860.3611810.7223620.638819
870.3288540.6577080.671146
880.3331140.6662280.666886
890.3727680.7455360.627232
900.4563230.9126460.543677
910.4756070.9512140.524393
920.4380270.8760540.561973
930.4107080.8214150.589292
940.3868290.7736570.613171
950.4542570.9085140.545743
960.4341590.8683190.565841
970.4848720.9697440.515128
980.4078090.8156190.592191
990.3498420.6996840.650158
1000.2954470.5908950.704553
1010.2747290.5494590.725271
1020.2781890.5563770.721811
1030.2546490.5092980.745351
1040.4613420.9226840.538658
1050.5601060.8797870.439894
1060.6634170.6731660.336583
1070.5195170.9609660.480483
1080.4068550.8137090.593145

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.832216 & 0.335568 & 0.167784 \tabularnewline
6 & 0.725002 & 0.549995 & 0.274998 \tabularnewline
7 & 0.758151 & 0.483698 & 0.241849 \tabularnewline
8 & 0.740168 & 0.519664 & 0.259832 \tabularnewline
9 & 0.695996 & 0.608009 & 0.304004 \tabularnewline
10 & 0.631597 & 0.736805 & 0.368403 \tabularnewline
11 & 0.579607 & 0.840787 & 0.420393 \tabularnewline
12 & 0.499039 & 0.998078 & 0.500961 \tabularnewline
13 & 0.434522 & 0.869045 & 0.565478 \tabularnewline
14 & 0.56264 & 0.874721 & 0.43736 \tabularnewline
15 & 0.649016 & 0.701968 & 0.350984 \tabularnewline
16 & 0.695748 & 0.608504 & 0.304252 \tabularnewline
17 & 0.662866 & 0.674269 & 0.337134 \tabularnewline
18 & 0.704439 & 0.591123 & 0.295561 \tabularnewline
19 & 0.675368 & 0.649264 & 0.324632 \tabularnewline
20 & 0.686313 & 0.627373 & 0.313687 \tabularnewline
21 & 0.677216 & 0.645567 & 0.322784 \tabularnewline
22 & 0.638617 & 0.722766 & 0.361383 \tabularnewline
23 & 0.681691 & 0.636618 & 0.318309 \tabularnewline
24 & 0.663834 & 0.672333 & 0.336166 \tabularnewline
25 & 0.621818 & 0.756363 & 0.378182 \tabularnewline
26 & 0.636604 & 0.726791 & 0.363396 \tabularnewline
27 & 0.590602 & 0.818795 & 0.409398 \tabularnewline
28 & 0.634751 & 0.730497 & 0.365249 \tabularnewline
29 & 0.60005 & 0.799901 & 0.39995 \tabularnewline
30 & 0.618865 & 0.762269 & 0.381135 \tabularnewline
31 & 0.629568 & 0.740864 & 0.370432 \tabularnewline
32 & 0.653029 & 0.693942 & 0.346971 \tabularnewline
33 & 0.613594 & 0.772812 & 0.386406 \tabularnewline
34 & 0.602889 & 0.794222 & 0.397111 \tabularnewline
35 & 0.567846 & 0.864308 & 0.432154 \tabularnewline
36 & 0.540318 & 0.919363 & 0.459682 \tabularnewline
37 & 0.509902 & 0.980197 & 0.490098 \tabularnewline
38 & 0.528369 & 0.943262 & 0.471631 \tabularnewline
39 & 0.495941 & 0.991882 & 0.504059 \tabularnewline
40 & 0.586642 & 0.826717 & 0.413358 \tabularnewline
41 & 0.553329 & 0.893343 & 0.446671 \tabularnewline
42 & 0.519245 & 0.96151 & 0.480755 \tabularnewline
43 & 0.480985 & 0.961971 & 0.519015 \tabularnewline
44 & 0.442766 & 0.885533 & 0.557234 \tabularnewline
45 & 0.433172 & 0.866343 & 0.566828 \tabularnewline
46 & 0.495154 & 0.990307 & 0.504846 \tabularnewline
47 & 0.504825 & 0.99035 & 0.495175 \tabularnewline
48 & 0.490833 & 0.981666 & 0.509167 \tabularnewline
49 & 0.499238 & 0.998477 & 0.500762 \tabularnewline
50 & 0.459629 & 0.919257 & 0.540371 \tabularnewline
51 & 0.427893 & 0.855786 & 0.572107 \tabularnewline
52 & 0.427949 & 0.855899 & 0.572051 \tabularnewline
53 & 0.416969 & 0.833938 & 0.583031 \tabularnewline
54 & 0.396381 & 0.792761 & 0.603619 \tabularnewline
55 & 0.393233 & 0.786467 & 0.606767 \tabularnewline
56 & 0.35193 & 0.703861 & 0.64807 \tabularnewline
57 & 0.39247 & 0.78494 & 0.60753 \tabularnewline
58 & 0.428391 & 0.856782 & 0.571609 \tabularnewline
59 & 0.397867 & 0.795733 & 0.602133 \tabularnewline
60 & 0.351113 & 0.702226 & 0.648887 \tabularnewline
61 & 0.309403 & 0.618806 & 0.690597 \tabularnewline
62 & 0.360644 & 0.721289 & 0.639356 \tabularnewline
63 & 0.395586 & 0.791173 & 0.604414 \tabularnewline
64 & 0.391306 & 0.782613 & 0.608694 \tabularnewline
65 & 0.409374 & 0.818749 & 0.590626 \tabularnewline
66 & 0.399626 & 0.799253 & 0.600374 \tabularnewline
67 & 0.408715 & 0.817431 & 0.591285 \tabularnewline
68 & 0.406556 & 0.813113 & 0.593444 \tabularnewline
69 & 0.422035 & 0.84407 & 0.577965 \tabularnewline
70 & 0.435919 & 0.871837 & 0.564081 \tabularnewline
71 & 0.508139 & 0.983722 & 0.491861 \tabularnewline
72 & 0.522432 & 0.955136 & 0.477568 \tabularnewline
73 & 0.566704 & 0.866593 & 0.433296 \tabularnewline
74 & 0.535989 & 0.928022 & 0.464011 \tabularnewline
75 & 0.513331 & 0.973337 & 0.486669 \tabularnewline
76 & 0.461833 & 0.923667 & 0.538167 \tabularnewline
77 & 0.45134 & 0.902681 & 0.54866 \tabularnewline
78 & 0.416356 & 0.832712 & 0.583644 \tabularnewline
79 & 0.437129 & 0.874258 & 0.562871 \tabularnewline
80 & 0.403493 & 0.806985 & 0.596507 \tabularnewline
81 & 0.369118 & 0.738235 & 0.630882 \tabularnewline
82 & 0.407834 & 0.815668 & 0.592166 \tabularnewline
83 & 0.37542 & 0.750839 & 0.62458 \tabularnewline
84 & 0.420765 & 0.841531 & 0.579235 \tabularnewline
85 & 0.402151 & 0.804303 & 0.597849 \tabularnewline
86 & 0.361181 & 0.722362 & 0.638819 \tabularnewline
87 & 0.328854 & 0.657708 & 0.671146 \tabularnewline
88 & 0.333114 & 0.666228 & 0.666886 \tabularnewline
89 & 0.372768 & 0.745536 & 0.627232 \tabularnewline
90 & 0.456323 & 0.912646 & 0.543677 \tabularnewline
91 & 0.475607 & 0.951214 & 0.524393 \tabularnewline
92 & 0.438027 & 0.876054 & 0.561973 \tabularnewline
93 & 0.410708 & 0.821415 & 0.589292 \tabularnewline
94 & 0.386829 & 0.773657 & 0.613171 \tabularnewline
95 & 0.454257 & 0.908514 & 0.545743 \tabularnewline
96 & 0.434159 & 0.868319 & 0.565841 \tabularnewline
97 & 0.484872 & 0.969744 & 0.515128 \tabularnewline
98 & 0.407809 & 0.815619 & 0.592191 \tabularnewline
99 & 0.349842 & 0.699684 & 0.650158 \tabularnewline
100 & 0.295447 & 0.590895 & 0.704553 \tabularnewline
101 & 0.274729 & 0.549459 & 0.725271 \tabularnewline
102 & 0.278189 & 0.556377 & 0.721811 \tabularnewline
103 & 0.254649 & 0.509298 & 0.745351 \tabularnewline
104 & 0.461342 & 0.922684 & 0.538658 \tabularnewline
105 & 0.560106 & 0.879787 & 0.439894 \tabularnewline
106 & 0.663417 & 0.673166 & 0.336583 \tabularnewline
107 & 0.519517 & 0.960966 & 0.480483 \tabularnewline
108 & 0.406855 & 0.813709 & 0.593145 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.832216[/C][C]0.335568[/C][C]0.167784[/C][/ROW]
[ROW][C]6[/C][C]0.725002[/C][C]0.549995[/C][C]0.274998[/C][/ROW]
[ROW][C]7[/C][C]0.758151[/C][C]0.483698[/C][C]0.241849[/C][/ROW]
[ROW][C]8[/C][C]0.740168[/C][C]0.519664[/C][C]0.259832[/C][/ROW]
[ROW][C]9[/C][C]0.695996[/C][C]0.608009[/C][C]0.304004[/C][/ROW]
[ROW][C]10[/C][C]0.631597[/C][C]0.736805[/C][C]0.368403[/C][/ROW]
[ROW][C]11[/C][C]0.579607[/C][C]0.840787[/C][C]0.420393[/C][/ROW]
[ROW][C]12[/C][C]0.499039[/C][C]0.998078[/C][C]0.500961[/C][/ROW]
[ROW][C]13[/C][C]0.434522[/C][C]0.869045[/C][C]0.565478[/C][/ROW]
[ROW][C]14[/C][C]0.56264[/C][C]0.874721[/C][C]0.43736[/C][/ROW]
[ROW][C]15[/C][C]0.649016[/C][C]0.701968[/C][C]0.350984[/C][/ROW]
[ROW][C]16[/C][C]0.695748[/C][C]0.608504[/C][C]0.304252[/C][/ROW]
[ROW][C]17[/C][C]0.662866[/C][C]0.674269[/C][C]0.337134[/C][/ROW]
[ROW][C]18[/C][C]0.704439[/C][C]0.591123[/C][C]0.295561[/C][/ROW]
[ROW][C]19[/C][C]0.675368[/C][C]0.649264[/C][C]0.324632[/C][/ROW]
[ROW][C]20[/C][C]0.686313[/C][C]0.627373[/C][C]0.313687[/C][/ROW]
[ROW][C]21[/C][C]0.677216[/C][C]0.645567[/C][C]0.322784[/C][/ROW]
[ROW][C]22[/C][C]0.638617[/C][C]0.722766[/C][C]0.361383[/C][/ROW]
[ROW][C]23[/C][C]0.681691[/C][C]0.636618[/C][C]0.318309[/C][/ROW]
[ROW][C]24[/C][C]0.663834[/C][C]0.672333[/C][C]0.336166[/C][/ROW]
[ROW][C]25[/C][C]0.621818[/C][C]0.756363[/C][C]0.378182[/C][/ROW]
[ROW][C]26[/C][C]0.636604[/C][C]0.726791[/C][C]0.363396[/C][/ROW]
[ROW][C]27[/C][C]0.590602[/C][C]0.818795[/C][C]0.409398[/C][/ROW]
[ROW][C]28[/C][C]0.634751[/C][C]0.730497[/C][C]0.365249[/C][/ROW]
[ROW][C]29[/C][C]0.60005[/C][C]0.799901[/C][C]0.39995[/C][/ROW]
[ROW][C]30[/C][C]0.618865[/C][C]0.762269[/C][C]0.381135[/C][/ROW]
[ROW][C]31[/C][C]0.629568[/C][C]0.740864[/C][C]0.370432[/C][/ROW]
[ROW][C]32[/C][C]0.653029[/C][C]0.693942[/C][C]0.346971[/C][/ROW]
[ROW][C]33[/C][C]0.613594[/C][C]0.772812[/C][C]0.386406[/C][/ROW]
[ROW][C]34[/C][C]0.602889[/C][C]0.794222[/C][C]0.397111[/C][/ROW]
[ROW][C]35[/C][C]0.567846[/C][C]0.864308[/C][C]0.432154[/C][/ROW]
[ROW][C]36[/C][C]0.540318[/C][C]0.919363[/C][C]0.459682[/C][/ROW]
[ROW][C]37[/C][C]0.509902[/C][C]0.980197[/C][C]0.490098[/C][/ROW]
[ROW][C]38[/C][C]0.528369[/C][C]0.943262[/C][C]0.471631[/C][/ROW]
[ROW][C]39[/C][C]0.495941[/C][C]0.991882[/C][C]0.504059[/C][/ROW]
[ROW][C]40[/C][C]0.586642[/C][C]0.826717[/C][C]0.413358[/C][/ROW]
[ROW][C]41[/C][C]0.553329[/C][C]0.893343[/C][C]0.446671[/C][/ROW]
[ROW][C]42[/C][C]0.519245[/C][C]0.96151[/C][C]0.480755[/C][/ROW]
[ROW][C]43[/C][C]0.480985[/C][C]0.961971[/C][C]0.519015[/C][/ROW]
[ROW][C]44[/C][C]0.442766[/C][C]0.885533[/C][C]0.557234[/C][/ROW]
[ROW][C]45[/C][C]0.433172[/C][C]0.866343[/C][C]0.566828[/C][/ROW]
[ROW][C]46[/C][C]0.495154[/C][C]0.990307[/C][C]0.504846[/C][/ROW]
[ROW][C]47[/C][C]0.504825[/C][C]0.99035[/C][C]0.495175[/C][/ROW]
[ROW][C]48[/C][C]0.490833[/C][C]0.981666[/C][C]0.509167[/C][/ROW]
[ROW][C]49[/C][C]0.499238[/C][C]0.998477[/C][C]0.500762[/C][/ROW]
[ROW][C]50[/C][C]0.459629[/C][C]0.919257[/C][C]0.540371[/C][/ROW]
[ROW][C]51[/C][C]0.427893[/C][C]0.855786[/C][C]0.572107[/C][/ROW]
[ROW][C]52[/C][C]0.427949[/C][C]0.855899[/C][C]0.572051[/C][/ROW]
[ROW][C]53[/C][C]0.416969[/C][C]0.833938[/C][C]0.583031[/C][/ROW]
[ROW][C]54[/C][C]0.396381[/C][C]0.792761[/C][C]0.603619[/C][/ROW]
[ROW][C]55[/C][C]0.393233[/C][C]0.786467[/C][C]0.606767[/C][/ROW]
[ROW][C]56[/C][C]0.35193[/C][C]0.703861[/C][C]0.64807[/C][/ROW]
[ROW][C]57[/C][C]0.39247[/C][C]0.78494[/C][C]0.60753[/C][/ROW]
[ROW][C]58[/C][C]0.428391[/C][C]0.856782[/C][C]0.571609[/C][/ROW]
[ROW][C]59[/C][C]0.397867[/C][C]0.795733[/C][C]0.602133[/C][/ROW]
[ROW][C]60[/C][C]0.351113[/C][C]0.702226[/C][C]0.648887[/C][/ROW]
[ROW][C]61[/C][C]0.309403[/C][C]0.618806[/C][C]0.690597[/C][/ROW]
[ROW][C]62[/C][C]0.360644[/C][C]0.721289[/C][C]0.639356[/C][/ROW]
[ROW][C]63[/C][C]0.395586[/C][C]0.791173[/C][C]0.604414[/C][/ROW]
[ROW][C]64[/C][C]0.391306[/C][C]0.782613[/C][C]0.608694[/C][/ROW]
[ROW][C]65[/C][C]0.409374[/C][C]0.818749[/C][C]0.590626[/C][/ROW]
[ROW][C]66[/C][C]0.399626[/C][C]0.799253[/C][C]0.600374[/C][/ROW]
[ROW][C]67[/C][C]0.408715[/C][C]0.817431[/C][C]0.591285[/C][/ROW]
[ROW][C]68[/C][C]0.406556[/C][C]0.813113[/C][C]0.593444[/C][/ROW]
[ROW][C]69[/C][C]0.422035[/C][C]0.84407[/C][C]0.577965[/C][/ROW]
[ROW][C]70[/C][C]0.435919[/C][C]0.871837[/C][C]0.564081[/C][/ROW]
[ROW][C]71[/C][C]0.508139[/C][C]0.983722[/C][C]0.491861[/C][/ROW]
[ROW][C]72[/C][C]0.522432[/C][C]0.955136[/C][C]0.477568[/C][/ROW]
[ROW][C]73[/C][C]0.566704[/C][C]0.866593[/C][C]0.433296[/C][/ROW]
[ROW][C]74[/C][C]0.535989[/C][C]0.928022[/C][C]0.464011[/C][/ROW]
[ROW][C]75[/C][C]0.513331[/C][C]0.973337[/C][C]0.486669[/C][/ROW]
[ROW][C]76[/C][C]0.461833[/C][C]0.923667[/C][C]0.538167[/C][/ROW]
[ROW][C]77[/C][C]0.45134[/C][C]0.902681[/C][C]0.54866[/C][/ROW]
[ROW][C]78[/C][C]0.416356[/C][C]0.832712[/C][C]0.583644[/C][/ROW]
[ROW][C]79[/C][C]0.437129[/C][C]0.874258[/C][C]0.562871[/C][/ROW]
[ROW][C]80[/C][C]0.403493[/C][C]0.806985[/C][C]0.596507[/C][/ROW]
[ROW][C]81[/C][C]0.369118[/C][C]0.738235[/C][C]0.630882[/C][/ROW]
[ROW][C]82[/C][C]0.407834[/C][C]0.815668[/C][C]0.592166[/C][/ROW]
[ROW][C]83[/C][C]0.37542[/C][C]0.750839[/C][C]0.62458[/C][/ROW]
[ROW][C]84[/C][C]0.420765[/C][C]0.841531[/C][C]0.579235[/C][/ROW]
[ROW][C]85[/C][C]0.402151[/C][C]0.804303[/C][C]0.597849[/C][/ROW]
[ROW][C]86[/C][C]0.361181[/C][C]0.722362[/C][C]0.638819[/C][/ROW]
[ROW][C]87[/C][C]0.328854[/C][C]0.657708[/C][C]0.671146[/C][/ROW]
[ROW][C]88[/C][C]0.333114[/C][C]0.666228[/C][C]0.666886[/C][/ROW]
[ROW][C]89[/C][C]0.372768[/C][C]0.745536[/C][C]0.627232[/C][/ROW]
[ROW][C]90[/C][C]0.456323[/C][C]0.912646[/C][C]0.543677[/C][/ROW]
[ROW][C]91[/C][C]0.475607[/C][C]0.951214[/C][C]0.524393[/C][/ROW]
[ROW][C]92[/C][C]0.438027[/C][C]0.876054[/C][C]0.561973[/C][/ROW]
[ROW][C]93[/C][C]0.410708[/C][C]0.821415[/C][C]0.589292[/C][/ROW]
[ROW][C]94[/C][C]0.386829[/C][C]0.773657[/C][C]0.613171[/C][/ROW]
[ROW][C]95[/C][C]0.454257[/C][C]0.908514[/C][C]0.545743[/C][/ROW]
[ROW][C]96[/C][C]0.434159[/C][C]0.868319[/C][C]0.565841[/C][/ROW]
[ROW][C]97[/C][C]0.484872[/C][C]0.969744[/C][C]0.515128[/C][/ROW]
[ROW][C]98[/C][C]0.407809[/C][C]0.815619[/C][C]0.592191[/C][/ROW]
[ROW][C]99[/C][C]0.349842[/C][C]0.699684[/C][C]0.650158[/C][/ROW]
[ROW][C]100[/C][C]0.295447[/C][C]0.590895[/C][C]0.704553[/C][/ROW]
[ROW][C]101[/C][C]0.274729[/C][C]0.549459[/C][C]0.725271[/C][/ROW]
[ROW][C]102[/C][C]0.278189[/C][C]0.556377[/C][C]0.721811[/C][/ROW]
[ROW][C]103[/C][C]0.254649[/C][C]0.509298[/C][C]0.745351[/C][/ROW]
[ROW][C]104[/C][C]0.461342[/C][C]0.922684[/C][C]0.538658[/C][/ROW]
[ROW][C]105[/C][C]0.560106[/C][C]0.879787[/C][C]0.439894[/C][/ROW]
[ROW][C]106[/C][C]0.663417[/C][C]0.673166[/C][C]0.336583[/C][/ROW]
[ROW][C]107[/C][C]0.519517[/C][C]0.960966[/C][C]0.480483[/C][/ROW]
[ROW][C]108[/C][C]0.406855[/C][C]0.813709[/C][C]0.593145[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.8322160.3355680.167784
60.7250020.5499950.274998
70.7581510.4836980.241849
80.7401680.5196640.259832
90.6959960.6080090.304004
100.6315970.7368050.368403
110.5796070.8407870.420393
120.4990390.9980780.500961
130.4345220.8690450.565478
140.562640.8747210.43736
150.6490160.7019680.350984
160.6957480.6085040.304252
170.6628660.6742690.337134
180.7044390.5911230.295561
190.6753680.6492640.324632
200.6863130.6273730.313687
210.6772160.6455670.322784
220.6386170.7227660.361383
230.6816910.6366180.318309
240.6638340.6723330.336166
250.6218180.7563630.378182
260.6366040.7267910.363396
270.5906020.8187950.409398
280.6347510.7304970.365249
290.600050.7999010.39995
300.6188650.7622690.381135
310.6295680.7408640.370432
320.6530290.6939420.346971
330.6135940.7728120.386406
340.6028890.7942220.397111
350.5678460.8643080.432154
360.5403180.9193630.459682
370.5099020.9801970.490098
380.5283690.9432620.471631
390.4959410.9918820.504059
400.5866420.8267170.413358
410.5533290.8933430.446671
420.5192450.961510.480755
430.4809850.9619710.519015
440.4427660.8855330.557234
450.4331720.8663430.566828
460.4951540.9903070.504846
470.5048250.990350.495175
480.4908330.9816660.509167
490.4992380.9984770.500762
500.4596290.9192570.540371
510.4278930.8557860.572107
520.4279490.8558990.572051
530.4169690.8339380.583031
540.3963810.7927610.603619
550.3932330.7864670.606767
560.351930.7038610.64807
570.392470.784940.60753
580.4283910.8567820.571609
590.3978670.7957330.602133
600.3511130.7022260.648887
610.3094030.6188060.690597
620.3606440.7212890.639356
630.3955860.7911730.604414
640.3913060.7826130.608694
650.4093740.8187490.590626
660.3996260.7992530.600374
670.4087150.8174310.591285
680.4065560.8131130.593444
690.4220350.844070.577965
700.4359190.8718370.564081
710.5081390.9837220.491861
720.5224320.9551360.477568
730.5667040.8665930.433296
740.5359890.9280220.464011
750.5133310.9733370.486669
760.4618330.9236670.538167
770.451340.9026810.54866
780.4163560.8327120.583644
790.4371290.8742580.562871
800.4034930.8069850.596507
810.3691180.7382350.630882
820.4078340.8156680.592166
830.375420.7508390.62458
840.4207650.8415310.579235
850.4021510.8043030.597849
860.3611810.7223620.638819
870.3288540.6577080.671146
880.3331140.6662280.666886
890.3727680.7455360.627232
900.4563230.9126460.543677
910.4756070.9512140.524393
920.4380270.8760540.561973
930.4107080.8214150.589292
940.3868290.7736570.613171
950.4542570.9085140.545743
960.4341590.8683190.565841
970.4848720.9697440.515128
980.4078090.8156190.592191
990.3498420.6996840.650158
1000.2954470.5908950.704553
1010.2747290.5494590.725271
1020.2781890.5563770.721811
1030.2546490.5092980.745351
1040.4613420.9226840.538658
1050.5601060.8797870.439894
1060.6634170.6731660.336583
1070.5195170.9609660.480483
1080.4068550.8137090.593145







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263951&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263951&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263951&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}