Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 10 Dec 2014 13:19:14 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/10/t1418217587qfjaa0o8bp43cgj.htm/, Retrieved Sun, 19 May 2024 15:26:22 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=265122, Retrieved Sun, 19 May 2024 15:26:22 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact65
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [MR impact mot] [2014-12-10 13:19:14] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
Feedback Forum

Post a new message
Dataseries X:
12.9 59 71 6
12.2 50 73 4
12.8 34 65 11
7.4 41 75 7
6.7 39 52 4
12.6 68 75 4
14.8 70 65 4
13.3 65 71 4
11.1 60 66 5
8.2 43 54 4
11.4 50 63 8
6.4 58 65 8
10.6 67 71 4
12 41 75 15
6.3 53 71 4
11.3 48 60 5
11.9 61 69 4
9.3 64 74 4
9.6 45 63 4
10 67 42 4
6.4 29 70 7
13.8 26 68 5
10.8 51 63 7
13.8 56 68 7
11.7 56 60 10
10.9 32 65 14
16.1 58 72 7
13.4 67 71 4
9.9 44 62 9
11.5 53 60 10
8.3 56 70 4
11.7 61 58 7
9 61 64 4
9.7 56 61 6
10.8 61 72 4
10.3 56 62 4
10.4 43 69 5
12.7 64 67 4
9.3 34 68 4
11.8 69 68 6
5.9 74 81 6
11.4 57 70 4
13 53 62 5
10.8 40 59 10
12.3 66 76 4
11.3 54 70 4
11.8 49 66 4
7.9 52 78 8
12.7 58 69 5
12.3 58 71 5
11.6 51 71 8
6.7 35 76 4
10.9 53 70 4
12.1 43 60 4
13.3 49 72 4
10.1 84 84 4
5.7 66 73 8
14.3 61 66 4
8 60 66 4
13.3 62 72 11
9.3 68 67 4
12.5 49 66 6
7.6 48 59 4
15.9 51 64 4
9.2 63 78 4
9.1 57 68 9
11.1 53 76 4
13 56 75 4
14.5 63 69 4
12.2 63 68 9
12.3 54 60 12
11.4 47 76 4
8.8 49 59 7
14.6 58 64 7
12.6 43 66 4
13 58 73 4
12.6 67 73 4
13.2 57 62 4
9.9 37 64 4
7.7 30 65 7
10.5 47 63 4
13.4 51 59 9
10.9 56 70 5
4.3 52 65 9
10.3 37 78 4
11.8 38 59 5
11.2 52 62 8
11.4 56 64 9
8.6 67 76 4
13.2 37 54 5
12.6 26 50 4
5.6 50 68 4
9.9 58 61 4
8.8 42 67 7
7.7 66 66 4
9 54 73 4
7.3 43 52 11
11.4 62 63 4
13.6 48 63 4
7.9 51 61 10
10.7 55 73 4
10.3 55 59 10
8.3 42 64 4
9.6 63 69 8
14.2 57 72 4
8.5 52 67 4
13.5 54 66 12
4.9 46 66 5
6.4 66 56 4
9.6 52 73 8
11.6 43 61 5
11.1 57 68 7




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 9.74987 + 0.0266956AMS.I[t] -0.00791095AMS.E[t] + 0.00834692AMS.A[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TOT[t] =  +  9.74987 +  0.0266956AMS.I[t] -0.00791095AMS.E[t] +  0.00834692AMS.A[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TOT[t] =  +  9.74987 +  0.0266956AMS.I[t] -0.00791095AMS.E[t] +  0.00834692AMS.A[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 9.74987 + 0.0266956AMS.I[t] -0.00791095AMS.E[t] + 0.00834692AMS.A[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9.749872.576913.7840.0002541390.00012707
AMS.I0.02669560.02341981.140.2568610.128431
AMS.E-0.007910950.0367516-0.21530.8299750.414987
AMS.A0.008346920.09563520.087280.9306120.465306

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 9.74987 & 2.57691 & 3.784 & 0.000254139 & 0.00012707 \tabularnewline
AMS.I & 0.0266956 & 0.0234198 & 1.14 & 0.256861 & 0.128431 \tabularnewline
AMS.E & -0.00791095 & 0.0367516 & -0.2153 & 0.829975 & 0.414987 \tabularnewline
AMS.A & 0.00834692 & 0.0956352 & 0.08728 & 0.930612 & 0.465306 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]9.74987[/C][C]2.57691[/C][C]3.784[/C][C]0.000254139[/C][C]0.00012707[/C][/ROW]
[ROW][C]AMS.I[/C][C]0.0266956[/C][C]0.0234198[/C][C]1.14[/C][C]0.256861[/C][C]0.128431[/C][/ROW]
[ROW][C]AMS.E[/C][C]-0.00791095[/C][C]0.0367516[/C][C]-0.2153[/C][C]0.829975[/C][C]0.414987[/C][/ROW]
[ROW][C]AMS.A[/C][C]0.00834692[/C][C]0.0956352[/C][C]0.08728[/C][C]0.930612[/C][C]0.465306[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9.749872.576913.7840.0002541390.00012707
AMS.I0.02669560.02341981.140.2568610.128431
AMS.E-0.007910950.0367516-0.21530.8299750.414987
AMS.A0.008346920.09563520.087280.9306120.465306







Multiple Linear Regression - Regression Statistics
Multiple R0.110099
R-squared0.0121217
Adjusted R-squared-0.0153194
F-TEST (value)0.441736
F-TEST (DF numerator)3
F-TEST (DF denominator)108
p-value0.723629
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.4902
Sum Squared Residuals669.717

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.110099 \tabularnewline
R-squared & 0.0121217 \tabularnewline
Adjusted R-squared & -0.0153194 \tabularnewline
F-TEST (value) & 0.441736 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 108 \tabularnewline
p-value & 0.723629 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.4902 \tabularnewline
Sum Squared Residuals & 669.717 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.110099[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0121217[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.0153194[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.441736[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]108[/C][/ROW]
[ROW][C]p-value[/C][C]0.723629[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.4902[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]669.717[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.110099
R-squared0.0121217
Adjusted R-squared-0.0153194
F-TEST (value)0.441736
F-TEST (DF numerator)3
F-TEST (DF denominator)108
p-value0.723629
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.4902
Sum Squared Residuals669.717







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.910.81332.08668
212.210.54051.65946
312.810.23512.56487
47.410.3095-2.9095
56.710.413-3.71302
612.611.00521.59476
714.811.13773.66226
813.310.95682.34321
911.110.87120.228781
108.210.504-2.30398
1111.410.6530.746964
126.410.8508-4.45078
1310.611.0102-0.410186
141210.37631.62373
156.310.6364-4.33645
1611.310.59830.701663
1711.910.86581.03417
189.310.9064-1.60637
199.610.4862-0.886171
201011.2396-1.2396
216.410.0287-3.62871
2213.89.947753.85225
2310.810.67140.128615
2413.810.76533.03469
2511.710.85360.846363
2610.910.20680.693225
2716.110.78715.31294
2813.411.01022.38981
299.910.5091-0.609121
3011.510.77350.72645
318.310.7244-2.42445
3211.710.97790.722104
33910.9054-1.90539
349.710.8123-1.11234
3510.810.8421-0.0421016
3610.310.7877-0.487733
3710.410.39370.00633934
3812.710.96171.73826
399.310.153-0.852964
4011.811.1040.695996
415.911.1346-5.23464
4211.410.75110.648859
431310.7162.28401
4410.810.43440.365582
4512.310.94391.35606
4611.310.67110.628946
4711.810.56921.23078
487.910.5878-2.68776
4912.710.79411.90591
5012.310.77831.52173
5111.610.61640.983556
526.710.1164-3.41637
5310.910.64440.255641
5412.110.45651.64349
5513.310.52182.77825
5610.111.3612-1.26117
575.711.0011-5.30106
5814.310.88963.41043
59810.8629-2.86287
6013.310.92722.37277
619.311.0685-1.76853
6212.510.58591.91409
637.610.5979-2.9979
6415.910.63845.26157
659.210.848-1.64803
669.110.8087-1.7087
6711.110.59690.503107
681310.68492.31511
6914.510.91923.58077
7012.210.96891.23113
7112.310.81691.48306
7211.410.43670.963281
738.810.6496-1.84964
7414.610.85033.74966
7512.610.4092.19095
761310.75412.2459
7712.610.99441.60564
7813.210.81442.38557
799.910.2647-0.364695
807.710.095-2.39496
8110.510.5396-0.0395618
8213.410.71972.68028
8310.910.73280.167208
844.310.699-6.39895
8510.310.15390.146058
8611.810.33931.46071
8711.210.71430.485662
8811.410.81360.586354
898.610.9706-2.37063
9013.210.35222.84785
9112.610.08182.5182
925.610.5801-4.98009
939.910.849-0.949035
948.810.3995-1.59948
957.711.023-3.32305
96910.6473-1.64732
977.310.5782-3.27823
9811.410.940.460004
9913.610.56633.03374
1007.910.7122-2.81225
10110.710.6740.0259829
10210.310.8349-0.534852
1038.310.3982-2.09817
1049.610.9526-1.35261
10514.210.73533.46468
1068.510.6414-2.1414
10713.510.76952.73053
1084.910.4975-5.59748
1096.411.1022-4.70215
1109.610.6273-1.02732
11111.610.45691.14305
11211.110.7920.307996

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 12.9 & 10.8133 & 2.08668 \tabularnewline
2 & 12.2 & 10.5405 & 1.65946 \tabularnewline
3 & 12.8 & 10.2351 & 2.56487 \tabularnewline
4 & 7.4 & 10.3095 & -2.9095 \tabularnewline
5 & 6.7 & 10.413 & -3.71302 \tabularnewline
6 & 12.6 & 11.0052 & 1.59476 \tabularnewline
7 & 14.8 & 11.1377 & 3.66226 \tabularnewline
8 & 13.3 & 10.9568 & 2.34321 \tabularnewline
9 & 11.1 & 10.8712 & 0.228781 \tabularnewline
10 & 8.2 & 10.504 & -2.30398 \tabularnewline
11 & 11.4 & 10.653 & 0.746964 \tabularnewline
12 & 6.4 & 10.8508 & -4.45078 \tabularnewline
13 & 10.6 & 11.0102 & -0.410186 \tabularnewline
14 & 12 & 10.3763 & 1.62373 \tabularnewline
15 & 6.3 & 10.6364 & -4.33645 \tabularnewline
16 & 11.3 & 10.5983 & 0.701663 \tabularnewline
17 & 11.9 & 10.8658 & 1.03417 \tabularnewline
18 & 9.3 & 10.9064 & -1.60637 \tabularnewline
19 & 9.6 & 10.4862 & -0.886171 \tabularnewline
20 & 10 & 11.2396 & -1.2396 \tabularnewline
21 & 6.4 & 10.0287 & -3.62871 \tabularnewline
22 & 13.8 & 9.94775 & 3.85225 \tabularnewline
23 & 10.8 & 10.6714 & 0.128615 \tabularnewline
24 & 13.8 & 10.7653 & 3.03469 \tabularnewline
25 & 11.7 & 10.8536 & 0.846363 \tabularnewline
26 & 10.9 & 10.2068 & 0.693225 \tabularnewline
27 & 16.1 & 10.7871 & 5.31294 \tabularnewline
28 & 13.4 & 11.0102 & 2.38981 \tabularnewline
29 & 9.9 & 10.5091 & -0.609121 \tabularnewline
30 & 11.5 & 10.7735 & 0.72645 \tabularnewline
31 & 8.3 & 10.7244 & -2.42445 \tabularnewline
32 & 11.7 & 10.9779 & 0.722104 \tabularnewline
33 & 9 & 10.9054 & -1.90539 \tabularnewline
34 & 9.7 & 10.8123 & -1.11234 \tabularnewline
35 & 10.8 & 10.8421 & -0.0421016 \tabularnewline
36 & 10.3 & 10.7877 & -0.487733 \tabularnewline
37 & 10.4 & 10.3937 & 0.00633934 \tabularnewline
38 & 12.7 & 10.9617 & 1.73826 \tabularnewline
39 & 9.3 & 10.153 & -0.852964 \tabularnewline
40 & 11.8 & 11.104 & 0.695996 \tabularnewline
41 & 5.9 & 11.1346 & -5.23464 \tabularnewline
42 & 11.4 & 10.7511 & 0.648859 \tabularnewline
43 & 13 & 10.716 & 2.28401 \tabularnewline
44 & 10.8 & 10.4344 & 0.365582 \tabularnewline
45 & 12.3 & 10.9439 & 1.35606 \tabularnewline
46 & 11.3 & 10.6711 & 0.628946 \tabularnewline
47 & 11.8 & 10.5692 & 1.23078 \tabularnewline
48 & 7.9 & 10.5878 & -2.68776 \tabularnewline
49 & 12.7 & 10.7941 & 1.90591 \tabularnewline
50 & 12.3 & 10.7783 & 1.52173 \tabularnewline
51 & 11.6 & 10.6164 & 0.983556 \tabularnewline
52 & 6.7 & 10.1164 & -3.41637 \tabularnewline
53 & 10.9 & 10.6444 & 0.255641 \tabularnewline
54 & 12.1 & 10.4565 & 1.64349 \tabularnewline
55 & 13.3 & 10.5218 & 2.77825 \tabularnewline
56 & 10.1 & 11.3612 & -1.26117 \tabularnewline
57 & 5.7 & 11.0011 & -5.30106 \tabularnewline
58 & 14.3 & 10.8896 & 3.41043 \tabularnewline
59 & 8 & 10.8629 & -2.86287 \tabularnewline
60 & 13.3 & 10.9272 & 2.37277 \tabularnewline
61 & 9.3 & 11.0685 & -1.76853 \tabularnewline
62 & 12.5 & 10.5859 & 1.91409 \tabularnewline
63 & 7.6 & 10.5979 & -2.9979 \tabularnewline
64 & 15.9 & 10.6384 & 5.26157 \tabularnewline
65 & 9.2 & 10.848 & -1.64803 \tabularnewline
66 & 9.1 & 10.8087 & -1.7087 \tabularnewline
67 & 11.1 & 10.5969 & 0.503107 \tabularnewline
68 & 13 & 10.6849 & 2.31511 \tabularnewline
69 & 14.5 & 10.9192 & 3.58077 \tabularnewline
70 & 12.2 & 10.9689 & 1.23113 \tabularnewline
71 & 12.3 & 10.8169 & 1.48306 \tabularnewline
72 & 11.4 & 10.4367 & 0.963281 \tabularnewline
73 & 8.8 & 10.6496 & -1.84964 \tabularnewline
74 & 14.6 & 10.8503 & 3.74966 \tabularnewline
75 & 12.6 & 10.409 & 2.19095 \tabularnewline
76 & 13 & 10.7541 & 2.2459 \tabularnewline
77 & 12.6 & 10.9944 & 1.60564 \tabularnewline
78 & 13.2 & 10.8144 & 2.38557 \tabularnewline
79 & 9.9 & 10.2647 & -0.364695 \tabularnewline
80 & 7.7 & 10.095 & -2.39496 \tabularnewline
81 & 10.5 & 10.5396 & -0.0395618 \tabularnewline
82 & 13.4 & 10.7197 & 2.68028 \tabularnewline
83 & 10.9 & 10.7328 & 0.167208 \tabularnewline
84 & 4.3 & 10.699 & -6.39895 \tabularnewline
85 & 10.3 & 10.1539 & 0.146058 \tabularnewline
86 & 11.8 & 10.3393 & 1.46071 \tabularnewline
87 & 11.2 & 10.7143 & 0.485662 \tabularnewline
88 & 11.4 & 10.8136 & 0.586354 \tabularnewline
89 & 8.6 & 10.9706 & -2.37063 \tabularnewline
90 & 13.2 & 10.3522 & 2.84785 \tabularnewline
91 & 12.6 & 10.0818 & 2.5182 \tabularnewline
92 & 5.6 & 10.5801 & -4.98009 \tabularnewline
93 & 9.9 & 10.849 & -0.949035 \tabularnewline
94 & 8.8 & 10.3995 & -1.59948 \tabularnewline
95 & 7.7 & 11.023 & -3.32305 \tabularnewline
96 & 9 & 10.6473 & -1.64732 \tabularnewline
97 & 7.3 & 10.5782 & -3.27823 \tabularnewline
98 & 11.4 & 10.94 & 0.460004 \tabularnewline
99 & 13.6 & 10.5663 & 3.03374 \tabularnewline
100 & 7.9 & 10.7122 & -2.81225 \tabularnewline
101 & 10.7 & 10.674 & 0.0259829 \tabularnewline
102 & 10.3 & 10.8349 & -0.534852 \tabularnewline
103 & 8.3 & 10.3982 & -2.09817 \tabularnewline
104 & 9.6 & 10.9526 & -1.35261 \tabularnewline
105 & 14.2 & 10.7353 & 3.46468 \tabularnewline
106 & 8.5 & 10.6414 & -2.1414 \tabularnewline
107 & 13.5 & 10.7695 & 2.73053 \tabularnewline
108 & 4.9 & 10.4975 & -5.59748 \tabularnewline
109 & 6.4 & 11.1022 & -4.70215 \tabularnewline
110 & 9.6 & 10.6273 & -1.02732 \tabularnewline
111 & 11.6 & 10.4569 & 1.14305 \tabularnewline
112 & 11.1 & 10.792 & 0.307996 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]12.9[/C][C]10.8133[/C][C]2.08668[/C][/ROW]
[ROW][C]2[/C][C]12.2[/C][C]10.5405[/C][C]1.65946[/C][/ROW]
[ROW][C]3[/C][C]12.8[/C][C]10.2351[/C][C]2.56487[/C][/ROW]
[ROW][C]4[/C][C]7.4[/C][C]10.3095[/C][C]-2.9095[/C][/ROW]
[ROW][C]5[/C][C]6.7[/C][C]10.413[/C][C]-3.71302[/C][/ROW]
[ROW][C]6[/C][C]12.6[/C][C]11.0052[/C][C]1.59476[/C][/ROW]
[ROW][C]7[/C][C]14.8[/C][C]11.1377[/C][C]3.66226[/C][/ROW]
[ROW][C]8[/C][C]13.3[/C][C]10.9568[/C][C]2.34321[/C][/ROW]
[ROW][C]9[/C][C]11.1[/C][C]10.8712[/C][C]0.228781[/C][/ROW]
[ROW][C]10[/C][C]8.2[/C][C]10.504[/C][C]-2.30398[/C][/ROW]
[ROW][C]11[/C][C]11.4[/C][C]10.653[/C][C]0.746964[/C][/ROW]
[ROW][C]12[/C][C]6.4[/C][C]10.8508[/C][C]-4.45078[/C][/ROW]
[ROW][C]13[/C][C]10.6[/C][C]11.0102[/C][C]-0.410186[/C][/ROW]
[ROW][C]14[/C][C]12[/C][C]10.3763[/C][C]1.62373[/C][/ROW]
[ROW][C]15[/C][C]6.3[/C][C]10.6364[/C][C]-4.33645[/C][/ROW]
[ROW][C]16[/C][C]11.3[/C][C]10.5983[/C][C]0.701663[/C][/ROW]
[ROW][C]17[/C][C]11.9[/C][C]10.8658[/C][C]1.03417[/C][/ROW]
[ROW][C]18[/C][C]9.3[/C][C]10.9064[/C][C]-1.60637[/C][/ROW]
[ROW][C]19[/C][C]9.6[/C][C]10.4862[/C][C]-0.886171[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]11.2396[/C][C]-1.2396[/C][/ROW]
[ROW][C]21[/C][C]6.4[/C][C]10.0287[/C][C]-3.62871[/C][/ROW]
[ROW][C]22[/C][C]13.8[/C][C]9.94775[/C][C]3.85225[/C][/ROW]
[ROW][C]23[/C][C]10.8[/C][C]10.6714[/C][C]0.128615[/C][/ROW]
[ROW][C]24[/C][C]13.8[/C][C]10.7653[/C][C]3.03469[/C][/ROW]
[ROW][C]25[/C][C]11.7[/C][C]10.8536[/C][C]0.846363[/C][/ROW]
[ROW][C]26[/C][C]10.9[/C][C]10.2068[/C][C]0.693225[/C][/ROW]
[ROW][C]27[/C][C]16.1[/C][C]10.7871[/C][C]5.31294[/C][/ROW]
[ROW][C]28[/C][C]13.4[/C][C]11.0102[/C][C]2.38981[/C][/ROW]
[ROW][C]29[/C][C]9.9[/C][C]10.5091[/C][C]-0.609121[/C][/ROW]
[ROW][C]30[/C][C]11.5[/C][C]10.7735[/C][C]0.72645[/C][/ROW]
[ROW][C]31[/C][C]8.3[/C][C]10.7244[/C][C]-2.42445[/C][/ROW]
[ROW][C]32[/C][C]11.7[/C][C]10.9779[/C][C]0.722104[/C][/ROW]
[ROW][C]33[/C][C]9[/C][C]10.9054[/C][C]-1.90539[/C][/ROW]
[ROW][C]34[/C][C]9.7[/C][C]10.8123[/C][C]-1.11234[/C][/ROW]
[ROW][C]35[/C][C]10.8[/C][C]10.8421[/C][C]-0.0421016[/C][/ROW]
[ROW][C]36[/C][C]10.3[/C][C]10.7877[/C][C]-0.487733[/C][/ROW]
[ROW][C]37[/C][C]10.4[/C][C]10.3937[/C][C]0.00633934[/C][/ROW]
[ROW][C]38[/C][C]12.7[/C][C]10.9617[/C][C]1.73826[/C][/ROW]
[ROW][C]39[/C][C]9.3[/C][C]10.153[/C][C]-0.852964[/C][/ROW]
[ROW][C]40[/C][C]11.8[/C][C]11.104[/C][C]0.695996[/C][/ROW]
[ROW][C]41[/C][C]5.9[/C][C]11.1346[/C][C]-5.23464[/C][/ROW]
[ROW][C]42[/C][C]11.4[/C][C]10.7511[/C][C]0.648859[/C][/ROW]
[ROW][C]43[/C][C]13[/C][C]10.716[/C][C]2.28401[/C][/ROW]
[ROW][C]44[/C][C]10.8[/C][C]10.4344[/C][C]0.365582[/C][/ROW]
[ROW][C]45[/C][C]12.3[/C][C]10.9439[/C][C]1.35606[/C][/ROW]
[ROW][C]46[/C][C]11.3[/C][C]10.6711[/C][C]0.628946[/C][/ROW]
[ROW][C]47[/C][C]11.8[/C][C]10.5692[/C][C]1.23078[/C][/ROW]
[ROW][C]48[/C][C]7.9[/C][C]10.5878[/C][C]-2.68776[/C][/ROW]
[ROW][C]49[/C][C]12.7[/C][C]10.7941[/C][C]1.90591[/C][/ROW]
[ROW][C]50[/C][C]12.3[/C][C]10.7783[/C][C]1.52173[/C][/ROW]
[ROW][C]51[/C][C]11.6[/C][C]10.6164[/C][C]0.983556[/C][/ROW]
[ROW][C]52[/C][C]6.7[/C][C]10.1164[/C][C]-3.41637[/C][/ROW]
[ROW][C]53[/C][C]10.9[/C][C]10.6444[/C][C]0.255641[/C][/ROW]
[ROW][C]54[/C][C]12.1[/C][C]10.4565[/C][C]1.64349[/C][/ROW]
[ROW][C]55[/C][C]13.3[/C][C]10.5218[/C][C]2.77825[/C][/ROW]
[ROW][C]56[/C][C]10.1[/C][C]11.3612[/C][C]-1.26117[/C][/ROW]
[ROW][C]57[/C][C]5.7[/C][C]11.0011[/C][C]-5.30106[/C][/ROW]
[ROW][C]58[/C][C]14.3[/C][C]10.8896[/C][C]3.41043[/C][/ROW]
[ROW][C]59[/C][C]8[/C][C]10.8629[/C][C]-2.86287[/C][/ROW]
[ROW][C]60[/C][C]13.3[/C][C]10.9272[/C][C]2.37277[/C][/ROW]
[ROW][C]61[/C][C]9.3[/C][C]11.0685[/C][C]-1.76853[/C][/ROW]
[ROW][C]62[/C][C]12.5[/C][C]10.5859[/C][C]1.91409[/C][/ROW]
[ROW][C]63[/C][C]7.6[/C][C]10.5979[/C][C]-2.9979[/C][/ROW]
[ROW][C]64[/C][C]15.9[/C][C]10.6384[/C][C]5.26157[/C][/ROW]
[ROW][C]65[/C][C]9.2[/C][C]10.848[/C][C]-1.64803[/C][/ROW]
[ROW][C]66[/C][C]9.1[/C][C]10.8087[/C][C]-1.7087[/C][/ROW]
[ROW][C]67[/C][C]11.1[/C][C]10.5969[/C][C]0.503107[/C][/ROW]
[ROW][C]68[/C][C]13[/C][C]10.6849[/C][C]2.31511[/C][/ROW]
[ROW][C]69[/C][C]14.5[/C][C]10.9192[/C][C]3.58077[/C][/ROW]
[ROW][C]70[/C][C]12.2[/C][C]10.9689[/C][C]1.23113[/C][/ROW]
[ROW][C]71[/C][C]12.3[/C][C]10.8169[/C][C]1.48306[/C][/ROW]
[ROW][C]72[/C][C]11.4[/C][C]10.4367[/C][C]0.963281[/C][/ROW]
[ROW][C]73[/C][C]8.8[/C][C]10.6496[/C][C]-1.84964[/C][/ROW]
[ROW][C]74[/C][C]14.6[/C][C]10.8503[/C][C]3.74966[/C][/ROW]
[ROW][C]75[/C][C]12.6[/C][C]10.409[/C][C]2.19095[/C][/ROW]
[ROW][C]76[/C][C]13[/C][C]10.7541[/C][C]2.2459[/C][/ROW]
[ROW][C]77[/C][C]12.6[/C][C]10.9944[/C][C]1.60564[/C][/ROW]
[ROW][C]78[/C][C]13.2[/C][C]10.8144[/C][C]2.38557[/C][/ROW]
[ROW][C]79[/C][C]9.9[/C][C]10.2647[/C][C]-0.364695[/C][/ROW]
[ROW][C]80[/C][C]7.7[/C][C]10.095[/C][C]-2.39496[/C][/ROW]
[ROW][C]81[/C][C]10.5[/C][C]10.5396[/C][C]-0.0395618[/C][/ROW]
[ROW][C]82[/C][C]13.4[/C][C]10.7197[/C][C]2.68028[/C][/ROW]
[ROW][C]83[/C][C]10.9[/C][C]10.7328[/C][C]0.167208[/C][/ROW]
[ROW][C]84[/C][C]4.3[/C][C]10.699[/C][C]-6.39895[/C][/ROW]
[ROW][C]85[/C][C]10.3[/C][C]10.1539[/C][C]0.146058[/C][/ROW]
[ROW][C]86[/C][C]11.8[/C][C]10.3393[/C][C]1.46071[/C][/ROW]
[ROW][C]87[/C][C]11.2[/C][C]10.7143[/C][C]0.485662[/C][/ROW]
[ROW][C]88[/C][C]11.4[/C][C]10.8136[/C][C]0.586354[/C][/ROW]
[ROW][C]89[/C][C]8.6[/C][C]10.9706[/C][C]-2.37063[/C][/ROW]
[ROW][C]90[/C][C]13.2[/C][C]10.3522[/C][C]2.84785[/C][/ROW]
[ROW][C]91[/C][C]12.6[/C][C]10.0818[/C][C]2.5182[/C][/ROW]
[ROW][C]92[/C][C]5.6[/C][C]10.5801[/C][C]-4.98009[/C][/ROW]
[ROW][C]93[/C][C]9.9[/C][C]10.849[/C][C]-0.949035[/C][/ROW]
[ROW][C]94[/C][C]8.8[/C][C]10.3995[/C][C]-1.59948[/C][/ROW]
[ROW][C]95[/C][C]7.7[/C][C]11.023[/C][C]-3.32305[/C][/ROW]
[ROW][C]96[/C][C]9[/C][C]10.6473[/C][C]-1.64732[/C][/ROW]
[ROW][C]97[/C][C]7.3[/C][C]10.5782[/C][C]-3.27823[/C][/ROW]
[ROW][C]98[/C][C]11.4[/C][C]10.94[/C][C]0.460004[/C][/ROW]
[ROW][C]99[/C][C]13.6[/C][C]10.5663[/C][C]3.03374[/C][/ROW]
[ROW][C]100[/C][C]7.9[/C][C]10.7122[/C][C]-2.81225[/C][/ROW]
[ROW][C]101[/C][C]10.7[/C][C]10.674[/C][C]0.0259829[/C][/ROW]
[ROW][C]102[/C][C]10.3[/C][C]10.8349[/C][C]-0.534852[/C][/ROW]
[ROW][C]103[/C][C]8.3[/C][C]10.3982[/C][C]-2.09817[/C][/ROW]
[ROW][C]104[/C][C]9.6[/C][C]10.9526[/C][C]-1.35261[/C][/ROW]
[ROW][C]105[/C][C]14.2[/C][C]10.7353[/C][C]3.46468[/C][/ROW]
[ROW][C]106[/C][C]8.5[/C][C]10.6414[/C][C]-2.1414[/C][/ROW]
[ROW][C]107[/C][C]13.5[/C][C]10.7695[/C][C]2.73053[/C][/ROW]
[ROW][C]108[/C][C]4.9[/C][C]10.4975[/C][C]-5.59748[/C][/ROW]
[ROW][C]109[/C][C]6.4[/C][C]11.1022[/C][C]-4.70215[/C][/ROW]
[ROW][C]110[/C][C]9.6[/C][C]10.6273[/C][C]-1.02732[/C][/ROW]
[ROW][C]111[/C][C]11.6[/C][C]10.4569[/C][C]1.14305[/C][/ROW]
[ROW][C]112[/C][C]11.1[/C][C]10.792[/C][C]0.307996[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.910.81332.08668
212.210.54051.65946
312.810.23512.56487
47.410.3095-2.9095
56.710.413-3.71302
612.611.00521.59476
714.811.13773.66226
813.310.95682.34321
911.110.87120.228781
108.210.504-2.30398
1111.410.6530.746964
126.410.8508-4.45078
1310.611.0102-0.410186
141210.37631.62373
156.310.6364-4.33645
1611.310.59830.701663
1711.910.86581.03417
189.310.9064-1.60637
199.610.4862-0.886171
201011.2396-1.2396
216.410.0287-3.62871
2213.89.947753.85225
2310.810.67140.128615
2413.810.76533.03469
2511.710.85360.846363
2610.910.20680.693225
2716.110.78715.31294
2813.411.01022.38981
299.910.5091-0.609121
3011.510.77350.72645
318.310.7244-2.42445
3211.710.97790.722104
33910.9054-1.90539
349.710.8123-1.11234
3510.810.8421-0.0421016
3610.310.7877-0.487733
3710.410.39370.00633934
3812.710.96171.73826
399.310.153-0.852964
4011.811.1040.695996
415.911.1346-5.23464
4211.410.75110.648859
431310.7162.28401
4410.810.43440.365582
4512.310.94391.35606
4611.310.67110.628946
4711.810.56921.23078
487.910.5878-2.68776
4912.710.79411.90591
5012.310.77831.52173
5111.610.61640.983556
526.710.1164-3.41637
5310.910.64440.255641
5412.110.45651.64349
5513.310.52182.77825
5610.111.3612-1.26117
575.711.0011-5.30106
5814.310.88963.41043
59810.8629-2.86287
6013.310.92722.37277
619.311.0685-1.76853
6212.510.58591.91409
637.610.5979-2.9979
6415.910.63845.26157
659.210.848-1.64803
669.110.8087-1.7087
6711.110.59690.503107
681310.68492.31511
6914.510.91923.58077
7012.210.96891.23113
7112.310.81691.48306
7211.410.43670.963281
738.810.6496-1.84964
7414.610.85033.74966
7512.610.4092.19095
761310.75412.2459
7712.610.99441.60564
7813.210.81442.38557
799.910.2647-0.364695
807.710.095-2.39496
8110.510.5396-0.0395618
8213.410.71972.68028
8310.910.73280.167208
844.310.699-6.39895
8510.310.15390.146058
8611.810.33931.46071
8711.210.71430.485662
8811.410.81360.586354
898.610.9706-2.37063
9013.210.35222.84785
9112.610.08182.5182
925.610.5801-4.98009
939.910.849-0.949035
948.810.3995-1.59948
957.711.023-3.32305
96910.6473-1.64732
977.310.5782-3.27823
9811.410.940.460004
9913.610.56633.03374
1007.910.7122-2.81225
10110.710.6740.0259829
10210.310.8349-0.534852
1038.310.3982-2.09817
1049.610.9526-1.35261
10514.210.73533.46468
1068.510.6414-2.1414
10713.510.76952.73053
1084.910.4975-5.59748
1096.411.1022-4.70215
1109.610.6273-1.02732
11111.610.45691.14305
11211.110.7920.307996







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.5493460.9013080.450654
80.3849370.7698730.615063
90.3115860.6231720.688414
100.2028350.405670.797165
110.1434240.2868470.856576
120.7125540.5748910.287446
130.6600760.6798480.339924
140.5720430.8559140.427957
150.7007020.5985970.299298
160.6589210.6821590.341079
170.5855730.8288530.414427
180.5590140.8819720.440986
190.4857660.9715320.514234
200.4255730.8511450.574427
210.403780.8075610.59622
220.6866760.6266490.313324
230.6204690.7590620.379531
240.6312410.7375180.368759
250.5660380.8679230.433962
260.500640.998720.49936
270.6613060.6773890.338694
280.6290380.7419240.370962
290.569570.8608610.43043
300.5076250.984750.492375
310.5154170.9691660.484583
320.4542570.9085130.545743
330.4323980.8647970.567602
340.3841670.7683340.615833
350.3284040.6568080.671596
360.2766360.5532730.723364
370.2291080.4582170.770892
380.2009710.4019420.799029
390.1639540.3279080.836046
400.1318520.2637040.868148
410.3756910.7513820.624309
420.326010.6520190.67399
430.3209790.6419570.679021
440.2721070.5442150.727893
450.2370040.4740080.762996
460.1979090.3958180.802091
470.171410.342820.82859
480.1846410.3692830.815359
490.1683780.3367560.831622
500.1462050.2924110.853795
510.1202210.2404430.879779
520.1386620.2773240.861338
530.1104380.2208750.889562
540.09940010.19880.9006
550.1072090.2144180.892791
560.09208650.1841730.907913
570.203920.4078390.79608
580.2384870.4769750.761513
590.2489090.4978170.751091
600.2431970.4863940.756803
610.2210370.4420750.778963
620.2037840.4075690.796216
630.2213440.4426870.778656
640.3775550.7551090.622445
650.3452190.6904380.654781
660.3160560.6321120.683944
670.2693390.5386770.730661
680.2621040.5242080.737896
690.3145990.6291980.685401
700.2828410.5656830.717159
710.2613820.5227640.738618
720.2243820.4487650.775618
730.202040.4040790.79796
740.2718740.5437480.728126
750.2605670.5211330.739433
760.266450.53290.73355
770.2634780.5269570.736522
780.2786390.5572780.721361
790.2305570.4611140.769443
800.2387880.4775760.761212
810.1937870.3875740.806213
820.2195530.4391050.780447
830.1835750.3671510.816425
840.4316070.8632130.568393
850.3684840.7369680.631516
860.3229420.6458840.677058
870.2745210.5490430.725479
880.2374610.4749230.762539
890.1998750.3997490.800125
900.2203480.4406950.779652
910.29380.5876010.7062
920.4226950.845390.577305
930.3597450.719490.640255
940.306280.612560.69372
950.2971330.5942660.702867
960.2648040.5296080.735196
970.2205810.4411620.779419
980.1872350.3744690.812765
990.3410960.6821930.658904
1000.3123480.6246950.687652
1010.2265110.4530230.773489
1020.1518320.3036650.848168
1030.09551630.1910330.904484
1040.06246220.1249240.937538
1050.2467090.4934180.753291

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 & 0.549346 & 0.901308 & 0.450654 \tabularnewline
8 & 0.384937 & 0.769873 & 0.615063 \tabularnewline
9 & 0.311586 & 0.623172 & 0.688414 \tabularnewline
10 & 0.202835 & 0.40567 & 0.797165 \tabularnewline
11 & 0.143424 & 0.286847 & 0.856576 \tabularnewline
12 & 0.712554 & 0.574891 & 0.287446 \tabularnewline
13 & 0.660076 & 0.679848 & 0.339924 \tabularnewline
14 & 0.572043 & 0.855914 & 0.427957 \tabularnewline
15 & 0.700702 & 0.598597 & 0.299298 \tabularnewline
16 & 0.658921 & 0.682159 & 0.341079 \tabularnewline
17 & 0.585573 & 0.828853 & 0.414427 \tabularnewline
18 & 0.559014 & 0.881972 & 0.440986 \tabularnewline
19 & 0.485766 & 0.971532 & 0.514234 \tabularnewline
20 & 0.425573 & 0.851145 & 0.574427 \tabularnewline
21 & 0.40378 & 0.807561 & 0.59622 \tabularnewline
22 & 0.686676 & 0.626649 & 0.313324 \tabularnewline
23 & 0.620469 & 0.759062 & 0.379531 \tabularnewline
24 & 0.631241 & 0.737518 & 0.368759 \tabularnewline
25 & 0.566038 & 0.867923 & 0.433962 \tabularnewline
26 & 0.50064 & 0.99872 & 0.49936 \tabularnewline
27 & 0.661306 & 0.677389 & 0.338694 \tabularnewline
28 & 0.629038 & 0.741924 & 0.370962 \tabularnewline
29 & 0.56957 & 0.860861 & 0.43043 \tabularnewline
30 & 0.507625 & 0.98475 & 0.492375 \tabularnewline
31 & 0.515417 & 0.969166 & 0.484583 \tabularnewline
32 & 0.454257 & 0.908513 & 0.545743 \tabularnewline
33 & 0.432398 & 0.864797 & 0.567602 \tabularnewline
34 & 0.384167 & 0.768334 & 0.615833 \tabularnewline
35 & 0.328404 & 0.656808 & 0.671596 \tabularnewline
36 & 0.276636 & 0.553273 & 0.723364 \tabularnewline
37 & 0.229108 & 0.458217 & 0.770892 \tabularnewline
38 & 0.200971 & 0.401942 & 0.799029 \tabularnewline
39 & 0.163954 & 0.327908 & 0.836046 \tabularnewline
40 & 0.131852 & 0.263704 & 0.868148 \tabularnewline
41 & 0.375691 & 0.751382 & 0.624309 \tabularnewline
42 & 0.32601 & 0.652019 & 0.67399 \tabularnewline
43 & 0.320979 & 0.641957 & 0.679021 \tabularnewline
44 & 0.272107 & 0.544215 & 0.727893 \tabularnewline
45 & 0.237004 & 0.474008 & 0.762996 \tabularnewline
46 & 0.197909 & 0.395818 & 0.802091 \tabularnewline
47 & 0.17141 & 0.34282 & 0.82859 \tabularnewline
48 & 0.184641 & 0.369283 & 0.815359 \tabularnewline
49 & 0.168378 & 0.336756 & 0.831622 \tabularnewline
50 & 0.146205 & 0.292411 & 0.853795 \tabularnewline
51 & 0.120221 & 0.240443 & 0.879779 \tabularnewline
52 & 0.138662 & 0.277324 & 0.861338 \tabularnewline
53 & 0.110438 & 0.220875 & 0.889562 \tabularnewline
54 & 0.0994001 & 0.1988 & 0.9006 \tabularnewline
55 & 0.107209 & 0.214418 & 0.892791 \tabularnewline
56 & 0.0920865 & 0.184173 & 0.907913 \tabularnewline
57 & 0.20392 & 0.407839 & 0.79608 \tabularnewline
58 & 0.238487 & 0.476975 & 0.761513 \tabularnewline
59 & 0.248909 & 0.497817 & 0.751091 \tabularnewline
60 & 0.243197 & 0.486394 & 0.756803 \tabularnewline
61 & 0.221037 & 0.442075 & 0.778963 \tabularnewline
62 & 0.203784 & 0.407569 & 0.796216 \tabularnewline
63 & 0.221344 & 0.442687 & 0.778656 \tabularnewline
64 & 0.377555 & 0.755109 & 0.622445 \tabularnewline
65 & 0.345219 & 0.690438 & 0.654781 \tabularnewline
66 & 0.316056 & 0.632112 & 0.683944 \tabularnewline
67 & 0.269339 & 0.538677 & 0.730661 \tabularnewline
68 & 0.262104 & 0.524208 & 0.737896 \tabularnewline
69 & 0.314599 & 0.629198 & 0.685401 \tabularnewline
70 & 0.282841 & 0.565683 & 0.717159 \tabularnewline
71 & 0.261382 & 0.522764 & 0.738618 \tabularnewline
72 & 0.224382 & 0.448765 & 0.775618 \tabularnewline
73 & 0.20204 & 0.404079 & 0.79796 \tabularnewline
74 & 0.271874 & 0.543748 & 0.728126 \tabularnewline
75 & 0.260567 & 0.521133 & 0.739433 \tabularnewline
76 & 0.26645 & 0.5329 & 0.73355 \tabularnewline
77 & 0.263478 & 0.526957 & 0.736522 \tabularnewline
78 & 0.278639 & 0.557278 & 0.721361 \tabularnewline
79 & 0.230557 & 0.461114 & 0.769443 \tabularnewline
80 & 0.238788 & 0.477576 & 0.761212 \tabularnewline
81 & 0.193787 & 0.387574 & 0.806213 \tabularnewline
82 & 0.219553 & 0.439105 & 0.780447 \tabularnewline
83 & 0.183575 & 0.367151 & 0.816425 \tabularnewline
84 & 0.431607 & 0.863213 & 0.568393 \tabularnewline
85 & 0.368484 & 0.736968 & 0.631516 \tabularnewline
86 & 0.322942 & 0.645884 & 0.677058 \tabularnewline
87 & 0.274521 & 0.549043 & 0.725479 \tabularnewline
88 & 0.237461 & 0.474923 & 0.762539 \tabularnewline
89 & 0.199875 & 0.399749 & 0.800125 \tabularnewline
90 & 0.220348 & 0.440695 & 0.779652 \tabularnewline
91 & 0.2938 & 0.587601 & 0.7062 \tabularnewline
92 & 0.422695 & 0.84539 & 0.577305 \tabularnewline
93 & 0.359745 & 0.71949 & 0.640255 \tabularnewline
94 & 0.30628 & 0.61256 & 0.69372 \tabularnewline
95 & 0.297133 & 0.594266 & 0.702867 \tabularnewline
96 & 0.264804 & 0.529608 & 0.735196 \tabularnewline
97 & 0.220581 & 0.441162 & 0.779419 \tabularnewline
98 & 0.187235 & 0.374469 & 0.812765 \tabularnewline
99 & 0.341096 & 0.682193 & 0.658904 \tabularnewline
100 & 0.312348 & 0.624695 & 0.687652 \tabularnewline
101 & 0.226511 & 0.453023 & 0.773489 \tabularnewline
102 & 0.151832 & 0.303665 & 0.848168 \tabularnewline
103 & 0.0955163 & 0.191033 & 0.904484 \tabularnewline
104 & 0.0624622 & 0.124924 & 0.937538 \tabularnewline
105 & 0.246709 & 0.493418 & 0.753291 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C]0.549346[/C][C]0.901308[/C][C]0.450654[/C][/ROW]
[ROW][C]8[/C][C]0.384937[/C][C]0.769873[/C][C]0.615063[/C][/ROW]
[ROW][C]9[/C][C]0.311586[/C][C]0.623172[/C][C]0.688414[/C][/ROW]
[ROW][C]10[/C][C]0.202835[/C][C]0.40567[/C][C]0.797165[/C][/ROW]
[ROW][C]11[/C][C]0.143424[/C][C]0.286847[/C][C]0.856576[/C][/ROW]
[ROW][C]12[/C][C]0.712554[/C][C]0.574891[/C][C]0.287446[/C][/ROW]
[ROW][C]13[/C][C]0.660076[/C][C]0.679848[/C][C]0.339924[/C][/ROW]
[ROW][C]14[/C][C]0.572043[/C][C]0.855914[/C][C]0.427957[/C][/ROW]
[ROW][C]15[/C][C]0.700702[/C][C]0.598597[/C][C]0.299298[/C][/ROW]
[ROW][C]16[/C][C]0.658921[/C][C]0.682159[/C][C]0.341079[/C][/ROW]
[ROW][C]17[/C][C]0.585573[/C][C]0.828853[/C][C]0.414427[/C][/ROW]
[ROW][C]18[/C][C]0.559014[/C][C]0.881972[/C][C]0.440986[/C][/ROW]
[ROW][C]19[/C][C]0.485766[/C][C]0.971532[/C][C]0.514234[/C][/ROW]
[ROW][C]20[/C][C]0.425573[/C][C]0.851145[/C][C]0.574427[/C][/ROW]
[ROW][C]21[/C][C]0.40378[/C][C]0.807561[/C][C]0.59622[/C][/ROW]
[ROW][C]22[/C][C]0.686676[/C][C]0.626649[/C][C]0.313324[/C][/ROW]
[ROW][C]23[/C][C]0.620469[/C][C]0.759062[/C][C]0.379531[/C][/ROW]
[ROW][C]24[/C][C]0.631241[/C][C]0.737518[/C][C]0.368759[/C][/ROW]
[ROW][C]25[/C][C]0.566038[/C][C]0.867923[/C][C]0.433962[/C][/ROW]
[ROW][C]26[/C][C]0.50064[/C][C]0.99872[/C][C]0.49936[/C][/ROW]
[ROW][C]27[/C][C]0.661306[/C][C]0.677389[/C][C]0.338694[/C][/ROW]
[ROW][C]28[/C][C]0.629038[/C][C]0.741924[/C][C]0.370962[/C][/ROW]
[ROW][C]29[/C][C]0.56957[/C][C]0.860861[/C][C]0.43043[/C][/ROW]
[ROW][C]30[/C][C]0.507625[/C][C]0.98475[/C][C]0.492375[/C][/ROW]
[ROW][C]31[/C][C]0.515417[/C][C]0.969166[/C][C]0.484583[/C][/ROW]
[ROW][C]32[/C][C]0.454257[/C][C]0.908513[/C][C]0.545743[/C][/ROW]
[ROW][C]33[/C][C]0.432398[/C][C]0.864797[/C][C]0.567602[/C][/ROW]
[ROW][C]34[/C][C]0.384167[/C][C]0.768334[/C][C]0.615833[/C][/ROW]
[ROW][C]35[/C][C]0.328404[/C][C]0.656808[/C][C]0.671596[/C][/ROW]
[ROW][C]36[/C][C]0.276636[/C][C]0.553273[/C][C]0.723364[/C][/ROW]
[ROW][C]37[/C][C]0.229108[/C][C]0.458217[/C][C]0.770892[/C][/ROW]
[ROW][C]38[/C][C]0.200971[/C][C]0.401942[/C][C]0.799029[/C][/ROW]
[ROW][C]39[/C][C]0.163954[/C][C]0.327908[/C][C]0.836046[/C][/ROW]
[ROW][C]40[/C][C]0.131852[/C][C]0.263704[/C][C]0.868148[/C][/ROW]
[ROW][C]41[/C][C]0.375691[/C][C]0.751382[/C][C]0.624309[/C][/ROW]
[ROW][C]42[/C][C]0.32601[/C][C]0.652019[/C][C]0.67399[/C][/ROW]
[ROW][C]43[/C][C]0.320979[/C][C]0.641957[/C][C]0.679021[/C][/ROW]
[ROW][C]44[/C][C]0.272107[/C][C]0.544215[/C][C]0.727893[/C][/ROW]
[ROW][C]45[/C][C]0.237004[/C][C]0.474008[/C][C]0.762996[/C][/ROW]
[ROW][C]46[/C][C]0.197909[/C][C]0.395818[/C][C]0.802091[/C][/ROW]
[ROW][C]47[/C][C]0.17141[/C][C]0.34282[/C][C]0.82859[/C][/ROW]
[ROW][C]48[/C][C]0.184641[/C][C]0.369283[/C][C]0.815359[/C][/ROW]
[ROW][C]49[/C][C]0.168378[/C][C]0.336756[/C][C]0.831622[/C][/ROW]
[ROW][C]50[/C][C]0.146205[/C][C]0.292411[/C][C]0.853795[/C][/ROW]
[ROW][C]51[/C][C]0.120221[/C][C]0.240443[/C][C]0.879779[/C][/ROW]
[ROW][C]52[/C][C]0.138662[/C][C]0.277324[/C][C]0.861338[/C][/ROW]
[ROW][C]53[/C][C]0.110438[/C][C]0.220875[/C][C]0.889562[/C][/ROW]
[ROW][C]54[/C][C]0.0994001[/C][C]0.1988[/C][C]0.9006[/C][/ROW]
[ROW][C]55[/C][C]0.107209[/C][C]0.214418[/C][C]0.892791[/C][/ROW]
[ROW][C]56[/C][C]0.0920865[/C][C]0.184173[/C][C]0.907913[/C][/ROW]
[ROW][C]57[/C][C]0.20392[/C][C]0.407839[/C][C]0.79608[/C][/ROW]
[ROW][C]58[/C][C]0.238487[/C][C]0.476975[/C][C]0.761513[/C][/ROW]
[ROW][C]59[/C][C]0.248909[/C][C]0.497817[/C][C]0.751091[/C][/ROW]
[ROW][C]60[/C][C]0.243197[/C][C]0.486394[/C][C]0.756803[/C][/ROW]
[ROW][C]61[/C][C]0.221037[/C][C]0.442075[/C][C]0.778963[/C][/ROW]
[ROW][C]62[/C][C]0.203784[/C][C]0.407569[/C][C]0.796216[/C][/ROW]
[ROW][C]63[/C][C]0.221344[/C][C]0.442687[/C][C]0.778656[/C][/ROW]
[ROW][C]64[/C][C]0.377555[/C][C]0.755109[/C][C]0.622445[/C][/ROW]
[ROW][C]65[/C][C]0.345219[/C][C]0.690438[/C][C]0.654781[/C][/ROW]
[ROW][C]66[/C][C]0.316056[/C][C]0.632112[/C][C]0.683944[/C][/ROW]
[ROW][C]67[/C][C]0.269339[/C][C]0.538677[/C][C]0.730661[/C][/ROW]
[ROW][C]68[/C][C]0.262104[/C][C]0.524208[/C][C]0.737896[/C][/ROW]
[ROW][C]69[/C][C]0.314599[/C][C]0.629198[/C][C]0.685401[/C][/ROW]
[ROW][C]70[/C][C]0.282841[/C][C]0.565683[/C][C]0.717159[/C][/ROW]
[ROW][C]71[/C][C]0.261382[/C][C]0.522764[/C][C]0.738618[/C][/ROW]
[ROW][C]72[/C][C]0.224382[/C][C]0.448765[/C][C]0.775618[/C][/ROW]
[ROW][C]73[/C][C]0.20204[/C][C]0.404079[/C][C]0.79796[/C][/ROW]
[ROW][C]74[/C][C]0.271874[/C][C]0.543748[/C][C]0.728126[/C][/ROW]
[ROW][C]75[/C][C]0.260567[/C][C]0.521133[/C][C]0.739433[/C][/ROW]
[ROW][C]76[/C][C]0.26645[/C][C]0.5329[/C][C]0.73355[/C][/ROW]
[ROW][C]77[/C][C]0.263478[/C][C]0.526957[/C][C]0.736522[/C][/ROW]
[ROW][C]78[/C][C]0.278639[/C][C]0.557278[/C][C]0.721361[/C][/ROW]
[ROW][C]79[/C][C]0.230557[/C][C]0.461114[/C][C]0.769443[/C][/ROW]
[ROW][C]80[/C][C]0.238788[/C][C]0.477576[/C][C]0.761212[/C][/ROW]
[ROW][C]81[/C][C]0.193787[/C][C]0.387574[/C][C]0.806213[/C][/ROW]
[ROW][C]82[/C][C]0.219553[/C][C]0.439105[/C][C]0.780447[/C][/ROW]
[ROW][C]83[/C][C]0.183575[/C][C]0.367151[/C][C]0.816425[/C][/ROW]
[ROW][C]84[/C][C]0.431607[/C][C]0.863213[/C][C]0.568393[/C][/ROW]
[ROW][C]85[/C][C]0.368484[/C][C]0.736968[/C][C]0.631516[/C][/ROW]
[ROW][C]86[/C][C]0.322942[/C][C]0.645884[/C][C]0.677058[/C][/ROW]
[ROW][C]87[/C][C]0.274521[/C][C]0.549043[/C][C]0.725479[/C][/ROW]
[ROW][C]88[/C][C]0.237461[/C][C]0.474923[/C][C]0.762539[/C][/ROW]
[ROW][C]89[/C][C]0.199875[/C][C]0.399749[/C][C]0.800125[/C][/ROW]
[ROW][C]90[/C][C]0.220348[/C][C]0.440695[/C][C]0.779652[/C][/ROW]
[ROW][C]91[/C][C]0.2938[/C][C]0.587601[/C][C]0.7062[/C][/ROW]
[ROW][C]92[/C][C]0.422695[/C][C]0.84539[/C][C]0.577305[/C][/ROW]
[ROW][C]93[/C][C]0.359745[/C][C]0.71949[/C][C]0.640255[/C][/ROW]
[ROW][C]94[/C][C]0.30628[/C][C]0.61256[/C][C]0.69372[/C][/ROW]
[ROW][C]95[/C][C]0.297133[/C][C]0.594266[/C][C]0.702867[/C][/ROW]
[ROW][C]96[/C][C]0.264804[/C][C]0.529608[/C][C]0.735196[/C][/ROW]
[ROW][C]97[/C][C]0.220581[/C][C]0.441162[/C][C]0.779419[/C][/ROW]
[ROW][C]98[/C][C]0.187235[/C][C]0.374469[/C][C]0.812765[/C][/ROW]
[ROW][C]99[/C][C]0.341096[/C][C]0.682193[/C][C]0.658904[/C][/ROW]
[ROW][C]100[/C][C]0.312348[/C][C]0.624695[/C][C]0.687652[/C][/ROW]
[ROW][C]101[/C][C]0.226511[/C][C]0.453023[/C][C]0.773489[/C][/ROW]
[ROW][C]102[/C][C]0.151832[/C][C]0.303665[/C][C]0.848168[/C][/ROW]
[ROW][C]103[/C][C]0.0955163[/C][C]0.191033[/C][C]0.904484[/C][/ROW]
[ROW][C]104[/C][C]0.0624622[/C][C]0.124924[/C][C]0.937538[/C][/ROW]
[ROW][C]105[/C][C]0.246709[/C][C]0.493418[/C][C]0.753291[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.5493460.9013080.450654
80.3849370.7698730.615063
90.3115860.6231720.688414
100.2028350.405670.797165
110.1434240.2868470.856576
120.7125540.5748910.287446
130.6600760.6798480.339924
140.5720430.8559140.427957
150.7007020.5985970.299298
160.6589210.6821590.341079
170.5855730.8288530.414427
180.5590140.8819720.440986
190.4857660.9715320.514234
200.4255730.8511450.574427
210.403780.8075610.59622
220.6866760.6266490.313324
230.6204690.7590620.379531
240.6312410.7375180.368759
250.5660380.8679230.433962
260.500640.998720.49936
270.6613060.6773890.338694
280.6290380.7419240.370962
290.569570.8608610.43043
300.5076250.984750.492375
310.5154170.9691660.484583
320.4542570.9085130.545743
330.4323980.8647970.567602
340.3841670.7683340.615833
350.3284040.6568080.671596
360.2766360.5532730.723364
370.2291080.4582170.770892
380.2009710.4019420.799029
390.1639540.3279080.836046
400.1318520.2637040.868148
410.3756910.7513820.624309
420.326010.6520190.67399
430.3209790.6419570.679021
440.2721070.5442150.727893
450.2370040.4740080.762996
460.1979090.3958180.802091
470.171410.342820.82859
480.1846410.3692830.815359
490.1683780.3367560.831622
500.1462050.2924110.853795
510.1202210.2404430.879779
520.1386620.2773240.861338
530.1104380.2208750.889562
540.09940010.19880.9006
550.1072090.2144180.892791
560.09208650.1841730.907913
570.203920.4078390.79608
580.2384870.4769750.761513
590.2489090.4978170.751091
600.2431970.4863940.756803
610.2210370.4420750.778963
620.2037840.4075690.796216
630.2213440.4426870.778656
640.3775550.7551090.622445
650.3452190.6904380.654781
660.3160560.6321120.683944
670.2693390.5386770.730661
680.2621040.5242080.737896
690.3145990.6291980.685401
700.2828410.5656830.717159
710.2613820.5227640.738618
720.2243820.4487650.775618
730.202040.4040790.79796
740.2718740.5437480.728126
750.2605670.5211330.739433
760.266450.53290.73355
770.2634780.5269570.736522
780.2786390.5572780.721361
790.2305570.4611140.769443
800.2387880.4775760.761212
810.1937870.3875740.806213
820.2195530.4391050.780447
830.1835750.3671510.816425
840.4316070.8632130.568393
850.3684840.7369680.631516
860.3229420.6458840.677058
870.2745210.5490430.725479
880.2374610.4749230.762539
890.1998750.3997490.800125
900.2203480.4406950.779652
910.29380.5876010.7062
920.4226950.845390.577305
930.3597450.719490.640255
940.306280.612560.69372
950.2971330.5942660.702867
960.2648040.5296080.735196
970.2205810.4411620.779419
980.1872350.3744690.812765
990.3410960.6821930.658904
1000.3123480.6246950.687652
1010.2265110.4530230.773489
1020.1518320.3036650.848168
1030.09551630.1910330.904484
1040.06246220.1249240.937538
1050.2467090.4934180.753291







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265122&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265122&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265122&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}