Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 08 Dec 2014 20:19:40 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/08/t1418069997xjfpiyrf2g5iesm.htm/, Retrieved Tue, 28 May 2024 00:48:47 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=264209, Retrieved Tue, 28 May 2024 00:48:47 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact106
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [mr totaalscores] [2014-12-08 20:19:40] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
- RMPD    [Central Tendency] [tot central tendency] [2014-12-10 09:19:29] [673773038936aef3a5778d7e6bda5c1e]
Feedback Forum

Post a new message
Dataseries X:
1 7.5 1.8 2.1 1.5
0 6.0 2.1 2.0 2.1
1 6.5 2.2 2.0 2.1
0 1.0 2.3 2.1 1.9
0 1.0 2.1 2.0 1.6
0 5.5 2.7 2.3 2.1
1 8.5 2.1 2.1 2.1
0 6.5 2.4 2.1 2.2
0 4.5 2.9 2.2 1.5
0 2.0 2.2 2.1 1.9
0 5.0 2.1 2.1 2.2
0 0.5 2.2 2.1 1.6
0 5.0 2.2 2.0 1.5
1 5.0 2.7 2.3 1.9
1 2.5 1.9 1.8 0.1
1 5.0 2.0 2.0 2.2
0 5.5 2.5 2.2 1.8
1 3.5 2.2 2.0 1.6
0 3.0 2.3 2.1 2.2
1 4.0 1.9 2.0 2.1
0 0.5 2.1 1.8 1.9
0 6.5 3.5 2.2 1.6
1 4.5 2.1 2.2 1.9
0 7.5 2.3 1.7 2.2
0 5.5 2.3 2.1 1.8
0 4.0 2.2 2.3 2.4
0 7.5 3.5 2.7 2.4
1 7.0 1.9 1.9 2.5
0 4.0 1.9 2.0 1.9
1 5.5 1.9 2.0 2.1
1 2.5 1.9 1.9 1.9
1 5.5 2.1 2.0 2.1
0 3.5 2.0 2.0 1.5
0 2.5 3.2 2.1 1.9
0 4.5 2.3 2.0 2.1
0 4.5 2.5 1.8 1.5
1 4.5 1.8 2.0 2.1
0 6.0 2.4 2.2 2.1
0 2.5 2.8 2.2 1.8
1 5.0 2.3 2.1 2.4
0 0.0 2.0 1.8 2.1
0 5.0 2.5 1.9 1.9
0 6.5 2.3 2.1 2.1
0 5.0 1.8 2.0 1.9
0 6.0 1.9 1.9 2.4
1 4.5 2.6 2.2 2.1
0 5.5 2.0 2.0 2.2
0 1.0 2.6 2.0 2.2
1 7.5 1.6 1.7 1.8
0 6.0 2.2 2.0 2.1
0 5.0 2.1 2.2 2.4
0 1.0 1.8 1.7 2.2
0 5.0 1.8 2.0 2.1
0 6.5 1.9 2.2 1.5
0 7.0 2.4 2.0 1.9
0 4.5 1.9 1.9 1.8
1 0.0 2.0 2.0 1.8
0 8.5 2.1 2.0 1.6
1 3.5 1.7 1.6 1.2
0 7.5 1.9 2.1 1.8
0 3.5 2.1 2.1 1.5
1 6.0 2.4 2.0 2.1
1 1.5 1.8 1.9 2.4
0 9.0 2.3 2.2 2.4
1 3.5 2.1 2.1 1.5
0 3.5 2.0 1.8 1.8
1 4.0 2.8 2.3 2.1
0 6.5 2.0 2.3 2.2
0 7.5 2.7 2.2 2.1
1 6.0 2.1 2.1 1.9
1 5.0 2.9 2.2 2.1
1 5.5 2.0 1.9 1.9
1 3.5 1.8 1.8 1.6
0 7.5 2.6 2.1 2.4
1 6.5 2.1 2.0 1.9
0 NA 2.3 1.7 1.9
1 6.5 2.3 2.1 2.1
0 6.5 2.2 2.1 1.8
1 7.0 2.0 2.1 2.1
1 3.5 2.2 1.8 2.4
0 1.5 2.1 2.0 2.1
1 4.0 2.1 2.1 2.2
1 7.5 1.9 1.9 2.1
1 4.5 2.0 2.1 2.2
0 0.0 1.7 1.0 1.6
1 3.5 2.2 2.2 2.4
0 5.5 2.2 2.1 2.1
0 5.0 2.3 1.9 1.9
1 4.5 2.4 2.0 2.4
1 2.5 2.1 1.9 2.1
1 7.5 1.9 2.0 1.8
0 7.0 1.7 1.8 2.1
0 0.0 1.8 2.0 1.8
0 4.5 1.5 2.0 1.9
1 3.0 1.9 2.0 1.9
0 1.5 1.9 1.8 2.4
1 3.5 1.7 2.0 1.8
0 2.5 1.9 1.1 1.8
0 5.5 1.9 1.8 2.1
0 8.0 1.8 1.8 2.1
0 1.0 2.4 2.0 2.4
0 5.0 1.8 1.9 1.9
1 4.5 1.9 2.1 1.8
0 3.0 1.8 1.6 1.8
0 3.0 2.1 2.2 2.2
0 8.0 1.9 1.9 2.4
1 2.5 2.2 2.0 1.8
1 7.0 2.0 2.1 2.4
1 0.0 1.7 1.3 1.8
1 1.0 1.7 1.8 1.9
1 3.5 1.8 1.9 2.4
1 5.5 1.9 2.1 2.1
0 5.5 1.8 1.8 1.9




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gertrude Mary Cox' @ cox.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gertrude Mary Cox' @ cox.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net







Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 0.590484 -0.000321919Ex[t] -0.383417PR[t] + 0.346481PE[t] -0.0208856PA[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
gendercode[t] =  +  0.590484 -0.000321919Ex[t] -0.383417PR[t] +  0.346481PE[t] -0.0208856PA[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]gendercode[t] =  +  0.590484 -0.000321919Ex[t] -0.383417PR[t] +  0.346481PE[t] -0.0208856PA[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 0.590484 -0.000321919Ex[t] -0.383417PR[t] + 0.346481PE[t] -0.0208856PA[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.5904840.4790281.2330.2204010.1102
Ex-0.0003219190.0226886-0.014190.9887060.494353
PR-0.3834170.15678-2.4460.01609340.00804672
PE0.3464810.2735321.2670.2080150.104007
PA-0.02088560.148179-0.14090.8881760.444088

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 0.590484 & 0.479028 & 1.233 & 0.220401 & 0.1102 \tabularnewline
Ex & -0.000321919 & 0.0226886 & -0.01419 & 0.988706 & 0.494353 \tabularnewline
PR & -0.383417 & 0.15678 & -2.446 & 0.0160934 & 0.00804672 \tabularnewline
PE & 0.346481 & 0.273532 & 1.267 & 0.208015 & 0.104007 \tabularnewline
PA & -0.0208856 & 0.148179 & -0.1409 & 0.888176 & 0.444088 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]0.590484[/C][C]0.479028[/C][C]1.233[/C][C]0.220401[/C][C]0.1102[/C][/ROW]
[ROW][C]Ex[/C][C]-0.000321919[/C][C]0.0226886[/C][C]-0.01419[/C][C]0.988706[/C][C]0.494353[/C][/ROW]
[ROW][C]PR[/C][C]-0.383417[/C][C]0.15678[/C][C]-2.446[/C][C]0.0160934[/C][C]0.00804672[/C][/ROW]
[ROW][C]PE[/C][C]0.346481[/C][C]0.273532[/C][C]1.267[/C][C]0.208015[/C][C]0.104007[/C][/ROW]
[ROW][C]PA[/C][C]-0.0208856[/C][C]0.148179[/C][C]-0.1409[/C][C]0.888176[/C][C]0.444088[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.5904840.4790281.2330.2204010.1102
Ex-0.0003219190.0226886-0.014190.9887060.494353
PR-0.3834170.15678-2.4460.01609340.00804672
PE0.3464810.2735321.2670.2080150.104007
PA-0.02088560.148179-0.14090.8881760.444088







Multiple Linear Regression - Regression Statistics
Multiple R0.23066
R-squared0.0532039
Adjusted R-squared0.0178097
F-TEST (value)1.50318
F-TEST (DF numerator)4
F-TEST (DF denominator)107
p-value0.206432
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.491284
Sum Squared Residuals25.8256

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.23066 \tabularnewline
R-squared & 0.0532039 \tabularnewline
Adjusted R-squared & 0.0178097 \tabularnewline
F-TEST (value) & 1.50318 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 107 \tabularnewline
p-value & 0.206432 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.491284 \tabularnewline
Sum Squared Residuals & 25.8256 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.23066[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0532039[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0178097[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]1.50318[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]107[/C][/ROW]
[ROW][C]p-value[/C][C]0.206432[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.491284[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]25.8256[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.23066
R-squared0.0532039
Adjusted R-squared0.0178097
F-TEST (value)1.50318
F-TEST (DF numerator)4
F-TEST (DF denominator)107
p-value0.206432
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.491284
Sum Squared Residuals25.8256







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.59420.4058
200.432479-0.432479
310.3939760.606024
400.39623-0.39623
500.444531-0.444531
600.306534-0.306534
710.4663220.533678
800.349852-0.349852
900.208055-0.208055
1000.43425-0.43425
1100.46536-0.46536
1200.440998-0.440998
1300.40699-0.40699
1410.3108720.689128
1510.4827640.517236
1610.4690540.530946
1700.354835-0.354835
1810.4053840.594616
1900.38932-0.38932
2010.5098060.490194
2100.36913-0.36913
220-0.02472750.0247275
2310.5064350.493565
2400.249279-0.249279
2500.39687-0.39687
2600.492459-0.492459
2700.131483-0.131483
2810.4658380.534162
2900.513983-0.513983
3010.5093230.490677
3110.4798180.520182
3210.432640.56736
3300.484156-0.484156
3400.0506715-0.0506715
3500.356278-0.356278
3600.22283-0.22283
3710.5479870.452013
3800.38675-0.38675
3900.240775-0.240775
4010.3844990.615501
4100.403456-0.403456
4200.248963-0.248963
4300.390282-0.390282
4400.552003-0.552003
4500.468248-0.468248
4610.3105490.689451
4700.468893-0.468893
4800.240291-0.240291
4910.5260260.473974
5000.394137-0.394137
5100.495831-0.495831
5200.44308-0.44308
5300.547826-0.547826
5400.590829-0.590829
5500.321309-0.321309
5600.481262-0.481262
5710.4790170.520983
5800.442117-0.442117
5910.4668550.533145
6000.549593-0.549593
6100.480463-0.480463
6210.3174530.682547
6310.5080390.491961
6400.41786-0.41786
6510.4804630.519537
6600.408595-0.408595
6710.2686750.731325
6800.572515-0.572515
6900.271242-0.271242
7010.4713040.528696
7110.1953630.804637
7210.440510.55949
7310.4894550.510545
7400.26867-0.26867
7510.4364950.563505
760-0.6097180.609718
7711.43489-0.43489
780-0.4948540.494854
7910.319380.68062
8011.43393-0.433927
810-0.5343180.534318
8210.4740310.525969
8310.5038630.496137
8411.25174-0.251739
850-0.5420280.542028
8611.42895-0.428946
8700.325646-0.325646
880-0.6883290.688329
8910.3989570.601043
9010.5149450.485055
9111.51623-0.516227
9200.555701-0.555701
9300.667189-0.667189
940-0.4856950.485695
9511.43505-0.435049
960-0.4070840.407084
9711.20472-0.204721
9800.440027-0.440027
9900.477564-0.477564
10000.312797-0.312797
10100.517355-0.517355
1020-0.4494410.449441
10311.41614-0.416143
10400.500652-0.500652
10500.467604-0.467604
1060-0.5984710.598471
10710.4988810.501119
10810.3515060.648494
10910.5223360.477664
11010.5073950.492605
11110.5439710.456029
11211.48255-0.482546
1130NANA

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1 & 0.5942 & 0.4058 \tabularnewline
2 & 0 & 0.432479 & -0.432479 \tabularnewline
3 & 1 & 0.393976 & 0.606024 \tabularnewline
4 & 0 & 0.39623 & -0.39623 \tabularnewline
5 & 0 & 0.444531 & -0.444531 \tabularnewline
6 & 0 & 0.306534 & -0.306534 \tabularnewline
7 & 1 & 0.466322 & 0.533678 \tabularnewline
8 & 0 & 0.349852 & -0.349852 \tabularnewline
9 & 0 & 0.208055 & -0.208055 \tabularnewline
10 & 0 & 0.43425 & -0.43425 \tabularnewline
11 & 0 & 0.46536 & -0.46536 \tabularnewline
12 & 0 & 0.440998 & -0.440998 \tabularnewline
13 & 0 & 0.40699 & -0.40699 \tabularnewline
14 & 1 & 0.310872 & 0.689128 \tabularnewline
15 & 1 & 0.482764 & 0.517236 \tabularnewline
16 & 1 & 0.469054 & 0.530946 \tabularnewline
17 & 0 & 0.354835 & -0.354835 \tabularnewline
18 & 1 & 0.405384 & 0.594616 \tabularnewline
19 & 0 & 0.38932 & -0.38932 \tabularnewline
20 & 1 & 0.509806 & 0.490194 \tabularnewline
21 & 0 & 0.36913 & -0.36913 \tabularnewline
22 & 0 & -0.0247275 & 0.0247275 \tabularnewline
23 & 1 & 0.506435 & 0.493565 \tabularnewline
24 & 0 & 0.249279 & -0.249279 \tabularnewline
25 & 0 & 0.39687 & -0.39687 \tabularnewline
26 & 0 & 0.492459 & -0.492459 \tabularnewline
27 & 0 & 0.131483 & -0.131483 \tabularnewline
28 & 1 & 0.465838 & 0.534162 \tabularnewline
29 & 0 & 0.513983 & -0.513983 \tabularnewline
30 & 1 & 0.509323 & 0.490677 \tabularnewline
31 & 1 & 0.479818 & 0.520182 \tabularnewline
32 & 1 & 0.43264 & 0.56736 \tabularnewline
33 & 0 & 0.484156 & -0.484156 \tabularnewline
34 & 0 & 0.0506715 & -0.0506715 \tabularnewline
35 & 0 & 0.356278 & -0.356278 \tabularnewline
36 & 0 & 0.22283 & -0.22283 \tabularnewline
37 & 1 & 0.547987 & 0.452013 \tabularnewline
38 & 0 & 0.38675 & -0.38675 \tabularnewline
39 & 0 & 0.240775 & -0.240775 \tabularnewline
40 & 1 & 0.384499 & 0.615501 \tabularnewline
41 & 0 & 0.403456 & -0.403456 \tabularnewline
42 & 0 & 0.248963 & -0.248963 \tabularnewline
43 & 0 & 0.390282 & -0.390282 \tabularnewline
44 & 0 & 0.552003 & -0.552003 \tabularnewline
45 & 0 & 0.468248 & -0.468248 \tabularnewline
46 & 1 & 0.310549 & 0.689451 \tabularnewline
47 & 0 & 0.468893 & -0.468893 \tabularnewline
48 & 0 & 0.240291 & -0.240291 \tabularnewline
49 & 1 & 0.526026 & 0.473974 \tabularnewline
50 & 0 & 0.394137 & -0.394137 \tabularnewline
51 & 0 & 0.495831 & -0.495831 \tabularnewline
52 & 0 & 0.44308 & -0.44308 \tabularnewline
53 & 0 & 0.547826 & -0.547826 \tabularnewline
54 & 0 & 0.590829 & -0.590829 \tabularnewline
55 & 0 & 0.321309 & -0.321309 \tabularnewline
56 & 0 & 0.481262 & -0.481262 \tabularnewline
57 & 1 & 0.479017 & 0.520983 \tabularnewline
58 & 0 & 0.442117 & -0.442117 \tabularnewline
59 & 1 & 0.466855 & 0.533145 \tabularnewline
60 & 0 & 0.549593 & -0.549593 \tabularnewline
61 & 0 & 0.480463 & -0.480463 \tabularnewline
62 & 1 & 0.317453 & 0.682547 \tabularnewline
63 & 1 & 0.508039 & 0.491961 \tabularnewline
64 & 0 & 0.41786 & -0.41786 \tabularnewline
65 & 1 & 0.480463 & 0.519537 \tabularnewline
66 & 0 & 0.408595 & -0.408595 \tabularnewline
67 & 1 & 0.268675 & 0.731325 \tabularnewline
68 & 0 & 0.572515 & -0.572515 \tabularnewline
69 & 0 & 0.271242 & -0.271242 \tabularnewline
70 & 1 & 0.471304 & 0.528696 \tabularnewline
71 & 1 & 0.195363 & 0.804637 \tabularnewline
72 & 1 & 0.44051 & 0.55949 \tabularnewline
73 & 1 & 0.489455 & 0.510545 \tabularnewline
74 & 0 & 0.26867 & -0.26867 \tabularnewline
75 & 1 & 0.436495 & 0.563505 \tabularnewline
76 & 0 & -0.609718 & 0.609718 \tabularnewline
77 & 1 & 1.43489 & -0.43489 \tabularnewline
78 & 0 & -0.494854 & 0.494854 \tabularnewline
79 & 1 & 0.31938 & 0.68062 \tabularnewline
80 & 1 & 1.43393 & -0.433927 \tabularnewline
81 & 0 & -0.534318 & 0.534318 \tabularnewline
82 & 1 & 0.474031 & 0.525969 \tabularnewline
83 & 1 & 0.503863 & 0.496137 \tabularnewline
84 & 1 & 1.25174 & -0.251739 \tabularnewline
85 & 0 & -0.542028 & 0.542028 \tabularnewline
86 & 1 & 1.42895 & -0.428946 \tabularnewline
87 & 0 & 0.325646 & -0.325646 \tabularnewline
88 & 0 & -0.688329 & 0.688329 \tabularnewline
89 & 1 & 0.398957 & 0.601043 \tabularnewline
90 & 1 & 0.514945 & 0.485055 \tabularnewline
91 & 1 & 1.51623 & -0.516227 \tabularnewline
92 & 0 & 0.555701 & -0.555701 \tabularnewline
93 & 0 & 0.667189 & -0.667189 \tabularnewline
94 & 0 & -0.485695 & 0.485695 \tabularnewline
95 & 1 & 1.43505 & -0.435049 \tabularnewline
96 & 0 & -0.407084 & 0.407084 \tabularnewline
97 & 1 & 1.20472 & -0.204721 \tabularnewline
98 & 0 & 0.440027 & -0.440027 \tabularnewline
99 & 0 & 0.477564 & -0.477564 \tabularnewline
100 & 0 & 0.312797 & -0.312797 \tabularnewline
101 & 0 & 0.517355 & -0.517355 \tabularnewline
102 & 0 & -0.449441 & 0.449441 \tabularnewline
103 & 1 & 1.41614 & -0.416143 \tabularnewline
104 & 0 & 0.500652 & -0.500652 \tabularnewline
105 & 0 & 0.467604 & -0.467604 \tabularnewline
106 & 0 & -0.598471 & 0.598471 \tabularnewline
107 & 1 & 0.498881 & 0.501119 \tabularnewline
108 & 1 & 0.351506 & 0.648494 \tabularnewline
109 & 1 & 0.522336 & 0.477664 \tabularnewline
110 & 1 & 0.507395 & 0.492605 \tabularnewline
111 & 1 & 0.543971 & 0.456029 \tabularnewline
112 & 1 & 1.48255 & -0.482546 \tabularnewline
113 & 0 & NA & NA \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1[/C][C]0.5942[/C][C]0.4058[/C][/ROW]
[ROW][C]2[/C][C]0[/C][C]0.432479[/C][C]-0.432479[/C][/ROW]
[ROW][C]3[/C][C]1[/C][C]0.393976[/C][C]0.606024[/C][/ROW]
[ROW][C]4[/C][C]0[/C][C]0.39623[/C][C]-0.39623[/C][/ROW]
[ROW][C]5[/C][C]0[/C][C]0.444531[/C][C]-0.444531[/C][/ROW]
[ROW][C]6[/C][C]0[/C][C]0.306534[/C][C]-0.306534[/C][/ROW]
[ROW][C]7[/C][C]1[/C][C]0.466322[/C][C]0.533678[/C][/ROW]
[ROW][C]8[/C][C]0[/C][C]0.349852[/C][C]-0.349852[/C][/ROW]
[ROW][C]9[/C][C]0[/C][C]0.208055[/C][C]-0.208055[/C][/ROW]
[ROW][C]10[/C][C]0[/C][C]0.43425[/C][C]-0.43425[/C][/ROW]
[ROW][C]11[/C][C]0[/C][C]0.46536[/C][C]-0.46536[/C][/ROW]
[ROW][C]12[/C][C]0[/C][C]0.440998[/C][C]-0.440998[/C][/ROW]
[ROW][C]13[/C][C]0[/C][C]0.40699[/C][C]-0.40699[/C][/ROW]
[ROW][C]14[/C][C]1[/C][C]0.310872[/C][C]0.689128[/C][/ROW]
[ROW][C]15[/C][C]1[/C][C]0.482764[/C][C]0.517236[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]0.469054[/C][C]0.530946[/C][/ROW]
[ROW][C]17[/C][C]0[/C][C]0.354835[/C][C]-0.354835[/C][/ROW]
[ROW][C]18[/C][C]1[/C][C]0.405384[/C][C]0.594616[/C][/ROW]
[ROW][C]19[/C][C]0[/C][C]0.38932[/C][C]-0.38932[/C][/ROW]
[ROW][C]20[/C][C]1[/C][C]0.509806[/C][C]0.490194[/C][/ROW]
[ROW][C]21[/C][C]0[/C][C]0.36913[/C][C]-0.36913[/C][/ROW]
[ROW][C]22[/C][C]0[/C][C]-0.0247275[/C][C]0.0247275[/C][/ROW]
[ROW][C]23[/C][C]1[/C][C]0.506435[/C][C]0.493565[/C][/ROW]
[ROW][C]24[/C][C]0[/C][C]0.249279[/C][C]-0.249279[/C][/ROW]
[ROW][C]25[/C][C]0[/C][C]0.39687[/C][C]-0.39687[/C][/ROW]
[ROW][C]26[/C][C]0[/C][C]0.492459[/C][C]-0.492459[/C][/ROW]
[ROW][C]27[/C][C]0[/C][C]0.131483[/C][C]-0.131483[/C][/ROW]
[ROW][C]28[/C][C]1[/C][C]0.465838[/C][C]0.534162[/C][/ROW]
[ROW][C]29[/C][C]0[/C][C]0.513983[/C][C]-0.513983[/C][/ROW]
[ROW][C]30[/C][C]1[/C][C]0.509323[/C][C]0.490677[/C][/ROW]
[ROW][C]31[/C][C]1[/C][C]0.479818[/C][C]0.520182[/C][/ROW]
[ROW][C]32[/C][C]1[/C][C]0.43264[/C][C]0.56736[/C][/ROW]
[ROW][C]33[/C][C]0[/C][C]0.484156[/C][C]-0.484156[/C][/ROW]
[ROW][C]34[/C][C]0[/C][C]0.0506715[/C][C]-0.0506715[/C][/ROW]
[ROW][C]35[/C][C]0[/C][C]0.356278[/C][C]-0.356278[/C][/ROW]
[ROW][C]36[/C][C]0[/C][C]0.22283[/C][C]-0.22283[/C][/ROW]
[ROW][C]37[/C][C]1[/C][C]0.547987[/C][C]0.452013[/C][/ROW]
[ROW][C]38[/C][C]0[/C][C]0.38675[/C][C]-0.38675[/C][/ROW]
[ROW][C]39[/C][C]0[/C][C]0.240775[/C][C]-0.240775[/C][/ROW]
[ROW][C]40[/C][C]1[/C][C]0.384499[/C][C]0.615501[/C][/ROW]
[ROW][C]41[/C][C]0[/C][C]0.403456[/C][C]-0.403456[/C][/ROW]
[ROW][C]42[/C][C]0[/C][C]0.248963[/C][C]-0.248963[/C][/ROW]
[ROW][C]43[/C][C]0[/C][C]0.390282[/C][C]-0.390282[/C][/ROW]
[ROW][C]44[/C][C]0[/C][C]0.552003[/C][C]-0.552003[/C][/ROW]
[ROW][C]45[/C][C]0[/C][C]0.468248[/C][C]-0.468248[/C][/ROW]
[ROW][C]46[/C][C]1[/C][C]0.310549[/C][C]0.689451[/C][/ROW]
[ROW][C]47[/C][C]0[/C][C]0.468893[/C][C]-0.468893[/C][/ROW]
[ROW][C]48[/C][C]0[/C][C]0.240291[/C][C]-0.240291[/C][/ROW]
[ROW][C]49[/C][C]1[/C][C]0.526026[/C][C]0.473974[/C][/ROW]
[ROW][C]50[/C][C]0[/C][C]0.394137[/C][C]-0.394137[/C][/ROW]
[ROW][C]51[/C][C]0[/C][C]0.495831[/C][C]-0.495831[/C][/ROW]
[ROW][C]52[/C][C]0[/C][C]0.44308[/C][C]-0.44308[/C][/ROW]
[ROW][C]53[/C][C]0[/C][C]0.547826[/C][C]-0.547826[/C][/ROW]
[ROW][C]54[/C][C]0[/C][C]0.590829[/C][C]-0.590829[/C][/ROW]
[ROW][C]55[/C][C]0[/C][C]0.321309[/C][C]-0.321309[/C][/ROW]
[ROW][C]56[/C][C]0[/C][C]0.481262[/C][C]-0.481262[/C][/ROW]
[ROW][C]57[/C][C]1[/C][C]0.479017[/C][C]0.520983[/C][/ROW]
[ROW][C]58[/C][C]0[/C][C]0.442117[/C][C]-0.442117[/C][/ROW]
[ROW][C]59[/C][C]1[/C][C]0.466855[/C][C]0.533145[/C][/ROW]
[ROW][C]60[/C][C]0[/C][C]0.549593[/C][C]-0.549593[/C][/ROW]
[ROW][C]61[/C][C]0[/C][C]0.480463[/C][C]-0.480463[/C][/ROW]
[ROW][C]62[/C][C]1[/C][C]0.317453[/C][C]0.682547[/C][/ROW]
[ROW][C]63[/C][C]1[/C][C]0.508039[/C][C]0.491961[/C][/ROW]
[ROW][C]64[/C][C]0[/C][C]0.41786[/C][C]-0.41786[/C][/ROW]
[ROW][C]65[/C][C]1[/C][C]0.480463[/C][C]0.519537[/C][/ROW]
[ROW][C]66[/C][C]0[/C][C]0.408595[/C][C]-0.408595[/C][/ROW]
[ROW][C]67[/C][C]1[/C][C]0.268675[/C][C]0.731325[/C][/ROW]
[ROW][C]68[/C][C]0[/C][C]0.572515[/C][C]-0.572515[/C][/ROW]
[ROW][C]69[/C][C]0[/C][C]0.271242[/C][C]-0.271242[/C][/ROW]
[ROW][C]70[/C][C]1[/C][C]0.471304[/C][C]0.528696[/C][/ROW]
[ROW][C]71[/C][C]1[/C][C]0.195363[/C][C]0.804637[/C][/ROW]
[ROW][C]72[/C][C]1[/C][C]0.44051[/C][C]0.55949[/C][/ROW]
[ROW][C]73[/C][C]1[/C][C]0.489455[/C][C]0.510545[/C][/ROW]
[ROW][C]74[/C][C]0[/C][C]0.26867[/C][C]-0.26867[/C][/ROW]
[ROW][C]75[/C][C]1[/C][C]0.436495[/C][C]0.563505[/C][/ROW]
[ROW][C]76[/C][C]0[/C][C]-0.609718[/C][C]0.609718[/C][/ROW]
[ROW][C]77[/C][C]1[/C][C]1.43489[/C][C]-0.43489[/C][/ROW]
[ROW][C]78[/C][C]0[/C][C]-0.494854[/C][C]0.494854[/C][/ROW]
[ROW][C]79[/C][C]1[/C][C]0.31938[/C][C]0.68062[/C][/ROW]
[ROW][C]80[/C][C]1[/C][C]1.43393[/C][C]-0.433927[/C][/ROW]
[ROW][C]81[/C][C]0[/C][C]-0.534318[/C][C]0.534318[/C][/ROW]
[ROW][C]82[/C][C]1[/C][C]0.474031[/C][C]0.525969[/C][/ROW]
[ROW][C]83[/C][C]1[/C][C]0.503863[/C][C]0.496137[/C][/ROW]
[ROW][C]84[/C][C]1[/C][C]1.25174[/C][C]-0.251739[/C][/ROW]
[ROW][C]85[/C][C]0[/C][C]-0.542028[/C][C]0.542028[/C][/ROW]
[ROW][C]86[/C][C]1[/C][C]1.42895[/C][C]-0.428946[/C][/ROW]
[ROW][C]87[/C][C]0[/C][C]0.325646[/C][C]-0.325646[/C][/ROW]
[ROW][C]88[/C][C]0[/C][C]-0.688329[/C][C]0.688329[/C][/ROW]
[ROW][C]89[/C][C]1[/C][C]0.398957[/C][C]0.601043[/C][/ROW]
[ROW][C]90[/C][C]1[/C][C]0.514945[/C][C]0.485055[/C][/ROW]
[ROW][C]91[/C][C]1[/C][C]1.51623[/C][C]-0.516227[/C][/ROW]
[ROW][C]92[/C][C]0[/C][C]0.555701[/C][C]-0.555701[/C][/ROW]
[ROW][C]93[/C][C]0[/C][C]0.667189[/C][C]-0.667189[/C][/ROW]
[ROW][C]94[/C][C]0[/C][C]-0.485695[/C][C]0.485695[/C][/ROW]
[ROW][C]95[/C][C]1[/C][C]1.43505[/C][C]-0.435049[/C][/ROW]
[ROW][C]96[/C][C]0[/C][C]-0.407084[/C][C]0.407084[/C][/ROW]
[ROW][C]97[/C][C]1[/C][C]1.20472[/C][C]-0.204721[/C][/ROW]
[ROW][C]98[/C][C]0[/C][C]0.440027[/C][C]-0.440027[/C][/ROW]
[ROW][C]99[/C][C]0[/C][C]0.477564[/C][C]-0.477564[/C][/ROW]
[ROW][C]100[/C][C]0[/C][C]0.312797[/C][C]-0.312797[/C][/ROW]
[ROW][C]101[/C][C]0[/C][C]0.517355[/C][C]-0.517355[/C][/ROW]
[ROW][C]102[/C][C]0[/C][C]-0.449441[/C][C]0.449441[/C][/ROW]
[ROW][C]103[/C][C]1[/C][C]1.41614[/C][C]-0.416143[/C][/ROW]
[ROW][C]104[/C][C]0[/C][C]0.500652[/C][C]-0.500652[/C][/ROW]
[ROW][C]105[/C][C]0[/C][C]0.467604[/C][C]-0.467604[/C][/ROW]
[ROW][C]106[/C][C]0[/C][C]-0.598471[/C][C]0.598471[/C][/ROW]
[ROW][C]107[/C][C]1[/C][C]0.498881[/C][C]0.501119[/C][/ROW]
[ROW][C]108[/C][C]1[/C][C]0.351506[/C][C]0.648494[/C][/ROW]
[ROW][C]109[/C][C]1[/C][C]0.522336[/C][C]0.477664[/C][/ROW]
[ROW][C]110[/C][C]1[/C][C]0.507395[/C][C]0.492605[/C][/ROW]
[ROW][C]111[/C][C]1[/C][C]0.543971[/C][C]0.456029[/C][/ROW]
[ROW][C]112[/C][C]1[/C][C]1.48255[/C][C]-0.482546[/C][/ROW]
[ROW][C]113[/C][C]0[/C][C]NA[/C][C]NA[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.59420.4058
200.432479-0.432479
310.3939760.606024
400.39623-0.39623
500.444531-0.444531
600.306534-0.306534
710.4663220.533678
800.349852-0.349852
900.208055-0.208055
1000.43425-0.43425
1100.46536-0.46536
1200.440998-0.440998
1300.40699-0.40699
1410.3108720.689128
1510.4827640.517236
1610.4690540.530946
1700.354835-0.354835
1810.4053840.594616
1900.38932-0.38932
2010.5098060.490194
2100.36913-0.36913
220-0.02472750.0247275
2310.5064350.493565
2400.249279-0.249279
2500.39687-0.39687
2600.492459-0.492459
2700.131483-0.131483
2810.4658380.534162
2900.513983-0.513983
3010.5093230.490677
3110.4798180.520182
3210.432640.56736
3300.484156-0.484156
3400.0506715-0.0506715
3500.356278-0.356278
3600.22283-0.22283
3710.5479870.452013
3800.38675-0.38675
3900.240775-0.240775
4010.3844990.615501
4100.403456-0.403456
4200.248963-0.248963
4300.390282-0.390282
4400.552003-0.552003
4500.468248-0.468248
4610.3105490.689451
4700.468893-0.468893
4800.240291-0.240291
4910.5260260.473974
5000.394137-0.394137
5100.495831-0.495831
5200.44308-0.44308
5300.547826-0.547826
5400.590829-0.590829
5500.321309-0.321309
5600.481262-0.481262
5710.4790170.520983
5800.442117-0.442117
5910.4668550.533145
6000.549593-0.549593
6100.480463-0.480463
6210.3174530.682547
6310.5080390.491961
6400.41786-0.41786
6510.4804630.519537
6600.408595-0.408595
6710.2686750.731325
6800.572515-0.572515
6900.271242-0.271242
7010.4713040.528696
7110.1953630.804637
7210.440510.55949
7310.4894550.510545
7400.26867-0.26867
7510.4364950.563505
760-0.6097180.609718
7711.43489-0.43489
780-0.4948540.494854
7910.319380.68062
8011.43393-0.433927
810-0.5343180.534318
8210.4740310.525969
8310.5038630.496137
8411.25174-0.251739
850-0.5420280.542028
8611.42895-0.428946
8700.325646-0.325646
880-0.6883290.688329
8910.3989570.601043
9010.5149450.485055
9111.51623-0.516227
9200.555701-0.555701
9300.667189-0.667189
940-0.4856950.485695
9511.43505-0.435049
960-0.4070840.407084
9711.20472-0.204721
9800.440027-0.440027
9900.477564-0.477564
10000.312797-0.312797
10100.517355-0.517355
1020-0.4494410.449441
10311.41614-0.416143
10400.500652-0.500652
10500.467604-0.467604
1060-0.5984710.598471
10710.4988810.501119
10810.3515060.648494
10910.5223360.477664
11010.5073950.492605
11110.5439710.456029
11211.48255-0.482546
1130NANA







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
80.5767430.8465150.423257
90.4076210.8152420.592379
100.2708110.5416210.729189
110.204370.408740.79563
120.1296650.2593290.870335
130.1447460.2894910.855254
140.2887850.5775690.711215
150.2534270.5068550.746573
160.3730040.7460070.626996
170.3430330.6860670.656967
180.4227080.8454150.577292
190.3502450.7004910.649755
200.3709640.7419280.629036
210.3053860.6107720.694614
220.2430870.4861740.756913
230.2458190.4916380.754181
240.2048330.4096650.795167
250.2065750.4131510.793425
260.1912460.3824920.808754
270.1488330.2976670.851167
280.1523060.3046120.847694
290.1647960.3295920.835204
300.1569570.3139130.843043
310.1904540.3809070.809546
320.1979520.3959030.802048
330.2129550.425910.787045
340.1976130.3952260.802387
350.1759930.3519860.824007
360.1481940.2963880.851806
370.1401960.2803910.859804
380.1349170.2698340.865083
390.116350.2327010.88365
400.146370.2927390.85363
410.126670.2533390.87333
420.1081380.2162760.891862
430.109990.2199790.89001
440.1344240.2688480.865576
450.1358980.2717960.864102
460.1818860.3637730.818114
470.1810320.3620640.818968
480.1662450.332490.833755
490.1684740.3369490.831526
500.1616720.3233440.838328
510.163210.326420.83679
520.1505190.3010370.849481
530.1590790.3181580.840921
540.1922680.3845360.807732
550.1805830.3611650.819417
560.1803390.3606780.819661
570.2023650.4047290.797635
580.2029880.4059760.797012
590.2105530.4211050.789447
600.2193750.4387510.780625
610.2393780.4787560.760622
620.2705470.5410940.729453
630.2826930.5653860.717307
640.2703950.5407890.729605
650.2623740.5247480.737626
660.2592510.5185020.740749
670.2846730.5693450.715327
680.309660.619320.69034
690.3144260.6288530.685574
700.3061080.6122150.693892
710.3290350.6580690.670965
720.3324670.6649340.667533
730.3307390.6614770.669261
740.3262540.6525070.673746
750.3263990.6527990.673601
760.3287920.6575850.671208
770.3326070.6652140.667393
780.3216990.6433990.678301
790.3441620.6883230.655838
800.3581070.7162140.641893
810.3459090.6918180.654091
820.3686950.737390.631305
830.3585620.7171250.641438
840.3117780.6235550.688222
850.301210.6024190.69879
860.2993990.5987980.700601
870.3129560.6259120.687044
880.3220680.6441360.677932
890.3290730.6581460.670927
900.3229050.645810.677095
910.2875040.5750080.712496
920.3850890.7701780.614911
930.539860.920280.46014
940.4856760.9713530.514324
950.5107720.9784550.489228
960.4295060.8590120.570494
970.3546370.7092730.645363
980.2915220.5830450.708478
990.2237620.4475240.776238
1000.2153530.4307050.784647
1010.2038080.4076160.796192
1020.1633450.326690.836655
1030.1565890.3131780.843411
1040.6544390.6911220.345561
1050.5679860.8640280.432014

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 & 0.576743 & 0.846515 & 0.423257 \tabularnewline
9 & 0.407621 & 0.815242 & 0.592379 \tabularnewline
10 & 0.270811 & 0.541621 & 0.729189 \tabularnewline
11 & 0.20437 & 0.40874 & 0.79563 \tabularnewline
12 & 0.129665 & 0.259329 & 0.870335 \tabularnewline
13 & 0.144746 & 0.289491 & 0.855254 \tabularnewline
14 & 0.288785 & 0.577569 & 0.711215 \tabularnewline
15 & 0.253427 & 0.506855 & 0.746573 \tabularnewline
16 & 0.373004 & 0.746007 & 0.626996 \tabularnewline
17 & 0.343033 & 0.686067 & 0.656967 \tabularnewline
18 & 0.422708 & 0.845415 & 0.577292 \tabularnewline
19 & 0.350245 & 0.700491 & 0.649755 \tabularnewline
20 & 0.370964 & 0.741928 & 0.629036 \tabularnewline
21 & 0.305386 & 0.610772 & 0.694614 \tabularnewline
22 & 0.243087 & 0.486174 & 0.756913 \tabularnewline
23 & 0.245819 & 0.491638 & 0.754181 \tabularnewline
24 & 0.204833 & 0.409665 & 0.795167 \tabularnewline
25 & 0.206575 & 0.413151 & 0.793425 \tabularnewline
26 & 0.191246 & 0.382492 & 0.808754 \tabularnewline
27 & 0.148833 & 0.297667 & 0.851167 \tabularnewline
28 & 0.152306 & 0.304612 & 0.847694 \tabularnewline
29 & 0.164796 & 0.329592 & 0.835204 \tabularnewline
30 & 0.156957 & 0.313913 & 0.843043 \tabularnewline
31 & 0.190454 & 0.380907 & 0.809546 \tabularnewline
32 & 0.197952 & 0.395903 & 0.802048 \tabularnewline
33 & 0.212955 & 0.42591 & 0.787045 \tabularnewline
34 & 0.197613 & 0.395226 & 0.802387 \tabularnewline
35 & 0.175993 & 0.351986 & 0.824007 \tabularnewline
36 & 0.148194 & 0.296388 & 0.851806 \tabularnewline
37 & 0.140196 & 0.280391 & 0.859804 \tabularnewline
38 & 0.134917 & 0.269834 & 0.865083 \tabularnewline
39 & 0.11635 & 0.232701 & 0.88365 \tabularnewline
40 & 0.14637 & 0.292739 & 0.85363 \tabularnewline
41 & 0.12667 & 0.253339 & 0.87333 \tabularnewline
42 & 0.108138 & 0.216276 & 0.891862 \tabularnewline
43 & 0.10999 & 0.219979 & 0.89001 \tabularnewline
44 & 0.134424 & 0.268848 & 0.865576 \tabularnewline
45 & 0.135898 & 0.271796 & 0.864102 \tabularnewline
46 & 0.181886 & 0.363773 & 0.818114 \tabularnewline
47 & 0.181032 & 0.362064 & 0.818968 \tabularnewline
48 & 0.166245 & 0.33249 & 0.833755 \tabularnewline
49 & 0.168474 & 0.336949 & 0.831526 \tabularnewline
50 & 0.161672 & 0.323344 & 0.838328 \tabularnewline
51 & 0.16321 & 0.32642 & 0.83679 \tabularnewline
52 & 0.150519 & 0.301037 & 0.849481 \tabularnewline
53 & 0.159079 & 0.318158 & 0.840921 \tabularnewline
54 & 0.192268 & 0.384536 & 0.807732 \tabularnewline
55 & 0.180583 & 0.361165 & 0.819417 \tabularnewline
56 & 0.180339 & 0.360678 & 0.819661 \tabularnewline
57 & 0.202365 & 0.404729 & 0.797635 \tabularnewline
58 & 0.202988 & 0.405976 & 0.797012 \tabularnewline
59 & 0.210553 & 0.421105 & 0.789447 \tabularnewline
60 & 0.219375 & 0.438751 & 0.780625 \tabularnewline
61 & 0.239378 & 0.478756 & 0.760622 \tabularnewline
62 & 0.270547 & 0.541094 & 0.729453 \tabularnewline
63 & 0.282693 & 0.565386 & 0.717307 \tabularnewline
64 & 0.270395 & 0.540789 & 0.729605 \tabularnewline
65 & 0.262374 & 0.524748 & 0.737626 \tabularnewline
66 & 0.259251 & 0.518502 & 0.740749 \tabularnewline
67 & 0.284673 & 0.569345 & 0.715327 \tabularnewline
68 & 0.30966 & 0.61932 & 0.69034 \tabularnewline
69 & 0.314426 & 0.628853 & 0.685574 \tabularnewline
70 & 0.306108 & 0.612215 & 0.693892 \tabularnewline
71 & 0.329035 & 0.658069 & 0.670965 \tabularnewline
72 & 0.332467 & 0.664934 & 0.667533 \tabularnewline
73 & 0.330739 & 0.661477 & 0.669261 \tabularnewline
74 & 0.326254 & 0.652507 & 0.673746 \tabularnewline
75 & 0.326399 & 0.652799 & 0.673601 \tabularnewline
76 & 0.328792 & 0.657585 & 0.671208 \tabularnewline
77 & 0.332607 & 0.665214 & 0.667393 \tabularnewline
78 & 0.321699 & 0.643399 & 0.678301 \tabularnewline
79 & 0.344162 & 0.688323 & 0.655838 \tabularnewline
80 & 0.358107 & 0.716214 & 0.641893 \tabularnewline
81 & 0.345909 & 0.691818 & 0.654091 \tabularnewline
82 & 0.368695 & 0.73739 & 0.631305 \tabularnewline
83 & 0.358562 & 0.717125 & 0.641438 \tabularnewline
84 & 0.311778 & 0.623555 & 0.688222 \tabularnewline
85 & 0.30121 & 0.602419 & 0.69879 \tabularnewline
86 & 0.299399 & 0.598798 & 0.700601 \tabularnewline
87 & 0.312956 & 0.625912 & 0.687044 \tabularnewline
88 & 0.322068 & 0.644136 & 0.677932 \tabularnewline
89 & 0.329073 & 0.658146 & 0.670927 \tabularnewline
90 & 0.322905 & 0.64581 & 0.677095 \tabularnewline
91 & 0.287504 & 0.575008 & 0.712496 \tabularnewline
92 & 0.385089 & 0.770178 & 0.614911 \tabularnewline
93 & 0.53986 & 0.92028 & 0.46014 \tabularnewline
94 & 0.485676 & 0.971353 & 0.514324 \tabularnewline
95 & 0.510772 & 0.978455 & 0.489228 \tabularnewline
96 & 0.429506 & 0.859012 & 0.570494 \tabularnewline
97 & 0.354637 & 0.709273 & 0.645363 \tabularnewline
98 & 0.291522 & 0.583045 & 0.708478 \tabularnewline
99 & 0.223762 & 0.447524 & 0.776238 \tabularnewline
100 & 0.215353 & 0.430705 & 0.784647 \tabularnewline
101 & 0.203808 & 0.407616 & 0.796192 \tabularnewline
102 & 0.163345 & 0.32669 & 0.836655 \tabularnewline
103 & 0.156589 & 0.313178 & 0.843411 \tabularnewline
104 & 0.654439 & 0.691122 & 0.345561 \tabularnewline
105 & 0.567986 & 0.864028 & 0.432014 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C]0.576743[/C][C]0.846515[/C][C]0.423257[/C][/ROW]
[ROW][C]9[/C][C]0.407621[/C][C]0.815242[/C][C]0.592379[/C][/ROW]
[ROW][C]10[/C][C]0.270811[/C][C]0.541621[/C][C]0.729189[/C][/ROW]
[ROW][C]11[/C][C]0.20437[/C][C]0.40874[/C][C]0.79563[/C][/ROW]
[ROW][C]12[/C][C]0.129665[/C][C]0.259329[/C][C]0.870335[/C][/ROW]
[ROW][C]13[/C][C]0.144746[/C][C]0.289491[/C][C]0.855254[/C][/ROW]
[ROW][C]14[/C][C]0.288785[/C][C]0.577569[/C][C]0.711215[/C][/ROW]
[ROW][C]15[/C][C]0.253427[/C][C]0.506855[/C][C]0.746573[/C][/ROW]
[ROW][C]16[/C][C]0.373004[/C][C]0.746007[/C][C]0.626996[/C][/ROW]
[ROW][C]17[/C][C]0.343033[/C][C]0.686067[/C][C]0.656967[/C][/ROW]
[ROW][C]18[/C][C]0.422708[/C][C]0.845415[/C][C]0.577292[/C][/ROW]
[ROW][C]19[/C][C]0.350245[/C][C]0.700491[/C][C]0.649755[/C][/ROW]
[ROW][C]20[/C][C]0.370964[/C][C]0.741928[/C][C]0.629036[/C][/ROW]
[ROW][C]21[/C][C]0.305386[/C][C]0.610772[/C][C]0.694614[/C][/ROW]
[ROW][C]22[/C][C]0.243087[/C][C]0.486174[/C][C]0.756913[/C][/ROW]
[ROW][C]23[/C][C]0.245819[/C][C]0.491638[/C][C]0.754181[/C][/ROW]
[ROW][C]24[/C][C]0.204833[/C][C]0.409665[/C][C]0.795167[/C][/ROW]
[ROW][C]25[/C][C]0.206575[/C][C]0.413151[/C][C]0.793425[/C][/ROW]
[ROW][C]26[/C][C]0.191246[/C][C]0.382492[/C][C]0.808754[/C][/ROW]
[ROW][C]27[/C][C]0.148833[/C][C]0.297667[/C][C]0.851167[/C][/ROW]
[ROW][C]28[/C][C]0.152306[/C][C]0.304612[/C][C]0.847694[/C][/ROW]
[ROW][C]29[/C][C]0.164796[/C][C]0.329592[/C][C]0.835204[/C][/ROW]
[ROW][C]30[/C][C]0.156957[/C][C]0.313913[/C][C]0.843043[/C][/ROW]
[ROW][C]31[/C][C]0.190454[/C][C]0.380907[/C][C]0.809546[/C][/ROW]
[ROW][C]32[/C][C]0.197952[/C][C]0.395903[/C][C]0.802048[/C][/ROW]
[ROW][C]33[/C][C]0.212955[/C][C]0.42591[/C][C]0.787045[/C][/ROW]
[ROW][C]34[/C][C]0.197613[/C][C]0.395226[/C][C]0.802387[/C][/ROW]
[ROW][C]35[/C][C]0.175993[/C][C]0.351986[/C][C]0.824007[/C][/ROW]
[ROW][C]36[/C][C]0.148194[/C][C]0.296388[/C][C]0.851806[/C][/ROW]
[ROW][C]37[/C][C]0.140196[/C][C]0.280391[/C][C]0.859804[/C][/ROW]
[ROW][C]38[/C][C]0.134917[/C][C]0.269834[/C][C]0.865083[/C][/ROW]
[ROW][C]39[/C][C]0.11635[/C][C]0.232701[/C][C]0.88365[/C][/ROW]
[ROW][C]40[/C][C]0.14637[/C][C]0.292739[/C][C]0.85363[/C][/ROW]
[ROW][C]41[/C][C]0.12667[/C][C]0.253339[/C][C]0.87333[/C][/ROW]
[ROW][C]42[/C][C]0.108138[/C][C]0.216276[/C][C]0.891862[/C][/ROW]
[ROW][C]43[/C][C]0.10999[/C][C]0.219979[/C][C]0.89001[/C][/ROW]
[ROW][C]44[/C][C]0.134424[/C][C]0.268848[/C][C]0.865576[/C][/ROW]
[ROW][C]45[/C][C]0.135898[/C][C]0.271796[/C][C]0.864102[/C][/ROW]
[ROW][C]46[/C][C]0.181886[/C][C]0.363773[/C][C]0.818114[/C][/ROW]
[ROW][C]47[/C][C]0.181032[/C][C]0.362064[/C][C]0.818968[/C][/ROW]
[ROW][C]48[/C][C]0.166245[/C][C]0.33249[/C][C]0.833755[/C][/ROW]
[ROW][C]49[/C][C]0.168474[/C][C]0.336949[/C][C]0.831526[/C][/ROW]
[ROW][C]50[/C][C]0.161672[/C][C]0.323344[/C][C]0.838328[/C][/ROW]
[ROW][C]51[/C][C]0.16321[/C][C]0.32642[/C][C]0.83679[/C][/ROW]
[ROW][C]52[/C][C]0.150519[/C][C]0.301037[/C][C]0.849481[/C][/ROW]
[ROW][C]53[/C][C]0.159079[/C][C]0.318158[/C][C]0.840921[/C][/ROW]
[ROW][C]54[/C][C]0.192268[/C][C]0.384536[/C][C]0.807732[/C][/ROW]
[ROW][C]55[/C][C]0.180583[/C][C]0.361165[/C][C]0.819417[/C][/ROW]
[ROW][C]56[/C][C]0.180339[/C][C]0.360678[/C][C]0.819661[/C][/ROW]
[ROW][C]57[/C][C]0.202365[/C][C]0.404729[/C][C]0.797635[/C][/ROW]
[ROW][C]58[/C][C]0.202988[/C][C]0.405976[/C][C]0.797012[/C][/ROW]
[ROW][C]59[/C][C]0.210553[/C][C]0.421105[/C][C]0.789447[/C][/ROW]
[ROW][C]60[/C][C]0.219375[/C][C]0.438751[/C][C]0.780625[/C][/ROW]
[ROW][C]61[/C][C]0.239378[/C][C]0.478756[/C][C]0.760622[/C][/ROW]
[ROW][C]62[/C][C]0.270547[/C][C]0.541094[/C][C]0.729453[/C][/ROW]
[ROW][C]63[/C][C]0.282693[/C][C]0.565386[/C][C]0.717307[/C][/ROW]
[ROW][C]64[/C][C]0.270395[/C][C]0.540789[/C][C]0.729605[/C][/ROW]
[ROW][C]65[/C][C]0.262374[/C][C]0.524748[/C][C]0.737626[/C][/ROW]
[ROW][C]66[/C][C]0.259251[/C][C]0.518502[/C][C]0.740749[/C][/ROW]
[ROW][C]67[/C][C]0.284673[/C][C]0.569345[/C][C]0.715327[/C][/ROW]
[ROW][C]68[/C][C]0.30966[/C][C]0.61932[/C][C]0.69034[/C][/ROW]
[ROW][C]69[/C][C]0.314426[/C][C]0.628853[/C][C]0.685574[/C][/ROW]
[ROW][C]70[/C][C]0.306108[/C][C]0.612215[/C][C]0.693892[/C][/ROW]
[ROW][C]71[/C][C]0.329035[/C][C]0.658069[/C][C]0.670965[/C][/ROW]
[ROW][C]72[/C][C]0.332467[/C][C]0.664934[/C][C]0.667533[/C][/ROW]
[ROW][C]73[/C][C]0.330739[/C][C]0.661477[/C][C]0.669261[/C][/ROW]
[ROW][C]74[/C][C]0.326254[/C][C]0.652507[/C][C]0.673746[/C][/ROW]
[ROW][C]75[/C][C]0.326399[/C][C]0.652799[/C][C]0.673601[/C][/ROW]
[ROW][C]76[/C][C]0.328792[/C][C]0.657585[/C][C]0.671208[/C][/ROW]
[ROW][C]77[/C][C]0.332607[/C][C]0.665214[/C][C]0.667393[/C][/ROW]
[ROW][C]78[/C][C]0.321699[/C][C]0.643399[/C][C]0.678301[/C][/ROW]
[ROW][C]79[/C][C]0.344162[/C][C]0.688323[/C][C]0.655838[/C][/ROW]
[ROW][C]80[/C][C]0.358107[/C][C]0.716214[/C][C]0.641893[/C][/ROW]
[ROW][C]81[/C][C]0.345909[/C][C]0.691818[/C][C]0.654091[/C][/ROW]
[ROW][C]82[/C][C]0.368695[/C][C]0.73739[/C][C]0.631305[/C][/ROW]
[ROW][C]83[/C][C]0.358562[/C][C]0.717125[/C][C]0.641438[/C][/ROW]
[ROW][C]84[/C][C]0.311778[/C][C]0.623555[/C][C]0.688222[/C][/ROW]
[ROW][C]85[/C][C]0.30121[/C][C]0.602419[/C][C]0.69879[/C][/ROW]
[ROW][C]86[/C][C]0.299399[/C][C]0.598798[/C][C]0.700601[/C][/ROW]
[ROW][C]87[/C][C]0.312956[/C][C]0.625912[/C][C]0.687044[/C][/ROW]
[ROW][C]88[/C][C]0.322068[/C][C]0.644136[/C][C]0.677932[/C][/ROW]
[ROW][C]89[/C][C]0.329073[/C][C]0.658146[/C][C]0.670927[/C][/ROW]
[ROW][C]90[/C][C]0.322905[/C][C]0.64581[/C][C]0.677095[/C][/ROW]
[ROW][C]91[/C][C]0.287504[/C][C]0.575008[/C][C]0.712496[/C][/ROW]
[ROW][C]92[/C][C]0.385089[/C][C]0.770178[/C][C]0.614911[/C][/ROW]
[ROW][C]93[/C][C]0.53986[/C][C]0.92028[/C][C]0.46014[/C][/ROW]
[ROW][C]94[/C][C]0.485676[/C][C]0.971353[/C][C]0.514324[/C][/ROW]
[ROW][C]95[/C][C]0.510772[/C][C]0.978455[/C][C]0.489228[/C][/ROW]
[ROW][C]96[/C][C]0.429506[/C][C]0.859012[/C][C]0.570494[/C][/ROW]
[ROW][C]97[/C][C]0.354637[/C][C]0.709273[/C][C]0.645363[/C][/ROW]
[ROW][C]98[/C][C]0.291522[/C][C]0.583045[/C][C]0.708478[/C][/ROW]
[ROW][C]99[/C][C]0.223762[/C][C]0.447524[/C][C]0.776238[/C][/ROW]
[ROW][C]100[/C][C]0.215353[/C][C]0.430705[/C][C]0.784647[/C][/ROW]
[ROW][C]101[/C][C]0.203808[/C][C]0.407616[/C][C]0.796192[/C][/ROW]
[ROW][C]102[/C][C]0.163345[/C][C]0.32669[/C][C]0.836655[/C][/ROW]
[ROW][C]103[/C][C]0.156589[/C][C]0.313178[/C][C]0.843411[/C][/ROW]
[ROW][C]104[/C][C]0.654439[/C][C]0.691122[/C][C]0.345561[/C][/ROW]
[ROW][C]105[/C][C]0.567986[/C][C]0.864028[/C][C]0.432014[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
80.5767430.8465150.423257
90.4076210.8152420.592379
100.2708110.5416210.729189
110.204370.408740.79563
120.1296650.2593290.870335
130.1447460.2894910.855254
140.2887850.5775690.711215
150.2534270.5068550.746573
160.3730040.7460070.626996
170.3430330.6860670.656967
180.4227080.8454150.577292
190.3502450.7004910.649755
200.3709640.7419280.629036
210.3053860.6107720.694614
220.2430870.4861740.756913
230.2458190.4916380.754181
240.2048330.4096650.795167
250.2065750.4131510.793425
260.1912460.3824920.808754
270.1488330.2976670.851167
280.1523060.3046120.847694
290.1647960.3295920.835204
300.1569570.3139130.843043
310.1904540.3809070.809546
320.1979520.3959030.802048
330.2129550.425910.787045
340.1976130.3952260.802387
350.1759930.3519860.824007
360.1481940.2963880.851806
370.1401960.2803910.859804
380.1349170.2698340.865083
390.116350.2327010.88365
400.146370.2927390.85363
410.126670.2533390.87333
420.1081380.2162760.891862
430.109990.2199790.89001
440.1344240.2688480.865576
450.1358980.2717960.864102
460.1818860.3637730.818114
470.1810320.3620640.818968
480.1662450.332490.833755
490.1684740.3369490.831526
500.1616720.3233440.838328
510.163210.326420.83679
520.1505190.3010370.849481
530.1590790.3181580.840921
540.1922680.3845360.807732
550.1805830.3611650.819417
560.1803390.3606780.819661
570.2023650.4047290.797635
580.2029880.4059760.797012
590.2105530.4211050.789447
600.2193750.4387510.780625
610.2393780.4787560.760622
620.2705470.5410940.729453
630.2826930.5653860.717307
640.2703950.5407890.729605
650.2623740.5247480.737626
660.2592510.5185020.740749
670.2846730.5693450.715327
680.309660.619320.69034
690.3144260.6288530.685574
700.3061080.6122150.693892
710.3290350.6580690.670965
720.3324670.6649340.667533
730.3307390.6614770.669261
740.3262540.6525070.673746
750.3263990.6527990.673601
760.3287920.6575850.671208
770.3326070.6652140.667393
780.3216990.6433990.678301
790.3441620.6883230.655838
800.3581070.7162140.641893
810.3459090.6918180.654091
820.3686950.737390.631305
830.3585620.7171250.641438
840.3117780.6235550.688222
850.301210.6024190.69879
860.2993990.5987980.700601
870.3129560.6259120.687044
880.3220680.6441360.677932
890.3290730.6581460.670927
900.3229050.645810.677095
910.2875040.5750080.712496
920.3850890.7701780.614911
930.539860.920280.46014
940.4856760.9713530.514324
950.5107720.9784550.489228
960.4295060.8590120.570494
970.3546370.7092730.645363
980.2915220.5830450.708478
990.2237620.4475240.776238
1000.2153530.4307050.784647
1010.2038080.4076160.796192
1020.1633450.326690.836655
1030.1565890.3131780.843411
1040.6544390.6911220.345561
1050.5679860.8640280.432014







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264209&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264209&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264209&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}