Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 24 Dec 2010 09:51:58 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2010/Dec/24/t1293184186082vqsd65z3mox8.htm/, Retrieved Tue, 30 Apr 2024 04:14:41 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=114675, Retrieved Tue, 30 Apr 2024 04:14:41 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact177
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [] [2010-12-05 18:56:24] [b98453cac15ba1066b407e146608df68]
-   PD    [Multiple Regression] [] [2010-12-24 09:51:58] [7674ee8f347756742f81ca2ada5c384c] [Current]
-   PD      [Multiple Regression] [] [2010-12-24 17:22:11] [fa409bd323d47d7cf4d4bfe80571749f]
-    D      [Multiple Regression] [] [2010-12-24 20:31:46] [58af523ef9b33032fd2497c80088399b]
-   PD        [Multiple Regression] [Multiple regression] [2010-12-27 10:06:13] [d4d7f64064e581afd5f11cb27d8ab03c]
Feedback Forum

Post a new message
Dataseries X:
172.69	104.31
172.98	103.88
172.98	103.88
172.89	103.86
173.38	103.89
173.20	103.98
173.24	103.98
172.86	104.29
172.86	104.29
172.74	104.24
172.28	103.98
171.05	103.54
171.07	103.44
171.07	103.32
171.07	103.30
171.11	103.26
170.72	103.14
170.49	103.11
170.48	102.91
170.48	103.23
170.48	103.23
170.57	103.14
170.39	102.91
170.04	102.42
169.67	102.10
169.57	102.07
169.57	102.06
169.53	101.98
169.24	101.83
169.29	101.75
169.21	101.56
168.58	101.66
168.58	101.65
168.55	101.61
168.46	101.52
167.39	101.31
167.16	101.19
167.16	101.11
167.16	101.10
167.17	101.07
166.52	100.98
166.35	100.93
166.19	100.92
166.19	101.02
166.19	101.01
166.07	100.97
166.64	100.89
166.26	100.62
166.44	100.53
166.27	100.48
166.27	100.48
166.30	100.47
165.97	100.52
164.58	100.49
164.28	100.47
163.93	100.44




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 7 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]7 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R Server'George Udny Yule' @ 72.249.76.132







Multiple Linear Regression - Estimated Regression Equation
Gemconsprijsblazers[t] = -31.5170225327134 + 1.96368138479357consumptieindexkleding[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Gemconsprijsblazers[t] =  -31.5170225327134 +  1.96368138479357consumptieindexkleding[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Gemconsprijsblazers[t] =  -31.5170225327134 +  1.96368138479357consumptieindexkleding[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Gemconsprijsblazers[t] = -31.5170225327134 + 1.96368138479357consumptieindexkleding[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-31.51702253271346.720968-4.68941.9e-051e-05
consumptieindexkleding1.963681384793570.06576829.857900

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & -31.5170225327134 & 6.720968 & -4.6894 & 1.9e-05 & 1e-05 \tabularnewline
consumptieindexkleding & 1.96368138479357 & 0.065768 & 29.8579 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]-31.5170225327134[/C][C]6.720968[/C][C]-4.6894[/C][C]1.9e-05[/C][C]1e-05[/C][/ROW]
[ROW][C]consumptieindexkleding[/C][C]1.96368138479357[/C][C]0.065768[/C][C]29.8579[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-31.51702253271346.720968-4.68941.9e-051e-05
consumptieindexkleding1.963681384793570.06576829.857900







Multiple Linear Regression - Regression Statistics
Multiple R0.971023618522707
R-squared0.942886867728931
Adjusted R-squared0.94182921713132
F-TEST (value)891.491830910392
F-TEST (DF numerator)1
F-TEST (DF denominator)54
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.642161673218539
Sum Squared Residuals22.268067185745

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.971023618522707 \tabularnewline
R-squared & 0.942886867728931 \tabularnewline
Adjusted R-squared & 0.94182921713132 \tabularnewline
F-TEST (value) & 891.491830910392 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 54 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.642161673218539 \tabularnewline
Sum Squared Residuals & 22.268067185745 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.971023618522707[/C][/ROW]
[ROW][C]R-squared[/C][C]0.942886867728931[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.94182921713132[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]891.491830910392[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]54[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.642161673218539[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]22.268067185745[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.971023618522707
R-squared0.942886867728931
Adjusted R-squared0.94182921713132
F-TEST (value)891.491830910392
F-TEST (DF numerator)1
F-TEST (DF denominator)54
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.642161673218539
Sum Squared Residuals22.268067185745







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1172.69173.314582715104-0.62458271510448
2172.98172.4701997196430.509800280357203
3172.98172.4701997196430.509800280357196
4172.89172.4309260919470.459073908053057
5173.38172.4898365334910.890163466509256
6173.2172.6665678581220.533432141877822
7173.24172.6665678581220.573432141877842
8172.86173.275309087408-0.415309087408165
9172.86173.275309087408-0.415309087408165
10172.74173.177125018168-0.437125018168468
11172.28172.666567858122-0.386567858122166
12171.05171.802548048813-0.752548048812989
13171.07171.606179910334-0.536179910333634
14171.07171.370538144158-0.300538144158397
15171.07171.331264516463-0.261264516462533
16171.11171.252717261071-0.142717261070785
17170.72171.017075494896-0.297075494895562
18170.49170.958165053352-0.468165053351743
19170.48170.565428776393-0.0854287763930428
20170.48171.193806819527-0.713806819527
21170.48171.193806819527-0.713806819527
22170.57171.017075494896-0.447075494895568
23170.39170.565428776393-0.175428776393046
24170.04169.6032248978440.436775102155799
25169.67168.9748468547100.695153145289751
26169.57168.9159364131660.654063586833566
27169.57168.8962995993180.673700400681484
28169.53168.7392050885350.790794911464974
29169.24168.4446528808160.795347119184029
30169.29168.2875583700321.00244162996749
31169.21167.9144589069221.29554109307828
32168.58168.1108270454010.469172954598943
33168.58168.0911902315530.48880976844686
34168.55168.0126429761610.537357023838615
35168.46167.835911651530.624088348470039
36167.39167.423538560723-0.0335385607233452
37167.16167.187896794548-0.0278967945480979
38167.16167.0308022837650.129197716235385
39167.16167.0111654699170.148834530083330
40167.17166.9522550283730.217744971627130
41166.52166.775523703741-0.255523703741447
42166.35166.677339634502-0.327339634501789
43166.19166.657702820654-0.46770282065384
44166.19166.854070959133-0.664070959133187
45166.19166.834434145285-0.644434145285268
46166.07166.755886889894-0.685886889893518
47166.64166.598792379110.0412076208899576
48166.26166.0685984052160.191401594784218
49166.44165.8918670805840.548132919415653
50166.27165.7936830113450.476316988655339
51166.27165.7936830113450.476316988655339
52166.3165.7740461974970.525953802503285
53165.97165.8722302667360.0977697332635998
54164.58165.813319825193-1.23331982519258
55164.28165.774046197497-1.49404619749672
56163.93165.715135755953-1.78513575595291

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 172.69 & 173.314582715104 & -0.62458271510448 \tabularnewline
2 & 172.98 & 172.470199719643 & 0.509800280357203 \tabularnewline
3 & 172.98 & 172.470199719643 & 0.509800280357196 \tabularnewline
4 & 172.89 & 172.430926091947 & 0.459073908053057 \tabularnewline
5 & 173.38 & 172.489836533491 & 0.890163466509256 \tabularnewline
6 & 173.2 & 172.666567858122 & 0.533432141877822 \tabularnewline
7 & 173.24 & 172.666567858122 & 0.573432141877842 \tabularnewline
8 & 172.86 & 173.275309087408 & -0.415309087408165 \tabularnewline
9 & 172.86 & 173.275309087408 & -0.415309087408165 \tabularnewline
10 & 172.74 & 173.177125018168 & -0.437125018168468 \tabularnewline
11 & 172.28 & 172.666567858122 & -0.386567858122166 \tabularnewline
12 & 171.05 & 171.802548048813 & -0.752548048812989 \tabularnewline
13 & 171.07 & 171.606179910334 & -0.536179910333634 \tabularnewline
14 & 171.07 & 171.370538144158 & -0.300538144158397 \tabularnewline
15 & 171.07 & 171.331264516463 & -0.261264516462533 \tabularnewline
16 & 171.11 & 171.252717261071 & -0.142717261070785 \tabularnewline
17 & 170.72 & 171.017075494896 & -0.297075494895562 \tabularnewline
18 & 170.49 & 170.958165053352 & -0.468165053351743 \tabularnewline
19 & 170.48 & 170.565428776393 & -0.0854287763930428 \tabularnewline
20 & 170.48 & 171.193806819527 & -0.713806819527 \tabularnewline
21 & 170.48 & 171.193806819527 & -0.713806819527 \tabularnewline
22 & 170.57 & 171.017075494896 & -0.447075494895568 \tabularnewline
23 & 170.39 & 170.565428776393 & -0.175428776393046 \tabularnewline
24 & 170.04 & 169.603224897844 & 0.436775102155799 \tabularnewline
25 & 169.67 & 168.974846854710 & 0.695153145289751 \tabularnewline
26 & 169.57 & 168.915936413166 & 0.654063586833566 \tabularnewline
27 & 169.57 & 168.896299599318 & 0.673700400681484 \tabularnewline
28 & 169.53 & 168.739205088535 & 0.790794911464974 \tabularnewline
29 & 169.24 & 168.444652880816 & 0.795347119184029 \tabularnewline
30 & 169.29 & 168.287558370032 & 1.00244162996749 \tabularnewline
31 & 169.21 & 167.914458906922 & 1.29554109307828 \tabularnewline
32 & 168.58 & 168.110827045401 & 0.469172954598943 \tabularnewline
33 & 168.58 & 168.091190231553 & 0.48880976844686 \tabularnewline
34 & 168.55 & 168.012642976161 & 0.537357023838615 \tabularnewline
35 & 168.46 & 167.83591165153 & 0.624088348470039 \tabularnewline
36 & 167.39 & 167.423538560723 & -0.0335385607233452 \tabularnewline
37 & 167.16 & 167.187896794548 & -0.0278967945480979 \tabularnewline
38 & 167.16 & 167.030802283765 & 0.129197716235385 \tabularnewline
39 & 167.16 & 167.011165469917 & 0.148834530083330 \tabularnewline
40 & 167.17 & 166.952255028373 & 0.217744971627130 \tabularnewline
41 & 166.52 & 166.775523703741 & -0.255523703741447 \tabularnewline
42 & 166.35 & 166.677339634502 & -0.327339634501789 \tabularnewline
43 & 166.19 & 166.657702820654 & -0.46770282065384 \tabularnewline
44 & 166.19 & 166.854070959133 & -0.664070959133187 \tabularnewline
45 & 166.19 & 166.834434145285 & -0.644434145285268 \tabularnewline
46 & 166.07 & 166.755886889894 & -0.685886889893518 \tabularnewline
47 & 166.64 & 166.59879237911 & 0.0412076208899576 \tabularnewline
48 & 166.26 & 166.068598405216 & 0.191401594784218 \tabularnewline
49 & 166.44 & 165.891867080584 & 0.548132919415653 \tabularnewline
50 & 166.27 & 165.793683011345 & 0.476316988655339 \tabularnewline
51 & 166.27 & 165.793683011345 & 0.476316988655339 \tabularnewline
52 & 166.3 & 165.774046197497 & 0.525953802503285 \tabularnewline
53 & 165.97 & 165.872230266736 & 0.0977697332635998 \tabularnewline
54 & 164.58 & 165.813319825193 & -1.23331982519258 \tabularnewline
55 & 164.28 & 165.774046197497 & -1.49404619749672 \tabularnewline
56 & 163.93 & 165.715135755953 & -1.78513575595291 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]172.69[/C][C]173.314582715104[/C][C]-0.62458271510448[/C][/ROW]
[ROW][C]2[/C][C]172.98[/C][C]172.470199719643[/C][C]0.509800280357203[/C][/ROW]
[ROW][C]3[/C][C]172.98[/C][C]172.470199719643[/C][C]0.509800280357196[/C][/ROW]
[ROW][C]4[/C][C]172.89[/C][C]172.430926091947[/C][C]0.459073908053057[/C][/ROW]
[ROW][C]5[/C][C]173.38[/C][C]172.489836533491[/C][C]0.890163466509256[/C][/ROW]
[ROW][C]6[/C][C]173.2[/C][C]172.666567858122[/C][C]0.533432141877822[/C][/ROW]
[ROW][C]7[/C][C]173.24[/C][C]172.666567858122[/C][C]0.573432141877842[/C][/ROW]
[ROW][C]8[/C][C]172.86[/C][C]173.275309087408[/C][C]-0.415309087408165[/C][/ROW]
[ROW][C]9[/C][C]172.86[/C][C]173.275309087408[/C][C]-0.415309087408165[/C][/ROW]
[ROW][C]10[/C][C]172.74[/C][C]173.177125018168[/C][C]-0.437125018168468[/C][/ROW]
[ROW][C]11[/C][C]172.28[/C][C]172.666567858122[/C][C]-0.386567858122166[/C][/ROW]
[ROW][C]12[/C][C]171.05[/C][C]171.802548048813[/C][C]-0.752548048812989[/C][/ROW]
[ROW][C]13[/C][C]171.07[/C][C]171.606179910334[/C][C]-0.536179910333634[/C][/ROW]
[ROW][C]14[/C][C]171.07[/C][C]171.370538144158[/C][C]-0.300538144158397[/C][/ROW]
[ROW][C]15[/C][C]171.07[/C][C]171.331264516463[/C][C]-0.261264516462533[/C][/ROW]
[ROW][C]16[/C][C]171.11[/C][C]171.252717261071[/C][C]-0.142717261070785[/C][/ROW]
[ROW][C]17[/C][C]170.72[/C][C]171.017075494896[/C][C]-0.297075494895562[/C][/ROW]
[ROW][C]18[/C][C]170.49[/C][C]170.958165053352[/C][C]-0.468165053351743[/C][/ROW]
[ROW][C]19[/C][C]170.48[/C][C]170.565428776393[/C][C]-0.0854287763930428[/C][/ROW]
[ROW][C]20[/C][C]170.48[/C][C]171.193806819527[/C][C]-0.713806819527[/C][/ROW]
[ROW][C]21[/C][C]170.48[/C][C]171.193806819527[/C][C]-0.713806819527[/C][/ROW]
[ROW][C]22[/C][C]170.57[/C][C]171.017075494896[/C][C]-0.447075494895568[/C][/ROW]
[ROW][C]23[/C][C]170.39[/C][C]170.565428776393[/C][C]-0.175428776393046[/C][/ROW]
[ROW][C]24[/C][C]170.04[/C][C]169.603224897844[/C][C]0.436775102155799[/C][/ROW]
[ROW][C]25[/C][C]169.67[/C][C]168.974846854710[/C][C]0.695153145289751[/C][/ROW]
[ROW][C]26[/C][C]169.57[/C][C]168.915936413166[/C][C]0.654063586833566[/C][/ROW]
[ROW][C]27[/C][C]169.57[/C][C]168.896299599318[/C][C]0.673700400681484[/C][/ROW]
[ROW][C]28[/C][C]169.53[/C][C]168.739205088535[/C][C]0.790794911464974[/C][/ROW]
[ROW][C]29[/C][C]169.24[/C][C]168.444652880816[/C][C]0.795347119184029[/C][/ROW]
[ROW][C]30[/C][C]169.29[/C][C]168.287558370032[/C][C]1.00244162996749[/C][/ROW]
[ROW][C]31[/C][C]169.21[/C][C]167.914458906922[/C][C]1.29554109307828[/C][/ROW]
[ROW][C]32[/C][C]168.58[/C][C]168.110827045401[/C][C]0.469172954598943[/C][/ROW]
[ROW][C]33[/C][C]168.58[/C][C]168.091190231553[/C][C]0.48880976844686[/C][/ROW]
[ROW][C]34[/C][C]168.55[/C][C]168.012642976161[/C][C]0.537357023838615[/C][/ROW]
[ROW][C]35[/C][C]168.46[/C][C]167.83591165153[/C][C]0.624088348470039[/C][/ROW]
[ROW][C]36[/C][C]167.39[/C][C]167.423538560723[/C][C]-0.0335385607233452[/C][/ROW]
[ROW][C]37[/C][C]167.16[/C][C]167.187896794548[/C][C]-0.0278967945480979[/C][/ROW]
[ROW][C]38[/C][C]167.16[/C][C]167.030802283765[/C][C]0.129197716235385[/C][/ROW]
[ROW][C]39[/C][C]167.16[/C][C]167.011165469917[/C][C]0.148834530083330[/C][/ROW]
[ROW][C]40[/C][C]167.17[/C][C]166.952255028373[/C][C]0.217744971627130[/C][/ROW]
[ROW][C]41[/C][C]166.52[/C][C]166.775523703741[/C][C]-0.255523703741447[/C][/ROW]
[ROW][C]42[/C][C]166.35[/C][C]166.677339634502[/C][C]-0.327339634501789[/C][/ROW]
[ROW][C]43[/C][C]166.19[/C][C]166.657702820654[/C][C]-0.46770282065384[/C][/ROW]
[ROW][C]44[/C][C]166.19[/C][C]166.854070959133[/C][C]-0.664070959133187[/C][/ROW]
[ROW][C]45[/C][C]166.19[/C][C]166.834434145285[/C][C]-0.644434145285268[/C][/ROW]
[ROW][C]46[/C][C]166.07[/C][C]166.755886889894[/C][C]-0.685886889893518[/C][/ROW]
[ROW][C]47[/C][C]166.64[/C][C]166.59879237911[/C][C]0.0412076208899576[/C][/ROW]
[ROW][C]48[/C][C]166.26[/C][C]166.068598405216[/C][C]0.191401594784218[/C][/ROW]
[ROW][C]49[/C][C]166.44[/C][C]165.891867080584[/C][C]0.548132919415653[/C][/ROW]
[ROW][C]50[/C][C]166.27[/C][C]165.793683011345[/C][C]0.476316988655339[/C][/ROW]
[ROW][C]51[/C][C]166.27[/C][C]165.793683011345[/C][C]0.476316988655339[/C][/ROW]
[ROW][C]52[/C][C]166.3[/C][C]165.774046197497[/C][C]0.525953802503285[/C][/ROW]
[ROW][C]53[/C][C]165.97[/C][C]165.872230266736[/C][C]0.0977697332635998[/C][/ROW]
[ROW][C]54[/C][C]164.58[/C][C]165.813319825193[/C][C]-1.23331982519258[/C][/ROW]
[ROW][C]55[/C][C]164.28[/C][C]165.774046197497[/C][C]-1.49404619749672[/C][/ROW]
[ROW][C]56[/C][C]163.93[/C][C]165.715135755953[/C][C]-1.78513575595291[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1172.69173.314582715104-0.62458271510448
2172.98172.4701997196430.509800280357203
3172.98172.4701997196430.509800280357196
4172.89172.4309260919470.459073908053057
5173.38172.4898365334910.890163466509256
6173.2172.6665678581220.533432141877822
7173.24172.6665678581220.573432141877842
8172.86173.275309087408-0.415309087408165
9172.86173.275309087408-0.415309087408165
10172.74173.177125018168-0.437125018168468
11172.28172.666567858122-0.386567858122166
12171.05171.802548048813-0.752548048812989
13171.07171.606179910334-0.536179910333634
14171.07171.370538144158-0.300538144158397
15171.07171.331264516463-0.261264516462533
16171.11171.252717261071-0.142717261070785
17170.72171.017075494896-0.297075494895562
18170.49170.958165053352-0.468165053351743
19170.48170.565428776393-0.0854287763930428
20170.48171.193806819527-0.713806819527
21170.48171.193806819527-0.713806819527
22170.57171.017075494896-0.447075494895568
23170.39170.565428776393-0.175428776393046
24170.04169.6032248978440.436775102155799
25169.67168.9748468547100.695153145289751
26169.57168.9159364131660.654063586833566
27169.57168.8962995993180.673700400681484
28169.53168.7392050885350.790794911464974
29169.24168.4446528808160.795347119184029
30169.29168.2875583700321.00244162996749
31169.21167.9144589069221.29554109307828
32168.58168.1108270454010.469172954598943
33168.58168.0911902315530.48880976844686
34168.55168.0126429761610.537357023838615
35168.46167.835911651530.624088348470039
36167.39167.423538560723-0.0335385607233452
37167.16167.187896794548-0.0278967945480979
38167.16167.0308022837650.129197716235385
39167.16167.0111654699170.148834530083330
40167.17166.9522550283730.217744971627130
41166.52166.775523703741-0.255523703741447
42166.35166.677339634502-0.327339634501789
43166.19166.657702820654-0.46770282065384
44166.19166.854070959133-0.664070959133187
45166.19166.834434145285-0.644434145285268
46166.07166.755886889894-0.685886889893518
47166.64166.598792379110.0412076208899576
48166.26166.0685984052160.191401594784218
49166.44165.8918670805840.548132919415653
50166.27165.7936830113450.476316988655339
51166.27165.7936830113450.476316988655339
52166.3165.7740461974970.525953802503285
53165.97165.8722302667360.0977697332635998
54164.58165.813319825193-1.23331982519258
55164.28165.774046197497-1.49404619749672
56163.93165.715135755953-1.78513575595291







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.05552802545350180.1110560509070040.944471974546498
60.02564174911959430.05128349823918870.974358250880406
70.01266493772311740.02532987544623490.987335062276883
80.003820555766847650.00764111153369530.996179444233152
90.001060294768003630.002120589536007260.998939705231996
100.0003382405591500950.0006764811183001910.99966175944085
110.006663392012677760.01332678402535550.993336607987322
120.3353517024121280.6707034048242560.664648297587872
130.3827301347417030.7654602694834070.617269865258297
140.3177634264072960.6355268528145920.682236573592704
150.2474029722132490.4948059444264970.752597027786751
160.1809413931967040.3618827863934090.819058606803296
170.1335218085722170.2670436171444340.866478191427783
180.1060708127173420.2121416254346840.893929187282658
190.0737943298621430.1475886597242860.926205670137857
200.08388751009987530.1677750201997510.916112489900125
210.1104633801526710.2209267603053430.889536619847328
220.1349673242089310.2699346484178620.865032675791069
230.1752167878101360.3504335756202730.824783212189864
240.224456903208190.448913806416380.77554309679181
250.2538349777795990.5076699555591980.746165022220401
260.235477692777820.470955385555640.76452230722218
270.2049173010936380.4098346021872770.795082698906362
280.1741773656024800.3483547312049600.82582263439752
290.1386069547121460.2772139094242920.861393045287854
300.1218326438995610.2436652877991220.87816735610044
310.1570978105433330.3141956210866660.842902189456667
320.1173772448975970.2347544897951950.882622755102403
330.08527583546527970.1705516709305590.91472416453472
340.06186428085260440.1237285617052090.938135719147396
350.04972714093444550.09945428186889110.950272859065554
360.04651949335916410.09303898671832820.953480506640836
370.04073653395554440.08147306791108880.959263466044456
380.03206817754518070.06413635509036130.96793182245482
390.02490593572411080.04981187144822160.97509406427589
400.02041130691141040.04082261382282090.97958869308859
410.0176706839986650.035341367997330.982329316001335
420.01470008318667260.02940016637334520.985299916813327
430.01248523003597560.02497046007195130.987514769964024
440.01182115427974140.02364230855948290.988178845720258
450.01041413312228670.02082826624457350.989585866877713
460.01279325100717430.02558650201434860.987206748992826
470.0147217131283440.0294434262566880.985278286871656
480.01981252153940930.03962504307881860.98018747846059
490.009918797443391770.01983759488678350.990081202556608
500.01174792398080800.02349584796161590.988252076019192
510.02286265899966050.0457253179993210.97713734100034

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.0555280254535018 & 0.111056050907004 & 0.944471974546498 \tabularnewline
6 & 0.0256417491195943 & 0.0512834982391887 & 0.974358250880406 \tabularnewline
7 & 0.0126649377231174 & 0.0253298754462349 & 0.987335062276883 \tabularnewline
8 & 0.00382055576684765 & 0.0076411115336953 & 0.996179444233152 \tabularnewline
9 & 0.00106029476800363 & 0.00212058953600726 & 0.998939705231996 \tabularnewline
10 & 0.000338240559150095 & 0.000676481118300191 & 0.99966175944085 \tabularnewline
11 & 0.00666339201267776 & 0.0133267840253555 & 0.993336607987322 \tabularnewline
12 & 0.335351702412128 & 0.670703404824256 & 0.664648297587872 \tabularnewline
13 & 0.382730134741703 & 0.765460269483407 & 0.617269865258297 \tabularnewline
14 & 0.317763426407296 & 0.635526852814592 & 0.682236573592704 \tabularnewline
15 & 0.247402972213249 & 0.494805944426497 & 0.752597027786751 \tabularnewline
16 & 0.180941393196704 & 0.361882786393409 & 0.819058606803296 \tabularnewline
17 & 0.133521808572217 & 0.267043617144434 & 0.866478191427783 \tabularnewline
18 & 0.106070812717342 & 0.212141625434684 & 0.893929187282658 \tabularnewline
19 & 0.073794329862143 & 0.147588659724286 & 0.926205670137857 \tabularnewline
20 & 0.0838875100998753 & 0.167775020199751 & 0.916112489900125 \tabularnewline
21 & 0.110463380152671 & 0.220926760305343 & 0.889536619847328 \tabularnewline
22 & 0.134967324208931 & 0.269934648417862 & 0.865032675791069 \tabularnewline
23 & 0.175216787810136 & 0.350433575620273 & 0.824783212189864 \tabularnewline
24 & 0.22445690320819 & 0.44891380641638 & 0.77554309679181 \tabularnewline
25 & 0.253834977779599 & 0.507669955559198 & 0.746165022220401 \tabularnewline
26 & 0.23547769277782 & 0.47095538555564 & 0.76452230722218 \tabularnewline
27 & 0.204917301093638 & 0.409834602187277 & 0.795082698906362 \tabularnewline
28 & 0.174177365602480 & 0.348354731204960 & 0.82582263439752 \tabularnewline
29 & 0.138606954712146 & 0.277213909424292 & 0.861393045287854 \tabularnewline
30 & 0.121832643899561 & 0.243665287799122 & 0.87816735610044 \tabularnewline
31 & 0.157097810543333 & 0.314195621086666 & 0.842902189456667 \tabularnewline
32 & 0.117377244897597 & 0.234754489795195 & 0.882622755102403 \tabularnewline
33 & 0.0852758354652797 & 0.170551670930559 & 0.91472416453472 \tabularnewline
34 & 0.0618642808526044 & 0.123728561705209 & 0.938135719147396 \tabularnewline
35 & 0.0497271409344455 & 0.0994542818688911 & 0.950272859065554 \tabularnewline
36 & 0.0465194933591641 & 0.0930389867183282 & 0.953480506640836 \tabularnewline
37 & 0.0407365339555444 & 0.0814730679110888 & 0.959263466044456 \tabularnewline
38 & 0.0320681775451807 & 0.0641363550903613 & 0.96793182245482 \tabularnewline
39 & 0.0249059357241108 & 0.0498118714482216 & 0.97509406427589 \tabularnewline
40 & 0.0204113069114104 & 0.0408226138228209 & 0.97958869308859 \tabularnewline
41 & 0.017670683998665 & 0.03534136799733 & 0.982329316001335 \tabularnewline
42 & 0.0147000831866726 & 0.0294001663733452 & 0.985299916813327 \tabularnewline
43 & 0.0124852300359756 & 0.0249704600719513 & 0.987514769964024 \tabularnewline
44 & 0.0118211542797414 & 0.0236423085594829 & 0.988178845720258 \tabularnewline
45 & 0.0104141331222867 & 0.0208282662445735 & 0.989585866877713 \tabularnewline
46 & 0.0127932510071743 & 0.0255865020143486 & 0.987206748992826 \tabularnewline
47 & 0.014721713128344 & 0.029443426256688 & 0.985278286871656 \tabularnewline
48 & 0.0198125215394093 & 0.0396250430788186 & 0.98018747846059 \tabularnewline
49 & 0.00991879744339177 & 0.0198375948867835 & 0.990081202556608 \tabularnewline
50 & 0.0117479239808080 & 0.0234958479616159 & 0.988252076019192 \tabularnewline
51 & 0.0228626589996605 & 0.045725317999321 & 0.97713734100034 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.0555280254535018[/C][C]0.111056050907004[/C][C]0.944471974546498[/C][/ROW]
[ROW][C]6[/C][C]0.0256417491195943[/C][C]0.0512834982391887[/C][C]0.974358250880406[/C][/ROW]
[ROW][C]7[/C][C]0.0126649377231174[/C][C]0.0253298754462349[/C][C]0.987335062276883[/C][/ROW]
[ROW][C]8[/C][C]0.00382055576684765[/C][C]0.0076411115336953[/C][C]0.996179444233152[/C][/ROW]
[ROW][C]9[/C][C]0.00106029476800363[/C][C]0.00212058953600726[/C][C]0.998939705231996[/C][/ROW]
[ROW][C]10[/C][C]0.000338240559150095[/C][C]0.000676481118300191[/C][C]0.99966175944085[/C][/ROW]
[ROW][C]11[/C][C]0.00666339201267776[/C][C]0.0133267840253555[/C][C]0.993336607987322[/C][/ROW]
[ROW][C]12[/C][C]0.335351702412128[/C][C]0.670703404824256[/C][C]0.664648297587872[/C][/ROW]
[ROW][C]13[/C][C]0.382730134741703[/C][C]0.765460269483407[/C][C]0.617269865258297[/C][/ROW]
[ROW][C]14[/C][C]0.317763426407296[/C][C]0.635526852814592[/C][C]0.682236573592704[/C][/ROW]
[ROW][C]15[/C][C]0.247402972213249[/C][C]0.494805944426497[/C][C]0.752597027786751[/C][/ROW]
[ROW][C]16[/C][C]0.180941393196704[/C][C]0.361882786393409[/C][C]0.819058606803296[/C][/ROW]
[ROW][C]17[/C][C]0.133521808572217[/C][C]0.267043617144434[/C][C]0.866478191427783[/C][/ROW]
[ROW][C]18[/C][C]0.106070812717342[/C][C]0.212141625434684[/C][C]0.893929187282658[/C][/ROW]
[ROW][C]19[/C][C]0.073794329862143[/C][C]0.147588659724286[/C][C]0.926205670137857[/C][/ROW]
[ROW][C]20[/C][C]0.0838875100998753[/C][C]0.167775020199751[/C][C]0.916112489900125[/C][/ROW]
[ROW][C]21[/C][C]0.110463380152671[/C][C]0.220926760305343[/C][C]0.889536619847328[/C][/ROW]
[ROW][C]22[/C][C]0.134967324208931[/C][C]0.269934648417862[/C][C]0.865032675791069[/C][/ROW]
[ROW][C]23[/C][C]0.175216787810136[/C][C]0.350433575620273[/C][C]0.824783212189864[/C][/ROW]
[ROW][C]24[/C][C]0.22445690320819[/C][C]0.44891380641638[/C][C]0.77554309679181[/C][/ROW]
[ROW][C]25[/C][C]0.253834977779599[/C][C]0.507669955559198[/C][C]0.746165022220401[/C][/ROW]
[ROW][C]26[/C][C]0.23547769277782[/C][C]0.47095538555564[/C][C]0.76452230722218[/C][/ROW]
[ROW][C]27[/C][C]0.204917301093638[/C][C]0.409834602187277[/C][C]0.795082698906362[/C][/ROW]
[ROW][C]28[/C][C]0.174177365602480[/C][C]0.348354731204960[/C][C]0.82582263439752[/C][/ROW]
[ROW][C]29[/C][C]0.138606954712146[/C][C]0.277213909424292[/C][C]0.861393045287854[/C][/ROW]
[ROW][C]30[/C][C]0.121832643899561[/C][C]0.243665287799122[/C][C]0.87816735610044[/C][/ROW]
[ROW][C]31[/C][C]0.157097810543333[/C][C]0.314195621086666[/C][C]0.842902189456667[/C][/ROW]
[ROW][C]32[/C][C]0.117377244897597[/C][C]0.234754489795195[/C][C]0.882622755102403[/C][/ROW]
[ROW][C]33[/C][C]0.0852758354652797[/C][C]0.170551670930559[/C][C]0.91472416453472[/C][/ROW]
[ROW][C]34[/C][C]0.0618642808526044[/C][C]0.123728561705209[/C][C]0.938135719147396[/C][/ROW]
[ROW][C]35[/C][C]0.0497271409344455[/C][C]0.0994542818688911[/C][C]0.950272859065554[/C][/ROW]
[ROW][C]36[/C][C]0.0465194933591641[/C][C]0.0930389867183282[/C][C]0.953480506640836[/C][/ROW]
[ROW][C]37[/C][C]0.0407365339555444[/C][C]0.0814730679110888[/C][C]0.959263466044456[/C][/ROW]
[ROW][C]38[/C][C]0.0320681775451807[/C][C]0.0641363550903613[/C][C]0.96793182245482[/C][/ROW]
[ROW][C]39[/C][C]0.0249059357241108[/C][C]0.0498118714482216[/C][C]0.97509406427589[/C][/ROW]
[ROW][C]40[/C][C]0.0204113069114104[/C][C]0.0408226138228209[/C][C]0.97958869308859[/C][/ROW]
[ROW][C]41[/C][C]0.017670683998665[/C][C]0.03534136799733[/C][C]0.982329316001335[/C][/ROW]
[ROW][C]42[/C][C]0.0147000831866726[/C][C]0.0294001663733452[/C][C]0.985299916813327[/C][/ROW]
[ROW][C]43[/C][C]0.0124852300359756[/C][C]0.0249704600719513[/C][C]0.987514769964024[/C][/ROW]
[ROW][C]44[/C][C]0.0118211542797414[/C][C]0.0236423085594829[/C][C]0.988178845720258[/C][/ROW]
[ROW][C]45[/C][C]0.0104141331222867[/C][C]0.0208282662445735[/C][C]0.989585866877713[/C][/ROW]
[ROW][C]46[/C][C]0.0127932510071743[/C][C]0.0255865020143486[/C][C]0.987206748992826[/C][/ROW]
[ROW][C]47[/C][C]0.014721713128344[/C][C]0.029443426256688[/C][C]0.985278286871656[/C][/ROW]
[ROW][C]48[/C][C]0.0198125215394093[/C][C]0.0396250430788186[/C][C]0.98018747846059[/C][/ROW]
[ROW][C]49[/C][C]0.00991879744339177[/C][C]0.0198375948867835[/C][C]0.990081202556608[/C][/ROW]
[ROW][C]50[/C][C]0.0117479239808080[/C][C]0.0234958479616159[/C][C]0.988252076019192[/C][/ROW]
[ROW][C]51[/C][C]0.0228626589996605[/C][C]0.045725317999321[/C][C]0.97713734100034[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.05552802545350180.1110560509070040.944471974546498
60.02564174911959430.05128349823918870.974358250880406
70.01266493772311740.02532987544623490.987335062276883
80.003820555766847650.00764111153369530.996179444233152
90.001060294768003630.002120589536007260.998939705231996
100.0003382405591500950.0006764811183001910.99966175944085
110.006663392012677760.01332678402535550.993336607987322
120.3353517024121280.6707034048242560.664648297587872
130.3827301347417030.7654602694834070.617269865258297
140.3177634264072960.6355268528145920.682236573592704
150.2474029722132490.4948059444264970.752597027786751
160.1809413931967040.3618827863934090.819058606803296
170.1335218085722170.2670436171444340.866478191427783
180.1060708127173420.2121416254346840.893929187282658
190.0737943298621430.1475886597242860.926205670137857
200.08388751009987530.1677750201997510.916112489900125
210.1104633801526710.2209267603053430.889536619847328
220.1349673242089310.2699346484178620.865032675791069
230.1752167878101360.3504335756202730.824783212189864
240.224456903208190.448913806416380.77554309679181
250.2538349777795990.5076699555591980.746165022220401
260.235477692777820.470955385555640.76452230722218
270.2049173010936380.4098346021872770.795082698906362
280.1741773656024800.3483547312049600.82582263439752
290.1386069547121460.2772139094242920.861393045287854
300.1218326438995610.2436652877991220.87816735610044
310.1570978105433330.3141956210866660.842902189456667
320.1173772448975970.2347544897951950.882622755102403
330.08527583546527970.1705516709305590.91472416453472
340.06186428085260440.1237285617052090.938135719147396
350.04972714093444550.09945428186889110.950272859065554
360.04651949335916410.09303898671832820.953480506640836
370.04073653395554440.08147306791108880.959263466044456
380.03206817754518070.06413635509036130.96793182245482
390.02490593572411080.04981187144822160.97509406427589
400.02041130691141040.04082261382282090.97958869308859
410.0176706839986650.035341367997330.982329316001335
420.01470008318667260.02940016637334520.985299916813327
430.01248523003597560.02497046007195130.987514769964024
440.01182115427974140.02364230855948290.988178845720258
450.01041413312228670.02082826624457350.989585866877713
460.01279325100717430.02558650201434860.987206748992826
470.0147217131283440.0294434262566880.985278286871656
480.01981252153940930.03962504307881860.98018747846059
490.009918797443391770.01983759488678350.990081202556608
500.01174792398080800.02349584796161590.988252076019192
510.02286265899966050.0457253179993210.97713734100034







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level30.0638297872340425NOK
5% type I error level180.382978723404255NOK
10% type I error level230.489361702127660NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 3 & 0.0638297872340425 & NOK \tabularnewline
5% type I error level & 18 & 0.382978723404255 & NOK \tabularnewline
10% type I error level & 23 & 0.489361702127660 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=114675&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]3[/C][C]0.0638297872340425[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]18[/C][C]0.382978723404255[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]23[/C][C]0.489361702127660[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=114675&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=114675&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level30.0638297872340425NOK
5% type I error level180.382978723404255NOK
10% type I error level230.489361702127660NOK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}