Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 30 Nov 2017 15:46:51 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Dec/01/t1512138293wa9gyfkc6okspz0.htm/, Retrieved Wed, 15 May 2024 15:30:05 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=308427, Retrieved Wed, 15 May 2024 15:30:05 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact124
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2017-11-30 14:46:51] [8329b9b38c877eb1bcf8703660df8d0b] [Current]
Feedback Forum

Post a new message
Dataseries X:
82	97.7
96.5	88.9
104.8	96.5
87.2	89.5
98.6	85.4
98.7	84.3
75	83.7
86.8	86.2
105	90.7
109.8	95.7
108.2	95.6
99	97
89.6	97.2
97.8	86.6
104.8	88.4
87	81.4
87.9	86.9
93.9	84.9
84.3	83.7
84	86.8
104.3	88.3
104.4	92.5
102.3	94.7
89.4	94.5
78.7	98.7
86.9	88.6
93.7	95.2
87	91.3
83.9	91.7
95.3	89.3
73.7	88.7
76.6	91.2
94.7	88.6
97.7	94.6
90	96
82.4	94.3
77.4	102
85	93.4
90.3	96.7
82.1	93.7
79.6	91.6
86.2	89.6
73.4	92.9
66.7	94.1
96.7	92
98.6	97.5
83.2	92.7
84	100.7
75.8	105.9
83.2	95.3
95.7	99.8
87.3	91.3
83.8	90.8
98.7	87.1
80.8	91.4
74.2	86.1
96.1	87.1
99.4	92.6
91.8	96.6
89.7	105.3
82.9	102.4
90	98.2
98.5	98.6
93.4	92.6
89.1	87.9
103	84.1
74.7	86.7
79	84.4
101.3	86
96.7	90.4
99.1	92.9
92.3	105.8
90.6	106
95.2	99.1
107.6	99.9
97.6	88.1
104	87.8
112	87.1
90.6	85.9
84.9	86.5
112.7	84.1
115.2	92.1
110.1	93.3
95.7	98.9
104.2	103
103.3	98.4
116.1	100.7
106.9	92.3
105.9	89
120.2	88.9
96.2	85.5
91.5	90.1
108.3	87
121.1	97.1
111.4	101.5
95.6	103
98.7	106.1
117.7	96.1
124.5	94.2
114.8	89.1
108	85.2
120.7	86.5
95.6	88
84.3	88.4
122.2	87.9
117.1	95.7
97.2	94.8
99.5	105.2
90.1	108.7
87.3	96.1
97.4	98.3
90.1	88.6
83.6	90.8
97.8	88.1
79.7	91.9
75.1	98.5
106.1	98.6
103.5	100.3
94.5	98.7
100.9	110.7
89.7	115.4
91.4	105.4
110.2	108
102.8	94.5
89.8	96.5
112.8	91
84	94.1
86.5	96.4
107.3	93.1
120.2	97.5
105.5	102.5
99.9	105.7
100.4	109.1
99.6	97.2
118.6	100.3
96	91.3
105.3	94.3
105.8	89.5
80.1	89.3
89.3	93.4
120.4	91.9
111.3	92.9
98.1	93.7
102.9	100.1
95.4	105.5
108.7	110.5
123	89.5
107.7	90.4
97.2	89.9
127.7	84.6
100.6	86.2
89.7	83.4
108.3	82.9
110	81.8
105.2	87.6
87.7	94.6
91.4	99.6
92.8	96.7
97.5	99.8
95.7	83.8
93.5	82.4
97.3	86.8
84.1	91
87.8	85.3
96.2	83.6
94.6	94
88.7	100.3
76.5	107.1
83.9	100.7
88.1	95.5
93	92.9
81.8	79.2
84.1	82
89.1	79.3
75.8	81.5
71.4	76
93.8	73.1
88.5	80.4
78.1	82.1
83.6	90.5
78.2	98.1
76.2	89.5
92	86.5
79.5	77
69.5	74.7
86.4	73.4
72.3	72.5
65	69.3
86	75.2
83.4	83.5
87.2	90.5
76.4	92.2
76.3	110.5
76.9	101.8
92.7	107.4
83.3	95.5
73.8	84.5
94	81.1
73.1	86.2
69.8	91.5
86	84.7
78.8	92.2
89.4	99.2
83.8	104.5
74.1	113
77.2	100.4
103.6	101
78	84.8
80.2	86.5
88.8	91.7
72.9	94.8
73.6	95




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time9 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]9 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=308427&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
Elect.equipment[t] = + 11.0655 + 0.18125Energy[t] + 0.29539`Elect.equipment(t-1)`[t] + 0.102799`Elect.equipment(t-2)`[t] + 0.323177`Elect.equipment(t-3)`[t] + 0.26232`Elect.equipment(t-4)`[t] -0.00638473`Elect.equipment(t-5)`[t] + 0.0708733`Elect.equipment(t-6)`[t] -0.0111728`Elect.equipment(t-7)`[t] -0.127492`Elect.equipment(t-8)`[t] + 0.0453019`Elect.equipment(t-9)`[t] -0.0570579`Elect.equipment(t-10)`[t] + 0.00518098`Elect.equipment(t-1s)`[t] -0.0500016`Elect.equipment(t-2s)`[t] + 0.00466526`Elect.equipment(t-3s)`[t] -0.0536471`Elect.equipment(t-4s)`[t] -1.9766M1[t] -16.6319M2[t] -21.1729M3[t] -10.795M4[t] + 0.818477M5[t] -10.1066M6[t] -9.91948M7[t] + 0.709909M8[t] -25.4871M9[t] -19.9258M10[t] + 3.60962M11[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Elect.equipment[t] =  +  11.0655 +  0.18125Energy[t] +  0.29539`Elect.equipment(t-1)`[t] +  0.102799`Elect.equipment(t-2)`[t] +  0.323177`Elect.equipment(t-3)`[t] +  0.26232`Elect.equipment(t-4)`[t] -0.00638473`Elect.equipment(t-5)`[t] +  0.0708733`Elect.equipment(t-6)`[t] -0.0111728`Elect.equipment(t-7)`[t] -0.127492`Elect.equipment(t-8)`[t] +  0.0453019`Elect.equipment(t-9)`[t] -0.0570579`Elect.equipment(t-10)`[t] +  0.00518098`Elect.equipment(t-1s)`[t] -0.0500016`Elect.equipment(t-2s)`[t] +  0.00466526`Elect.equipment(t-3s)`[t] -0.0536471`Elect.equipment(t-4s)`[t] -1.9766M1[t] -16.6319M2[t] -21.1729M3[t] -10.795M4[t] +  0.818477M5[t] -10.1066M6[t] -9.91948M7[t] +  0.709909M8[t] -25.4871M9[t] -19.9258M10[t] +  3.60962M11[t]  + e[t] \tabularnewline
Warning: you did not specify the column number of the endogenous series! The first column was selected by default. \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Elect.equipment[t] =  +  11.0655 +  0.18125Energy[t] +  0.29539`Elect.equipment(t-1)`[t] +  0.102799`Elect.equipment(t-2)`[t] +  0.323177`Elect.equipment(t-3)`[t] +  0.26232`Elect.equipment(t-4)`[t] -0.00638473`Elect.equipment(t-5)`[t] +  0.0708733`Elect.equipment(t-6)`[t] -0.0111728`Elect.equipment(t-7)`[t] -0.127492`Elect.equipment(t-8)`[t] +  0.0453019`Elect.equipment(t-9)`[t] -0.0570579`Elect.equipment(t-10)`[t] +  0.00518098`Elect.equipment(t-1s)`[t] -0.0500016`Elect.equipment(t-2s)`[t] +  0.00466526`Elect.equipment(t-3s)`[t] -0.0536471`Elect.equipment(t-4s)`[t] -1.9766M1[t] -16.6319M2[t] -21.1729M3[t] -10.795M4[t] +  0.818477M5[t] -10.1066M6[t] -9.91948M7[t] +  0.709909M8[t] -25.4871M9[t] -19.9258M10[t] +  3.60962M11[t]  + e[t][/C][/ROW]
[ROW][C]Warning: you did not specify the column number of the endogenous series! The first column was selected by default.[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Elect.equipment[t] = + 11.0655 + 0.18125Energy[t] + 0.29539`Elect.equipment(t-1)`[t] + 0.102799`Elect.equipment(t-2)`[t] + 0.323177`Elect.equipment(t-3)`[t] + 0.26232`Elect.equipment(t-4)`[t] -0.00638473`Elect.equipment(t-5)`[t] + 0.0708733`Elect.equipment(t-6)`[t] -0.0111728`Elect.equipment(t-7)`[t] -0.127492`Elect.equipment(t-8)`[t] + 0.0453019`Elect.equipment(t-9)`[t] -0.0570579`Elect.equipment(t-10)`[t] + 0.00518098`Elect.equipment(t-1s)`[t] -0.0500016`Elect.equipment(t-2s)`[t] + 0.00466526`Elect.equipment(t-3s)`[t] -0.0536471`Elect.equipment(t-4s)`[t] -1.9766M1[t] -16.6319M2[t] -21.1729M3[t] -10.795M4[t] + 0.818477M5[t] -10.1066M6[t] -9.91948M7[t] + 0.709909M8[t] -25.4871M9[t] -19.9258M10[t] + 3.60962M11[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+11.07 12+9.2240e-01 0.358 0.179
Energy+0.1812 0.08411+2.1550e+00 0.03307 0.01653
`Elect.equipment(t-1)`+0.2954 0.08814+3.3510e+00 0.001059 0.0005296
`Elect.equipment(t-2)`+0.1028 0.09099+1.1300e+00 0.2607 0.1304
`Elect.equipment(t-3)`+0.3232 0.09295+3.4770e+00 0.0006955 0.0003478
`Elect.equipment(t-4)`+0.2623 0.09776+2.6830e+00 0.008261 0.00413
`Elect.equipment(t-5)`-0.006385 0.1008-6.3370e-02 0.9496 0.4748
`Elect.equipment(t-6)`+0.07087 0.1008+7.0330e-01 0.4832 0.2416
`Elect.equipment(t-7)`-0.01117 0.09912-1.1270e-01 0.9104 0.4552
`Elect.equipment(t-8)`-0.1275 0.0986-1.2930e+00 0.1983 0.09916
`Elect.equipment(t-9)`+0.0453 0.1006+4.5050e-01 0.6531 0.3265
`Elect.equipment(t-10)`-0.05706 0.09373-6.0870e-01 0.5438 0.2719
`Elect.equipment(t-1s)`+0.005181 0.08973+5.7740e-02 0.954 0.477
`Elect.equipment(t-2s)`-0.05 0.0614-8.1440e-01 0.417 0.2085
`Elect.equipment(t-3s)`+0.004665 0.06331+7.3690e-02 0.9414 0.4707
`Elect.equipment(t-4s)`-0.05365 0.06167-8.7000e-01 0.386 0.193
M1-1.977 4.362-4.5310e-01 0.6513 0.3256
M2-16.63 5.202-3.1970e+00 0.001753 0.0008765
M3-21.17 4.871-4.3470e+00 2.807e-05 1.403e-05
M4-10.79 4.916-2.1960e+00 0.02991 0.01496
M5+0.8185 4.106+1.9930e-01 0.8423 0.4212
M6-10.11 3.689-2.7400e+00 0.007037 0.003518
M7-9.919 4.373-2.2680e+00 0.02501 0.0125
M8+0.7099 4.39+1.6170e-01 0.8718 0.4359
M9-25.49 4.568-5.5790e+00 1.393e-07 6.966e-08
M10-19.93 5.482-3.6340e+00 0.0004032 0.0002016
M11+3.61 4.27+8.4530e-01 0.3995 0.1998

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +11.07 &  12 & +9.2240e-01 &  0.358 &  0.179 \tabularnewline
Energy & +0.1812 &  0.08411 & +2.1550e+00 &  0.03307 &  0.01653 \tabularnewline
`Elect.equipment(t-1)` & +0.2954 &  0.08814 & +3.3510e+00 &  0.001059 &  0.0005296 \tabularnewline
`Elect.equipment(t-2)` & +0.1028 &  0.09099 & +1.1300e+00 &  0.2607 &  0.1304 \tabularnewline
`Elect.equipment(t-3)` & +0.3232 &  0.09295 & +3.4770e+00 &  0.0006955 &  0.0003478 \tabularnewline
`Elect.equipment(t-4)` & +0.2623 &  0.09776 & +2.6830e+00 &  0.008261 &  0.00413 \tabularnewline
`Elect.equipment(t-5)` & -0.006385 &  0.1008 & -6.3370e-02 &  0.9496 &  0.4748 \tabularnewline
`Elect.equipment(t-6)` & +0.07087 &  0.1008 & +7.0330e-01 &  0.4832 &  0.2416 \tabularnewline
`Elect.equipment(t-7)` & -0.01117 &  0.09912 & -1.1270e-01 &  0.9104 &  0.4552 \tabularnewline
`Elect.equipment(t-8)` & -0.1275 &  0.0986 & -1.2930e+00 &  0.1983 &  0.09916 \tabularnewline
`Elect.equipment(t-9)` & +0.0453 &  0.1006 & +4.5050e-01 &  0.6531 &  0.3265 \tabularnewline
`Elect.equipment(t-10)` & -0.05706 &  0.09373 & -6.0870e-01 &  0.5438 &  0.2719 \tabularnewline
`Elect.equipment(t-1s)` & +0.005181 &  0.08973 & +5.7740e-02 &  0.954 &  0.477 \tabularnewline
`Elect.equipment(t-2s)` & -0.05 &  0.0614 & -8.1440e-01 &  0.417 &  0.2085 \tabularnewline
`Elect.equipment(t-3s)` & +0.004665 &  0.06331 & +7.3690e-02 &  0.9414 &  0.4707 \tabularnewline
`Elect.equipment(t-4s)` & -0.05365 &  0.06167 & -8.7000e-01 &  0.386 &  0.193 \tabularnewline
M1 & -1.977 &  4.362 & -4.5310e-01 &  0.6513 &  0.3256 \tabularnewline
M2 & -16.63 &  5.202 & -3.1970e+00 &  0.001753 &  0.0008765 \tabularnewline
M3 & -21.17 &  4.871 & -4.3470e+00 &  2.807e-05 &  1.403e-05 \tabularnewline
M4 & -10.79 &  4.916 & -2.1960e+00 &  0.02991 &  0.01496 \tabularnewline
M5 & +0.8185 &  4.106 & +1.9930e-01 &  0.8423 &  0.4212 \tabularnewline
M6 & -10.11 &  3.689 & -2.7400e+00 &  0.007037 &  0.003518 \tabularnewline
M7 & -9.919 &  4.373 & -2.2680e+00 &  0.02501 &  0.0125 \tabularnewline
M8 & +0.7099 &  4.39 & +1.6170e-01 &  0.8718 &  0.4359 \tabularnewline
M9 & -25.49 &  4.568 & -5.5790e+00 &  1.393e-07 &  6.966e-08 \tabularnewline
M10 & -19.93 &  5.482 & -3.6340e+00 &  0.0004032 &  0.0002016 \tabularnewline
M11 & +3.61 &  4.27 & +8.4530e-01 &  0.3995 &  0.1998 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+11.07[/C][C] 12[/C][C]+9.2240e-01[/C][C] 0.358[/C][C] 0.179[/C][/ROW]
[ROW][C]Energy[/C][C]+0.1812[/C][C] 0.08411[/C][C]+2.1550e+00[/C][C] 0.03307[/C][C] 0.01653[/C][/ROW]
[ROW][C]`Elect.equipment(t-1)`[/C][C]+0.2954[/C][C] 0.08814[/C][C]+3.3510e+00[/C][C] 0.001059[/C][C] 0.0005296[/C][/ROW]
[ROW][C]`Elect.equipment(t-2)`[/C][C]+0.1028[/C][C] 0.09099[/C][C]+1.1300e+00[/C][C] 0.2607[/C][C] 0.1304[/C][/ROW]
[ROW][C]`Elect.equipment(t-3)`[/C][C]+0.3232[/C][C] 0.09295[/C][C]+3.4770e+00[/C][C] 0.0006955[/C][C] 0.0003478[/C][/ROW]
[ROW][C]`Elect.equipment(t-4)`[/C][C]+0.2623[/C][C] 0.09776[/C][C]+2.6830e+00[/C][C] 0.008261[/C][C] 0.00413[/C][/ROW]
[ROW][C]`Elect.equipment(t-5)`[/C][C]-0.006385[/C][C] 0.1008[/C][C]-6.3370e-02[/C][C] 0.9496[/C][C] 0.4748[/C][/ROW]
[ROW][C]`Elect.equipment(t-6)`[/C][C]+0.07087[/C][C] 0.1008[/C][C]+7.0330e-01[/C][C] 0.4832[/C][C] 0.2416[/C][/ROW]
[ROW][C]`Elect.equipment(t-7)`[/C][C]-0.01117[/C][C] 0.09912[/C][C]-1.1270e-01[/C][C] 0.9104[/C][C] 0.4552[/C][/ROW]
[ROW][C]`Elect.equipment(t-8)`[/C][C]-0.1275[/C][C] 0.0986[/C][C]-1.2930e+00[/C][C] 0.1983[/C][C] 0.09916[/C][/ROW]
[ROW][C]`Elect.equipment(t-9)`[/C][C]+0.0453[/C][C] 0.1006[/C][C]+4.5050e-01[/C][C] 0.6531[/C][C] 0.3265[/C][/ROW]
[ROW][C]`Elect.equipment(t-10)`[/C][C]-0.05706[/C][C] 0.09373[/C][C]-6.0870e-01[/C][C] 0.5438[/C][C] 0.2719[/C][/ROW]
[ROW][C]`Elect.equipment(t-1s)`[/C][C]+0.005181[/C][C] 0.08973[/C][C]+5.7740e-02[/C][C] 0.954[/C][C] 0.477[/C][/ROW]
[ROW][C]`Elect.equipment(t-2s)`[/C][C]-0.05[/C][C] 0.0614[/C][C]-8.1440e-01[/C][C] 0.417[/C][C] 0.2085[/C][/ROW]
[ROW][C]`Elect.equipment(t-3s)`[/C][C]+0.004665[/C][C] 0.06331[/C][C]+7.3690e-02[/C][C] 0.9414[/C][C] 0.4707[/C][/ROW]
[ROW][C]`Elect.equipment(t-4s)`[/C][C]-0.05365[/C][C] 0.06167[/C][C]-8.7000e-01[/C][C] 0.386[/C][C] 0.193[/C][/ROW]
[ROW][C]M1[/C][C]-1.977[/C][C] 4.362[/C][C]-4.5310e-01[/C][C] 0.6513[/C][C] 0.3256[/C][/ROW]
[ROW][C]M2[/C][C]-16.63[/C][C] 5.202[/C][C]-3.1970e+00[/C][C] 0.001753[/C][C] 0.0008765[/C][/ROW]
[ROW][C]M3[/C][C]-21.17[/C][C] 4.871[/C][C]-4.3470e+00[/C][C] 2.807e-05[/C][C] 1.403e-05[/C][/ROW]
[ROW][C]M4[/C][C]-10.79[/C][C] 4.916[/C][C]-2.1960e+00[/C][C] 0.02991[/C][C] 0.01496[/C][/ROW]
[ROW][C]M5[/C][C]+0.8185[/C][C] 4.106[/C][C]+1.9930e-01[/C][C] 0.8423[/C][C] 0.4212[/C][/ROW]
[ROW][C]M6[/C][C]-10.11[/C][C] 3.689[/C][C]-2.7400e+00[/C][C] 0.007037[/C][C] 0.003518[/C][/ROW]
[ROW][C]M7[/C][C]-9.919[/C][C] 4.373[/C][C]-2.2680e+00[/C][C] 0.02501[/C][C] 0.0125[/C][/ROW]
[ROW][C]M8[/C][C]+0.7099[/C][C] 4.39[/C][C]+1.6170e-01[/C][C] 0.8718[/C][C] 0.4359[/C][/ROW]
[ROW][C]M9[/C][C]-25.49[/C][C] 4.568[/C][C]-5.5790e+00[/C][C] 1.393e-07[/C][C] 6.966e-08[/C][/ROW]
[ROW][C]M10[/C][C]-19.93[/C][C] 5.482[/C][C]-3.6340e+00[/C][C] 0.0004032[/C][C] 0.0002016[/C][/ROW]
[ROW][C]M11[/C][C]+3.61[/C][C] 4.27[/C][C]+8.4530e-01[/C][C] 0.3995[/C][C] 0.1998[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+11.07 12+9.2240e-01 0.358 0.179
Energy+0.1812 0.08411+2.1550e+00 0.03307 0.01653
`Elect.equipment(t-1)`+0.2954 0.08814+3.3510e+00 0.001059 0.0005296
`Elect.equipment(t-2)`+0.1028 0.09099+1.1300e+00 0.2607 0.1304
`Elect.equipment(t-3)`+0.3232 0.09295+3.4770e+00 0.0006955 0.0003478
`Elect.equipment(t-4)`+0.2623 0.09776+2.6830e+00 0.008261 0.00413
`Elect.equipment(t-5)`-0.006385 0.1008-6.3370e-02 0.9496 0.4748
`Elect.equipment(t-6)`+0.07087 0.1008+7.0330e-01 0.4832 0.2416
`Elect.equipment(t-7)`-0.01117 0.09912-1.1270e-01 0.9104 0.4552
`Elect.equipment(t-8)`-0.1275 0.0986-1.2930e+00 0.1983 0.09916
`Elect.equipment(t-9)`+0.0453 0.1006+4.5050e-01 0.6531 0.3265
`Elect.equipment(t-10)`-0.05706 0.09373-6.0870e-01 0.5438 0.2719
`Elect.equipment(t-1s)`+0.005181 0.08973+5.7740e-02 0.954 0.477
`Elect.equipment(t-2s)`-0.05 0.0614-8.1440e-01 0.417 0.2085
`Elect.equipment(t-3s)`+0.004665 0.06331+7.3690e-02 0.9414 0.4707
`Elect.equipment(t-4s)`-0.05365 0.06167-8.7000e-01 0.386 0.193
M1-1.977 4.362-4.5310e-01 0.6513 0.3256
M2-16.63 5.202-3.1970e+00 0.001753 0.0008765
M3-21.17 4.871-4.3470e+00 2.807e-05 1.403e-05
M4-10.79 4.916-2.1960e+00 0.02991 0.01496
M5+0.8185 4.106+1.9930e-01 0.8423 0.4212
M6-10.11 3.689-2.7400e+00 0.007037 0.003518
M7-9.919 4.373-2.2680e+00 0.02501 0.0125
M8+0.7099 4.39+1.6170e-01 0.8718 0.4359
M9-25.49 4.568-5.5790e+00 1.393e-07 6.966e-08
M10-19.93 5.482-3.6340e+00 0.0004032 0.0002016
M11+3.61 4.27+8.4530e-01 0.3995 0.1998







Multiple Linear Regression - Regression Statistics
Multiple R 0.9232
R-squared 0.8523
Adjusted R-squared 0.8221
F-TEST (value) 28.2
F-TEST (DF numerator)26
F-TEST (DF denominator)127
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 5.746
Sum Squared Residuals 4193

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.9232 \tabularnewline
R-squared &  0.8523 \tabularnewline
Adjusted R-squared &  0.8221 \tabularnewline
F-TEST (value) &  28.2 \tabularnewline
F-TEST (DF numerator) & 26 \tabularnewline
F-TEST (DF denominator) & 127 \tabularnewline
p-value &  0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  5.746 \tabularnewline
Sum Squared Residuals &  4193 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.9232[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.8523[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.8221[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 28.2[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]26[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]127[/C][/ROW]
[ROW][C]p-value[/C][C] 0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 5.746[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 4193[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.9232
R-squared 0.8523
Adjusted R-squared 0.8221
F-TEST (value) 28.2
F-TEST (DF numerator)26
F-TEST (DF denominator)127
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 5.746
Sum Squared Residuals 4193







Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute

\begin{tabular}{lllllllll}
\hline
Menu of Residual Diagnostics \tabularnewline
Description & Link \tabularnewline
Histogram & Compute \tabularnewline
Central Tendency & Compute \tabularnewline
QQ Plot & Compute \tabularnewline
Kernel Density Plot & Compute \tabularnewline
Skewness/Kurtosis Test & Compute \tabularnewline
Skewness-Kurtosis Plot & Compute \tabularnewline
Harrell-Davis Plot & Compute \tabularnewline
Bootstrap Plot -- Central Tendency & Compute \tabularnewline
Blocked Bootstrap Plot -- Central Tendency & Compute \tabularnewline
(Partial) Autocorrelation Plot & Compute \tabularnewline
Spectral Analysis & Compute \tabularnewline
Tukey lambda PPCC Plot & Compute \tabularnewline
Box-Cox Normality Plot & Compute \tabularnewline
Summary Statistics & Compute \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=4

[TABLE]
[ROW][C]Menu of Residual Diagnostics[/C][/ROW]
[ROW][C]Description[/C][C]Link[/C][/ROW]
[ROW][C]Histogram[/C][C]Compute[/C][/ROW]
[ROW][C]Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]QQ Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Kernel Density Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness/Kurtosis Test[/C][C]Compute[/C][/ROW]
[ROW][C]Skewness-Kurtosis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Harrell-Davis Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C]Blocked Bootstrap Plot -- Central Tendency[/C][C]Compute[/C][/ROW]
[ROW][C](Partial) Autocorrelation Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Spectral Analysis[/C][C]Compute[/C][/ROW]
[ROW][C]Tukey lambda PPCC Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Box-Cox Normality Plot[/C][C]Compute[/C][/ROW]
[ROW][C]Summary Statistics[/C][C]Compute[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Menu of Residual Diagnostics
DescriptionLink
HistogramCompute
Central TendencyCompute
QQ PlotCompute
Kernel Density PlotCompute
Skewness/Kurtosis TestCompute
Skewness-Kurtosis PlotCompute
Harrell-Davis PlotCompute
Bootstrap Plot -- Central TendencyCompute
Blocked Bootstrap Plot -- Central TendencyCompute
(Partial) Autocorrelation PlotCompute
Spectral AnalysisCompute
Tukey lambda PPCC PlotCompute
Box-Cox Normality PlotCompute
Summary StatisticsCompute







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 91.8 93.19-1.394
2 89.7 86.8 2.902
3 82.9 85.76-2.863
4 90 88.83 1.166
5 98.5 103.5-4.964
6 93.4 92.45 0.9484
7 89.1 89.08 0.02272
8 103 102.2 0.7611
9 74.7 81.09-6.385
10 79 77.16 1.838
11 101.3 102.1-0.7824
12 96.7 99.21-2.508
13 99.1 92.68 6.419
14 92.3 91.38 0.9154
15 90.6 88.05 2.55
16 95.2 93.61 1.59
17 107.6 110.1-2.509
18 97.6 96.35 1.252
19 104 95.38 8.619
20 112 111.6 0.4401
21 90.6 88.17 2.426
22 84.9 89.46-4.56
23 112.7 111.8 0.9066
24 115.2 110.9 4.262
25 110.1 105.3 4.808
26 95.7 100.6-4.878
27 104.2 97.12 7.082
28 103.3 105.5-2.194
29 116.1 116.3-0.1977
30 106.9 106.3 0.588
31 105.9 103.9 1.975
32 120.2 116.5 3.694
33 96.2 96.05 0.1481
34 91.5 95.4-3.897
35 108.3 115.9-7.627
36 121.1 115.3 5.789
37 111.4 110.3 1.147
38 95.6 101.7-6.111
39 98.7 98.3 0.4017
40 117.7 104.3 13.44
41 124.5 117.6 6.945
42 114.8 106.9 7.919
43 108 109.3-1.276
44 120.7 121.6-0.9226
45 95.6 100.3-4.706
46 84.3 97.84-13.54
47 122.2 115.1 7.116
48 117.1 115.8 1.316
49 97.2 105.8-8.584
50 99.5 98.21 1.289
51 90.1 99.37-9.271
52 87.3 94.8-7.503
53 97.4 106.6-9.151
54 90.1 93.69-3.586
55 83.6 84.78-1.179
56 97.8 96.88 0.9224
57 79.7 77.14 2.556
58 75.1 74.86 0.2434
59 106.1 99.6 6.504
60 103.5 101.7 1.843
61 94.5 94.46 0.04353
62 100.9 91.7 9.196
63 89.7 94.71-5.009
64 91.4 93.81-2.408
65 110.2 109.6 0.5557
66 102.8 99.48 3.316
67 89.8 93.57-3.772
68 112.8 106.8 5.974
69 84 89.77-5.77
70 86.5 83.34 3.162
71 107.3 108.5-1.221
72 120.2 107.3 12.94
73 105.5 103.9 1.596
74 99.9 98.46 1.435
75 100.4 98.91 1.494
76 99.6 102.9-3.285
77 118.6 115 3.634
78 96 105.2-9.248
79 105.3 99.36 5.937
80 105.8 113.1-7.297
81 80.1 88.95-8.849
82 89.3 84.36 4.939
83 120.4 110.2 10.24
84 111.3 106.7 4.558
85 98.1 100.9-2.774
86 102.9 99.4 3.499
87 95.4 96.29-0.8899
88 108.7 100.5 8.235
89 123 113.3 9.713
90 107.7 104.4 3.283
91 97.2 101.9-4.682
92 127.7 115.8 11.9
93 100.6 98.03 2.572
94 89.7 91.56-1.857
95 108.3 116.2-7.921
96 110 112.1-2.111
97 105.2 103.1 2.1
98 87.7 96.13-8.425
99 91.4 90.28 1.118
100 92.8 94.51-1.71
101 97.5 106.1-8.6
102 95.7 90.19 5.515
103 93.5 89.7 3.805
104 97.3 101.7-4.39
105 84.1 79.86 4.241
106 87.8 81.38 6.418
107 96.2 101.3-5.11
108 94.6 100.7-6.063
109 88.7 98.09-9.395
110 76.5 86.49-9.99
111 83.9 78.19 5.708
112 88.1 85.35 2.745
113 93 93.94-0.942
114 81.8 81.2 0.6047
115 84.1 83.17 0.9343
116 89.1 92.31-3.21
117 75.8 69.44 6.363
118 71.4 70.43 0.9722
119 93.8 90.34 3.455
120 88.5 90.04-1.541
121 78.1 84.53-6.435
122 83.6 76.83 6.771
123 78.2 76.12 2.075
124 76.2 79.02-2.819
125 92 90.29 1.708
126 79.5 82.09-2.59
127 69.5 74.29-4.794
128 86.4 87.46-1.064
129 72.3 66.39 5.907
130 65 60.63 4.369
131 86 85.34 0.6639
132 83.4 87.81-4.407
133 87.2 80.88 6.315
134 76.4 76.39 0.0131
135 76.3 75.72 0.581
136 76.9 80.61-3.711
137 92.7 94.72-2.024
138 83.3 83.7-0.3981
139 73.8 79.46-5.657
140 94 90.07 3.928
141 73.1 71.19 1.91
142 69.8 70.5-0.7032
143 86 92.23-6.229
144 78.8 92.88-14.08
145 89.4 83.25 6.152
146 83.8 80.42 3.384
147 74.1 77.08-2.977
148 77.2 80.74-3.542
149 103.6 97.77 5.831
150 78 85.6-7.604
151 80.2 80.13 0.06716
152 88.8 99.53-10.73
153 72.9 73.31-0.4133
154 73.6 70.99 2.613

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  91.8 &  93.19 & -1.394 \tabularnewline
2 &  89.7 &  86.8 &  2.902 \tabularnewline
3 &  82.9 &  85.76 & -2.863 \tabularnewline
4 &  90 &  88.83 &  1.166 \tabularnewline
5 &  98.5 &  103.5 & -4.964 \tabularnewline
6 &  93.4 &  92.45 &  0.9484 \tabularnewline
7 &  89.1 &  89.08 &  0.02272 \tabularnewline
8 &  103 &  102.2 &  0.7611 \tabularnewline
9 &  74.7 &  81.09 & -6.385 \tabularnewline
10 &  79 &  77.16 &  1.838 \tabularnewline
11 &  101.3 &  102.1 & -0.7824 \tabularnewline
12 &  96.7 &  99.21 & -2.508 \tabularnewline
13 &  99.1 &  92.68 &  6.419 \tabularnewline
14 &  92.3 &  91.38 &  0.9154 \tabularnewline
15 &  90.6 &  88.05 &  2.55 \tabularnewline
16 &  95.2 &  93.61 &  1.59 \tabularnewline
17 &  107.6 &  110.1 & -2.509 \tabularnewline
18 &  97.6 &  96.35 &  1.252 \tabularnewline
19 &  104 &  95.38 &  8.619 \tabularnewline
20 &  112 &  111.6 &  0.4401 \tabularnewline
21 &  90.6 &  88.17 &  2.426 \tabularnewline
22 &  84.9 &  89.46 & -4.56 \tabularnewline
23 &  112.7 &  111.8 &  0.9066 \tabularnewline
24 &  115.2 &  110.9 &  4.262 \tabularnewline
25 &  110.1 &  105.3 &  4.808 \tabularnewline
26 &  95.7 &  100.6 & -4.878 \tabularnewline
27 &  104.2 &  97.12 &  7.082 \tabularnewline
28 &  103.3 &  105.5 & -2.194 \tabularnewline
29 &  116.1 &  116.3 & -0.1977 \tabularnewline
30 &  106.9 &  106.3 &  0.588 \tabularnewline
31 &  105.9 &  103.9 &  1.975 \tabularnewline
32 &  120.2 &  116.5 &  3.694 \tabularnewline
33 &  96.2 &  96.05 &  0.1481 \tabularnewline
34 &  91.5 &  95.4 & -3.897 \tabularnewline
35 &  108.3 &  115.9 & -7.627 \tabularnewline
36 &  121.1 &  115.3 &  5.789 \tabularnewline
37 &  111.4 &  110.3 &  1.147 \tabularnewline
38 &  95.6 &  101.7 & -6.111 \tabularnewline
39 &  98.7 &  98.3 &  0.4017 \tabularnewline
40 &  117.7 &  104.3 &  13.44 \tabularnewline
41 &  124.5 &  117.6 &  6.945 \tabularnewline
42 &  114.8 &  106.9 &  7.919 \tabularnewline
43 &  108 &  109.3 & -1.276 \tabularnewline
44 &  120.7 &  121.6 & -0.9226 \tabularnewline
45 &  95.6 &  100.3 & -4.706 \tabularnewline
46 &  84.3 &  97.84 & -13.54 \tabularnewline
47 &  122.2 &  115.1 &  7.116 \tabularnewline
48 &  117.1 &  115.8 &  1.316 \tabularnewline
49 &  97.2 &  105.8 & -8.584 \tabularnewline
50 &  99.5 &  98.21 &  1.289 \tabularnewline
51 &  90.1 &  99.37 & -9.271 \tabularnewline
52 &  87.3 &  94.8 & -7.503 \tabularnewline
53 &  97.4 &  106.6 & -9.151 \tabularnewline
54 &  90.1 &  93.69 & -3.586 \tabularnewline
55 &  83.6 &  84.78 & -1.179 \tabularnewline
56 &  97.8 &  96.88 &  0.9224 \tabularnewline
57 &  79.7 &  77.14 &  2.556 \tabularnewline
58 &  75.1 &  74.86 &  0.2434 \tabularnewline
59 &  106.1 &  99.6 &  6.504 \tabularnewline
60 &  103.5 &  101.7 &  1.843 \tabularnewline
61 &  94.5 &  94.46 &  0.04353 \tabularnewline
62 &  100.9 &  91.7 &  9.196 \tabularnewline
63 &  89.7 &  94.71 & -5.009 \tabularnewline
64 &  91.4 &  93.81 & -2.408 \tabularnewline
65 &  110.2 &  109.6 &  0.5557 \tabularnewline
66 &  102.8 &  99.48 &  3.316 \tabularnewline
67 &  89.8 &  93.57 & -3.772 \tabularnewline
68 &  112.8 &  106.8 &  5.974 \tabularnewline
69 &  84 &  89.77 & -5.77 \tabularnewline
70 &  86.5 &  83.34 &  3.162 \tabularnewline
71 &  107.3 &  108.5 & -1.221 \tabularnewline
72 &  120.2 &  107.3 &  12.94 \tabularnewline
73 &  105.5 &  103.9 &  1.596 \tabularnewline
74 &  99.9 &  98.46 &  1.435 \tabularnewline
75 &  100.4 &  98.91 &  1.494 \tabularnewline
76 &  99.6 &  102.9 & -3.285 \tabularnewline
77 &  118.6 &  115 &  3.634 \tabularnewline
78 &  96 &  105.2 & -9.248 \tabularnewline
79 &  105.3 &  99.36 &  5.937 \tabularnewline
80 &  105.8 &  113.1 & -7.297 \tabularnewline
81 &  80.1 &  88.95 & -8.849 \tabularnewline
82 &  89.3 &  84.36 &  4.939 \tabularnewline
83 &  120.4 &  110.2 &  10.24 \tabularnewline
84 &  111.3 &  106.7 &  4.558 \tabularnewline
85 &  98.1 &  100.9 & -2.774 \tabularnewline
86 &  102.9 &  99.4 &  3.499 \tabularnewline
87 &  95.4 &  96.29 & -0.8899 \tabularnewline
88 &  108.7 &  100.5 &  8.235 \tabularnewline
89 &  123 &  113.3 &  9.713 \tabularnewline
90 &  107.7 &  104.4 &  3.283 \tabularnewline
91 &  97.2 &  101.9 & -4.682 \tabularnewline
92 &  127.7 &  115.8 &  11.9 \tabularnewline
93 &  100.6 &  98.03 &  2.572 \tabularnewline
94 &  89.7 &  91.56 & -1.857 \tabularnewline
95 &  108.3 &  116.2 & -7.921 \tabularnewline
96 &  110 &  112.1 & -2.111 \tabularnewline
97 &  105.2 &  103.1 &  2.1 \tabularnewline
98 &  87.7 &  96.13 & -8.425 \tabularnewline
99 &  91.4 &  90.28 &  1.118 \tabularnewline
100 &  92.8 &  94.51 & -1.71 \tabularnewline
101 &  97.5 &  106.1 & -8.6 \tabularnewline
102 &  95.7 &  90.19 &  5.515 \tabularnewline
103 &  93.5 &  89.7 &  3.805 \tabularnewline
104 &  97.3 &  101.7 & -4.39 \tabularnewline
105 &  84.1 &  79.86 &  4.241 \tabularnewline
106 &  87.8 &  81.38 &  6.418 \tabularnewline
107 &  96.2 &  101.3 & -5.11 \tabularnewline
108 &  94.6 &  100.7 & -6.063 \tabularnewline
109 &  88.7 &  98.09 & -9.395 \tabularnewline
110 &  76.5 &  86.49 & -9.99 \tabularnewline
111 &  83.9 &  78.19 &  5.708 \tabularnewline
112 &  88.1 &  85.35 &  2.745 \tabularnewline
113 &  93 &  93.94 & -0.942 \tabularnewline
114 &  81.8 &  81.2 &  0.6047 \tabularnewline
115 &  84.1 &  83.17 &  0.9343 \tabularnewline
116 &  89.1 &  92.31 & -3.21 \tabularnewline
117 &  75.8 &  69.44 &  6.363 \tabularnewline
118 &  71.4 &  70.43 &  0.9722 \tabularnewline
119 &  93.8 &  90.34 &  3.455 \tabularnewline
120 &  88.5 &  90.04 & -1.541 \tabularnewline
121 &  78.1 &  84.53 & -6.435 \tabularnewline
122 &  83.6 &  76.83 &  6.771 \tabularnewline
123 &  78.2 &  76.12 &  2.075 \tabularnewline
124 &  76.2 &  79.02 & -2.819 \tabularnewline
125 &  92 &  90.29 &  1.708 \tabularnewline
126 &  79.5 &  82.09 & -2.59 \tabularnewline
127 &  69.5 &  74.29 & -4.794 \tabularnewline
128 &  86.4 &  87.46 & -1.064 \tabularnewline
129 &  72.3 &  66.39 &  5.907 \tabularnewline
130 &  65 &  60.63 &  4.369 \tabularnewline
131 &  86 &  85.34 &  0.6639 \tabularnewline
132 &  83.4 &  87.81 & -4.407 \tabularnewline
133 &  87.2 &  80.88 &  6.315 \tabularnewline
134 &  76.4 &  76.39 &  0.0131 \tabularnewline
135 &  76.3 &  75.72 &  0.581 \tabularnewline
136 &  76.9 &  80.61 & -3.711 \tabularnewline
137 &  92.7 &  94.72 & -2.024 \tabularnewline
138 &  83.3 &  83.7 & -0.3981 \tabularnewline
139 &  73.8 &  79.46 & -5.657 \tabularnewline
140 &  94 &  90.07 &  3.928 \tabularnewline
141 &  73.1 &  71.19 &  1.91 \tabularnewline
142 &  69.8 &  70.5 & -0.7032 \tabularnewline
143 &  86 &  92.23 & -6.229 \tabularnewline
144 &  78.8 &  92.88 & -14.08 \tabularnewline
145 &  89.4 &  83.25 &  6.152 \tabularnewline
146 &  83.8 &  80.42 &  3.384 \tabularnewline
147 &  74.1 &  77.08 & -2.977 \tabularnewline
148 &  77.2 &  80.74 & -3.542 \tabularnewline
149 &  103.6 &  97.77 &  5.831 \tabularnewline
150 &  78 &  85.6 & -7.604 \tabularnewline
151 &  80.2 &  80.13 &  0.06716 \tabularnewline
152 &  88.8 &  99.53 & -10.73 \tabularnewline
153 &  72.9 &  73.31 & -0.4133 \tabularnewline
154 &  73.6 &  70.99 &  2.613 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=5

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 91.8[/C][C] 93.19[/C][C]-1.394[/C][/ROW]
[ROW][C]2[/C][C] 89.7[/C][C] 86.8[/C][C] 2.902[/C][/ROW]
[ROW][C]3[/C][C] 82.9[/C][C] 85.76[/C][C]-2.863[/C][/ROW]
[ROW][C]4[/C][C] 90[/C][C] 88.83[/C][C] 1.166[/C][/ROW]
[ROW][C]5[/C][C] 98.5[/C][C] 103.5[/C][C]-4.964[/C][/ROW]
[ROW][C]6[/C][C] 93.4[/C][C] 92.45[/C][C] 0.9484[/C][/ROW]
[ROW][C]7[/C][C] 89.1[/C][C] 89.08[/C][C] 0.02272[/C][/ROW]
[ROW][C]8[/C][C] 103[/C][C] 102.2[/C][C] 0.7611[/C][/ROW]
[ROW][C]9[/C][C] 74.7[/C][C] 81.09[/C][C]-6.385[/C][/ROW]
[ROW][C]10[/C][C] 79[/C][C] 77.16[/C][C] 1.838[/C][/ROW]
[ROW][C]11[/C][C] 101.3[/C][C] 102.1[/C][C]-0.7824[/C][/ROW]
[ROW][C]12[/C][C] 96.7[/C][C] 99.21[/C][C]-2.508[/C][/ROW]
[ROW][C]13[/C][C] 99.1[/C][C] 92.68[/C][C] 6.419[/C][/ROW]
[ROW][C]14[/C][C] 92.3[/C][C] 91.38[/C][C] 0.9154[/C][/ROW]
[ROW][C]15[/C][C] 90.6[/C][C] 88.05[/C][C] 2.55[/C][/ROW]
[ROW][C]16[/C][C] 95.2[/C][C] 93.61[/C][C] 1.59[/C][/ROW]
[ROW][C]17[/C][C] 107.6[/C][C] 110.1[/C][C]-2.509[/C][/ROW]
[ROW][C]18[/C][C] 97.6[/C][C] 96.35[/C][C] 1.252[/C][/ROW]
[ROW][C]19[/C][C] 104[/C][C] 95.38[/C][C] 8.619[/C][/ROW]
[ROW][C]20[/C][C] 112[/C][C] 111.6[/C][C] 0.4401[/C][/ROW]
[ROW][C]21[/C][C] 90.6[/C][C] 88.17[/C][C] 2.426[/C][/ROW]
[ROW][C]22[/C][C] 84.9[/C][C] 89.46[/C][C]-4.56[/C][/ROW]
[ROW][C]23[/C][C] 112.7[/C][C] 111.8[/C][C] 0.9066[/C][/ROW]
[ROW][C]24[/C][C] 115.2[/C][C] 110.9[/C][C] 4.262[/C][/ROW]
[ROW][C]25[/C][C] 110.1[/C][C] 105.3[/C][C] 4.808[/C][/ROW]
[ROW][C]26[/C][C] 95.7[/C][C] 100.6[/C][C]-4.878[/C][/ROW]
[ROW][C]27[/C][C] 104.2[/C][C] 97.12[/C][C] 7.082[/C][/ROW]
[ROW][C]28[/C][C] 103.3[/C][C] 105.5[/C][C]-2.194[/C][/ROW]
[ROW][C]29[/C][C] 116.1[/C][C] 116.3[/C][C]-0.1977[/C][/ROW]
[ROW][C]30[/C][C] 106.9[/C][C] 106.3[/C][C] 0.588[/C][/ROW]
[ROW][C]31[/C][C] 105.9[/C][C] 103.9[/C][C] 1.975[/C][/ROW]
[ROW][C]32[/C][C] 120.2[/C][C] 116.5[/C][C] 3.694[/C][/ROW]
[ROW][C]33[/C][C] 96.2[/C][C] 96.05[/C][C] 0.1481[/C][/ROW]
[ROW][C]34[/C][C] 91.5[/C][C] 95.4[/C][C]-3.897[/C][/ROW]
[ROW][C]35[/C][C] 108.3[/C][C] 115.9[/C][C]-7.627[/C][/ROW]
[ROW][C]36[/C][C] 121.1[/C][C] 115.3[/C][C] 5.789[/C][/ROW]
[ROW][C]37[/C][C] 111.4[/C][C] 110.3[/C][C] 1.147[/C][/ROW]
[ROW][C]38[/C][C] 95.6[/C][C] 101.7[/C][C]-6.111[/C][/ROW]
[ROW][C]39[/C][C] 98.7[/C][C] 98.3[/C][C] 0.4017[/C][/ROW]
[ROW][C]40[/C][C] 117.7[/C][C] 104.3[/C][C] 13.44[/C][/ROW]
[ROW][C]41[/C][C] 124.5[/C][C] 117.6[/C][C] 6.945[/C][/ROW]
[ROW][C]42[/C][C] 114.8[/C][C] 106.9[/C][C] 7.919[/C][/ROW]
[ROW][C]43[/C][C] 108[/C][C] 109.3[/C][C]-1.276[/C][/ROW]
[ROW][C]44[/C][C] 120.7[/C][C] 121.6[/C][C]-0.9226[/C][/ROW]
[ROW][C]45[/C][C] 95.6[/C][C] 100.3[/C][C]-4.706[/C][/ROW]
[ROW][C]46[/C][C] 84.3[/C][C] 97.84[/C][C]-13.54[/C][/ROW]
[ROW][C]47[/C][C] 122.2[/C][C] 115.1[/C][C] 7.116[/C][/ROW]
[ROW][C]48[/C][C] 117.1[/C][C] 115.8[/C][C] 1.316[/C][/ROW]
[ROW][C]49[/C][C] 97.2[/C][C] 105.8[/C][C]-8.584[/C][/ROW]
[ROW][C]50[/C][C] 99.5[/C][C] 98.21[/C][C] 1.289[/C][/ROW]
[ROW][C]51[/C][C] 90.1[/C][C] 99.37[/C][C]-9.271[/C][/ROW]
[ROW][C]52[/C][C] 87.3[/C][C] 94.8[/C][C]-7.503[/C][/ROW]
[ROW][C]53[/C][C] 97.4[/C][C] 106.6[/C][C]-9.151[/C][/ROW]
[ROW][C]54[/C][C] 90.1[/C][C] 93.69[/C][C]-3.586[/C][/ROW]
[ROW][C]55[/C][C] 83.6[/C][C] 84.78[/C][C]-1.179[/C][/ROW]
[ROW][C]56[/C][C] 97.8[/C][C] 96.88[/C][C] 0.9224[/C][/ROW]
[ROW][C]57[/C][C] 79.7[/C][C] 77.14[/C][C] 2.556[/C][/ROW]
[ROW][C]58[/C][C] 75.1[/C][C] 74.86[/C][C] 0.2434[/C][/ROW]
[ROW][C]59[/C][C] 106.1[/C][C] 99.6[/C][C] 6.504[/C][/ROW]
[ROW][C]60[/C][C] 103.5[/C][C] 101.7[/C][C] 1.843[/C][/ROW]
[ROW][C]61[/C][C] 94.5[/C][C] 94.46[/C][C] 0.04353[/C][/ROW]
[ROW][C]62[/C][C] 100.9[/C][C] 91.7[/C][C] 9.196[/C][/ROW]
[ROW][C]63[/C][C] 89.7[/C][C] 94.71[/C][C]-5.009[/C][/ROW]
[ROW][C]64[/C][C] 91.4[/C][C] 93.81[/C][C]-2.408[/C][/ROW]
[ROW][C]65[/C][C] 110.2[/C][C] 109.6[/C][C] 0.5557[/C][/ROW]
[ROW][C]66[/C][C] 102.8[/C][C] 99.48[/C][C] 3.316[/C][/ROW]
[ROW][C]67[/C][C] 89.8[/C][C] 93.57[/C][C]-3.772[/C][/ROW]
[ROW][C]68[/C][C] 112.8[/C][C] 106.8[/C][C] 5.974[/C][/ROW]
[ROW][C]69[/C][C] 84[/C][C] 89.77[/C][C]-5.77[/C][/ROW]
[ROW][C]70[/C][C] 86.5[/C][C] 83.34[/C][C] 3.162[/C][/ROW]
[ROW][C]71[/C][C] 107.3[/C][C] 108.5[/C][C]-1.221[/C][/ROW]
[ROW][C]72[/C][C] 120.2[/C][C] 107.3[/C][C] 12.94[/C][/ROW]
[ROW][C]73[/C][C] 105.5[/C][C] 103.9[/C][C] 1.596[/C][/ROW]
[ROW][C]74[/C][C] 99.9[/C][C] 98.46[/C][C] 1.435[/C][/ROW]
[ROW][C]75[/C][C] 100.4[/C][C] 98.91[/C][C] 1.494[/C][/ROW]
[ROW][C]76[/C][C] 99.6[/C][C] 102.9[/C][C]-3.285[/C][/ROW]
[ROW][C]77[/C][C] 118.6[/C][C] 115[/C][C] 3.634[/C][/ROW]
[ROW][C]78[/C][C] 96[/C][C] 105.2[/C][C]-9.248[/C][/ROW]
[ROW][C]79[/C][C] 105.3[/C][C] 99.36[/C][C] 5.937[/C][/ROW]
[ROW][C]80[/C][C] 105.8[/C][C] 113.1[/C][C]-7.297[/C][/ROW]
[ROW][C]81[/C][C] 80.1[/C][C] 88.95[/C][C]-8.849[/C][/ROW]
[ROW][C]82[/C][C] 89.3[/C][C] 84.36[/C][C] 4.939[/C][/ROW]
[ROW][C]83[/C][C] 120.4[/C][C] 110.2[/C][C] 10.24[/C][/ROW]
[ROW][C]84[/C][C] 111.3[/C][C] 106.7[/C][C] 4.558[/C][/ROW]
[ROW][C]85[/C][C] 98.1[/C][C] 100.9[/C][C]-2.774[/C][/ROW]
[ROW][C]86[/C][C] 102.9[/C][C] 99.4[/C][C] 3.499[/C][/ROW]
[ROW][C]87[/C][C] 95.4[/C][C] 96.29[/C][C]-0.8899[/C][/ROW]
[ROW][C]88[/C][C] 108.7[/C][C] 100.5[/C][C] 8.235[/C][/ROW]
[ROW][C]89[/C][C] 123[/C][C] 113.3[/C][C] 9.713[/C][/ROW]
[ROW][C]90[/C][C] 107.7[/C][C] 104.4[/C][C] 3.283[/C][/ROW]
[ROW][C]91[/C][C] 97.2[/C][C] 101.9[/C][C]-4.682[/C][/ROW]
[ROW][C]92[/C][C] 127.7[/C][C] 115.8[/C][C] 11.9[/C][/ROW]
[ROW][C]93[/C][C] 100.6[/C][C] 98.03[/C][C] 2.572[/C][/ROW]
[ROW][C]94[/C][C] 89.7[/C][C] 91.56[/C][C]-1.857[/C][/ROW]
[ROW][C]95[/C][C] 108.3[/C][C] 116.2[/C][C]-7.921[/C][/ROW]
[ROW][C]96[/C][C] 110[/C][C] 112.1[/C][C]-2.111[/C][/ROW]
[ROW][C]97[/C][C] 105.2[/C][C] 103.1[/C][C] 2.1[/C][/ROW]
[ROW][C]98[/C][C] 87.7[/C][C] 96.13[/C][C]-8.425[/C][/ROW]
[ROW][C]99[/C][C] 91.4[/C][C] 90.28[/C][C] 1.118[/C][/ROW]
[ROW][C]100[/C][C] 92.8[/C][C] 94.51[/C][C]-1.71[/C][/ROW]
[ROW][C]101[/C][C] 97.5[/C][C] 106.1[/C][C]-8.6[/C][/ROW]
[ROW][C]102[/C][C] 95.7[/C][C] 90.19[/C][C] 5.515[/C][/ROW]
[ROW][C]103[/C][C] 93.5[/C][C] 89.7[/C][C] 3.805[/C][/ROW]
[ROW][C]104[/C][C] 97.3[/C][C] 101.7[/C][C]-4.39[/C][/ROW]
[ROW][C]105[/C][C] 84.1[/C][C] 79.86[/C][C] 4.241[/C][/ROW]
[ROW][C]106[/C][C] 87.8[/C][C] 81.38[/C][C] 6.418[/C][/ROW]
[ROW][C]107[/C][C] 96.2[/C][C] 101.3[/C][C]-5.11[/C][/ROW]
[ROW][C]108[/C][C] 94.6[/C][C] 100.7[/C][C]-6.063[/C][/ROW]
[ROW][C]109[/C][C] 88.7[/C][C] 98.09[/C][C]-9.395[/C][/ROW]
[ROW][C]110[/C][C] 76.5[/C][C] 86.49[/C][C]-9.99[/C][/ROW]
[ROW][C]111[/C][C] 83.9[/C][C] 78.19[/C][C] 5.708[/C][/ROW]
[ROW][C]112[/C][C] 88.1[/C][C] 85.35[/C][C] 2.745[/C][/ROW]
[ROW][C]113[/C][C] 93[/C][C] 93.94[/C][C]-0.942[/C][/ROW]
[ROW][C]114[/C][C] 81.8[/C][C] 81.2[/C][C] 0.6047[/C][/ROW]
[ROW][C]115[/C][C] 84.1[/C][C] 83.17[/C][C] 0.9343[/C][/ROW]
[ROW][C]116[/C][C] 89.1[/C][C] 92.31[/C][C]-3.21[/C][/ROW]
[ROW][C]117[/C][C] 75.8[/C][C] 69.44[/C][C] 6.363[/C][/ROW]
[ROW][C]118[/C][C] 71.4[/C][C] 70.43[/C][C] 0.9722[/C][/ROW]
[ROW][C]119[/C][C] 93.8[/C][C] 90.34[/C][C] 3.455[/C][/ROW]
[ROW][C]120[/C][C] 88.5[/C][C] 90.04[/C][C]-1.541[/C][/ROW]
[ROW][C]121[/C][C] 78.1[/C][C] 84.53[/C][C]-6.435[/C][/ROW]
[ROW][C]122[/C][C] 83.6[/C][C] 76.83[/C][C] 6.771[/C][/ROW]
[ROW][C]123[/C][C] 78.2[/C][C] 76.12[/C][C] 2.075[/C][/ROW]
[ROW][C]124[/C][C] 76.2[/C][C] 79.02[/C][C]-2.819[/C][/ROW]
[ROW][C]125[/C][C] 92[/C][C] 90.29[/C][C] 1.708[/C][/ROW]
[ROW][C]126[/C][C] 79.5[/C][C] 82.09[/C][C]-2.59[/C][/ROW]
[ROW][C]127[/C][C] 69.5[/C][C] 74.29[/C][C]-4.794[/C][/ROW]
[ROW][C]128[/C][C] 86.4[/C][C] 87.46[/C][C]-1.064[/C][/ROW]
[ROW][C]129[/C][C] 72.3[/C][C] 66.39[/C][C] 5.907[/C][/ROW]
[ROW][C]130[/C][C] 65[/C][C] 60.63[/C][C] 4.369[/C][/ROW]
[ROW][C]131[/C][C] 86[/C][C] 85.34[/C][C] 0.6639[/C][/ROW]
[ROW][C]132[/C][C] 83.4[/C][C] 87.81[/C][C]-4.407[/C][/ROW]
[ROW][C]133[/C][C] 87.2[/C][C] 80.88[/C][C] 6.315[/C][/ROW]
[ROW][C]134[/C][C] 76.4[/C][C] 76.39[/C][C] 0.0131[/C][/ROW]
[ROW][C]135[/C][C] 76.3[/C][C] 75.72[/C][C] 0.581[/C][/ROW]
[ROW][C]136[/C][C] 76.9[/C][C] 80.61[/C][C]-3.711[/C][/ROW]
[ROW][C]137[/C][C] 92.7[/C][C] 94.72[/C][C]-2.024[/C][/ROW]
[ROW][C]138[/C][C] 83.3[/C][C] 83.7[/C][C]-0.3981[/C][/ROW]
[ROW][C]139[/C][C] 73.8[/C][C] 79.46[/C][C]-5.657[/C][/ROW]
[ROW][C]140[/C][C] 94[/C][C] 90.07[/C][C] 3.928[/C][/ROW]
[ROW][C]141[/C][C] 73.1[/C][C] 71.19[/C][C] 1.91[/C][/ROW]
[ROW][C]142[/C][C] 69.8[/C][C] 70.5[/C][C]-0.7032[/C][/ROW]
[ROW][C]143[/C][C] 86[/C][C] 92.23[/C][C]-6.229[/C][/ROW]
[ROW][C]144[/C][C] 78.8[/C][C] 92.88[/C][C]-14.08[/C][/ROW]
[ROW][C]145[/C][C] 89.4[/C][C] 83.25[/C][C] 6.152[/C][/ROW]
[ROW][C]146[/C][C] 83.8[/C][C] 80.42[/C][C] 3.384[/C][/ROW]
[ROW][C]147[/C][C] 74.1[/C][C] 77.08[/C][C]-2.977[/C][/ROW]
[ROW][C]148[/C][C] 77.2[/C][C] 80.74[/C][C]-3.542[/C][/ROW]
[ROW][C]149[/C][C] 103.6[/C][C] 97.77[/C][C] 5.831[/C][/ROW]
[ROW][C]150[/C][C] 78[/C][C] 85.6[/C][C]-7.604[/C][/ROW]
[ROW][C]151[/C][C] 80.2[/C][C] 80.13[/C][C] 0.06716[/C][/ROW]
[ROW][C]152[/C][C] 88.8[/C][C] 99.53[/C][C]-10.73[/C][/ROW]
[ROW][C]153[/C][C] 72.9[/C][C] 73.31[/C][C]-0.4133[/C][/ROW]
[ROW][C]154[/C][C] 73.6[/C][C] 70.99[/C][C] 2.613[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 91.8 93.19-1.394
2 89.7 86.8 2.902
3 82.9 85.76-2.863
4 90 88.83 1.166
5 98.5 103.5-4.964
6 93.4 92.45 0.9484
7 89.1 89.08 0.02272
8 103 102.2 0.7611
9 74.7 81.09-6.385
10 79 77.16 1.838
11 101.3 102.1-0.7824
12 96.7 99.21-2.508
13 99.1 92.68 6.419
14 92.3 91.38 0.9154
15 90.6 88.05 2.55
16 95.2 93.61 1.59
17 107.6 110.1-2.509
18 97.6 96.35 1.252
19 104 95.38 8.619
20 112 111.6 0.4401
21 90.6 88.17 2.426
22 84.9 89.46-4.56
23 112.7 111.8 0.9066
24 115.2 110.9 4.262
25 110.1 105.3 4.808
26 95.7 100.6-4.878
27 104.2 97.12 7.082
28 103.3 105.5-2.194
29 116.1 116.3-0.1977
30 106.9 106.3 0.588
31 105.9 103.9 1.975
32 120.2 116.5 3.694
33 96.2 96.05 0.1481
34 91.5 95.4-3.897
35 108.3 115.9-7.627
36 121.1 115.3 5.789
37 111.4 110.3 1.147
38 95.6 101.7-6.111
39 98.7 98.3 0.4017
40 117.7 104.3 13.44
41 124.5 117.6 6.945
42 114.8 106.9 7.919
43 108 109.3-1.276
44 120.7 121.6-0.9226
45 95.6 100.3-4.706
46 84.3 97.84-13.54
47 122.2 115.1 7.116
48 117.1 115.8 1.316
49 97.2 105.8-8.584
50 99.5 98.21 1.289
51 90.1 99.37-9.271
52 87.3 94.8-7.503
53 97.4 106.6-9.151
54 90.1 93.69-3.586
55 83.6 84.78-1.179
56 97.8 96.88 0.9224
57 79.7 77.14 2.556
58 75.1 74.86 0.2434
59 106.1 99.6 6.504
60 103.5 101.7 1.843
61 94.5 94.46 0.04353
62 100.9 91.7 9.196
63 89.7 94.71-5.009
64 91.4 93.81-2.408
65 110.2 109.6 0.5557
66 102.8 99.48 3.316
67 89.8 93.57-3.772
68 112.8 106.8 5.974
69 84 89.77-5.77
70 86.5 83.34 3.162
71 107.3 108.5-1.221
72 120.2 107.3 12.94
73 105.5 103.9 1.596
74 99.9 98.46 1.435
75 100.4 98.91 1.494
76 99.6 102.9-3.285
77 118.6 115 3.634
78 96 105.2-9.248
79 105.3 99.36 5.937
80 105.8 113.1-7.297
81 80.1 88.95-8.849
82 89.3 84.36 4.939
83 120.4 110.2 10.24
84 111.3 106.7 4.558
85 98.1 100.9-2.774
86 102.9 99.4 3.499
87 95.4 96.29-0.8899
88 108.7 100.5 8.235
89 123 113.3 9.713
90 107.7 104.4 3.283
91 97.2 101.9-4.682
92 127.7 115.8 11.9
93 100.6 98.03 2.572
94 89.7 91.56-1.857
95 108.3 116.2-7.921
96 110 112.1-2.111
97 105.2 103.1 2.1
98 87.7 96.13-8.425
99 91.4 90.28 1.118
100 92.8 94.51-1.71
101 97.5 106.1-8.6
102 95.7 90.19 5.515
103 93.5 89.7 3.805
104 97.3 101.7-4.39
105 84.1 79.86 4.241
106 87.8 81.38 6.418
107 96.2 101.3-5.11
108 94.6 100.7-6.063
109 88.7 98.09-9.395
110 76.5 86.49-9.99
111 83.9 78.19 5.708
112 88.1 85.35 2.745
113 93 93.94-0.942
114 81.8 81.2 0.6047
115 84.1 83.17 0.9343
116 89.1 92.31-3.21
117 75.8 69.44 6.363
118 71.4 70.43 0.9722
119 93.8 90.34 3.455
120 88.5 90.04-1.541
121 78.1 84.53-6.435
122 83.6 76.83 6.771
123 78.2 76.12 2.075
124 76.2 79.02-2.819
125 92 90.29 1.708
126 79.5 82.09-2.59
127 69.5 74.29-4.794
128 86.4 87.46-1.064
129 72.3 66.39 5.907
130 65 60.63 4.369
131 86 85.34 0.6639
132 83.4 87.81-4.407
133 87.2 80.88 6.315
134 76.4 76.39 0.0131
135 76.3 75.72 0.581
136 76.9 80.61-3.711
137 92.7 94.72-2.024
138 83.3 83.7-0.3981
139 73.8 79.46-5.657
140 94 90.07 3.928
141 73.1 71.19 1.91
142 69.8 70.5-0.7032
143 86 92.23-6.229
144 78.8 92.88-14.08
145 89.4 83.25 6.152
146 83.8 80.42 3.384
147 74.1 77.08-2.977
148 77.2 80.74-3.542
149 103.6 97.77 5.831
150 78 85.6-7.604
151 80.2 80.13 0.06716
152 88.8 99.53-10.73
153 72.9 73.31-0.4133
154 73.6 70.99 2.613







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
30 0.239 0.478 0.761
31 0.1587 0.3173 0.8413
32 0.0793 0.1586 0.9207
33 0.07801 0.156 0.922
34 0.04131 0.08263 0.9587
35 0.0901 0.1802 0.9099
36 0.05359 0.1072 0.9464
37 0.02995 0.0599 0.9701
38 0.02191 0.04382 0.9781
39 0.01301 0.02601 0.987
40 0.03866 0.07732 0.9613
41 0.1128 0.2257 0.8872
42 0.1429 0.2858 0.8571
43 0.1251 0.2503 0.8749
44 0.1354 0.2708 0.8646
45 0.1798 0.3596 0.8202
46 0.195 0.39 0.805
47 0.3313 0.6626 0.6687
48 0.3143 0.6286 0.6857
49 0.5193 0.9614 0.4807
50 0.4585 0.9171 0.5415
51 0.4466 0.8932 0.5534
52 0.4644 0.9287 0.5356
53 0.4555 0.9111 0.5445
54 0.4122 0.8245 0.5878
55 0.3528 0.7056 0.6472
56 0.3686 0.7372 0.6314
57 0.414 0.828 0.586
58 0.3841 0.7682 0.6159
59 0.3676 0.7352 0.6324
60 0.3176 0.6352 0.6824
61 0.2656 0.5312 0.7344
62 0.3437 0.6874 0.6563
63 0.3058 0.6115 0.6942
64 0.2561 0.5121 0.7439
65 0.2166 0.4332 0.7834
66 0.1999 0.3999 0.8001
67 0.1646 0.3291 0.8354
68 0.1759 0.3517 0.8241
69 0.1632 0.3264 0.8368
70 0.1422 0.2843 0.8578
71 0.1137 0.2274 0.8863
72 0.2685 0.5369 0.7315
73 0.2323 0.4646 0.7677
74 0.2138 0.4276 0.7862
75 0.1845 0.369 0.8155
76 0.1732 0.3464 0.8268
77 0.1545 0.309 0.8455
78 0.1879 0.3757 0.8121
79 0.1875 0.3749 0.8125
80 0.1872 0.3743 0.8128
81 0.3191 0.6383 0.6809
82 0.3252 0.6503 0.6748
83 0.4659 0.9318 0.5341
84 0.5443 0.9114 0.4557
85 0.4984 0.9969 0.5016
86 0.4652 0.9304 0.5348
87 0.4424 0.8848 0.5576
88 0.5019 0.9961 0.4981
89 0.7054 0.5892 0.2946
90 0.7779 0.4442 0.2221
91 0.7583 0.4834 0.2417
92 0.8826 0.2349 0.1174
93 0.8648 0.2704 0.1352
94 0.8403 0.3194 0.1597
95 0.8247 0.3506 0.1753
96 0.8543 0.2914 0.1457
97 0.868 0.264 0.132
98 0.84 0.32 0.16
99 0.8008 0.3983 0.1992
100 0.7538 0.4925 0.2462
101 0.8527 0.2947 0.1473
102 0.8226 0.3547 0.1774
103 0.8291 0.3418 0.1709
104 0.7865 0.4269 0.2135
105 0.7545 0.491 0.2455
106 0.8258 0.3484 0.1742
107 0.8094 0.3813 0.1906
108 0.8203 0.3593 0.1797
109 0.795 0.4099 0.205
110 0.866 0.268 0.134
111 0.8524 0.2951 0.1476
112 0.8713 0.2574 0.1287
113 0.8227 0.3546 0.1773
114 0.7637 0.4726 0.2363
115 0.7062 0.5875 0.2938
116 0.6388 0.7223 0.3612
117 0.6022 0.7957 0.3978
118 0.5981 0.8037 0.4019
119 0.6621 0.6758 0.3379
120 0.7524 0.4951 0.2476
121 0.7323 0.5354 0.2677
122 0.6279 0.7441 0.3721
123 0.6505 0.6989 0.3495
124 0.6655 0.6691 0.3345

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
30 &  0.239 &  0.478 &  0.761 \tabularnewline
31 &  0.1587 &  0.3173 &  0.8413 \tabularnewline
32 &  0.0793 &  0.1586 &  0.9207 \tabularnewline
33 &  0.07801 &  0.156 &  0.922 \tabularnewline
34 &  0.04131 &  0.08263 &  0.9587 \tabularnewline
35 &  0.0901 &  0.1802 &  0.9099 \tabularnewline
36 &  0.05359 &  0.1072 &  0.9464 \tabularnewline
37 &  0.02995 &  0.0599 &  0.9701 \tabularnewline
38 &  0.02191 &  0.04382 &  0.9781 \tabularnewline
39 &  0.01301 &  0.02601 &  0.987 \tabularnewline
40 &  0.03866 &  0.07732 &  0.9613 \tabularnewline
41 &  0.1128 &  0.2257 &  0.8872 \tabularnewline
42 &  0.1429 &  0.2858 &  0.8571 \tabularnewline
43 &  0.1251 &  0.2503 &  0.8749 \tabularnewline
44 &  0.1354 &  0.2708 &  0.8646 \tabularnewline
45 &  0.1798 &  0.3596 &  0.8202 \tabularnewline
46 &  0.195 &  0.39 &  0.805 \tabularnewline
47 &  0.3313 &  0.6626 &  0.6687 \tabularnewline
48 &  0.3143 &  0.6286 &  0.6857 \tabularnewline
49 &  0.5193 &  0.9614 &  0.4807 \tabularnewline
50 &  0.4585 &  0.9171 &  0.5415 \tabularnewline
51 &  0.4466 &  0.8932 &  0.5534 \tabularnewline
52 &  0.4644 &  0.9287 &  0.5356 \tabularnewline
53 &  0.4555 &  0.9111 &  0.5445 \tabularnewline
54 &  0.4122 &  0.8245 &  0.5878 \tabularnewline
55 &  0.3528 &  0.7056 &  0.6472 \tabularnewline
56 &  0.3686 &  0.7372 &  0.6314 \tabularnewline
57 &  0.414 &  0.828 &  0.586 \tabularnewline
58 &  0.3841 &  0.7682 &  0.6159 \tabularnewline
59 &  0.3676 &  0.7352 &  0.6324 \tabularnewline
60 &  0.3176 &  0.6352 &  0.6824 \tabularnewline
61 &  0.2656 &  0.5312 &  0.7344 \tabularnewline
62 &  0.3437 &  0.6874 &  0.6563 \tabularnewline
63 &  0.3058 &  0.6115 &  0.6942 \tabularnewline
64 &  0.2561 &  0.5121 &  0.7439 \tabularnewline
65 &  0.2166 &  0.4332 &  0.7834 \tabularnewline
66 &  0.1999 &  0.3999 &  0.8001 \tabularnewline
67 &  0.1646 &  0.3291 &  0.8354 \tabularnewline
68 &  0.1759 &  0.3517 &  0.8241 \tabularnewline
69 &  0.1632 &  0.3264 &  0.8368 \tabularnewline
70 &  0.1422 &  0.2843 &  0.8578 \tabularnewline
71 &  0.1137 &  0.2274 &  0.8863 \tabularnewline
72 &  0.2685 &  0.5369 &  0.7315 \tabularnewline
73 &  0.2323 &  0.4646 &  0.7677 \tabularnewline
74 &  0.2138 &  0.4276 &  0.7862 \tabularnewline
75 &  0.1845 &  0.369 &  0.8155 \tabularnewline
76 &  0.1732 &  0.3464 &  0.8268 \tabularnewline
77 &  0.1545 &  0.309 &  0.8455 \tabularnewline
78 &  0.1879 &  0.3757 &  0.8121 \tabularnewline
79 &  0.1875 &  0.3749 &  0.8125 \tabularnewline
80 &  0.1872 &  0.3743 &  0.8128 \tabularnewline
81 &  0.3191 &  0.6383 &  0.6809 \tabularnewline
82 &  0.3252 &  0.6503 &  0.6748 \tabularnewline
83 &  0.4659 &  0.9318 &  0.5341 \tabularnewline
84 &  0.5443 &  0.9114 &  0.4557 \tabularnewline
85 &  0.4984 &  0.9969 &  0.5016 \tabularnewline
86 &  0.4652 &  0.9304 &  0.5348 \tabularnewline
87 &  0.4424 &  0.8848 &  0.5576 \tabularnewline
88 &  0.5019 &  0.9961 &  0.4981 \tabularnewline
89 &  0.7054 &  0.5892 &  0.2946 \tabularnewline
90 &  0.7779 &  0.4442 &  0.2221 \tabularnewline
91 &  0.7583 &  0.4834 &  0.2417 \tabularnewline
92 &  0.8826 &  0.2349 &  0.1174 \tabularnewline
93 &  0.8648 &  0.2704 &  0.1352 \tabularnewline
94 &  0.8403 &  0.3194 &  0.1597 \tabularnewline
95 &  0.8247 &  0.3506 &  0.1753 \tabularnewline
96 &  0.8543 &  0.2914 &  0.1457 \tabularnewline
97 &  0.868 &  0.264 &  0.132 \tabularnewline
98 &  0.84 &  0.32 &  0.16 \tabularnewline
99 &  0.8008 &  0.3983 &  0.1992 \tabularnewline
100 &  0.7538 &  0.4925 &  0.2462 \tabularnewline
101 &  0.8527 &  0.2947 &  0.1473 \tabularnewline
102 &  0.8226 &  0.3547 &  0.1774 \tabularnewline
103 &  0.8291 &  0.3418 &  0.1709 \tabularnewline
104 &  0.7865 &  0.4269 &  0.2135 \tabularnewline
105 &  0.7545 &  0.491 &  0.2455 \tabularnewline
106 &  0.8258 &  0.3484 &  0.1742 \tabularnewline
107 &  0.8094 &  0.3813 &  0.1906 \tabularnewline
108 &  0.8203 &  0.3593 &  0.1797 \tabularnewline
109 &  0.795 &  0.4099 &  0.205 \tabularnewline
110 &  0.866 &  0.268 &  0.134 \tabularnewline
111 &  0.8524 &  0.2951 &  0.1476 \tabularnewline
112 &  0.8713 &  0.2574 &  0.1287 \tabularnewline
113 &  0.8227 &  0.3546 &  0.1773 \tabularnewline
114 &  0.7637 &  0.4726 &  0.2363 \tabularnewline
115 &  0.7062 &  0.5875 &  0.2938 \tabularnewline
116 &  0.6388 &  0.7223 &  0.3612 \tabularnewline
117 &  0.6022 &  0.7957 &  0.3978 \tabularnewline
118 &  0.5981 &  0.8037 &  0.4019 \tabularnewline
119 &  0.6621 &  0.6758 &  0.3379 \tabularnewline
120 &  0.7524 &  0.4951 &  0.2476 \tabularnewline
121 &  0.7323 &  0.5354 &  0.2677 \tabularnewline
122 &  0.6279 &  0.7441 &  0.3721 \tabularnewline
123 &  0.6505 &  0.6989 &  0.3495 \tabularnewline
124 &  0.6655 &  0.6691 &  0.3345 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=6

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]30[/C][C] 0.239[/C][C] 0.478[/C][C] 0.761[/C][/ROW]
[ROW][C]31[/C][C] 0.1587[/C][C] 0.3173[/C][C] 0.8413[/C][/ROW]
[ROW][C]32[/C][C] 0.0793[/C][C] 0.1586[/C][C] 0.9207[/C][/ROW]
[ROW][C]33[/C][C] 0.07801[/C][C] 0.156[/C][C] 0.922[/C][/ROW]
[ROW][C]34[/C][C] 0.04131[/C][C] 0.08263[/C][C] 0.9587[/C][/ROW]
[ROW][C]35[/C][C] 0.0901[/C][C] 0.1802[/C][C] 0.9099[/C][/ROW]
[ROW][C]36[/C][C] 0.05359[/C][C] 0.1072[/C][C] 0.9464[/C][/ROW]
[ROW][C]37[/C][C] 0.02995[/C][C] 0.0599[/C][C] 0.9701[/C][/ROW]
[ROW][C]38[/C][C] 0.02191[/C][C] 0.04382[/C][C] 0.9781[/C][/ROW]
[ROW][C]39[/C][C] 0.01301[/C][C] 0.02601[/C][C] 0.987[/C][/ROW]
[ROW][C]40[/C][C] 0.03866[/C][C] 0.07732[/C][C] 0.9613[/C][/ROW]
[ROW][C]41[/C][C] 0.1128[/C][C] 0.2257[/C][C] 0.8872[/C][/ROW]
[ROW][C]42[/C][C] 0.1429[/C][C] 0.2858[/C][C] 0.8571[/C][/ROW]
[ROW][C]43[/C][C] 0.1251[/C][C] 0.2503[/C][C] 0.8749[/C][/ROW]
[ROW][C]44[/C][C] 0.1354[/C][C] 0.2708[/C][C] 0.8646[/C][/ROW]
[ROW][C]45[/C][C] 0.1798[/C][C] 0.3596[/C][C] 0.8202[/C][/ROW]
[ROW][C]46[/C][C] 0.195[/C][C] 0.39[/C][C] 0.805[/C][/ROW]
[ROW][C]47[/C][C] 0.3313[/C][C] 0.6626[/C][C] 0.6687[/C][/ROW]
[ROW][C]48[/C][C] 0.3143[/C][C] 0.6286[/C][C] 0.6857[/C][/ROW]
[ROW][C]49[/C][C] 0.5193[/C][C] 0.9614[/C][C] 0.4807[/C][/ROW]
[ROW][C]50[/C][C] 0.4585[/C][C] 0.9171[/C][C] 0.5415[/C][/ROW]
[ROW][C]51[/C][C] 0.4466[/C][C] 0.8932[/C][C] 0.5534[/C][/ROW]
[ROW][C]52[/C][C] 0.4644[/C][C] 0.9287[/C][C] 0.5356[/C][/ROW]
[ROW][C]53[/C][C] 0.4555[/C][C] 0.9111[/C][C] 0.5445[/C][/ROW]
[ROW][C]54[/C][C] 0.4122[/C][C] 0.8245[/C][C] 0.5878[/C][/ROW]
[ROW][C]55[/C][C] 0.3528[/C][C] 0.7056[/C][C] 0.6472[/C][/ROW]
[ROW][C]56[/C][C] 0.3686[/C][C] 0.7372[/C][C] 0.6314[/C][/ROW]
[ROW][C]57[/C][C] 0.414[/C][C] 0.828[/C][C] 0.586[/C][/ROW]
[ROW][C]58[/C][C] 0.3841[/C][C] 0.7682[/C][C] 0.6159[/C][/ROW]
[ROW][C]59[/C][C] 0.3676[/C][C] 0.7352[/C][C] 0.6324[/C][/ROW]
[ROW][C]60[/C][C] 0.3176[/C][C] 0.6352[/C][C] 0.6824[/C][/ROW]
[ROW][C]61[/C][C] 0.2656[/C][C] 0.5312[/C][C] 0.7344[/C][/ROW]
[ROW][C]62[/C][C] 0.3437[/C][C] 0.6874[/C][C] 0.6563[/C][/ROW]
[ROW][C]63[/C][C] 0.3058[/C][C] 0.6115[/C][C] 0.6942[/C][/ROW]
[ROW][C]64[/C][C] 0.2561[/C][C] 0.5121[/C][C] 0.7439[/C][/ROW]
[ROW][C]65[/C][C] 0.2166[/C][C] 0.4332[/C][C] 0.7834[/C][/ROW]
[ROW][C]66[/C][C] 0.1999[/C][C] 0.3999[/C][C] 0.8001[/C][/ROW]
[ROW][C]67[/C][C] 0.1646[/C][C] 0.3291[/C][C] 0.8354[/C][/ROW]
[ROW][C]68[/C][C] 0.1759[/C][C] 0.3517[/C][C] 0.8241[/C][/ROW]
[ROW][C]69[/C][C] 0.1632[/C][C] 0.3264[/C][C] 0.8368[/C][/ROW]
[ROW][C]70[/C][C] 0.1422[/C][C] 0.2843[/C][C] 0.8578[/C][/ROW]
[ROW][C]71[/C][C] 0.1137[/C][C] 0.2274[/C][C] 0.8863[/C][/ROW]
[ROW][C]72[/C][C] 0.2685[/C][C] 0.5369[/C][C] 0.7315[/C][/ROW]
[ROW][C]73[/C][C] 0.2323[/C][C] 0.4646[/C][C] 0.7677[/C][/ROW]
[ROW][C]74[/C][C] 0.2138[/C][C] 0.4276[/C][C] 0.7862[/C][/ROW]
[ROW][C]75[/C][C] 0.1845[/C][C] 0.369[/C][C] 0.8155[/C][/ROW]
[ROW][C]76[/C][C] 0.1732[/C][C] 0.3464[/C][C] 0.8268[/C][/ROW]
[ROW][C]77[/C][C] 0.1545[/C][C] 0.309[/C][C] 0.8455[/C][/ROW]
[ROW][C]78[/C][C] 0.1879[/C][C] 0.3757[/C][C] 0.8121[/C][/ROW]
[ROW][C]79[/C][C] 0.1875[/C][C] 0.3749[/C][C] 0.8125[/C][/ROW]
[ROW][C]80[/C][C] 0.1872[/C][C] 0.3743[/C][C] 0.8128[/C][/ROW]
[ROW][C]81[/C][C] 0.3191[/C][C] 0.6383[/C][C] 0.6809[/C][/ROW]
[ROW][C]82[/C][C] 0.3252[/C][C] 0.6503[/C][C] 0.6748[/C][/ROW]
[ROW][C]83[/C][C] 0.4659[/C][C] 0.9318[/C][C] 0.5341[/C][/ROW]
[ROW][C]84[/C][C] 0.5443[/C][C] 0.9114[/C][C] 0.4557[/C][/ROW]
[ROW][C]85[/C][C] 0.4984[/C][C] 0.9969[/C][C] 0.5016[/C][/ROW]
[ROW][C]86[/C][C] 0.4652[/C][C] 0.9304[/C][C] 0.5348[/C][/ROW]
[ROW][C]87[/C][C] 0.4424[/C][C] 0.8848[/C][C] 0.5576[/C][/ROW]
[ROW][C]88[/C][C] 0.5019[/C][C] 0.9961[/C][C] 0.4981[/C][/ROW]
[ROW][C]89[/C][C] 0.7054[/C][C] 0.5892[/C][C] 0.2946[/C][/ROW]
[ROW][C]90[/C][C] 0.7779[/C][C] 0.4442[/C][C] 0.2221[/C][/ROW]
[ROW][C]91[/C][C] 0.7583[/C][C] 0.4834[/C][C] 0.2417[/C][/ROW]
[ROW][C]92[/C][C] 0.8826[/C][C] 0.2349[/C][C] 0.1174[/C][/ROW]
[ROW][C]93[/C][C] 0.8648[/C][C] 0.2704[/C][C] 0.1352[/C][/ROW]
[ROW][C]94[/C][C] 0.8403[/C][C] 0.3194[/C][C] 0.1597[/C][/ROW]
[ROW][C]95[/C][C] 0.8247[/C][C] 0.3506[/C][C] 0.1753[/C][/ROW]
[ROW][C]96[/C][C] 0.8543[/C][C] 0.2914[/C][C] 0.1457[/C][/ROW]
[ROW][C]97[/C][C] 0.868[/C][C] 0.264[/C][C] 0.132[/C][/ROW]
[ROW][C]98[/C][C] 0.84[/C][C] 0.32[/C][C] 0.16[/C][/ROW]
[ROW][C]99[/C][C] 0.8008[/C][C] 0.3983[/C][C] 0.1992[/C][/ROW]
[ROW][C]100[/C][C] 0.7538[/C][C] 0.4925[/C][C] 0.2462[/C][/ROW]
[ROW][C]101[/C][C] 0.8527[/C][C] 0.2947[/C][C] 0.1473[/C][/ROW]
[ROW][C]102[/C][C] 0.8226[/C][C] 0.3547[/C][C] 0.1774[/C][/ROW]
[ROW][C]103[/C][C] 0.8291[/C][C] 0.3418[/C][C] 0.1709[/C][/ROW]
[ROW][C]104[/C][C] 0.7865[/C][C] 0.4269[/C][C] 0.2135[/C][/ROW]
[ROW][C]105[/C][C] 0.7545[/C][C] 0.491[/C][C] 0.2455[/C][/ROW]
[ROW][C]106[/C][C] 0.8258[/C][C] 0.3484[/C][C] 0.1742[/C][/ROW]
[ROW][C]107[/C][C] 0.8094[/C][C] 0.3813[/C][C] 0.1906[/C][/ROW]
[ROW][C]108[/C][C] 0.8203[/C][C] 0.3593[/C][C] 0.1797[/C][/ROW]
[ROW][C]109[/C][C] 0.795[/C][C] 0.4099[/C][C] 0.205[/C][/ROW]
[ROW][C]110[/C][C] 0.866[/C][C] 0.268[/C][C] 0.134[/C][/ROW]
[ROW][C]111[/C][C] 0.8524[/C][C] 0.2951[/C][C] 0.1476[/C][/ROW]
[ROW][C]112[/C][C] 0.8713[/C][C] 0.2574[/C][C] 0.1287[/C][/ROW]
[ROW][C]113[/C][C] 0.8227[/C][C] 0.3546[/C][C] 0.1773[/C][/ROW]
[ROW][C]114[/C][C] 0.7637[/C][C] 0.4726[/C][C] 0.2363[/C][/ROW]
[ROW][C]115[/C][C] 0.7062[/C][C] 0.5875[/C][C] 0.2938[/C][/ROW]
[ROW][C]116[/C][C] 0.6388[/C][C] 0.7223[/C][C] 0.3612[/C][/ROW]
[ROW][C]117[/C][C] 0.6022[/C][C] 0.7957[/C][C] 0.3978[/C][/ROW]
[ROW][C]118[/C][C] 0.5981[/C][C] 0.8037[/C][C] 0.4019[/C][/ROW]
[ROW][C]119[/C][C] 0.6621[/C][C] 0.6758[/C][C] 0.3379[/C][/ROW]
[ROW][C]120[/C][C] 0.7524[/C][C] 0.4951[/C][C] 0.2476[/C][/ROW]
[ROW][C]121[/C][C] 0.7323[/C][C] 0.5354[/C][C] 0.2677[/C][/ROW]
[ROW][C]122[/C][C] 0.6279[/C][C] 0.7441[/C][C] 0.3721[/C][/ROW]
[ROW][C]123[/C][C] 0.6505[/C][C] 0.6989[/C][C] 0.3495[/C][/ROW]
[ROW][C]124[/C][C] 0.6655[/C][C] 0.6691[/C][C] 0.3345[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
30 0.239 0.478 0.761
31 0.1587 0.3173 0.8413
32 0.0793 0.1586 0.9207
33 0.07801 0.156 0.922
34 0.04131 0.08263 0.9587
35 0.0901 0.1802 0.9099
36 0.05359 0.1072 0.9464
37 0.02995 0.0599 0.9701
38 0.02191 0.04382 0.9781
39 0.01301 0.02601 0.987
40 0.03866 0.07732 0.9613
41 0.1128 0.2257 0.8872
42 0.1429 0.2858 0.8571
43 0.1251 0.2503 0.8749
44 0.1354 0.2708 0.8646
45 0.1798 0.3596 0.8202
46 0.195 0.39 0.805
47 0.3313 0.6626 0.6687
48 0.3143 0.6286 0.6857
49 0.5193 0.9614 0.4807
50 0.4585 0.9171 0.5415
51 0.4466 0.8932 0.5534
52 0.4644 0.9287 0.5356
53 0.4555 0.9111 0.5445
54 0.4122 0.8245 0.5878
55 0.3528 0.7056 0.6472
56 0.3686 0.7372 0.6314
57 0.414 0.828 0.586
58 0.3841 0.7682 0.6159
59 0.3676 0.7352 0.6324
60 0.3176 0.6352 0.6824
61 0.2656 0.5312 0.7344
62 0.3437 0.6874 0.6563
63 0.3058 0.6115 0.6942
64 0.2561 0.5121 0.7439
65 0.2166 0.4332 0.7834
66 0.1999 0.3999 0.8001
67 0.1646 0.3291 0.8354
68 0.1759 0.3517 0.8241
69 0.1632 0.3264 0.8368
70 0.1422 0.2843 0.8578
71 0.1137 0.2274 0.8863
72 0.2685 0.5369 0.7315
73 0.2323 0.4646 0.7677
74 0.2138 0.4276 0.7862
75 0.1845 0.369 0.8155
76 0.1732 0.3464 0.8268
77 0.1545 0.309 0.8455
78 0.1879 0.3757 0.8121
79 0.1875 0.3749 0.8125
80 0.1872 0.3743 0.8128
81 0.3191 0.6383 0.6809
82 0.3252 0.6503 0.6748
83 0.4659 0.9318 0.5341
84 0.5443 0.9114 0.4557
85 0.4984 0.9969 0.5016
86 0.4652 0.9304 0.5348
87 0.4424 0.8848 0.5576
88 0.5019 0.9961 0.4981
89 0.7054 0.5892 0.2946
90 0.7779 0.4442 0.2221
91 0.7583 0.4834 0.2417
92 0.8826 0.2349 0.1174
93 0.8648 0.2704 0.1352
94 0.8403 0.3194 0.1597
95 0.8247 0.3506 0.1753
96 0.8543 0.2914 0.1457
97 0.868 0.264 0.132
98 0.84 0.32 0.16
99 0.8008 0.3983 0.1992
100 0.7538 0.4925 0.2462
101 0.8527 0.2947 0.1473
102 0.8226 0.3547 0.1774
103 0.8291 0.3418 0.1709
104 0.7865 0.4269 0.2135
105 0.7545 0.491 0.2455
106 0.8258 0.3484 0.1742
107 0.8094 0.3813 0.1906
108 0.8203 0.3593 0.1797
109 0.795 0.4099 0.205
110 0.866 0.268 0.134
111 0.8524 0.2951 0.1476
112 0.8713 0.2574 0.1287
113 0.8227 0.3546 0.1773
114 0.7637 0.4726 0.2363
115 0.7062 0.5875 0.2938
116 0.6388 0.7223 0.3612
117 0.6022 0.7957 0.3978
118 0.5981 0.8037 0.4019
119 0.6621 0.6758 0.3379
120 0.7524 0.4951 0.2476
121 0.7323 0.5354 0.2677
122 0.6279 0.7441 0.3721
123 0.6505 0.6989 0.3495
124 0.6655 0.6691 0.3345







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level20.0210526OK
10% type I error level50.0526316OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 2 & 0.0210526 & OK \tabularnewline
10% type I error level & 5 & 0.0526316 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=308427&T=7

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]2[/C][C]0.0210526[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]5[/C][C]0.0526316[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=308427&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level20.0210526OK
10% type I error level50.0526316OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 4.3509, df1 = 2, df2 = 125, p-value = 0.0149
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.57445, df1 = 52, df2 = 75, p-value = 0.9821
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6823, df1 = 2, df2 = 125, p-value = 0.07234

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 4.3509, df1 = 2, df2 = 125, p-value = 0.0149
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.57445, df1 = 52, df2 = 75, p-value = 0.9821
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6823, df1 = 2, df2 = 125, p-value = 0.07234
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=308427&T=8

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 4.3509, df1 = 2, df2 = 125, p-value = 0.0149
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.57445, df1 = 52, df2 = 75, p-value = 0.9821
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6823, df1 = 2, df2 = 125, p-value = 0.07234
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=308427&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 4.3509, df1 = 2, df2 = 125, p-value = 0.0149
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.57445, df1 = 52, df2 = 75, p-value = 0.9821
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6823, df1 = 2, df2 = 125, p-value = 0.07234







Variance Inflation Factors (Multicollinearity)
> vif
                 Energy  `Elect.equipment(t-1)`  `Elect.equipment(t-2)` 
               2.631594                6.582814                6.896825 
 `Elect.equipment(t-3)`  `Elect.equipment(t-4)`  `Elect.equipment(t-5)` 
               7.299033                8.068077                8.487436 
 `Elect.equipment(t-6)`  `Elect.equipment(t-7)`  `Elect.equipment(t-8)` 
               8.505313                8.152339                7.938733 
 `Elect.equipment(t-9)` `Elect.equipment(t-10)` `Elect.equipment(t-1s)` 
               8.260676                7.268197                6.640223 
`Elect.equipment(t-2s)` `Elect.equipment(t-3s)` `Elect.equipment(t-4s)` 
               3.037417                2.966441                2.670900 
                     M1                      M2                      M3 
               6.860536                9.756861                8.553967 
                     M4                      M5                      M6 
               8.711894                6.078110                4.906072 
                     M7                      M8                      M9 
               6.894749                6.946762                7.523166 
                    M10                     M11 
              10.835742                6.111140 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
                 Energy  `Elect.equipment(t-1)`  `Elect.equipment(t-2)` 
               2.631594                6.582814                6.896825 
 `Elect.equipment(t-3)`  `Elect.equipment(t-4)`  `Elect.equipment(t-5)` 
               7.299033                8.068077                8.487436 
 `Elect.equipment(t-6)`  `Elect.equipment(t-7)`  `Elect.equipment(t-8)` 
               8.505313                8.152339                7.938733 
 `Elect.equipment(t-9)` `Elect.equipment(t-10)` `Elect.equipment(t-1s)` 
               8.260676                7.268197                6.640223 
`Elect.equipment(t-2s)` `Elect.equipment(t-3s)` `Elect.equipment(t-4s)` 
               3.037417                2.966441                2.670900 
                     M1                      M2                      M3 
               6.860536                9.756861                8.553967 
                     M4                      M5                      M6 
               8.711894                6.078110                4.906072 
                     M7                      M8                      M9 
               6.894749                6.946762                7.523166 
                    M10                     M11 
              10.835742                6.111140 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=308427&T=9

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
                 Energy  `Elect.equipment(t-1)`  `Elect.equipment(t-2)` 
               2.631594                6.582814                6.896825 
 `Elect.equipment(t-3)`  `Elect.equipment(t-4)`  `Elect.equipment(t-5)` 
               7.299033                8.068077                8.487436 
 `Elect.equipment(t-6)`  `Elect.equipment(t-7)`  `Elect.equipment(t-8)` 
               8.505313                8.152339                7.938733 
 `Elect.equipment(t-9)` `Elect.equipment(t-10)` `Elect.equipment(t-1s)` 
               8.260676                7.268197                6.640223 
`Elect.equipment(t-2s)` `Elect.equipment(t-3s)` `Elect.equipment(t-4s)` 
               3.037417                2.966441                2.670900 
                     M1                      M2                      M3 
               6.860536                9.756861                8.553967 
                     M4                      M5                      M6 
               8.711894                6.078110                4.906072 
                     M7                      M8                      M9 
               6.894749                6.946762                7.523166 
                    M10                     M11 
              10.835742                6.111140 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=308427&T=9

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=308427&T=9

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
                 Energy  `Elect.equipment(t-1)`  `Elect.equipment(t-2)` 
               2.631594                6.582814                6.896825 
 `Elect.equipment(t-3)`  `Elect.equipment(t-4)`  `Elect.equipment(t-5)` 
               7.299033                8.068077                8.487436 
 `Elect.equipment(t-6)`  `Elect.equipment(t-7)`  `Elect.equipment(t-8)` 
               8.505313                8.152339                7.938733 
 `Elect.equipment(t-9)` `Elect.equipment(t-10)` `Elect.equipment(t-1s)` 
               8.260676                7.268197                6.640223 
`Elect.equipment(t-2s)` `Elect.equipment(t-3s)` `Elect.equipment(t-4s)` 
               3.037417                2.966441                2.670900 
                     M1                      M2                      M3 
               6.860536                9.756861                8.553967 
                     M4                      M5                      M6 
               8.711894                6.078110                4.906072 
                     M7                      M8                      M9 
               6.894749                6.946762                7.523166 
                    M10                     M11 
              10.835742                6.111140 



Parameters (Session):
Parameters (R input):
par1 = ; par2 = Include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 10 ; par5 = 4 ; par6 = 12 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par6 <- as.numeric(par6)
if(is.na(par6)) {
par6 <- 12
mywarning = 'Warning: you did not specify the seasonality. The seasonal period was set to s = 12.'
}
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (!is.numeric(par4)) par4 <- 0
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
if (!is.numeric(par5)) par5 <- 0
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s)'){
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - par6)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-Bs)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+par6,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*par6,par5), dimnames=list(1:(n-par5*par6), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*par6)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*par6-j*par6,par1]
}
}
x <- cbind(x[(par5*par6+1):n,], x2)
n <- n - par5*par6
}
if (par2 == 'Include Seasonal Dummies'){
x2 <- array(0, dim=c(n,par6-1), dimnames=list(1:n, paste('M', seq(1:(par6-1)), sep ='')))
for (i in 1:(par6-1)){
x2[seq(i,n,par6),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
a <-table.start()
a <- table.row.start(a)
a <- table.element(a,'Menu of Residual Diagnostics',2,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Description',1,TRUE)
a <- table.element(a,'Link',1,TRUE)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Histogram',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_histogram.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_centraltendency.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'QQ Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_fitdistrnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Kernel Density Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_density.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness/Kurtosis Test',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Skewness-Kurtosis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_skewness_kurtosis_plot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Harrell-Davis Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_harrell_davis.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Blocked Bootstrap Plot -- Central Tendency',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_bootstrapplot.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'(Partial) Autocorrelation Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_autocorrelation.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Spectral Analysis',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_spectrum.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Tukey lambda PPCC Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_tukeylambda.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <-table.element(a,'Box-Cox Normality Plot',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_boxcoxnorm.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a <- table.row.start(a)
a <- table.element(a,'Summary Statistics',1,header=TRUE)
a <- table.element(a,hyperlink( paste('https://supernova.wessa.net/rwasp_summary1.wasp?convertgetintopost=1&data=',paste(as.character(mysum$resid),sep='',collapse=' '),sep='') ,'Compute','Click here to examine the Residuals.'),1)
a <- table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable7.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')