Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 12 Dec 2007 13:04:52 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/12/t1197489011k4o4o1prb18n84d.htm/, Retrieved Thu, 02 May 2024 21:15:24 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=3268, Retrieved Thu, 02 May 2024 21:15:24 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordss0650062
Estimated Impact187
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [inflatie paper] [2007-12-12 20:04:52] [85ebbca709d200023cfec93009cd575f] [Current]
Feedback Forum

Post a new message
Dataseries X:
1.3	0
1.2	0
1.6	0
1.7	0
1.5	0
0.9	0
1.5	0
1.4	0
1.6	0
1.7	0
1.4	0
1.8	0
1.7	0
1.4	0
1.2	0
1.0	0
1.7	0
2.4	0
2.0	0
2.1	0
2.0	0
1.8	0
2.7	0
2.3	0
1.9	0
2.0	0
2.3	0
2.8	0
2.4	0
2.3	0
2.7	0
2.7	0
2.9	0
3.0	0
2.2	0
2.3	0
2.8	0
2.8	0
2.8	0
2.2	0
2.6	0
2.8	0
2.5	0
2.4	0
2.3	0
1.9	0
1.7	0
2.0	0
2.1	0
1.7	0
1.8	0
1.8	0
1.8	0
1.3	0
1.3	0
1.3	0
1.2	0
1.4	0
2.2	0
2.9	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3268&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3268&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3268&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 1.85434782608696 + 0.554347826086957x[t] -0.0990579710144927M1[t] -0.247246376811593M2[t] -0.135434782608695M3[t] -0.183623188405796M4[t] -0.091811594202898M5[t] -0.159999999999999M6[t] -0.108188405797101M7[t] -0.136376811594202M8[t] -0.124565217391304M9[t] -0.172753623188405M10[t] -0.100942028985506M11[t] + 0.00818840579710145t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  1.85434782608696 +  0.554347826086957x[t] -0.0990579710144927M1[t] -0.247246376811593M2[t] -0.135434782608695M3[t] -0.183623188405796M4[t] -0.091811594202898M5[t] -0.159999999999999M6[t] -0.108188405797101M7[t] -0.136376811594202M8[t] -0.124565217391304M9[t] -0.172753623188405M10[t] -0.100942028985506M11[t] +  0.00818840579710145t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3268&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  1.85434782608696 +  0.554347826086957x[t] -0.0990579710144927M1[t] -0.247246376811593M2[t] -0.135434782608695M3[t] -0.183623188405796M4[t] -0.091811594202898M5[t] -0.159999999999999M6[t] -0.108188405797101M7[t] -0.136376811594202M8[t] -0.124565217391304M9[t] -0.172753623188405M10[t] -0.100942028985506M11[t] +  0.00818840579710145t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3268&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3268&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 1.85434782608696 + 0.554347826086957x[t] -0.0990579710144927M1[t] -0.247246376811593M2[t] -0.135434782608695M3[t] -0.183623188405796M4[t] -0.091811594202898M5[t] -0.159999999999999M6[t] -0.108188405797101M7[t] -0.136376811594202M8[t] -0.124565217391304M9[t] -0.172753623188405M10[t] -0.100942028985506M11[t] + 0.00818840579710145t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.854347826086960.3237885.7271e-060
x0.5543478260869570.6703060.8270.4125020.206251
M1-0.09905797101449270.394375-0.25120.8027960.401398
M2-0.2472463768115930.394138-0.62730.5335580.266779
M3-0.1354347826086950.393953-0.34380.7325750.366288
M4-0.1836231884057960.393821-0.46630.643230.321615
M5-0.0918115942028980.393742-0.23320.8166580.408329
M6-0.1599999999999990.393715-0.40640.6863440.343172
M7-0.1081884057971010.393742-0.27480.7847230.392362
M8-0.1363768115942020.393821-0.34630.7307030.365351
M9-0.1245652173913040.393953-0.31620.7532860.376643
M10-0.1727536231884050.394138-0.43830.6632150.331607
M11-0.1009420289855060.394375-0.2560.7991270.399564
t0.008188405797101450.0045611.79540.0791660.039583

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 1.85434782608696 & 0.323788 & 5.727 & 1e-06 & 0 \tabularnewline
x & 0.554347826086957 & 0.670306 & 0.827 & 0.412502 & 0.206251 \tabularnewline
M1 & -0.0990579710144927 & 0.394375 & -0.2512 & 0.802796 & 0.401398 \tabularnewline
M2 & -0.247246376811593 & 0.394138 & -0.6273 & 0.533558 & 0.266779 \tabularnewline
M3 & -0.135434782608695 & 0.393953 & -0.3438 & 0.732575 & 0.366288 \tabularnewline
M4 & -0.183623188405796 & 0.393821 & -0.4663 & 0.64323 & 0.321615 \tabularnewline
M5 & -0.091811594202898 & 0.393742 & -0.2332 & 0.816658 & 0.408329 \tabularnewline
M6 & -0.159999999999999 & 0.393715 & -0.4064 & 0.686344 & 0.343172 \tabularnewline
M7 & -0.108188405797101 & 0.393742 & -0.2748 & 0.784723 & 0.392362 \tabularnewline
M8 & -0.136376811594202 & 0.393821 & -0.3463 & 0.730703 & 0.365351 \tabularnewline
M9 & -0.124565217391304 & 0.393953 & -0.3162 & 0.753286 & 0.376643 \tabularnewline
M10 & -0.172753623188405 & 0.394138 & -0.4383 & 0.663215 & 0.331607 \tabularnewline
M11 & -0.100942028985506 & 0.394375 & -0.256 & 0.799127 & 0.399564 \tabularnewline
t & 0.00818840579710145 & 0.004561 & 1.7954 & 0.079166 & 0.039583 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3268&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]1.85434782608696[/C][C]0.323788[/C][C]5.727[/C][C]1e-06[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]0.554347826086957[/C][C]0.670306[/C][C]0.827[/C][C]0.412502[/C][C]0.206251[/C][/ROW]
[ROW][C]M1[/C][C]-0.0990579710144927[/C][C]0.394375[/C][C]-0.2512[/C][C]0.802796[/C][C]0.401398[/C][/ROW]
[ROW][C]M2[/C][C]-0.247246376811593[/C][C]0.394138[/C][C]-0.6273[/C][C]0.533558[/C][C]0.266779[/C][/ROW]
[ROW][C]M3[/C][C]-0.135434782608695[/C][C]0.393953[/C][C]-0.3438[/C][C]0.732575[/C][C]0.366288[/C][/ROW]
[ROW][C]M4[/C][C]-0.183623188405796[/C][C]0.393821[/C][C]-0.4663[/C][C]0.64323[/C][C]0.321615[/C][/ROW]
[ROW][C]M5[/C][C]-0.091811594202898[/C][C]0.393742[/C][C]-0.2332[/C][C]0.816658[/C][C]0.408329[/C][/ROW]
[ROW][C]M6[/C][C]-0.159999999999999[/C][C]0.393715[/C][C]-0.4064[/C][C]0.686344[/C][C]0.343172[/C][/ROW]
[ROW][C]M7[/C][C]-0.108188405797101[/C][C]0.393742[/C][C]-0.2748[/C][C]0.784723[/C][C]0.392362[/C][/ROW]
[ROW][C]M8[/C][C]-0.136376811594202[/C][C]0.393821[/C][C]-0.3463[/C][C]0.730703[/C][C]0.365351[/C][/ROW]
[ROW][C]M9[/C][C]-0.124565217391304[/C][C]0.393953[/C][C]-0.3162[/C][C]0.753286[/C][C]0.376643[/C][/ROW]
[ROW][C]M10[/C][C]-0.172753623188405[/C][C]0.394138[/C][C]-0.4383[/C][C]0.663215[/C][C]0.331607[/C][/ROW]
[ROW][C]M11[/C][C]-0.100942028985506[/C][C]0.394375[/C][C]-0.256[/C][C]0.799127[/C][C]0.399564[/C][/ROW]
[ROW][C]t[/C][C]0.00818840579710145[/C][C]0.004561[/C][C]1.7954[/C][C]0.079166[/C][C]0.039583[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3268&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3268&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.854347826086960.3237885.7271e-060
x0.5543478260869570.6703060.8270.4125020.206251
M1-0.09905797101449270.394375-0.25120.8027960.401398
M2-0.2472463768115930.394138-0.62730.5335580.266779
M3-0.1354347826086950.393953-0.34380.7325750.366288
M4-0.1836231884057960.393821-0.46630.643230.321615
M5-0.0918115942028980.393742-0.23320.8166580.408329
M6-0.1599999999999990.393715-0.40640.6863440.343172
M7-0.1081884057971010.393742-0.27480.7847230.392362
M8-0.1363768115942020.393821-0.34630.7307030.365351
M9-0.1245652173913040.393953-0.31620.7532860.376643
M10-0.1727536231884050.394138-0.43830.6632150.331607
M11-0.1009420289855060.394375-0.2560.7991270.399564
t0.008188405797101450.0045611.79540.0791660.039583







Multiple Linear Regression - Regression Statistics
Multiple R0.350389114316179
R-squared0.122772531431276
Adjusted R-squared-0.125139579251189
F-TEST (value)0.495226034312328
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0.915973912243563
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.586916262709505
Sum Squared Residuals15.8456521739130

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.350389114316179 \tabularnewline
R-squared & 0.122772531431276 \tabularnewline
Adjusted R-squared & -0.125139579251189 \tabularnewline
F-TEST (value) & 0.495226034312328 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 0.915973912243563 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.586916262709505 \tabularnewline
Sum Squared Residuals & 15.8456521739130 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3268&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.350389114316179[/C][/ROW]
[ROW][C]R-squared[/C][C]0.122772531431276[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.125139579251189[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.495226034312328[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]0.915973912243563[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.586916262709505[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]15.8456521739130[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3268&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3268&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.350389114316179
R-squared0.122772531431276
Adjusted R-squared-0.125139579251189
F-TEST (value)0.495226034312328
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0.915973912243563
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.586916262709505
Sum Squared Residuals15.8456521739130







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11.31.76347826086957-0.463478260869569
21.21.62347826086957-0.423478260869565
31.61.74347826086957-0.143478260869565
41.71.70347826086957-0.00347826086956534
51.51.80347826086957-0.303478260869566
60.91.74347826086957-0.843478260869565
71.51.80347826086956-0.303478260869565
81.41.78347826086956-0.383478260869565
91.61.80347826086957-0.203478260869565
101.71.76347826086956-0.063478260869565
111.41.84347826086957-0.443478260869565
121.81.95260869565217-0.152608695652173
131.71.86173913043478-0.161739130434782
141.41.72173913043478-0.321739130434783
151.21.84173913043478-0.641739130434783
1611.80173913043478-0.801739130434783
171.71.90173913043478-0.201739130434782
182.41.841739130434780.558260869565217
1921.901739130434780.0982608695652175
202.11.881739130434780.218260869565218
2121.901739130434780.0982608695652175
221.81.86173913043478-0.0617391304347827
232.71.941739130434780.758260869565217
242.32.050869565217390.249130434782609
251.91.96-0.0599999999999992
2621.820.180000000000000
272.31.940.36
282.81.90.9
292.420.4
302.31.940.36
312.720.7
322.71.980.72
332.920.9
3431.961.04
352.22.040.16
362.32.149130434782610.150869565217392
372.82.058260869565220.741739130434783
382.81.918260869565220.881739130434783
392.82.038260869565220.761739130434782
402.21.998260869565220.201739130434783
412.62.098260869565220.501739130434782
422.82.038260869565220.761739130434782
432.52.098260869565220.401739130434783
442.42.078260869565220.321739130434782
452.32.098260869565220.201739130434783
461.92.05826086956522-0.158260869565218
471.72.13826086956522-0.438260869565218
4822.24739130434783-0.247391304347825
492.12.15652173913043-0.0565217391304339
501.72.01652173913044-0.316521739130435
511.82.13652173913043-0.336521739130435
521.82.09652173913044-0.296521739130435
531.82.19652173913043-0.396521739130435
541.32.13652173913044-0.836521739130435
551.32.19652173913044-0.896521739130435
561.32.17652173913044-0.876521739130435
571.22.19652173913044-0.996521739130435
581.42.15652173913044-0.756521739130435
592.22.23652173913043-0.0365217391304347
602.92.9-6.2450045135165e-17

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1.3 & 1.76347826086957 & -0.463478260869569 \tabularnewline
2 & 1.2 & 1.62347826086957 & -0.423478260869565 \tabularnewline
3 & 1.6 & 1.74347826086957 & -0.143478260869565 \tabularnewline
4 & 1.7 & 1.70347826086957 & -0.00347826086956534 \tabularnewline
5 & 1.5 & 1.80347826086957 & -0.303478260869566 \tabularnewline
6 & 0.9 & 1.74347826086957 & -0.843478260869565 \tabularnewline
7 & 1.5 & 1.80347826086956 & -0.303478260869565 \tabularnewline
8 & 1.4 & 1.78347826086956 & -0.383478260869565 \tabularnewline
9 & 1.6 & 1.80347826086957 & -0.203478260869565 \tabularnewline
10 & 1.7 & 1.76347826086956 & -0.063478260869565 \tabularnewline
11 & 1.4 & 1.84347826086957 & -0.443478260869565 \tabularnewline
12 & 1.8 & 1.95260869565217 & -0.152608695652173 \tabularnewline
13 & 1.7 & 1.86173913043478 & -0.161739130434782 \tabularnewline
14 & 1.4 & 1.72173913043478 & -0.321739130434783 \tabularnewline
15 & 1.2 & 1.84173913043478 & -0.641739130434783 \tabularnewline
16 & 1 & 1.80173913043478 & -0.801739130434783 \tabularnewline
17 & 1.7 & 1.90173913043478 & -0.201739130434782 \tabularnewline
18 & 2.4 & 1.84173913043478 & 0.558260869565217 \tabularnewline
19 & 2 & 1.90173913043478 & 0.0982608695652175 \tabularnewline
20 & 2.1 & 1.88173913043478 & 0.218260869565218 \tabularnewline
21 & 2 & 1.90173913043478 & 0.0982608695652175 \tabularnewline
22 & 1.8 & 1.86173913043478 & -0.0617391304347827 \tabularnewline
23 & 2.7 & 1.94173913043478 & 0.758260869565217 \tabularnewline
24 & 2.3 & 2.05086956521739 & 0.249130434782609 \tabularnewline
25 & 1.9 & 1.96 & -0.0599999999999992 \tabularnewline
26 & 2 & 1.82 & 0.180000000000000 \tabularnewline
27 & 2.3 & 1.94 & 0.36 \tabularnewline
28 & 2.8 & 1.9 & 0.9 \tabularnewline
29 & 2.4 & 2 & 0.4 \tabularnewline
30 & 2.3 & 1.94 & 0.36 \tabularnewline
31 & 2.7 & 2 & 0.7 \tabularnewline
32 & 2.7 & 1.98 & 0.72 \tabularnewline
33 & 2.9 & 2 & 0.9 \tabularnewline
34 & 3 & 1.96 & 1.04 \tabularnewline
35 & 2.2 & 2.04 & 0.16 \tabularnewline
36 & 2.3 & 2.14913043478261 & 0.150869565217392 \tabularnewline
37 & 2.8 & 2.05826086956522 & 0.741739130434783 \tabularnewline
38 & 2.8 & 1.91826086956522 & 0.881739130434783 \tabularnewline
39 & 2.8 & 2.03826086956522 & 0.761739130434782 \tabularnewline
40 & 2.2 & 1.99826086956522 & 0.201739130434783 \tabularnewline
41 & 2.6 & 2.09826086956522 & 0.501739130434782 \tabularnewline
42 & 2.8 & 2.03826086956522 & 0.761739130434782 \tabularnewline
43 & 2.5 & 2.09826086956522 & 0.401739130434783 \tabularnewline
44 & 2.4 & 2.07826086956522 & 0.321739130434782 \tabularnewline
45 & 2.3 & 2.09826086956522 & 0.201739130434783 \tabularnewline
46 & 1.9 & 2.05826086956522 & -0.158260869565218 \tabularnewline
47 & 1.7 & 2.13826086956522 & -0.438260869565218 \tabularnewline
48 & 2 & 2.24739130434783 & -0.247391304347825 \tabularnewline
49 & 2.1 & 2.15652173913043 & -0.0565217391304339 \tabularnewline
50 & 1.7 & 2.01652173913044 & -0.316521739130435 \tabularnewline
51 & 1.8 & 2.13652173913043 & -0.336521739130435 \tabularnewline
52 & 1.8 & 2.09652173913044 & -0.296521739130435 \tabularnewline
53 & 1.8 & 2.19652173913043 & -0.396521739130435 \tabularnewline
54 & 1.3 & 2.13652173913044 & -0.836521739130435 \tabularnewline
55 & 1.3 & 2.19652173913044 & -0.896521739130435 \tabularnewline
56 & 1.3 & 2.17652173913044 & -0.876521739130435 \tabularnewline
57 & 1.2 & 2.19652173913044 & -0.996521739130435 \tabularnewline
58 & 1.4 & 2.15652173913044 & -0.756521739130435 \tabularnewline
59 & 2.2 & 2.23652173913043 & -0.0365217391304347 \tabularnewline
60 & 2.9 & 2.9 & -6.2450045135165e-17 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3268&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1.3[/C][C]1.76347826086957[/C][C]-0.463478260869569[/C][/ROW]
[ROW][C]2[/C][C]1.2[/C][C]1.62347826086957[/C][C]-0.423478260869565[/C][/ROW]
[ROW][C]3[/C][C]1.6[/C][C]1.74347826086957[/C][C]-0.143478260869565[/C][/ROW]
[ROW][C]4[/C][C]1.7[/C][C]1.70347826086957[/C][C]-0.00347826086956534[/C][/ROW]
[ROW][C]5[/C][C]1.5[/C][C]1.80347826086957[/C][C]-0.303478260869566[/C][/ROW]
[ROW][C]6[/C][C]0.9[/C][C]1.74347826086957[/C][C]-0.843478260869565[/C][/ROW]
[ROW][C]7[/C][C]1.5[/C][C]1.80347826086956[/C][C]-0.303478260869565[/C][/ROW]
[ROW][C]8[/C][C]1.4[/C][C]1.78347826086956[/C][C]-0.383478260869565[/C][/ROW]
[ROW][C]9[/C][C]1.6[/C][C]1.80347826086957[/C][C]-0.203478260869565[/C][/ROW]
[ROW][C]10[/C][C]1.7[/C][C]1.76347826086956[/C][C]-0.063478260869565[/C][/ROW]
[ROW][C]11[/C][C]1.4[/C][C]1.84347826086957[/C][C]-0.443478260869565[/C][/ROW]
[ROW][C]12[/C][C]1.8[/C][C]1.95260869565217[/C][C]-0.152608695652173[/C][/ROW]
[ROW][C]13[/C][C]1.7[/C][C]1.86173913043478[/C][C]-0.161739130434782[/C][/ROW]
[ROW][C]14[/C][C]1.4[/C][C]1.72173913043478[/C][C]-0.321739130434783[/C][/ROW]
[ROW][C]15[/C][C]1.2[/C][C]1.84173913043478[/C][C]-0.641739130434783[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]1.80173913043478[/C][C]-0.801739130434783[/C][/ROW]
[ROW][C]17[/C][C]1.7[/C][C]1.90173913043478[/C][C]-0.201739130434782[/C][/ROW]
[ROW][C]18[/C][C]2.4[/C][C]1.84173913043478[/C][C]0.558260869565217[/C][/ROW]
[ROW][C]19[/C][C]2[/C][C]1.90173913043478[/C][C]0.0982608695652175[/C][/ROW]
[ROW][C]20[/C][C]2.1[/C][C]1.88173913043478[/C][C]0.218260869565218[/C][/ROW]
[ROW][C]21[/C][C]2[/C][C]1.90173913043478[/C][C]0.0982608695652175[/C][/ROW]
[ROW][C]22[/C][C]1.8[/C][C]1.86173913043478[/C][C]-0.0617391304347827[/C][/ROW]
[ROW][C]23[/C][C]2.7[/C][C]1.94173913043478[/C][C]0.758260869565217[/C][/ROW]
[ROW][C]24[/C][C]2.3[/C][C]2.05086956521739[/C][C]0.249130434782609[/C][/ROW]
[ROW][C]25[/C][C]1.9[/C][C]1.96[/C][C]-0.0599999999999992[/C][/ROW]
[ROW][C]26[/C][C]2[/C][C]1.82[/C][C]0.180000000000000[/C][/ROW]
[ROW][C]27[/C][C]2.3[/C][C]1.94[/C][C]0.36[/C][/ROW]
[ROW][C]28[/C][C]2.8[/C][C]1.9[/C][C]0.9[/C][/ROW]
[ROW][C]29[/C][C]2.4[/C][C]2[/C][C]0.4[/C][/ROW]
[ROW][C]30[/C][C]2.3[/C][C]1.94[/C][C]0.36[/C][/ROW]
[ROW][C]31[/C][C]2.7[/C][C]2[/C][C]0.7[/C][/ROW]
[ROW][C]32[/C][C]2.7[/C][C]1.98[/C][C]0.72[/C][/ROW]
[ROW][C]33[/C][C]2.9[/C][C]2[/C][C]0.9[/C][/ROW]
[ROW][C]34[/C][C]3[/C][C]1.96[/C][C]1.04[/C][/ROW]
[ROW][C]35[/C][C]2.2[/C][C]2.04[/C][C]0.16[/C][/ROW]
[ROW][C]36[/C][C]2.3[/C][C]2.14913043478261[/C][C]0.150869565217392[/C][/ROW]
[ROW][C]37[/C][C]2.8[/C][C]2.05826086956522[/C][C]0.741739130434783[/C][/ROW]
[ROW][C]38[/C][C]2.8[/C][C]1.91826086956522[/C][C]0.881739130434783[/C][/ROW]
[ROW][C]39[/C][C]2.8[/C][C]2.03826086956522[/C][C]0.761739130434782[/C][/ROW]
[ROW][C]40[/C][C]2.2[/C][C]1.99826086956522[/C][C]0.201739130434783[/C][/ROW]
[ROW][C]41[/C][C]2.6[/C][C]2.09826086956522[/C][C]0.501739130434782[/C][/ROW]
[ROW][C]42[/C][C]2.8[/C][C]2.03826086956522[/C][C]0.761739130434782[/C][/ROW]
[ROW][C]43[/C][C]2.5[/C][C]2.09826086956522[/C][C]0.401739130434783[/C][/ROW]
[ROW][C]44[/C][C]2.4[/C][C]2.07826086956522[/C][C]0.321739130434782[/C][/ROW]
[ROW][C]45[/C][C]2.3[/C][C]2.09826086956522[/C][C]0.201739130434783[/C][/ROW]
[ROW][C]46[/C][C]1.9[/C][C]2.05826086956522[/C][C]-0.158260869565218[/C][/ROW]
[ROW][C]47[/C][C]1.7[/C][C]2.13826086956522[/C][C]-0.438260869565218[/C][/ROW]
[ROW][C]48[/C][C]2[/C][C]2.24739130434783[/C][C]-0.247391304347825[/C][/ROW]
[ROW][C]49[/C][C]2.1[/C][C]2.15652173913043[/C][C]-0.0565217391304339[/C][/ROW]
[ROW][C]50[/C][C]1.7[/C][C]2.01652173913044[/C][C]-0.316521739130435[/C][/ROW]
[ROW][C]51[/C][C]1.8[/C][C]2.13652173913043[/C][C]-0.336521739130435[/C][/ROW]
[ROW][C]52[/C][C]1.8[/C][C]2.09652173913044[/C][C]-0.296521739130435[/C][/ROW]
[ROW][C]53[/C][C]1.8[/C][C]2.19652173913043[/C][C]-0.396521739130435[/C][/ROW]
[ROW][C]54[/C][C]1.3[/C][C]2.13652173913044[/C][C]-0.836521739130435[/C][/ROW]
[ROW][C]55[/C][C]1.3[/C][C]2.19652173913044[/C][C]-0.896521739130435[/C][/ROW]
[ROW][C]56[/C][C]1.3[/C][C]2.17652173913044[/C][C]-0.876521739130435[/C][/ROW]
[ROW][C]57[/C][C]1.2[/C][C]2.19652173913044[/C][C]-0.996521739130435[/C][/ROW]
[ROW][C]58[/C][C]1.4[/C][C]2.15652173913044[/C][C]-0.756521739130435[/C][/ROW]
[ROW][C]59[/C][C]2.2[/C][C]2.23652173913043[/C][C]-0.0365217391304347[/C][/ROW]
[ROW][C]60[/C][C]2.9[/C][C]2.9[/C][C]-6.2450045135165e-17[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3268&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3268&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11.31.76347826086957-0.463478260869569
21.21.62347826086957-0.423478260869565
31.61.74347826086957-0.143478260869565
41.71.70347826086957-0.00347826086956534
51.51.80347826086957-0.303478260869566
60.91.74347826086957-0.843478260869565
71.51.80347826086956-0.303478260869565
81.41.78347826086956-0.383478260869565
91.61.80347826086957-0.203478260869565
101.71.76347826086956-0.063478260869565
111.41.84347826086957-0.443478260869565
121.81.95260869565217-0.152608695652173
131.71.86173913043478-0.161739130434782
141.41.72173913043478-0.321739130434783
151.21.84173913043478-0.641739130434783
1611.80173913043478-0.801739130434783
171.71.90173913043478-0.201739130434782
182.41.841739130434780.558260869565217
1921.901739130434780.0982608695652175
202.11.881739130434780.218260869565218
2121.901739130434780.0982608695652175
221.81.86173913043478-0.0617391304347827
232.71.941739130434780.758260869565217
242.32.050869565217390.249130434782609
251.91.96-0.0599999999999992
2621.820.180000000000000
272.31.940.36
282.81.90.9
292.420.4
302.31.940.36
312.720.7
322.71.980.72
332.920.9
3431.961.04
352.22.040.16
362.32.149130434782610.150869565217392
372.82.058260869565220.741739130434783
382.81.918260869565220.881739130434783
392.82.038260869565220.761739130434782
402.21.998260869565220.201739130434783
412.62.098260869565220.501739130434782
422.82.038260869565220.761739130434782
432.52.098260869565220.401739130434783
442.42.078260869565220.321739130434782
452.32.098260869565220.201739130434783
461.92.05826086956522-0.158260869565218
471.72.13826086956522-0.438260869565218
4822.24739130434783-0.247391304347825
492.12.15652173913043-0.0565217391304339
501.72.01652173913044-0.316521739130435
511.82.13652173913043-0.336521739130435
521.82.09652173913044-0.296521739130435
531.82.19652173913043-0.396521739130435
541.32.13652173913044-0.836521739130435
551.32.19652173913044-0.896521739130435
561.32.17652173913044-0.876521739130435
571.22.19652173913044-0.996521739130435
581.42.15652173913044-0.756521739130435
592.22.23652173913043-0.0365217391304347
602.92.9-6.2450045135165e-17



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')