Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 15:52:29 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t11955124101j5rt61qfeig8c7.htm/, Retrieved Fri, 03 May 2024 11:41:32 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5789, Retrieved Fri, 03 May 2024 11:41:32 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact169
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Invloed olieprijs...] [2007-11-19 22:52:29] [9bb499d88394279c02e6a8b8cf177cf7] [Current]
Feedback Forum

Post a new message
Dataseries X:
99	0
92	0
91	0
92	0
93	0
95	0
98	0
98	0
97	0
95	0
93	0
102	0
102	0
113	0
112	0
114	0
104	0
98	0
88	0
93	0
96	0
101	0
107	0
104	0
96	1
86	1
83	1
90	1
95	1
102	1
95	1
98	1
95	1
92	1
94	1
96	1
109	1
117	1
118	1
107	1
104	1
101	1
110	1
101	1
101	1
98	1
99	1
92	1
85	1
81	1
79	1
82	1
81	1
83	1
82	1
89	1
88	1
86	1
88	1
87	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5789&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5789&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5789&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 102.5 + 0.75000000000001X[t] -0.0625000000000058M1[t] -0.275000000000011M2[t] -1.28750000000001M3[t] -0.700000000000012M4[t] -2.11250000000001M5[t] -1.52500000000001M6[t] -2.53750000000001M7[t] -1.15000000000001M8[t] -1.36250000000001M9[t] -2.17500000000001M10[t] -0.187500000000007M11[t] -0.187500000000000t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  102.5 +  0.75000000000001X[t] -0.0625000000000058M1[t] -0.275000000000011M2[t] -1.28750000000001M3[t] -0.700000000000012M4[t] -2.11250000000001M5[t] -1.52500000000001M6[t] -2.53750000000001M7[t] -1.15000000000001M8[t] -1.36250000000001M9[t] -2.17500000000001M10[t] -0.187500000000007M11[t] -0.187500000000000t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5789&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  102.5 +  0.75000000000001X[t] -0.0625000000000058M1[t] -0.275000000000011M2[t] -1.28750000000001M3[t] -0.700000000000012M4[t] -2.11250000000001M5[t] -1.52500000000001M6[t] -2.53750000000001M7[t] -1.15000000000001M8[t] -1.36250000000001M9[t] -2.17500000000001M10[t] -0.187500000000007M11[t] -0.187500000000000t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5789&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5789&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 102.5 + 0.75000000000001X[t] -0.0625000000000058M1[t] -0.275000000000011M2[t] -1.28750000000001M3[t] -0.700000000000012M4[t] -2.11250000000001M5[t] -1.52500000000001M6[t] -2.53750000000001M7[t] -1.15000000000001M8[t] -1.36250000000001M9[t] -2.17500000000001M10[t] -0.187500000000007M11[t] -0.187500000000000t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)102.55.44420318.827400
X0.750000000000015.2386870.14320.8867850.443392
M1-0.06250000000000586.502798-0.00960.9923730.496187
M2-0.2750000000000116.465765-0.04250.9662590.483129
M3-1.287500000000016.432075-0.20020.8422310.421115
M4-0.7000000000000126.401781-0.10930.9134050.456702
M5-2.112500000000016.374932-0.33140.7418650.370933
M6-1.525000000000016.35157-0.24010.8113210.40566
M7-2.537500000000016.331736-0.40080.6904530.345226
M8-1.150000000000016.315461-0.18210.856310.428155
M9-1.362500000000016.302774-0.21620.8298070.414904
M10-2.175000000000016.293696-0.34560.7312310.365615
M11-0.1875000000000076.288243-0.02980.9763420.488171
t-0.1875000000000000.151228-1.23990.2213180.110659

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 102.5 & 5.444203 & 18.8274 & 0 & 0 \tabularnewline
X & 0.75000000000001 & 5.238687 & 0.1432 & 0.886785 & 0.443392 \tabularnewline
M1 & -0.0625000000000058 & 6.502798 & -0.0096 & 0.992373 & 0.496187 \tabularnewline
M2 & -0.275000000000011 & 6.465765 & -0.0425 & 0.966259 & 0.483129 \tabularnewline
M3 & -1.28750000000001 & 6.432075 & -0.2002 & 0.842231 & 0.421115 \tabularnewline
M4 & -0.700000000000012 & 6.401781 & -0.1093 & 0.913405 & 0.456702 \tabularnewline
M5 & -2.11250000000001 & 6.374932 & -0.3314 & 0.741865 & 0.370933 \tabularnewline
M6 & -1.52500000000001 & 6.35157 & -0.2401 & 0.811321 & 0.40566 \tabularnewline
M7 & -2.53750000000001 & 6.331736 & -0.4008 & 0.690453 & 0.345226 \tabularnewline
M8 & -1.15000000000001 & 6.315461 & -0.1821 & 0.85631 & 0.428155 \tabularnewline
M9 & -1.36250000000001 & 6.302774 & -0.2162 & 0.829807 & 0.414904 \tabularnewline
M10 & -2.17500000000001 & 6.293696 & -0.3456 & 0.731231 & 0.365615 \tabularnewline
M11 & -0.187500000000007 & 6.288243 & -0.0298 & 0.976342 & 0.488171 \tabularnewline
t & -0.187500000000000 & 0.151228 & -1.2399 & 0.221318 & 0.110659 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5789&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]102.5[/C][C]5.444203[/C][C]18.8274[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]X[/C][C]0.75000000000001[/C][C]5.238687[/C][C]0.1432[/C][C]0.886785[/C][C]0.443392[/C][/ROW]
[ROW][C]M1[/C][C]-0.0625000000000058[/C][C]6.502798[/C][C]-0.0096[/C][C]0.992373[/C][C]0.496187[/C][/ROW]
[ROW][C]M2[/C][C]-0.275000000000011[/C][C]6.465765[/C][C]-0.0425[/C][C]0.966259[/C][C]0.483129[/C][/ROW]
[ROW][C]M3[/C][C]-1.28750000000001[/C][C]6.432075[/C][C]-0.2002[/C][C]0.842231[/C][C]0.421115[/C][/ROW]
[ROW][C]M4[/C][C]-0.700000000000012[/C][C]6.401781[/C][C]-0.1093[/C][C]0.913405[/C][C]0.456702[/C][/ROW]
[ROW][C]M5[/C][C]-2.11250000000001[/C][C]6.374932[/C][C]-0.3314[/C][C]0.741865[/C][C]0.370933[/C][/ROW]
[ROW][C]M6[/C][C]-1.52500000000001[/C][C]6.35157[/C][C]-0.2401[/C][C]0.811321[/C][C]0.40566[/C][/ROW]
[ROW][C]M7[/C][C]-2.53750000000001[/C][C]6.331736[/C][C]-0.4008[/C][C]0.690453[/C][C]0.345226[/C][/ROW]
[ROW][C]M8[/C][C]-1.15000000000001[/C][C]6.315461[/C][C]-0.1821[/C][C]0.85631[/C][C]0.428155[/C][/ROW]
[ROW][C]M9[/C][C]-1.36250000000001[/C][C]6.302774[/C][C]-0.2162[/C][C]0.829807[/C][C]0.414904[/C][/ROW]
[ROW][C]M10[/C][C]-2.17500000000001[/C][C]6.293696[/C][C]-0.3456[/C][C]0.731231[/C][C]0.365615[/C][/ROW]
[ROW][C]M11[/C][C]-0.187500000000007[/C][C]6.288243[/C][C]-0.0298[/C][C]0.976342[/C][C]0.488171[/C][/ROW]
[ROW][C]t[/C][C]-0.187500000000000[/C][C]0.151228[/C][C]-1.2399[/C][C]0.221318[/C][C]0.110659[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5789&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5789&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)102.55.44420318.827400
X0.750000000000015.2386870.14320.8867850.443392
M1-0.06250000000000586.502798-0.00960.9923730.496187
M2-0.2750000000000116.465765-0.04250.9662590.483129
M3-1.287500000000016.432075-0.20020.8422310.421115
M4-0.7000000000000126.401781-0.10930.9134050.456702
M5-2.112500000000016.374932-0.33140.7418650.370933
M6-1.525000000000016.35157-0.24010.8113210.40566
M7-2.537500000000016.331736-0.40080.6904530.345226
M8-1.150000000000016.315461-0.18210.856310.428155
M9-1.362500000000016.302774-0.21620.8298070.414904
M10-2.175000000000016.293696-0.34560.7312310.365615
M11-0.1875000000000076.288243-0.02980.9763420.488171
t-0.1875000000000000.151228-1.23990.2213180.110659







Multiple Linear Regression - Regression Statistics
Multiple R0.333175720431569
R-squared0.111006060685095
Adjusted R-squared-0.140231356947378
F-TEST (value)0.441837293708704
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0.944772506132533
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation9.93970955747483
Sum Squared Residuals4544.7

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.333175720431569 \tabularnewline
R-squared & 0.111006060685095 \tabularnewline
Adjusted R-squared & -0.140231356947378 \tabularnewline
F-TEST (value) & 0.441837293708704 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 0.944772506132533 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 9.93970955747483 \tabularnewline
Sum Squared Residuals & 4544.7 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5789&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.333175720431569[/C][/ROW]
[ROW][C]R-squared[/C][C]0.111006060685095[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.140231356947378[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.441837293708704[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]0.944772506132533[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]9.93970955747483[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]4544.7[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5789&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5789&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.333175720431569
R-squared0.111006060685095
Adjusted R-squared-0.140231356947378
F-TEST (value)0.441837293708704
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0.944772506132533
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation9.93970955747483
Sum Squared Residuals4544.7







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
199102.25-3.24999999999999
292101.85-9.84999999999999
391100.65-9.65
492101.05-9.05000000000001
59399.45-6.45
69599.85-4.85
79898.65-0.650000000000006
89899.85-1.85000000000000
99799.45-2.45
109598.45-3.45000000000000
1193100.25-7.25
12102100.251.75000000000000
131021002.00000000000000
1411399.613.4
1511298.413.6
1611498.815.2
1710497.26.8
189897.60.400000000000003
198896.4-8.4
209397.6-4.6
219697.2-1.20000000000000
2210196.24.8
23107989
24104986
259698.5-2.50000000000001
268698.1-12.1
278396.9-13.9
289097.3-7.3
299595.7-0.700000000000001
3010296.15.9
319594.90.0999999999999981
329896.11.90000000000000
339595.7-0.700000000000003
349294.7-2.7
359496.5-2.5
369696.5-0.50000000000001
3710996.2512.75
3811795.8521.15
3911894.6523.35
4010795.0511.95
4110493.4510.55
4210193.857.15
4311092.6517.35
4410193.857.15
4510193.457.55
469892.455.55
479994.254.75
489294.25-2.25000000000001
498594-9
508193.6-12.6
517992.4-13.4
528292.8-10.8
538191.2-10.2
548391.6-8.6
558290.4-8.4
568991.6-2.6
578891.2-3.20000000000000
588690.2-4.2
598892-4
608792-5

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 99 & 102.25 & -3.24999999999999 \tabularnewline
2 & 92 & 101.85 & -9.84999999999999 \tabularnewline
3 & 91 & 100.65 & -9.65 \tabularnewline
4 & 92 & 101.05 & -9.05000000000001 \tabularnewline
5 & 93 & 99.45 & -6.45 \tabularnewline
6 & 95 & 99.85 & -4.85 \tabularnewline
7 & 98 & 98.65 & -0.650000000000006 \tabularnewline
8 & 98 & 99.85 & -1.85000000000000 \tabularnewline
9 & 97 & 99.45 & -2.45 \tabularnewline
10 & 95 & 98.45 & -3.45000000000000 \tabularnewline
11 & 93 & 100.25 & -7.25 \tabularnewline
12 & 102 & 100.25 & 1.75000000000000 \tabularnewline
13 & 102 & 100 & 2.00000000000000 \tabularnewline
14 & 113 & 99.6 & 13.4 \tabularnewline
15 & 112 & 98.4 & 13.6 \tabularnewline
16 & 114 & 98.8 & 15.2 \tabularnewline
17 & 104 & 97.2 & 6.8 \tabularnewline
18 & 98 & 97.6 & 0.400000000000003 \tabularnewline
19 & 88 & 96.4 & -8.4 \tabularnewline
20 & 93 & 97.6 & -4.6 \tabularnewline
21 & 96 & 97.2 & -1.20000000000000 \tabularnewline
22 & 101 & 96.2 & 4.8 \tabularnewline
23 & 107 & 98 & 9 \tabularnewline
24 & 104 & 98 & 6 \tabularnewline
25 & 96 & 98.5 & -2.50000000000001 \tabularnewline
26 & 86 & 98.1 & -12.1 \tabularnewline
27 & 83 & 96.9 & -13.9 \tabularnewline
28 & 90 & 97.3 & -7.3 \tabularnewline
29 & 95 & 95.7 & -0.700000000000001 \tabularnewline
30 & 102 & 96.1 & 5.9 \tabularnewline
31 & 95 & 94.9 & 0.0999999999999981 \tabularnewline
32 & 98 & 96.1 & 1.90000000000000 \tabularnewline
33 & 95 & 95.7 & -0.700000000000003 \tabularnewline
34 & 92 & 94.7 & -2.7 \tabularnewline
35 & 94 & 96.5 & -2.5 \tabularnewline
36 & 96 & 96.5 & -0.50000000000001 \tabularnewline
37 & 109 & 96.25 & 12.75 \tabularnewline
38 & 117 & 95.85 & 21.15 \tabularnewline
39 & 118 & 94.65 & 23.35 \tabularnewline
40 & 107 & 95.05 & 11.95 \tabularnewline
41 & 104 & 93.45 & 10.55 \tabularnewline
42 & 101 & 93.85 & 7.15 \tabularnewline
43 & 110 & 92.65 & 17.35 \tabularnewline
44 & 101 & 93.85 & 7.15 \tabularnewline
45 & 101 & 93.45 & 7.55 \tabularnewline
46 & 98 & 92.45 & 5.55 \tabularnewline
47 & 99 & 94.25 & 4.75 \tabularnewline
48 & 92 & 94.25 & -2.25000000000001 \tabularnewline
49 & 85 & 94 & -9 \tabularnewline
50 & 81 & 93.6 & -12.6 \tabularnewline
51 & 79 & 92.4 & -13.4 \tabularnewline
52 & 82 & 92.8 & -10.8 \tabularnewline
53 & 81 & 91.2 & -10.2 \tabularnewline
54 & 83 & 91.6 & -8.6 \tabularnewline
55 & 82 & 90.4 & -8.4 \tabularnewline
56 & 89 & 91.6 & -2.6 \tabularnewline
57 & 88 & 91.2 & -3.20000000000000 \tabularnewline
58 & 86 & 90.2 & -4.2 \tabularnewline
59 & 88 & 92 & -4 \tabularnewline
60 & 87 & 92 & -5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5789&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]99[/C][C]102.25[/C][C]-3.24999999999999[/C][/ROW]
[ROW][C]2[/C][C]92[/C][C]101.85[/C][C]-9.84999999999999[/C][/ROW]
[ROW][C]3[/C][C]91[/C][C]100.65[/C][C]-9.65[/C][/ROW]
[ROW][C]4[/C][C]92[/C][C]101.05[/C][C]-9.05000000000001[/C][/ROW]
[ROW][C]5[/C][C]93[/C][C]99.45[/C][C]-6.45[/C][/ROW]
[ROW][C]6[/C][C]95[/C][C]99.85[/C][C]-4.85[/C][/ROW]
[ROW][C]7[/C][C]98[/C][C]98.65[/C][C]-0.650000000000006[/C][/ROW]
[ROW][C]8[/C][C]98[/C][C]99.85[/C][C]-1.85000000000000[/C][/ROW]
[ROW][C]9[/C][C]97[/C][C]99.45[/C][C]-2.45[/C][/ROW]
[ROW][C]10[/C][C]95[/C][C]98.45[/C][C]-3.45000000000000[/C][/ROW]
[ROW][C]11[/C][C]93[/C][C]100.25[/C][C]-7.25[/C][/ROW]
[ROW][C]12[/C][C]102[/C][C]100.25[/C][C]1.75000000000000[/C][/ROW]
[ROW][C]13[/C][C]102[/C][C]100[/C][C]2.00000000000000[/C][/ROW]
[ROW][C]14[/C][C]113[/C][C]99.6[/C][C]13.4[/C][/ROW]
[ROW][C]15[/C][C]112[/C][C]98.4[/C][C]13.6[/C][/ROW]
[ROW][C]16[/C][C]114[/C][C]98.8[/C][C]15.2[/C][/ROW]
[ROW][C]17[/C][C]104[/C][C]97.2[/C][C]6.8[/C][/ROW]
[ROW][C]18[/C][C]98[/C][C]97.6[/C][C]0.400000000000003[/C][/ROW]
[ROW][C]19[/C][C]88[/C][C]96.4[/C][C]-8.4[/C][/ROW]
[ROW][C]20[/C][C]93[/C][C]97.6[/C][C]-4.6[/C][/ROW]
[ROW][C]21[/C][C]96[/C][C]97.2[/C][C]-1.20000000000000[/C][/ROW]
[ROW][C]22[/C][C]101[/C][C]96.2[/C][C]4.8[/C][/ROW]
[ROW][C]23[/C][C]107[/C][C]98[/C][C]9[/C][/ROW]
[ROW][C]24[/C][C]104[/C][C]98[/C][C]6[/C][/ROW]
[ROW][C]25[/C][C]96[/C][C]98.5[/C][C]-2.50000000000001[/C][/ROW]
[ROW][C]26[/C][C]86[/C][C]98.1[/C][C]-12.1[/C][/ROW]
[ROW][C]27[/C][C]83[/C][C]96.9[/C][C]-13.9[/C][/ROW]
[ROW][C]28[/C][C]90[/C][C]97.3[/C][C]-7.3[/C][/ROW]
[ROW][C]29[/C][C]95[/C][C]95.7[/C][C]-0.700000000000001[/C][/ROW]
[ROW][C]30[/C][C]102[/C][C]96.1[/C][C]5.9[/C][/ROW]
[ROW][C]31[/C][C]95[/C][C]94.9[/C][C]0.0999999999999981[/C][/ROW]
[ROW][C]32[/C][C]98[/C][C]96.1[/C][C]1.90000000000000[/C][/ROW]
[ROW][C]33[/C][C]95[/C][C]95.7[/C][C]-0.700000000000003[/C][/ROW]
[ROW][C]34[/C][C]92[/C][C]94.7[/C][C]-2.7[/C][/ROW]
[ROW][C]35[/C][C]94[/C][C]96.5[/C][C]-2.5[/C][/ROW]
[ROW][C]36[/C][C]96[/C][C]96.5[/C][C]-0.50000000000001[/C][/ROW]
[ROW][C]37[/C][C]109[/C][C]96.25[/C][C]12.75[/C][/ROW]
[ROW][C]38[/C][C]117[/C][C]95.85[/C][C]21.15[/C][/ROW]
[ROW][C]39[/C][C]118[/C][C]94.65[/C][C]23.35[/C][/ROW]
[ROW][C]40[/C][C]107[/C][C]95.05[/C][C]11.95[/C][/ROW]
[ROW][C]41[/C][C]104[/C][C]93.45[/C][C]10.55[/C][/ROW]
[ROW][C]42[/C][C]101[/C][C]93.85[/C][C]7.15[/C][/ROW]
[ROW][C]43[/C][C]110[/C][C]92.65[/C][C]17.35[/C][/ROW]
[ROW][C]44[/C][C]101[/C][C]93.85[/C][C]7.15[/C][/ROW]
[ROW][C]45[/C][C]101[/C][C]93.45[/C][C]7.55[/C][/ROW]
[ROW][C]46[/C][C]98[/C][C]92.45[/C][C]5.55[/C][/ROW]
[ROW][C]47[/C][C]99[/C][C]94.25[/C][C]4.75[/C][/ROW]
[ROW][C]48[/C][C]92[/C][C]94.25[/C][C]-2.25000000000001[/C][/ROW]
[ROW][C]49[/C][C]85[/C][C]94[/C][C]-9[/C][/ROW]
[ROW][C]50[/C][C]81[/C][C]93.6[/C][C]-12.6[/C][/ROW]
[ROW][C]51[/C][C]79[/C][C]92.4[/C][C]-13.4[/C][/ROW]
[ROW][C]52[/C][C]82[/C][C]92.8[/C][C]-10.8[/C][/ROW]
[ROW][C]53[/C][C]81[/C][C]91.2[/C][C]-10.2[/C][/ROW]
[ROW][C]54[/C][C]83[/C][C]91.6[/C][C]-8.6[/C][/ROW]
[ROW][C]55[/C][C]82[/C][C]90.4[/C][C]-8.4[/C][/ROW]
[ROW][C]56[/C][C]89[/C][C]91.6[/C][C]-2.6[/C][/ROW]
[ROW][C]57[/C][C]88[/C][C]91.2[/C][C]-3.20000000000000[/C][/ROW]
[ROW][C]58[/C][C]86[/C][C]90.2[/C][C]-4.2[/C][/ROW]
[ROW][C]59[/C][C]88[/C][C]92[/C][C]-4[/C][/ROW]
[ROW][C]60[/C][C]87[/C][C]92[/C][C]-5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5789&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5789&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
199102.25-3.24999999999999
292101.85-9.84999999999999
391100.65-9.65
492101.05-9.05000000000001
59399.45-6.45
69599.85-4.85
79898.65-0.650000000000006
89899.85-1.85000000000000
99799.45-2.45
109598.45-3.45000000000000
1193100.25-7.25
12102100.251.75000000000000
131021002.00000000000000
1411399.613.4
1511298.413.6
1611498.815.2
1710497.26.8
189897.60.400000000000003
198896.4-8.4
209397.6-4.6
219697.2-1.20000000000000
2210196.24.8
23107989
24104986
259698.5-2.50000000000001
268698.1-12.1
278396.9-13.9
289097.3-7.3
299595.7-0.700000000000001
3010296.15.9
319594.90.0999999999999981
329896.11.90000000000000
339595.7-0.700000000000003
349294.7-2.7
359496.5-2.5
369696.5-0.50000000000001
3710996.2512.75
3811795.8521.15
3911894.6523.35
4010795.0511.95
4110493.4510.55
4210193.857.15
4311092.6517.35
4410193.857.15
4510193.457.55
469892.455.55
479994.254.75
489294.25-2.25000000000001
498594-9
508193.6-12.6
517992.4-13.4
528292.8-10.8
538191.2-10.2
548391.6-8.6
558290.4-8.4
568991.6-2.6
578891.2-3.20000000000000
588690.2-4.2
598892-4
608792-5



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')