Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 14 Dec 2007 07:51:04 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/14/t1197646305yd8pq2m259nylqm.htm/, Retrieved Thu, 02 May 2024 20:14:25 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=3929, Retrieved Thu, 02 May 2024 20:14:25 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact166
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple linear r...] [2007-12-14 14:51:04] [9ec4fcc2bfe8b6d942eac6074e595603] [Current]
Feedback Forum

Post a new message
Dataseries X:
106.70	0
110.20	0
125.90	0
100.10	0
106.40	0
114.80	0
81.30	0
87.00	0
104.20	0
108.00	0
105.00	0
94.50	0
92.00	0
95.90	0
108.80	0
103.40	0
102.10	0
110.10	0
83.20	0
82.70	0
106.80	0
113.70	0
102.50	0
96.60	0
92.10	0
95.60	0
102.30	0
98.60	0
98.20	0
104.50	0
84.00	0
73.80	0
103.90	0
106.00	0
97.20	0
102.60	0
89.00	0
93.80	0
116.70	0
106.80	0
98.50	0
118.70	0
90.00	0
91.90	0
113.30	0
113.10	1
104.10	1
108.70	1
96.70	1
101.00	1
116.90	1
105.80	1
99.00	1
129.40	1
83.00	1
88.90	1
115.90	1
104.20	1
113.40	1
112.20	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3929&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3929&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3929&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Ind.Nijverheid[t] = + 102.557671957672 + 6.9915343915344Dummy[t] -6.96550264550264M1[t] -2.89788359788361M2[t] + 11.9897354497355M3[t] + 0.877354497354476M4[t] -1.15502645502646M5[t] + 13.5725925925926M6[t] -17.5597883597884M7[t] -16.9321693121693M8[t] + 7.09544973544973M9[t] + 5.94476190476189M10[t] + 1.45238095238095M11[t] -0.0676190476190479t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Ind.Nijverheid[t] =  +  102.557671957672 +  6.9915343915344Dummy[t] -6.96550264550264M1[t] -2.89788359788361M2[t] +  11.9897354497355M3[t] +  0.877354497354476M4[t] -1.15502645502646M5[t] +  13.5725925925926M6[t] -17.5597883597884M7[t] -16.9321693121693M8[t] +  7.09544973544973M9[t] +  5.94476190476189M10[t] +  1.45238095238095M11[t] -0.0676190476190479t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3929&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Ind.Nijverheid[t] =  +  102.557671957672 +  6.9915343915344Dummy[t] -6.96550264550264M1[t] -2.89788359788361M2[t] +  11.9897354497355M3[t] +  0.877354497354476M4[t] -1.15502645502646M5[t] +  13.5725925925926M6[t] -17.5597883597884M7[t] -16.9321693121693M8[t] +  7.09544973544973M9[t] +  5.94476190476189M10[t] +  1.45238095238095M11[t] -0.0676190476190479t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3929&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3929&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Ind.Nijverheid[t] = + 102.557671957672 + 6.9915343915344Dummy[t] -6.96550264550264M1[t] -2.89788359788361M2[t] + 11.9897354497355M3[t] + 0.877354497354476M4[t] -1.15502645502646M5[t] + 13.5725925925926M6[t] -17.5597883597884M7[t] -16.9321693121693M8[t] + 7.09544973544973M9[t] + 5.94476190476189M10[t] + 1.45238095238095M11[t] -0.0676190476190479t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)102.5576719576723.22515231.799300
Dummy6.99153439153442.7504192.5420.0144550.007228
M1-6.965502645502643.814193-1.82620.0743140.037157
M2-2.897883597883613.80861-0.76090.4506140.225307
M311.98973544973553.8042633.15170.0028550.001427
M40.8773544973544763.8011550.23080.8184840.409242
M5-1.155026455026463.799289-0.3040.762490.381245
M613.57259259259263.7986663.5730.0008420.000421
M7-17.55978835978843.799289-4.62193.1e-051.5e-05
M8-16.93216931216933.801155-4.45455.3e-052.7e-05
M97.095449735449733.8042631.86510.0685490.034275
M105.944761904761893.7837011.57120.1230020.061501
M111.452380952380953.7818260.3840.7027170.351358
t-0.06761904761904790.06876-0.98340.3305550.165277

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 102.557671957672 & 3.225152 & 31.7993 & 0 & 0 \tabularnewline
Dummy & 6.9915343915344 & 2.750419 & 2.542 & 0.014455 & 0.007228 \tabularnewline
M1 & -6.96550264550264 & 3.814193 & -1.8262 & 0.074314 & 0.037157 \tabularnewline
M2 & -2.89788359788361 & 3.80861 & -0.7609 & 0.450614 & 0.225307 \tabularnewline
M3 & 11.9897354497355 & 3.804263 & 3.1517 & 0.002855 & 0.001427 \tabularnewline
M4 & 0.877354497354476 & 3.801155 & 0.2308 & 0.818484 & 0.409242 \tabularnewline
M5 & -1.15502645502646 & 3.799289 & -0.304 & 0.76249 & 0.381245 \tabularnewline
M6 & 13.5725925925926 & 3.798666 & 3.573 & 0.000842 & 0.000421 \tabularnewline
M7 & -17.5597883597884 & 3.799289 & -4.6219 & 3.1e-05 & 1.5e-05 \tabularnewline
M8 & -16.9321693121693 & 3.801155 & -4.4545 & 5.3e-05 & 2.7e-05 \tabularnewline
M9 & 7.09544973544973 & 3.804263 & 1.8651 & 0.068549 & 0.034275 \tabularnewline
M10 & 5.94476190476189 & 3.783701 & 1.5712 & 0.123002 & 0.061501 \tabularnewline
M11 & 1.45238095238095 & 3.781826 & 0.384 & 0.702717 & 0.351358 \tabularnewline
t & -0.0676190476190479 & 0.06876 & -0.9834 & 0.330555 & 0.165277 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3929&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]102.557671957672[/C][C]3.225152[/C][C]31.7993[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]Dummy[/C][C]6.9915343915344[/C][C]2.750419[/C][C]2.542[/C][C]0.014455[/C][C]0.007228[/C][/ROW]
[ROW][C]M1[/C][C]-6.96550264550264[/C][C]3.814193[/C][C]-1.8262[/C][C]0.074314[/C][C]0.037157[/C][/ROW]
[ROW][C]M2[/C][C]-2.89788359788361[/C][C]3.80861[/C][C]-0.7609[/C][C]0.450614[/C][C]0.225307[/C][/ROW]
[ROW][C]M3[/C][C]11.9897354497355[/C][C]3.804263[/C][C]3.1517[/C][C]0.002855[/C][C]0.001427[/C][/ROW]
[ROW][C]M4[/C][C]0.877354497354476[/C][C]3.801155[/C][C]0.2308[/C][C]0.818484[/C][C]0.409242[/C][/ROW]
[ROW][C]M5[/C][C]-1.15502645502646[/C][C]3.799289[/C][C]-0.304[/C][C]0.76249[/C][C]0.381245[/C][/ROW]
[ROW][C]M6[/C][C]13.5725925925926[/C][C]3.798666[/C][C]3.573[/C][C]0.000842[/C][C]0.000421[/C][/ROW]
[ROW][C]M7[/C][C]-17.5597883597884[/C][C]3.799289[/C][C]-4.6219[/C][C]3.1e-05[/C][C]1.5e-05[/C][/ROW]
[ROW][C]M8[/C][C]-16.9321693121693[/C][C]3.801155[/C][C]-4.4545[/C][C]5.3e-05[/C][C]2.7e-05[/C][/ROW]
[ROW][C]M9[/C][C]7.09544973544973[/C][C]3.804263[/C][C]1.8651[/C][C]0.068549[/C][C]0.034275[/C][/ROW]
[ROW][C]M10[/C][C]5.94476190476189[/C][C]3.783701[/C][C]1.5712[/C][C]0.123002[/C][C]0.061501[/C][/ROW]
[ROW][C]M11[/C][C]1.45238095238095[/C][C]3.781826[/C][C]0.384[/C][C]0.702717[/C][C]0.351358[/C][/ROW]
[ROW][C]t[/C][C]-0.0676190476190479[/C][C]0.06876[/C][C]-0.9834[/C][C]0.330555[/C][C]0.165277[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3929&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3929&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)102.5576719576723.22515231.799300
Dummy6.99153439153442.7504192.5420.0144550.007228
M1-6.965502645502643.814193-1.82620.0743140.037157
M2-2.897883597883613.80861-0.76090.4506140.225307
M311.98973544973553.8042633.15170.0028550.001427
M40.8773544973544763.8011550.23080.8184840.409242
M5-1.155026455026463.799289-0.3040.762490.381245
M613.57259259259263.7986663.5730.0008420.000421
M7-17.55978835978843.799289-4.62193.1e-051.5e-05
M8-16.93216931216933.801155-4.45455.3e-052.7e-05
M97.095449735449733.8042631.86510.0685490.034275
M105.944761904761893.7837011.57120.1230020.061501
M111.452380952380953.7818260.3840.7027170.351358
t-0.06761904761904790.06876-0.98340.3305550.165277







Multiple Linear Regression - Regression Statistics
Multiple R0.882076529432349
R-squared0.778059003775417
Adjusted R-squared0.715336548320644
F-TEST (value)12.4047918426351
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value5.42883515919357e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.97860374866314
Sum Squared Residuals1644.21032804233

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.882076529432349 \tabularnewline
R-squared & 0.778059003775417 \tabularnewline
Adjusted R-squared & 0.715336548320644 \tabularnewline
F-TEST (value) & 12.4047918426351 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 5.42883515919357e-11 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.97860374866314 \tabularnewline
Sum Squared Residuals & 1644.21032804233 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3929&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.882076529432349[/C][/ROW]
[ROW][C]R-squared[/C][C]0.778059003775417[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.715336548320644[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]12.4047918426351[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]5.42883515919357e-11[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.97860374866314[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1644.21032804233[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3929&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3929&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.882076529432349
R-squared0.778059003775417
Adjusted R-squared0.715336548320644
F-TEST (value)12.4047918426351
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value5.42883515919357e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.97860374866314
Sum Squared Residuals1644.21032804233







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.795.524550264550211.1754497354498
2110.299.524550264550310.6754497354497
3125.9114.34455026455011.5554497354497
4100.1103.164550264550-3.06455026455027
5106.4101.0645502645505.33544973544972
6114.8115.724550264550-0.924550264550254
781.384.5245502645503-3.22455026455028
88785.08455026455031.91544973544975
9104.2109.044550264550-4.84455026455027
10108107.8262433862430.173756613756610
11105103.2662433862431.73375661375661
1294.5101.746243386243-7.24624338624339
139294.7131216931217-2.71312169312171
1495.998.7131216931217-2.81312169312169
15108.8113.533121693122-4.7331216931217
16103.4102.3531216931221.04687830687831
17102.1100.2531216931221.84687830687830
18110.1114.913121693122-4.8131216931217
1983.283.7131216931217-0.513121693121686
2082.784.2731216931217-1.57312169312170
21106.8108.233121693122-1.43312169312170
22113.7107.0148148148156.68518518518519
23102.5102.4548148148150.045185185185183
2496.6100.934814814815-4.33481481481483
2592.193.9016931216931-1.80169312169314
2695.697.9016931216931-2.30169312169313
27102.3112.721693121693-10.4216931216931
2898.6101.541693121693-2.94169312169312
2998.299.4416931216931-1.24169312169311
30104.5114.101693121693-9.60169312169313
318482.90169312169311.09830687830689
3273.883.4616931216931-9.66169312169313
33103.9107.421693121693-3.52169312169311
34106106.203386243386-0.203386243386239
3597.2101.643386243386-4.44338624338624
36102.6100.1233862433862.47661375661375
378993.0902645502646-4.09026455026456
3893.897.0902645502645-3.29026455026455
39116.7111.9102645502654.78973544973545
40106.8100.7302645502656.06973544973545
4198.598.6302645502645-0.130264550264542
42118.7113.2902645502655.40973544973545
439082.09026455026457.90973544973546
4491.982.65026455026469.24973544973545
45113.3106.6102645502656.68973544973545
46113.1112.3834920634920.716507936507934
47104.1107.823492063492-3.72349206349207
48108.7106.3034920634922.39650793650793
4996.799.2703703703704-2.57037037037038
50101103.270370370370-2.27037037037037
51116.9118.090370370370-1.19037037037037
52105.8106.910370370370-1.11037037037037
5399104.810370370370-5.81037037037037
54129.4119.4703703703709.92962962962963
558388.2703703703704-5.27037037037037
5688.988.83037037037040.0696296296296282
57115.9112.7903703703703.10962962962964
58104.2111.572063492063-7.37206349206349
59113.4107.0120634920636.38793650793652
60112.2105.4920634920636.7079365079365

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 106.7 & 95.5245502645502 & 11.1754497354498 \tabularnewline
2 & 110.2 & 99.5245502645503 & 10.6754497354497 \tabularnewline
3 & 125.9 & 114.344550264550 & 11.5554497354497 \tabularnewline
4 & 100.1 & 103.164550264550 & -3.06455026455027 \tabularnewline
5 & 106.4 & 101.064550264550 & 5.33544973544972 \tabularnewline
6 & 114.8 & 115.724550264550 & -0.924550264550254 \tabularnewline
7 & 81.3 & 84.5245502645503 & -3.22455026455028 \tabularnewline
8 & 87 & 85.0845502645503 & 1.91544973544975 \tabularnewline
9 & 104.2 & 109.044550264550 & -4.84455026455027 \tabularnewline
10 & 108 & 107.826243386243 & 0.173756613756610 \tabularnewline
11 & 105 & 103.266243386243 & 1.73375661375661 \tabularnewline
12 & 94.5 & 101.746243386243 & -7.24624338624339 \tabularnewline
13 & 92 & 94.7131216931217 & -2.71312169312171 \tabularnewline
14 & 95.9 & 98.7131216931217 & -2.81312169312169 \tabularnewline
15 & 108.8 & 113.533121693122 & -4.7331216931217 \tabularnewline
16 & 103.4 & 102.353121693122 & 1.04687830687831 \tabularnewline
17 & 102.1 & 100.253121693122 & 1.84687830687830 \tabularnewline
18 & 110.1 & 114.913121693122 & -4.8131216931217 \tabularnewline
19 & 83.2 & 83.7131216931217 & -0.513121693121686 \tabularnewline
20 & 82.7 & 84.2731216931217 & -1.57312169312170 \tabularnewline
21 & 106.8 & 108.233121693122 & -1.43312169312170 \tabularnewline
22 & 113.7 & 107.014814814815 & 6.68518518518519 \tabularnewline
23 & 102.5 & 102.454814814815 & 0.045185185185183 \tabularnewline
24 & 96.6 & 100.934814814815 & -4.33481481481483 \tabularnewline
25 & 92.1 & 93.9016931216931 & -1.80169312169314 \tabularnewline
26 & 95.6 & 97.9016931216931 & -2.30169312169313 \tabularnewline
27 & 102.3 & 112.721693121693 & -10.4216931216931 \tabularnewline
28 & 98.6 & 101.541693121693 & -2.94169312169312 \tabularnewline
29 & 98.2 & 99.4416931216931 & -1.24169312169311 \tabularnewline
30 & 104.5 & 114.101693121693 & -9.60169312169313 \tabularnewline
31 & 84 & 82.9016931216931 & 1.09830687830689 \tabularnewline
32 & 73.8 & 83.4616931216931 & -9.66169312169313 \tabularnewline
33 & 103.9 & 107.421693121693 & -3.52169312169311 \tabularnewline
34 & 106 & 106.203386243386 & -0.203386243386239 \tabularnewline
35 & 97.2 & 101.643386243386 & -4.44338624338624 \tabularnewline
36 & 102.6 & 100.123386243386 & 2.47661375661375 \tabularnewline
37 & 89 & 93.0902645502646 & -4.09026455026456 \tabularnewline
38 & 93.8 & 97.0902645502645 & -3.29026455026455 \tabularnewline
39 & 116.7 & 111.910264550265 & 4.78973544973545 \tabularnewline
40 & 106.8 & 100.730264550265 & 6.06973544973545 \tabularnewline
41 & 98.5 & 98.6302645502645 & -0.130264550264542 \tabularnewline
42 & 118.7 & 113.290264550265 & 5.40973544973545 \tabularnewline
43 & 90 & 82.0902645502645 & 7.90973544973546 \tabularnewline
44 & 91.9 & 82.6502645502646 & 9.24973544973545 \tabularnewline
45 & 113.3 & 106.610264550265 & 6.68973544973545 \tabularnewline
46 & 113.1 & 112.383492063492 & 0.716507936507934 \tabularnewline
47 & 104.1 & 107.823492063492 & -3.72349206349207 \tabularnewline
48 & 108.7 & 106.303492063492 & 2.39650793650793 \tabularnewline
49 & 96.7 & 99.2703703703704 & -2.57037037037038 \tabularnewline
50 & 101 & 103.270370370370 & -2.27037037037037 \tabularnewline
51 & 116.9 & 118.090370370370 & -1.19037037037037 \tabularnewline
52 & 105.8 & 106.910370370370 & -1.11037037037037 \tabularnewline
53 & 99 & 104.810370370370 & -5.81037037037037 \tabularnewline
54 & 129.4 & 119.470370370370 & 9.92962962962963 \tabularnewline
55 & 83 & 88.2703703703704 & -5.27037037037037 \tabularnewline
56 & 88.9 & 88.8303703703704 & 0.0696296296296282 \tabularnewline
57 & 115.9 & 112.790370370370 & 3.10962962962964 \tabularnewline
58 & 104.2 & 111.572063492063 & -7.37206349206349 \tabularnewline
59 & 113.4 & 107.012063492063 & 6.38793650793652 \tabularnewline
60 & 112.2 & 105.492063492063 & 6.7079365079365 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3929&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]106.7[/C][C]95.5245502645502[/C][C]11.1754497354498[/C][/ROW]
[ROW][C]2[/C][C]110.2[/C][C]99.5245502645503[/C][C]10.6754497354497[/C][/ROW]
[ROW][C]3[/C][C]125.9[/C][C]114.344550264550[/C][C]11.5554497354497[/C][/ROW]
[ROW][C]4[/C][C]100.1[/C][C]103.164550264550[/C][C]-3.06455026455027[/C][/ROW]
[ROW][C]5[/C][C]106.4[/C][C]101.064550264550[/C][C]5.33544973544972[/C][/ROW]
[ROW][C]6[/C][C]114.8[/C][C]115.724550264550[/C][C]-0.924550264550254[/C][/ROW]
[ROW][C]7[/C][C]81.3[/C][C]84.5245502645503[/C][C]-3.22455026455028[/C][/ROW]
[ROW][C]8[/C][C]87[/C][C]85.0845502645503[/C][C]1.91544973544975[/C][/ROW]
[ROW][C]9[/C][C]104.2[/C][C]109.044550264550[/C][C]-4.84455026455027[/C][/ROW]
[ROW][C]10[/C][C]108[/C][C]107.826243386243[/C][C]0.173756613756610[/C][/ROW]
[ROW][C]11[/C][C]105[/C][C]103.266243386243[/C][C]1.73375661375661[/C][/ROW]
[ROW][C]12[/C][C]94.5[/C][C]101.746243386243[/C][C]-7.24624338624339[/C][/ROW]
[ROW][C]13[/C][C]92[/C][C]94.7131216931217[/C][C]-2.71312169312171[/C][/ROW]
[ROW][C]14[/C][C]95.9[/C][C]98.7131216931217[/C][C]-2.81312169312169[/C][/ROW]
[ROW][C]15[/C][C]108.8[/C][C]113.533121693122[/C][C]-4.7331216931217[/C][/ROW]
[ROW][C]16[/C][C]103.4[/C][C]102.353121693122[/C][C]1.04687830687831[/C][/ROW]
[ROW][C]17[/C][C]102.1[/C][C]100.253121693122[/C][C]1.84687830687830[/C][/ROW]
[ROW][C]18[/C][C]110.1[/C][C]114.913121693122[/C][C]-4.8131216931217[/C][/ROW]
[ROW][C]19[/C][C]83.2[/C][C]83.7131216931217[/C][C]-0.513121693121686[/C][/ROW]
[ROW][C]20[/C][C]82.7[/C][C]84.2731216931217[/C][C]-1.57312169312170[/C][/ROW]
[ROW][C]21[/C][C]106.8[/C][C]108.233121693122[/C][C]-1.43312169312170[/C][/ROW]
[ROW][C]22[/C][C]113.7[/C][C]107.014814814815[/C][C]6.68518518518519[/C][/ROW]
[ROW][C]23[/C][C]102.5[/C][C]102.454814814815[/C][C]0.045185185185183[/C][/ROW]
[ROW][C]24[/C][C]96.6[/C][C]100.934814814815[/C][C]-4.33481481481483[/C][/ROW]
[ROW][C]25[/C][C]92.1[/C][C]93.9016931216931[/C][C]-1.80169312169314[/C][/ROW]
[ROW][C]26[/C][C]95.6[/C][C]97.9016931216931[/C][C]-2.30169312169313[/C][/ROW]
[ROW][C]27[/C][C]102.3[/C][C]112.721693121693[/C][C]-10.4216931216931[/C][/ROW]
[ROW][C]28[/C][C]98.6[/C][C]101.541693121693[/C][C]-2.94169312169312[/C][/ROW]
[ROW][C]29[/C][C]98.2[/C][C]99.4416931216931[/C][C]-1.24169312169311[/C][/ROW]
[ROW][C]30[/C][C]104.5[/C][C]114.101693121693[/C][C]-9.60169312169313[/C][/ROW]
[ROW][C]31[/C][C]84[/C][C]82.9016931216931[/C][C]1.09830687830689[/C][/ROW]
[ROW][C]32[/C][C]73.8[/C][C]83.4616931216931[/C][C]-9.66169312169313[/C][/ROW]
[ROW][C]33[/C][C]103.9[/C][C]107.421693121693[/C][C]-3.52169312169311[/C][/ROW]
[ROW][C]34[/C][C]106[/C][C]106.203386243386[/C][C]-0.203386243386239[/C][/ROW]
[ROW][C]35[/C][C]97.2[/C][C]101.643386243386[/C][C]-4.44338624338624[/C][/ROW]
[ROW][C]36[/C][C]102.6[/C][C]100.123386243386[/C][C]2.47661375661375[/C][/ROW]
[ROW][C]37[/C][C]89[/C][C]93.0902645502646[/C][C]-4.09026455026456[/C][/ROW]
[ROW][C]38[/C][C]93.8[/C][C]97.0902645502645[/C][C]-3.29026455026455[/C][/ROW]
[ROW][C]39[/C][C]116.7[/C][C]111.910264550265[/C][C]4.78973544973545[/C][/ROW]
[ROW][C]40[/C][C]106.8[/C][C]100.730264550265[/C][C]6.06973544973545[/C][/ROW]
[ROW][C]41[/C][C]98.5[/C][C]98.6302645502645[/C][C]-0.130264550264542[/C][/ROW]
[ROW][C]42[/C][C]118.7[/C][C]113.290264550265[/C][C]5.40973544973545[/C][/ROW]
[ROW][C]43[/C][C]90[/C][C]82.0902645502645[/C][C]7.90973544973546[/C][/ROW]
[ROW][C]44[/C][C]91.9[/C][C]82.6502645502646[/C][C]9.24973544973545[/C][/ROW]
[ROW][C]45[/C][C]113.3[/C][C]106.610264550265[/C][C]6.68973544973545[/C][/ROW]
[ROW][C]46[/C][C]113.1[/C][C]112.383492063492[/C][C]0.716507936507934[/C][/ROW]
[ROW][C]47[/C][C]104.1[/C][C]107.823492063492[/C][C]-3.72349206349207[/C][/ROW]
[ROW][C]48[/C][C]108.7[/C][C]106.303492063492[/C][C]2.39650793650793[/C][/ROW]
[ROW][C]49[/C][C]96.7[/C][C]99.2703703703704[/C][C]-2.57037037037038[/C][/ROW]
[ROW][C]50[/C][C]101[/C][C]103.270370370370[/C][C]-2.27037037037037[/C][/ROW]
[ROW][C]51[/C][C]116.9[/C][C]118.090370370370[/C][C]-1.19037037037037[/C][/ROW]
[ROW][C]52[/C][C]105.8[/C][C]106.910370370370[/C][C]-1.11037037037037[/C][/ROW]
[ROW][C]53[/C][C]99[/C][C]104.810370370370[/C][C]-5.81037037037037[/C][/ROW]
[ROW][C]54[/C][C]129.4[/C][C]119.470370370370[/C][C]9.92962962962963[/C][/ROW]
[ROW][C]55[/C][C]83[/C][C]88.2703703703704[/C][C]-5.27037037037037[/C][/ROW]
[ROW][C]56[/C][C]88.9[/C][C]88.8303703703704[/C][C]0.0696296296296282[/C][/ROW]
[ROW][C]57[/C][C]115.9[/C][C]112.790370370370[/C][C]3.10962962962964[/C][/ROW]
[ROW][C]58[/C][C]104.2[/C][C]111.572063492063[/C][C]-7.37206349206349[/C][/ROW]
[ROW][C]59[/C][C]113.4[/C][C]107.012063492063[/C][C]6.38793650793652[/C][/ROW]
[ROW][C]60[/C][C]112.2[/C][C]105.492063492063[/C][C]6.7079365079365[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3929&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3929&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.795.524550264550211.1754497354498
2110.299.524550264550310.6754497354497
3125.9114.34455026455011.5554497354497
4100.1103.164550264550-3.06455026455027
5106.4101.0645502645505.33544973544972
6114.8115.724550264550-0.924550264550254
781.384.5245502645503-3.22455026455028
88785.08455026455031.91544973544975
9104.2109.044550264550-4.84455026455027
10108107.8262433862430.173756613756610
11105103.2662433862431.73375661375661
1294.5101.746243386243-7.24624338624339
139294.7131216931217-2.71312169312171
1495.998.7131216931217-2.81312169312169
15108.8113.533121693122-4.7331216931217
16103.4102.3531216931221.04687830687831
17102.1100.2531216931221.84687830687830
18110.1114.913121693122-4.8131216931217
1983.283.7131216931217-0.513121693121686
2082.784.2731216931217-1.57312169312170
21106.8108.233121693122-1.43312169312170
22113.7107.0148148148156.68518518518519
23102.5102.4548148148150.045185185185183
2496.6100.934814814815-4.33481481481483
2592.193.9016931216931-1.80169312169314
2695.697.9016931216931-2.30169312169313
27102.3112.721693121693-10.4216931216931
2898.6101.541693121693-2.94169312169312
2998.299.4416931216931-1.24169312169311
30104.5114.101693121693-9.60169312169313
318482.90169312169311.09830687830689
3273.883.4616931216931-9.66169312169313
33103.9107.421693121693-3.52169312169311
34106106.203386243386-0.203386243386239
3597.2101.643386243386-4.44338624338624
36102.6100.1233862433862.47661375661375
378993.0902645502646-4.09026455026456
3893.897.0902645502645-3.29026455026455
39116.7111.9102645502654.78973544973545
40106.8100.7302645502656.06973544973545
4198.598.6302645502645-0.130264550264542
42118.7113.2902645502655.40973544973545
439082.09026455026457.90973544973546
4491.982.65026455026469.24973544973545
45113.3106.6102645502656.68973544973545
46113.1112.3834920634920.716507936507934
47104.1107.823492063492-3.72349206349207
48108.7106.3034920634922.39650793650793
4996.799.2703703703704-2.57037037037038
50101103.270370370370-2.27037037037037
51116.9118.090370370370-1.19037037037037
52105.8106.910370370370-1.11037037037037
5399104.810370370370-5.81037037037037
54129.4119.4703703703709.92962962962963
558388.2703703703704-5.27037037037037
5688.988.83037037037040.0696296296296282
57115.9112.7903703703703.10962962962964
58104.2111.572063492063-7.37206349206349
59113.4107.0120634920636.38793650793652
60112.2105.4920634920636.7079365079365



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')