Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 03:01:59 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t11954661255rb13y8d3hjmukh.htm/, Retrieved Fri, 03 May 2024 14:17:03 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5653, Retrieved Fri, 03 May 2024 14:17:03 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact203
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Workshop 6 ] [2007-11-19 10:01:59] [6bae8369195607c4cbc8a8485fed7b2f] [Current]
Feedback Forum

Post a new message
Dataseries X:
110.40	72.50	0
96.40	59.40	0
101.90	85.70	0
106.20	88.20	0
81.00	62.80	0
94.70	87.00	0
101.00	79.20	0
109.40	112.00	1
102.30	79.20	1
90.70	132.10	1
96.20	40.10	1
96.10	69.00	1
106.00	59.40	1
103.10	73.80	1
102.00	57.40	1
104.70	81.10	1
86.00	46.60	1
92.10	41.40	1
106.90	71.20	1
112.60	67.90	1
101.70	72.00	1
92.00	145.50	1
97.40	39.70	1
97.00	51.90	1
105.40	73.70	1
102.70	70.90	1
98.10	60.80	1
104.50	61.00	1
87.40	54.50	1
89.90	39.10	1
109.80	66.60	1
111.70	58.50	1
98.60	59.80	1
96.90	80.90	1
95.10	37.30	1
97.00	44.60	1
112.70	48.70	1
102.90	54.00	1
97.40	49.50	1
111.40	61.60	1
87.40	35.00	1
96.80	35.70	1
114.10	51.30	1
110.30	49.00	1
103.90	41.50	1
101.60	72.50	1
94.60	42.10	1
95.90	44.10	1
104.70	45.10	1
102.80	50.30	1
98.10	40.90	1
113.90	47.20	1
80.90	36.90	1
95.70	40.90	1
113.20	38.30	1
105.90	46.30	1
108.80	28.40	1
102.30	78.40	1
99.00	36.80	1
100.70	50.70	1
115.50	42.80	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5653&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5653&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5653&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Invest[t] = + 129.477741045368 -0.532610496389665Tot.prod[t] -1.22880551638674Tijd[t] + 7.6597101083052M1[t] + 4.87011023185619M2[t] + 1.61852011632155M3[t] + 15.8565145220841M4[t] -16.6968534757561M5[t] -9.40733614237636M6[t] + 11.8432786998468M7[t] + 18.7072378065419M8[t] + 5.1378128884813M9[t] + 48.1266498483989M10[t] -14.0049369537788M11[t] -0.676239716955866t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Invest[t] =  +  129.477741045368 -0.532610496389665Tot.prod[t] -1.22880551638674Tijd[t] +  7.6597101083052M1[t] +  4.87011023185619M2[t] +  1.61852011632155M3[t] +  15.8565145220841M4[t] -16.6968534757561M5[t] -9.40733614237636M6[t] +  11.8432786998468M7[t] +  18.7072378065419M8[t] +  5.1378128884813M9[t] +  48.1266498483989M10[t] -14.0049369537788M11[t] -0.676239716955866t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5653&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Invest[t] =  +  129.477741045368 -0.532610496389665Tot.prod[t] -1.22880551638674Tijd[t] +  7.6597101083052M1[t] +  4.87011023185619M2[t] +  1.61852011632155M3[t] +  15.8565145220841M4[t] -16.6968534757561M5[t] -9.40733614237636M6[t] +  11.8432786998468M7[t] +  18.7072378065419M8[t] +  5.1378128884813M9[t] +  48.1266498483989M10[t] -14.0049369537788M11[t] -0.676239716955866t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5653&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5653&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Invest[t] = + 129.477741045368 -0.532610496389665Tot.prod[t] -1.22880551638674Tijd[t] + 7.6597101083052M1[t] + 4.87011023185619M2[t] + 1.61852011632155M3[t] + 15.8565145220841M4[t] -16.6968534757561M5[t] -9.40733614237636M6[t] + 11.8432786998468M7[t] + 18.7072378065419M8[t] + 5.1378128884813M9[t] + 48.1266498483989M10[t] -14.0049369537788M11[t] -0.676239716955866t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)129.47774104536851.2926112.52430.0151080.007554
Tot.prod-0.5326104963896650.537457-0.9910.3268790.163439
Tijd-1.228805516386746.098418-0.20150.8411990.4206
M17.65971010830529.8959410.7740.4428760.221438
M24.870110231856198.2389030.59110.5573390.27867
M31.618520116321557.9268330.20420.8391120.419556
M415.85651452208419.9119011.59970.1165020.058251
M5-16.696853475756110.175477-1.64090.1076410.05382
M6-9.407336142376367.93386-1.18570.2418220.120911
M711.843278699846810.1181041.17050.2478270.123913
M818.707237806541910.3950671.79960.0784810.03924
M95.13781288848138.346490.61560.5412150.270608
M1048.12664984839897.7059066.245400
M11-14.00493695377887.711377-1.81610.0758690.037935
t-0.6762397169558660.115742-5.842600

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 129.477741045368 & 51.292611 & 2.5243 & 0.015108 & 0.007554 \tabularnewline
Tot.prod & -0.532610496389665 & 0.537457 & -0.991 & 0.326879 & 0.163439 \tabularnewline
Tijd & -1.22880551638674 & 6.098418 & -0.2015 & 0.841199 & 0.4206 \tabularnewline
M1 & 7.6597101083052 & 9.895941 & 0.774 & 0.442876 & 0.221438 \tabularnewline
M2 & 4.87011023185619 & 8.238903 & 0.5911 & 0.557339 & 0.27867 \tabularnewline
M3 & 1.61852011632155 & 7.926833 & 0.2042 & 0.839112 & 0.419556 \tabularnewline
M4 & 15.8565145220841 & 9.911901 & 1.5997 & 0.116502 & 0.058251 \tabularnewline
M5 & -16.6968534757561 & 10.175477 & -1.6409 & 0.107641 & 0.05382 \tabularnewline
M6 & -9.40733614237636 & 7.93386 & -1.1857 & 0.241822 & 0.120911 \tabularnewline
M7 & 11.8432786998468 & 10.118104 & 1.1705 & 0.247827 & 0.123913 \tabularnewline
M8 & 18.7072378065419 & 10.395067 & 1.7996 & 0.078481 & 0.03924 \tabularnewline
M9 & 5.1378128884813 & 8.34649 & 0.6156 & 0.541215 & 0.270608 \tabularnewline
M10 & 48.1266498483989 & 7.705906 & 6.2454 & 0 & 0 \tabularnewline
M11 & -14.0049369537788 & 7.711377 & -1.8161 & 0.075869 & 0.037935 \tabularnewline
t & -0.676239716955866 & 0.115742 & -5.8426 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5653&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]129.477741045368[/C][C]51.292611[/C][C]2.5243[/C][C]0.015108[/C][C]0.007554[/C][/ROW]
[ROW][C]Tot.prod[/C][C]-0.532610496389665[/C][C]0.537457[/C][C]-0.991[/C][C]0.326879[/C][C]0.163439[/C][/ROW]
[ROW][C]Tijd[/C][C]-1.22880551638674[/C][C]6.098418[/C][C]-0.2015[/C][C]0.841199[/C][C]0.4206[/C][/ROW]
[ROW][C]M1[/C][C]7.6597101083052[/C][C]9.895941[/C][C]0.774[/C][C]0.442876[/C][C]0.221438[/C][/ROW]
[ROW][C]M2[/C][C]4.87011023185619[/C][C]8.238903[/C][C]0.5911[/C][C]0.557339[/C][C]0.27867[/C][/ROW]
[ROW][C]M3[/C][C]1.61852011632155[/C][C]7.926833[/C][C]0.2042[/C][C]0.839112[/C][C]0.419556[/C][/ROW]
[ROW][C]M4[/C][C]15.8565145220841[/C][C]9.911901[/C][C]1.5997[/C][C]0.116502[/C][C]0.058251[/C][/ROW]
[ROW][C]M5[/C][C]-16.6968534757561[/C][C]10.175477[/C][C]-1.6409[/C][C]0.107641[/C][C]0.05382[/C][/ROW]
[ROW][C]M6[/C][C]-9.40733614237636[/C][C]7.93386[/C][C]-1.1857[/C][C]0.241822[/C][C]0.120911[/C][/ROW]
[ROW][C]M7[/C][C]11.8432786998468[/C][C]10.118104[/C][C]1.1705[/C][C]0.247827[/C][C]0.123913[/C][/ROW]
[ROW][C]M8[/C][C]18.7072378065419[/C][C]10.395067[/C][C]1.7996[/C][C]0.078481[/C][C]0.03924[/C][/ROW]
[ROW][C]M9[/C][C]5.1378128884813[/C][C]8.34649[/C][C]0.6156[/C][C]0.541215[/C][C]0.270608[/C][/ROW]
[ROW][C]M10[/C][C]48.1266498483989[/C][C]7.705906[/C][C]6.2454[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M11[/C][C]-14.0049369537788[/C][C]7.711377[/C][C]-1.8161[/C][C]0.075869[/C][C]0.037935[/C][/ROW]
[ROW][C]t[/C][C]-0.676239716955866[/C][C]0.115742[/C][C]-5.8426[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5653&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5653&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)129.47774104536851.2926112.52430.0151080.007554
Tot.prod-0.5326104963896650.537457-0.9910.3268790.163439
Tijd-1.228805516386746.098418-0.20150.8411990.4206
M17.65971010830529.8959410.7740.4428760.221438
M24.870110231856198.2389030.59110.5573390.27867
M31.618520116321557.9268330.20420.8391120.419556
M415.85651452208419.9119011.59970.1165020.058251
M5-16.696853475756110.175477-1.64090.1076410.05382
M6-9.407336142376367.93386-1.18570.2418220.120911
M711.843278699846810.1181041.17050.2478270.123913
M818.707237806541910.3950671.79960.0784810.03924
M95.13781288848138.346490.61560.5412150.270608
M1048.12664984839897.7059066.245400
M11-14.00493695377887.711377-1.81610.0758690.037935
t-0.6762397169558660.115742-5.842600







Multiple Linear Regression - Regression Statistics
Multiple R0.879870656126318
R-squared0.774172371512158
Adjusted R-squared0.70544222371151
F-TEST (value)11.2639416076574
F-TEST (DF numerator)14
F-TEST (DF denominator)46
p-value1.51000545400848e-10
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation12.1727554778682
Sum Squared Residuals6816.09489250269

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.879870656126318 \tabularnewline
R-squared & 0.774172371512158 \tabularnewline
Adjusted R-squared & 0.70544222371151 \tabularnewline
F-TEST (value) & 11.2639416076574 \tabularnewline
F-TEST (DF numerator) & 14 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 1.51000545400848e-10 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 12.1727554778682 \tabularnewline
Sum Squared Residuals & 6816.09489250269 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5653&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.879870656126318[/C][/ROW]
[ROW][C]R-squared[/C][C]0.774172371512158[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.70544222371151[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]11.2639416076574[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]14[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]1.51000545400848e-10[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]12.1727554778682[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]6816.09489250269[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5653&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5653&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.879870656126318
R-squared0.774172371512158
Adjusted R-squared0.70544222371151
F-TEST (value)11.2639416076574
F-TEST (DF numerator)14
F-TEST (DF denominator)46
p-value1.51000545400848e-10
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation12.1727554778682
Sum Squared Residuals6816.09489250269







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
172.577.6610126352981-5.16101263529812
259.481.6517199913486-22.2517199913486
385.774.79453242871510.9054675712850
488.286.06606198304612.13393801695387
562.866.2582387772696-3.4582387772696
68765.57475259315521.4252474068449
779.282.7936815911675-3.59368159116749
811283.278667294846828.7213327051532
979.272.8145371841976.38546281580308
10132.1121.30541618527910.7945838147212
1140.155.5682319360021-15.4682319360021
126968.9501902224640.0498097775360103
1359.470.6608166995556-11.2608166995556
1473.868.73954754568085.06045245431924
1557.465.3975892592189-7.9975892592189
1681.177.52129560777353.5787043922265
1746.654.2515041754641-7.65150417546413
1841.457.6158577639111-16.2158577639111
1971.270.30759754261130.89240245738866
2067.973.4594371029295-5.55943710292948
217265.01922687856036.98077312143966
22145.5112.49814593650233.0018540634981
2339.746.8142227368641-7.1142227368641
2451.960.3559641722429-8.45596417224288
2573.762.86550639391910.834493606081
2670.960.837715140766210.0622848592338
2760.859.35989359166821.44010640833180
286169.512941103581-8.51294110358103
2954.545.39097287704829.1090271229518
3039.150.6727242524979-11.5727242524979
3166.660.64815049961095.95184950038907
3258.565.8239099462098-7.32390994620978
3359.858.55544281389791.24455718610209
3480.9101.773477900722-20.8734779007221
3537.339.92435027509-2.62435027508996
3644.652.2410875687725-7.64108756877248
3748.750.8625731668041-2.16257316680406
385452.61631643801791.38368356198211
3949.551.6178443356706-2.11784433567055
4061.657.7230520750223.87694792497805
413537.2760962735778-2.27609627357781
4235.738.8828352239389-3.18283522393885
4351.350.2430487616651.05695123833502
444958.4546880376849-9.45468803768492
4541.547.6177305795623-6.11773057956229
4672.591.1553319642203-18.6553319642203
4742.132.075778919814410.0242210801856
4844.144.7120825113307-0.612082511330715
4945.147.008580534451-1.90858053445098
5050.344.55470088418655.74529911581352
5140.943.1301403847274-2.23014038472740
5247.248.2766492305774-1.07664923057739
5336.932.62318789664024.27681210335976
5440.931.35383016649719.54616983350292
5538.342.6075216049453-4.30752160494527
5646.352.683297618329-6.38329761832904
5728.436.8930625437825-8.49306254378254
5878.482.667628013277-4.26762801327708
5936.821.617416132229515.1825838677705
6050.734.040675525189916.6593244748101
6142.833.14151056997229.65848943002778

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 72.5 & 77.6610126352981 & -5.16101263529812 \tabularnewline
2 & 59.4 & 81.6517199913486 & -22.2517199913486 \tabularnewline
3 & 85.7 & 74.794532428715 & 10.9054675712850 \tabularnewline
4 & 88.2 & 86.0660619830461 & 2.13393801695387 \tabularnewline
5 & 62.8 & 66.2582387772696 & -3.4582387772696 \tabularnewline
6 & 87 & 65.574752593155 & 21.4252474068449 \tabularnewline
7 & 79.2 & 82.7936815911675 & -3.59368159116749 \tabularnewline
8 & 112 & 83.2786672948468 & 28.7213327051532 \tabularnewline
9 & 79.2 & 72.814537184197 & 6.38546281580308 \tabularnewline
10 & 132.1 & 121.305416185279 & 10.7945838147212 \tabularnewline
11 & 40.1 & 55.5682319360021 & -15.4682319360021 \tabularnewline
12 & 69 & 68.950190222464 & 0.0498097775360103 \tabularnewline
13 & 59.4 & 70.6608166995556 & -11.2608166995556 \tabularnewline
14 & 73.8 & 68.7395475456808 & 5.06045245431924 \tabularnewline
15 & 57.4 & 65.3975892592189 & -7.9975892592189 \tabularnewline
16 & 81.1 & 77.5212956077735 & 3.5787043922265 \tabularnewline
17 & 46.6 & 54.2515041754641 & -7.65150417546413 \tabularnewline
18 & 41.4 & 57.6158577639111 & -16.2158577639111 \tabularnewline
19 & 71.2 & 70.3075975426113 & 0.89240245738866 \tabularnewline
20 & 67.9 & 73.4594371029295 & -5.55943710292948 \tabularnewline
21 & 72 & 65.0192268785603 & 6.98077312143966 \tabularnewline
22 & 145.5 & 112.498145936502 & 33.0018540634981 \tabularnewline
23 & 39.7 & 46.8142227368641 & -7.1142227368641 \tabularnewline
24 & 51.9 & 60.3559641722429 & -8.45596417224288 \tabularnewline
25 & 73.7 & 62.865506393919 & 10.834493606081 \tabularnewline
26 & 70.9 & 60.8377151407662 & 10.0622848592338 \tabularnewline
27 & 60.8 & 59.3598935916682 & 1.44010640833180 \tabularnewline
28 & 61 & 69.512941103581 & -8.51294110358103 \tabularnewline
29 & 54.5 & 45.3909728770482 & 9.1090271229518 \tabularnewline
30 & 39.1 & 50.6727242524979 & -11.5727242524979 \tabularnewline
31 & 66.6 & 60.6481504996109 & 5.95184950038907 \tabularnewline
32 & 58.5 & 65.8239099462098 & -7.32390994620978 \tabularnewline
33 & 59.8 & 58.5554428138979 & 1.24455718610209 \tabularnewline
34 & 80.9 & 101.773477900722 & -20.8734779007221 \tabularnewline
35 & 37.3 & 39.92435027509 & -2.62435027508996 \tabularnewline
36 & 44.6 & 52.2410875687725 & -7.64108756877248 \tabularnewline
37 & 48.7 & 50.8625731668041 & -2.16257316680406 \tabularnewline
38 & 54 & 52.6163164380179 & 1.38368356198211 \tabularnewline
39 & 49.5 & 51.6178443356706 & -2.11784433567055 \tabularnewline
40 & 61.6 & 57.723052075022 & 3.87694792497805 \tabularnewline
41 & 35 & 37.2760962735778 & -2.27609627357781 \tabularnewline
42 & 35.7 & 38.8828352239389 & -3.18283522393885 \tabularnewline
43 & 51.3 & 50.243048761665 & 1.05695123833502 \tabularnewline
44 & 49 & 58.4546880376849 & -9.45468803768492 \tabularnewline
45 & 41.5 & 47.6177305795623 & -6.11773057956229 \tabularnewline
46 & 72.5 & 91.1553319642203 & -18.6553319642203 \tabularnewline
47 & 42.1 & 32.0757789198144 & 10.0242210801856 \tabularnewline
48 & 44.1 & 44.7120825113307 & -0.612082511330715 \tabularnewline
49 & 45.1 & 47.008580534451 & -1.90858053445098 \tabularnewline
50 & 50.3 & 44.5547008841865 & 5.74529911581352 \tabularnewline
51 & 40.9 & 43.1301403847274 & -2.23014038472740 \tabularnewline
52 & 47.2 & 48.2766492305774 & -1.07664923057739 \tabularnewline
53 & 36.9 & 32.6231878966402 & 4.27681210335976 \tabularnewline
54 & 40.9 & 31.3538301664971 & 9.54616983350292 \tabularnewline
55 & 38.3 & 42.6075216049453 & -4.30752160494527 \tabularnewline
56 & 46.3 & 52.683297618329 & -6.38329761832904 \tabularnewline
57 & 28.4 & 36.8930625437825 & -8.49306254378254 \tabularnewline
58 & 78.4 & 82.667628013277 & -4.26762801327708 \tabularnewline
59 & 36.8 & 21.6174161322295 & 15.1825838677705 \tabularnewline
60 & 50.7 & 34.0406755251899 & 16.6593244748101 \tabularnewline
61 & 42.8 & 33.1415105699722 & 9.65848943002778 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5653&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]72.5[/C][C]77.6610126352981[/C][C]-5.16101263529812[/C][/ROW]
[ROW][C]2[/C][C]59.4[/C][C]81.6517199913486[/C][C]-22.2517199913486[/C][/ROW]
[ROW][C]3[/C][C]85.7[/C][C]74.794532428715[/C][C]10.9054675712850[/C][/ROW]
[ROW][C]4[/C][C]88.2[/C][C]86.0660619830461[/C][C]2.13393801695387[/C][/ROW]
[ROW][C]5[/C][C]62.8[/C][C]66.2582387772696[/C][C]-3.4582387772696[/C][/ROW]
[ROW][C]6[/C][C]87[/C][C]65.574752593155[/C][C]21.4252474068449[/C][/ROW]
[ROW][C]7[/C][C]79.2[/C][C]82.7936815911675[/C][C]-3.59368159116749[/C][/ROW]
[ROW][C]8[/C][C]112[/C][C]83.2786672948468[/C][C]28.7213327051532[/C][/ROW]
[ROW][C]9[/C][C]79.2[/C][C]72.814537184197[/C][C]6.38546281580308[/C][/ROW]
[ROW][C]10[/C][C]132.1[/C][C]121.305416185279[/C][C]10.7945838147212[/C][/ROW]
[ROW][C]11[/C][C]40.1[/C][C]55.5682319360021[/C][C]-15.4682319360021[/C][/ROW]
[ROW][C]12[/C][C]69[/C][C]68.950190222464[/C][C]0.0498097775360103[/C][/ROW]
[ROW][C]13[/C][C]59.4[/C][C]70.6608166995556[/C][C]-11.2608166995556[/C][/ROW]
[ROW][C]14[/C][C]73.8[/C][C]68.7395475456808[/C][C]5.06045245431924[/C][/ROW]
[ROW][C]15[/C][C]57.4[/C][C]65.3975892592189[/C][C]-7.9975892592189[/C][/ROW]
[ROW][C]16[/C][C]81.1[/C][C]77.5212956077735[/C][C]3.5787043922265[/C][/ROW]
[ROW][C]17[/C][C]46.6[/C][C]54.2515041754641[/C][C]-7.65150417546413[/C][/ROW]
[ROW][C]18[/C][C]41.4[/C][C]57.6158577639111[/C][C]-16.2158577639111[/C][/ROW]
[ROW][C]19[/C][C]71.2[/C][C]70.3075975426113[/C][C]0.89240245738866[/C][/ROW]
[ROW][C]20[/C][C]67.9[/C][C]73.4594371029295[/C][C]-5.55943710292948[/C][/ROW]
[ROW][C]21[/C][C]72[/C][C]65.0192268785603[/C][C]6.98077312143966[/C][/ROW]
[ROW][C]22[/C][C]145.5[/C][C]112.498145936502[/C][C]33.0018540634981[/C][/ROW]
[ROW][C]23[/C][C]39.7[/C][C]46.8142227368641[/C][C]-7.1142227368641[/C][/ROW]
[ROW][C]24[/C][C]51.9[/C][C]60.3559641722429[/C][C]-8.45596417224288[/C][/ROW]
[ROW][C]25[/C][C]73.7[/C][C]62.865506393919[/C][C]10.834493606081[/C][/ROW]
[ROW][C]26[/C][C]70.9[/C][C]60.8377151407662[/C][C]10.0622848592338[/C][/ROW]
[ROW][C]27[/C][C]60.8[/C][C]59.3598935916682[/C][C]1.44010640833180[/C][/ROW]
[ROW][C]28[/C][C]61[/C][C]69.512941103581[/C][C]-8.51294110358103[/C][/ROW]
[ROW][C]29[/C][C]54.5[/C][C]45.3909728770482[/C][C]9.1090271229518[/C][/ROW]
[ROW][C]30[/C][C]39.1[/C][C]50.6727242524979[/C][C]-11.5727242524979[/C][/ROW]
[ROW][C]31[/C][C]66.6[/C][C]60.6481504996109[/C][C]5.95184950038907[/C][/ROW]
[ROW][C]32[/C][C]58.5[/C][C]65.8239099462098[/C][C]-7.32390994620978[/C][/ROW]
[ROW][C]33[/C][C]59.8[/C][C]58.5554428138979[/C][C]1.24455718610209[/C][/ROW]
[ROW][C]34[/C][C]80.9[/C][C]101.773477900722[/C][C]-20.8734779007221[/C][/ROW]
[ROW][C]35[/C][C]37.3[/C][C]39.92435027509[/C][C]-2.62435027508996[/C][/ROW]
[ROW][C]36[/C][C]44.6[/C][C]52.2410875687725[/C][C]-7.64108756877248[/C][/ROW]
[ROW][C]37[/C][C]48.7[/C][C]50.8625731668041[/C][C]-2.16257316680406[/C][/ROW]
[ROW][C]38[/C][C]54[/C][C]52.6163164380179[/C][C]1.38368356198211[/C][/ROW]
[ROW][C]39[/C][C]49.5[/C][C]51.6178443356706[/C][C]-2.11784433567055[/C][/ROW]
[ROW][C]40[/C][C]61.6[/C][C]57.723052075022[/C][C]3.87694792497805[/C][/ROW]
[ROW][C]41[/C][C]35[/C][C]37.2760962735778[/C][C]-2.27609627357781[/C][/ROW]
[ROW][C]42[/C][C]35.7[/C][C]38.8828352239389[/C][C]-3.18283522393885[/C][/ROW]
[ROW][C]43[/C][C]51.3[/C][C]50.243048761665[/C][C]1.05695123833502[/C][/ROW]
[ROW][C]44[/C][C]49[/C][C]58.4546880376849[/C][C]-9.45468803768492[/C][/ROW]
[ROW][C]45[/C][C]41.5[/C][C]47.6177305795623[/C][C]-6.11773057956229[/C][/ROW]
[ROW][C]46[/C][C]72.5[/C][C]91.1553319642203[/C][C]-18.6553319642203[/C][/ROW]
[ROW][C]47[/C][C]42.1[/C][C]32.0757789198144[/C][C]10.0242210801856[/C][/ROW]
[ROW][C]48[/C][C]44.1[/C][C]44.7120825113307[/C][C]-0.612082511330715[/C][/ROW]
[ROW][C]49[/C][C]45.1[/C][C]47.008580534451[/C][C]-1.90858053445098[/C][/ROW]
[ROW][C]50[/C][C]50.3[/C][C]44.5547008841865[/C][C]5.74529911581352[/C][/ROW]
[ROW][C]51[/C][C]40.9[/C][C]43.1301403847274[/C][C]-2.23014038472740[/C][/ROW]
[ROW][C]52[/C][C]47.2[/C][C]48.2766492305774[/C][C]-1.07664923057739[/C][/ROW]
[ROW][C]53[/C][C]36.9[/C][C]32.6231878966402[/C][C]4.27681210335976[/C][/ROW]
[ROW][C]54[/C][C]40.9[/C][C]31.3538301664971[/C][C]9.54616983350292[/C][/ROW]
[ROW][C]55[/C][C]38.3[/C][C]42.6075216049453[/C][C]-4.30752160494527[/C][/ROW]
[ROW][C]56[/C][C]46.3[/C][C]52.683297618329[/C][C]-6.38329761832904[/C][/ROW]
[ROW][C]57[/C][C]28.4[/C][C]36.8930625437825[/C][C]-8.49306254378254[/C][/ROW]
[ROW][C]58[/C][C]78.4[/C][C]82.667628013277[/C][C]-4.26762801327708[/C][/ROW]
[ROW][C]59[/C][C]36.8[/C][C]21.6174161322295[/C][C]15.1825838677705[/C][/ROW]
[ROW][C]60[/C][C]50.7[/C][C]34.0406755251899[/C][C]16.6593244748101[/C][/ROW]
[ROW][C]61[/C][C]42.8[/C][C]33.1415105699722[/C][C]9.65848943002778[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5653&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5653&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
172.577.6610126352981-5.16101263529812
259.481.6517199913486-22.2517199913486
385.774.79453242871510.9054675712850
488.286.06606198304612.13393801695387
562.866.2582387772696-3.4582387772696
68765.57475259315521.4252474068449
779.282.7936815911675-3.59368159116749
811283.278667294846828.7213327051532
979.272.8145371841976.38546281580308
10132.1121.30541618527910.7945838147212
1140.155.5682319360021-15.4682319360021
126968.9501902224640.0498097775360103
1359.470.6608166995556-11.2608166995556
1473.868.73954754568085.06045245431924
1557.465.3975892592189-7.9975892592189
1681.177.52129560777353.5787043922265
1746.654.2515041754641-7.65150417546413
1841.457.6158577639111-16.2158577639111
1971.270.30759754261130.89240245738866
2067.973.4594371029295-5.55943710292948
217265.01922687856036.98077312143966
22145.5112.49814593650233.0018540634981
2339.746.8142227368641-7.1142227368641
2451.960.3559641722429-8.45596417224288
2573.762.86550639391910.834493606081
2670.960.837715140766210.0622848592338
2760.859.35989359166821.44010640833180
286169.512941103581-8.51294110358103
2954.545.39097287704829.1090271229518
3039.150.6727242524979-11.5727242524979
3166.660.64815049961095.95184950038907
3258.565.8239099462098-7.32390994620978
3359.858.55544281389791.24455718610209
3480.9101.773477900722-20.8734779007221
3537.339.92435027509-2.62435027508996
3644.652.2410875687725-7.64108756877248
3748.750.8625731668041-2.16257316680406
385452.61631643801791.38368356198211
3949.551.6178443356706-2.11784433567055
4061.657.7230520750223.87694792497805
413537.2760962735778-2.27609627357781
4235.738.8828352239389-3.18283522393885
4351.350.2430487616651.05695123833502
444958.4546880376849-9.45468803768492
4541.547.6177305795623-6.11773057956229
4672.591.1553319642203-18.6553319642203
4742.132.075778919814410.0242210801856
4844.144.7120825113307-0.612082511330715
4945.147.008580534451-1.90858053445098
5050.344.55470088418655.74529911581352
5140.943.1301403847274-2.23014038472740
5247.248.2766492305774-1.07664923057739
5336.932.62318789664024.27681210335976
5440.931.35383016649719.54616983350292
5538.342.6075216049453-4.30752160494527
5646.352.683297618329-6.38329761832904
5728.436.8930625437825-8.49306254378254
5878.482.667628013277-4.26762801327708
5936.821.617416132229515.1825838677705
6050.734.040675525189916.6593244748101
6142.833.14151056997229.65848943002778



Parameters (Session):
par1 = 2 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 2 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')