Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 19 Dec 2007 11:20:00 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/19/t1198087392bswxoljskq2zks2.htm/, Retrieved Mon, 06 May 2024 21:16:30 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=4695, Retrieved Mon, 06 May 2024 21:16:30 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact208
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2007-12-19 18:20:00] [9fe578921d87f9af8e79a90d6142ba02] [Current]
Feedback Forum

Post a new message
Dataseries X:
25.62	0
27.5	0
24.5	0
25.66	0
28.31	0
27.85	0
24.61	0
25.68	0
25.62	0
20.54	0
18.8	0
18.71	0
19.46	0
20.12	0
23.54	0
25.6	0
25.39	0
24.09	0
25.69	0
26.56	0
28.33	0
27.5	0
24.23	0
28.23	0
31.29	0
32.72	0
30.46	0
24.89	0
25.68	0
27.52	0
28.4	0
29.71	0
26.85	0
29.62	0
28.69	0
29.76	0
31.3	0
30.86	0
33.46	0
33.15	0
37.99	0
35.24	0
38.24	0
43.16	0
43.33	0
49.67	0
43.17	0
39.56	0
44.36	0
45.22	0
53.1	0
52.1	0
48.52	0
54.84	0
57.57	0
64.14	0
62.85	1
58.75	1
55.33	1
57.03	1
63.18	1
60.19	1
62.12	1
70.12	1
69.75	1
68.56	1
73.77	0
73.23	0
61.96	0
57.81	0
58.76	0
62.47	0
53.68	0
57.56	0
62.05	0
67.49	0
67.21	0
71.05	0
76.93	0
70.76	0




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4695&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4695&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4695&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Brent[t] = + 10.5325353628536 + 9.3619926199262Katrina[t] + 2.58057909418378M1[t] + 2.68723466877524M2[t] + 4.19103310050956M3[t] + 4.94054581795817M4[t] + 4.84148710683533M5[t] + 5.09385696714109M6[t] + 8.07794005886487M7[t] + 8.57745277631347M8[t] + 4.13955708574942M9[t] + 2.65026027938849M10[t] -0.482369860305762M11[t] + 0.647630139694254t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Brent[t] =  +  10.5325353628536 +  9.3619926199262Katrina[t] +  2.58057909418378M1[t] +  2.68723466877524M2[t] +  4.19103310050956M3[t] +  4.94054581795817M4[t] +  4.84148710683533M5[t] +  5.09385696714109M6[t] +  8.07794005886487M7[t] +  8.57745277631347M8[t] +  4.13955708574942M9[t] +  2.65026027938849M10[t] -0.482369860305762M11[t] +  0.647630139694254t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4695&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Brent[t] =  +  10.5325353628536 +  9.3619926199262Katrina[t] +  2.58057909418378M1[t] +  2.68723466877524M2[t] +  4.19103310050956M3[t] +  4.94054581795817M4[t] +  4.84148710683533M5[t] +  5.09385696714109M6[t] +  8.07794005886487M7[t] +  8.57745277631347M8[t] +  4.13955708574942M9[t] +  2.65026027938849M10[t] -0.482369860305762M11[t] +  0.647630139694254t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4695&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4695&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Brent[t] = + 10.5325353628536 + 9.3619926199262Katrina[t] + 2.58057909418378M1[t] + 2.68723466877524M2[t] + 4.19103310050956M3[t] + 4.94054581795817M4[t] + 4.84148710683533M5[t] + 5.09385696714109M6[t] + 8.07794005886487M7[t] + 8.57745277631347M8[t] + 4.13955708574942M9[t] + 2.65026027938849M10[t] -0.482369860305762M11[t] + 0.647630139694254t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.53253536285362.7092093.88770.0002380.000119
Katrina9.36199261992622.1841724.28636e-053e-05
M12.580579094183783.3029650.78130.4374250.218712
M22.687234668775243.3018390.81390.4186520.209326
M34.191033100509563.3010021.26960.2086770.104338
M44.940545817958173.3004541.49690.1391790.069589
M54.841487106835333.3001961.4670.1471170.073558
M65.093856967141093.3002271.54350.1274940.063747
M78.077940058864873.3212012.43220.0177260.008863
M88.577452776313473.3228542.58140.012070.006035
M94.139557085749423.4256371.20840.2312030.115602
M102.650260279388493.424940.77380.4418050.220902
M11-0.4823698603057623.424521-0.14090.8884110.444206
t0.6476301396942540.03090520.955300

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 10.5325353628536 & 2.709209 & 3.8877 & 0.000238 & 0.000119 \tabularnewline
Katrina & 9.3619926199262 & 2.184172 & 4.2863 & 6e-05 & 3e-05 \tabularnewline
M1 & 2.58057909418378 & 3.302965 & 0.7813 & 0.437425 & 0.218712 \tabularnewline
M2 & 2.68723466877524 & 3.301839 & 0.8139 & 0.418652 & 0.209326 \tabularnewline
M3 & 4.19103310050956 & 3.301002 & 1.2696 & 0.208677 & 0.104338 \tabularnewline
M4 & 4.94054581795817 & 3.300454 & 1.4969 & 0.139179 & 0.069589 \tabularnewline
M5 & 4.84148710683533 & 3.300196 & 1.467 & 0.147117 & 0.073558 \tabularnewline
M6 & 5.09385696714109 & 3.300227 & 1.5435 & 0.127494 & 0.063747 \tabularnewline
M7 & 8.07794005886487 & 3.321201 & 2.4322 & 0.017726 & 0.008863 \tabularnewline
M8 & 8.57745277631347 & 3.322854 & 2.5814 & 0.01207 & 0.006035 \tabularnewline
M9 & 4.13955708574942 & 3.425637 & 1.2084 & 0.231203 & 0.115602 \tabularnewline
M10 & 2.65026027938849 & 3.42494 & 0.7738 & 0.441805 & 0.220902 \tabularnewline
M11 & -0.482369860305762 & 3.424521 & -0.1409 & 0.888411 & 0.444206 \tabularnewline
t & 0.647630139694254 & 0.030905 & 20.9553 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4695&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]10.5325353628536[/C][C]2.709209[/C][C]3.8877[/C][C]0.000238[/C][C]0.000119[/C][/ROW]
[ROW][C]Katrina[/C][C]9.3619926199262[/C][C]2.184172[/C][C]4.2863[/C][C]6e-05[/C][C]3e-05[/C][/ROW]
[ROW][C]M1[/C][C]2.58057909418378[/C][C]3.302965[/C][C]0.7813[/C][C]0.437425[/C][C]0.218712[/C][/ROW]
[ROW][C]M2[/C][C]2.68723466877524[/C][C]3.301839[/C][C]0.8139[/C][C]0.418652[/C][C]0.209326[/C][/ROW]
[ROW][C]M3[/C][C]4.19103310050956[/C][C]3.301002[/C][C]1.2696[/C][C]0.208677[/C][C]0.104338[/C][/ROW]
[ROW][C]M4[/C][C]4.94054581795817[/C][C]3.300454[/C][C]1.4969[/C][C]0.139179[/C][C]0.069589[/C][/ROW]
[ROW][C]M5[/C][C]4.84148710683533[/C][C]3.300196[/C][C]1.467[/C][C]0.147117[/C][C]0.073558[/C][/ROW]
[ROW][C]M6[/C][C]5.09385696714109[/C][C]3.300227[/C][C]1.5435[/C][C]0.127494[/C][C]0.063747[/C][/ROW]
[ROW][C]M7[/C][C]8.07794005886487[/C][C]3.321201[/C][C]2.4322[/C][C]0.017726[/C][C]0.008863[/C][/ROW]
[ROW][C]M8[/C][C]8.57745277631347[/C][C]3.322854[/C][C]2.5814[/C][C]0.01207[/C][C]0.006035[/C][/ROW]
[ROW][C]M9[/C][C]4.13955708574942[/C][C]3.425637[/C][C]1.2084[/C][C]0.231203[/C][C]0.115602[/C][/ROW]
[ROW][C]M10[/C][C]2.65026027938849[/C][C]3.42494[/C][C]0.7738[/C][C]0.441805[/C][C]0.220902[/C][/ROW]
[ROW][C]M11[/C][C]-0.482369860305762[/C][C]3.424521[/C][C]-0.1409[/C][C]0.888411[/C][C]0.444206[/C][/ROW]
[ROW][C]t[/C][C]0.647630139694254[/C][C]0.030905[/C][C]20.9553[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4695&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4695&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.53253536285362.7092093.88770.0002380.000119
Katrina9.36199261992622.1841724.28636e-053e-05
M12.580579094183783.3029650.78130.4374250.218712
M22.687234668775243.3018390.81390.4186520.209326
M34.191033100509563.3010021.26960.2086770.104338
M44.940545817958173.3004541.49690.1391790.069589
M54.841487106835333.3001961.4670.1471170.073558
M65.093856967141093.3002271.54350.1274940.063747
M78.077940058864873.3212012.43220.0177260.008863
M88.577452776313473.3228542.58140.012070.006035
M94.139557085749423.4256371.20840.2312030.115602
M102.650260279388493.424940.77380.4418050.220902
M11-0.4823698603057623.424521-0.14090.8884110.444206
t0.6476301396942540.03090520.955300







Multiple Linear Regression - Regression Statistics
Multiple R0.950697261132452
R-squared0.903825282324745
Adjusted R-squared0.884881777328104
F-TEST (value)47.711618440463
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.93120340517714
Sum Squared Residuals2321.82547301660

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.950697261132452 \tabularnewline
R-squared & 0.903825282324745 \tabularnewline
Adjusted R-squared & 0.884881777328104 \tabularnewline
F-TEST (value) & 47.711618440463 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.93120340517714 \tabularnewline
Sum Squared Residuals & 2321.82547301660 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4695&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.950697261132452[/C][/ROW]
[ROW][C]R-squared[/C][C]0.903825282324745[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.884881777328104[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]47.711618440463[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.93120340517714[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]2321.82547301660[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4695&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4695&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.950697261132452
R-squared0.903825282324745
Adjusted R-squared0.884881777328104
F-TEST (value)47.711618440463
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.93120340517714
Sum Squared Residuals2321.82547301660







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
125.6213.760744596731711.8592554032683
227.514.515030311017412.9849696889826
324.516.66645888244607.83354111755403
425.6618.06360173958887.59639826041118
528.3118.61217316816029.69782683183975
627.8519.51217316816028.33782683183976
724.6123.14388639957831.46611360042175
825.6824.29102925672111.38897074327886
925.6220.50076370585135.11923629414866
1020.5419.65909703918470.880902960815301
1118.817.17409703918471.62590296081533
1218.7118.30409703918470.405902960815316
1319.4621.5323062730627-2.07230627306272
1420.1222.2865919873484-2.16659198734845
1523.5424.438020558777-0.898020558777015
1625.625.8351634159199-0.235163415919871
1725.3926.3837348444913-0.9937348444913
1824.0927.2837348444913-3.19373484449130
1925.6930.9154480759093-5.22544807590933
2026.5632.0625909330522-5.50259093305219
2128.3328.27232538218240.0576746178176064
2227.527.43065871551570.0693412844842811
2324.2324.9456587155157-0.715658715515722
2428.2326.07565871551572.15434128448427
2531.2929.30386794939381.98613205060623
2632.7230.05815366367952.66184633632050
2730.4632.2095822351081-1.74958223510806
2824.8933.6067250922509-8.71672509225092
2925.6834.1552965208224-8.47529652082235
3027.5235.0552965208224-7.53529652082235
3128.438.6870097522404-10.2870097522404
3229.7139.8341526093832-10.1241526093832
3326.8536.0438870585134-9.19388705851344
3429.6235.2022203918468-5.58222039184677
3528.6932.7172203918468-4.02722039184677
3629.7633.8472203918468-4.08722039184678
3731.337.0754296257248-5.77542962572482
3830.8637.8297153400105-6.96971534001054
3933.4639.9811439114391-6.5211439114391
4033.1541.378286768582-8.22828676858197
4137.9941.9268581971534-3.9368581971534
4235.2442.8268581971534-7.5868581971534
4338.2446.4585714285714-8.21857142857143
4443.1647.6057142857143-4.44571428571429
4543.3343.8154487348445-0.485448734844493
4649.6742.97378206817786.69621793182218
4743.1740.48878206817782.68121793182218
4839.5641.6187820681778-2.05878206817783
4944.3644.8469913020559-0.486991302055866
5045.2245.6012770163416-0.381277016341593
5153.147.75270558777025.34729441222984
5252.149.1498484449132.95015155508698
5348.5249.6984198734844-1.17841987348444
5454.8450.59841987348444.24158012651556
5557.5754.23013310490253.33986689509752
5664.1455.37727596204538.76272403795467
5762.8560.94900303110171.90099696889826
5858.7560.1073363644351-1.35733636443507
5955.3357.6223363644351-2.29233636443508
6057.0358.7523363644351-1.72233636443508
6163.1861.98054559831311.19945440168688
6260.1962.7348313125988-2.54483131259885
6362.1264.8862598840274-2.76625988402742
6470.1266.28340274117033.83659725882973
6569.7566.83197416974172.9180258302583
6668.5667.73197416974170.8280258302583
6773.7762.001694781233511.7683052187665
6873.2363.148837638376410.0811623616236
6961.9659.35857208750662.60142791249342
7057.8158.5169054208399-0.706905420839916
7158.7656.03190542083992.72809457916007
7262.4757.16190542083995.30809457916007
7353.6860.390114654718-6.71011465471796
7457.5661.1444003690037-3.58440036900369
7562.0563.2958289404323-1.24582894043226
7667.4964.69297179757512.79702820242487
7767.2165.24154322614651.96845677385344
7871.0566.14154322614664.90845677385345
7976.9369.77325645756467.15674354243542
8070.7670.9203993147074-0.160399314707429

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 25.62 & 13.7607445967317 & 11.8592554032683 \tabularnewline
2 & 27.5 & 14.5150303110174 & 12.9849696889826 \tabularnewline
3 & 24.5 & 16.6664588824460 & 7.83354111755403 \tabularnewline
4 & 25.66 & 18.0636017395888 & 7.59639826041118 \tabularnewline
5 & 28.31 & 18.6121731681602 & 9.69782683183975 \tabularnewline
6 & 27.85 & 19.5121731681602 & 8.33782683183976 \tabularnewline
7 & 24.61 & 23.1438863995783 & 1.46611360042175 \tabularnewline
8 & 25.68 & 24.2910292567211 & 1.38897074327886 \tabularnewline
9 & 25.62 & 20.5007637058513 & 5.11923629414866 \tabularnewline
10 & 20.54 & 19.6590970391847 & 0.880902960815301 \tabularnewline
11 & 18.8 & 17.1740970391847 & 1.62590296081533 \tabularnewline
12 & 18.71 & 18.3040970391847 & 0.405902960815316 \tabularnewline
13 & 19.46 & 21.5323062730627 & -2.07230627306272 \tabularnewline
14 & 20.12 & 22.2865919873484 & -2.16659198734845 \tabularnewline
15 & 23.54 & 24.438020558777 & -0.898020558777015 \tabularnewline
16 & 25.6 & 25.8351634159199 & -0.235163415919871 \tabularnewline
17 & 25.39 & 26.3837348444913 & -0.9937348444913 \tabularnewline
18 & 24.09 & 27.2837348444913 & -3.19373484449130 \tabularnewline
19 & 25.69 & 30.9154480759093 & -5.22544807590933 \tabularnewline
20 & 26.56 & 32.0625909330522 & -5.50259093305219 \tabularnewline
21 & 28.33 & 28.2723253821824 & 0.0576746178176064 \tabularnewline
22 & 27.5 & 27.4306587155157 & 0.0693412844842811 \tabularnewline
23 & 24.23 & 24.9456587155157 & -0.715658715515722 \tabularnewline
24 & 28.23 & 26.0756587155157 & 2.15434128448427 \tabularnewline
25 & 31.29 & 29.3038679493938 & 1.98613205060623 \tabularnewline
26 & 32.72 & 30.0581536636795 & 2.66184633632050 \tabularnewline
27 & 30.46 & 32.2095822351081 & -1.74958223510806 \tabularnewline
28 & 24.89 & 33.6067250922509 & -8.71672509225092 \tabularnewline
29 & 25.68 & 34.1552965208224 & -8.47529652082235 \tabularnewline
30 & 27.52 & 35.0552965208224 & -7.53529652082235 \tabularnewline
31 & 28.4 & 38.6870097522404 & -10.2870097522404 \tabularnewline
32 & 29.71 & 39.8341526093832 & -10.1241526093832 \tabularnewline
33 & 26.85 & 36.0438870585134 & -9.19388705851344 \tabularnewline
34 & 29.62 & 35.2022203918468 & -5.58222039184677 \tabularnewline
35 & 28.69 & 32.7172203918468 & -4.02722039184677 \tabularnewline
36 & 29.76 & 33.8472203918468 & -4.08722039184678 \tabularnewline
37 & 31.3 & 37.0754296257248 & -5.77542962572482 \tabularnewline
38 & 30.86 & 37.8297153400105 & -6.96971534001054 \tabularnewline
39 & 33.46 & 39.9811439114391 & -6.5211439114391 \tabularnewline
40 & 33.15 & 41.378286768582 & -8.22828676858197 \tabularnewline
41 & 37.99 & 41.9268581971534 & -3.9368581971534 \tabularnewline
42 & 35.24 & 42.8268581971534 & -7.5868581971534 \tabularnewline
43 & 38.24 & 46.4585714285714 & -8.21857142857143 \tabularnewline
44 & 43.16 & 47.6057142857143 & -4.44571428571429 \tabularnewline
45 & 43.33 & 43.8154487348445 & -0.485448734844493 \tabularnewline
46 & 49.67 & 42.9737820681778 & 6.69621793182218 \tabularnewline
47 & 43.17 & 40.4887820681778 & 2.68121793182218 \tabularnewline
48 & 39.56 & 41.6187820681778 & -2.05878206817783 \tabularnewline
49 & 44.36 & 44.8469913020559 & -0.486991302055866 \tabularnewline
50 & 45.22 & 45.6012770163416 & -0.381277016341593 \tabularnewline
51 & 53.1 & 47.7527055877702 & 5.34729441222984 \tabularnewline
52 & 52.1 & 49.149848444913 & 2.95015155508698 \tabularnewline
53 & 48.52 & 49.6984198734844 & -1.17841987348444 \tabularnewline
54 & 54.84 & 50.5984198734844 & 4.24158012651556 \tabularnewline
55 & 57.57 & 54.2301331049025 & 3.33986689509752 \tabularnewline
56 & 64.14 & 55.3772759620453 & 8.76272403795467 \tabularnewline
57 & 62.85 & 60.9490030311017 & 1.90099696889826 \tabularnewline
58 & 58.75 & 60.1073363644351 & -1.35733636443507 \tabularnewline
59 & 55.33 & 57.6223363644351 & -2.29233636443508 \tabularnewline
60 & 57.03 & 58.7523363644351 & -1.72233636443508 \tabularnewline
61 & 63.18 & 61.9805455983131 & 1.19945440168688 \tabularnewline
62 & 60.19 & 62.7348313125988 & -2.54483131259885 \tabularnewline
63 & 62.12 & 64.8862598840274 & -2.76625988402742 \tabularnewline
64 & 70.12 & 66.2834027411703 & 3.83659725882973 \tabularnewline
65 & 69.75 & 66.8319741697417 & 2.9180258302583 \tabularnewline
66 & 68.56 & 67.7319741697417 & 0.8280258302583 \tabularnewline
67 & 73.77 & 62.0016947812335 & 11.7683052187665 \tabularnewline
68 & 73.23 & 63.1488376383764 & 10.0811623616236 \tabularnewline
69 & 61.96 & 59.3585720875066 & 2.60142791249342 \tabularnewline
70 & 57.81 & 58.5169054208399 & -0.706905420839916 \tabularnewline
71 & 58.76 & 56.0319054208399 & 2.72809457916007 \tabularnewline
72 & 62.47 & 57.1619054208399 & 5.30809457916007 \tabularnewline
73 & 53.68 & 60.390114654718 & -6.71011465471796 \tabularnewline
74 & 57.56 & 61.1444003690037 & -3.58440036900369 \tabularnewline
75 & 62.05 & 63.2958289404323 & -1.24582894043226 \tabularnewline
76 & 67.49 & 64.6929717975751 & 2.79702820242487 \tabularnewline
77 & 67.21 & 65.2415432261465 & 1.96845677385344 \tabularnewline
78 & 71.05 & 66.1415432261466 & 4.90845677385345 \tabularnewline
79 & 76.93 & 69.7732564575646 & 7.15674354243542 \tabularnewline
80 & 70.76 & 70.9203993147074 & -0.160399314707429 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4695&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]25.62[/C][C]13.7607445967317[/C][C]11.8592554032683[/C][/ROW]
[ROW][C]2[/C][C]27.5[/C][C]14.5150303110174[/C][C]12.9849696889826[/C][/ROW]
[ROW][C]3[/C][C]24.5[/C][C]16.6664588824460[/C][C]7.83354111755403[/C][/ROW]
[ROW][C]4[/C][C]25.66[/C][C]18.0636017395888[/C][C]7.59639826041118[/C][/ROW]
[ROW][C]5[/C][C]28.31[/C][C]18.6121731681602[/C][C]9.69782683183975[/C][/ROW]
[ROW][C]6[/C][C]27.85[/C][C]19.5121731681602[/C][C]8.33782683183976[/C][/ROW]
[ROW][C]7[/C][C]24.61[/C][C]23.1438863995783[/C][C]1.46611360042175[/C][/ROW]
[ROW][C]8[/C][C]25.68[/C][C]24.2910292567211[/C][C]1.38897074327886[/C][/ROW]
[ROW][C]9[/C][C]25.62[/C][C]20.5007637058513[/C][C]5.11923629414866[/C][/ROW]
[ROW][C]10[/C][C]20.54[/C][C]19.6590970391847[/C][C]0.880902960815301[/C][/ROW]
[ROW][C]11[/C][C]18.8[/C][C]17.1740970391847[/C][C]1.62590296081533[/C][/ROW]
[ROW][C]12[/C][C]18.71[/C][C]18.3040970391847[/C][C]0.405902960815316[/C][/ROW]
[ROW][C]13[/C][C]19.46[/C][C]21.5323062730627[/C][C]-2.07230627306272[/C][/ROW]
[ROW][C]14[/C][C]20.12[/C][C]22.2865919873484[/C][C]-2.16659198734845[/C][/ROW]
[ROW][C]15[/C][C]23.54[/C][C]24.438020558777[/C][C]-0.898020558777015[/C][/ROW]
[ROW][C]16[/C][C]25.6[/C][C]25.8351634159199[/C][C]-0.235163415919871[/C][/ROW]
[ROW][C]17[/C][C]25.39[/C][C]26.3837348444913[/C][C]-0.9937348444913[/C][/ROW]
[ROW][C]18[/C][C]24.09[/C][C]27.2837348444913[/C][C]-3.19373484449130[/C][/ROW]
[ROW][C]19[/C][C]25.69[/C][C]30.9154480759093[/C][C]-5.22544807590933[/C][/ROW]
[ROW][C]20[/C][C]26.56[/C][C]32.0625909330522[/C][C]-5.50259093305219[/C][/ROW]
[ROW][C]21[/C][C]28.33[/C][C]28.2723253821824[/C][C]0.0576746178176064[/C][/ROW]
[ROW][C]22[/C][C]27.5[/C][C]27.4306587155157[/C][C]0.0693412844842811[/C][/ROW]
[ROW][C]23[/C][C]24.23[/C][C]24.9456587155157[/C][C]-0.715658715515722[/C][/ROW]
[ROW][C]24[/C][C]28.23[/C][C]26.0756587155157[/C][C]2.15434128448427[/C][/ROW]
[ROW][C]25[/C][C]31.29[/C][C]29.3038679493938[/C][C]1.98613205060623[/C][/ROW]
[ROW][C]26[/C][C]32.72[/C][C]30.0581536636795[/C][C]2.66184633632050[/C][/ROW]
[ROW][C]27[/C][C]30.46[/C][C]32.2095822351081[/C][C]-1.74958223510806[/C][/ROW]
[ROW][C]28[/C][C]24.89[/C][C]33.6067250922509[/C][C]-8.71672509225092[/C][/ROW]
[ROW][C]29[/C][C]25.68[/C][C]34.1552965208224[/C][C]-8.47529652082235[/C][/ROW]
[ROW][C]30[/C][C]27.52[/C][C]35.0552965208224[/C][C]-7.53529652082235[/C][/ROW]
[ROW][C]31[/C][C]28.4[/C][C]38.6870097522404[/C][C]-10.2870097522404[/C][/ROW]
[ROW][C]32[/C][C]29.71[/C][C]39.8341526093832[/C][C]-10.1241526093832[/C][/ROW]
[ROW][C]33[/C][C]26.85[/C][C]36.0438870585134[/C][C]-9.19388705851344[/C][/ROW]
[ROW][C]34[/C][C]29.62[/C][C]35.2022203918468[/C][C]-5.58222039184677[/C][/ROW]
[ROW][C]35[/C][C]28.69[/C][C]32.7172203918468[/C][C]-4.02722039184677[/C][/ROW]
[ROW][C]36[/C][C]29.76[/C][C]33.8472203918468[/C][C]-4.08722039184678[/C][/ROW]
[ROW][C]37[/C][C]31.3[/C][C]37.0754296257248[/C][C]-5.77542962572482[/C][/ROW]
[ROW][C]38[/C][C]30.86[/C][C]37.8297153400105[/C][C]-6.96971534001054[/C][/ROW]
[ROW][C]39[/C][C]33.46[/C][C]39.9811439114391[/C][C]-6.5211439114391[/C][/ROW]
[ROW][C]40[/C][C]33.15[/C][C]41.378286768582[/C][C]-8.22828676858197[/C][/ROW]
[ROW][C]41[/C][C]37.99[/C][C]41.9268581971534[/C][C]-3.9368581971534[/C][/ROW]
[ROW][C]42[/C][C]35.24[/C][C]42.8268581971534[/C][C]-7.5868581971534[/C][/ROW]
[ROW][C]43[/C][C]38.24[/C][C]46.4585714285714[/C][C]-8.21857142857143[/C][/ROW]
[ROW][C]44[/C][C]43.16[/C][C]47.6057142857143[/C][C]-4.44571428571429[/C][/ROW]
[ROW][C]45[/C][C]43.33[/C][C]43.8154487348445[/C][C]-0.485448734844493[/C][/ROW]
[ROW][C]46[/C][C]49.67[/C][C]42.9737820681778[/C][C]6.69621793182218[/C][/ROW]
[ROW][C]47[/C][C]43.17[/C][C]40.4887820681778[/C][C]2.68121793182218[/C][/ROW]
[ROW][C]48[/C][C]39.56[/C][C]41.6187820681778[/C][C]-2.05878206817783[/C][/ROW]
[ROW][C]49[/C][C]44.36[/C][C]44.8469913020559[/C][C]-0.486991302055866[/C][/ROW]
[ROW][C]50[/C][C]45.22[/C][C]45.6012770163416[/C][C]-0.381277016341593[/C][/ROW]
[ROW][C]51[/C][C]53.1[/C][C]47.7527055877702[/C][C]5.34729441222984[/C][/ROW]
[ROW][C]52[/C][C]52.1[/C][C]49.149848444913[/C][C]2.95015155508698[/C][/ROW]
[ROW][C]53[/C][C]48.52[/C][C]49.6984198734844[/C][C]-1.17841987348444[/C][/ROW]
[ROW][C]54[/C][C]54.84[/C][C]50.5984198734844[/C][C]4.24158012651556[/C][/ROW]
[ROW][C]55[/C][C]57.57[/C][C]54.2301331049025[/C][C]3.33986689509752[/C][/ROW]
[ROW][C]56[/C][C]64.14[/C][C]55.3772759620453[/C][C]8.76272403795467[/C][/ROW]
[ROW][C]57[/C][C]62.85[/C][C]60.9490030311017[/C][C]1.90099696889826[/C][/ROW]
[ROW][C]58[/C][C]58.75[/C][C]60.1073363644351[/C][C]-1.35733636443507[/C][/ROW]
[ROW][C]59[/C][C]55.33[/C][C]57.6223363644351[/C][C]-2.29233636443508[/C][/ROW]
[ROW][C]60[/C][C]57.03[/C][C]58.7523363644351[/C][C]-1.72233636443508[/C][/ROW]
[ROW][C]61[/C][C]63.18[/C][C]61.9805455983131[/C][C]1.19945440168688[/C][/ROW]
[ROW][C]62[/C][C]60.19[/C][C]62.7348313125988[/C][C]-2.54483131259885[/C][/ROW]
[ROW][C]63[/C][C]62.12[/C][C]64.8862598840274[/C][C]-2.76625988402742[/C][/ROW]
[ROW][C]64[/C][C]70.12[/C][C]66.2834027411703[/C][C]3.83659725882973[/C][/ROW]
[ROW][C]65[/C][C]69.75[/C][C]66.8319741697417[/C][C]2.9180258302583[/C][/ROW]
[ROW][C]66[/C][C]68.56[/C][C]67.7319741697417[/C][C]0.8280258302583[/C][/ROW]
[ROW][C]67[/C][C]73.77[/C][C]62.0016947812335[/C][C]11.7683052187665[/C][/ROW]
[ROW][C]68[/C][C]73.23[/C][C]63.1488376383764[/C][C]10.0811623616236[/C][/ROW]
[ROW][C]69[/C][C]61.96[/C][C]59.3585720875066[/C][C]2.60142791249342[/C][/ROW]
[ROW][C]70[/C][C]57.81[/C][C]58.5169054208399[/C][C]-0.706905420839916[/C][/ROW]
[ROW][C]71[/C][C]58.76[/C][C]56.0319054208399[/C][C]2.72809457916007[/C][/ROW]
[ROW][C]72[/C][C]62.47[/C][C]57.1619054208399[/C][C]5.30809457916007[/C][/ROW]
[ROW][C]73[/C][C]53.68[/C][C]60.390114654718[/C][C]-6.71011465471796[/C][/ROW]
[ROW][C]74[/C][C]57.56[/C][C]61.1444003690037[/C][C]-3.58440036900369[/C][/ROW]
[ROW][C]75[/C][C]62.05[/C][C]63.2958289404323[/C][C]-1.24582894043226[/C][/ROW]
[ROW][C]76[/C][C]67.49[/C][C]64.6929717975751[/C][C]2.79702820242487[/C][/ROW]
[ROW][C]77[/C][C]67.21[/C][C]65.2415432261465[/C][C]1.96845677385344[/C][/ROW]
[ROW][C]78[/C][C]71.05[/C][C]66.1415432261466[/C][C]4.90845677385345[/C][/ROW]
[ROW][C]79[/C][C]76.93[/C][C]69.7732564575646[/C][C]7.15674354243542[/C][/ROW]
[ROW][C]80[/C][C]70.76[/C][C]70.9203993147074[/C][C]-0.160399314707429[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4695&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4695&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
125.6213.760744596731711.8592554032683
227.514.515030311017412.9849696889826
324.516.66645888244607.83354111755403
425.6618.06360173958887.59639826041118
528.3118.61217316816029.69782683183975
627.8519.51217316816028.33782683183976
724.6123.14388639957831.46611360042175
825.6824.29102925672111.38897074327886
925.6220.50076370585135.11923629414866
1020.5419.65909703918470.880902960815301
1118.817.17409703918471.62590296081533
1218.7118.30409703918470.405902960815316
1319.4621.5323062730627-2.07230627306272
1420.1222.2865919873484-2.16659198734845
1523.5424.438020558777-0.898020558777015
1625.625.8351634159199-0.235163415919871
1725.3926.3837348444913-0.9937348444913
1824.0927.2837348444913-3.19373484449130
1925.6930.9154480759093-5.22544807590933
2026.5632.0625909330522-5.50259093305219
2128.3328.27232538218240.0576746178176064
2227.527.43065871551570.0693412844842811
2324.2324.9456587155157-0.715658715515722
2428.2326.07565871551572.15434128448427
2531.2929.30386794939381.98613205060623
2632.7230.05815366367952.66184633632050
2730.4632.2095822351081-1.74958223510806
2824.8933.6067250922509-8.71672509225092
2925.6834.1552965208224-8.47529652082235
3027.5235.0552965208224-7.53529652082235
3128.438.6870097522404-10.2870097522404
3229.7139.8341526093832-10.1241526093832
3326.8536.0438870585134-9.19388705851344
3429.6235.2022203918468-5.58222039184677
3528.6932.7172203918468-4.02722039184677
3629.7633.8472203918468-4.08722039184678
3731.337.0754296257248-5.77542962572482
3830.8637.8297153400105-6.96971534001054
3933.4639.9811439114391-6.5211439114391
4033.1541.378286768582-8.22828676858197
4137.9941.9268581971534-3.9368581971534
4235.2442.8268581971534-7.5868581971534
4338.2446.4585714285714-8.21857142857143
4443.1647.6057142857143-4.44571428571429
4543.3343.8154487348445-0.485448734844493
4649.6742.97378206817786.69621793182218
4743.1740.48878206817782.68121793182218
4839.5641.6187820681778-2.05878206817783
4944.3644.8469913020559-0.486991302055866
5045.2245.6012770163416-0.381277016341593
5153.147.75270558777025.34729441222984
5252.149.1498484449132.95015155508698
5348.5249.6984198734844-1.17841987348444
5454.8450.59841987348444.24158012651556
5557.5754.23013310490253.33986689509752
5664.1455.37727596204538.76272403795467
5762.8560.94900303110171.90099696889826
5858.7560.1073363644351-1.35733636443507
5955.3357.6223363644351-2.29233636443508
6057.0358.7523363644351-1.72233636443508
6163.1861.98054559831311.19945440168688
6260.1962.7348313125988-2.54483131259885
6362.1264.8862598840274-2.76625988402742
6470.1266.28340274117033.83659725882973
6569.7566.83197416974172.9180258302583
6668.5667.73197416974170.8280258302583
6773.7762.001694781233511.7683052187665
6873.2363.148837638376410.0811623616236
6961.9659.35857208750662.60142791249342
7057.8158.5169054208399-0.706905420839916
7158.7656.03190542083992.72809457916007
7262.4757.16190542083995.30809457916007
7353.6860.390114654718-6.71011465471796
7457.5661.1444003690037-3.58440036900369
7562.0563.2958289404323-1.24582894043226
7667.4964.69297179757512.79702820242487
7767.2165.24154322614651.96845677385344
7871.0566.14154322614664.90845677385345
7976.9369.77325645756467.15674354243542
8070.7670.9203993147074-0.160399314707429



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')