Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 17 Dec 2007 06:47:56 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/17/t1197898305c6xr61zxttao8yv.htm/, Retrieved Sat, 04 May 2024 05:29:51 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=4367, Retrieved Sat, 04 May 2024 05:29:51 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact173
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Lineair Regressie...] [2007-12-17 13:47:56] [9fe578921d87f9af8e79a90d6142ba02] [Current]
Feedback Forum

Post a new message
Dataseries X:
25.62	0
27.5	0
24.5	0
25.66	0
28.31	0
27.85	0
24.61	0
25.68	0
25.62	1
20.54	0
18.8	0
18.71	0
19.46	0
20.12	0
23.54	0
25.6	0
25.39	0
24.09	0
25.69	0
26.56	0
28.33	0
27.5	0
24.23	0
28.23	0
31.29	0
32.72	0
30.46	1
24.89	1
25.68	0
27.52	0
28.4	1
29.71	0
26.85	0
29.62	1
28.69	1
29.76	1
31.3	1
30.86	0
33.46	1
33.15	1
37.99	1
35.24	1
38.24	1
43.16	1
43.33	1
49.67	1
43.17	1
39.56	1
44.36	1
45.22	1
53.1	1
52.1	1
48.52	1
54.84	1
57.57	1
64.14	1
62.85	1
58.75	1
55.33	1
57.03	1
63.18	1
60.19	1
62.12	0
70.12	1
69.75	1
68.56	1
73.77	1
73.23	1
61.96	1
57.81	1
58.76	1
62.47	1
53.68	1
57.56	1
62.05	1
67.49	1
67.21	1
71.05	1
76.93	1
70.76	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4367&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4367&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4367&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Brent[t] = + 10.0700259581615 -1.40281951442969Irak[t] + 2.57622199681522M1[t] + 2.41204491416357M2[t] + 4.0458162642061M3[t] + 4.92530189996293M4[t] + 4.55541053159698M5[t] + 4.73735052243528M6[t] + 6.51397901533497M7[t] + 6.74265907554045M8[t] + 4.35084669415179M9[t] + 2.79112001832342M10[t] -0.411939990838284M11[t] + 0.718060009161704t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Brent[t] =  +  10.0700259581615 -1.40281951442969Irak[t] +  2.57622199681522M1[t] +  2.41204491416357M2[t] +  4.0458162642061M3[t] +  4.92530189996293M4[t] +  4.55541053159698M5[t] +  4.73735052243528M6[t] +  6.51397901533497M7[t] +  6.74265907554045M8[t] +  4.35084669415179M9[t] +  2.79112001832342M10[t] -0.411939990838284M11[t] +  0.718060009161704t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4367&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Brent[t] =  +  10.0700259581615 -1.40281951442969Irak[t] +  2.57622199681522M1[t] +  2.41204491416357M2[t] +  4.0458162642061M3[t] +  4.92530189996293M4[t] +  4.55541053159698M5[t] +  4.73735052243528M6[t] +  6.51397901533497M7[t] +  6.74265907554045M8[t] +  4.35084669415179M9[t] +  2.79112001832342M10[t] -0.411939990838284M11[t] +  0.718060009161704t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4367&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4367&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Brent[t] = + 10.0700259581615 -1.40281951442969Irak[t] + 2.57622199681522M1[t] + 2.41204491416357M2[t] + 4.0458162642061M3[t] + 4.92530189996293M4[t] + 4.55541053159698M5[t] + 4.73735052243528M6[t] + 6.51397901533497M7[t] + 6.74265907554045M8[t] + 4.35084669415179M9[t] + 2.79112001832342M10[t] -0.411939990838284M11[t] + 0.718060009161704t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.07002595816153.0535033.29790.0015720.000786
Irak-1.402819514429692.454968-0.57140.5696550.284828
M12.576221996815223.7254580.69150.4916650.245833
M22.412044914163573.7482830.64350.5221250.261063
M34.04581626420613.7247891.08620.2813480.140674
M44.925301899962933.7274771.32130.1909470.095474
M54.555410531596983.7269431.22230.2259450.112972
M64.737350522435283.7290781.27040.2084090.104204
M76.513979015334973.7226841.74980.08480.0424
M86.742659075540453.7354561.8050.075630.037815
M94.350846694151793.8652961.12560.2644040.132202
M102.791120018323423.8635950.72240.4725910.236296
M11-0.4119399908382843.862573-0.10660.9153910.457695
t0.7180600091617040.05128614.001100

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 10.0700259581615 & 3.053503 & 3.2979 & 0.001572 & 0.000786 \tabularnewline
Irak & -1.40281951442969 & 2.454968 & -0.5714 & 0.569655 & 0.284828 \tabularnewline
M1 & 2.57622199681522 & 3.725458 & 0.6915 & 0.491665 & 0.245833 \tabularnewline
M2 & 2.41204491416357 & 3.748283 & 0.6435 & 0.522125 & 0.261063 \tabularnewline
M3 & 4.0458162642061 & 3.724789 & 1.0862 & 0.281348 & 0.140674 \tabularnewline
M4 & 4.92530189996293 & 3.727477 & 1.3213 & 0.190947 & 0.095474 \tabularnewline
M5 & 4.55541053159698 & 3.726943 & 1.2223 & 0.225945 & 0.112972 \tabularnewline
M6 & 4.73735052243528 & 3.729078 & 1.2704 & 0.208409 & 0.104204 \tabularnewline
M7 & 6.51397901533497 & 3.722684 & 1.7498 & 0.0848 & 0.0424 \tabularnewline
M8 & 6.74265907554045 & 3.735456 & 1.805 & 0.07563 & 0.037815 \tabularnewline
M9 & 4.35084669415179 & 3.865296 & 1.1256 & 0.264404 & 0.132202 \tabularnewline
M10 & 2.79112001832342 & 3.863595 & 0.7224 & 0.472591 & 0.236296 \tabularnewline
M11 & -0.411939990838284 & 3.862573 & -0.1066 & 0.915391 & 0.457695 \tabularnewline
t & 0.718060009161704 & 0.051286 & 14.0011 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4367&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]10.0700259581615[/C][C]3.053503[/C][C]3.2979[/C][C]0.001572[/C][C]0.000786[/C][/ROW]
[ROW][C]Irak[/C][C]-1.40281951442969[/C][C]2.454968[/C][C]-0.5714[/C][C]0.569655[/C][C]0.284828[/C][/ROW]
[ROW][C]M1[/C][C]2.57622199681522[/C][C]3.725458[/C][C]0.6915[/C][C]0.491665[/C][C]0.245833[/C][/ROW]
[ROW][C]M2[/C][C]2.41204491416357[/C][C]3.748283[/C][C]0.6435[/C][C]0.522125[/C][C]0.261063[/C][/ROW]
[ROW][C]M3[/C][C]4.0458162642061[/C][C]3.724789[/C][C]1.0862[/C][C]0.281348[/C][C]0.140674[/C][/ROW]
[ROW][C]M4[/C][C]4.92530189996293[/C][C]3.727477[/C][C]1.3213[/C][C]0.190947[/C][C]0.095474[/C][/ROW]
[ROW][C]M5[/C][C]4.55541053159698[/C][C]3.726943[/C][C]1.2223[/C][C]0.225945[/C][C]0.112972[/C][/ROW]
[ROW][C]M6[/C][C]4.73735052243528[/C][C]3.729078[/C][C]1.2704[/C][C]0.208409[/C][C]0.104204[/C][/ROW]
[ROW][C]M7[/C][C]6.51397901533497[/C][C]3.722684[/C][C]1.7498[/C][C]0.0848[/C][C]0.0424[/C][/ROW]
[ROW][C]M8[/C][C]6.74265907554045[/C][C]3.735456[/C][C]1.805[/C][C]0.07563[/C][C]0.037815[/C][/ROW]
[ROW][C]M9[/C][C]4.35084669415179[/C][C]3.865296[/C][C]1.1256[/C][C]0.264404[/C][C]0.132202[/C][/ROW]
[ROW][C]M10[/C][C]2.79112001832342[/C][C]3.863595[/C][C]0.7224[/C][C]0.472591[/C][C]0.236296[/C][/ROW]
[ROW][C]M11[/C][C]-0.411939990838284[/C][C]3.862573[/C][C]-0.1066[/C][C]0.915391[/C][C]0.457695[/C][/ROW]
[ROW][C]t[/C][C]0.718060009161704[/C][C]0.051286[/C][C]14.0011[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4367&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4367&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.07002595816153.0535033.29790.0015720.000786
Irak-1.402819514429692.454968-0.57140.5696550.284828
M12.576221996815223.7254580.69150.4916650.245833
M22.412044914163573.7482830.64350.5221250.261063
M34.04581626420613.7247891.08620.2813480.140674
M44.925301899962933.7274771.32130.1909470.095474
M54.555410531596983.7269431.22230.2259450.112972
M64.737350522435283.7290781.27040.2084090.104204
M76.513979015334973.7226841.74980.08480.0424
M86.742659075540453.7354561.8050.075630.037815
M94.350846694151793.8652961.12560.2644040.132202
M102.791120018323423.8635950.72240.4725910.236296
M11-0.4119399908382843.862573-0.10660.9153910.457695
t0.7180600091617040.05128614.001100







Multiple Linear Regression - Regression Statistics
Multiple R0.936834349754005
R-squared0.877658598879009
Adjusted R-squared0.85356105017336
F-TEST (value)36.4210737614668
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.6895835607987
Sum Squared Residuals2953.53486231595

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.936834349754005 \tabularnewline
R-squared & 0.877658598879009 \tabularnewline
Adjusted R-squared & 0.85356105017336 \tabularnewline
F-TEST (value) & 36.4210737614668 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 6.6895835607987 \tabularnewline
Sum Squared Residuals & 2953.53486231595 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4367&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.936834349754005[/C][/ROW]
[ROW][C]R-squared[/C][C]0.877658598879009[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.85356105017336[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]36.4210737614668[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]6.6895835607987[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]2953.53486231595[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4367&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4367&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.936834349754005
R-squared0.877658598879009
Adjusted R-squared0.85356105017336
F-TEST (value)36.4210737614668
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.6895835607987
Sum Squared Residuals2953.53486231595







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
125.6213.364307964138512.2556920358615
227.513.918190890648513.5818091093515
324.516.27002224985288.22997775014722
425.6617.86756789477137.79243210522872
528.3118.215736535567010.0942634644330
627.8519.11573653556708.73426346443297
724.6121.61042503762842.99957496237157
825.6822.55716510699563.12283489300439
925.6219.4805932203396.139406779661
1020.5420.0417460681020.498253931898005
1118.817.5567460681021.24325393189801
1218.7118.6867460681020.0232539318980090
1319.4621.9810280740789-2.5210280740789
1420.1222.534911000589-2.41491100058899
1523.5424.8867423597932-1.34674235979320
1625.626.4842880047117-0.884288004711731
1725.3926.8324566455075-1.44245664550749
1824.0927.7324566455075-3.64245664550749
1925.6930.2271451475689-4.53714514756887
2026.5631.1738852169361-4.61388521693607
2128.3329.5001328447091-1.17013284470911
2227.528.6584661780424-1.15846617804244
2324.2326.1734661780424-1.94346617804245
2428.2327.30346617804240.926533821957564
2531.2930.59774818401940.692251815980644
2632.7231.15163111052941.56836888947058
2730.4632.100642955304-1.64064295530398
2824.8933.6981886002225-8.8081886002225
2925.6835.4491767554479-9.76917675544793
3027.5236.3491767554479-8.82917675544794
3128.437.4410457430796-9.04104574307964
3229.7139.7906053268765-10.0806053268765
3326.8538.1168529546496-11.2668529546496
3429.6235.8723667735532-6.25236677355321
3528.6933.3873667735532-4.69736677355322
3629.7634.5173667735532-4.75736677355321
3731.337.8116487795301-6.51164877953012
3830.8639.7683512204699-8.90835122046987
3933.4640.7173630652444-7.25736306524443
4033.1542.3149087101629-9.16490871016295
4137.9942.6630773509587-4.67307735095871
4235.2443.5630773509587-8.3230773509587
4338.2446.0577658530201-7.81776585302009
4443.1647.0045059223873-3.84450592238729
4543.3345.3307535501603-2.00075355016033
4649.6744.48908688349375.18091311650633
4743.1742.00408688349371.16591311650633
4839.5643.1340868834937-3.57408688349366
4944.3646.4283688894706-2.06836888947058
5045.2246.9822518159806-1.76225181598064
5153.149.33408317518493.76591682481513
5252.150.93162882010341.16837117989660
5348.5251.2797974608992-2.75979746089915
5454.8452.17979746089922.66020253910084
5557.5754.67448596296052.89551403703945
5664.1455.62122603232778.51877396767227
5762.8553.94747366010088.90252633989922
5858.7553.10580699343415.64419300656588
5955.3350.62080699343414.70919300656588
6057.0351.75080699343415.2791930065659
6163.1855.0450889994118.13491100058897
6260.1955.59897192592114.59102807407891
6362.1259.3536227995552.76637720044500
6470.1259.548348930043810.5716510699562
6569.7559.89651757083969.8534824291604
6668.5660.79651757083967.7634824291604
6773.7763.29120607290110.478793927099
6873.2364.23794614226828.99205385773184
6961.9662.5641937700412-0.604193770041223
7057.8161.7225271033746-3.91252710337456
7158.7659.2375271033746-0.477527103374563
7262.4760.36752710337462.10247289662544
7353.6863.6618091093515-9.98180910935148
7457.5664.2156920358615-6.65569203586153
7562.0566.5675233950658-4.51752339506578
7667.4968.1650690399843-0.675069039984306
7767.2168.51323768078-1.30323768078006
7871.0569.413237680781.63676231921994
7976.9371.90792618284145.02207381715857
8070.7672.8546662522086-2.09466625220862

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 25.62 & 13.3643079641385 & 12.2556920358615 \tabularnewline
2 & 27.5 & 13.9181908906485 & 13.5818091093515 \tabularnewline
3 & 24.5 & 16.2700222498528 & 8.22997775014722 \tabularnewline
4 & 25.66 & 17.8675678947713 & 7.79243210522872 \tabularnewline
5 & 28.31 & 18.2157365355670 & 10.0942634644330 \tabularnewline
6 & 27.85 & 19.1157365355670 & 8.73426346443297 \tabularnewline
7 & 24.61 & 21.6104250376284 & 2.99957496237157 \tabularnewline
8 & 25.68 & 22.5571651069956 & 3.12283489300439 \tabularnewline
9 & 25.62 & 19.480593220339 & 6.139406779661 \tabularnewline
10 & 20.54 & 20.041746068102 & 0.498253931898005 \tabularnewline
11 & 18.8 & 17.556746068102 & 1.24325393189801 \tabularnewline
12 & 18.71 & 18.686746068102 & 0.0232539318980090 \tabularnewline
13 & 19.46 & 21.9810280740789 & -2.5210280740789 \tabularnewline
14 & 20.12 & 22.534911000589 & -2.41491100058899 \tabularnewline
15 & 23.54 & 24.8867423597932 & -1.34674235979320 \tabularnewline
16 & 25.6 & 26.4842880047117 & -0.884288004711731 \tabularnewline
17 & 25.39 & 26.8324566455075 & -1.44245664550749 \tabularnewline
18 & 24.09 & 27.7324566455075 & -3.64245664550749 \tabularnewline
19 & 25.69 & 30.2271451475689 & -4.53714514756887 \tabularnewline
20 & 26.56 & 31.1738852169361 & -4.61388521693607 \tabularnewline
21 & 28.33 & 29.5001328447091 & -1.17013284470911 \tabularnewline
22 & 27.5 & 28.6584661780424 & -1.15846617804244 \tabularnewline
23 & 24.23 & 26.1734661780424 & -1.94346617804245 \tabularnewline
24 & 28.23 & 27.3034661780424 & 0.926533821957564 \tabularnewline
25 & 31.29 & 30.5977481840194 & 0.692251815980644 \tabularnewline
26 & 32.72 & 31.1516311105294 & 1.56836888947058 \tabularnewline
27 & 30.46 & 32.100642955304 & -1.64064295530398 \tabularnewline
28 & 24.89 & 33.6981886002225 & -8.8081886002225 \tabularnewline
29 & 25.68 & 35.4491767554479 & -9.76917675544793 \tabularnewline
30 & 27.52 & 36.3491767554479 & -8.82917675544794 \tabularnewline
31 & 28.4 & 37.4410457430796 & -9.04104574307964 \tabularnewline
32 & 29.71 & 39.7906053268765 & -10.0806053268765 \tabularnewline
33 & 26.85 & 38.1168529546496 & -11.2668529546496 \tabularnewline
34 & 29.62 & 35.8723667735532 & -6.25236677355321 \tabularnewline
35 & 28.69 & 33.3873667735532 & -4.69736677355322 \tabularnewline
36 & 29.76 & 34.5173667735532 & -4.75736677355321 \tabularnewline
37 & 31.3 & 37.8116487795301 & -6.51164877953012 \tabularnewline
38 & 30.86 & 39.7683512204699 & -8.90835122046987 \tabularnewline
39 & 33.46 & 40.7173630652444 & -7.25736306524443 \tabularnewline
40 & 33.15 & 42.3149087101629 & -9.16490871016295 \tabularnewline
41 & 37.99 & 42.6630773509587 & -4.67307735095871 \tabularnewline
42 & 35.24 & 43.5630773509587 & -8.3230773509587 \tabularnewline
43 & 38.24 & 46.0577658530201 & -7.81776585302009 \tabularnewline
44 & 43.16 & 47.0045059223873 & -3.84450592238729 \tabularnewline
45 & 43.33 & 45.3307535501603 & -2.00075355016033 \tabularnewline
46 & 49.67 & 44.4890868834937 & 5.18091311650633 \tabularnewline
47 & 43.17 & 42.0040868834937 & 1.16591311650633 \tabularnewline
48 & 39.56 & 43.1340868834937 & -3.57408688349366 \tabularnewline
49 & 44.36 & 46.4283688894706 & -2.06836888947058 \tabularnewline
50 & 45.22 & 46.9822518159806 & -1.76225181598064 \tabularnewline
51 & 53.1 & 49.3340831751849 & 3.76591682481513 \tabularnewline
52 & 52.1 & 50.9316288201034 & 1.16837117989660 \tabularnewline
53 & 48.52 & 51.2797974608992 & -2.75979746089915 \tabularnewline
54 & 54.84 & 52.1797974608992 & 2.66020253910084 \tabularnewline
55 & 57.57 & 54.6744859629605 & 2.89551403703945 \tabularnewline
56 & 64.14 & 55.6212260323277 & 8.51877396767227 \tabularnewline
57 & 62.85 & 53.9474736601008 & 8.90252633989922 \tabularnewline
58 & 58.75 & 53.1058069934341 & 5.64419300656588 \tabularnewline
59 & 55.33 & 50.6208069934341 & 4.70919300656588 \tabularnewline
60 & 57.03 & 51.7508069934341 & 5.2791930065659 \tabularnewline
61 & 63.18 & 55.045088999411 & 8.13491100058897 \tabularnewline
62 & 60.19 & 55.5989719259211 & 4.59102807407891 \tabularnewline
63 & 62.12 & 59.353622799555 & 2.76637720044500 \tabularnewline
64 & 70.12 & 59.5483489300438 & 10.5716510699562 \tabularnewline
65 & 69.75 & 59.8965175708396 & 9.8534824291604 \tabularnewline
66 & 68.56 & 60.7965175708396 & 7.7634824291604 \tabularnewline
67 & 73.77 & 63.291206072901 & 10.478793927099 \tabularnewline
68 & 73.23 & 64.2379461422682 & 8.99205385773184 \tabularnewline
69 & 61.96 & 62.5641937700412 & -0.604193770041223 \tabularnewline
70 & 57.81 & 61.7225271033746 & -3.91252710337456 \tabularnewline
71 & 58.76 & 59.2375271033746 & -0.477527103374563 \tabularnewline
72 & 62.47 & 60.3675271033746 & 2.10247289662544 \tabularnewline
73 & 53.68 & 63.6618091093515 & -9.98180910935148 \tabularnewline
74 & 57.56 & 64.2156920358615 & -6.65569203586153 \tabularnewline
75 & 62.05 & 66.5675233950658 & -4.51752339506578 \tabularnewline
76 & 67.49 & 68.1650690399843 & -0.675069039984306 \tabularnewline
77 & 67.21 & 68.51323768078 & -1.30323768078006 \tabularnewline
78 & 71.05 & 69.41323768078 & 1.63676231921994 \tabularnewline
79 & 76.93 & 71.9079261828414 & 5.02207381715857 \tabularnewline
80 & 70.76 & 72.8546662522086 & -2.09466625220862 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4367&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]25.62[/C][C]13.3643079641385[/C][C]12.2556920358615[/C][/ROW]
[ROW][C]2[/C][C]27.5[/C][C]13.9181908906485[/C][C]13.5818091093515[/C][/ROW]
[ROW][C]3[/C][C]24.5[/C][C]16.2700222498528[/C][C]8.22997775014722[/C][/ROW]
[ROW][C]4[/C][C]25.66[/C][C]17.8675678947713[/C][C]7.79243210522872[/C][/ROW]
[ROW][C]5[/C][C]28.31[/C][C]18.2157365355670[/C][C]10.0942634644330[/C][/ROW]
[ROW][C]6[/C][C]27.85[/C][C]19.1157365355670[/C][C]8.73426346443297[/C][/ROW]
[ROW][C]7[/C][C]24.61[/C][C]21.6104250376284[/C][C]2.99957496237157[/C][/ROW]
[ROW][C]8[/C][C]25.68[/C][C]22.5571651069956[/C][C]3.12283489300439[/C][/ROW]
[ROW][C]9[/C][C]25.62[/C][C]19.480593220339[/C][C]6.139406779661[/C][/ROW]
[ROW][C]10[/C][C]20.54[/C][C]20.041746068102[/C][C]0.498253931898005[/C][/ROW]
[ROW][C]11[/C][C]18.8[/C][C]17.556746068102[/C][C]1.24325393189801[/C][/ROW]
[ROW][C]12[/C][C]18.71[/C][C]18.686746068102[/C][C]0.0232539318980090[/C][/ROW]
[ROW][C]13[/C][C]19.46[/C][C]21.9810280740789[/C][C]-2.5210280740789[/C][/ROW]
[ROW][C]14[/C][C]20.12[/C][C]22.534911000589[/C][C]-2.41491100058899[/C][/ROW]
[ROW][C]15[/C][C]23.54[/C][C]24.8867423597932[/C][C]-1.34674235979320[/C][/ROW]
[ROW][C]16[/C][C]25.6[/C][C]26.4842880047117[/C][C]-0.884288004711731[/C][/ROW]
[ROW][C]17[/C][C]25.39[/C][C]26.8324566455075[/C][C]-1.44245664550749[/C][/ROW]
[ROW][C]18[/C][C]24.09[/C][C]27.7324566455075[/C][C]-3.64245664550749[/C][/ROW]
[ROW][C]19[/C][C]25.69[/C][C]30.2271451475689[/C][C]-4.53714514756887[/C][/ROW]
[ROW][C]20[/C][C]26.56[/C][C]31.1738852169361[/C][C]-4.61388521693607[/C][/ROW]
[ROW][C]21[/C][C]28.33[/C][C]29.5001328447091[/C][C]-1.17013284470911[/C][/ROW]
[ROW][C]22[/C][C]27.5[/C][C]28.6584661780424[/C][C]-1.15846617804244[/C][/ROW]
[ROW][C]23[/C][C]24.23[/C][C]26.1734661780424[/C][C]-1.94346617804245[/C][/ROW]
[ROW][C]24[/C][C]28.23[/C][C]27.3034661780424[/C][C]0.926533821957564[/C][/ROW]
[ROW][C]25[/C][C]31.29[/C][C]30.5977481840194[/C][C]0.692251815980644[/C][/ROW]
[ROW][C]26[/C][C]32.72[/C][C]31.1516311105294[/C][C]1.56836888947058[/C][/ROW]
[ROW][C]27[/C][C]30.46[/C][C]32.100642955304[/C][C]-1.64064295530398[/C][/ROW]
[ROW][C]28[/C][C]24.89[/C][C]33.6981886002225[/C][C]-8.8081886002225[/C][/ROW]
[ROW][C]29[/C][C]25.68[/C][C]35.4491767554479[/C][C]-9.76917675544793[/C][/ROW]
[ROW][C]30[/C][C]27.52[/C][C]36.3491767554479[/C][C]-8.82917675544794[/C][/ROW]
[ROW][C]31[/C][C]28.4[/C][C]37.4410457430796[/C][C]-9.04104574307964[/C][/ROW]
[ROW][C]32[/C][C]29.71[/C][C]39.7906053268765[/C][C]-10.0806053268765[/C][/ROW]
[ROW][C]33[/C][C]26.85[/C][C]38.1168529546496[/C][C]-11.2668529546496[/C][/ROW]
[ROW][C]34[/C][C]29.62[/C][C]35.8723667735532[/C][C]-6.25236677355321[/C][/ROW]
[ROW][C]35[/C][C]28.69[/C][C]33.3873667735532[/C][C]-4.69736677355322[/C][/ROW]
[ROW][C]36[/C][C]29.76[/C][C]34.5173667735532[/C][C]-4.75736677355321[/C][/ROW]
[ROW][C]37[/C][C]31.3[/C][C]37.8116487795301[/C][C]-6.51164877953012[/C][/ROW]
[ROW][C]38[/C][C]30.86[/C][C]39.7683512204699[/C][C]-8.90835122046987[/C][/ROW]
[ROW][C]39[/C][C]33.46[/C][C]40.7173630652444[/C][C]-7.25736306524443[/C][/ROW]
[ROW][C]40[/C][C]33.15[/C][C]42.3149087101629[/C][C]-9.16490871016295[/C][/ROW]
[ROW][C]41[/C][C]37.99[/C][C]42.6630773509587[/C][C]-4.67307735095871[/C][/ROW]
[ROW][C]42[/C][C]35.24[/C][C]43.5630773509587[/C][C]-8.3230773509587[/C][/ROW]
[ROW][C]43[/C][C]38.24[/C][C]46.0577658530201[/C][C]-7.81776585302009[/C][/ROW]
[ROW][C]44[/C][C]43.16[/C][C]47.0045059223873[/C][C]-3.84450592238729[/C][/ROW]
[ROW][C]45[/C][C]43.33[/C][C]45.3307535501603[/C][C]-2.00075355016033[/C][/ROW]
[ROW][C]46[/C][C]49.67[/C][C]44.4890868834937[/C][C]5.18091311650633[/C][/ROW]
[ROW][C]47[/C][C]43.17[/C][C]42.0040868834937[/C][C]1.16591311650633[/C][/ROW]
[ROW][C]48[/C][C]39.56[/C][C]43.1340868834937[/C][C]-3.57408688349366[/C][/ROW]
[ROW][C]49[/C][C]44.36[/C][C]46.4283688894706[/C][C]-2.06836888947058[/C][/ROW]
[ROW][C]50[/C][C]45.22[/C][C]46.9822518159806[/C][C]-1.76225181598064[/C][/ROW]
[ROW][C]51[/C][C]53.1[/C][C]49.3340831751849[/C][C]3.76591682481513[/C][/ROW]
[ROW][C]52[/C][C]52.1[/C][C]50.9316288201034[/C][C]1.16837117989660[/C][/ROW]
[ROW][C]53[/C][C]48.52[/C][C]51.2797974608992[/C][C]-2.75979746089915[/C][/ROW]
[ROW][C]54[/C][C]54.84[/C][C]52.1797974608992[/C][C]2.66020253910084[/C][/ROW]
[ROW][C]55[/C][C]57.57[/C][C]54.6744859629605[/C][C]2.89551403703945[/C][/ROW]
[ROW][C]56[/C][C]64.14[/C][C]55.6212260323277[/C][C]8.51877396767227[/C][/ROW]
[ROW][C]57[/C][C]62.85[/C][C]53.9474736601008[/C][C]8.90252633989922[/C][/ROW]
[ROW][C]58[/C][C]58.75[/C][C]53.1058069934341[/C][C]5.64419300656588[/C][/ROW]
[ROW][C]59[/C][C]55.33[/C][C]50.6208069934341[/C][C]4.70919300656588[/C][/ROW]
[ROW][C]60[/C][C]57.03[/C][C]51.7508069934341[/C][C]5.2791930065659[/C][/ROW]
[ROW][C]61[/C][C]63.18[/C][C]55.045088999411[/C][C]8.13491100058897[/C][/ROW]
[ROW][C]62[/C][C]60.19[/C][C]55.5989719259211[/C][C]4.59102807407891[/C][/ROW]
[ROW][C]63[/C][C]62.12[/C][C]59.353622799555[/C][C]2.76637720044500[/C][/ROW]
[ROW][C]64[/C][C]70.12[/C][C]59.5483489300438[/C][C]10.5716510699562[/C][/ROW]
[ROW][C]65[/C][C]69.75[/C][C]59.8965175708396[/C][C]9.8534824291604[/C][/ROW]
[ROW][C]66[/C][C]68.56[/C][C]60.7965175708396[/C][C]7.7634824291604[/C][/ROW]
[ROW][C]67[/C][C]73.77[/C][C]63.291206072901[/C][C]10.478793927099[/C][/ROW]
[ROW][C]68[/C][C]73.23[/C][C]64.2379461422682[/C][C]8.99205385773184[/C][/ROW]
[ROW][C]69[/C][C]61.96[/C][C]62.5641937700412[/C][C]-0.604193770041223[/C][/ROW]
[ROW][C]70[/C][C]57.81[/C][C]61.7225271033746[/C][C]-3.91252710337456[/C][/ROW]
[ROW][C]71[/C][C]58.76[/C][C]59.2375271033746[/C][C]-0.477527103374563[/C][/ROW]
[ROW][C]72[/C][C]62.47[/C][C]60.3675271033746[/C][C]2.10247289662544[/C][/ROW]
[ROW][C]73[/C][C]53.68[/C][C]63.6618091093515[/C][C]-9.98180910935148[/C][/ROW]
[ROW][C]74[/C][C]57.56[/C][C]64.2156920358615[/C][C]-6.65569203586153[/C][/ROW]
[ROW][C]75[/C][C]62.05[/C][C]66.5675233950658[/C][C]-4.51752339506578[/C][/ROW]
[ROW][C]76[/C][C]67.49[/C][C]68.1650690399843[/C][C]-0.675069039984306[/C][/ROW]
[ROW][C]77[/C][C]67.21[/C][C]68.51323768078[/C][C]-1.30323768078006[/C][/ROW]
[ROW][C]78[/C][C]71.05[/C][C]69.41323768078[/C][C]1.63676231921994[/C][/ROW]
[ROW][C]79[/C][C]76.93[/C][C]71.9079261828414[/C][C]5.02207381715857[/C][/ROW]
[ROW][C]80[/C][C]70.76[/C][C]72.8546662522086[/C][C]-2.09466625220862[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4367&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4367&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
125.6213.364307964138512.2556920358615
227.513.918190890648513.5818091093515
324.516.27002224985288.22997775014722
425.6617.86756789477137.79243210522872
528.3118.215736535567010.0942634644330
627.8519.11573653556708.73426346443297
724.6121.61042503762842.99957496237157
825.6822.55716510699563.12283489300439
925.6219.4805932203396.139406779661
1020.5420.0417460681020.498253931898005
1118.817.5567460681021.24325393189801
1218.7118.6867460681020.0232539318980090
1319.4621.9810280740789-2.5210280740789
1420.1222.534911000589-2.41491100058899
1523.5424.8867423597932-1.34674235979320
1625.626.4842880047117-0.884288004711731
1725.3926.8324566455075-1.44245664550749
1824.0927.7324566455075-3.64245664550749
1925.6930.2271451475689-4.53714514756887
2026.5631.1738852169361-4.61388521693607
2128.3329.5001328447091-1.17013284470911
2227.528.6584661780424-1.15846617804244
2324.2326.1734661780424-1.94346617804245
2428.2327.30346617804240.926533821957564
2531.2930.59774818401940.692251815980644
2632.7231.15163111052941.56836888947058
2730.4632.100642955304-1.64064295530398
2824.8933.6981886002225-8.8081886002225
2925.6835.4491767554479-9.76917675544793
3027.5236.3491767554479-8.82917675544794
3128.437.4410457430796-9.04104574307964
3229.7139.7906053268765-10.0806053268765
3326.8538.1168529546496-11.2668529546496
3429.6235.8723667735532-6.25236677355321
3528.6933.3873667735532-4.69736677355322
3629.7634.5173667735532-4.75736677355321
3731.337.8116487795301-6.51164877953012
3830.8639.7683512204699-8.90835122046987
3933.4640.7173630652444-7.25736306524443
4033.1542.3149087101629-9.16490871016295
4137.9942.6630773509587-4.67307735095871
4235.2443.5630773509587-8.3230773509587
4338.2446.0577658530201-7.81776585302009
4443.1647.0045059223873-3.84450592238729
4543.3345.3307535501603-2.00075355016033
4649.6744.48908688349375.18091311650633
4743.1742.00408688349371.16591311650633
4839.5643.1340868834937-3.57408688349366
4944.3646.4283688894706-2.06836888947058
5045.2246.9822518159806-1.76225181598064
5153.149.33408317518493.76591682481513
5252.150.93162882010341.16837117989660
5348.5251.2797974608992-2.75979746089915
5454.8452.17979746089922.66020253910084
5557.5754.67448596296052.89551403703945
5664.1455.62122603232778.51877396767227
5762.8553.94747366010088.90252633989922
5858.7553.10580699343415.64419300656588
5955.3350.62080699343414.70919300656588
6057.0351.75080699343415.2791930065659
6163.1855.0450889994118.13491100058897
6260.1955.59897192592114.59102807407891
6362.1259.3536227995552.76637720044500
6470.1259.548348930043810.5716510699562
6569.7559.89651757083969.8534824291604
6668.5660.79651757083967.7634824291604
6773.7763.29120607290110.478793927099
6873.2364.23794614226828.99205385773184
6961.9662.5641937700412-0.604193770041223
7057.8161.7225271033746-3.91252710337456
7158.7659.2375271033746-0.477527103374563
7262.4760.36752710337462.10247289662544
7353.6863.6618091093515-9.98180910935148
7457.5664.2156920358615-6.65569203586153
7562.0566.5675233950658-4.51752339506578
7667.4968.1650690399843-0.675069039984306
7767.2168.51323768078-1.30323768078006
7871.0569.413237680781.63676231921994
7976.9371.90792618284145.02207381715857
8070.7672.8546662522086-2.09466625220862



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')