Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 17 Dec 2007 06:13:40 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/17/t1197898015o8jisp79titomc5.htm/, Retrieved Fri, 03 May 2024 23:58:47 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=4366, Retrieved Fri, 03 May 2024 23:58:47 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact183
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Lineair Regressie...] [2007-12-17 13:13:40] [9fe578921d87f9af8e79a90d6142ba02] [Current]
Feedback Forum

Post a new message
Dataseries X:
25.62	0
27.5	0
24.5	0
25.66	0
28.31	0
27.85	0
24.61	0
25.68	0
25.62	0
20.54	0
18.8	0
18.71	0
19.46	0
20.12	0
23.54	0
25.6	0
25.39	0
24.09	0
25.69	0
26.56	0
28.33	0
27.5	0
24.23	0
28.23	0
31.29	0
32.72	0
30.46	0
24.89	0
25.68	0
27.52	0
28.4	0
29.71	0
26.85	0
29.62	0
28.69	0
29.76	0
31.3	0
30.86	0
33.46	0
33.15	0
37.99	0
35.24	0
38.24	0
43.16	0
43.33	0
49.67	0
43.17	0
39.56	0
44.36	0
45.22	0
53.1	0
52.1	0
48.52	0
54.84	1
57.57	1
64.14	1
62.85	1
58.75	1
55.33	1
57.03	1
63.18	1
60.19	1
62.12	1
70.12	1
69.75	1
68.56	1
73.77	1
73.23	1
61.96	1
57.81	1
58.76	1
62.47	1
53.68	1
57.56	1
62.05	1
67.49	1
67.21	1
71.05	1
76.93	1
70.76	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4366&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4366&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4366&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Brent[t] = + 15.3991514616441 + 14.3949587992937Iran[t] + 2.07831497519547M1[t] + 2.37793730205442M2[t] + 4.07470248605622M3[t] + 5.0171819557723M4[t] + 5.11108999691695M5[t] + 3.50000392387678M6[t] + 5.33962625073572M7[t] + 6.0321057204518M8[t] + 3.560656828947M9[t] + 2.26432677485355M10[t] -0.675336612573228M11[t] + 0.454663387426778t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Brent[t] =  +  15.3991514616441 +  14.3949587992937Iran[t] +  2.07831497519547M1[t] +  2.37793730205442M2[t] +  4.07470248605622M3[t] +  5.0171819557723M4[t] +  5.11108999691695M5[t] +  3.50000392387678M6[t] +  5.33962625073572M7[t] +  6.0321057204518M8[t] +  3.560656828947M9[t] +  2.26432677485355M10[t] -0.675336612573228M11[t] +  0.454663387426778t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4366&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Brent[t] =  +  15.3991514616441 +  14.3949587992937Iran[t] +  2.07831497519547M1[t] +  2.37793730205442M2[t] +  4.07470248605622M3[t] +  5.0171819557723M4[t] +  5.11108999691695M5[t] +  3.50000392387678M6[t] +  5.33962625073572M7[t] +  6.0321057204518M8[t] +  3.560656828947M9[t] +  2.26432677485355M10[t] -0.675336612573228M11[t] +  0.454663387426778t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4366&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4366&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Brent[t] = + 15.3991514616441 + 14.3949587992937Iran[t] + 2.07831497519547M1[t] + 2.37793730205442M2[t] + 4.07470248605622M3[t] + 5.0171819557723M4[t] + 5.11108999691695M5[t] + 3.50000392387678M6[t] + 5.33962625073572M7[t] + 6.0321057204518M8[t] + 3.560656828947M9[t] + 2.26432677485355M10[t] -0.675336612573228M11[t] + 0.454663387426778t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)15.39915146164412.4942986.173700
Iran14.39495879929372.1546366.680900
M12.078314975195472.8854580.72030.4739010.23695
M22.377937302054422.8837260.82460.4125660.206283
M34.074702486056222.8826631.41350.1622020.081101
M45.01718195577232.882271.74070.0863960.043198
M55.111089996916952.8825471.77310.0808230.040412
M63.500003923876782.8889651.21150.2300210.11501
M75.339626250735722.8867391.84970.0688330.034417
M86.03210572045182.8851812.09070.0404080.020204
M93.5606568289472.9933541.18950.2384970.119248
M102.264326774853552.991740.75690.4518280.225914
M11-0.6753366125732282.990771-0.22580.8220490.411025
t0.4546633874267780.04395510.343800

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 15.3991514616441 & 2.494298 & 6.1737 & 0 & 0 \tabularnewline
Iran & 14.3949587992937 & 2.154636 & 6.6809 & 0 & 0 \tabularnewline
M1 & 2.07831497519547 & 2.885458 & 0.7203 & 0.473901 & 0.23695 \tabularnewline
M2 & 2.37793730205442 & 2.883726 & 0.8246 & 0.412566 & 0.206283 \tabularnewline
M3 & 4.07470248605622 & 2.882663 & 1.4135 & 0.162202 & 0.081101 \tabularnewline
M4 & 5.0171819557723 & 2.88227 & 1.7407 & 0.086396 & 0.043198 \tabularnewline
M5 & 5.11108999691695 & 2.882547 & 1.7731 & 0.080823 & 0.040412 \tabularnewline
M6 & 3.50000392387678 & 2.888965 & 1.2115 & 0.230021 & 0.11501 \tabularnewline
M7 & 5.33962625073572 & 2.886739 & 1.8497 & 0.068833 & 0.034417 \tabularnewline
M8 & 6.0321057204518 & 2.885181 & 2.0907 & 0.040408 & 0.020204 \tabularnewline
M9 & 3.560656828947 & 2.993354 & 1.1895 & 0.238497 & 0.119248 \tabularnewline
M10 & 2.26432677485355 & 2.99174 & 0.7569 & 0.451828 & 0.225914 \tabularnewline
M11 & -0.675336612573228 & 2.990771 & -0.2258 & 0.822049 & 0.411025 \tabularnewline
t & 0.454663387426778 & 0.043955 & 10.3438 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4366&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]15.3991514616441[/C][C]2.494298[/C][C]6.1737[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]Iran[/C][C]14.3949587992937[/C][C]2.154636[/C][C]6.6809[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M1[/C][C]2.07831497519547[/C][C]2.885458[/C][C]0.7203[/C][C]0.473901[/C][C]0.23695[/C][/ROW]
[ROW][C]M2[/C][C]2.37793730205442[/C][C]2.883726[/C][C]0.8246[/C][C]0.412566[/C][C]0.206283[/C][/ROW]
[ROW][C]M3[/C][C]4.07470248605622[/C][C]2.882663[/C][C]1.4135[/C][C]0.162202[/C][C]0.081101[/C][/ROW]
[ROW][C]M4[/C][C]5.0171819557723[/C][C]2.88227[/C][C]1.7407[/C][C]0.086396[/C][C]0.043198[/C][/ROW]
[ROW][C]M5[/C][C]5.11108999691695[/C][C]2.882547[/C][C]1.7731[/C][C]0.080823[/C][C]0.040412[/C][/ROW]
[ROW][C]M6[/C][C]3.50000392387678[/C][C]2.888965[/C][C]1.2115[/C][C]0.230021[/C][C]0.11501[/C][/ROW]
[ROW][C]M7[/C][C]5.33962625073572[/C][C]2.886739[/C][C]1.8497[/C][C]0.068833[/C][C]0.034417[/C][/ROW]
[ROW][C]M8[/C][C]6.0321057204518[/C][C]2.885181[/C][C]2.0907[/C][C]0.040408[/C][C]0.020204[/C][/ROW]
[ROW][C]M9[/C][C]3.560656828947[/C][C]2.993354[/C][C]1.1895[/C][C]0.238497[/C][C]0.119248[/C][/ROW]
[ROW][C]M10[/C][C]2.26432677485355[/C][C]2.99174[/C][C]0.7569[/C][C]0.451828[/C][C]0.225914[/C][/ROW]
[ROW][C]M11[/C][C]-0.675336612573228[/C][C]2.990771[/C][C]-0.2258[/C][C]0.822049[/C][C]0.411025[/C][/ROW]
[ROW][C]t[/C][C]0.454663387426778[/C][C]0.043955[/C][C]10.3438[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4366&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4366&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)15.39915146164412.4942986.173700
Iran14.39495879929372.1546366.680900
M12.078314975195472.8854580.72030.4739010.23695
M22.377937302054422.8837260.82460.4125660.206283
M34.074702486056222.8826631.41350.1622020.081101
M45.01718195577232.882271.74070.0863960.043198
M55.111089996916952.8825471.77310.0808230.040412
M63.500003923876782.8889651.21150.2300210.11501
M75.339626250735722.8867391.84970.0688330.034417
M86.03210572045182.8851812.09070.0404080.020204
M93.5606568289472.9933541.18950.2384970.119248
M102.264326774853552.991740.75690.4518280.225914
M11-0.6753366125732282.990771-0.22580.8220490.411025
t0.4546633874267780.04395510.343800







Multiple Linear Regression - Regression Statistics
Multiple R0.962629332889965
R-squared0.92665523254018
Adjusted R-squared0.912208535919305
F-TEST (value)64.1430533815784
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.1796083934521
Sum Squared Residuals1770.67064522828

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.962629332889965 \tabularnewline
R-squared & 0.92665523254018 \tabularnewline
Adjusted R-squared & 0.912208535919305 \tabularnewline
F-TEST (value) & 64.1430533815784 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.1796083934521 \tabularnewline
Sum Squared Residuals & 1770.67064522828 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4366&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.962629332889965[/C][/ROW]
[ROW][C]R-squared[/C][C]0.92665523254018[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.912208535919305[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]64.1430533815784[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.1796083934521[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1770.67064522828[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4366&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4366&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.962629332889965
R-squared0.92665523254018
Adjusted R-squared0.912208535919305
F-TEST (value)64.1430533815784
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.1796083934521
Sum Squared Residuals1770.67064522828







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
125.6217.93212982426657.68787017573355
227.518.68641553855218.81358446144793
324.520.83784410998073.66215589001934
425.6622.23498696712353.4250130328765
528.3122.78355839569495.52644160430505
627.8521.62713571008166.22286428991844
724.6123.92142142436730.68857857563273
825.6825.06856428151010.611435718489871
925.6223.05177877743212.5682212225679
1020.5422.2101121107654-1.67011211076543
1118.819.7251121107654-0.925112110765436
1218.7120.8551121107654-2.14511211076543
1319.4623.3880904733877-3.92809047338770
1420.1224.1423761876734-4.02237618767342
1523.5426.293804759102-2.75380475910199
1625.627.6909476162449-2.09094761624485
1725.3928.2395190448163-2.84951904481627
1824.0927.0830963592029-2.99309635920289
1925.6929.3773820734886-3.68738207348860
2026.5630.5245249306315-3.96452493063146
2128.3328.5077394265534-0.177739426553438
2227.527.6660727598868-0.166072759886770
2324.2325.1810727598868-0.951072759886767
2428.2326.31107275988681.91892724011323
2531.2928.8440511225092.44594887749098
2632.7229.59833683679483.12166316320524
2730.4631.7497654082233-1.28976540822332
2824.8933.1469082653662-8.25690826536619
2925.6833.6954796939376-8.0154796939376
3027.5232.5390570083242-5.01905700832422
3128.434.8333427226099-6.43334272260994
3229.7135.9804855797528-6.27048557975279
3326.8533.9637000756748-7.11370007567476
3429.6233.1220334090081-3.5020334090081
3528.6930.6370334090081-1.9470334090081
3629.7631.7670334090081-2.0070334090081
3731.334.3000117716304-3.00001177163035
3830.8635.0542974859161-4.19429748591609
3933.4637.2057260573447-3.74572605734466
4033.1538.6028689144875-5.45286891448752
4137.9939.1514403430589-1.16144034305894
4235.2437.9950176574456-2.75501765744556
4338.2440.2893033717313-2.04930337173127
4443.1641.43644622887411.72355377112587
4543.3339.41966072479613.91033927520389
4649.6738.577994058129411.0920059418706
4743.1736.09299405812947.07700594187056
4839.5637.22299405812942.33700594187057
4944.3639.75597242075174.60402757924831
5045.2240.51025813503744.70974186496258
5153.142.66168670646610.4383132935340
5252.144.05882956360888.04117043639115
5348.5244.60740099218033.91259900781973
5454.8457.8459371058606-3.00593710586058
5557.5760.1402228201463-2.5702228201463
5664.1461.28736567728922.85263432271084
5762.8559.27058017321113.57941982678887
5858.7558.42891350654450.321086493455532
5955.3355.9439135065445-0.613913506544466
6057.0357.0739135065445-0.0439135065444644
6163.1859.60689186916673.57310813083328
6260.1960.3611775834525-0.171177583452458
6362.1262.512606154881-0.392606154881023
6470.1263.90974901202396.21025098797612
6569.7564.45832044059535.29167955940469
6668.5663.30189775498195.25810224501808
6773.7765.59618346926768.17381653073236
6873.2366.74332632641056.48667367358951
6961.9664.7265408223325-2.76654082233247
7057.8163.8848741556658-6.0748741556658
7158.7661.3998741556658-2.6398741556658
7262.4762.5298741556658-0.0598741556658028
7353.6865.062852518288-11.3828525182881
7457.5665.8171382325738-8.25713823257379
7562.0567.9685668040024-5.91856680400236
7667.4969.3657096611452-1.87570966114522
7767.2169.9142810897166-2.70428108971665
7871.0568.75785840410332.29214159589674
7976.9371.0521441183895.87785588161104
8070.7672.1992869755318-1.43928697553182

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 25.62 & 17.9321298242665 & 7.68787017573355 \tabularnewline
2 & 27.5 & 18.6864155385521 & 8.81358446144793 \tabularnewline
3 & 24.5 & 20.8378441099807 & 3.66215589001934 \tabularnewline
4 & 25.66 & 22.2349869671235 & 3.4250130328765 \tabularnewline
5 & 28.31 & 22.7835583956949 & 5.52644160430505 \tabularnewline
6 & 27.85 & 21.6271357100816 & 6.22286428991844 \tabularnewline
7 & 24.61 & 23.9214214243673 & 0.68857857563273 \tabularnewline
8 & 25.68 & 25.0685642815101 & 0.611435718489871 \tabularnewline
9 & 25.62 & 23.0517787774321 & 2.5682212225679 \tabularnewline
10 & 20.54 & 22.2101121107654 & -1.67011211076543 \tabularnewline
11 & 18.8 & 19.7251121107654 & -0.925112110765436 \tabularnewline
12 & 18.71 & 20.8551121107654 & -2.14511211076543 \tabularnewline
13 & 19.46 & 23.3880904733877 & -3.92809047338770 \tabularnewline
14 & 20.12 & 24.1423761876734 & -4.02237618767342 \tabularnewline
15 & 23.54 & 26.293804759102 & -2.75380475910199 \tabularnewline
16 & 25.6 & 27.6909476162449 & -2.09094761624485 \tabularnewline
17 & 25.39 & 28.2395190448163 & -2.84951904481627 \tabularnewline
18 & 24.09 & 27.0830963592029 & -2.99309635920289 \tabularnewline
19 & 25.69 & 29.3773820734886 & -3.68738207348860 \tabularnewline
20 & 26.56 & 30.5245249306315 & -3.96452493063146 \tabularnewline
21 & 28.33 & 28.5077394265534 & -0.177739426553438 \tabularnewline
22 & 27.5 & 27.6660727598868 & -0.166072759886770 \tabularnewline
23 & 24.23 & 25.1810727598868 & -0.951072759886767 \tabularnewline
24 & 28.23 & 26.3110727598868 & 1.91892724011323 \tabularnewline
25 & 31.29 & 28.844051122509 & 2.44594887749098 \tabularnewline
26 & 32.72 & 29.5983368367948 & 3.12166316320524 \tabularnewline
27 & 30.46 & 31.7497654082233 & -1.28976540822332 \tabularnewline
28 & 24.89 & 33.1469082653662 & -8.25690826536619 \tabularnewline
29 & 25.68 & 33.6954796939376 & -8.0154796939376 \tabularnewline
30 & 27.52 & 32.5390570083242 & -5.01905700832422 \tabularnewline
31 & 28.4 & 34.8333427226099 & -6.43334272260994 \tabularnewline
32 & 29.71 & 35.9804855797528 & -6.27048557975279 \tabularnewline
33 & 26.85 & 33.9637000756748 & -7.11370007567476 \tabularnewline
34 & 29.62 & 33.1220334090081 & -3.5020334090081 \tabularnewline
35 & 28.69 & 30.6370334090081 & -1.9470334090081 \tabularnewline
36 & 29.76 & 31.7670334090081 & -2.0070334090081 \tabularnewline
37 & 31.3 & 34.3000117716304 & -3.00001177163035 \tabularnewline
38 & 30.86 & 35.0542974859161 & -4.19429748591609 \tabularnewline
39 & 33.46 & 37.2057260573447 & -3.74572605734466 \tabularnewline
40 & 33.15 & 38.6028689144875 & -5.45286891448752 \tabularnewline
41 & 37.99 & 39.1514403430589 & -1.16144034305894 \tabularnewline
42 & 35.24 & 37.9950176574456 & -2.75501765744556 \tabularnewline
43 & 38.24 & 40.2893033717313 & -2.04930337173127 \tabularnewline
44 & 43.16 & 41.4364462288741 & 1.72355377112587 \tabularnewline
45 & 43.33 & 39.4196607247961 & 3.91033927520389 \tabularnewline
46 & 49.67 & 38.5779940581294 & 11.0920059418706 \tabularnewline
47 & 43.17 & 36.0929940581294 & 7.07700594187056 \tabularnewline
48 & 39.56 & 37.2229940581294 & 2.33700594187057 \tabularnewline
49 & 44.36 & 39.7559724207517 & 4.60402757924831 \tabularnewline
50 & 45.22 & 40.5102581350374 & 4.70974186496258 \tabularnewline
51 & 53.1 & 42.661686706466 & 10.4383132935340 \tabularnewline
52 & 52.1 & 44.0588295636088 & 8.04117043639115 \tabularnewline
53 & 48.52 & 44.6074009921803 & 3.91259900781973 \tabularnewline
54 & 54.84 & 57.8459371058606 & -3.00593710586058 \tabularnewline
55 & 57.57 & 60.1402228201463 & -2.5702228201463 \tabularnewline
56 & 64.14 & 61.2873656772892 & 2.85263432271084 \tabularnewline
57 & 62.85 & 59.2705801732111 & 3.57941982678887 \tabularnewline
58 & 58.75 & 58.4289135065445 & 0.321086493455532 \tabularnewline
59 & 55.33 & 55.9439135065445 & -0.613913506544466 \tabularnewline
60 & 57.03 & 57.0739135065445 & -0.0439135065444644 \tabularnewline
61 & 63.18 & 59.6068918691667 & 3.57310813083328 \tabularnewline
62 & 60.19 & 60.3611775834525 & -0.171177583452458 \tabularnewline
63 & 62.12 & 62.512606154881 & -0.392606154881023 \tabularnewline
64 & 70.12 & 63.9097490120239 & 6.21025098797612 \tabularnewline
65 & 69.75 & 64.4583204405953 & 5.29167955940469 \tabularnewline
66 & 68.56 & 63.3018977549819 & 5.25810224501808 \tabularnewline
67 & 73.77 & 65.5961834692676 & 8.17381653073236 \tabularnewline
68 & 73.23 & 66.7433263264105 & 6.48667367358951 \tabularnewline
69 & 61.96 & 64.7265408223325 & -2.76654082233247 \tabularnewline
70 & 57.81 & 63.8848741556658 & -6.0748741556658 \tabularnewline
71 & 58.76 & 61.3998741556658 & -2.6398741556658 \tabularnewline
72 & 62.47 & 62.5298741556658 & -0.0598741556658028 \tabularnewline
73 & 53.68 & 65.062852518288 & -11.3828525182881 \tabularnewline
74 & 57.56 & 65.8171382325738 & -8.25713823257379 \tabularnewline
75 & 62.05 & 67.9685668040024 & -5.91856680400236 \tabularnewline
76 & 67.49 & 69.3657096611452 & -1.87570966114522 \tabularnewline
77 & 67.21 & 69.9142810897166 & -2.70428108971665 \tabularnewline
78 & 71.05 & 68.7578584041033 & 2.29214159589674 \tabularnewline
79 & 76.93 & 71.052144118389 & 5.87785588161104 \tabularnewline
80 & 70.76 & 72.1992869755318 & -1.43928697553182 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=4366&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]25.62[/C][C]17.9321298242665[/C][C]7.68787017573355[/C][/ROW]
[ROW][C]2[/C][C]27.5[/C][C]18.6864155385521[/C][C]8.81358446144793[/C][/ROW]
[ROW][C]3[/C][C]24.5[/C][C]20.8378441099807[/C][C]3.66215589001934[/C][/ROW]
[ROW][C]4[/C][C]25.66[/C][C]22.2349869671235[/C][C]3.4250130328765[/C][/ROW]
[ROW][C]5[/C][C]28.31[/C][C]22.7835583956949[/C][C]5.52644160430505[/C][/ROW]
[ROW][C]6[/C][C]27.85[/C][C]21.6271357100816[/C][C]6.22286428991844[/C][/ROW]
[ROW][C]7[/C][C]24.61[/C][C]23.9214214243673[/C][C]0.68857857563273[/C][/ROW]
[ROW][C]8[/C][C]25.68[/C][C]25.0685642815101[/C][C]0.611435718489871[/C][/ROW]
[ROW][C]9[/C][C]25.62[/C][C]23.0517787774321[/C][C]2.5682212225679[/C][/ROW]
[ROW][C]10[/C][C]20.54[/C][C]22.2101121107654[/C][C]-1.67011211076543[/C][/ROW]
[ROW][C]11[/C][C]18.8[/C][C]19.7251121107654[/C][C]-0.925112110765436[/C][/ROW]
[ROW][C]12[/C][C]18.71[/C][C]20.8551121107654[/C][C]-2.14511211076543[/C][/ROW]
[ROW][C]13[/C][C]19.46[/C][C]23.3880904733877[/C][C]-3.92809047338770[/C][/ROW]
[ROW][C]14[/C][C]20.12[/C][C]24.1423761876734[/C][C]-4.02237618767342[/C][/ROW]
[ROW][C]15[/C][C]23.54[/C][C]26.293804759102[/C][C]-2.75380475910199[/C][/ROW]
[ROW][C]16[/C][C]25.6[/C][C]27.6909476162449[/C][C]-2.09094761624485[/C][/ROW]
[ROW][C]17[/C][C]25.39[/C][C]28.2395190448163[/C][C]-2.84951904481627[/C][/ROW]
[ROW][C]18[/C][C]24.09[/C][C]27.0830963592029[/C][C]-2.99309635920289[/C][/ROW]
[ROW][C]19[/C][C]25.69[/C][C]29.3773820734886[/C][C]-3.68738207348860[/C][/ROW]
[ROW][C]20[/C][C]26.56[/C][C]30.5245249306315[/C][C]-3.96452493063146[/C][/ROW]
[ROW][C]21[/C][C]28.33[/C][C]28.5077394265534[/C][C]-0.177739426553438[/C][/ROW]
[ROW][C]22[/C][C]27.5[/C][C]27.6660727598868[/C][C]-0.166072759886770[/C][/ROW]
[ROW][C]23[/C][C]24.23[/C][C]25.1810727598868[/C][C]-0.951072759886767[/C][/ROW]
[ROW][C]24[/C][C]28.23[/C][C]26.3110727598868[/C][C]1.91892724011323[/C][/ROW]
[ROW][C]25[/C][C]31.29[/C][C]28.844051122509[/C][C]2.44594887749098[/C][/ROW]
[ROW][C]26[/C][C]32.72[/C][C]29.5983368367948[/C][C]3.12166316320524[/C][/ROW]
[ROW][C]27[/C][C]30.46[/C][C]31.7497654082233[/C][C]-1.28976540822332[/C][/ROW]
[ROW][C]28[/C][C]24.89[/C][C]33.1469082653662[/C][C]-8.25690826536619[/C][/ROW]
[ROW][C]29[/C][C]25.68[/C][C]33.6954796939376[/C][C]-8.0154796939376[/C][/ROW]
[ROW][C]30[/C][C]27.52[/C][C]32.5390570083242[/C][C]-5.01905700832422[/C][/ROW]
[ROW][C]31[/C][C]28.4[/C][C]34.8333427226099[/C][C]-6.43334272260994[/C][/ROW]
[ROW][C]32[/C][C]29.71[/C][C]35.9804855797528[/C][C]-6.27048557975279[/C][/ROW]
[ROW][C]33[/C][C]26.85[/C][C]33.9637000756748[/C][C]-7.11370007567476[/C][/ROW]
[ROW][C]34[/C][C]29.62[/C][C]33.1220334090081[/C][C]-3.5020334090081[/C][/ROW]
[ROW][C]35[/C][C]28.69[/C][C]30.6370334090081[/C][C]-1.9470334090081[/C][/ROW]
[ROW][C]36[/C][C]29.76[/C][C]31.7670334090081[/C][C]-2.0070334090081[/C][/ROW]
[ROW][C]37[/C][C]31.3[/C][C]34.3000117716304[/C][C]-3.00001177163035[/C][/ROW]
[ROW][C]38[/C][C]30.86[/C][C]35.0542974859161[/C][C]-4.19429748591609[/C][/ROW]
[ROW][C]39[/C][C]33.46[/C][C]37.2057260573447[/C][C]-3.74572605734466[/C][/ROW]
[ROW][C]40[/C][C]33.15[/C][C]38.6028689144875[/C][C]-5.45286891448752[/C][/ROW]
[ROW][C]41[/C][C]37.99[/C][C]39.1514403430589[/C][C]-1.16144034305894[/C][/ROW]
[ROW][C]42[/C][C]35.24[/C][C]37.9950176574456[/C][C]-2.75501765744556[/C][/ROW]
[ROW][C]43[/C][C]38.24[/C][C]40.2893033717313[/C][C]-2.04930337173127[/C][/ROW]
[ROW][C]44[/C][C]43.16[/C][C]41.4364462288741[/C][C]1.72355377112587[/C][/ROW]
[ROW][C]45[/C][C]43.33[/C][C]39.4196607247961[/C][C]3.91033927520389[/C][/ROW]
[ROW][C]46[/C][C]49.67[/C][C]38.5779940581294[/C][C]11.0920059418706[/C][/ROW]
[ROW][C]47[/C][C]43.17[/C][C]36.0929940581294[/C][C]7.07700594187056[/C][/ROW]
[ROW][C]48[/C][C]39.56[/C][C]37.2229940581294[/C][C]2.33700594187057[/C][/ROW]
[ROW][C]49[/C][C]44.36[/C][C]39.7559724207517[/C][C]4.60402757924831[/C][/ROW]
[ROW][C]50[/C][C]45.22[/C][C]40.5102581350374[/C][C]4.70974186496258[/C][/ROW]
[ROW][C]51[/C][C]53.1[/C][C]42.661686706466[/C][C]10.4383132935340[/C][/ROW]
[ROW][C]52[/C][C]52.1[/C][C]44.0588295636088[/C][C]8.04117043639115[/C][/ROW]
[ROW][C]53[/C][C]48.52[/C][C]44.6074009921803[/C][C]3.91259900781973[/C][/ROW]
[ROW][C]54[/C][C]54.84[/C][C]57.8459371058606[/C][C]-3.00593710586058[/C][/ROW]
[ROW][C]55[/C][C]57.57[/C][C]60.1402228201463[/C][C]-2.5702228201463[/C][/ROW]
[ROW][C]56[/C][C]64.14[/C][C]61.2873656772892[/C][C]2.85263432271084[/C][/ROW]
[ROW][C]57[/C][C]62.85[/C][C]59.2705801732111[/C][C]3.57941982678887[/C][/ROW]
[ROW][C]58[/C][C]58.75[/C][C]58.4289135065445[/C][C]0.321086493455532[/C][/ROW]
[ROW][C]59[/C][C]55.33[/C][C]55.9439135065445[/C][C]-0.613913506544466[/C][/ROW]
[ROW][C]60[/C][C]57.03[/C][C]57.0739135065445[/C][C]-0.0439135065444644[/C][/ROW]
[ROW][C]61[/C][C]63.18[/C][C]59.6068918691667[/C][C]3.57310813083328[/C][/ROW]
[ROW][C]62[/C][C]60.19[/C][C]60.3611775834525[/C][C]-0.171177583452458[/C][/ROW]
[ROW][C]63[/C][C]62.12[/C][C]62.512606154881[/C][C]-0.392606154881023[/C][/ROW]
[ROW][C]64[/C][C]70.12[/C][C]63.9097490120239[/C][C]6.21025098797612[/C][/ROW]
[ROW][C]65[/C][C]69.75[/C][C]64.4583204405953[/C][C]5.29167955940469[/C][/ROW]
[ROW][C]66[/C][C]68.56[/C][C]63.3018977549819[/C][C]5.25810224501808[/C][/ROW]
[ROW][C]67[/C][C]73.77[/C][C]65.5961834692676[/C][C]8.17381653073236[/C][/ROW]
[ROW][C]68[/C][C]73.23[/C][C]66.7433263264105[/C][C]6.48667367358951[/C][/ROW]
[ROW][C]69[/C][C]61.96[/C][C]64.7265408223325[/C][C]-2.76654082233247[/C][/ROW]
[ROW][C]70[/C][C]57.81[/C][C]63.8848741556658[/C][C]-6.0748741556658[/C][/ROW]
[ROW][C]71[/C][C]58.76[/C][C]61.3998741556658[/C][C]-2.6398741556658[/C][/ROW]
[ROW][C]72[/C][C]62.47[/C][C]62.5298741556658[/C][C]-0.0598741556658028[/C][/ROW]
[ROW][C]73[/C][C]53.68[/C][C]65.062852518288[/C][C]-11.3828525182881[/C][/ROW]
[ROW][C]74[/C][C]57.56[/C][C]65.8171382325738[/C][C]-8.25713823257379[/C][/ROW]
[ROW][C]75[/C][C]62.05[/C][C]67.9685668040024[/C][C]-5.91856680400236[/C][/ROW]
[ROW][C]76[/C][C]67.49[/C][C]69.3657096611452[/C][C]-1.87570966114522[/C][/ROW]
[ROW][C]77[/C][C]67.21[/C][C]69.9142810897166[/C][C]-2.70428108971665[/C][/ROW]
[ROW][C]78[/C][C]71.05[/C][C]68.7578584041033[/C][C]2.29214159589674[/C][/ROW]
[ROW][C]79[/C][C]76.93[/C][C]71.052144118389[/C][C]5.87785588161104[/C][/ROW]
[ROW][C]80[/C][C]70.76[/C][C]72.1992869755318[/C][C]-1.43928697553182[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=4366&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=4366&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
125.6217.93212982426657.68787017573355
227.518.68641553855218.81358446144793
324.520.83784410998073.66215589001934
425.6622.23498696712353.4250130328765
528.3122.78355839569495.52644160430505
627.8521.62713571008166.22286428991844
724.6123.92142142436730.68857857563273
825.6825.06856428151010.611435718489871
925.6223.05177877743212.5682212225679
1020.5422.2101121107654-1.67011211076543
1118.819.7251121107654-0.925112110765436
1218.7120.8551121107654-2.14511211076543
1319.4623.3880904733877-3.92809047338770
1420.1224.1423761876734-4.02237618767342
1523.5426.293804759102-2.75380475910199
1625.627.6909476162449-2.09094761624485
1725.3928.2395190448163-2.84951904481627
1824.0927.0830963592029-2.99309635920289
1925.6929.3773820734886-3.68738207348860
2026.5630.5245249306315-3.96452493063146
2128.3328.5077394265534-0.177739426553438
2227.527.6660727598868-0.166072759886770
2324.2325.1810727598868-0.951072759886767
2428.2326.31107275988681.91892724011323
2531.2928.8440511225092.44594887749098
2632.7229.59833683679483.12166316320524
2730.4631.7497654082233-1.28976540822332
2824.8933.1469082653662-8.25690826536619
2925.6833.6954796939376-8.0154796939376
3027.5232.5390570083242-5.01905700832422
3128.434.8333427226099-6.43334272260994
3229.7135.9804855797528-6.27048557975279
3326.8533.9637000756748-7.11370007567476
3429.6233.1220334090081-3.5020334090081
3528.6930.6370334090081-1.9470334090081
3629.7631.7670334090081-2.0070334090081
3731.334.3000117716304-3.00001177163035
3830.8635.0542974859161-4.19429748591609
3933.4637.2057260573447-3.74572605734466
4033.1538.6028689144875-5.45286891448752
4137.9939.1514403430589-1.16144034305894
4235.2437.9950176574456-2.75501765744556
4338.2440.2893033717313-2.04930337173127
4443.1641.43644622887411.72355377112587
4543.3339.41966072479613.91033927520389
4649.6738.577994058129411.0920059418706
4743.1736.09299405812947.07700594187056
4839.5637.22299405812942.33700594187057
4944.3639.75597242075174.60402757924831
5045.2240.51025813503744.70974186496258
5153.142.66168670646610.4383132935340
5252.144.05882956360888.04117043639115
5348.5244.60740099218033.91259900781973
5454.8457.8459371058606-3.00593710586058
5557.5760.1402228201463-2.5702228201463
5664.1461.28736567728922.85263432271084
5762.8559.27058017321113.57941982678887
5858.7558.42891350654450.321086493455532
5955.3355.9439135065445-0.613913506544466
6057.0357.0739135065445-0.0439135065444644
6163.1859.60689186916673.57310813083328
6260.1960.3611775834525-0.171177583452458
6362.1262.512606154881-0.392606154881023
6470.1263.90974901202396.21025098797612
6569.7564.45832044059535.29167955940469
6668.5663.30189775498195.25810224501808
6773.7765.59618346926768.17381653073236
6873.2366.74332632641056.48667367358951
6961.9664.7265408223325-2.76654082233247
7057.8163.8848741556658-6.0748741556658
7158.7661.3998741556658-2.6398741556658
7262.4762.5298741556658-0.0598741556658028
7353.6865.062852518288-11.3828525182881
7457.5665.8171382325738-8.25713823257379
7562.0567.9685668040024-5.91856680400236
7667.4969.3657096611452-1.87570966114522
7767.2169.9142810897166-2.70428108971665
7871.0568.75785840410332.29214159589674
7976.9371.0521441183895.87785588161104
8070.7672.1992869755318-1.43928697553182



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')