Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 14 Nov 2007 13:28:26 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/14/t1195071848fsvi267qwgcloi3.htm/, Retrieved Tue, 07 May 2024 00:13:34 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=14449, Retrieved Tue, 07 May 2024 00:13:34 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsWS6RMPV
Estimated Impact266
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [WS6 - Regression ...] [2007-11-14 20:28:26] [043b25469ff995e98bde0a26b8a4f1a8] [Current]
Feedback Forum

Post a new message
Dataseries X:
97,3	0
101	0
113,2	0
101	0
105,7	0
113,9	0
86,4	0
96,5	0
103,3	0
114,9	0
105,8	0
94,2	0
98,4	0
99,4	0
108,8	0
112,6	0
104,4	0
112,2	0
81,1	0
97,1	0
112,6	0
113,8	0
107,8	0
103,2	0
103,3	0
101,2	0
107,7	0
110,4	0
101,9	0
115,9	0
89,9	1
88,6	1
117,2	1
123,9	1
100	1
103,6	1
94,1	1
98,7	1
119,5	1
112,7	1
104,4	1
124,7	1
89,1	1
97	1
121,6	1
118,8	1
114	1
111,5	1
97,2	1
102,5	1
113,4	1
109,8	1
104,9	1
126,1	1
80	1
96,8	1
117,2	1
112,3	1
117,3	1
111,1	1
102,2	1
104,3	1
122,9	1
107,6	1
121,3	1
131,5	1
89	1
104,4	1
128,9	1
135,9	1
133,3	1
121,3	1
120,5	1
120,4	1
137,9	1
126,1	1
133,2	1
146,6	0
103,4	0
117,2	0




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=14449&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=14449&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=14449&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 95.6270870655679 -6.5084378359011X[t] -4.31804112424862M1[t] -2.63221266770174M2[t] + 10.6821872174165M3[t] + 4.12515853110623M4[t] + 3.11098698765305M5[t] + 15.3813243247855M6[t] -20.0744989563961M7[t] -9.21724192842071M8[t] + 10.4734670113119M9[t] + 13.2212002297635M10[t] + 5.93560011488173M11[t] + 0.385600114881735t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  95.6270870655679 -6.5084378359011X[t] -4.31804112424862M1[t] -2.63221266770174M2[t] +  10.6821872174165M3[t] +  4.12515853110623M4[t] +  3.11098698765305M5[t] +  15.3813243247855M6[t] -20.0744989563961M7[t] -9.21724192842071M8[t] +  10.4734670113119M9[t] +  13.2212002297635M10[t] +  5.93560011488173M11[t] +  0.385600114881735t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=14449&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  95.6270870655679 -6.5084378359011X[t] -4.31804112424862M1[t] -2.63221266770174M2[t] +  10.6821872174165M3[t] +  4.12515853110623M4[t] +  3.11098698765305M5[t] +  15.3813243247855M6[t] -20.0744989563961M7[t] -9.21724192842071M8[t] +  10.4734670113119M9[t] +  13.2212002297635M10[t] +  5.93560011488173M11[t] +  0.385600114881735t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=14449&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=14449&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 95.6270870655679 -6.5084378359011X[t] -4.31804112424862M1[t] -2.63221266770174M2[t] + 10.6821872174165M3[t] + 4.12515853110623M4[t] + 3.11098698765305M5[t] + 15.3813243247855M6[t] -20.0744989563961M7[t] -9.21724192842071M8[t] + 10.4734670113119M9[t] + 13.2212002297635M10[t] + 5.93560011488173M11[t] + 0.385600114881735t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)95.62708706556792.70577235.341900
X-6.50843783590111.923382-3.38390.0012070.000603
M1-4.318041124248623.30037-1.30840.1952930.097646
M2-2.632212667701743.299704-0.79770.4278990.213949
M310.68218721741653.2995433.23750.0018880.000944
M44.125158531106233.2998861.25010.2156810.10784
M53.110986987653053.3007330.94250.3493670.174683
M615.38132432478553.3286524.62091.8e-059e-06
M7-20.07449895639613.303939-6.075900
M8-9.217241928420713.306295-2.78780.0069250.003463
M910.47346701131193.4236463.05920.0032060.001603
M1013.22120022976353.4224313.86310.0002580.000129
M115.935600114881733.4217011.73470.0874630.043732
t0.3856001148817350.0407979.451600

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 95.6270870655679 & 2.705772 & 35.3419 & 0 & 0 \tabularnewline
X & -6.5084378359011 & 1.923382 & -3.3839 & 0.001207 & 0.000603 \tabularnewline
M1 & -4.31804112424862 & 3.30037 & -1.3084 & 0.195293 & 0.097646 \tabularnewline
M2 & -2.63221266770174 & 3.299704 & -0.7977 & 0.427899 & 0.213949 \tabularnewline
M3 & 10.6821872174165 & 3.299543 & 3.2375 & 0.001888 & 0.000944 \tabularnewline
M4 & 4.12515853110623 & 3.299886 & 1.2501 & 0.215681 & 0.10784 \tabularnewline
M5 & 3.11098698765305 & 3.300733 & 0.9425 & 0.349367 & 0.174683 \tabularnewline
M6 & 15.3813243247855 & 3.328652 & 4.6209 & 1.8e-05 & 9e-06 \tabularnewline
M7 & -20.0744989563961 & 3.303939 & -6.0759 & 0 & 0 \tabularnewline
M8 & -9.21724192842071 & 3.306295 & -2.7878 & 0.006925 & 0.003463 \tabularnewline
M9 & 10.4734670113119 & 3.423646 & 3.0592 & 0.003206 & 0.001603 \tabularnewline
M10 & 13.2212002297635 & 3.422431 & 3.8631 & 0.000258 & 0.000129 \tabularnewline
M11 & 5.93560011488173 & 3.421701 & 1.7347 & 0.087463 & 0.043732 \tabularnewline
t & 0.385600114881735 & 0.040797 & 9.4516 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=14449&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]95.6270870655679[/C][C]2.705772[/C][C]35.3419[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]X[/C][C]-6.5084378359011[/C][C]1.923382[/C][C]-3.3839[/C][C]0.001207[/C][C]0.000603[/C][/ROW]
[ROW][C]M1[/C][C]-4.31804112424862[/C][C]3.30037[/C][C]-1.3084[/C][C]0.195293[/C][C]0.097646[/C][/ROW]
[ROW][C]M2[/C][C]-2.63221266770174[/C][C]3.299704[/C][C]-0.7977[/C][C]0.427899[/C][C]0.213949[/C][/ROW]
[ROW][C]M3[/C][C]10.6821872174165[/C][C]3.299543[/C][C]3.2375[/C][C]0.001888[/C][C]0.000944[/C][/ROW]
[ROW][C]M4[/C][C]4.12515853110623[/C][C]3.299886[/C][C]1.2501[/C][C]0.215681[/C][C]0.10784[/C][/ROW]
[ROW][C]M5[/C][C]3.11098698765305[/C][C]3.300733[/C][C]0.9425[/C][C]0.349367[/C][C]0.174683[/C][/ROW]
[ROW][C]M6[/C][C]15.3813243247855[/C][C]3.328652[/C][C]4.6209[/C][C]1.8e-05[/C][C]9e-06[/C][/ROW]
[ROW][C]M7[/C][C]-20.0744989563961[/C][C]3.303939[/C][C]-6.0759[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M8[/C][C]-9.21724192842071[/C][C]3.306295[/C][C]-2.7878[/C][C]0.006925[/C][C]0.003463[/C][/ROW]
[ROW][C]M9[/C][C]10.4734670113119[/C][C]3.423646[/C][C]3.0592[/C][C]0.003206[/C][C]0.001603[/C][/ROW]
[ROW][C]M10[/C][C]13.2212002297635[/C][C]3.422431[/C][C]3.8631[/C][C]0.000258[/C][C]0.000129[/C][/ROW]
[ROW][C]M11[/C][C]5.93560011488173[/C][C]3.421701[/C][C]1.7347[/C][C]0.087463[/C][C]0.043732[/C][/ROW]
[ROW][C]t[/C][C]0.385600114881735[/C][C]0.040797[/C][C]9.4516[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=14449&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=14449&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)95.62708706556792.70577235.341900
X-6.50843783590111.923382-3.38390.0012070.000603
M1-4.318041124248623.30037-1.30840.1952930.097646
M2-2.632212667701743.299704-0.79770.4278990.213949
M310.68218721741653.2995433.23750.0018880.000944
M44.125158531106233.2998861.25010.2156810.10784
M53.110986987653053.3007330.94250.3493670.174683
M615.38132432478553.3286524.62091.8e-059e-06
M7-20.07449895639613.303939-6.075900
M8-9.217241928420713.306295-2.78780.0069250.003463
M910.47346701131193.4236463.05920.0032060.001603
M1013.22120022976353.4224313.86310.0002580.000129
M115.935600114881733.4217011.73470.0874630.043732
t0.3856001148817350.0407979.451600







Multiple Linear Regression - Regression Statistics
Multiple R0.911986029816551
R-squared0.831718518580555
Adjusted R-squared0.798572166179755
F-TEST (value)25.0923090578282
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.92613924965973
Sum Squared Residuals2317.86234281960

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.911986029816551 \tabularnewline
R-squared & 0.831718518580555 \tabularnewline
Adjusted R-squared & 0.798572166179755 \tabularnewline
F-TEST (value) & 25.0923090578282 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.92613924965973 \tabularnewline
Sum Squared Residuals & 2317.86234281960 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=14449&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.911986029816551[/C][/ROW]
[ROW][C]R-squared[/C][C]0.831718518580555[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.798572166179755[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]25.0923090578282[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.92613924965973[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]2317.86234281960[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=14449&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=14449&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.911986029816551
R-squared0.831718518580555
Adjusted R-squared0.798572166179755
F-TEST (value)25.0923090578282
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.92613924965973
Sum Squared Residuals2317.86234281960







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
197.391.69464605620135.60535394379869
210193.76607462762967.23392537237036
3113.2107.4660746276305.73392537237038
4101101.294646056201-0.294646056201046
5105.7100.6660746276305.0339253723704
6113.9113.3220120796440.577987920356247
786.478.25178891334398.14821108665613
896.589.4946460562017.00535394379895
9103.3109.570955110815-6.2709551108154
10114.9112.7042884441492.19571155585129
11105.8105.804288444149-0.00428844414869245
1294.2100.254288444149-6.05428844414871
1398.496.32184743478182.07815256521820
1499.498.39327600621041.00672399378957
15108.8112.093276006210-3.29327600621043
16112.6105.9218474347826.67815256521813
17104.4105.293276006210-0.89327600621043
18112.2117.949213458225-5.74921345822455
1981.182.8789902919247-1.77899029192473
2097.194.12184743478192.97815256521814
21112.6114.198156489396-1.59815648939619
22113.8117.331489822730-3.53148982272952
23107.8110.431489822730-2.63148982272953
24103.2104.881489822730-1.68148982272952
25103.3100.9490488133632.35095118663737
26101.2103.020477384791-1.82047738479124
27107.7116.720477384791-9.02047738479125
28110.4110.549048813363-0.149048813362669
29101.9109.920477384791-8.02047738479125
30115.9122.576414836805-6.67641483680536
3189.980.99775383460448.90224616539558
3288.692.2406109774616-3.64061097746157
33117.2112.3169200320764.88307996792412
34123.9115.4502533654098.44974663459079
35100108.550253365409-8.55025336540922
36103.6103.0002533654090.599746634590766
3794.199.0678123560423-4.96781235604234
3898.7101.139240927471-2.43924092747095
39119.5114.8392409274714.66075907252905
40112.7108.6678123560424.03218764395762
41104.4108.039240927471-3.63924092747095
42124.7120.6951783794854.00482162051493
4389.185.62495521318523.47504478681475
449796.86781235604240.132187643957619
45121.6116.9441214106574.6558785893433
46118.8120.07745474399-1.27745474399004
47114113.177454743990.82254525600996
48111.5107.627454743993.87254525600996
4997.2103.695013734623-6.49501373462315
50102.5105.766442306052-3.26644230605176
51113.4119.466442306052-6.06644230605176
52109.8113.295013734623-3.4950137346232
53104.9112.666442306052-7.76644230605176
54126.1125.3223797580660.7776202419341
558090.252156591766-10.2521565917661
5696.8101.495013734623-4.6950137346232
57117.2121.571322789238-4.37132278923751
58112.3124.704656122571-12.4046561225709
59117.3117.804656122571-0.50465612257086
60111.1112.254656122571-1.15465612257086
61102.2108.322215113204-6.12221511320397
62104.3110.393643684633-6.09364368463258
63122.9124.093643684633-1.19364368463258
64107.6117.922215113204-10.322215113204
65121.3117.2936436846334.00635631536741
66131.5129.9495811366471.55041886335329
678994.8793579703469-5.87935797034688
68104.4106.122215113204-1.72221511320401
69128.9126.1985241678182.70147583218168
70135.9129.3318575011526.56814249884834
71133.3122.43185750115210.8681424988483
72121.3116.8818575011524.41814249884832
73120.5112.9494164917857.55058350821522
74120.4115.0208450632135.37915493678661
75137.9128.7208450632139.17915493678661
76126.1122.5494164917853.55058350821516
77133.2121.92084506321311.2791549367866
78146.6141.0852203511295.51477964887135
79103.4106.014997184829-2.6149971848288
80117.2117.257854327686-0.0578543276859312

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 97.3 & 91.6946460562013 & 5.60535394379869 \tabularnewline
2 & 101 & 93.7660746276296 & 7.23392537237036 \tabularnewline
3 & 113.2 & 107.466074627630 & 5.73392537237038 \tabularnewline
4 & 101 & 101.294646056201 & -0.294646056201046 \tabularnewline
5 & 105.7 & 100.666074627630 & 5.0339253723704 \tabularnewline
6 & 113.9 & 113.322012079644 & 0.577987920356247 \tabularnewline
7 & 86.4 & 78.2517889133439 & 8.14821108665613 \tabularnewline
8 & 96.5 & 89.494646056201 & 7.00535394379895 \tabularnewline
9 & 103.3 & 109.570955110815 & -6.2709551108154 \tabularnewline
10 & 114.9 & 112.704288444149 & 2.19571155585129 \tabularnewline
11 & 105.8 & 105.804288444149 & -0.00428844414869245 \tabularnewline
12 & 94.2 & 100.254288444149 & -6.05428844414871 \tabularnewline
13 & 98.4 & 96.3218474347818 & 2.07815256521820 \tabularnewline
14 & 99.4 & 98.3932760062104 & 1.00672399378957 \tabularnewline
15 & 108.8 & 112.093276006210 & -3.29327600621043 \tabularnewline
16 & 112.6 & 105.921847434782 & 6.67815256521813 \tabularnewline
17 & 104.4 & 105.293276006210 & -0.89327600621043 \tabularnewline
18 & 112.2 & 117.949213458225 & -5.74921345822455 \tabularnewline
19 & 81.1 & 82.8789902919247 & -1.77899029192473 \tabularnewline
20 & 97.1 & 94.1218474347819 & 2.97815256521814 \tabularnewline
21 & 112.6 & 114.198156489396 & -1.59815648939619 \tabularnewline
22 & 113.8 & 117.331489822730 & -3.53148982272952 \tabularnewline
23 & 107.8 & 110.431489822730 & -2.63148982272953 \tabularnewline
24 & 103.2 & 104.881489822730 & -1.68148982272952 \tabularnewline
25 & 103.3 & 100.949048813363 & 2.35095118663737 \tabularnewline
26 & 101.2 & 103.020477384791 & -1.82047738479124 \tabularnewline
27 & 107.7 & 116.720477384791 & -9.02047738479125 \tabularnewline
28 & 110.4 & 110.549048813363 & -0.149048813362669 \tabularnewline
29 & 101.9 & 109.920477384791 & -8.02047738479125 \tabularnewline
30 & 115.9 & 122.576414836805 & -6.67641483680536 \tabularnewline
31 & 89.9 & 80.9977538346044 & 8.90224616539558 \tabularnewline
32 & 88.6 & 92.2406109774616 & -3.64061097746157 \tabularnewline
33 & 117.2 & 112.316920032076 & 4.88307996792412 \tabularnewline
34 & 123.9 & 115.450253365409 & 8.44974663459079 \tabularnewline
35 & 100 & 108.550253365409 & -8.55025336540922 \tabularnewline
36 & 103.6 & 103.000253365409 & 0.599746634590766 \tabularnewline
37 & 94.1 & 99.0678123560423 & -4.96781235604234 \tabularnewline
38 & 98.7 & 101.139240927471 & -2.43924092747095 \tabularnewline
39 & 119.5 & 114.839240927471 & 4.66075907252905 \tabularnewline
40 & 112.7 & 108.667812356042 & 4.03218764395762 \tabularnewline
41 & 104.4 & 108.039240927471 & -3.63924092747095 \tabularnewline
42 & 124.7 & 120.695178379485 & 4.00482162051493 \tabularnewline
43 & 89.1 & 85.6249552131852 & 3.47504478681475 \tabularnewline
44 & 97 & 96.8678123560424 & 0.132187643957619 \tabularnewline
45 & 121.6 & 116.944121410657 & 4.6558785893433 \tabularnewline
46 & 118.8 & 120.07745474399 & -1.27745474399004 \tabularnewline
47 & 114 & 113.17745474399 & 0.82254525600996 \tabularnewline
48 & 111.5 & 107.62745474399 & 3.87254525600996 \tabularnewline
49 & 97.2 & 103.695013734623 & -6.49501373462315 \tabularnewline
50 & 102.5 & 105.766442306052 & -3.26644230605176 \tabularnewline
51 & 113.4 & 119.466442306052 & -6.06644230605176 \tabularnewline
52 & 109.8 & 113.295013734623 & -3.4950137346232 \tabularnewline
53 & 104.9 & 112.666442306052 & -7.76644230605176 \tabularnewline
54 & 126.1 & 125.322379758066 & 0.7776202419341 \tabularnewline
55 & 80 & 90.252156591766 & -10.2521565917661 \tabularnewline
56 & 96.8 & 101.495013734623 & -4.6950137346232 \tabularnewline
57 & 117.2 & 121.571322789238 & -4.37132278923751 \tabularnewline
58 & 112.3 & 124.704656122571 & -12.4046561225709 \tabularnewline
59 & 117.3 & 117.804656122571 & -0.50465612257086 \tabularnewline
60 & 111.1 & 112.254656122571 & -1.15465612257086 \tabularnewline
61 & 102.2 & 108.322215113204 & -6.12221511320397 \tabularnewline
62 & 104.3 & 110.393643684633 & -6.09364368463258 \tabularnewline
63 & 122.9 & 124.093643684633 & -1.19364368463258 \tabularnewline
64 & 107.6 & 117.922215113204 & -10.322215113204 \tabularnewline
65 & 121.3 & 117.293643684633 & 4.00635631536741 \tabularnewline
66 & 131.5 & 129.949581136647 & 1.55041886335329 \tabularnewline
67 & 89 & 94.8793579703469 & -5.87935797034688 \tabularnewline
68 & 104.4 & 106.122215113204 & -1.72221511320401 \tabularnewline
69 & 128.9 & 126.198524167818 & 2.70147583218168 \tabularnewline
70 & 135.9 & 129.331857501152 & 6.56814249884834 \tabularnewline
71 & 133.3 & 122.431857501152 & 10.8681424988483 \tabularnewline
72 & 121.3 & 116.881857501152 & 4.41814249884832 \tabularnewline
73 & 120.5 & 112.949416491785 & 7.55058350821522 \tabularnewline
74 & 120.4 & 115.020845063213 & 5.37915493678661 \tabularnewline
75 & 137.9 & 128.720845063213 & 9.17915493678661 \tabularnewline
76 & 126.1 & 122.549416491785 & 3.55058350821516 \tabularnewline
77 & 133.2 & 121.920845063213 & 11.2791549367866 \tabularnewline
78 & 146.6 & 141.085220351129 & 5.51477964887135 \tabularnewline
79 & 103.4 & 106.014997184829 & -2.6149971848288 \tabularnewline
80 & 117.2 & 117.257854327686 & -0.0578543276859312 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=14449&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]97.3[/C][C]91.6946460562013[/C][C]5.60535394379869[/C][/ROW]
[ROW][C]2[/C][C]101[/C][C]93.7660746276296[/C][C]7.23392537237036[/C][/ROW]
[ROW][C]3[/C][C]113.2[/C][C]107.466074627630[/C][C]5.73392537237038[/C][/ROW]
[ROW][C]4[/C][C]101[/C][C]101.294646056201[/C][C]-0.294646056201046[/C][/ROW]
[ROW][C]5[/C][C]105.7[/C][C]100.666074627630[/C][C]5.0339253723704[/C][/ROW]
[ROW][C]6[/C][C]113.9[/C][C]113.322012079644[/C][C]0.577987920356247[/C][/ROW]
[ROW][C]7[/C][C]86.4[/C][C]78.2517889133439[/C][C]8.14821108665613[/C][/ROW]
[ROW][C]8[/C][C]96.5[/C][C]89.494646056201[/C][C]7.00535394379895[/C][/ROW]
[ROW][C]9[/C][C]103.3[/C][C]109.570955110815[/C][C]-6.2709551108154[/C][/ROW]
[ROW][C]10[/C][C]114.9[/C][C]112.704288444149[/C][C]2.19571155585129[/C][/ROW]
[ROW][C]11[/C][C]105.8[/C][C]105.804288444149[/C][C]-0.00428844414869245[/C][/ROW]
[ROW][C]12[/C][C]94.2[/C][C]100.254288444149[/C][C]-6.05428844414871[/C][/ROW]
[ROW][C]13[/C][C]98.4[/C][C]96.3218474347818[/C][C]2.07815256521820[/C][/ROW]
[ROW][C]14[/C][C]99.4[/C][C]98.3932760062104[/C][C]1.00672399378957[/C][/ROW]
[ROW][C]15[/C][C]108.8[/C][C]112.093276006210[/C][C]-3.29327600621043[/C][/ROW]
[ROW][C]16[/C][C]112.6[/C][C]105.921847434782[/C][C]6.67815256521813[/C][/ROW]
[ROW][C]17[/C][C]104.4[/C][C]105.293276006210[/C][C]-0.89327600621043[/C][/ROW]
[ROW][C]18[/C][C]112.2[/C][C]117.949213458225[/C][C]-5.74921345822455[/C][/ROW]
[ROW][C]19[/C][C]81.1[/C][C]82.8789902919247[/C][C]-1.77899029192473[/C][/ROW]
[ROW][C]20[/C][C]97.1[/C][C]94.1218474347819[/C][C]2.97815256521814[/C][/ROW]
[ROW][C]21[/C][C]112.6[/C][C]114.198156489396[/C][C]-1.59815648939619[/C][/ROW]
[ROW][C]22[/C][C]113.8[/C][C]117.331489822730[/C][C]-3.53148982272952[/C][/ROW]
[ROW][C]23[/C][C]107.8[/C][C]110.431489822730[/C][C]-2.63148982272953[/C][/ROW]
[ROW][C]24[/C][C]103.2[/C][C]104.881489822730[/C][C]-1.68148982272952[/C][/ROW]
[ROW][C]25[/C][C]103.3[/C][C]100.949048813363[/C][C]2.35095118663737[/C][/ROW]
[ROW][C]26[/C][C]101.2[/C][C]103.020477384791[/C][C]-1.82047738479124[/C][/ROW]
[ROW][C]27[/C][C]107.7[/C][C]116.720477384791[/C][C]-9.02047738479125[/C][/ROW]
[ROW][C]28[/C][C]110.4[/C][C]110.549048813363[/C][C]-0.149048813362669[/C][/ROW]
[ROW][C]29[/C][C]101.9[/C][C]109.920477384791[/C][C]-8.02047738479125[/C][/ROW]
[ROW][C]30[/C][C]115.9[/C][C]122.576414836805[/C][C]-6.67641483680536[/C][/ROW]
[ROW][C]31[/C][C]89.9[/C][C]80.9977538346044[/C][C]8.90224616539558[/C][/ROW]
[ROW][C]32[/C][C]88.6[/C][C]92.2406109774616[/C][C]-3.64061097746157[/C][/ROW]
[ROW][C]33[/C][C]117.2[/C][C]112.316920032076[/C][C]4.88307996792412[/C][/ROW]
[ROW][C]34[/C][C]123.9[/C][C]115.450253365409[/C][C]8.44974663459079[/C][/ROW]
[ROW][C]35[/C][C]100[/C][C]108.550253365409[/C][C]-8.55025336540922[/C][/ROW]
[ROW][C]36[/C][C]103.6[/C][C]103.000253365409[/C][C]0.599746634590766[/C][/ROW]
[ROW][C]37[/C][C]94.1[/C][C]99.0678123560423[/C][C]-4.96781235604234[/C][/ROW]
[ROW][C]38[/C][C]98.7[/C][C]101.139240927471[/C][C]-2.43924092747095[/C][/ROW]
[ROW][C]39[/C][C]119.5[/C][C]114.839240927471[/C][C]4.66075907252905[/C][/ROW]
[ROW][C]40[/C][C]112.7[/C][C]108.667812356042[/C][C]4.03218764395762[/C][/ROW]
[ROW][C]41[/C][C]104.4[/C][C]108.039240927471[/C][C]-3.63924092747095[/C][/ROW]
[ROW][C]42[/C][C]124.7[/C][C]120.695178379485[/C][C]4.00482162051493[/C][/ROW]
[ROW][C]43[/C][C]89.1[/C][C]85.6249552131852[/C][C]3.47504478681475[/C][/ROW]
[ROW][C]44[/C][C]97[/C][C]96.8678123560424[/C][C]0.132187643957619[/C][/ROW]
[ROW][C]45[/C][C]121.6[/C][C]116.944121410657[/C][C]4.6558785893433[/C][/ROW]
[ROW][C]46[/C][C]118.8[/C][C]120.07745474399[/C][C]-1.27745474399004[/C][/ROW]
[ROW][C]47[/C][C]114[/C][C]113.17745474399[/C][C]0.82254525600996[/C][/ROW]
[ROW][C]48[/C][C]111.5[/C][C]107.62745474399[/C][C]3.87254525600996[/C][/ROW]
[ROW][C]49[/C][C]97.2[/C][C]103.695013734623[/C][C]-6.49501373462315[/C][/ROW]
[ROW][C]50[/C][C]102.5[/C][C]105.766442306052[/C][C]-3.26644230605176[/C][/ROW]
[ROW][C]51[/C][C]113.4[/C][C]119.466442306052[/C][C]-6.06644230605176[/C][/ROW]
[ROW][C]52[/C][C]109.8[/C][C]113.295013734623[/C][C]-3.4950137346232[/C][/ROW]
[ROW][C]53[/C][C]104.9[/C][C]112.666442306052[/C][C]-7.76644230605176[/C][/ROW]
[ROW][C]54[/C][C]126.1[/C][C]125.322379758066[/C][C]0.7776202419341[/C][/ROW]
[ROW][C]55[/C][C]80[/C][C]90.252156591766[/C][C]-10.2521565917661[/C][/ROW]
[ROW][C]56[/C][C]96.8[/C][C]101.495013734623[/C][C]-4.6950137346232[/C][/ROW]
[ROW][C]57[/C][C]117.2[/C][C]121.571322789238[/C][C]-4.37132278923751[/C][/ROW]
[ROW][C]58[/C][C]112.3[/C][C]124.704656122571[/C][C]-12.4046561225709[/C][/ROW]
[ROW][C]59[/C][C]117.3[/C][C]117.804656122571[/C][C]-0.50465612257086[/C][/ROW]
[ROW][C]60[/C][C]111.1[/C][C]112.254656122571[/C][C]-1.15465612257086[/C][/ROW]
[ROW][C]61[/C][C]102.2[/C][C]108.322215113204[/C][C]-6.12221511320397[/C][/ROW]
[ROW][C]62[/C][C]104.3[/C][C]110.393643684633[/C][C]-6.09364368463258[/C][/ROW]
[ROW][C]63[/C][C]122.9[/C][C]124.093643684633[/C][C]-1.19364368463258[/C][/ROW]
[ROW][C]64[/C][C]107.6[/C][C]117.922215113204[/C][C]-10.322215113204[/C][/ROW]
[ROW][C]65[/C][C]121.3[/C][C]117.293643684633[/C][C]4.00635631536741[/C][/ROW]
[ROW][C]66[/C][C]131.5[/C][C]129.949581136647[/C][C]1.55041886335329[/C][/ROW]
[ROW][C]67[/C][C]89[/C][C]94.8793579703469[/C][C]-5.87935797034688[/C][/ROW]
[ROW][C]68[/C][C]104.4[/C][C]106.122215113204[/C][C]-1.72221511320401[/C][/ROW]
[ROW][C]69[/C][C]128.9[/C][C]126.198524167818[/C][C]2.70147583218168[/C][/ROW]
[ROW][C]70[/C][C]135.9[/C][C]129.331857501152[/C][C]6.56814249884834[/C][/ROW]
[ROW][C]71[/C][C]133.3[/C][C]122.431857501152[/C][C]10.8681424988483[/C][/ROW]
[ROW][C]72[/C][C]121.3[/C][C]116.881857501152[/C][C]4.41814249884832[/C][/ROW]
[ROW][C]73[/C][C]120.5[/C][C]112.949416491785[/C][C]7.55058350821522[/C][/ROW]
[ROW][C]74[/C][C]120.4[/C][C]115.020845063213[/C][C]5.37915493678661[/C][/ROW]
[ROW][C]75[/C][C]137.9[/C][C]128.720845063213[/C][C]9.17915493678661[/C][/ROW]
[ROW][C]76[/C][C]126.1[/C][C]122.549416491785[/C][C]3.55058350821516[/C][/ROW]
[ROW][C]77[/C][C]133.2[/C][C]121.920845063213[/C][C]11.2791549367866[/C][/ROW]
[ROW][C]78[/C][C]146.6[/C][C]141.085220351129[/C][C]5.51477964887135[/C][/ROW]
[ROW][C]79[/C][C]103.4[/C][C]106.014997184829[/C][C]-2.6149971848288[/C][/ROW]
[ROW][C]80[/C][C]117.2[/C][C]117.257854327686[/C][C]-0.0578543276859312[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=14449&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=14449&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
197.391.69464605620135.60535394379869
210193.76607462762967.23392537237036
3113.2107.4660746276305.73392537237038
4101101.294646056201-0.294646056201046
5105.7100.6660746276305.0339253723704
6113.9113.3220120796440.577987920356247
786.478.25178891334398.14821108665613
896.589.4946460562017.00535394379895
9103.3109.570955110815-6.2709551108154
10114.9112.7042884441492.19571155585129
11105.8105.804288444149-0.00428844414869245
1294.2100.254288444149-6.05428844414871
1398.496.32184743478182.07815256521820
1499.498.39327600621041.00672399378957
15108.8112.093276006210-3.29327600621043
16112.6105.9218474347826.67815256521813
17104.4105.293276006210-0.89327600621043
18112.2117.949213458225-5.74921345822455
1981.182.8789902919247-1.77899029192473
2097.194.12184743478192.97815256521814
21112.6114.198156489396-1.59815648939619
22113.8117.331489822730-3.53148982272952
23107.8110.431489822730-2.63148982272953
24103.2104.881489822730-1.68148982272952
25103.3100.9490488133632.35095118663737
26101.2103.020477384791-1.82047738479124
27107.7116.720477384791-9.02047738479125
28110.4110.549048813363-0.149048813362669
29101.9109.920477384791-8.02047738479125
30115.9122.576414836805-6.67641483680536
3189.980.99775383460448.90224616539558
3288.692.2406109774616-3.64061097746157
33117.2112.3169200320764.88307996792412
34123.9115.4502533654098.44974663459079
35100108.550253365409-8.55025336540922
36103.6103.0002533654090.599746634590766
3794.199.0678123560423-4.96781235604234
3898.7101.139240927471-2.43924092747095
39119.5114.8392409274714.66075907252905
40112.7108.6678123560424.03218764395762
41104.4108.039240927471-3.63924092747095
42124.7120.6951783794854.00482162051493
4389.185.62495521318523.47504478681475
449796.86781235604240.132187643957619
45121.6116.9441214106574.6558785893433
46118.8120.07745474399-1.27745474399004
47114113.177454743990.82254525600996
48111.5107.627454743993.87254525600996
4997.2103.695013734623-6.49501373462315
50102.5105.766442306052-3.26644230605176
51113.4119.466442306052-6.06644230605176
52109.8113.295013734623-3.4950137346232
53104.9112.666442306052-7.76644230605176
54126.1125.3223797580660.7776202419341
558090.252156591766-10.2521565917661
5696.8101.495013734623-4.6950137346232
57117.2121.571322789238-4.37132278923751
58112.3124.704656122571-12.4046561225709
59117.3117.804656122571-0.50465612257086
60111.1112.254656122571-1.15465612257086
61102.2108.322215113204-6.12221511320397
62104.3110.393643684633-6.09364368463258
63122.9124.093643684633-1.19364368463258
64107.6117.922215113204-10.322215113204
65121.3117.2936436846334.00635631536741
66131.5129.9495811366471.55041886335329
678994.8793579703469-5.87935797034688
68104.4106.122215113204-1.72221511320401
69128.9126.1985241678182.70147583218168
70135.9129.3318575011526.56814249884834
71133.3122.43185750115210.8681424988483
72121.3116.8818575011524.41814249884832
73120.5112.9494164917857.55058350821522
74120.4115.0208450632135.37915493678661
75137.9128.7208450632139.17915493678661
76126.1122.5494164917853.55058350821516
77133.2121.92084506321311.2791549367866
78146.6141.0852203511295.51477964887135
79103.4106.014997184829-2.6149971848288
80117.2117.257854327686-0.0578543276859312



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')