Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 11:02:20 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195494936ritknomo2lx48ex.htm/, Retrieved Fri, 03 May 2024 07:26:30 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5751, Retrieved Fri, 03 May 2024 07:26:30 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsQ3 Totale productie
Estimated Impact160
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [The Seatbeltlaw] [2007-11-19 18:02:20] [3cbd35878d9bd3c68c81c01c5c6ec146] [Current]
Feedback Forum

Post a new message
Dataseries X:
106,7	0
110,2	0
125,9	0
100,1	0
106,4	0
114,8	0
81,3	0
87	0
104,2	0
108	0
105	0
94,5	0
92	0
95,9	0
108,8	0
103,4	0
102,1	0
110,1	0
83,2	0
82,7	0
106,8	0
113,7	0
102,5	0
96,6	0
92,1	0
95,6	0
102,3	0
98,6	0
98,2	0
104,5	0
84	0
73,8	0
103,9	0
106	0
97,2	0
102,6	0
89	0
93,8	0
116,7	0
106,8	0
98,5	0
118,7	0
90	0
91,9	0
113,3	1
113,1	1
104,1	1
108,7	1
96,7	1
101	1
116,9	1
105,8	1
99	1
129,4	1
83	1
88,9	1
115,9	1
104,2	1
113,4	1
112,2	1
100,8	1
107,3	1
126,6	1
102,9	1
117,9	1
128,8	1
87,5	1
93,8	1
122,7	1
126,2	1
124,6	1
116,7	1
115,2	1
111,1	1
129,9	1
113,3	1
118,5	1
133,5	1
102,1	1
102,4	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5751&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5751&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5751&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 97.6433333333333 + 5.88291666666667x[t] -5.31647321428578M1[t] -2.22675595238096M2[t] + 13.6915327380952M3[t] -0.161607142857151M4[t] + 1.11382440476191M5[t] + 15.1749702380952M6[t] -17.6067410714286M7[t] -16.3741666666667M8[t] + 6.24751488095237M9[t] + 6.87056547619047M10[t] + 2.69361607142857M11[t] + 0.110282738095238t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  97.6433333333333 +  5.88291666666667x[t] -5.31647321428578M1[t] -2.22675595238096M2[t] +  13.6915327380952M3[t] -0.161607142857151M4[t] +  1.11382440476191M5[t] +  15.1749702380952M6[t] -17.6067410714286M7[t] -16.3741666666667M8[t] +  6.24751488095237M9[t] +  6.87056547619047M10[t] +  2.69361607142857M11[t] +  0.110282738095238t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5751&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  97.6433333333333 +  5.88291666666667x[t] -5.31647321428578M1[t] -2.22675595238096M2[t] +  13.6915327380952M3[t] -0.161607142857151M4[t] +  1.11382440476191M5[t] +  15.1749702380952M6[t] -17.6067410714286M7[t] -16.3741666666667M8[t] +  6.24751488095237M9[t] +  6.87056547619047M10[t] +  2.69361607142857M11[t] +  0.110282738095238t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5751&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5751&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 97.6433333333333 + 5.88291666666667x[t] -5.31647321428578M1[t] -2.22675595238096M2[t] + 13.6915327380952M3[t] -0.161607142857151M4[t] + 1.11382440476191M5[t] + 15.1749702380952M6[t] -17.6067410714286M7[t] -16.3741666666667M8[t] + 6.24751488095237M9[t] + 6.87056547619047M10[t] + 2.69361607142857M11[t] + 0.110282738095238t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)97.64333333333333.10672631.429700
x5.882916666666672.9594031.98790.0509770.025488
M1-5.316473214285783.647896-1.45740.1497460.074873
M2-2.226755952380963.646073-0.61070.5434780.271739
M313.69153273809523.6453673.75590.0003680.000184
M4-0.1616071428571513.645781-0.04430.9647770.482389
M51.113824404761913.6473120.30540.7610360.380518
M615.17497023809523.6499614.15769.5e-054.7e-05
M7-17.60674107142863.653724-4.81889e-064e-06
M8-16.37416666666673.658598-4.47553.1e-051.5e-05
M96.247514880952373.7862361.65010.1036820.051841
M106.870565476190473.7835421.81590.0739270.036963
M112.693616071428573.7819250.71220.478830.239415
t0.1102827380952380.0638621.72690.0888630.044432

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 97.6433333333333 & 3.106726 & 31.4297 & 0 & 0 \tabularnewline
x & 5.88291666666667 & 2.959403 & 1.9879 & 0.050977 & 0.025488 \tabularnewline
M1 & -5.31647321428578 & 3.647896 & -1.4574 & 0.149746 & 0.074873 \tabularnewline
M2 & -2.22675595238096 & 3.646073 & -0.6107 & 0.543478 & 0.271739 \tabularnewline
M3 & 13.6915327380952 & 3.645367 & 3.7559 & 0.000368 & 0.000184 \tabularnewline
M4 & -0.161607142857151 & 3.645781 & -0.0443 & 0.964777 & 0.482389 \tabularnewline
M5 & 1.11382440476191 & 3.647312 & 0.3054 & 0.761036 & 0.380518 \tabularnewline
M6 & 15.1749702380952 & 3.649961 & 4.1576 & 9.5e-05 & 4.7e-05 \tabularnewline
M7 & -17.6067410714286 & 3.653724 & -4.8188 & 9e-06 & 4e-06 \tabularnewline
M8 & -16.3741666666667 & 3.658598 & -4.4755 & 3.1e-05 & 1.5e-05 \tabularnewline
M9 & 6.24751488095237 & 3.786236 & 1.6501 & 0.103682 & 0.051841 \tabularnewline
M10 & 6.87056547619047 & 3.783542 & 1.8159 & 0.073927 & 0.036963 \tabularnewline
M11 & 2.69361607142857 & 3.781925 & 0.7122 & 0.47883 & 0.239415 \tabularnewline
t & 0.110282738095238 & 0.063862 & 1.7269 & 0.088863 & 0.044432 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5751&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]97.6433333333333[/C][C]3.106726[/C][C]31.4297[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]5.88291666666667[/C][C]2.959403[/C][C]1.9879[/C][C]0.050977[/C][C]0.025488[/C][/ROW]
[ROW][C]M1[/C][C]-5.31647321428578[/C][C]3.647896[/C][C]-1.4574[/C][C]0.149746[/C][C]0.074873[/C][/ROW]
[ROW][C]M2[/C][C]-2.22675595238096[/C][C]3.646073[/C][C]-0.6107[/C][C]0.543478[/C][C]0.271739[/C][/ROW]
[ROW][C]M3[/C][C]13.6915327380952[/C][C]3.645367[/C][C]3.7559[/C][C]0.000368[/C][C]0.000184[/C][/ROW]
[ROW][C]M4[/C][C]-0.161607142857151[/C][C]3.645781[/C][C]-0.0443[/C][C]0.964777[/C][C]0.482389[/C][/ROW]
[ROW][C]M5[/C][C]1.11382440476191[/C][C]3.647312[/C][C]0.3054[/C][C]0.761036[/C][C]0.380518[/C][/ROW]
[ROW][C]M6[/C][C]15.1749702380952[/C][C]3.649961[/C][C]4.1576[/C][C]9.5e-05[/C][C]4.7e-05[/C][/ROW]
[ROW][C]M7[/C][C]-17.6067410714286[/C][C]3.653724[/C][C]-4.8188[/C][C]9e-06[/C][C]4e-06[/C][/ROW]
[ROW][C]M8[/C][C]-16.3741666666667[/C][C]3.658598[/C][C]-4.4755[/C][C]3.1e-05[/C][C]1.5e-05[/C][/ROW]
[ROW][C]M9[/C][C]6.24751488095237[/C][C]3.786236[/C][C]1.6501[/C][C]0.103682[/C][C]0.051841[/C][/ROW]
[ROW][C]M10[/C][C]6.87056547619047[/C][C]3.783542[/C][C]1.8159[/C][C]0.073927[/C][C]0.036963[/C][/ROW]
[ROW][C]M11[/C][C]2.69361607142857[/C][C]3.781925[/C][C]0.7122[/C][C]0.47883[/C][C]0.239415[/C][/ROW]
[ROW][C]t[/C][C]0.110282738095238[/C][C]0.063862[/C][C]1.7269[/C][C]0.088863[/C][C]0.044432[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5751&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5751&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)97.64333333333333.10672631.429700
x5.882916666666672.9594031.98790.0509770.025488
M1-5.316473214285783.647896-1.45740.1497460.074873
M2-2.226755952380963.646073-0.61070.5434780.271739
M313.69153273809523.6453673.75590.0003680.000184
M4-0.1616071428571513.645781-0.04430.9647770.482389
M51.113824404761913.6473120.30540.7610360.380518
M615.17497023809523.6499614.15769.5e-054.7e-05
M7-17.60674107142863.653724-4.81889e-064e-06
M8-16.37416666666673.658598-4.47553.1e-051.5e-05
M96.247514880952373.7862361.65010.1036820.051841
M106.870565476190473.7835421.81590.0739270.036963
M112.693616071428573.7819250.71220.478830.239415
t0.1102827380952380.0638621.72690.0888630.044432







Multiple Linear Regression - Regression Statistics
Multiple R0.882316047395406
R-squared0.778481607491453
Adjusted R-squared0.73484919684583
F-TEST (value)17.8418197842467
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value1.11022302462516e-16
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.54955267493152
Sum Squared Residuals2831.17825595237

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.882316047395406 \tabularnewline
R-squared & 0.778481607491453 \tabularnewline
Adjusted R-squared & 0.73484919684583 \tabularnewline
F-TEST (value) & 17.8418197842467 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 1.11022302462516e-16 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 6.54955267493152 \tabularnewline
Sum Squared Residuals & 2831.17825595237 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5751&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.882316047395406[/C][/ROW]
[ROW][C]R-squared[/C][C]0.778481607491453[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.73484919684583[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]17.8418197842467[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]1.11022302462516e-16[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]6.54955267493152[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]2831.17825595237[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5751&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5751&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.882316047395406
R-squared0.778481607491453
Adjusted R-squared0.73484919684583
F-TEST (value)17.8418197842467
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value1.11022302462516e-16
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.54955267493152
Sum Squared Residuals2831.17825595237







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.792.437142857143214.2628571428568
2110.295.637142857142914.5628571428571
3125.9111.66571428571414.2342857142858
4100.197.92285714285712.17714285714287
5106.499.30857142857147.09142857142859
6114.8113.481.32000000000003
781.380.80857142857140.491428571428584
88782.15142857142864.84857142857144
9104.2104.883392857143-0.683392857142873
10108105.6167261904762.38327380952379
11105101.5500595238103.44994047619049
1294.598.9667261904762-4.46672619047619
139293.7605357142857-1.76053571428566
1495.996.9605357142857-1.06053571428570
15108.8112.989107142857-4.18910714285714
16103.499.246254.15375000000001
17102.1100.6319642857141.46803571428571
18110.1114.803392857143-4.70339285714286
1983.282.13196428571431.06803571428572
2082.783.4748214285714-0.77482142857143
21106.8106.2067857142860.59321428571429
22113.7106.9401190476196.75988095238096
23102.5102.873452380952-0.373452380952379
2496.6100.290119047619-3.69011904761905
2592.195.0839285714285-2.98392857142852
2695.698.2839285714286-2.68392857142857
27102.3114.3125-12.0125
2898.6100.569642857143-1.96964285714286
2998.2101.955357142857-3.75535714285714
30104.5116.126785714286-11.6267857142857
318483.45535714285710.544642857142854
3273.884.7982142857143-10.9982142857143
33103.9107.530178571429-3.63017857142856
34106108.263511904762-2.2635119047619
3597.2104.196845238095-6.99684523809523
36102.6101.6135119047620.986488095238083
378996.4073214285714-7.40732142857138
3893.899.6073214285714-5.80732142857143
39116.7115.6358928571431.06410714285714
40106.8101.8930357142864.90696428571428
4198.5103.27875-4.77875
42118.7117.4501785714291.24982142857142
439084.778755.22125
4491.986.12160714285715.77839285714286
45113.3114.736488095238-1.43648809523809
46113.1115.469821428571-2.36982142857143
47104.1111.403154761905-7.30315476190477
48108.7108.819821428571-0.119821428571430
4996.7103.613630952381-6.9136309523809
50101106.813630952381-5.81363095238095
51116.9122.842202380952-5.94220238095238
52105.8109.099345238095-3.29934523809524
5399110.485059523810-11.4850595238095
54129.4124.6564880952384.7435119047619
558391.9850595238095-8.98505952380953
5688.993.3279166666667-4.42791666666666
57115.9116.059880952381-0.159880952380947
58104.2116.793214285714-12.5932142857143
59113.4112.7265476190480.673452380952381
60112.2110.1432142857142.05678571428571
61100.8104.937023809524-4.13702380952377
62107.3108.137023809524-0.837023809523818
63126.6124.1655952380952.43440476190474
64102.9110.422738095238-7.5227380952381
65117.9111.8084523809526.09154761904762
66128.8125.9798809523812.82011904761905
6787.593.3084523809524-5.80845238095238
6893.894.6513095238095-0.851309523809535
69122.7117.3832738095245.31672619047619
70126.2118.1166071428578.08339285714285
71124.6114.04994047619010.5500595238095
72116.7111.4666071428575.23339285714285
73115.2106.2604166666678.93958333333338
74111.1109.4604166666671.63958333333332
75129.9125.4889880952384.41101190476189
76113.3111.7461309523811.55386904761904
77118.5113.1318452380955.36815476190475
78133.5127.3032738095246.19672619047617
79102.194.63184523809537.46815476190475
80102.495.97470238095246.42529761904761

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 106.7 & 92.4371428571432 & 14.2628571428568 \tabularnewline
2 & 110.2 & 95.6371428571429 & 14.5628571428571 \tabularnewline
3 & 125.9 & 111.665714285714 & 14.2342857142858 \tabularnewline
4 & 100.1 & 97.9228571428571 & 2.17714285714287 \tabularnewline
5 & 106.4 & 99.3085714285714 & 7.09142857142859 \tabularnewline
6 & 114.8 & 113.48 & 1.32000000000003 \tabularnewline
7 & 81.3 & 80.8085714285714 & 0.491428571428584 \tabularnewline
8 & 87 & 82.1514285714286 & 4.84857142857144 \tabularnewline
9 & 104.2 & 104.883392857143 & -0.683392857142873 \tabularnewline
10 & 108 & 105.616726190476 & 2.38327380952379 \tabularnewline
11 & 105 & 101.550059523810 & 3.44994047619049 \tabularnewline
12 & 94.5 & 98.9667261904762 & -4.46672619047619 \tabularnewline
13 & 92 & 93.7605357142857 & -1.76053571428566 \tabularnewline
14 & 95.9 & 96.9605357142857 & -1.06053571428570 \tabularnewline
15 & 108.8 & 112.989107142857 & -4.18910714285714 \tabularnewline
16 & 103.4 & 99.24625 & 4.15375000000001 \tabularnewline
17 & 102.1 & 100.631964285714 & 1.46803571428571 \tabularnewline
18 & 110.1 & 114.803392857143 & -4.70339285714286 \tabularnewline
19 & 83.2 & 82.1319642857143 & 1.06803571428572 \tabularnewline
20 & 82.7 & 83.4748214285714 & -0.77482142857143 \tabularnewline
21 & 106.8 & 106.206785714286 & 0.59321428571429 \tabularnewline
22 & 113.7 & 106.940119047619 & 6.75988095238096 \tabularnewline
23 & 102.5 & 102.873452380952 & -0.373452380952379 \tabularnewline
24 & 96.6 & 100.290119047619 & -3.69011904761905 \tabularnewline
25 & 92.1 & 95.0839285714285 & -2.98392857142852 \tabularnewline
26 & 95.6 & 98.2839285714286 & -2.68392857142857 \tabularnewline
27 & 102.3 & 114.3125 & -12.0125 \tabularnewline
28 & 98.6 & 100.569642857143 & -1.96964285714286 \tabularnewline
29 & 98.2 & 101.955357142857 & -3.75535714285714 \tabularnewline
30 & 104.5 & 116.126785714286 & -11.6267857142857 \tabularnewline
31 & 84 & 83.4553571428571 & 0.544642857142854 \tabularnewline
32 & 73.8 & 84.7982142857143 & -10.9982142857143 \tabularnewline
33 & 103.9 & 107.530178571429 & -3.63017857142856 \tabularnewline
34 & 106 & 108.263511904762 & -2.2635119047619 \tabularnewline
35 & 97.2 & 104.196845238095 & -6.99684523809523 \tabularnewline
36 & 102.6 & 101.613511904762 & 0.986488095238083 \tabularnewline
37 & 89 & 96.4073214285714 & -7.40732142857138 \tabularnewline
38 & 93.8 & 99.6073214285714 & -5.80732142857143 \tabularnewline
39 & 116.7 & 115.635892857143 & 1.06410714285714 \tabularnewline
40 & 106.8 & 101.893035714286 & 4.90696428571428 \tabularnewline
41 & 98.5 & 103.27875 & -4.77875 \tabularnewline
42 & 118.7 & 117.450178571429 & 1.24982142857142 \tabularnewline
43 & 90 & 84.77875 & 5.22125 \tabularnewline
44 & 91.9 & 86.1216071428571 & 5.77839285714286 \tabularnewline
45 & 113.3 & 114.736488095238 & -1.43648809523809 \tabularnewline
46 & 113.1 & 115.469821428571 & -2.36982142857143 \tabularnewline
47 & 104.1 & 111.403154761905 & -7.30315476190477 \tabularnewline
48 & 108.7 & 108.819821428571 & -0.119821428571430 \tabularnewline
49 & 96.7 & 103.613630952381 & -6.9136309523809 \tabularnewline
50 & 101 & 106.813630952381 & -5.81363095238095 \tabularnewline
51 & 116.9 & 122.842202380952 & -5.94220238095238 \tabularnewline
52 & 105.8 & 109.099345238095 & -3.29934523809524 \tabularnewline
53 & 99 & 110.485059523810 & -11.4850595238095 \tabularnewline
54 & 129.4 & 124.656488095238 & 4.7435119047619 \tabularnewline
55 & 83 & 91.9850595238095 & -8.98505952380953 \tabularnewline
56 & 88.9 & 93.3279166666667 & -4.42791666666666 \tabularnewline
57 & 115.9 & 116.059880952381 & -0.159880952380947 \tabularnewline
58 & 104.2 & 116.793214285714 & -12.5932142857143 \tabularnewline
59 & 113.4 & 112.726547619048 & 0.673452380952381 \tabularnewline
60 & 112.2 & 110.143214285714 & 2.05678571428571 \tabularnewline
61 & 100.8 & 104.937023809524 & -4.13702380952377 \tabularnewline
62 & 107.3 & 108.137023809524 & -0.837023809523818 \tabularnewline
63 & 126.6 & 124.165595238095 & 2.43440476190474 \tabularnewline
64 & 102.9 & 110.422738095238 & -7.5227380952381 \tabularnewline
65 & 117.9 & 111.808452380952 & 6.09154761904762 \tabularnewline
66 & 128.8 & 125.979880952381 & 2.82011904761905 \tabularnewline
67 & 87.5 & 93.3084523809524 & -5.80845238095238 \tabularnewline
68 & 93.8 & 94.6513095238095 & -0.851309523809535 \tabularnewline
69 & 122.7 & 117.383273809524 & 5.31672619047619 \tabularnewline
70 & 126.2 & 118.116607142857 & 8.08339285714285 \tabularnewline
71 & 124.6 & 114.049940476190 & 10.5500595238095 \tabularnewline
72 & 116.7 & 111.466607142857 & 5.23339285714285 \tabularnewline
73 & 115.2 & 106.260416666667 & 8.93958333333338 \tabularnewline
74 & 111.1 & 109.460416666667 & 1.63958333333332 \tabularnewline
75 & 129.9 & 125.488988095238 & 4.41101190476189 \tabularnewline
76 & 113.3 & 111.746130952381 & 1.55386904761904 \tabularnewline
77 & 118.5 & 113.131845238095 & 5.36815476190475 \tabularnewline
78 & 133.5 & 127.303273809524 & 6.19672619047617 \tabularnewline
79 & 102.1 & 94.6318452380953 & 7.46815476190475 \tabularnewline
80 & 102.4 & 95.9747023809524 & 6.42529761904761 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5751&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]106.7[/C][C]92.4371428571432[/C][C]14.2628571428568[/C][/ROW]
[ROW][C]2[/C][C]110.2[/C][C]95.6371428571429[/C][C]14.5628571428571[/C][/ROW]
[ROW][C]3[/C][C]125.9[/C][C]111.665714285714[/C][C]14.2342857142858[/C][/ROW]
[ROW][C]4[/C][C]100.1[/C][C]97.9228571428571[/C][C]2.17714285714287[/C][/ROW]
[ROW][C]5[/C][C]106.4[/C][C]99.3085714285714[/C][C]7.09142857142859[/C][/ROW]
[ROW][C]6[/C][C]114.8[/C][C]113.48[/C][C]1.32000000000003[/C][/ROW]
[ROW][C]7[/C][C]81.3[/C][C]80.8085714285714[/C][C]0.491428571428584[/C][/ROW]
[ROW][C]8[/C][C]87[/C][C]82.1514285714286[/C][C]4.84857142857144[/C][/ROW]
[ROW][C]9[/C][C]104.2[/C][C]104.883392857143[/C][C]-0.683392857142873[/C][/ROW]
[ROW][C]10[/C][C]108[/C][C]105.616726190476[/C][C]2.38327380952379[/C][/ROW]
[ROW][C]11[/C][C]105[/C][C]101.550059523810[/C][C]3.44994047619049[/C][/ROW]
[ROW][C]12[/C][C]94.5[/C][C]98.9667261904762[/C][C]-4.46672619047619[/C][/ROW]
[ROW][C]13[/C][C]92[/C][C]93.7605357142857[/C][C]-1.76053571428566[/C][/ROW]
[ROW][C]14[/C][C]95.9[/C][C]96.9605357142857[/C][C]-1.06053571428570[/C][/ROW]
[ROW][C]15[/C][C]108.8[/C][C]112.989107142857[/C][C]-4.18910714285714[/C][/ROW]
[ROW][C]16[/C][C]103.4[/C][C]99.24625[/C][C]4.15375000000001[/C][/ROW]
[ROW][C]17[/C][C]102.1[/C][C]100.631964285714[/C][C]1.46803571428571[/C][/ROW]
[ROW][C]18[/C][C]110.1[/C][C]114.803392857143[/C][C]-4.70339285714286[/C][/ROW]
[ROW][C]19[/C][C]83.2[/C][C]82.1319642857143[/C][C]1.06803571428572[/C][/ROW]
[ROW][C]20[/C][C]82.7[/C][C]83.4748214285714[/C][C]-0.77482142857143[/C][/ROW]
[ROW][C]21[/C][C]106.8[/C][C]106.206785714286[/C][C]0.59321428571429[/C][/ROW]
[ROW][C]22[/C][C]113.7[/C][C]106.940119047619[/C][C]6.75988095238096[/C][/ROW]
[ROW][C]23[/C][C]102.5[/C][C]102.873452380952[/C][C]-0.373452380952379[/C][/ROW]
[ROW][C]24[/C][C]96.6[/C][C]100.290119047619[/C][C]-3.69011904761905[/C][/ROW]
[ROW][C]25[/C][C]92.1[/C][C]95.0839285714285[/C][C]-2.98392857142852[/C][/ROW]
[ROW][C]26[/C][C]95.6[/C][C]98.2839285714286[/C][C]-2.68392857142857[/C][/ROW]
[ROW][C]27[/C][C]102.3[/C][C]114.3125[/C][C]-12.0125[/C][/ROW]
[ROW][C]28[/C][C]98.6[/C][C]100.569642857143[/C][C]-1.96964285714286[/C][/ROW]
[ROW][C]29[/C][C]98.2[/C][C]101.955357142857[/C][C]-3.75535714285714[/C][/ROW]
[ROW][C]30[/C][C]104.5[/C][C]116.126785714286[/C][C]-11.6267857142857[/C][/ROW]
[ROW][C]31[/C][C]84[/C][C]83.4553571428571[/C][C]0.544642857142854[/C][/ROW]
[ROW][C]32[/C][C]73.8[/C][C]84.7982142857143[/C][C]-10.9982142857143[/C][/ROW]
[ROW][C]33[/C][C]103.9[/C][C]107.530178571429[/C][C]-3.63017857142856[/C][/ROW]
[ROW][C]34[/C][C]106[/C][C]108.263511904762[/C][C]-2.2635119047619[/C][/ROW]
[ROW][C]35[/C][C]97.2[/C][C]104.196845238095[/C][C]-6.99684523809523[/C][/ROW]
[ROW][C]36[/C][C]102.6[/C][C]101.613511904762[/C][C]0.986488095238083[/C][/ROW]
[ROW][C]37[/C][C]89[/C][C]96.4073214285714[/C][C]-7.40732142857138[/C][/ROW]
[ROW][C]38[/C][C]93.8[/C][C]99.6073214285714[/C][C]-5.80732142857143[/C][/ROW]
[ROW][C]39[/C][C]116.7[/C][C]115.635892857143[/C][C]1.06410714285714[/C][/ROW]
[ROW][C]40[/C][C]106.8[/C][C]101.893035714286[/C][C]4.90696428571428[/C][/ROW]
[ROW][C]41[/C][C]98.5[/C][C]103.27875[/C][C]-4.77875[/C][/ROW]
[ROW][C]42[/C][C]118.7[/C][C]117.450178571429[/C][C]1.24982142857142[/C][/ROW]
[ROW][C]43[/C][C]90[/C][C]84.77875[/C][C]5.22125[/C][/ROW]
[ROW][C]44[/C][C]91.9[/C][C]86.1216071428571[/C][C]5.77839285714286[/C][/ROW]
[ROW][C]45[/C][C]113.3[/C][C]114.736488095238[/C][C]-1.43648809523809[/C][/ROW]
[ROW][C]46[/C][C]113.1[/C][C]115.469821428571[/C][C]-2.36982142857143[/C][/ROW]
[ROW][C]47[/C][C]104.1[/C][C]111.403154761905[/C][C]-7.30315476190477[/C][/ROW]
[ROW][C]48[/C][C]108.7[/C][C]108.819821428571[/C][C]-0.119821428571430[/C][/ROW]
[ROW][C]49[/C][C]96.7[/C][C]103.613630952381[/C][C]-6.9136309523809[/C][/ROW]
[ROW][C]50[/C][C]101[/C][C]106.813630952381[/C][C]-5.81363095238095[/C][/ROW]
[ROW][C]51[/C][C]116.9[/C][C]122.842202380952[/C][C]-5.94220238095238[/C][/ROW]
[ROW][C]52[/C][C]105.8[/C][C]109.099345238095[/C][C]-3.29934523809524[/C][/ROW]
[ROW][C]53[/C][C]99[/C][C]110.485059523810[/C][C]-11.4850595238095[/C][/ROW]
[ROW][C]54[/C][C]129.4[/C][C]124.656488095238[/C][C]4.7435119047619[/C][/ROW]
[ROW][C]55[/C][C]83[/C][C]91.9850595238095[/C][C]-8.98505952380953[/C][/ROW]
[ROW][C]56[/C][C]88.9[/C][C]93.3279166666667[/C][C]-4.42791666666666[/C][/ROW]
[ROW][C]57[/C][C]115.9[/C][C]116.059880952381[/C][C]-0.159880952380947[/C][/ROW]
[ROW][C]58[/C][C]104.2[/C][C]116.793214285714[/C][C]-12.5932142857143[/C][/ROW]
[ROW][C]59[/C][C]113.4[/C][C]112.726547619048[/C][C]0.673452380952381[/C][/ROW]
[ROW][C]60[/C][C]112.2[/C][C]110.143214285714[/C][C]2.05678571428571[/C][/ROW]
[ROW][C]61[/C][C]100.8[/C][C]104.937023809524[/C][C]-4.13702380952377[/C][/ROW]
[ROW][C]62[/C][C]107.3[/C][C]108.137023809524[/C][C]-0.837023809523818[/C][/ROW]
[ROW][C]63[/C][C]126.6[/C][C]124.165595238095[/C][C]2.43440476190474[/C][/ROW]
[ROW][C]64[/C][C]102.9[/C][C]110.422738095238[/C][C]-7.5227380952381[/C][/ROW]
[ROW][C]65[/C][C]117.9[/C][C]111.808452380952[/C][C]6.09154761904762[/C][/ROW]
[ROW][C]66[/C][C]128.8[/C][C]125.979880952381[/C][C]2.82011904761905[/C][/ROW]
[ROW][C]67[/C][C]87.5[/C][C]93.3084523809524[/C][C]-5.80845238095238[/C][/ROW]
[ROW][C]68[/C][C]93.8[/C][C]94.6513095238095[/C][C]-0.851309523809535[/C][/ROW]
[ROW][C]69[/C][C]122.7[/C][C]117.383273809524[/C][C]5.31672619047619[/C][/ROW]
[ROW][C]70[/C][C]126.2[/C][C]118.116607142857[/C][C]8.08339285714285[/C][/ROW]
[ROW][C]71[/C][C]124.6[/C][C]114.049940476190[/C][C]10.5500595238095[/C][/ROW]
[ROW][C]72[/C][C]116.7[/C][C]111.466607142857[/C][C]5.23339285714285[/C][/ROW]
[ROW][C]73[/C][C]115.2[/C][C]106.260416666667[/C][C]8.93958333333338[/C][/ROW]
[ROW][C]74[/C][C]111.1[/C][C]109.460416666667[/C][C]1.63958333333332[/C][/ROW]
[ROW][C]75[/C][C]129.9[/C][C]125.488988095238[/C][C]4.41101190476189[/C][/ROW]
[ROW][C]76[/C][C]113.3[/C][C]111.746130952381[/C][C]1.55386904761904[/C][/ROW]
[ROW][C]77[/C][C]118.5[/C][C]113.131845238095[/C][C]5.36815476190475[/C][/ROW]
[ROW][C]78[/C][C]133.5[/C][C]127.303273809524[/C][C]6.19672619047617[/C][/ROW]
[ROW][C]79[/C][C]102.1[/C][C]94.6318452380953[/C][C]7.46815476190475[/C][/ROW]
[ROW][C]80[/C][C]102.4[/C][C]95.9747023809524[/C][C]6.42529761904761[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5751&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5751&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.792.437142857143214.2628571428568
2110.295.637142857142914.5628571428571
3125.9111.66571428571414.2342857142858
4100.197.92285714285712.17714285714287
5106.499.30857142857147.09142857142859
6114.8113.481.32000000000003
781.380.80857142857140.491428571428584
88782.15142857142864.84857142857144
9104.2104.883392857143-0.683392857142873
10108105.6167261904762.38327380952379
11105101.5500595238103.44994047619049
1294.598.9667261904762-4.46672619047619
139293.7605357142857-1.76053571428566
1495.996.9605357142857-1.06053571428570
15108.8112.989107142857-4.18910714285714
16103.499.246254.15375000000001
17102.1100.6319642857141.46803571428571
18110.1114.803392857143-4.70339285714286
1983.282.13196428571431.06803571428572
2082.783.4748214285714-0.77482142857143
21106.8106.2067857142860.59321428571429
22113.7106.9401190476196.75988095238096
23102.5102.873452380952-0.373452380952379
2496.6100.290119047619-3.69011904761905
2592.195.0839285714285-2.98392857142852
2695.698.2839285714286-2.68392857142857
27102.3114.3125-12.0125
2898.6100.569642857143-1.96964285714286
2998.2101.955357142857-3.75535714285714
30104.5116.126785714286-11.6267857142857
318483.45535714285710.544642857142854
3273.884.7982142857143-10.9982142857143
33103.9107.530178571429-3.63017857142856
34106108.263511904762-2.2635119047619
3597.2104.196845238095-6.99684523809523
36102.6101.6135119047620.986488095238083
378996.4073214285714-7.40732142857138
3893.899.6073214285714-5.80732142857143
39116.7115.6358928571431.06410714285714
40106.8101.8930357142864.90696428571428
4198.5103.27875-4.77875
42118.7117.4501785714291.24982142857142
439084.778755.22125
4491.986.12160714285715.77839285714286
45113.3114.736488095238-1.43648809523809
46113.1115.469821428571-2.36982142857143
47104.1111.403154761905-7.30315476190477
48108.7108.819821428571-0.119821428571430
4996.7103.613630952381-6.9136309523809
50101106.813630952381-5.81363095238095
51116.9122.842202380952-5.94220238095238
52105.8109.099345238095-3.29934523809524
5399110.485059523810-11.4850595238095
54129.4124.6564880952384.7435119047619
558391.9850595238095-8.98505952380953
5688.993.3279166666667-4.42791666666666
57115.9116.059880952381-0.159880952380947
58104.2116.793214285714-12.5932142857143
59113.4112.7265476190480.673452380952381
60112.2110.1432142857142.05678571428571
61100.8104.937023809524-4.13702380952377
62107.3108.137023809524-0.837023809523818
63126.6124.1655952380952.43440476190474
64102.9110.422738095238-7.5227380952381
65117.9111.8084523809526.09154761904762
66128.8125.9798809523812.82011904761905
6787.593.3084523809524-5.80845238095238
6893.894.6513095238095-0.851309523809535
69122.7117.3832738095245.31672619047619
70126.2118.1166071428578.08339285714285
71124.6114.04994047619010.5500595238095
72116.7111.4666071428575.23339285714285
73115.2106.2604166666678.93958333333338
74111.1109.4604166666671.63958333333332
75129.9125.4889880952384.41101190476189
76113.3111.7461309523811.55386904761904
77118.5113.1318452380955.36815476190475
78133.5127.3032738095246.19672619047617
79102.194.63184523809537.46815476190475
80102.495.97470238095246.42529761904761



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')