Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 13 Dec 2007 13:22:17 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/13/t1197576468cptvivylpz9x2dw.htm/, Retrieved Sun, 05 May 2024 15:23:17 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=3710, Retrieved Sun, 05 May 2024 15:23:17 +0000
QR Codes:

Original text written by user:²
IsPrivate?No (this computation is public)
User-defined keywordswtc
Estimated Impact182
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper] [2007-12-13 20:22:17] [8ce1ad2ac57e06e10fb37a1292ae8cb6] [Current]
Feedback Forum

Post a new message
Dataseries X:
98.6	0
98	0
106.8	0
96.6	0
100.1	0
107.7	0
91.5	0
97.8	0
107.4	1
117.5	1
105.6	1
97.4	1
99.5	1
98	1
104.3	1
100.6	1
101.1	1
103.9	1
96.9	1
95.5	1
108.4	1
117	1
103.8	1
100.8	1
110.6	1
104	1
112.6	1
107.3	1
98.9	1
109.8	1
104.9	1
102.2	1
123.9	1
124.9	1
112.7	1
121.9	1
100.6	1
104.3	1
120.4	1
107.5	1
102.9	1
125.6	1
107.5	1
108.8	1
128.4	1
121.1	1
119.5	1
128.7	1
108.7	1
105.5	1
119.8	1
111.3	1
110.6	1
120.1	1
97.5	1
107.7	1
127.3	1
117.2	1
119.8	1
116.2	1
111	1
112.4	1
130.6	1
109.1	1
118.8	1
123.9	1
101.6	1
112.8	1
128	1
129.6	1
125.8	1
119.5	1
115.7	1
113.6	1
129.7	1
112	1
116.8	1
126.3	1
112.9	1
115.9	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3710&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3710&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3710&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Totaal[t] = + 103.887500000000 -2.05083333333325`WTC11-09`[t] -6.5326587301588M1[t] -8.09567460317459M2[t] + 4.24130952380952M3[t] -7.45027777777778M4[t] -7.05615079365079M5[t] + 2.38083333333334M6[t] -12.8393253968254M7[t] -9.1451984126984M8[t] + 7.35809523809524M9[t] + 7.71650793650794M10[t] + 0.741587301587295M11[t] + 0.291587301587302t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Totaal[t] =  +  103.887500000000 -2.05083333333325`WTC11-09`[t] -6.5326587301588M1[t] -8.09567460317459M2[t] +  4.24130952380952M3[t] -7.45027777777778M4[t] -7.05615079365079M5[t] +  2.38083333333334M6[t] -12.8393253968254M7[t] -9.1451984126984M8[t] +  7.35809523809524M9[t] +  7.71650793650794M10[t] +  0.741587301587295M11[t] +  0.291587301587302t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3710&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Totaal[t] =  +  103.887500000000 -2.05083333333325`WTC11-09`[t] -6.5326587301588M1[t] -8.09567460317459M2[t] +  4.24130952380952M3[t] -7.45027777777778M4[t] -7.05615079365079M5[t] +  2.38083333333334M6[t] -12.8393253968254M7[t] -9.1451984126984M8[t] +  7.35809523809524M9[t] +  7.71650793650794M10[t] +  0.741587301587295M11[t] +  0.291587301587302t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3710&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3710&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Totaal[t] = + 103.887500000000 -2.05083333333325`WTC11-09`[t] -6.5326587301588M1[t] -8.09567460317459M2[t] + 4.24130952380952M3[t] -7.45027777777778M4[t] -7.05615079365079M5[t] + 2.38083333333334M6[t] -12.8393253968254M7[t] -9.1451984126984M8[t] + 7.35809523809524M9[t] + 7.71650793650794M10[t] + 0.741587301587295M11[t] + 0.291587301587302t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)103.8875000000002.59834539.982200
`WTC11-09`-2.050833333333252.100217-0.97650.3323890.166195
M1-6.53265873015882.598482-2.5140.0143830.007191
M2-8.095674603174592.598894-3.1150.0027210.001361
M34.241309523809522.5995821.63150.107540.05377
M4-7.450277777777782.600543-2.86490.0055890.002795
M5-7.056150793650792.601779-2.7120.0085170.004259
M62.380833333333342.6032890.91450.3637580.181879
M7-12.83932539682542.605072-4.92866e-063e-06
M8-9.14519841269842.607128-3.50780.0008190.000409
M97.358095238095242.6847562.74070.007880.00394
M107.716507936507942.6840912.87490.0054340.002717
M110.7415873015872952.6836920.27630.7831570.391579
t0.2915873015873020.02672910.909100

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 103.887500000000 & 2.598345 & 39.9822 & 0 & 0 \tabularnewline
`WTC11-09` & -2.05083333333325 & 2.100217 & -0.9765 & 0.332389 & 0.166195 \tabularnewline
M1 & -6.5326587301588 & 2.598482 & -2.514 & 0.014383 & 0.007191 \tabularnewline
M2 & -8.09567460317459 & 2.598894 & -3.115 & 0.002721 & 0.001361 \tabularnewline
M3 & 4.24130952380952 & 2.599582 & 1.6315 & 0.10754 & 0.05377 \tabularnewline
M4 & -7.45027777777778 & 2.600543 & -2.8649 & 0.005589 & 0.002795 \tabularnewline
M5 & -7.05615079365079 & 2.601779 & -2.712 & 0.008517 & 0.004259 \tabularnewline
M6 & 2.38083333333334 & 2.603289 & 0.9145 & 0.363758 & 0.181879 \tabularnewline
M7 & -12.8393253968254 & 2.605072 & -4.9286 & 6e-06 & 3e-06 \tabularnewline
M8 & -9.1451984126984 & 2.607128 & -3.5078 & 0.000819 & 0.000409 \tabularnewline
M9 & 7.35809523809524 & 2.684756 & 2.7407 & 0.00788 & 0.00394 \tabularnewline
M10 & 7.71650793650794 & 2.684091 & 2.8749 & 0.005434 & 0.002717 \tabularnewline
M11 & 0.741587301587295 & 2.683692 & 0.2763 & 0.783157 & 0.391579 \tabularnewline
t & 0.291587301587302 & 0.026729 & 10.9091 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3710&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]103.887500000000[/C][C]2.598345[/C][C]39.9822[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]`WTC11-09`[/C][C]-2.05083333333325[/C][C]2.100217[/C][C]-0.9765[/C][C]0.332389[/C][C]0.166195[/C][/ROW]
[ROW][C]M1[/C][C]-6.5326587301588[/C][C]2.598482[/C][C]-2.514[/C][C]0.014383[/C][C]0.007191[/C][/ROW]
[ROW][C]M2[/C][C]-8.09567460317459[/C][C]2.598894[/C][C]-3.115[/C][C]0.002721[/C][C]0.001361[/C][/ROW]
[ROW][C]M3[/C][C]4.24130952380952[/C][C]2.599582[/C][C]1.6315[/C][C]0.10754[/C][C]0.05377[/C][/ROW]
[ROW][C]M4[/C][C]-7.45027777777778[/C][C]2.600543[/C][C]-2.8649[/C][C]0.005589[/C][C]0.002795[/C][/ROW]
[ROW][C]M5[/C][C]-7.05615079365079[/C][C]2.601779[/C][C]-2.712[/C][C]0.008517[/C][C]0.004259[/C][/ROW]
[ROW][C]M6[/C][C]2.38083333333334[/C][C]2.603289[/C][C]0.9145[/C][C]0.363758[/C][C]0.181879[/C][/ROW]
[ROW][C]M7[/C][C]-12.8393253968254[/C][C]2.605072[/C][C]-4.9286[/C][C]6e-06[/C][C]3e-06[/C][/ROW]
[ROW][C]M8[/C][C]-9.1451984126984[/C][C]2.607128[/C][C]-3.5078[/C][C]0.000819[/C][C]0.000409[/C][/ROW]
[ROW][C]M9[/C][C]7.35809523809524[/C][C]2.684756[/C][C]2.7407[/C][C]0.00788[/C][C]0.00394[/C][/ROW]
[ROW][C]M10[/C][C]7.71650793650794[/C][C]2.684091[/C][C]2.8749[/C][C]0.005434[/C][C]0.002717[/C][/ROW]
[ROW][C]M11[/C][C]0.741587301587295[/C][C]2.683692[/C][C]0.2763[/C][C]0.783157[/C][C]0.391579[/C][/ROW]
[ROW][C]t[/C][C]0.291587301587302[/C][C]0.026729[/C][C]10.9091[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3710&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3710&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)103.8875000000002.59834539.982200
`WTC11-09`-2.050833333333252.100217-0.97650.3323890.166195
M1-6.53265873015882.598482-2.5140.0143830.007191
M2-8.095674603174592.598894-3.1150.0027210.001361
M34.241309523809522.5995821.63150.107540.05377
M4-7.450277777777782.600543-2.86490.0055890.002795
M5-7.056150793650792.601779-2.7120.0085170.004259
M62.380833333333342.6032890.91450.3637580.181879
M7-12.83932539682542.605072-4.92866e-063e-06
M8-9.14519841269842.607128-3.50780.0008190.000409
M97.358095238095242.6847562.74070.007880.00394
M107.716507936507942.6840912.87490.0054340.002717
M110.7415873015872952.6836920.27630.7831570.391579
t0.2915873015873020.02672910.909100







Multiple Linear Regression - Regression Statistics
Multiple R0.906469959176328
R-squared0.821687786889133
Adjusted R-squared0.78656568430669
F-TEST (value)23.3951764408283
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation4.64805996085586
Sum Squared Residuals1425.89445238095

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.906469959176328 \tabularnewline
R-squared & 0.821687786889133 \tabularnewline
Adjusted R-squared & 0.78656568430669 \tabularnewline
F-TEST (value) & 23.3951764408283 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 4.64805996085586 \tabularnewline
Sum Squared Residuals & 1425.89445238095 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3710&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.906469959176328[/C][/ROW]
[ROW][C]R-squared[/C][C]0.821687786889133[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.78656568430669[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]23.3951764408283[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]4.64805996085586[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1425.89445238095[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3710&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3710&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.906469959176328
R-squared0.821687786889133
Adjusted R-squared0.78656568430669
F-TEST (value)23.3951764408283
F-TEST (DF numerator)13
F-TEST (DF denominator)66
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation4.64805996085586
Sum Squared Residuals1425.89445238095







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
198.697.6464285714290.953571428571022
29896.3751.62500000000004
3106.8109.003571428571-2.20357142857138
496.697.6035714285714-1.00357142857139
5100.198.28928571428571.81071428571434
6107.7108.017857142857-0.317857142857068
791.593.0892857142857-1.58928571428565
897.897.0750.725000000000066
9107.4111.819047619048-4.41904761904763
10117.5112.4690476190485.03095238095239
11105.6105.785714285714-0.185714285714298
1297.4105.335714285714-7.93571428571427
1399.599.09464285714280.405357142857214
149897.82321428571430.176785714285718
15104.3110.451785714286-6.15178571428572
16100.699.05178571428571.54821428571428
17101.199.73751.36249999999998
18103.9109.466071428571-5.56607142857143
1996.994.53752.36249999999999
2095.598.5232142857143-3.0232142857143
21108.4115.318095238095-6.91809523809523
22117115.9680952380951.03190476190476
23103.8109.284761904762-5.4847619047619
24100.8108.834761904762-8.03476190476191
25110.6102.5936904761908.00630952380959
26104101.3222619047622.67773809523809
27112.6113.950833333333-1.35083333333334
28107.3102.5508333333334.74916666666666
2998.9103.236547619048-4.33654761904762
30109.8112.965119047619-3.16511904761906
31104.998.03654761904766.86345238095237
32102.2102.0222619047620.177738095238085
33123.9118.8171428571435.08285714285715
34124.9119.4671428571435.43285714285715
35112.7112.783809523810-0.083809523809515
36121.9112.3338095238109.56619047619048
37100.6106.092738095238-5.49273809523803
38104.3104.821309523810-0.521309523809533
39120.4117.4498809523812.95011904761905
40107.5106.0498809523811.45011904761904
41102.9106.735595238095-3.83559523809524
42125.6116.4641666666679.13583333333332
43107.5101.5355952380955.96440476190475
44108.8105.5213095238103.27869047619046
45128.4122.3161904761906.08380952380953
46121.1122.966190476190-1.86619047619049
47119.5116.2828571428573.21714285714286
48128.7115.83285714285712.8671428571428
49108.7109.591785714286-0.891785714285641
50105.5108.320357142857-2.82035714285715
51119.8120.948928571429-1.14892857142858
52111.3109.5489285714291.75107142857142
53110.6110.2346428571430.365357142857127
54120.1119.9632142857140.136785714285695
5597.5105.034642857143-7.53464285714287
56107.7109.020357142857-1.32035714285715
57127.3125.8152380952381.48476190476190
58117.2126.465238095238-9.2652380952381
59119.8119.7819047619050.0180952380952398
60116.2119.331904761905-3.13190476190476
61111113.090833333333-2.09083333333326
62112.4111.8194047619050.580595238095237
63130.6124.4479761904766.1520238095238
64109.1113.047976190476-3.9479761904762
65118.8113.7336904761905.06630952380951
66123.9123.4622619047620.437738095238089
67101.6108.533690476190-6.93369047619049
68112.8112.5194047619050.280595238095223
69128129.314285714286-1.31428571428571
70129.6129.964285714286-0.364285714285725
71125.8123.2809523809522.51904761904762
72119.5122.830952380952-3.33095238095238
73115.7116.589880952381-0.88988095238088
74113.6115.318452380952-1.71845238095239
75129.7127.9470238095241.75297619047618
76112116.547023809524-4.54702380952381
77116.8117.232738095238-0.432738095238105
78126.3126.961309523810-0.661309523809538
79112.9112.0327380952380.8672619047619
80115.9116.018452380952-0.118452380952390

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 98.6 & 97.646428571429 & 0.953571428571022 \tabularnewline
2 & 98 & 96.375 & 1.62500000000004 \tabularnewline
3 & 106.8 & 109.003571428571 & -2.20357142857138 \tabularnewline
4 & 96.6 & 97.6035714285714 & -1.00357142857139 \tabularnewline
5 & 100.1 & 98.2892857142857 & 1.81071428571434 \tabularnewline
6 & 107.7 & 108.017857142857 & -0.317857142857068 \tabularnewline
7 & 91.5 & 93.0892857142857 & -1.58928571428565 \tabularnewline
8 & 97.8 & 97.075 & 0.725000000000066 \tabularnewline
9 & 107.4 & 111.819047619048 & -4.41904761904763 \tabularnewline
10 & 117.5 & 112.469047619048 & 5.03095238095239 \tabularnewline
11 & 105.6 & 105.785714285714 & -0.185714285714298 \tabularnewline
12 & 97.4 & 105.335714285714 & -7.93571428571427 \tabularnewline
13 & 99.5 & 99.0946428571428 & 0.405357142857214 \tabularnewline
14 & 98 & 97.8232142857143 & 0.176785714285718 \tabularnewline
15 & 104.3 & 110.451785714286 & -6.15178571428572 \tabularnewline
16 & 100.6 & 99.0517857142857 & 1.54821428571428 \tabularnewline
17 & 101.1 & 99.7375 & 1.36249999999998 \tabularnewline
18 & 103.9 & 109.466071428571 & -5.56607142857143 \tabularnewline
19 & 96.9 & 94.5375 & 2.36249999999999 \tabularnewline
20 & 95.5 & 98.5232142857143 & -3.0232142857143 \tabularnewline
21 & 108.4 & 115.318095238095 & -6.91809523809523 \tabularnewline
22 & 117 & 115.968095238095 & 1.03190476190476 \tabularnewline
23 & 103.8 & 109.284761904762 & -5.4847619047619 \tabularnewline
24 & 100.8 & 108.834761904762 & -8.03476190476191 \tabularnewline
25 & 110.6 & 102.593690476190 & 8.00630952380959 \tabularnewline
26 & 104 & 101.322261904762 & 2.67773809523809 \tabularnewline
27 & 112.6 & 113.950833333333 & -1.35083333333334 \tabularnewline
28 & 107.3 & 102.550833333333 & 4.74916666666666 \tabularnewline
29 & 98.9 & 103.236547619048 & -4.33654761904762 \tabularnewline
30 & 109.8 & 112.965119047619 & -3.16511904761906 \tabularnewline
31 & 104.9 & 98.0365476190476 & 6.86345238095237 \tabularnewline
32 & 102.2 & 102.022261904762 & 0.177738095238085 \tabularnewline
33 & 123.9 & 118.817142857143 & 5.08285714285715 \tabularnewline
34 & 124.9 & 119.467142857143 & 5.43285714285715 \tabularnewline
35 & 112.7 & 112.783809523810 & -0.083809523809515 \tabularnewline
36 & 121.9 & 112.333809523810 & 9.56619047619048 \tabularnewline
37 & 100.6 & 106.092738095238 & -5.49273809523803 \tabularnewline
38 & 104.3 & 104.821309523810 & -0.521309523809533 \tabularnewline
39 & 120.4 & 117.449880952381 & 2.95011904761905 \tabularnewline
40 & 107.5 & 106.049880952381 & 1.45011904761904 \tabularnewline
41 & 102.9 & 106.735595238095 & -3.83559523809524 \tabularnewline
42 & 125.6 & 116.464166666667 & 9.13583333333332 \tabularnewline
43 & 107.5 & 101.535595238095 & 5.96440476190475 \tabularnewline
44 & 108.8 & 105.521309523810 & 3.27869047619046 \tabularnewline
45 & 128.4 & 122.316190476190 & 6.08380952380953 \tabularnewline
46 & 121.1 & 122.966190476190 & -1.86619047619049 \tabularnewline
47 & 119.5 & 116.282857142857 & 3.21714285714286 \tabularnewline
48 & 128.7 & 115.832857142857 & 12.8671428571428 \tabularnewline
49 & 108.7 & 109.591785714286 & -0.891785714285641 \tabularnewline
50 & 105.5 & 108.320357142857 & -2.82035714285715 \tabularnewline
51 & 119.8 & 120.948928571429 & -1.14892857142858 \tabularnewline
52 & 111.3 & 109.548928571429 & 1.75107142857142 \tabularnewline
53 & 110.6 & 110.234642857143 & 0.365357142857127 \tabularnewline
54 & 120.1 & 119.963214285714 & 0.136785714285695 \tabularnewline
55 & 97.5 & 105.034642857143 & -7.53464285714287 \tabularnewline
56 & 107.7 & 109.020357142857 & -1.32035714285715 \tabularnewline
57 & 127.3 & 125.815238095238 & 1.48476190476190 \tabularnewline
58 & 117.2 & 126.465238095238 & -9.2652380952381 \tabularnewline
59 & 119.8 & 119.781904761905 & 0.0180952380952398 \tabularnewline
60 & 116.2 & 119.331904761905 & -3.13190476190476 \tabularnewline
61 & 111 & 113.090833333333 & -2.09083333333326 \tabularnewline
62 & 112.4 & 111.819404761905 & 0.580595238095237 \tabularnewline
63 & 130.6 & 124.447976190476 & 6.1520238095238 \tabularnewline
64 & 109.1 & 113.047976190476 & -3.9479761904762 \tabularnewline
65 & 118.8 & 113.733690476190 & 5.06630952380951 \tabularnewline
66 & 123.9 & 123.462261904762 & 0.437738095238089 \tabularnewline
67 & 101.6 & 108.533690476190 & -6.93369047619049 \tabularnewline
68 & 112.8 & 112.519404761905 & 0.280595238095223 \tabularnewline
69 & 128 & 129.314285714286 & -1.31428571428571 \tabularnewline
70 & 129.6 & 129.964285714286 & -0.364285714285725 \tabularnewline
71 & 125.8 & 123.280952380952 & 2.51904761904762 \tabularnewline
72 & 119.5 & 122.830952380952 & -3.33095238095238 \tabularnewline
73 & 115.7 & 116.589880952381 & -0.88988095238088 \tabularnewline
74 & 113.6 & 115.318452380952 & -1.71845238095239 \tabularnewline
75 & 129.7 & 127.947023809524 & 1.75297619047618 \tabularnewline
76 & 112 & 116.547023809524 & -4.54702380952381 \tabularnewline
77 & 116.8 & 117.232738095238 & -0.432738095238105 \tabularnewline
78 & 126.3 & 126.961309523810 & -0.661309523809538 \tabularnewline
79 & 112.9 & 112.032738095238 & 0.8672619047619 \tabularnewline
80 & 115.9 & 116.018452380952 & -0.118452380952390 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3710&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]98.6[/C][C]97.646428571429[/C][C]0.953571428571022[/C][/ROW]
[ROW][C]2[/C][C]98[/C][C]96.375[/C][C]1.62500000000004[/C][/ROW]
[ROW][C]3[/C][C]106.8[/C][C]109.003571428571[/C][C]-2.20357142857138[/C][/ROW]
[ROW][C]4[/C][C]96.6[/C][C]97.6035714285714[/C][C]-1.00357142857139[/C][/ROW]
[ROW][C]5[/C][C]100.1[/C][C]98.2892857142857[/C][C]1.81071428571434[/C][/ROW]
[ROW][C]6[/C][C]107.7[/C][C]108.017857142857[/C][C]-0.317857142857068[/C][/ROW]
[ROW][C]7[/C][C]91.5[/C][C]93.0892857142857[/C][C]-1.58928571428565[/C][/ROW]
[ROW][C]8[/C][C]97.8[/C][C]97.075[/C][C]0.725000000000066[/C][/ROW]
[ROW][C]9[/C][C]107.4[/C][C]111.819047619048[/C][C]-4.41904761904763[/C][/ROW]
[ROW][C]10[/C][C]117.5[/C][C]112.469047619048[/C][C]5.03095238095239[/C][/ROW]
[ROW][C]11[/C][C]105.6[/C][C]105.785714285714[/C][C]-0.185714285714298[/C][/ROW]
[ROW][C]12[/C][C]97.4[/C][C]105.335714285714[/C][C]-7.93571428571427[/C][/ROW]
[ROW][C]13[/C][C]99.5[/C][C]99.0946428571428[/C][C]0.405357142857214[/C][/ROW]
[ROW][C]14[/C][C]98[/C][C]97.8232142857143[/C][C]0.176785714285718[/C][/ROW]
[ROW][C]15[/C][C]104.3[/C][C]110.451785714286[/C][C]-6.15178571428572[/C][/ROW]
[ROW][C]16[/C][C]100.6[/C][C]99.0517857142857[/C][C]1.54821428571428[/C][/ROW]
[ROW][C]17[/C][C]101.1[/C][C]99.7375[/C][C]1.36249999999998[/C][/ROW]
[ROW][C]18[/C][C]103.9[/C][C]109.466071428571[/C][C]-5.56607142857143[/C][/ROW]
[ROW][C]19[/C][C]96.9[/C][C]94.5375[/C][C]2.36249999999999[/C][/ROW]
[ROW][C]20[/C][C]95.5[/C][C]98.5232142857143[/C][C]-3.0232142857143[/C][/ROW]
[ROW][C]21[/C][C]108.4[/C][C]115.318095238095[/C][C]-6.91809523809523[/C][/ROW]
[ROW][C]22[/C][C]117[/C][C]115.968095238095[/C][C]1.03190476190476[/C][/ROW]
[ROW][C]23[/C][C]103.8[/C][C]109.284761904762[/C][C]-5.4847619047619[/C][/ROW]
[ROW][C]24[/C][C]100.8[/C][C]108.834761904762[/C][C]-8.03476190476191[/C][/ROW]
[ROW][C]25[/C][C]110.6[/C][C]102.593690476190[/C][C]8.00630952380959[/C][/ROW]
[ROW][C]26[/C][C]104[/C][C]101.322261904762[/C][C]2.67773809523809[/C][/ROW]
[ROW][C]27[/C][C]112.6[/C][C]113.950833333333[/C][C]-1.35083333333334[/C][/ROW]
[ROW][C]28[/C][C]107.3[/C][C]102.550833333333[/C][C]4.74916666666666[/C][/ROW]
[ROW][C]29[/C][C]98.9[/C][C]103.236547619048[/C][C]-4.33654761904762[/C][/ROW]
[ROW][C]30[/C][C]109.8[/C][C]112.965119047619[/C][C]-3.16511904761906[/C][/ROW]
[ROW][C]31[/C][C]104.9[/C][C]98.0365476190476[/C][C]6.86345238095237[/C][/ROW]
[ROW][C]32[/C][C]102.2[/C][C]102.022261904762[/C][C]0.177738095238085[/C][/ROW]
[ROW][C]33[/C][C]123.9[/C][C]118.817142857143[/C][C]5.08285714285715[/C][/ROW]
[ROW][C]34[/C][C]124.9[/C][C]119.467142857143[/C][C]5.43285714285715[/C][/ROW]
[ROW][C]35[/C][C]112.7[/C][C]112.783809523810[/C][C]-0.083809523809515[/C][/ROW]
[ROW][C]36[/C][C]121.9[/C][C]112.333809523810[/C][C]9.56619047619048[/C][/ROW]
[ROW][C]37[/C][C]100.6[/C][C]106.092738095238[/C][C]-5.49273809523803[/C][/ROW]
[ROW][C]38[/C][C]104.3[/C][C]104.821309523810[/C][C]-0.521309523809533[/C][/ROW]
[ROW][C]39[/C][C]120.4[/C][C]117.449880952381[/C][C]2.95011904761905[/C][/ROW]
[ROW][C]40[/C][C]107.5[/C][C]106.049880952381[/C][C]1.45011904761904[/C][/ROW]
[ROW][C]41[/C][C]102.9[/C][C]106.735595238095[/C][C]-3.83559523809524[/C][/ROW]
[ROW][C]42[/C][C]125.6[/C][C]116.464166666667[/C][C]9.13583333333332[/C][/ROW]
[ROW][C]43[/C][C]107.5[/C][C]101.535595238095[/C][C]5.96440476190475[/C][/ROW]
[ROW][C]44[/C][C]108.8[/C][C]105.521309523810[/C][C]3.27869047619046[/C][/ROW]
[ROW][C]45[/C][C]128.4[/C][C]122.316190476190[/C][C]6.08380952380953[/C][/ROW]
[ROW][C]46[/C][C]121.1[/C][C]122.966190476190[/C][C]-1.86619047619049[/C][/ROW]
[ROW][C]47[/C][C]119.5[/C][C]116.282857142857[/C][C]3.21714285714286[/C][/ROW]
[ROW][C]48[/C][C]128.7[/C][C]115.832857142857[/C][C]12.8671428571428[/C][/ROW]
[ROW][C]49[/C][C]108.7[/C][C]109.591785714286[/C][C]-0.891785714285641[/C][/ROW]
[ROW][C]50[/C][C]105.5[/C][C]108.320357142857[/C][C]-2.82035714285715[/C][/ROW]
[ROW][C]51[/C][C]119.8[/C][C]120.948928571429[/C][C]-1.14892857142858[/C][/ROW]
[ROW][C]52[/C][C]111.3[/C][C]109.548928571429[/C][C]1.75107142857142[/C][/ROW]
[ROW][C]53[/C][C]110.6[/C][C]110.234642857143[/C][C]0.365357142857127[/C][/ROW]
[ROW][C]54[/C][C]120.1[/C][C]119.963214285714[/C][C]0.136785714285695[/C][/ROW]
[ROW][C]55[/C][C]97.5[/C][C]105.034642857143[/C][C]-7.53464285714287[/C][/ROW]
[ROW][C]56[/C][C]107.7[/C][C]109.020357142857[/C][C]-1.32035714285715[/C][/ROW]
[ROW][C]57[/C][C]127.3[/C][C]125.815238095238[/C][C]1.48476190476190[/C][/ROW]
[ROW][C]58[/C][C]117.2[/C][C]126.465238095238[/C][C]-9.2652380952381[/C][/ROW]
[ROW][C]59[/C][C]119.8[/C][C]119.781904761905[/C][C]0.0180952380952398[/C][/ROW]
[ROW][C]60[/C][C]116.2[/C][C]119.331904761905[/C][C]-3.13190476190476[/C][/ROW]
[ROW][C]61[/C][C]111[/C][C]113.090833333333[/C][C]-2.09083333333326[/C][/ROW]
[ROW][C]62[/C][C]112.4[/C][C]111.819404761905[/C][C]0.580595238095237[/C][/ROW]
[ROW][C]63[/C][C]130.6[/C][C]124.447976190476[/C][C]6.1520238095238[/C][/ROW]
[ROW][C]64[/C][C]109.1[/C][C]113.047976190476[/C][C]-3.9479761904762[/C][/ROW]
[ROW][C]65[/C][C]118.8[/C][C]113.733690476190[/C][C]5.06630952380951[/C][/ROW]
[ROW][C]66[/C][C]123.9[/C][C]123.462261904762[/C][C]0.437738095238089[/C][/ROW]
[ROW][C]67[/C][C]101.6[/C][C]108.533690476190[/C][C]-6.93369047619049[/C][/ROW]
[ROW][C]68[/C][C]112.8[/C][C]112.519404761905[/C][C]0.280595238095223[/C][/ROW]
[ROW][C]69[/C][C]128[/C][C]129.314285714286[/C][C]-1.31428571428571[/C][/ROW]
[ROW][C]70[/C][C]129.6[/C][C]129.964285714286[/C][C]-0.364285714285725[/C][/ROW]
[ROW][C]71[/C][C]125.8[/C][C]123.280952380952[/C][C]2.51904761904762[/C][/ROW]
[ROW][C]72[/C][C]119.5[/C][C]122.830952380952[/C][C]-3.33095238095238[/C][/ROW]
[ROW][C]73[/C][C]115.7[/C][C]116.589880952381[/C][C]-0.88988095238088[/C][/ROW]
[ROW][C]74[/C][C]113.6[/C][C]115.318452380952[/C][C]-1.71845238095239[/C][/ROW]
[ROW][C]75[/C][C]129.7[/C][C]127.947023809524[/C][C]1.75297619047618[/C][/ROW]
[ROW][C]76[/C][C]112[/C][C]116.547023809524[/C][C]-4.54702380952381[/C][/ROW]
[ROW][C]77[/C][C]116.8[/C][C]117.232738095238[/C][C]-0.432738095238105[/C][/ROW]
[ROW][C]78[/C][C]126.3[/C][C]126.961309523810[/C][C]-0.661309523809538[/C][/ROW]
[ROW][C]79[/C][C]112.9[/C][C]112.032738095238[/C][C]0.8672619047619[/C][/ROW]
[ROW][C]80[/C][C]115.9[/C][C]116.018452380952[/C][C]-0.118452380952390[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3710&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3710&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
198.697.6464285714290.953571428571022
29896.3751.62500000000004
3106.8109.003571428571-2.20357142857138
496.697.6035714285714-1.00357142857139
5100.198.28928571428571.81071428571434
6107.7108.017857142857-0.317857142857068
791.593.0892857142857-1.58928571428565
897.897.0750.725000000000066
9107.4111.819047619048-4.41904761904763
10117.5112.4690476190485.03095238095239
11105.6105.785714285714-0.185714285714298
1297.4105.335714285714-7.93571428571427
1399.599.09464285714280.405357142857214
149897.82321428571430.176785714285718
15104.3110.451785714286-6.15178571428572
16100.699.05178571428571.54821428571428
17101.199.73751.36249999999998
18103.9109.466071428571-5.56607142857143
1996.994.53752.36249999999999
2095.598.5232142857143-3.0232142857143
21108.4115.318095238095-6.91809523809523
22117115.9680952380951.03190476190476
23103.8109.284761904762-5.4847619047619
24100.8108.834761904762-8.03476190476191
25110.6102.5936904761908.00630952380959
26104101.3222619047622.67773809523809
27112.6113.950833333333-1.35083333333334
28107.3102.5508333333334.74916666666666
2998.9103.236547619048-4.33654761904762
30109.8112.965119047619-3.16511904761906
31104.998.03654761904766.86345238095237
32102.2102.0222619047620.177738095238085
33123.9118.8171428571435.08285714285715
34124.9119.4671428571435.43285714285715
35112.7112.783809523810-0.083809523809515
36121.9112.3338095238109.56619047619048
37100.6106.092738095238-5.49273809523803
38104.3104.821309523810-0.521309523809533
39120.4117.4498809523812.95011904761905
40107.5106.0498809523811.45011904761904
41102.9106.735595238095-3.83559523809524
42125.6116.4641666666679.13583333333332
43107.5101.5355952380955.96440476190475
44108.8105.5213095238103.27869047619046
45128.4122.3161904761906.08380952380953
46121.1122.966190476190-1.86619047619049
47119.5116.2828571428573.21714285714286
48128.7115.83285714285712.8671428571428
49108.7109.591785714286-0.891785714285641
50105.5108.320357142857-2.82035714285715
51119.8120.948928571429-1.14892857142858
52111.3109.5489285714291.75107142857142
53110.6110.2346428571430.365357142857127
54120.1119.9632142857140.136785714285695
5597.5105.034642857143-7.53464285714287
56107.7109.020357142857-1.32035714285715
57127.3125.8152380952381.48476190476190
58117.2126.465238095238-9.2652380952381
59119.8119.7819047619050.0180952380952398
60116.2119.331904761905-3.13190476190476
61111113.090833333333-2.09083333333326
62112.4111.8194047619050.580595238095237
63130.6124.4479761904766.1520238095238
64109.1113.047976190476-3.9479761904762
65118.8113.7336904761905.06630952380951
66123.9123.4622619047620.437738095238089
67101.6108.533690476190-6.93369047619049
68112.8112.5194047619050.280595238095223
69128129.314285714286-1.31428571428571
70129.6129.964285714286-0.364285714285725
71125.8123.2809523809522.51904761904762
72119.5122.830952380952-3.33095238095238
73115.7116.589880952381-0.88988095238088
74113.6115.318452380952-1.71845238095239
75129.7127.9470238095241.75297619047618
76112116.547023809524-4.54702380952381
77116.8117.232738095238-0.432738095238105
78126.3126.961309523810-0.661309523809538
79112.9112.0327380952380.8672619047619
80115.9116.018452380952-0.118452380952390



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')