Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 21 Nov 2007 05:36:11 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/21/t1195648242tlugvr8gym70xu1.htm/, Retrieved Tue, 07 May 2024 13:10:32 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5846, Retrieved Tue, 07 May 2024 13:10:32 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordstextiel met monthly dummies en lineair trends
Estimated Impact257
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Seatbelt law] [2007-11-21 12:36:11] [e38ae300fa323c405e42b78372d772d6] [Current]
- R  D    [Multiple Regression] [Multiple regressi...] [2008-12-18 16:01:35] [072df11bdb18ed8d65d8164df87f26f2]
-  M        [Multiple Regression] [multiple regressi...] [2009-12-16 11:03:47] [072df11bdb18ed8d65d8164df87f26f2]
Feedback Forum

Post a new message
Dataseries X:
101.5	0
99.2	0
107.8	0
92.3	0
99.2	0
101.6	0
87	0
71.4	0
104.7	0
115.1	0
102.5	0
75.3	0
96.7	1
94.6	1
98.6	1
99.5	1
92	1
93.6	1
89.3	1
66.9	1
108.8	1
113.2	1
105.5	1
77.8	1
102.1	1
97	1
95.5	1
99.3	1
86.4	1
92.4	1
85.7	1
61.9	1
104.9	1
107.9	1
95.6	1
79.8	1
94.8	1
93.7	1
108.1	1
96.9	1
88.8	1
106.7	1
86.8	1
69.8	1
110.9	1
105.4	1
99.2	1
84.4	1
87.2	1
91.9	1
97.9	1
94.5	1
85	1
100.3	1
78.7	1
65.8	1
104.8	1
96	1
103.3	1
82.9	1
91.4	1
94.5	1
109.3	1
92.1	1
99.3	1
109.6	1
87.5	1
73.1	1
110.7	1
111.6	1
110.7	1
84	1
101.6	1
102.1	1
113.9	1
99	1
100.4	1
109.5	1
93	1
76.8	1
105.3	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5846&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5846&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5846&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 81.0978124999999 -5.16078124999995x[t] + 16.3589279513888M1[t] + 15.9374317956349M2[t] + 24.1445070684524M3[t] + 15.8372966269841M4[t] + 12.5300861855159M5[t] + 21.3800186011905M6[t] + 6.18709387400795M7[t] -11.3772594246032M8[t] + 26.3012444196429M9[t] + 27.6858494543651M10[t] + 22.1929247271826M11[t] + 0.0929247271825398t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  81.0978124999999 -5.16078124999995x[t] +  16.3589279513888M1[t] +  15.9374317956349M2[t] +  24.1445070684524M3[t] +  15.8372966269841M4[t] +  12.5300861855159M5[t] +  21.3800186011905M6[t] +  6.18709387400795M7[t] -11.3772594246032M8[t] +  26.3012444196429M9[t] +  27.6858494543651M10[t] +  22.1929247271826M11[t] +  0.0929247271825398t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5846&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  81.0978124999999 -5.16078124999995x[t] +  16.3589279513888M1[t] +  15.9374317956349M2[t] +  24.1445070684524M3[t] +  15.8372966269841M4[t] +  12.5300861855159M5[t] +  21.3800186011905M6[t] +  6.18709387400795M7[t] -11.3772594246032M8[t] +  26.3012444196429M9[t] +  27.6858494543651M10[t] +  22.1929247271826M11[t] +  0.0929247271825398t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5846&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5846&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 81.0978124999999 -5.16078124999995x[t] + 16.3589279513888M1[t] + 15.9374317956349M2[t] + 24.1445070684524M3[t] + 15.8372966269841M4[t] + 12.5300861855159M5[t] + 21.3800186011905M6[t] + 6.18709387400795M7[t] -11.3772594246032M8[t] + 26.3012444196429M9[t] + 27.6858494543651M10[t] + 22.1929247271826M11[t] + 0.0929247271825398t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)81.09781249999992.44663433.146700
x-5.160781249999952.000773-2.57940.0120980.006049
M116.35892795138882.8013085.839700
M215.93743179563492.7994885.69300
M324.14450706845242.7980018.629200
M415.83729662698412.7968465.662600
M512.53008618551592.7960244.48143e-051.5e-05
M621.38001860119052.7955357.647900
M76.187093874007952.795382.21330.0302850.015142
M8-11.37725942460322.795558-4.06980.0001276.3e-05
M926.30124441964292.796079.406500
M1027.68584945436512.9012859.542600
M1122.19292472718262.9008037.650600
t0.09292472718253980.0305373.0430.0033440.001672

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 81.0978124999999 & 2.446634 & 33.1467 & 0 & 0 \tabularnewline
x & -5.16078124999995 & 2.000773 & -2.5794 & 0.012098 & 0.006049 \tabularnewline
M1 & 16.3589279513888 & 2.801308 & 5.8397 & 0 & 0 \tabularnewline
M2 & 15.9374317956349 & 2.799488 & 5.693 & 0 & 0 \tabularnewline
M3 & 24.1445070684524 & 2.798001 & 8.6292 & 0 & 0 \tabularnewline
M4 & 15.8372966269841 & 2.796846 & 5.6626 & 0 & 0 \tabularnewline
M5 & 12.5300861855159 & 2.796024 & 4.4814 & 3e-05 & 1.5e-05 \tabularnewline
M6 & 21.3800186011905 & 2.795535 & 7.6479 & 0 & 0 \tabularnewline
M7 & 6.18709387400795 & 2.79538 & 2.2133 & 0.030285 & 0.015142 \tabularnewline
M8 & -11.3772594246032 & 2.795558 & -4.0698 & 0.000127 & 6.3e-05 \tabularnewline
M9 & 26.3012444196429 & 2.79607 & 9.4065 & 0 & 0 \tabularnewline
M10 & 27.6858494543651 & 2.901285 & 9.5426 & 0 & 0 \tabularnewline
M11 & 22.1929247271826 & 2.900803 & 7.6506 & 0 & 0 \tabularnewline
t & 0.0929247271825398 & 0.030537 & 3.043 & 0.003344 & 0.001672 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5846&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]81.0978124999999[/C][C]2.446634[/C][C]33.1467[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-5.16078124999995[/C][C]2.000773[/C][C]-2.5794[/C][C]0.012098[/C][C]0.006049[/C][/ROW]
[ROW][C]M1[/C][C]16.3589279513888[/C][C]2.801308[/C][C]5.8397[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M2[/C][C]15.9374317956349[/C][C]2.799488[/C][C]5.693[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M3[/C][C]24.1445070684524[/C][C]2.798001[/C][C]8.6292[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M4[/C][C]15.8372966269841[/C][C]2.796846[/C][C]5.6626[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M5[/C][C]12.5300861855159[/C][C]2.796024[/C][C]4.4814[/C][C]3e-05[/C][C]1.5e-05[/C][/ROW]
[ROW][C]M6[/C][C]21.3800186011905[/C][C]2.795535[/C][C]7.6479[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M7[/C][C]6.18709387400795[/C][C]2.79538[/C][C]2.2133[/C][C]0.030285[/C][C]0.015142[/C][/ROW]
[ROW][C]M8[/C][C]-11.3772594246032[/C][C]2.795558[/C][C]-4.0698[/C][C]0.000127[/C][C]6.3e-05[/C][/ROW]
[ROW][C]M9[/C][C]26.3012444196429[/C][C]2.79607[/C][C]9.4065[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M10[/C][C]27.6858494543651[/C][C]2.901285[/C][C]9.5426[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M11[/C][C]22.1929247271826[/C][C]2.900803[/C][C]7.6506[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]t[/C][C]0.0929247271825398[/C][C]0.030537[/C][C]3.043[/C][C]0.003344[/C][C]0.001672[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5846&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5846&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)81.09781249999992.44663433.146700
x-5.160781249999952.000773-2.57940.0120980.006049
M116.35892795138882.8013085.839700
M215.93743179563492.7994885.69300
M324.14450706845242.7980018.629200
M415.83729662698412.7968465.662600
M512.53008618551592.7960244.48143e-051.5e-05
M621.38001860119052.7955357.647900
M76.187093874007952.795382.21330.0302850.015142
M8-11.37725942460322.795558-4.06980.0001276.3e-05
M926.30124441964292.796079.406500
M1027.68584945436512.9012859.542600
M1122.19292472718262.9008037.650600
t0.09292472718253980.0305373.0430.0033440.001672







Multiple Linear Regression - Regression Statistics
Multiple R0.924878815061786
R-squared0.855400822550093
Adjusted R-squared0.827344265731454
F-TEST (value)30.4884461796049
F-TEST (DF numerator)13
F-TEST (DF denominator)67
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.02406007724201
Sum Squared Residuals1691.15903720238

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.924878815061786 \tabularnewline
R-squared & 0.855400822550093 \tabularnewline
Adjusted R-squared & 0.827344265731454 \tabularnewline
F-TEST (value) & 30.4884461796049 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 67 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.02406007724201 \tabularnewline
Sum Squared Residuals & 1691.15903720238 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5846&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.924878815061786[/C][/ROW]
[ROW][C]R-squared[/C][C]0.855400822550093[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.827344265731454[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]30.4884461796049[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]67[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.02406007724201[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1691.15903720238[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5846&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5846&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.924878815061786
R-squared0.855400822550093
Adjusted R-squared0.827344265731454
F-TEST (value)30.4884461796049
F-TEST (DF numerator)13
F-TEST (DF denominator)67
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.02406007724201
Sum Squared Residuals1691.15903720238







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1101.597.54966517857183.95033482142816
299.297.221093751.97890625
3107.8105.521093752.27890625000002
492.397.3068080357142-5.00680803571423
599.294.09252232142865.10747767857144
6101.6103.035379464286-1.43537946428569
78787.9353794642857-0.935379464285681
871.470.46395089285710.936049107142891
9104.7108.235379464286-3.5353794642857
10115.1109.7129092261905.38709077380959
11102.5104.312909226190-1.81290922619042
1275.382.2129092261904-6.91290922619044
1396.793.50398065476183.19601934523818
1494.693.17540922619051.42459077380952
1598.6101.475409226190-2.87540922619048
1699.593.26112351190486.23887648809523
179290.0468377976191.95316220238095
1893.698.9896949404762-5.38969494047620
1989.383.88969494047625.41030505952381
2066.966.41826636904760.481733630952386
21108.8104.1896949404764.61030505952381
22113.2105.6672247023817.53277529761904
23105.5100.2672247023815.23277529761904
2477.878.167224702381-0.367224702380958
25102.194.61907738095237.48092261904768
269794.2905059523812.70949404761905
2795.5102.590505952381-7.09050595238095
2899.394.37622023809524.92377976190475
2986.491.1619345238095-4.76193452380952
3092.4100.104791666667-7.70479166666666
3185.785.00479166666670.695208333333334
3261.967.5333630952381-5.6333630952381
33104.9105.304791666667-0.404791666666663
34107.9106.7823214285711.11767857142857
3595.6101.382321428571-5.78232142857144
3679.879.28232142857140.517678571428565
3794.895.7341741071428-0.934174107142794
3893.795.4056026785714-1.70560267857143
39108.1103.7056026785714.39439732142857
4096.995.49131696428571.40868303571428
4188.892.27703125-3.47703125000001
42106.7101.2198883928575.48011160714286
4386.886.11988839285710.680111607142851
4469.868.64845982142861.15154017857142
45110.9106.4198883928574.48011160714286
46105.4107.897418154762-2.49741815476191
4799.2102.497418154762-3.29741815476191
4884.480.39741815476194.00258184523809
4987.296.8492708333333-9.64927083333327
5091.996.520699404762-4.6206994047619
5197.9104.820699404762-6.9206994047619
5294.596.6064136904762-2.1064136904762
538593.3921279761905-8.39212797619048
54100.3102.334985119048-2.03498511904762
5578.787.2349851190476-8.53498511904762
5665.869.763556547619-3.96355654761906
57104.8107.534985119048-2.73498511904763
5896109.012514880952-13.0125148809524
59103.3103.612514880952-0.312514880952395
6082.981.51251488095241.38748511904762
6191.497.9643675595238-6.56436755952374
6294.597.6357961309524-3.13579613095239
63109.3105.9357961309523.36420386904761
6492.197.7215104166667-5.62151041666668
6599.394.5072247023814.79277529761904
66109.6103.4500818452386.1499181547619
6787.588.3500818452381-0.850081845238102
6873.170.87865327380952.22134672619046
69110.7108.6500818452382.0499181547619
70111.6110.1276116071431.47238839285712
71110.7104.7276116071435.97238839285714
728482.6276116071431.37238839285713
73101.699.07946428571422.52053571428577
74102.198.75089285714293.34910714285713
75113.9107.0508928571436.84910714285714
769998.83660714285710.163392857142846
77100.495.62232142857144.77767857142857
78109.5104.5651785714294.93482142857142
799389.46517857142863.53482142857142
8076.871.993754.80624999999999
81105.3109.765178571429-4.46517857142858

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 101.5 & 97.5496651785718 & 3.95033482142816 \tabularnewline
2 & 99.2 & 97.22109375 & 1.97890625 \tabularnewline
3 & 107.8 & 105.52109375 & 2.27890625000002 \tabularnewline
4 & 92.3 & 97.3068080357142 & -5.00680803571423 \tabularnewline
5 & 99.2 & 94.0925223214286 & 5.10747767857144 \tabularnewline
6 & 101.6 & 103.035379464286 & -1.43537946428569 \tabularnewline
7 & 87 & 87.9353794642857 & -0.935379464285681 \tabularnewline
8 & 71.4 & 70.4639508928571 & 0.936049107142891 \tabularnewline
9 & 104.7 & 108.235379464286 & -3.5353794642857 \tabularnewline
10 & 115.1 & 109.712909226190 & 5.38709077380959 \tabularnewline
11 & 102.5 & 104.312909226190 & -1.81290922619042 \tabularnewline
12 & 75.3 & 82.2129092261904 & -6.91290922619044 \tabularnewline
13 & 96.7 & 93.5039806547618 & 3.19601934523818 \tabularnewline
14 & 94.6 & 93.1754092261905 & 1.42459077380952 \tabularnewline
15 & 98.6 & 101.475409226190 & -2.87540922619048 \tabularnewline
16 & 99.5 & 93.2611235119048 & 6.23887648809523 \tabularnewline
17 & 92 & 90.046837797619 & 1.95316220238095 \tabularnewline
18 & 93.6 & 98.9896949404762 & -5.38969494047620 \tabularnewline
19 & 89.3 & 83.8896949404762 & 5.41030505952381 \tabularnewline
20 & 66.9 & 66.4182663690476 & 0.481733630952386 \tabularnewline
21 & 108.8 & 104.189694940476 & 4.61030505952381 \tabularnewline
22 & 113.2 & 105.667224702381 & 7.53277529761904 \tabularnewline
23 & 105.5 & 100.267224702381 & 5.23277529761904 \tabularnewline
24 & 77.8 & 78.167224702381 & -0.367224702380958 \tabularnewline
25 & 102.1 & 94.6190773809523 & 7.48092261904768 \tabularnewline
26 & 97 & 94.290505952381 & 2.70949404761905 \tabularnewline
27 & 95.5 & 102.590505952381 & -7.09050595238095 \tabularnewline
28 & 99.3 & 94.3762202380952 & 4.92377976190475 \tabularnewline
29 & 86.4 & 91.1619345238095 & -4.76193452380952 \tabularnewline
30 & 92.4 & 100.104791666667 & -7.70479166666666 \tabularnewline
31 & 85.7 & 85.0047916666667 & 0.695208333333334 \tabularnewline
32 & 61.9 & 67.5333630952381 & -5.6333630952381 \tabularnewline
33 & 104.9 & 105.304791666667 & -0.404791666666663 \tabularnewline
34 & 107.9 & 106.782321428571 & 1.11767857142857 \tabularnewline
35 & 95.6 & 101.382321428571 & -5.78232142857144 \tabularnewline
36 & 79.8 & 79.2823214285714 & 0.517678571428565 \tabularnewline
37 & 94.8 & 95.7341741071428 & -0.934174107142794 \tabularnewline
38 & 93.7 & 95.4056026785714 & -1.70560267857143 \tabularnewline
39 & 108.1 & 103.705602678571 & 4.39439732142857 \tabularnewline
40 & 96.9 & 95.4913169642857 & 1.40868303571428 \tabularnewline
41 & 88.8 & 92.27703125 & -3.47703125000001 \tabularnewline
42 & 106.7 & 101.219888392857 & 5.48011160714286 \tabularnewline
43 & 86.8 & 86.1198883928571 & 0.680111607142851 \tabularnewline
44 & 69.8 & 68.6484598214286 & 1.15154017857142 \tabularnewline
45 & 110.9 & 106.419888392857 & 4.48011160714286 \tabularnewline
46 & 105.4 & 107.897418154762 & -2.49741815476191 \tabularnewline
47 & 99.2 & 102.497418154762 & -3.29741815476191 \tabularnewline
48 & 84.4 & 80.3974181547619 & 4.00258184523809 \tabularnewline
49 & 87.2 & 96.8492708333333 & -9.64927083333327 \tabularnewline
50 & 91.9 & 96.520699404762 & -4.6206994047619 \tabularnewline
51 & 97.9 & 104.820699404762 & -6.9206994047619 \tabularnewline
52 & 94.5 & 96.6064136904762 & -2.1064136904762 \tabularnewline
53 & 85 & 93.3921279761905 & -8.39212797619048 \tabularnewline
54 & 100.3 & 102.334985119048 & -2.03498511904762 \tabularnewline
55 & 78.7 & 87.2349851190476 & -8.53498511904762 \tabularnewline
56 & 65.8 & 69.763556547619 & -3.96355654761906 \tabularnewline
57 & 104.8 & 107.534985119048 & -2.73498511904763 \tabularnewline
58 & 96 & 109.012514880952 & -13.0125148809524 \tabularnewline
59 & 103.3 & 103.612514880952 & -0.312514880952395 \tabularnewline
60 & 82.9 & 81.5125148809524 & 1.38748511904762 \tabularnewline
61 & 91.4 & 97.9643675595238 & -6.56436755952374 \tabularnewline
62 & 94.5 & 97.6357961309524 & -3.13579613095239 \tabularnewline
63 & 109.3 & 105.935796130952 & 3.36420386904761 \tabularnewline
64 & 92.1 & 97.7215104166667 & -5.62151041666668 \tabularnewline
65 & 99.3 & 94.507224702381 & 4.79277529761904 \tabularnewline
66 & 109.6 & 103.450081845238 & 6.1499181547619 \tabularnewline
67 & 87.5 & 88.3500818452381 & -0.850081845238102 \tabularnewline
68 & 73.1 & 70.8786532738095 & 2.22134672619046 \tabularnewline
69 & 110.7 & 108.650081845238 & 2.0499181547619 \tabularnewline
70 & 111.6 & 110.127611607143 & 1.47238839285712 \tabularnewline
71 & 110.7 & 104.727611607143 & 5.97238839285714 \tabularnewline
72 & 84 & 82.627611607143 & 1.37238839285713 \tabularnewline
73 & 101.6 & 99.0794642857142 & 2.52053571428577 \tabularnewline
74 & 102.1 & 98.7508928571429 & 3.34910714285713 \tabularnewline
75 & 113.9 & 107.050892857143 & 6.84910714285714 \tabularnewline
76 & 99 & 98.8366071428571 & 0.163392857142846 \tabularnewline
77 & 100.4 & 95.6223214285714 & 4.77767857142857 \tabularnewline
78 & 109.5 & 104.565178571429 & 4.93482142857142 \tabularnewline
79 & 93 & 89.4651785714286 & 3.53482142857142 \tabularnewline
80 & 76.8 & 71.99375 & 4.80624999999999 \tabularnewline
81 & 105.3 & 109.765178571429 & -4.46517857142858 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5846&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]101.5[/C][C]97.5496651785718[/C][C]3.95033482142816[/C][/ROW]
[ROW][C]2[/C][C]99.2[/C][C]97.22109375[/C][C]1.97890625[/C][/ROW]
[ROW][C]3[/C][C]107.8[/C][C]105.52109375[/C][C]2.27890625000002[/C][/ROW]
[ROW][C]4[/C][C]92.3[/C][C]97.3068080357142[/C][C]-5.00680803571423[/C][/ROW]
[ROW][C]5[/C][C]99.2[/C][C]94.0925223214286[/C][C]5.10747767857144[/C][/ROW]
[ROW][C]6[/C][C]101.6[/C][C]103.035379464286[/C][C]-1.43537946428569[/C][/ROW]
[ROW][C]7[/C][C]87[/C][C]87.9353794642857[/C][C]-0.935379464285681[/C][/ROW]
[ROW][C]8[/C][C]71.4[/C][C]70.4639508928571[/C][C]0.936049107142891[/C][/ROW]
[ROW][C]9[/C][C]104.7[/C][C]108.235379464286[/C][C]-3.5353794642857[/C][/ROW]
[ROW][C]10[/C][C]115.1[/C][C]109.712909226190[/C][C]5.38709077380959[/C][/ROW]
[ROW][C]11[/C][C]102.5[/C][C]104.312909226190[/C][C]-1.81290922619042[/C][/ROW]
[ROW][C]12[/C][C]75.3[/C][C]82.2129092261904[/C][C]-6.91290922619044[/C][/ROW]
[ROW][C]13[/C][C]96.7[/C][C]93.5039806547618[/C][C]3.19601934523818[/C][/ROW]
[ROW][C]14[/C][C]94.6[/C][C]93.1754092261905[/C][C]1.42459077380952[/C][/ROW]
[ROW][C]15[/C][C]98.6[/C][C]101.475409226190[/C][C]-2.87540922619048[/C][/ROW]
[ROW][C]16[/C][C]99.5[/C][C]93.2611235119048[/C][C]6.23887648809523[/C][/ROW]
[ROW][C]17[/C][C]92[/C][C]90.046837797619[/C][C]1.95316220238095[/C][/ROW]
[ROW][C]18[/C][C]93.6[/C][C]98.9896949404762[/C][C]-5.38969494047620[/C][/ROW]
[ROW][C]19[/C][C]89.3[/C][C]83.8896949404762[/C][C]5.41030505952381[/C][/ROW]
[ROW][C]20[/C][C]66.9[/C][C]66.4182663690476[/C][C]0.481733630952386[/C][/ROW]
[ROW][C]21[/C][C]108.8[/C][C]104.189694940476[/C][C]4.61030505952381[/C][/ROW]
[ROW][C]22[/C][C]113.2[/C][C]105.667224702381[/C][C]7.53277529761904[/C][/ROW]
[ROW][C]23[/C][C]105.5[/C][C]100.267224702381[/C][C]5.23277529761904[/C][/ROW]
[ROW][C]24[/C][C]77.8[/C][C]78.167224702381[/C][C]-0.367224702380958[/C][/ROW]
[ROW][C]25[/C][C]102.1[/C][C]94.6190773809523[/C][C]7.48092261904768[/C][/ROW]
[ROW][C]26[/C][C]97[/C][C]94.290505952381[/C][C]2.70949404761905[/C][/ROW]
[ROW][C]27[/C][C]95.5[/C][C]102.590505952381[/C][C]-7.09050595238095[/C][/ROW]
[ROW][C]28[/C][C]99.3[/C][C]94.3762202380952[/C][C]4.92377976190475[/C][/ROW]
[ROW][C]29[/C][C]86.4[/C][C]91.1619345238095[/C][C]-4.76193452380952[/C][/ROW]
[ROW][C]30[/C][C]92.4[/C][C]100.104791666667[/C][C]-7.70479166666666[/C][/ROW]
[ROW][C]31[/C][C]85.7[/C][C]85.0047916666667[/C][C]0.695208333333334[/C][/ROW]
[ROW][C]32[/C][C]61.9[/C][C]67.5333630952381[/C][C]-5.6333630952381[/C][/ROW]
[ROW][C]33[/C][C]104.9[/C][C]105.304791666667[/C][C]-0.404791666666663[/C][/ROW]
[ROW][C]34[/C][C]107.9[/C][C]106.782321428571[/C][C]1.11767857142857[/C][/ROW]
[ROW][C]35[/C][C]95.6[/C][C]101.382321428571[/C][C]-5.78232142857144[/C][/ROW]
[ROW][C]36[/C][C]79.8[/C][C]79.2823214285714[/C][C]0.517678571428565[/C][/ROW]
[ROW][C]37[/C][C]94.8[/C][C]95.7341741071428[/C][C]-0.934174107142794[/C][/ROW]
[ROW][C]38[/C][C]93.7[/C][C]95.4056026785714[/C][C]-1.70560267857143[/C][/ROW]
[ROW][C]39[/C][C]108.1[/C][C]103.705602678571[/C][C]4.39439732142857[/C][/ROW]
[ROW][C]40[/C][C]96.9[/C][C]95.4913169642857[/C][C]1.40868303571428[/C][/ROW]
[ROW][C]41[/C][C]88.8[/C][C]92.27703125[/C][C]-3.47703125000001[/C][/ROW]
[ROW][C]42[/C][C]106.7[/C][C]101.219888392857[/C][C]5.48011160714286[/C][/ROW]
[ROW][C]43[/C][C]86.8[/C][C]86.1198883928571[/C][C]0.680111607142851[/C][/ROW]
[ROW][C]44[/C][C]69.8[/C][C]68.6484598214286[/C][C]1.15154017857142[/C][/ROW]
[ROW][C]45[/C][C]110.9[/C][C]106.419888392857[/C][C]4.48011160714286[/C][/ROW]
[ROW][C]46[/C][C]105.4[/C][C]107.897418154762[/C][C]-2.49741815476191[/C][/ROW]
[ROW][C]47[/C][C]99.2[/C][C]102.497418154762[/C][C]-3.29741815476191[/C][/ROW]
[ROW][C]48[/C][C]84.4[/C][C]80.3974181547619[/C][C]4.00258184523809[/C][/ROW]
[ROW][C]49[/C][C]87.2[/C][C]96.8492708333333[/C][C]-9.64927083333327[/C][/ROW]
[ROW][C]50[/C][C]91.9[/C][C]96.520699404762[/C][C]-4.6206994047619[/C][/ROW]
[ROW][C]51[/C][C]97.9[/C][C]104.820699404762[/C][C]-6.9206994047619[/C][/ROW]
[ROW][C]52[/C][C]94.5[/C][C]96.6064136904762[/C][C]-2.1064136904762[/C][/ROW]
[ROW][C]53[/C][C]85[/C][C]93.3921279761905[/C][C]-8.39212797619048[/C][/ROW]
[ROW][C]54[/C][C]100.3[/C][C]102.334985119048[/C][C]-2.03498511904762[/C][/ROW]
[ROW][C]55[/C][C]78.7[/C][C]87.2349851190476[/C][C]-8.53498511904762[/C][/ROW]
[ROW][C]56[/C][C]65.8[/C][C]69.763556547619[/C][C]-3.96355654761906[/C][/ROW]
[ROW][C]57[/C][C]104.8[/C][C]107.534985119048[/C][C]-2.73498511904763[/C][/ROW]
[ROW][C]58[/C][C]96[/C][C]109.012514880952[/C][C]-13.0125148809524[/C][/ROW]
[ROW][C]59[/C][C]103.3[/C][C]103.612514880952[/C][C]-0.312514880952395[/C][/ROW]
[ROW][C]60[/C][C]82.9[/C][C]81.5125148809524[/C][C]1.38748511904762[/C][/ROW]
[ROW][C]61[/C][C]91.4[/C][C]97.9643675595238[/C][C]-6.56436755952374[/C][/ROW]
[ROW][C]62[/C][C]94.5[/C][C]97.6357961309524[/C][C]-3.13579613095239[/C][/ROW]
[ROW][C]63[/C][C]109.3[/C][C]105.935796130952[/C][C]3.36420386904761[/C][/ROW]
[ROW][C]64[/C][C]92.1[/C][C]97.7215104166667[/C][C]-5.62151041666668[/C][/ROW]
[ROW][C]65[/C][C]99.3[/C][C]94.507224702381[/C][C]4.79277529761904[/C][/ROW]
[ROW][C]66[/C][C]109.6[/C][C]103.450081845238[/C][C]6.1499181547619[/C][/ROW]
[ROW][C]67[/C][C]87.5[/C][C]88.3500818452381[/C][C]-0.850081845238102[/C][/ROW]
[ROW][C]68[/C][C]73.1[/C][C]70.8786532738095[/C][C]2.22134672619046[/C][/ROW]
[ROW][C]69[/C][C]110.7[/C][C]108.650081845238[/C][C]2.0499181547619[/C][/ROW]
[ROW][C]70[/C][C]111.6[/C][C]110.127611607143[/C][C]1.47238839285712[/C][/ROW]
[ROW][C]71[/C][C]110.7[/C][C]104.727611607143[/C][C]5.97238839285714[/C][/ROW]
[ROW][C]72[/C][C]84[/C][C]82.627611607143[/C][C]1.37238839285713[/C][/ROW]
[ROW][C]73[/C][C]101.6[/C][C]99.0794642857142[/C][C]2.52053571428577[/C][/ROW]
[ROW][C]74[/C][C]102.1[/C][C]98.7508928571429[/C][C]3.34910714285713[/C][/ROW]
[ROW][C]75[/C][C]113.9[/C][C]107.050892857143[/C][C]6.84910714285714[/C][/ROW]
[ROW][C]76[/C][C]99[/C][C]98.8366071428571[/C][C]0.163392857142846[/C][/ROW]
[ROW][C]77[/C][C]100.4[/C][C]95.6223214285714[/C][C]4.77767857142857[/C][/ROW]
[ROW][C]78[/C][C]109.5[/C][C]104.565178571429[/C][C]4.93482142857142[/C][/ROW]
[ROW][C]79[/C][C]93[/C][C]89.4651785714286[/C][C]3.53482142857142[/C][/ROW]
[ROW][C]80[/C][C]76.8[/C][C]71.99375[/C][C]4.80624999999999[/C][/ROW]
[ROW][C]81[/C][C]105.3[/C][C]109.765178571429[/C][C]-4.46517857142858[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5846&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5846&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1101.597.54966517857183.95033482142816
299.297.221093751.97890625
3107.8105.521093752.27890625000002
492.397.3068080357142-5.00680803571423
599.294.09252232142865.10747767857144
6101.6103.035379464286-1.43537946428569
78787.9353794642857-0.935379464285681
871.470.46395089285710.936049107142891
9104.7108.235379464286-3.5353794642857
10115.1109.7129092261905.38709077380959
11102.5104.312909226190-1.81290922619042
1275.382.2129092261904-6.91290922619044
1396.793.50398065476183.19601934523818
1494.693.17540922619051.42459077380952
1598.6101.475409226190-2.87540922619048
1699.593.26112351190486.23887648809523
179290.0468377976191.95316220238095
1893.698.9896949404762-5.38969494047620
1989.383.88969494047625.41030505952381
2066.966.41826636904760.481733630952386
21108.8104.1896949404764.61030505952381
22113.2105.6672247023817.53277529761904
23105.5100.2672247023815.23277529761904
2477.878.167224702381-0.367224702380958
25102.194.61907738095237.48092261904768
269794.2905059523812.70949404761905
2795.5102.590505952381-7.09050595238095
2899.394.37622023809524.92377976190475
2986.491.1619345238095-4.76193452380952
3092.4100.104791666667-7.70479166666666
3185.785.00479166666670.695208333333334
3261.967.5333630952381-5.6333630952381
33104.9105.304791666667-0.404791666666663
34107.9106.7823214285711.11767857142857
3595.6101.382321428571-5.78232142857144
3679.879.28232142857140.517678571428565
3794.895.7341741071428-0.934174107142794
3893.795.4056026785714-1.70560267857143
39108.1103.7056026785714.39439732142857
4096.995.49131696428571.40868303571428
4188.892.27703125-3.47703125000001
42106.7101.2198883928575.48011160714286
4386.886.11988839285710.680111607142851
4469.868.64845982142861.15154017857142
45110.9106.4198883928574.48011160714286
46105.4107.897418154762-2.49741815476191
4799.2102.497418154762-3.29741815476191
4884.480.39741815476194.00258184523809
4987.296.8492708333333-9.64927083333327
5091.996.520699404762-4.6206994047619
5197.9104.820699404762-6.9206994047619
5294.596.6064136904762-2.1064136904762
538593.3921279761905-8.39212797619048
54100.3102.334985119048-2.03498511904762
5578.787.2349851190476-8.53498511904762
5665.869.763556547619-3.96355654761906
57104.8107.534985119048-2.73498511904763
5896109.012514880952-13.0125148809524
59103.3103.612514880952-0.312514880952395
6082.981.51251488095241.38748511904762
6191.497.9643675595238-6.56436755952374
6294.597.6357961309524-3.13579613095239
63109.3105.9357961309523.36420386904761
6492.197.7215104166667-5.62151041666668
6599.394.5072247023814.79277529761904
66109.6103.4500818452386.1499181547619
6787.588.3500818452381-0.850081845238102
6873.170.87865327380952.22134672619046
69110.7108.6500818452382.0499181547619
70111.6110.1276116071431.47238839285712
71110.7104.7276116071435.97238839285714
728482.6276116071431.37238839285713
73101.699.07946428571422.52053571428577
74102.198.75089285714293.34910714285713
75113.9107.0508928571436.84910714285714
769998.83660714285710.163392857142846
77100.495.62232142857144.77767857142857
78109.5104.5651785714294.93482142857142
799389.46517857142863.53482142857142
8076.871.993754.80624999999999
81105.3109.765178571429-4.46517857142858



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')