Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 13 Dec 2007 04:36:49 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/13/t11975450856v46i5w0uuvpad0.htm/, Retrieved Sun, 05 May 2024 19:33:30 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=3452, Retrieved Sun, 05 May 2024 19:33:30 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact209
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper 2] [2007-12-13 11:36:49] [0608207d88b1eaba866515cf0d1cb34d] [Current]
Feedback Forum

Post a new message
Dataseries X:
106.5
112.3
102.8
96,5
101.0
98.9
105.1
103.0
99.0
104.3
94.6
90.4
108.9
111.4
100.8
102.5
98.2
98.7
113.3
104.6
99.3
111.8
97.3
97.7
115.6
111.9
107.0
107.1
100.6
99.2
108.4
103.0
99.8
115.0
90.8
95.9
114.4
108.2
112.6
109.1
105.0
105.0
118.5
103.7
112.5
116.6
96.6
101.9
116.5
119.3
115.4
108.5
111.5
108.8
121.8
109.6
112.2
119.6
103.4
105.3
113.5




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3452&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3452&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3452&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
industriële_productie[t] = + 90.3011764705882 + 15.4292810457517M1[t] + 16.5852287581699M2[t] + 11.4647058823530M3[t] + 8.26418300653594M4[t] + 6.56366013071895M5[t] + 5.20313725490195M6[t] + 16.2826143790850M7[t] + 7.42209150326796M8[t] + 6.98156862745097M9[t] + 15.661045751634M10[t] -1.47947712418301M11[t] + 0.220522875816993t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
industriële_productie[t] =  +  90.3011764705882 +  15.4292810457517M1[t] +  16.5852287581699M2[t] +  11.4647058823530M3[t] +  8.26418300653594M4[t] +  6.56366013071895M5[t] +  5.20313725490195M6[t] +  16.2826143790850M7[t] +  7.42209150326796M8[t] +  6.98156862745097M9[t] +  15.661045751634M10[t] -1.47947712418301M11[t] +  0.220522875816993t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3452&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]industriële_productie[t] =  +  90.3011764705882 +  15.4292810457517M1[t] +  16.5852287581699M2[t] +  11.4647058823530M3[t] +  8.26418300653594M4[t] +  6.56366013071895M5[t] +  5.20313725490195M6[t] +  16.2826143790850M7[t] +  7.42209150326796M8[t] +  6.98156862745097M9[t] +  15.661045751634M10[t] -1.47947712418301M11[t] +  0.220522875816993t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3452&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3452&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
industriële_productie[t] = + 90.3011764705882 + 15.4292810457517M1[t] + 16.5852287581699M2[t] + 11.4647058823530M3[t] + 8.26418300653594M4[t] + 6.56366013071895M5[t] + 5.20313725490195M6[t] + 16.2826143790850M7[t] + 7.42209150326796M8[t] + 6.98156862745097M9[t] + 15.661045751634M10[t] -1.47947712418301M11[t] + 0.220522875816993t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)90.30117647058821.65713354.492400
M115.42928104575171.9326077.983700
M216.58522875816992.0284758.176200
M311.46470588235302.0258855.65911e-060
M48.264183006535942.0235644.0840.0001678.3e-05
M56.563660130718952.0215153.24690.0021310.001065
M65.203137254901952.0197362.57610.0131240.006562
M716.28261437908502.0182318.067800
M87.422091503267962.0169983.67980.000590.000295
M96.981568627450972.0160383.4630.0011330.000567
M1015.6610457516342.0153537.770900
M11-1.479477124183012.014941-0.73430.4663660.233183
t0.2205228758169930.0235119.379700

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 90.3011764705882 & 1.657133 & 54.4924 & 0 & 0 \tabularnewline
M1 & 15.4292810457517 & 1.932607 & 7.9837 & 0 & 0 \tabularnewline
M2 & 16.5852287581699 & 2.028475 & 8.1762 & 0 & 0 \tabularnewline
M3 & 11.4647058823530 & 2.025885 & 5.6591 & 1e-06 & 0 \tabularnewline
M4 & 8.26418300653594 & 2.023564 & 4.084 & 0.000167 & 8.3e-05 \tabularnewline
M5 & 6.56366013071895 & 2.021515 & 3.2469 & 0.002131 & 0.001065 \tabularnewline
M6 & 5.20313725490195 & 2.019736 & 2.5761 & 0.013124 & 0.006562 \tabularnewline
M7 & 16.2826143790850 & 2.018231 & 8.0678 & 0 & 0 \tabularnewline
M8 & 7.42209150326796 & 2.016998 & 3.6798 & 0.00059 & 0.000295 \tabularnewline
M9 & 6.98156862745097 & 2.016038 & 3.463 & 0.001133 & 0.000567 \tabularnewline
M10 & 15.661045751634 & 2.015353 & 7.7709 & 0 & 0 \tabularnewline
M11 & -1.47947712418301 & 2.014941 & -0.7343 & 0.466366 & 0.233183 \tabularnewline
t & 0.220522875816993 & 0.023511 & 9.3797 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3452&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]90.3011764705882[/C][C]1.657133[/C][C]54.4924[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M1[/C][C]15.4292810457517[/C][C]1.932607[/C][C]7.9837[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M2[/C][C]16.5852287581699[/C][C]2.028475[/C][C]8.1762[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M3[/C][C]11.4647058823530[/C][C]2.025885[/C][C]5.6591[/C][C]1e-06[/C][C]0[/C][/ROW]
[ROW][C]M4[/C][C]8.26418300653594[/C][C]2.023564[/C][C]4.084[/C][C]0.000167[/C][C]8.3e-05[/C][/ROW]
[ROW][C]M5[/C][C]6.56366013071895[/C][C]2.021515[/C][C]3.2469[/C][C]0.002131[/C][C]0.001065[/C][/ROW]
[ROW][C]M6[/C][C]5.20313725490195[/C][C]2.019736[/C][C]2.5761[/C][C]0.013124[/C][C]0.006562[/C][/ROW]
[ROW][C]M7[/C][C]16.2826143790850[/C][C]2.018231[/C][C]8.0678[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M8[/C][C]7.42209150326796[/C][C]2.016998[/C][C]3.6798[/C][C]0.00059[/C][C]0.000295[/C][/ROW]
[ROW][C]M9[/C][C]6.98156862745097[/C][C]2.016038[/C][C]3.463[/C][C]0.001133[/C][C]0.000567[/C][/ROW]
[ROW][C]M10[/C][C]15.661045751634[/C][C]2.015353[/C][C]7.7709[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M11[/C][C]-1.47947712418301[/C][C]2.014941[/C][C]-0.7343[/C][C]0.466366[/C][C]0.233183[/C][/ROW]
[ROW][C]t[/C][C]0.220522875816993[/C][C]0.023511[/C][C]9.3797[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3452&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3452&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)90.30117647058821.65713354.492400
M115.42928104575171.9326077.983700
M216.58522875816992.0284758.176200
M311.46470588235302.0258855.65911e-060
M48.264183006535942.0235644.0840.0001678.3e-05
M56.563660130718952.0215153.24690.0021310.001065
M65.203137254901952.0197362.57610.0131240.006562
M716.28261437908502.0182318.067800
M87.422091503267962.0169983.67980.000590.000295
M96.981568627450972.0160383.4630.0011330.000567
M1015.6610457516342.0153537.770900
M11-1.479477124183012.014941-0.73430.4663660.233183
t0.2205228758169930.0235119.379700







Multiple Linear Regression - Regression Statistics
Multiple R0.923456661459081
R-squared0.852772205593152
Adjusted R-squared0.81596525699144
F-TEST (value)23.1687830148867
F-TEST (DF numerator)12
F-TEST (DF denominator)48
p-value6.66133814775094e-16
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation3.1856851491133
Sum Squared Residuals487.13231372549

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.923456661459081 \tabularnewline
R-squared & 0.852772205593152 \tabularnewline
Adjusted R-squared & 0.81596525699144 \tabularnewline
F-TEST (value) & 23.1687830148867 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 48 \tabularnewline
p-value & 6.66133814775094e-16 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 3.1856851491133 \tabularnewline
Sum Squared Residuals & 487.13231372549 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3452&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.923456661459081[/C][/ROW]
[ROW][C]R-squared[/C][C]0.852772205593152[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.81596525699144[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]23.1687830148867[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]48[/C][/ROW]
[ROW][C]p-value[/C][C]6.66133814775094e-16[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]3.1856851491133[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]487.13231372549[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3452&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3452&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.923456661459081
R-squared0.852772205593152
Adjusted R-squared0.81596525699144
F-TEST (value)23.1687830148867
F-TEST (DF numerator)12
F-TEST (DF denominator)48
p-value6.66133814775094e-16
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation3.1856851491133
Sum Squared Residuals487.13231372549







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.5105.9509803921570.549019607843207
2112.3107.3274509803924.97254901960782
3102.8102.4274509803920.37254901960783
496.599.4474509803922-2.94745098039219
510197.96745098039223.03254901960782
698.996.82745098039222.07254901960784
7105.1108.127450980392-3.02745098039216
810399.48745098039223.51254901960784
99999.2674509803921-0.267450980392144
10104.3108.167450980392-3.86745098039215
1194.691.24745098039223.35254901960783
1290.492.9474509803922-2.54745098039216
13108.9108.5972549019610.302745098039209
14111.4109.9737254901961.42627450980393
15100.8105.073725490196-4.27372549019608
16102.5102.0937254901960.406274509803932
1798.2100.613725490196-2.41372549019607
1898.799.473725490196-0.773725490196077
19113.3110.7737254901962.52627450980392
20104.6102.1337254901962.46627450980392
2199.3101.913725490196-2.61372549019609
22111.8110.8137254901960.986274509803915
2397.393.8937254901963.40627450980392
2497.795.5937254901962.10627450980392
25115.6111.2435294117654.35647058823527
26111.9112.62-0.719999999999992
27107107.72-0.719999999999996
28107.1104.742.36000000000000
29100.6103.26-2.66000000000000
3099.2102.12-2.92
31108.4113.42-5.01999999999999
32103104.78-1.78000000000000
3399.8104.56-4.76000000000001
34115113.461.54000000000000
3590.896.54-5.74
3695.998.24-2.33999999999999
37114.4113.8898039215690.510196078431359
38108.2115.266274509804-7.06627450980391
39112.6110.3662745098042.23372549019608
40109.1107.3862745098041.71372549019609
41105105.906274509804-0.906274509803916
42105104.7662745098040.233725490196078
43118.5116.0662745098042.43372549019608
44103.7107.426274509804-3.72627450980392
45112.5107.2062745098045.29372549019608
46116.6116.1062745098040.493725490196069
4796.699.186274509804-2.58627450980392
48101.9100.8862745098041.01372549019608
49116.5116.536078431373-0.0360784313725666
50119.3117.9125490196081.38745098039216
51115.4113.0125490196082.38745098039217
52108.5110.032549019608-1.53254901960783
53111.5108.5525490196082.94745098039216
54108.8107.4125490196081.38745098039216
55121.8118.7125490196083.08745098039215
56109.6110.072549019608-0.472549019607844
57112.2109.8525490196082.34745098039216
58119.6118.7525490196080.847450980392155
59103.4101.8325490196081.56745098039217
60105.3103.5325490196081.76745098039216
61113.5119.182352941176-5.68235294117648

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 106.5 & 105.950980392157 & 0.549019607843207 \tabularnewline
2 & 112.3 & 107.327450980392 & 4.97254901960782 \tabularnewline
3 & 102.8 & 102.427450980392 & 0.37254901960783 \tabularnewline
4 & 96.5 & 99.4474509803922 & -2.94745098039219 \tabularnewline
5 & 101 & 97.9674509803922 & 3.03254901960782 \tabularnewline
6 & 98.9 & 96.8274509803922 & 2.07254901960784 \tabularnewline
7 & 105.1 & 108.127450980392 & -3.02745098039216 \tabularnewline
8 & 103 & 99.4874509803922 & 3.51254901960784 \tabularnewline
9 & 99 & 99.2674509803921 & -0.267450980392144 \tabularnewline
10 & 104.3 & 108.167450980392 & -3.86745098039215 \tabularnewline
11 & 94.6 & 91.2474509803922 & 3.35254901960783 \tabularnewline
12 & 90.4 & 92.9474509803922 & -2.54745098039216 \tabularnewline
13 & 108.9 & 108.597254901961 & 0.302745098039209 \tabularnewline
14 & 111.4 & 109.973725490196 & 1.42627450980393 \tabularnewline
15 & 100.8 & 105.073725490196 & -4.27372549019608 \tabularnewline
16 & 102.5 & 102.093725490196 & 0.406274509803932 \tabularnewline
17 & 98.2 & 100.613725490196 & -2.41372549019607 \tabularnewline
18 & 98.7 & 99.473725490196 & -0.773725490196077 \tabularnewline
19 & 113.3 & 110.773725490196 & 2.52627450980392 \tabularnewline
20 & 104.6 & 102.133725490196 & 2.46627450980392 \tabularnewline
21 & 99.3 & 101.913725490196 & -2.61372549019609 \tabularnewline
22 & 111.8 & 110.813725490196 & 0.986274509803915 \tabularnewline
23 & 97.3 & 93.893725490196 & 3.40627450980392 \tabularnewline
24 & 97.7 & 95.593725490196 & 2.10627450980392 \tabularnewline
25 & 115.6 & 111.243529411765 & 4.35647058823527 \tabularnewline
26 & 111.9 & 112.62 & -0.719999999999992 \tabularnewline
27 & 107 & 107.72 & -0.719999999999996 \tabularnewline
28 & 107.1 & 104.74 & 2.36000000000000 \tabularnewline
29 & 100.6 & 103.26 & -2.66000000000000 \tabularnewline
30 & 99.2 & 102.12 & -2.92 \tabularnewline
31 & 108.4 & 113.42 & -5.01999999999999 \tabularnewline
32 & 103 & 104.78 & -1.78000000000000 \tabularnewline
33 & 99.8 & 104.56 & -4.76000000000001 \tabularnewline
34 & 115 & 113.46 & 1.54000000000000 \tabularnewline
35 & 90.8 & 96.54 & -5.74 \tabularnewline
36 & 95.9 & 98.24 & -2.33999999999999 \tabularnewline
37 & 114.4 & 113.889803921569 & 0.510196078431359 \tabularnewline
38 & 108.2 & 115.266274509804 & -7.06627450980391 \tabularnewline
39 & 112.6 & 110.366274509804 & 2.23372549019608 \tabularnewline
40 & 109.1 & 107.386274509804 & 1.71372549019609 \tabularnewline
41 & 105 & 105.906274509804 & -0.906274509803916 \tabularnewline
42 & 105 & 104.766274509804 & 0.233725490196078 \tabularnewline
43 & 118.5 & 116.066274509804 & 2.43372549019608 \tabularnewline
44 & 103.7 & 107.426274509804 & -3.72627450980392 \tabularnewline
45 & 112.5 & 107.206274509804 & 5.29372549019608 \tabularnewline
46 & 116.6 & 116.106274509804 & 0.493725490196069 \tabularnewline
47 & 96.6 & 99.186274509804 & -2.58627450980392 \tabularnewline
48 & 101.9 & 100.886274509804 & 1.01372549019608 \tabularnewline
49 & 116.5 & 116.536078431373 & -0.0360784313725666 \tabularnewline
50 & 119.3 & 117.912549019608 & 1.38745098039216 \tabularnewline
51 & 115.4 & 113.012549019608 & 2.38745098039217 \tabularnewline
52 & 108.5 & 110.032549019608 & -1.53254901960783 \tabularnewline
53 & 111.5 & 108.552549019608 & 2.94745098039216 \tabularnewline
54 & 108.8 & 107.412549019608 & 1.38745098039216 \tabularnewline
55 & 121.8 & 118.712549019608 & 3.08745098039215 \tabularnewline
56 & 109.6 & 110.072549019608 & -0.472549019607844 \tabularnewline
57 & 112.2 & 109.852549019608 & 2.34745098039216 \tabularnewline
58 & 119.6 & 118.752549019608 & 0.847450980392155 \tabularnewline
59 & 103.4 & 101.832549019608 & 1.56745098039217 \tabularnewline
60 & 105.3 & 103.532549019608 & 1.76745098039216 \tabularnewline
61 & 113.5 & 119.182352941176 & -5.68235294117648 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3452&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]106.5[/C][C]105.950980392157[/C][C]0.549019607843207[/C][/ROW]
[ROW][C]2[/C][C]112.3[/C][C]107.327450980392[/C][C]4.97254901960782[/C][/ROW]
[ROW][C]3[/C][C]102.8[/C][C]102.427450980392[/C][C]0.37254901960783[/C][/ROW]
[ROW][C]4[/C][C]96.5[/C][C]99.4474509803922[/C][C]-2.94745098039219[/C][/ROW]
[ROW][C]5[/C][C]101[/C][C]97.9674509803922[/C][C]3.03254901960782[/C][/ROW]
[ROW][C]6[/C][C]98.9[/C][C]96.8274509803922[/C][C]2.07254901960784[/C][/ROW]
[ROW][C]7[/C][C]105.1[/C][C]108.127450980392[/C][C]-3.02745098039216[/C][/ROW]
[ROW][C]8[/C][C]103[/C][C]99.4874509803922[/C][C]3.51254901960784[/C][/ROW]
[ROW][C]9[/C][C]99[/C][C]99.2674509803921[/C][C]-0.267450980392144[/C][/ROW]
[ROW][C]10[/C][C]104.3[/C][C]108.167450980392[/C][C]-3.86745098039215[/C][/ROW]
[ROW][C]11[/C][C]94.6[/C][C]91.2474509803922[/C][C]3.35254901960783[/C][/ROW]
[ROW][C]12[/C][C]90.4[/C][C]92.9474509803922[/C][C]-2.54745098039216[/C][/ROW]
[ROW][C]13[/C][C]108.9[/C][C]108.597254901961[/C][C]0.302745098039209[/C][/ROW]
[ROW][C]14[/C][C]111.4[/C][C]109.973725490196[/C][C]1.42627450980393[/C][/ROW]
[ROW][C]15[/C][C]100.8[/C][C]105.073725490196[/C][C]-4.27372549019608[/C][/ROW]
[ROW][C]16[/C][C]102.5[/C][C]102.093725490196[/C][C]0.406274509803932[/C][/ROW]
[ROW][C]17[/C][C]98.2[/C][C]100.613725490196[/C][C]-2.41372549019607[/C][/ROW]
[ROW][C]18[/C][C]98.7[/C][C]99.473725490196[/C][C]-0.773725490196077[/C][/ROW]
[ROW][C]19[/C][C]113.3[/C][C]110.773725490196[/C][C]2.52627450980392[/C][/ROW]
[ROW][C]20[/C][C]104.6[/C][C]102.133725490196[/C][C]2.46627450980392[/C][/ROW]
[ROW][C]21[/C][C]99.3[/C][C]101.913725490196[/C][C]-2.61372549019609[/C][/ROW]
[ROW][C]22[/C][C]111.8[/C][C]110.813725490196[/C][C]0.986274509803915[/C][/ROW]
[ROW][C]23[/C][C]97.3[/C][C]93.893725490196[/C][C]3.40627450980392[/C][/ROW]
[ROW][C]24[/C][C]97.7[/C][C]95.593725490196[/C][C]2.10627450980392[/C][/ROW]
[ROW][C]25[/C][C]115.6[/C][C]111.243529411765[/C][C]4.35647058823527[/C][/ROW]
[ROW][C]26[/C][C]111.9[/C][C]112.62[/C][C]-0.719999999999992[/C][/ROW]
[ROW][C]27[/C][C]107[/C][C]107.72[/C][C]-0.719999999999996[/C][/ROW]
[ROW][C]28[/C][C]107.1[/C][C]104.74[/C][C]2.36000000000000[/C][/ROW]
[ROW][C]29[/C][C]100.6[/C][C]103.26[/C][C]-2.66000000000000[/C][/ROW]
[ROW][C]30[/C][C]99.2[/C][C]102.12[/C][C]-2.92[/C][/ROW]
[ROW][C]31[/C][C]108.4[/C][C]113.42[/C][C]-5.01999999999999[/C][/ROW]
[ROW][C]32[/C][C]103[/C][C]104.78[/C][C]-1.78000000000000[/C][/ROW]
[ROW][C]33[/C][C]99.8[/C][C]104.56[/C][C]-4.76000000000001[/C][/ROW]
[ROW][C]34[/C][C]115[/C][C]113.46[/C][C]1.54000000000000[/C][/ROW]
[ROW][C]35[/C][C]90.8[/C][C]96.54[/C][C]-5.74[/C][/ROW]
[ROW][C]36[/C][C]95.9[/C][C]98.24[/C][C]-2.33999999999999[/C][/ROW]
[ROW][C]37[/C][C]114.4[/C][C]113.889803921569[/C][C]0.510196078431359[/C][/ROW]
[ROW][C]38[/C][C]108.2[/C][C]115.266274509804[/C][C]-7.06627450980391[/C][/ROW]
[ROW][C]39[/C][C]112.6[/C][C]110.366274509804[/C][C]2.23372549019608[/C][/ROW]
[ROW][C]40[/C][C]109.1[/C][C]107.386274509804[/C][C]1.71372549019609[/C][/ROW]
[ROW][C]41[/C][C]105[/C][C]105.906274509804[/C][C]-0.906274509803916[/C][/ROW]
[ROW][C]42[/C][C]105[/C][C]104.766274509804[/C][C]0.233725490196078[/C][/ROW]
[ROW][C]43[/C][C]118.5[/C][C]116.066274509804[/C][C]2.43372549019608[/C][/ROW]
[ROW][C]44[/C][C]103.7[/C][C]107.426274509804[/C][C]-3.72627450980392[/C][/ROW]
[ROW][C]45[/C][C]112.5[/C][C]107.206274509804[/C][C]5.29372549019608[/C][/ROW]
[ROW][C]46[/C][C]116.6[/C][C]116.106274509804[/C][C]0.493725490196069[/C][/ROW]
[ROW][C]47[/C][C]96.6[/C][C]99.186274509804[/C][C]-2.58627450980392[/C][/ROW]
[ROW][C]48[/C][C]101.9[/C][C]100.886274509804[/C][C]1.01372549019608[/C][/ROW]
[ROW][C]49[/C][C]116.5[/C][C]116.536078431373[/C][C]-0.0360784313725666[/C][/ROW]
[ROW][C]50[/C][C]119.3[/C][C]117.912549019608[/C][C]1.38745098039216[/C][/ROW]
[ROW][C]51[/C][C]115.4[/C][C]113.012549019608[/C][C]2.38745098039217[/C][/ROW]
[ROW][C]52[/C][C]108.5[/C][C]110.032549019608[/C][C]-1.53254901960783[/C][/ROW]
[ROW][C]53[/C][C]111.5[/C][C]108.552549019608[/C][C]2.94745098039216[/C][/ROW]
[ROW][C]54[/C][C]108.8[/C][C]107.412549019608[/C][C]1.38745098039216[/C][/ROW]
[ROW][C]55[/C][C]121.8[/C][C]118.712549019608[/C][C]3.08745098039215[/C][/ROW]
[ROW][C]56[/C][C]109.6[/C][C]110.072549019608[/C][C]-0.472549019607844[/C][/ROW]
[ROW][C]57[/C][C]112.2[/C][C]109.852549019608[/C][C]2.34745098039216[/C][/ROW]
[ROW][C]58[/C][C]119.6[/C][C]118.752549019608[/C][C]0.847450980392155[/C][/ROW]
[ROW][C]59[/C][C]103.4[/C][C]101.832549019608[/C][C]1.56745098039217[/C][/ROW]
[ROW][C]60[/C][C]105.3[/C][C]103.532549019608[/C][C]1.76745098039216[/C][/ROW]
[ROW][C]61[/C][C]113.5[/C][C]119.182352941176[/C][C]-5.68235294117648[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3452&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3452&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1106.5105.9509803921570.549019607843207
2112.3107.3274509803924.97254901960782
3102.8102.4274509803920.37254901960783
496.599.4474509803922-2.94745098039219
510197.96745098039223.03254901960782
698.996.82745098039222.07254901960784
7105.1108.127450980392-3.02745098039216
810399.48745098039223.51254901960784
99999.2674509803921-0.267450980392144
10104.3108.167450980392-3.86745098039215
1194.691.24745098039223.35254901960783
1290.492.9474509803922-2.54745098039216
13108.9108.5972549019610.302745098039209
14111.4109.9737254901961.42627450980393
15100.8105.073725490196-4.27372549019608
16102.5102.0937254901960.406274509803932
1798.2100.613725490196-2.41372549019607
1898.799.473725490196-0.773725490196077
19113.3110.7737254901962.52627450980392
20104.6102.1337254901962.46627450980392
2199.3101.913725490196-2.61372549019609
22111.8110.8137254901960.986274509803915
2397.393.8937254901963.40627450980392
2497.795.5937254901962.10627450980392
25115.6111.2435294117654.35647058823527
26111.9112.62-0.719999999999992
27107107.72-0.719999999999996
28107.1104.742.36000000000000
29100.6103.26-2.66000000000000
3099.2102.12-2.92
31108.4113.42-5.01999999999999
32103104.78-1.78000000000000
3399.8104.56-4.76000000000001
34115113.461.54000000000000
3590.896.54-5.74
3695.998.24-2.33999999999999
37114.4113.8898039215690.510196078431359
38108.2115.266274509804-7.06627450980391
39112.6110.3662745098042.23372549019608
40109.1107.3862745098041.71372549019609
41105105.906274509804-0.906274509803916
42105104.7662745098040.233725490196078
43118.5116.0662745098042.43372549019608
44103.7107.426274509804-3.72627450980392
45112.5107.2062745098045.29372549019608
46116.6116.1062745098040.493725490196069
4796.699.186274509804-2.58627450980392
48101.9100.8862745098041.01372549019608
49116.5116.536078431373-0.0360784313725666
50119.3117.9125490196081.38745098039216
51115.4113.0125490196082.38745098039217
52108.5110.032549019608-1.53254901960783
53111.5108.5525490196082.94745098039216
54108.8107.4125490196081.38745098039216
55121.8118.7125490196083.08745098039215
56109.6110.072549019608-0.472549019607844
57112.2109.8525490196082.34745098039216
58119.6118.7525490196080.847450980392155
59103.4101.8325490196081.56745098039217
60105.3103.5325490196081.76745098039216
61113.5119.182352941176-5.68235294117648



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')