Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 03:36:18 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195468163gr10zjjpwbpdrb6.htm/, Retrieved Fri, 03 May 2024 04:58:37 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5675, Retrieved Fri, 03 May 2024 04:58:37 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact224
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Ws6 T2] [2007-11-19 10:36:18] [6bae8369195607c4cbc8a8485fed7b2f] [Current]
Feedback Forum

Post a new message
Dataseries X:
110.40	0
96.40	0
101.90	0
106.20	0
81.00	0
94.70	0
101.00	0
109.40	1
102.30	1
90.70	1
96.20	1
96.10	1
106.00	1
103.10	1
102.00	1
104.70	1
86.00	1
92.10	1
106.90	1
112.60	1
101.70	1
92.00	1
97.40	1
97.00	1
105.40	1
102.70	1
98.10	1
104.50	1
87.40	1
89.90	1
109.80	1
111.70	1
98.60	1
96.90	1
95.10	1
97.00	1
112.70	1
102.90	1
97.40	1
111.40	1
87.40	1
96.80	1
114.10	1
110.30	1
103.90	1
101.60	1
94.60	1
95.90	1
104.70	1
102.80	1
98.10	1
113.90	1
80.90	1
95.70	1
113.20	1
105.90	1
108.80	1
102.30	1
99.00	1
100.70	1
115.50	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5675&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5675&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5675&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 94.8340828402367 + 2.50591715976330x[t] + 12.1943195266272M1[t] + 4.74118343195268M2[t] + 2.66118343195267M3[t] + 11.3011834319527M4[t] -12.2988165680473M5[t] -2.99881656804733M6[t] + 12.1611834319527M7[t] + 12.64M8[t] + 5.72000000000001M9[t] -0.639999999999997M10[t] -0.879999999999997M11[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  94.8340828402367 +  2.50591715976330x[t] +  12.1943195266272M1[t] +  4.74118343195268M2[t] +  2.66118343195267M3[t] +  11.3011834319527M4[t] -12.2988165680473M5[t] -2.99881656804733M6[t] +  12.1611834319527M7[t] +  12.64M8[t] +  5.72000000000001M9[t] -0.639999999999997M10[t] -0.879999999999997M11[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5675&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  94.8340828402367 +  2.50591715976330x[t] +  12.1943195266272M1[t] +  4.74118343195268M2[t] +  2.66118343195267M3[t] +  11.3011834319527M4[t] -12.2988165680473M5[t] -2.99881656804733M6[t] +  12.1611834319527M7[t] +  12.64M8[t] +  5.72000000000001M9[t] -0.639999999999997M10[t] -0.879999999999997M11[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5675&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5675&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 94.8340828402367 + 2.50591715976330x[t] + 12.1943195266272M1[t] + 4.74118343195268M2[t] + 2.66118343195267M3[t] + 11.3011834319527M4[t] -12.2988165680473M5[t] -2.99881656804733M6[t] + 12.1611834319527M7[t] + 12.64M8[t] + 5.72000000000001M9[t] -0.639999999999997M10[t] -0.879999999999997M11[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)94.83408284023672.17632843.575300
x2.505917159763301.4923641.67920.099620.04981
M112.19431952662722.1592055.64761e-060
M24.741183431952682.2599992.09790.0412060.020603
M32.661183431952672.2599991.17750.2447950.122398
M411.30118343195272.2599995.00058e-064e-06
M5-12.29881656804732.259999-5.4422e-061e-06
M6-2.998816568047332.259999-1.32690.1908160.095408
M712.16118343195272.2599995.38112e-061e-06
M812.642.2402035.64231e-060
M95.720000000000012.2402032.55330.0139020.006951
M10-0.6399999999999972.240203-0.28570.7763460.388173
M11-0.8799999999999972.240203-0.39280.696190.348095

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 94.8340828402367 & 2.176328 & 43.5753 & 0 & 0 \tabularnewline
x & 2.50591715976330 & 1.492364 & 1.6792 & 0.09962 & 0.04981 \tabularnewline
M1 & 12.1943195266272 & 2.159205 & 5.6476 & 1e-06 & 0 \tabularnewline
M2 & 4.74118343195268 & 2.259999 & 2.0979 & 0.041206 & 0.020603 \tabularnewline
M3 & 2.66118343195267 & 2.259999 & 1.1775 & 0.244795 & 0.122398 \tabularnewline
M4 & 11.3011834319527 & 2.259999 & 5.0005 & 8e-06 & 4e-06 \tabularnewline
M5 & -12.2988165680473 & 2.259999 & -5.442 & 2e-06 & 1e-06 \tabularnewline
M6 & -2.99881656804733 & 2.259999 & -1.3269 & 0.190816 & 0.095408 \tabularnewline
M7 & 12.1611834319527 & 2.259999 & 5.3811 & 2e-06 & 1e-06 \tabularnewline
M8 & 12.64 & 2.240203 & 5.6423 & 1e-06 & 0 \tabularnewline
M9 & 5.72000000000001 & 2.240203 & 2.5533 & 0.013902 & 0.006951 \tabularnewline
M10 & -0.639999999999997 & 2.240203 & -0.2857 & 0.776346 & 0.388173 \tabularnewline
M11 & -0.879999999999997 & 2.240203 & -0.3928 & 0.69619 & 0.348095 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5675&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]94.8340828402367[/C][C]2.176328[/C][C]43.5753[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]2.50591715976330[/C][C]1.492364[/C][C]1.6792[/C][C]0.09962[/C][C]0.04981[/C][/ROW]
[ROW][C]M1[/C][C]12.1943195266272[/C][C]2.159205[/C][C]5.6476[/C][C]1e-06[/C][C]0[/C][/ROW]
[ROW][C]M2[/C][C]4.74118343195268[/C][C]2.259999[/C][C]2.0979[/C][C]0.041206[/C][C]0.020603[/C][/ROW]
[ROW][C]M3[/C][C]2.66118343195267[/C][C]2.259999[/C][C]1.1775[/C][C]0.244795[/C][C]0.122398[/C][/ROW]
[ROW][C]M4[/C][C]11.3011834319527[/C][C]2.259999[/C][C]5.0005[/C][C]8e-06[/C][C]4e-06[/C][/ROW]
[ROW][C]M5[/C][C]-12.2988165680473[/C][C]2.259999[/C][C]-5.442[/C][C]2e-06[/C][C]1e-06[/C][/ROW]
[ROW][C]M6[/C][C]-2.99881656804733[/C][C]2.259999[/C][C]-1.3269[/C][C]0.190816[/C][C]0.095408[/C][/ROW]
[ROW][C]M7[/C][C]12.1611834319527[/C][C]2.259999[/C][C]5.3811[/C][C]2e-06[/C][C]1e-06[/C][/ROW]
[ROW][C]M8[/C][C]12.64[/C][C]2.240203[/C][C]5.6423[/C][C]1e-06[/C][C]0[/C][/ROW]
[ROW][C]M9[/C][C]5.72000000000001[/C][C]2.240203[/C][C]2.5533[/C][C]0.013902[/C][C]0.006951[/C][/ROW]
[ROW][C]M10[/C][C]-0.639999999999997[/C][C]2.240203[/C][C]-0.2857[/C][C]0.776346[/C][C]0.388173[/C][/ROW]
[ROW][C]M11[/C][C]-0.879999999999997[/C][C]2.240203[/C][C]-0.3928[/C][C]0.69619[/C][C]0.348095[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5675&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5675&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)94.83408284023672.17632843.575300
x2.505917159763301.4923641.67920.099620.04981
M112.19431952662722.1592055.64761e-060
M24.741183431952682.2599992.09790.0412060.020603
M32.661183431952672.2599991.17750.2447950.122398
M411.30118343195272.2599995.00058e-064e-06
M5-12.29881656804732.259999-5.4422e-061e-06
M6-2.998816568047332.259999-1.32690.1908160.095408
M712.16118343195272.2599995.38112e-061e-06
M812.642.2402035.64231e-060
M95.720000000000012.2402032.55330.0139020.006951
M10-0.6399999999999972.240203-0.28570.7763460.388173
M11-0.8799999999999972.240203-0.39280.696190.348095







Multiple Linear Regression - Regression Statistics
Multiple R0.91994407847051
R-squared0.846297107512956
Adjusted R-squared0.807871384391195
F-TEST (value)22.0242337361164
F-TEST (DF numerator)12
F-TEST (DF denominator)48
p-value1.55431223447522e-15
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation3.54207194571751
Sum Squared Residuals602.221136094675

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.91994407847051 \tabularnewline
R-squared & 0.846297107512956 \tabularnewline
Adjusted R-squared & 0.807871384391195 \tabularnewline
F-TEST (value) & 22.0242337361164 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 48 \tabularnewline
p-value & 1.55431223447522e-15 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 3.54207194571751 \tabularnewline
Sum Squared Residuals & 602.221136094675 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5675&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.91994407847051[/C][/ROW]
[ROW][C]R-squared[/C][C]0.846297107512956[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.807871384391195[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]22.0242337361164[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]48[/C][/ROW]
[ROW][C]p-value[/C][C]1.55431223447522e-15[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]3.54207194571751[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]602.221136094675[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5675&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5675&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.91994407847051
R-squared0.846297107512956
Adjusted R-squared0.807871384391195
F-TEST (value)22.0242337361164
F-TEST (DF numerator)12
F-TEST (DF denominator)48
p-value1.55431223447522e-15
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation3.54207194571751
Sum Squared Residuals602.221136094675







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1110.4107.0284023668643.37159763313617
296.499.5752662721893-3.17526627218933
3101.997.49526627218944.40473372781065
4106.2106.1352662721890.0647337278106297
58182.5352662721894-1.53526627218938
694.791.83526627218942.86473372781064
7101106.995266272189-5.99526627218935
8109.4109.98-0.580000000000001
9102.3103.06-0.759999999999995
1090.796.7-6
1196.296.46-0.259999999999999
1296.197.34-1.24000000000000
13106109.534319526627-3.53431952662724
14103.1102.0811834319531.01881656804733
15102100.0011834319531.99881656804734
16104.7108.641183431953-3.94118343195266
178685.04118343195270.958816568047336
1892.194.3411834319527-2.24118343195266
19106.9109.501183431953-2.60118343195266
20112.6109.982.61999999999999
21101.7103.06-1.36
229296.7-4.7
2397.496.460.940000000000006
249797.34-0.339999999999997
25105.4109.534319526627-4.13431952662723
26102.7102.0811834319530.618816568047336
2798.1100.001183431953-1.90118343195267
28104.5108.641183431953-4.14118343195266
2987.485.04118343195272.35881656804734
3089.994.3411834319527-4.44118343195265
31109.8109.5011834319530.298816568047331
32111.7109.981.72
3398.6103.06-4.46000000000001
3496.996.70.200000000000006
3595.196.46-1.36000000000000
369797.34-0.339999999999997
37112.7109.5343195266273.16568047337277
38102.9102.0811834319530.818816568047338
3997.4100.001183431953-2.60118343195265
40111.4108.6411834319532.75881656804734
4187.485.04118343195272.35881656804734
4296.894.34118343195272.45881656804734
43114.1109.5011834319534.59881656804733
44110.3109.980.319999999999996
45103.9103.060.840000000000005
46101.696.74.90000000000000
4794.696.46-1.86000000000000
4895.997.34-1.43999999999999
49104.7109.534319526627-4.83431952662723
50102.8102.0811834319530.71881656804733
5198.1100.001183431953-1.90118343195267
52113.9108.6411834319535.25881656804734
5380.985.0411834319527-4.14118343195265
5495.794.34118343195271.35881656804734
55113.2109.5011834319533.69881656804734
56105.9109.98-4.08000000000000
57108.8103.065.74
58102.396.75.6
599996.462.54
60100.797.343.36000000000001
61115.5109.5343195266275.96568047337276

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 110.4 & 107.028402366864 & 3.37159763313617 \tabularnewline
2 & 96.4 & 99.5752662721893 & -3.17526627218933 \tabularnewline
3 & 101.9 & 97.4952662721894 & 4.40473372781065 \tabularnewline
4 & 106.2 & 106.135266272189 & 0.0647337278106297 \tabularnewline
5 & 81 & 82.5352662721894 & -1.53526627218938 \tabularnewline
6 & 94.7 & 91.8352662721894 & 2.86473372781064 \tabularnewline
7 & 101 & 106.995266272189 & -5.99526627218935 \tabularnewline
8 & 109.4 & 109.98 & -0.580000000000001 \tabularnewline
9 & 102.3 & 103.06 & -0.759999999999995 \tabularnewline
10 & 90.7 & 96.7 & -6 \tabularnewline
11 & 96.2 & 96.46 & -0.259999999999999 \tabularnewline
12 & 96.1 & 97.34 & -1.24000000000000 \tabularnewline
13 & 106 & 109.534319526627 & -3.53431952662724 \tabularnewline
14 & 103.1 & 102.081183431953 & 1.01881656804733 \tabularnewline
15 & 102 & 100.001183431953 & 1.99881656804734 \tabularnewline
16 & 104.7 & 108.641183431953 & -3.94118343195266 \tabularnewline
17 & 86 & 85.0411834319527 & 0.958816568047336 \tabularnewline
18 & 92.1 & 94.3411834319527 & -2.24118343195266 \tabularnewline
19 & 106.9 & 109.501183431953 & -2.60118343195266 \tabularnewline
20 & 112.6 & 109.98 & 2.61999999999999 \tabularnewline
21 & 101.7 & 103.06 & -1.36 \tabularnewline
22 & 92 & 96.7 & -4.7 \tabularnewline
23 & 97.4 & 96.46 & 0.940000000000006 \tabularnewline
24 & 97 & 97.34 & -0.339999999999997 \tabularnewline
25 & 105.4 & 109.534319526627 & -4.13431952662723 \tabularnewline
26 & 102.7 & 102.081183431953 & 0.618816568047336 \tabularnewline
27 & 98.1 & 100.001183431953 & -1.90118343195267 \tabularnewline
28 & 104.5 & 108.641183431953 & -4.14118343195266 \tabularnewline
29 & 87.4 & 85.0411834319527 & 2.35881656804734 \tabularnewline
30 & 89.9 & 94.3411834319527 & -4.44118343195265 \tabularnewline
31 & 109.8 & 109.501183431953 & 0.298816568047331 \tabularnewline
32 & 111.7 & 109.98 & 1.72 \tabularnewline
33 & 98.6 & 103.06 & -4.46000000000001 \tabularnewline
34 & 96.9 & 96.7 & 0.200000000000006 \tabularnewline
35 & 95.1 & 96.46 & -1.36000000000000 \tabularnewline
36 & 97 & 97.34 & -0.339999999999997 \tabularnewline
37 & 112.7 & 109.534319526627 & 3.16568047337277 \tabularnewline
38 & 102.9 & 102.081183431953 & 0.818816568047338 \tabularnewline
39 & 97.4 & 100.001183431953 & -2.60118343195265 \tabularnewline
40 & 111.4 & 108.641183431953 & 2.75881656804734 \tabularnewline
41 & 87.4 & 85.0411834319527 & 2.35881656804734 \tabularnewline
42 & 96.8 & 94.3411834319527 & 2.45881656804734 \tabularnewline
43 & 114.1 & 109.501183431953 & 4.59881656804733 \tabularnewline
44 & 110.3 & 109.98 & 0.319999999999996 \tabularnewline
45 & 103.9 & 103.06 & 0.840000000000005 \tabularnewline
46 & 101.6 & 96.7 & 4.90000000000000 \tabularnewline
47 & 94.6 & 96.46 & -1.86000000000000 \tabularnewline
48 & 95.9 & 97.34 & -1.43999999999999 \tabularnewline
49 & 104.7 & 109.534319526627 & -4.83431952662723 \tabularnewline
50 & 102.8 & 102.081183431953 & 0.71881656804733 \tabularnewline
51 & 98.1 & 100.001183431953 & -1.90118343195267 \tabularnewline
52 & 113.9 & 108.641183431953 & 5.25881656804734 \tabularnewline
53 & 80.9 & 85.0411834319527 & -4.14118343195265 \tabularnewline
54 & 95.7 & 94.3411834319527 & 1.35881656804734 \tabularnewline
55 & 113.2 & 109.501183431953 & 3.69881656804734 \tabularnewline
56 & 105.9 & 109.98 & -4.08000000000000 \tabularnewline
57 & 108.8 & 103.06 & 5.74 \tabularnewline
58 & 102.3 & 96.7 & 5.6 \tabularnewline
59 & 99 & 96.46 & 2.54 \tabularnewline
60 & 100.7 & 97.34 & 3.36000000000001 \tabularnewline
61 & 115.5 & 109.534319526627 & 5.96568047337276 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5675&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]110.4[/C][C]107.028402366864[/C][C]3.37159763313617[/C][/ROW]
[ROW][C]2[/C][C]96.4[/C][C]99.5752662721893[/C][C]-3.17526627218933[/C][/ROW]
[ROW][C]3[/C][C]101.9[/C][C]97.4952662721894[/C][C]4.40473372781065[/C][/ROW]
[ROW][C]4[/C][C]106.2[/C][C]106.135266272189[/C][C]0.0647337278106297[/C][/ROW]
[ROW][C]5[/C][C]81[/C][C]82.5352662721894[/C][C]-1.53526627218938[/C][/ROW]
[ROW][C]6[/C][C]94.7[/C][C]91.8352662721894[/C][C]2.86473372781064[/C][/ROW]
[ROW][C]7[/C][C]101[/C][C]106.995266272189[/C][C]-5.99526627218935[/C][/ROW]
[ROW][C]8[/C][C]109.4[/C][C]109.98[/C][C]-0.580000000000001[/C][/ROW]
[ROW][C]9[/C][C]102.3[/C][C]103.06[/C][C]-0.759999999999995[/C][/ROW]
[ROW][C]10[/C][C]90.7[/C][C]96.7[/C][C]-6[/C][/ROW]
[ROW][C]11[/C][C]96.2[/C][C]96.46[/C][C]-0.259999999999999[/C][/ROW]
[ROW][C]12[/C][C]96.1[/C][C]97.34[/C][C]-1.24000000000000[/C][/ROW]
[ROW][C]13[/C][C]106[/C][C]109.534319526627[/C][C]-3.53431952662724[/C][/ROW]
[ROW][C]14[/C][C]103.1[/C][C]102.081183431953[/C][C]1.01881656804733[/C][/ROW]
[ROW][C]15[/C][C]102[/C][C]100.001183431953[/C][C]1.99881656804734[/C][/ROW]
[ROW][C]16[/C][C]104.7[/C][C]108.641183431953[/C][C]-3.94118343195266[/C][/ROW]
[ROW][C]17[/C][C]86[/C][C]85.0411834319527[/C][C]0.958816568047336[/C][/ROW]
[ROW][C]18[/C][C]92.1[/C][C]94.3411834319527[/C][C]-2.24118343195266[/C][/ROW]
[ROW][C]19[/C][C]106.9[/C][C]109.501183431953[/C][C]-2.60118343195266[/C][/ROW]
[ROW][C]20[/C][C]112.6[/C][C]109.98[/C][C]2.61999999999999[/C][/ROW]
[ROW][C]21[/C][C]101.7[/C][C]103.06[/C][C]-1.36[/C][/ROW]
[ROW][C]22[/C][C]92[/C][C]96.7[/C][C]-4.7[/C][/ROW]
[ROW][C]23[/C][C]97.4[/C][C]96.46[/C][C]0.940000000000006[/C][/ROW]
[ROW][C]24[/C][C]97[/C][C]97.34[/C][C]-0.339999999999997[/C][/ROW]
[ROW][C]25[/C][C]105.4[/C][C]109.534319526627[/C][C]-4.13431952662723[/C][/ROW]
[ROW][C]26[/C][C]102.7[/C][C]102.081183431953[/C][C]0.618816568047336[/C][/ROW]
[ROW][C]27[/C][C]98.1[/C][C]100.001183431953[/C][C]-1.90118343195267[/C][/ROW]
[ROW][C]28[/C][C]104.5[/C][C]108.641183431953[/C][C]-4.14118343195266[/C][/ROW]
[ROW][C]29[/C][C]87.4[/C][C]85.0411834319527[/C][C]2.35881656804734[/C][/ROW]
[ROW][C]30[/C][C]89.9[/C][C]94.3411834319527[/C][C]-4.44118343195265[/C][/ROW]
[ROW][C]31[/C][C]109.8[/C][C]109.501183431953[/C][C]0.298816568047331[/C][/ROW]
[ROW][C]32[/C][C]111.7[/C][C]109.98[/C][C]1.72[/C][/ROW]
[ROW][C]33[/C][C]98.6[/C][C]103.06[/C][C]-4.46000000000001[/C][/ROW]
[ROW][C]34[/C][C]96.9[/C][C]96.7[/C][C]0.200000000000006[/C][/ROW]
[ROW][C]35[/C][C]95.1[/C][C]96.46[/C][C]-1.36000000000000[/C][/ROW]
[ROW][C]36[/C][C]97[/C][C]97.34[/C][C]-0.339999999999997[/C][/ROW]
[ROW][C]37[/C][C]112.7[/C][C]109.534319526627[/C][C]3.16568047337277[/C][/ROW]
[ROW][C]38[/C][C]102.9[/C][C]102.081183431953[/C][C]0.818816568047338[/C][/ROW]
[ROW][C]39[/C][C]97.4[/C][C]100.001183431953[/C][C]-2.60118343195265[/C][/ROW]
[ROW][C]40[/C][C]111.4[/C][C]108.641183431953[/C][C]2.75881656804734[/C][/ROW]
[ROW][C]41[/C][C]87.4[/C][C]85.0411834319527[/C][C]2.35881656804734[/C][/ROW]
[ROW][C]42[/C][C]96.8[/C][C]94.3411834319527[/C][C]2.45881656804734[/C][/ROW]
[ROW][C]43[/C][C]114.1[/C][C]109.501183431953[/C][C]4.59881656804733[/C][/ROW]
[ROW][C]44[/C][C]110.3[/C][C]109.98[/C][C]0.319999999999996[/C][/ROW]
[ROW][C]45[/C][C]103.9[/C][C]103.06[/C][C]0.840000000000005[/C][/ROW]
[ROW][C]46[/C][C]101.6[/C][C]96.7[/C][C]4.90000000000000[/C][/ROW]
[ROW][C]47[/C][C]94.6[/C][C]96.46[/C][C]-1.86000000000000[/C][/ROW]
[ROW][C]48[/C][C]95.9[/C][C]97.34[/C][C]-1.43999999999999[/C][/ROW]
[ROW][C]49[/C][C]104.7[/C][C]109.534319526627[/C][C]-4.83431952662723[/C][/ROW]
[ROW][C]50[/C][C]102.8[/C][C]102.081183431953[/C][C]0.71881656804733[/C][/ROW]
[ROW][C]51[/C][C]98.1[/C][C]100.001183431953[/C][C]-1.90118343195267[/C][/ROW]
[ROW][C]52[/C][C]113.9[/C][C]108.641183431953[/C][C]5.25881656804734[/C][/ROW]
[ROW][C]53[/C][C]80.9[/C][C]85.0411834319527[/C][C]-4.14118343195265[/C][/ROW]
[ROW][C]54[/C][C]95.7[/C][C]94.3411834319527[/C][C]1.35881656804734[/C][/ROW]
[ROW][C]55[/C][C]113.2[/C][C]109.501183431953[/C][C]3.69881656804734[/C][/ROW]
[ROW][C]56[/C][C]105.9[/C][C]109.98[/C][C]-4.08000000000000[/C][/ROW]
[ROW][C]57[/C][C]108.8[/C][C]103.06[/C][C]5.74[/C][/ROW]
[ROW][C]58[/C][C]102.3[/C][C]96.7[/C][C]5.6[/C][/ROW]
[ROW][C]59[/C][C]99[/C][C]96.46[/C][C]2.54[/C][/ROW]
[ROW][C]60[/C][C]100.7[/C][C]97.34[/C][C]3.36000000000001[/C][/ROW]
[ROW][C]61[/C][C]115.5[/C][C]109.534319526627[/C][C]5.96568047337276[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5675&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5675&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1110.4107.0284023668643.37159763313617
296.499.5752662721893-3.17526627218933
3101.997.49526627218944.40473372781065
4106.2106.1352662721890.0647337278106297
58182.5352662721894-1.53526627218938
694.791.83526627218942.86473372781064
7101106.995266272189-5.99526627218935
8109.4109.98-0.580000000000001
9102.3103.06-0.759999999999995
1090.796.7-6
1196.296.46-0.259999999999999
1296.197.34-1.24000000000000
13106109.534319526627-3.53431952662724
14103.1102.0811834319531.01881656804733
15102100.0011834319531.99881656804734
16104.7108.641183431953-3.94118343195266
178685.04118343195270.958816568047336
1892.194.3411834319527-2.24118343195266
19106.9109.501183431953-2.60118343195266
20112.6109.982.61999999999999
21101.7103.06-1.36
229296.7-4.7
2397.496.460.940000000000006
249797.34-0.339999999999997
25105.4109.534319526627-4.13431952662723
26102.7102.0811834319530.618816568047336
2798.1100.001183431953-1.90118343195267
28104.5108.641183431953-4.14118343195266
2987.485.04118343195272.35881656804734
3089.994.3411834319527-4.44118343195265
31109.8109.5011834319530.298816568047331
32111.7109.981.72
3398.6103.06-4.46000000000001
3496.996.70.200000000000006
3595.196.46-1.36000000000000
369797.34-0.339999999999997
37112.7109.5343195266273.16568047337277
38102.9102.0811834319530.818816568047338
3997.4100.001183431953-2.60118343195265
40111.4108.6411834319532.75881656804734
4187.485.04118343195272.35881656804734
4296.894.34118343195272.45881656804734
43114.1109.5011834319534.59881656804733
44110.3109.980.319999999999996
45103.9103.060.840000000000005
46101.696.74.90000000000000
4794.696.46-1.86000000000000
4895.997.34-1.43999999999999
49104.7109.534319526627-4.83431952662723
50102.8102.0811834319530.71881656804733
5198.1100.001183431953-1.90118343195267
52113.9108.6411834319535.25881656804734
5380.985.0411834319527-4.14118343195265
5495.794.34118343195271.35881656804734
55113.2109.5011834319533.69881656804734
56105.9109.98-4.08000000000000
57108.8103.065.74
58102.396.75.6
599996.462.54
60100.797.343.36000000000001
61115.5109.5343195266275.96568047337276



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')