Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 03:27:18 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195467640a97uui1v73ib42z.htm/, Retrieved Fri, 03 May 2024 14:44:01 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5666, Retrieved Fri, 03 May 2024 14:44:01 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact219
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Works 6 T1] [2007-11-19 10:27:18] [6bae8369195607c4cbc8a8485fed7b2f] [Current]
Feedback Forum

Post a new message
Dataseries X:
110.40	0
96.40	0
101.90	0
106.20	0
81.00	0
94.70	0
101.00	0
109.40	1
102.30	1
90.70	1
96.20	1
96.10	1
106.00	1
103.10	1
102.00	1
104.70	1
86.00	1
92.10	1
106.90	1
112.60	1
101.70	1
92.00	1
97.40	1
97.00	1
105.40	1
102.70	1
98.10	1
104.50	1
87.40	1
89.90	1
109.80	1
111.70	1
98.60	1
96.90	1
95.10	1
97.00	1
112.70	1
102.90	1
97.40	1
111.40	1
87.40	1
96.80	1
114.10	1
110.30	1
103.90	1
101.60	1
94.60	1
95.90	1
104.70	1
102.80	1
98.10	1
113.90	1
80.90	1
95.70	1
113.20	1
105.90	1
108.80	1
102.30	1
99.00	1
100.70	1
115.50	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5666&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5666&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5666&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 98.8 + 2.38148148148147`x `[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  98.8 +  2.38148148148147`x
`[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5666&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  98.8 +  2.38148148148147`x
`[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5666&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5666&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 98.8 + 2.38148148148147`x `[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)98.83.06623532.221900
`x `2.381481481481473.2589180.73080.4678190.233909

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 98.8 & 3.066235 & 32.2219 & 0 & 0 \tabularnewline
`x
` & 2.38148148148147 & 3.258918 & 0.7308 & 0.467819 & 0.233909 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5666&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]98.8[/C][C]3.066235[/C][C]32.2219[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]`x
`[/C][C]2.38148148148147[/C][C]3.258918[/C][C]0.7308[/C][C]0.467819[/C][C]0.233909[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5666&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5666&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)98.83.06623532.221900
`x `2.381481481481473.2589180.73080.4678190.233909







Multiple Linear Regression - Regression Statistics
Multiple R0.0947089920200564
R-squared0.00896979316945512
Adjusted R-squared-0.0078273289802151
F-TEST (value)0.534007735940124
F-TEST (DF numerator)1
F-TEST (DF denominator)59
p-value0.467818896835352
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation8.1124945289804
Sum Squared Residuals3882.94148148148

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.0947089920200564 \tabularnewline
R-squared & 0.00896979316945512 \tabularnewline
Adjusted R-squared & -0.0078273289802151 \tabularnewline
F-TEST (value) & 0.534007735940124 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 59 \tabularnewline
p-value & 0.467818896835352 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 8.1124945289804 \tabularnewline
Sum Squared Residuals & 3882.94148148148 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5666&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.0947089920200564[/C][/ROW]
[ROW][C]R-squared[/C][C]0.00896979316945512[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.0078273289802151[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.534007735940124[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]59[/C][/ROW]
[ROW][C]p-value[/C][C]0.467818896835352[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]8.1124945289804[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]3882.94148148148[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5666&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5666&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.0947089920200564
R-squared0.00896979316945512
Adjusted R-squared-0.0078273289802151
F-TEST (value)0.534007735940124
F-TEST (DF numerator)1
F-TEST (DF denominator)59
p-value0.467818896835352
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation8.1124945289804
Sum Squared Residuals3882.94148148148







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1110.498.811.6000000000001
296.498.8-2.40000000000000
3101.998.83.1
4106.298.87.4
58198.8-17.8
694.798.8-4.10000000000001
710198.82.19999999999999
8109.4101.1814814814818.21851851851852
9102.3101.1814814814811.11851851851851
1090.7101.181481481481-10.4814814814815
1196.2101.181481481481-4.98148148148148
1296.1101.181481481481-5.08148148148149
13106101.1814814814814.81851851851852
14103.1101.1814814814811.91851851851851
15102101.1814814814810.818518518518517
16104.7101.1814814814813.51851851851852
1786101.181481481481-15.1814814814815
1892.1101.181481481481-9.0814814814815
19106.9101.1814814814815.71851851851852
20112.6101.18148148148111.4185185185185
21101.7101.1814814814810.518518518518519
2292101.181481481481-9.18148148148148
2397.4101.181481481481-3.78148148148148
2497101.181481481481-4.18148148148148
25105.4101.1814814814814.21851851851852
26102.7101.1814814814811.51851851851852
2798.1101.181481481481-3.08148148148149
28104.5101.1814814814813.31851851851852
2987.4101.181481481481-13.7814814814815
3089.9101.181481481481-11.2814814814815
31109.8101.1814814814818.61851851851852
32111.7101.18148148148110.5185185185185
3398.6101.181481481481-2.58148148148149
3496.9101.181481481481-4.28148148148148
3595.1101.181481481481-6.08148148148149
3697101.181481481481-4.18148148148148
37112.7101.18148148148111.5185185185185
38102.9101.1814814814811.71851851851852
3997.4101.181481481481-3.78148148148148
40111.4101.18148148148110.2185185185185
4187.4101.181481481481-13.7814814814815
4296.8101.181481481481-4.38148148148149
43114.1101.18148148148112.9185185185185
44110.3101.1814814814819.11851851851852
45103.9101.1814814814812.71851851851852
46101.6101.1814814814810.418518518518511
4794.6101.181481481481-6.58148148148149
4895.9101.181481481481-5.28148148148148
49104.7101.1814814814813.51851851851852
50102.8101.1814814814811.61851851851851
5198.1101.181481481481-3.08148148148149
52113.9101.18148148148112.7185185185185
5380.9101.181481481481-20.2814814814815
5495.7101.181481481481-5.48148148148148
55113.2101.18148148148112.0185185185185
56105.9101.1814814814814.71851851851852
57108.8101.1814814814817.61851851851851
58102.3101.1814814814811.11851851851851
5999101.181481481481-2.18148148148148
60100.7101.181481481481-0.48148148148148
61115.5101.18148148148114.3185185185185

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 110.4 & 98.8 & 11.6000000000001 \tabularnewline
2 & 96.4 & 98.8 & -2.40000000000000 \tabularnewline
3 & 101.9 & 98.8 & 3.1 \tabularnewline
4 & 106.2 & 98.8 & 7.4 \tabularnewline
5 & 81 & 98.8 & -17.8 \tabularnewline
6 & 94.7 & 98.8 & -4.10000000000001 \tabularnewline
7 & 101 & 98.8 & 2.19999999999999 \tabularnewline
8 & 109.4 & 101.181481481481 & 8.21851851851852 \tabularnewline
9 & 102.3 & 101.181481481481 & 1.11851851851851 \tabularnewline
10 & 90.7 & 101.181481481481 & -10.4814814814815 \tabularnewline
11 & 96.2 & 101.181481481481 & -4.98148148148148 \tabularnewline
12 & 96.1 & 101.181481481481 & -5.08148148148149 \tabularnewline
13 & 106 & 101.181481481481 & 4.81851851851852 \tabularnewline
14 & 103.1 & 101.181481481481 & 1.91851851851851 \tabularnewline
15 & 102 & 101.181481481481 & 0.818518518518517 \tabularnewline
16 & 104.7 & 101.181481481481 & 3.51851851851852 \tabularnewline
17 & 86 & 101.181481481481 & -15.1814814814815 \tabularnewline
18 & 92.1 & 101.181481481481 & -9.0814814814815 \tabularnewline
19 & 106.9 & 101.181481481481 & 5.71851851851852 \tabularnewline
20 & 112.6 & 101.181481481481 & 11.4185185185185 \tabularnewline
21 & 101.7 & 101.181481481481 & 0.518518518518519 \tabularnewline
22 & 92 & 101.181481481481 & -9.18148148148148 \tabularnewline
23 & 97.4 & 101.181481481481 & -3.78148148148148 \tabularnewline
24 & 97 & 101.181481481481 & -4.18148148148148 \tabularnewline
25 & 105.4 & 101.181481481481 & 4.21851851851852 \tabularnewline
26 & 102.7 & 101.181481481481 & 1.51851851851852 \tabularnewline
27 & 98.1 & 101.181481481481 & -3.08148148148149 \tabularnewline
28 & 104.5 & 101.181481481481 & 3.31851851851852 \tabularnewline
29 & 87.4 & 101.181481481481 & -13.7814814814815 \tabularnewline
30 & 89.9 & 101.181481481481 & -11.2814814814815 \tabularnewline
31 & 109.8 & 101.181481481481 & 8.61851851851852 \tabularnewline
32 & 111.7 & 101.181481481481 & 10.5185185185185 \tabularnewline
33 & 98.6 & 101.181481481481 & -2.58148148148149 \tabularnewline
34 & 96.9 & 101.181481481481 & -4.28148148148148 \tabularnewline
35 & 95.1 & 101.181481481481 & -6.08148148148149 \tabularnewline
36 & 97 & 101.181481481481 & -4.18148148148148 \tabularnewline
37 & 112.7 & 101.181481481481 & 11.5185185185185 \tabularnewline
38 & 102.9 & 101.181481481481 & 1.71851851851852 \tabularnewline
39 & 97.4 & 101.181481481481 & -3.78148148148148 \tabularnewline
40 & 111.4 & 101.181481481481 & 10.2185185185185 \tabularnewline
41 & 87.4 & 101.181481481481 & -13.7814814814815 \tabularnewline
42 & 96.8 & 101.181481481481 & -4.38148148148149 \tabularnewline
43 & 114.1 & 101.181481481481 & 12.9185185185185 \tabularnewline
44 & 110.3 & 101.181481481481 & 9.11851851851852 \tabularnewline
45 & 103.9 & 101.181481481481 & 2.71851851851852 \tabularnewline
46 & 101.6 & 101.181481481481 & 0.418518518518511 \tabularnewline
47 & 94.6 & 101.181481481481 & -6.58148148148149 \tabularnewline
48 & 95.9 & 101.181481481481 & -5.28148148148148 \tabularnewline
49 & 104.7 & 101.181481481481 & 3.51851851851852 \tabularnewline
50 & 102.8 & 101.181481481481 & 1.61851851851851 \tabularnewline
51 & 98.1 & 101.181481481481 & -3.08148148148149 \tabularnewline
52 & 113.9 & 101.181481481481 & 12.7185185185185 \tabularnewline
53 & 80.9 & 101.181481481481 & -20.2814814814815 \tabularnewline
54 & 95.7 & 101.181481481481 & -5.48148148148148 \tabularnewline
55 & 113.2 & 101.181481481481 & 12.0185185185185 \tabularnewline
56 & 105.9 & 101.181481481481 & 4.71851851851852 \tabularnewline
57 & 108.8 & 101.181481481481 & 7.61851851851851 \tabularnewline
58 & 102.3 & 101.181481481481 & 1.11851851851851 \tabularnewline
59 & 99 & 101.181481481481 & -2.18148148148148 \tabularnewline
60 & 100.7 & 101.181481481481 & -0.48148148148148 \tabularnewline
61 & 115.5 & 101.181481481481 & 14.3185185185185 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5666&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]110.4[/C][C]98.8[/C][C]11.6000000000001[/C][/ROW]
[ROW][C]2[/C][C]96.4[/C][C]98.8[/C][C]-2.40000000000000[/C][/ROW]
[ROW][C]3[/C][C]101.9[/C][C]98.8[/C][C]3.1[/C][/ROW]
[ROW][C]4[/C][C]106.2[/C][C]98.8[/C][C]7.4[/C][/ROW]
[ROW][C]5[/C][C]81[/C][C]98.8[/C][C]-17.8[/C][/ROW]
[ROW][C]6[/C][C]94.7[/C][C]98.8[/C][C]-4.10000000000001[/C][/ROW]
[ROW][C]7[/C][C]101[/C][C]98.8[/C][C]2.19999999999999[/C][/ROW]
[ROW][C]8[/C][C]109.4[/C][C]101.181481481481[/C][C]8.21851851851852[/C][/ROW]
[ROW][C]9[/C][C]102.3[/C][C]101.181481481481[/C][C]1.11851851851851[/C][/ROW]
[ROW][C]10[/C][C]90.7[/C][C]101.181481481481[/C][C]-10.4814814814815[/C][/ROW]
[ROW][C]11[/C][C]96.2[/C][C]101.181481481481[/C][C]-4.98148148148148[/C][/ROW]
[ROW][C]12[/C][C]96.1[/C][C]101.181481481481[/C][C]-5.08148148148149[/C][/ROW]
[ROW][C]13[/C][C]106[/C][C]101.181481481481[/C][C]4.81851851851852[/C][/ROW]
[ROW][C]14[/C][C]103.1[/C][C]101.181481481481[/C][C]1.91851851851851[/C][/ROW]
[ROW][C]15[/C][C]102[/C][C]101.181481481481[/C][C]0.818518518518517[/C][/ROW]
[ROW][C]16[/C][C]104.7[/C][C]101.181481481481[/C][C]3.51851851851852[/C][/ROW]
[ROW][C]17[/C][C]86[/C][C]101.181481481481[/C][C]-15.1814814814815[/C][/ROW]
[ROW][C]18[/C][C]92.1[/C][C]101.181481481481[/C][C]-9.0814814814815[/C][/ROW]
[ROW][C]19[/C][C]106.9[/C][C]101.181481481481[/C][C]5.71851851851852[/C][/ROW]
[ROW][C]20[/C][C]112.6[/C][C]101.181481481481[/C][C]11.4185185185185[/C][/ROW]
[ROW][C]21[/C][C]101.7[/C][C]101.181481481481[/C][C]0.518518518518519[/C][/ROW]
[ROW][C]22[/C][C]92[/C][C]101.181481481481[/C][C]-9.18148148148148[/C][/ROW]
[ROW][C]23[/C][C]97.4[/C][C]101.181481481481[/C][C]-3.78148148148148[/C][/ROW]
[ROW][C]24[/C][C]97[/C][C]101.181481481481[/C][C]-4.18148148148148[/C][/ROW]
[ROW][C]25[/C][C]105.4[/C][C]101.181481481481[/C][C]4.21851851851852[/C][/ROW]
[ROW][C]26[/C][C]102.7[/C][C]101.181481481481[/C][C]1.51851851851852[/C][/ROW]
[ROW][C]27[/C][C]98.1[/C][C]101.181481481481[/C][C]-3.08148148148149[/C][/ROW]
[ROW][C]28[/C][C]104.5[/C][C]101.181481481481[/C][C]3.31851851851852[/C][/ROW]
[ROW][C]29[/C][C]87.4[/C][C]101.181481481481[/C][C]-13.7814814814815[/C][/ROW]
[ROW][C]30[/C][C]89.9[/C][C]101.181481481481[/C][C]-11.2814814814815[/C][/ROW]
[ROW][C]31[/C][C]109.8[/C][C]101.181481481481[/C][C]8.61851851851852[/C][/ROW]
[ROW][C]32[/C][C]111.7[/C][C]101.181481481481[/C][C]10.5185185185185[/C][/ROW]
[ROW][C]33[/C][C]98.6[/C][C]101.181481481481[/C][C]-2.58148148148149[/C][/ROW]
[ROW][C]34[/C][C]96.9[/C][C]101.181481481481[/C][C]-4.28148148148148[/C][/ROW]
[ROW][C]35[/C][C]95.1[/C][C]101.181481481481[/C][C]-6.08148148148149[/C][/ROW]
[ROW][C]36[/C][C]97[/C][C]101.181481481481[/C][C]-4.18148148148148[/C][/ROW]
[ROW][C]37[/C][C]112.7[/C][C]101.181481481481[/C][C]11.5185185185185[/C][/ROW]
[ROW][C]38[/C][C]102.9[/C][C]101.181481481481[/C][C]1.71851851851852[/C][/ROW]
[ROW][C]39[/C][C]97.4[/C][C]101.181481481481[/C][C]-3.78148148148148[/C][/ROW]
[ROW][C]40[/C][C]111.4[/C][C]101.181481481481[/C][C]10.2185185185185[/C][/ROW]
[ROW][C]41[/C][C]87.4[/C][C]101.181481481481[/C][C]-13.7814814814815[/C][/ROW]
[ROW][C]42[/C][C]96.8[/C][C]101.181481481481[/C][C]-4.38148148148149[/C][/ROW]
[ROW][C]43[/C][C]114.1[/C][C]101.181481481481[/C][C]12.9185185185185[/C][/ROW]
[ROW][C]44[/C][C]110.3[/C][C]101.181481481481[/C][C]9.11851851851852[/C][/ROW]
[ROW][C]45[/C][C]103.9[/C][C]101.181481481481[/C][C]2.71851851851852[/C][/ROW]
[ROW][C]46[/C][C]101.6[/C][C]101.181481481481[/C][C]0.418518518518511[/C][/ROW]
[ROW][C]47[/C][C]94.6[/C][C]101.181481481481[/C][C]-6.58148148148149[/C][/ROW]
[ROW][C]48[/C][C]95.9[/C][C]101.181481481481[/C][C]-5.28148148148148[/C][/ROW]
[ROW][C]49[/C][C]104.7[/C][C]101.181481481481[/C][C]3.51851851851852[/C][/ROW]
[ROW][C]50[/C][C]102.8[/C][C]101.181481481481[/C][C]1.61851851851851[/C][/ROW]
[ROW][C]51[/C][C]98.1[/C][C]101.181481481481[/C][C]-3.08148148148149[/C][/ROW]
[ROW][C]52[/C][C]113.9[/C][C]101.181481481481[/C][C]12.7185185185185[/C][/ROW]
[ROW][C]53[/C][C]80.9[/C][C]101.181481481481[/C][C]-20.2814814814815[/C][/ROW]
[ROW][C]54[/C][C]95.7[/C][C]101.181481481481[/C][C]-5.48148148148148[/C][/ROW]
[ROW][C]55[/C][C]113.2[/C][C]101.181481481481[/C][C]12.0185185185185[/C][/ROW]
[ROW][C]56[/C][C]105.9[/C][C]101.181481481481[/C][C]4.71851851851852[/C][/ROW]
[ROW][C]57[/C][C]108.8[/C][C]101.181481481481[/C][C]7.61851851851851[/C][/ROW]
[ROW][C]58[/C][C]102.3[/C][C]101.181481481481[/C][C]1.11851851851851[/C][/ROW]
[ROW][C]59[/C][C]99[/C][C]101.181481481481[/C][C]-2.18148148148148[/C][/ROW]
[ROW][C]60[/C][C]100.7[/C][C]101.181481481481[/C][C]-0.48148148148148[/C][/ROW]
[ROW][C]61[/C][C]115.5[/C][C]101.181481481481[/C][C]14.3185185185185[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5666&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5666&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1110.498.811.6000000000001
296.498.8-2.40000000000000
3101.998.83.1
4106.298.87.4
58198.8-17.8
694.798.8-4.10000000000001
710198.82.19999999999999
8109.4101.1814814814818.21851851851852
9102.3101.1814814814811.11851851851851
1090.7101.181481481481-10.4814814814815
1196.2101.181481481481-4.98148148148148
1296.1101.181481481481-5.08148148148149
13106101.1814814814814.81851851851852
14103.1101.1814814814811.91851851851851
15102101.1814814814810.818518518518517
16104.7101.1814814814813.51851851851852
1786101.181481481481-15.1814814814815
1892.1101.181481481481-9.0814814814815
19106.9101.1814814814815.71851851851852
20112.6101.18148148148111.4185185185185
21101.7101.1814814814810.518518518518519
2292101.181481481481-9.18148148148148
2397.4101.181481481481-3.78148148148148
2497101.181481481481-4.18148148148148
25105.4101.1814814814814.21851851851852
26102.7101.1814814814811.51851851851852
2798.1101.181481481481-3.08148148148149
28104.5101.1814814814813.31851851851852
2987.4101.181481481481-13.7814814814815
3089.9101.181481481481-11.2814814814815
31109.8101.1814814814818.61851851851852
32111.7101.18148148148110.5185185185185
3398.6101.181481481481-2.58148148148149
3496.9101.181481481481-4.28148148148148
3595.1101.181481481481-6.08148148148149
3697101.181481481481-4.18148148148148
37112.7101.18148148148111.5185185185185
38102.9101.1814814814811.71851851851852
3997.4101.181481481481-3.78148148148148
40111.4101.18148148148110.2185185185185
4187.4101.181481481481-13.7814814814815
4296.8101.181481481481-4.38148148148149
43114.1101.18148148148112.9185185185185
44110.3101.1814814814819.11851851851852
45103.9101.1814814814812.71851851851852
46101.6101.1814814814810.418518518518511
4794.6101.181481481481-6.58148148148149
4895.9101.181481481481-5.28148148148148
49104.7101.1814814814813.51851851851852
50102.8101.1814814814811.61851851851851
5198.1101.181481481481-3.08148148148149
52113.9101.18148148148112.7185185185185
5380.9101.181481481481-20.2814814814815
5495.7101.181481481481-5.48148148148148
55113.2101.18148148148112.0185185185185
56105.9101.1814814814814.71851851851852
57108.8101.1814814814817.61851851851851
58102.3101.1814814814811.11851851851851
5999101.181481481481-2.18148148148148
60100.7101.181481481481-0.48148148148148
61115.5101.18148148148114.3185185185185



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')