Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 19 Nov 2007 04:01:48 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/19/t1195469763yqp27qafvvu2sqv.htm/, Retrieved Fri, 03 May 2024 07:39:26 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5698, Retrieved Fri, 03 May 2024 07:39:26 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsQ3
Estimated Impact193
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [the seatbelt law] [2007-11-19 11:01:48] [c4516de5538230e4cf0ae0b9d9e43dd3] [Current]
Feedback Forum

Post a new message
Dataseries X:
102.3	0
98.7	0
104.4	0
97.6	0
102.7	0
103.0	0
92.9	0
96.1	0
94.9	0
99.9	0
96.3	0
89.5	0
104.6	0
101.5	0
109.8	0
112.1	0
110.1	0
107.1	0
108.1	0
99.0	0
104.0	0
106.7	0
101.1	0
97.8	0
113.8	0
107.1	0
117.5	1
113.7	1
106.6	1
109.8	1
108.8	1
102.0	1
114.5	1
116.5	1
108.6	1
113.9	1
109.3	1
112.5	1
123.4	1
115.2	1
110.8	1
120.4	1
117.6	1
111.2	1
131.1	1
118.9	1
115.7	1
119.6	1
113.1	1
106.4	1
115.5	1
111.8	1
109.6	1
121.5	1
109.5	1
109.0	1
113.4	1
112.7	1
114.4	1
109.2	1
116.2	1
113.8	1
123.6	1
112.6	1
117.7	1
113.3	1
110.7	1
114.7	1
116.9	1
120.6	1
111.6	1
111.9	1
116.1	1
111.9	1
125.1	1
115.1	1
116.7	1
115.8	1
116.8	1
113.0	1
106.5	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5698&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5698&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5698&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 102.35 + 11.91`x `[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  102.35 +  11.91`x
`[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5698&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  102.35 +  11.91`x
`[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5698&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5698&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 102.35 + 11.91`x `[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)102.351.06847795.790500
`x `11.911.2966619.185100

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 102.35 & 1.068477 & 95.7905 & 0 & 0 \tabularnewline
`x
` & 11.91 & 1.296661 & 9.1851 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5698&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]102.35[/C][C]1.068477[/C][C]95.7905[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]`x
`[/C][C]11.91[/C][C]1.296661[/C][C]9.1851[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5698&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5698&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)102.351.06847795.790500
`x `11.911.2966619.185100







Multiple Linear Regression - Regression Statistics
Multiple R0.718627141240444
R-squared0.516424968127414
Adjusted R-squared0.510303765192318
F-TEST (value)84.3665817982413
F-TEST (DF numerator)1
F-TEST (DF denominator)79
p-value4.2410519540681e-14
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.44818748167159
Sum Squared Residuals2344.937

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.718627141240444 \tabularnewline
R-squared & 0.516424968127414 \tabularnewline
Adjusted R-squared & 0.510303765192318 \tabularnewline
F-TEST (value) & 84.3665817982413 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 79 \tabularnewline
p-value & 4.2410519540681e-14 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.44818748167159 \tabularnewline
Sum Squared Residuals & 2344.937 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5698&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.718627141240444[/C][/ROW]
[ROW][C]R-squared[/C][C]0.516424968127414[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.510303765192318[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]84.3665817982413[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]79[/C][/ROW]
[ROW][C]p-value[/C][C]4.2410519540681e-14[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.44818748167159[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]2344.937[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5698&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5698&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.718627141240444
R-squared0.516424968127414
Adjusted R-squared0.510303765192318
F-TEST (value)84.3665817982413
F-TEST (DF numerator)1
F-TEST (DF denominator)79
p-value4.2410519540681e-14
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.44818748167159
Sum Squared Residuals2344.937







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1102.3102.350000000000-0.0500000000001618
298.7102.35-3.64999999999991
3104.4102.352.05000000000001
497.6102.35-4.75
5102.7102.350.350000000000008
6103102.350.650000000000005
792.9102.35-9.44999999999999
896.1102.35-6.25
994.9102.35-7.44999999999999
1099.9102.35-2.44999999999999
1196.3102.35-6.05
1289.5102.35-12.85
13104.6102.352.25
14101.5102.35-0.849999999999995
15109.8102.357.45
16112.1102.359.75
17110.1102.357.75
18107.1102.354.75
19108.1102.355.75
2099102.35-3.34999999999999
21104102.351.65000000000001
22106.7102.354.35000000000001
23101.1102.35-1.25
2497.8102.35-4.55
25113.8102.3511.45
26107.1102.354.75
27117.5114.263.24
28113.7114.26-0.559999999999997
29106.6114.26-7.66
30109.8114.26-4.46
31108.8114.26-5.46
32102114.26-12.26
33114.5114.260.24
34116.5114.262.24
35108.6114.26-5.66
36113.9114.26-0.359999999999994
37109.3114.26-4.96
38112.5114.26-1.76
39123.4114.269.14
40115.2114.260.940000000000003
41110.8114.26-3.46
42120.4114.266.14000000000001
43117.6114.263.33999999999999
44111.2114.26-3.06
45131.1114.2616.84
46118.9114.264.64000000000001
47115.7114.261.44000000000000
48119.6114.265.34
49113.1114.26-1.16000000000001
50106.4114.26-7.86
51115.5114.261.24
52111.8114.26-2.46000000000000
53109.6114.26-4.66000000000001
54121.5114.267.24
55109.5114.26-4.76
56109114.26-5.26
57113.4114.26-0.859999999999994
58112.7114.26-1.56000000000000
59114.4114.260.140000000000006
60109.2114.26-5.06
61116.2114.261.94000000000000
62113.8114.26-0.460000000000003
63123.6114.269.34
64112.6114.26-1.66000000000001
65117.7114.263.44
66113.3114.26-0.960000000000003
67110.7114.26-3.56
68114.7114.260.440000000000003
69116.9114.262.64000000000001
70120.6114.266.34
71111.6114.26-2.66000000000001
72111.9114.26-2.35999999999999
73116.1114.261.83999999999999
74111.9114.26-2.35999999999999
75125.1114.2610.84
76115.1114.260.839999999999994
77116.7114.262.44000000000000
78115.8114.261.54000000000000
79116.8114.262.54000000000000
80113114.26-1.26
81106.5114.26-7.76

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 102.3 & 102.350000000000 & -0.0500000000001618 \tabularnewline
2 & 98.7 & 102.35 & -3.64999999999991 \tabularnewline
3 & 104.4 & 102.35 & 2.05000000000001 \tabularnewline
4 & 97.6 & 102.35 & -4.75 \tabularnewline
5 & 102.7 & 102.35 & 0.350000000000008 \tabularnewline
6 & 103 & 102.35 & 0.650000000000005 \tabularnewline
7 & 92.9 & 102.35 & -9.44999999999999 \tabularnewline
8 & 96.1 & 102.35 & -6.25 \tabularnewline
9 & 94.9 & 102.35 & -7.44999999999999 \tabularnewline
10 & 99.9 & 102.35 & -2.44999999999999 \tabularnewline
11 & 96.3 & 102.35 & -6.05 \tabularnewline
12 & 89.5 & 102.35 & -12.85 \tabularnewline
13 & 104.6 & 102.35 & 2.25 \tabularnewline
14 & 101.5 & 102.35 & -0.849999999999995 \tabularnewline
15 & 109.8 & 102.35 & 7.45 \tabularnewline
16 & 112.1 & 102.35 & 9.75 \tabularnewline
17 & 110.1 & 102.35 & 7.75 \tabularnewline
18 & 107.1 & 102.35 & 4.75 \tabularnewline
19 & 108.1 & 102.35 & 5.75 \tabularnewline
20 & 99 & 102.35 & -3.34999999999999 \tabularnewline
21 & 104 & 102.35 & 1.65000000000001 \tabularnewline
22 & 106.7 & 102.35 & 4.35000000000001 \tabularnewline
23 & 101.1 & 102.35 & -1.25 \tabularnewline
24 & 97.8 & 102.35 & -4.55 \tabularnewline
25 & 113.8 & 102.35 & 11.45 \tabularnewline
26 & 107.1 & 102.35 & 4.75 \tabularnewline
27 & 117.5 & 114.26 & 3.24 \tabularnewline
28 & 113.7 & 114.26 & -0.559999999999997 \tabularnewline
29 & 106.6 & 114.26 & -7.66 \tabularnewline
30 & 109.8 & 114.26 & -4.46 \tabularnewline
31 & 108.8 & 114.26 & -5.46 \tabularnewline
32 & 102 & 114.26 & -12.26 \tabularnewline
33 & 114.5 & 114.26 & 0.24 \tabularnewline
34 & 116.5 & 114.26 & 2.24 \tabularnewline
35 & 108.6 & 114.26 & -5.66 \tabularnewline
36 & 113.9 & 114.26 & -0.359999999999994 \tabularnewline
37 & 109.3 & 114.26 & -4.96 \tabularnewline
38 & 112.5 & 114.26 & -1.76 \tabularnewline
39 & 123.4 & 114.26 & 9.14 \tabularnewline
40 & 115.2 & 114.26 & 0.940000000000003 \tabularnewline
41 & 110.8 & 114.26 & -3.46 \tabularnewline
42 & 120.4 & 114.26 & 6.14000000000001 \tabularnewline
43 & 117.6 & 114.26 & 3.33999999999999 \tabularnewline
44 & 111.2 & 114.26 & -3.06 \tabularnewline
45 & 131.1 & 114.26 & 16.84 \tabularnewline
46 & 118.9 & 114.26 & 4.64000000000001 \tabularnewline
47 & 115.7 & 114.26 & 1.44000000000000 \tabularnewline
48 & 119.6 & 114.26 & 5.34 \tabularnewline
49 & 113.1 & 114.26 & -1.16000000000001 \tabularnewline
50 & 106.4 & 114.26 & -7.86 \tabularnewline
51 & 115.5 & 114.26 & 1.24 \tabularnewline
52 & 111.8 & 114.26 & -2.46000000000000 \tabularnewline
53 & 109.6 & 114.26 & -4.66000000000001 \tabularnewline
54 & 121.5 & 114.26 & 7.24 \tabularnewline
55 & 109.5 & 114.26 & -4.76 \tabularnewline
56 & 109 & 114.26 & -5.26 \tabularnewline
57 & 113.4 & 114.26 & -0.859999999999994 \tabularnewline
58 & 112.7 & 114.26 & -1.56000000000000 \tabularnewline
59 & 114.4 & 114.26 & 0.140000000000006 \tabularnewline
60 & 109.2 & 114.26 & -5.06 \tabularnewline
61 & 116.2 & 114.26 & 1.94000000000000 \tabularnewline
62 & 113.8 & 114.26 & -0.460000000000003 \tabularnewline
63 & 123.6 & 114.26 & 9.34 \tabularnewline
64 & 112.6 & 114.26 & -1.66000000000001 \tabularnewline
65 & 117.7 & 114.26 & 3.44 \tabularnewline
66 & 113.3 & 114.26 & -0.960000000000003 \tabularnewline
67 & 110.7 & 114.26 & -3.56 \tabularnewline
68 & 114.7 & 114.26 & 0.440000000000003 \tabularnewline
69 & 116.9 & 114.26 & 2.64000000000001 \tabularnewline
70 & 120.6 & 114.26 & 6.34 \tabularnewline
71 & 111.6 & 114.26 & -2.66000000000001 \tabularnewline
72 & 111.9 & 114.26 & -2.35999999999999 \tabularnewline
73 & 116.1 & 114.26 & 1.83999999999999 \tabularnewline
74 & 111.9 & 114.26 & -2.35999999999999 \tabularnewline
75 & 125.1 & 114.26 & 10.84 \tabularnewline
76 & 115.1 & 114.26 & 0.839999999999994 \tabularnewline
77 & 116.7 & 114.26 & 2.44000000000000 \tabularnewline
78 & 115.8 & 114.26 & 1.54000000000000 \tabularnewline
79 & 116.8 & 114.26 & 2.54000000000000 \tabularnewline
80 & 113 & 114.26 & -1.26 \tabularnewline
81 & 106.5 & 114.26 & -7.76 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5698&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]102.3[/C][C]102.350000000000[/C][C]-0.0500000000001618[/C][/ROW]
[ROW][C]2[/C][C]98.7[/C][C]102.35[/C][C]-3.64999999999991[/C][/ROW]
[ROW][C]3[/C][C]104.4[/C][C]102.35[/C][C]2.05000000000001[/C][/ROW]
[ROW][C]4[/C][C]97.6[/C][C]102.35[/C][C]-4.75[/C][/ROW]
[ROW][C]5[/C][C]102.7[/C][C]102.35[/C][C]0.350000000000008[/C][/ROW]
[ROW][C]6[/C][C]103[/C][C]102.35[/C][C]0.650000000000005[/C][/ROW]
[ROW][C]7[/C][C]92.9[/C][C]102.35[/C][C]-9.44999999999999[/C][/ROW]
[ROW][C]8[/C][C]96.1[/C][C]102.35[/C][C]-6.25[/C][/ROW]
[ROW][C]9[/C][C]94.9[/C][C]102.35[/C][C]-7.44999999999999[/C][/ROW]
[ROW][C]10[/C][C]99.9[/C][C]102.35[/C][C]-2.44999999999999[/C][/ROW]
[ROW][C]11[/C][C]96.3[/C][C]102.35[/C][C]-6.05[/C][/ROW]
[ROW][C]12[/C][C]89.5[/C][C]102.35[/C][C]-12.85[/C][/ROW]
[ROW][C]13[/C][C]104.6[/C][C]102.35[/C][C]2.25[/C][/ROW]
[ROW][C]14[/C][C]101.5[/C][C]102.35[/C][C]-0.849999999999995[/C][/ROW]
[ROW][C]15[/C][C]109.8[/C][C]102.35[/C][C]7.45[/C][/ROW]
[ROW][C]16[/C][C]112.1[/C][C]102.35[/C][C]9.75[/C][/ROW]
[ROW][C]17[/C][C]110.1[/C][C]102.35[/C][C]7.75[/C][/ROW]
[ROW][C]18[/C][C]107.1[/C][C]102.35[/C][C]4.75[/C][/ROW]
[ROW][C]19[/C][C]108.1[/C][C]102.35[/C][C]5.75[/C][/ROW]
[ROW][C]20[/C][C]99[/C][C]102.35[/C][C]-3.34999999999999[/C][/ROW]
[ROW][C]21[/C][C]104[/C][C]102.35[/C][C]1.65000000000001[/C][/ROW]
[ROW][C]22[/C][C]106.7[/C][C]102.35[/C][C]4.35000000000001[/C][/ROW]
[ROW][C]23[/C][C]101.1[/C][C]102.35[/C][C]-1.25[/C][/ROW]
[ROW][C]24[/C][C]97.8[/C][C]102.35[/C][C]-4.55[/C][/ROW]
[ROW][C]25[/C][C]113.8[/C][C]102.35[/C][C]11.45[/C][/ROW]
[ROW][C]26[/C][C]107.1[/C][C]102.35[/C][C]4.75[/C][/ROW]
[ROW][C]27[/C][C]117.5[/C][C]114.26[/C][C]3.24[/C][/ROW]
[ROW][C]28[/C][C]113.7[/C][C]114.26[/C][C]-0.559999999999997[/C][/ROW]
[ROW][C]29[/C][C]106.6[/C][C]114.26[/C][C]-7.66[/C][/ROW]
[ROW][C]30[/C][C]109.8[/C][C]114.26[/C][C]-4.46[/C][/ROW]
[ROW][C]31[/C][C]108.8[/C][C]114.26[/C][C]-5.46[/C][/ROW]
[ROW][C]32[/C][C]102[/C][C]114.26[/C][C]-12.26[/C][/ROW]
[ROW][C]33[/C][C]114.5[/C][C]114.26[/C][C]0.24[/C][/ROW]
[ROW][C]34[/C][C]116.5[/C][C]114.26[/C][C]2.24[/C][/ROW]
[ROW][C]35[/C][C]108.6[/C][C]114.26[/C][C]-5.66[/C][/ROW]
[ROW][C]36[/C][C]113.9[/C][C]114.26[/C][C]-0.359999999999994[/C][/ROW]
[ROW][C]37[/C][C]109.3[/C][C]114.26[/C][C]-4.96[/C][/ROW]
[ROW][C]38[/C][C]112.5[/C][C]114.26[/C][C]-1.76[/C][/ROW]
[ROW][C]39[/C][C]123.4[/C][C]114.26[/C][C]9.14[/C][/ROW]
[ROW][C]40[/C][C]115.2[/C][C]114.26[/C][C]0.940000000000003[/C][/ROW]
[ROW][C]41[/C][C]110.8[/C][C]114.26[/C][C]-3.46[/C][/ROW]
[ROW][C]42[/C][C]120.4[/C][C]114.26[/C][C]6.14000000000001[/C][/ROW]
[ROW][C]43[/C][C]117.6[/C][C]114.26[/C][C]3.33999999999999[/C][/ROW]
[ROW][C]44[/C][C]111.2[/C][C]114.26[/C][C]-3.06[/C][/ROW]
[ROW][C]45[/C][C]131.1[/C][C]114.26[/C][C]16.84[/C][/ROW]
[ROW][C]46[/C][C]118.9[/C][C]114.26[/C][C]4.64000000000001[/C][/ROW]
[ROW][C]47[/C][C]115.7[/C][C]114.26[/C][C]1.44000000000000[/C][/ROW]
[ROW][C]48[/C][C]119.6[/C][C]114.26[/C][C]5.34[/C][/ROW]
[ROW][C]49[/C][C]113.1[/C][C]114.26[/C][C]-1.16000000000001[/C][/ROW]
[ROW][C]50[/C][C]106.4[/C][C]114.26[/C][C]-7.86[/C][/ROW]
[ROW][C]51[/C][C]115.5[/C][C]114.26[/C][C]1.24[/C][/ROW]
[ROW][C]52[/C][C]111.8[/C][C]114.26[/C][C]-2.46000000000000[/C][/ROW]
[ROW][C]53[/C][C]109.6[/C][C]114.26[/C][C]-4.66000000000001[/C][/ROW]
[ROW][C]54[/C][C]121.5[/C][C]114.26[/C][C]7.24[/C][/ROW]
[ROW][C]55[/C][C]109.5[/C][C]114.26[/C][C]-4.76[/C][/ROW]
[ROW][C]56[/C][C]109[/C][C]114.26[/C][C]-5.26[/C][/ROW]
[ROW][C]57[/C][C]113.4[/C][C]114.26[/C][C]-0.859999999999994[/C][/ROW]
[ROW][C]58[/C][C]112.7[/C][C]114.26[/C][C]-1.56000000000000[/C][/ROW]
[ROW][C]59[/C][C]114.4[/C][C]114.26[/C][C]0.140000000000006[/C][/ROW]
[ROW][C]60[/C][C]109.2[/C][C]114.26[/C][C]-5.06[/C][/ROW]
[ROW][C]61[/C][C]116.2[/C][C]114.26[/C][C]1.94000000000000[/C][/ROW]
[ROW][C]62[/C][C]113.8[/C][C]114.26[/C][C]-0.460000000000003[/C][/ROW]
[ROW][C]63[/C][C]123.6[/C][C]114.26[/C][C]9.34[/C][/ROW]
[ROW][C]64[/C][C]112.6[/C][C]114.26[/C][C]-1.66000000000001[/C][/ROW]
[ROW][C]65[/C][C]117.7[/C][C]114.26[/C][C]3.44[/C][/ROW]
[ROW][C]66[/C][C]113.3[/C][C]114.26[/C][C]-0.960000000000003[/C][/ROW]
[ROW][C]67[/C][C]110.7[/C][C]114.26[/C][C]-3.56[/C][/ROW]
[ROW][C]68[/C][C]114.7[/C][C]114.26[/C][C]0.440000000000003[/C][/ROW]
[ROW][C]69[/C][C]116.9[/C][C]114.26[/C][C]2.64000000000001[/C][/ROW]
[ROW][C]70[/C][C]120.6[/C][C]114.26[/C][C]6.34[/C][/ROW]
[ROW][C]71[/C][C]111.6[/C][C]114.26[/C][C]-2.66000000000001[/C][/ROW]
[ROW][C]72[/C][C]111.9[/C][C]114.26[/C][C]-2.35999999999999[/C][/ROW]
[ROW][C]73[/C][C]116.1[/C][C]114.26[/C][C]1.83999999999999[/C][/ROW]
[ROW][C]74[/C][C]111.9[/C][C]114.26[/C][C]-2.35999999999999[/C][/ROW]
[ROW][C]75[/C][C]125.1[/C][C]114.26[/C][C]10.84[/C][/ROW]
[ROW][C]76[/C][C]115.1[/C][C]114.26[/C][C]0.839999999999994[/C][/ROW]
[ROW][C]77[/C][C]116.7[/C][C]114.26[/C][C]2.44000000000000[/C][/ROW]
[ROW][C]78[/C][C]115.8[/C][C]114.26[/C][C]1.54000000000000[/C][/ROW]
[ROW][C]79[/C][C]116.8[/C][C]114.26[/C][C]2.54000000000000[/C][/ROW]
[ROW][C]80[/C][C]113[/C][C]114.26[/C][C]-1.26[/C][/ROW]
[ROW][C]81[/C][C]106.5[/C][C]114.26[/C][C]-7.76[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5698&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5698&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1102.3102.350000000000-0.0500000000001618
298.7102.35-3.64999999999991
3104.4102.352.05000000000001
497.6102.35-4.75
5102.7102.350.350000000000008
6103102.350.650000000000005
792.9102.35-9.44999999999999
896.1102.35-6.25
994.9102.35-7.44999999999999
1099.9102.35-2.44999999999999
1196.3102.35-6.05
1289.5102.35-12.85
13104.6102.352.25
14101.5102.35-0.849999999999995
15109.8102.357.45
16112.1102.359.75
17110.1102.357.75
18107.1102.354.75
19108.1102.355.75
2099102.35-3.34999999999999
21104102.351.65000000000001
22106.7102.354.35000000000001
23101.1102.35-1.25
2497.8102.35-4.55
25113.8102.3511.45
26107.1102.354.75
27117.5114.263.24
28113.7114.26-0.559999999999997
29106.6114.26-7.66
30109.8114.26-4.46
31108.8114.26-5.46
32102114.26-12.26
33114.5114.260.24
34116.5114.262.24
35108.6114.26-5.66
36113.9114.26-0.359999999999994
37109.3114.26-4.96
38112.5114.26-1.76
39123.4114.269.14
40115.2114.260.940000000000003
41110.8114.26-3.46
42120.4114.266.14000000000001
43117.6114.263.33999999999999
44111.2114.26-3.06
45131.1114.2616.84
46118.9114.264.64000000000001
47115.7114.261.44000000000000
48119.6114.265.34
49113.1114.26-1.16000000000001
50106.4114.26-7.86
51115.5114.261.24
52111.8114.26-2.46000000000000
53109.6114.26-4.66000000000001
54121.5114.267.24
55109.5114.26-4.76
56109114.26-5.26
57113.4114.26-0.859999999999994
58112.7114.26-1.56000000000000
59114.4114.260.140000000000006
60109.2114.26-5.06
61116.2114.261.94000000000000
62113.8114.26-0.460000000000003
63123.6114.269.34
64112.6114.26-1.66000000000001
65117.7114.263.44
66113.3114.26-0.960000000000003
67110.7114.26-3.56
68114.7114.260.440000000000003
69116.9114.262.64000000000001
70120.6114.266.34
71111.6114.26-2.66000000000001
72111.9114.26-2.35999999999999
73116.1114.261.83999999999999
74111.9114.26-2.35999999999999
75125.1114.2610.84
76115.1114.260.839999999999994
77116.7114.262.44000000000000
78115.8114.261.54000000000000
79116.8114.262.54000000000000
80113114.26-1.26
81106.5114.26-7.76



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')