Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 14 Dec 2007 12:28:18 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Dec/14/t11976598787mpe30j4oqtjjn1.htm/, Retrieved Thu, 02 May 2024 23:01:49 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=3948, Retrieved Thu, 02 May 2024 23:01:49 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact207
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [broodprijs dummy1...] [2007-12-14 19:28:18] [5a8e7c1f041681f87e3014e302618e0c] [Current]
Feedback Forum

Post a new message
Dataseries X:
1,43	0	0	0
1,43	0	0	0
1,43	0	0	0
1,43	0	0	0
1,43	0	0	0
1,43	0	0	0
1,44	0	0	0
1,48	0	0	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,48	0	1	0
1,57	0	1	0
1,58	0	1	0
1,58	0	1	0
1,58	0	1	0
1,58	0	1	0
1,59	1	1	43
1,6	1	1	44
1,6	1	1	45
1,61	1	1	46
1,61	1	1	47
1,61	1	1	48
1,62	1	1	49
1,63	1	1	50
1,63	1	1	51
1,64	1	1	52
1,64	1	1	53
1,64	1	1	54
1,64	1	1	55
1,64	1	1	56
1,65	1	1	57
1,65	1	1	58
1,65	1	1	59
1,65	1	1	60




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 2 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3948&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]2 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3948&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3948&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 1.43750000000000 -0.0515875472996208dummy1[t] + 0.0569117647058833dummy2[t] + 0.00359133126934983trend[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  1.43750000000000 -0.0515875472996208dummy1[t] +  0.0569117647058833dummy2[t] +  0.00359133126934983trend[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3948&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  1.43750000000000 -0.0515875472996208dummy1[t] +  0.0569117647058833dummy2[t] +  0.00359133126934983trend[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3948&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3948&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 1.43750000000000 -0.0515875472996208dummy1[t] + 0.0569117647058833dummy2[t] + 0.00359133126934983trend[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.437500000000000.009871145.630600
dummy1-0.05158754729962080.065827-0.78370.4365290.218264
dummy20.05691176470588330.0109715.18753e-062e-06
trend0.003591331269349830.0012682.83140.0064260.003213

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 1.43750000000000 & 0.009871 & 145.6306 & 0 & 0 \tabularnewline
dummy1 & -0.0515875472996208 & 0.065827 & -0.7837 & 0.436529 & 0.218264 \tabularnewline
dummy2 & 0.0569117647058833 & 0.010971 & 5.1875 & 3e-06 & 2e-06 \tabularnewline
trend & 0.00359133126934983 & 0.001268 & 2.8314 & 0.006426 & 0.003213 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3948&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]1.43750000000000[/C][C]0.009871[/C][C]145.6306[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]dummy1[/C][C]-0.0515875472996208[/C][C]0.065827[/C][C]-0.7837[/C][C]0.436529[/C][C]0.218264[/C][/ROW]
[ROW][C]dummy2[/C][C]0.0569117647058833[/C][C]0.010971[/C][C]5.1875[/C][C]3e-06[/C][C]2e-06[/C][/ROW]
[ROW][C]trend[/C][C]0.00359133126934983[/C][C]0.001268[/C][C]2.8314[/C][C]0.006426[/C][C]0.003213[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3948&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3948&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.437500000000000.009871145.630600
dummy1-0.05158754729962080.065827-0.78370.4365290.218264
dummy20.05691176470588330.0109715.18753e-062e-06
trend0.003591331269349830.0012682.83140.0064260.003213







Multiple Linear Regression - Regression Statistics
Multiple R0.932136039037301
R-squared0.86887759527215
Adjusted R-squared0.861853180733158
F-TEST (value)123.693952065197
F-TEST (DF numerator)3
F-TEST (DF denominator)56
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0279190251302636
Sum Squared Residuals0.0436504299965602

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.932136039037301 \tabularnewline
R-squared & 0.86887759527215 \tabularnewline
Adjusted R-squared & 0.861853180733158 \tabularnewline
F-TEST (value) & 123.693952065197 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 56 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.0279190251302636 \tabularnewline
Sum Squared Residuals & 0.0436504299965602 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3948&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.932136039037301[/C][/ROW]
[ROW][C]R-squared[/C][C]0.86887759527215[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.861853180733158[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]123.693952065197[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]56[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.0279190251302636[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]0.0436504299965602[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3948&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3948&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.932136039037301
R-squared0.86887759527215
Adjusted R-squared0.861853180733158
F-TEST (value)123.693952065197
F-TEST (DF numerator)3
F-TEST (DF denominator)56
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0279190251302636
Sum Squared Residuals0.0436504299965602







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11.431.43750000000001-0.00750000000000674
21.431.437500-0.00749999999999881
31.431.4375-0.00749999999999905
41.431.4375-0.00749999999999912
51.431.4375-0.00749999999999906
61.431.4375-0.00749999999999906
71.441.43750.00250000000000094
81.481.43750.042500000000001
91.481.49441176470588-0.0144117647058824
101.481.49441176470588-0.0144117647058824
111.481.49441176470588-0.0144117647058824
121.481.49441176470588-0.0144117647058824
131.481.49441176470588-0.0144117647058824
141.481.49441176470588-0.0144117647058824
151.481.49441176470588-0.0144117647058824
161.481.49441176470588-0.0144117647058824
171.481.49441176470588-0.0144117647058824
181.481.49441176470588-0.0144117647058824
191.481.49441176470588-0.0144117647058824
201.481.49441176470588-0.0144117647058824
211.481.49441176470588-0.0144117647058824
221.481.49441176470588-0.0144117647058824
231.481.49441176470588-0.0144117647058824
241.481.49441176470588-0.0144117647058824
251.481.49441176470588-0.0144117647058824
261.481.49441176470588-0.0144117647058824
271.481.49441176470588-0.0144117647058824
281.481.49441176470588-0.0144117647058824
291.481.49441176470588-0.0144117647058824
301.481.49441176470588-0.0144117647058824
311.481.49441176470588-0.0144117647058824
321.481.49441176470588-0.0144117647058824
331.481.49441176470588-0.0144117647058824
341.481.49441176470588-0.0144117647058824
351.481.49441176470588-0.0144117647058824
361.481.49441176470588-0.0144117647058824
371.481.49441176470588-0.0144117647058824
381.571.494411764705880.0755882352941177
391.581.494411764705880.0855882352941177
401.581.494411764705880.0855882352941177
411.581.494411764705880.0855882352941177
421.581.494411764705880.0855882352941177
431.591.59725146198830-0.0072514619883041
441.61.60084279325765-0.000842793257653927
451.61.60443412452700-0.00443412452700376
461.611.608025455796350.00197454420364642
471.611.61161678706570-0.00161678706570341
481.611.61520811833505-0.00520811833505324
491.621.618799449604400.00120055039559694
501.631.622390780873750.00760921912624689
511.631.625982112143100.00401788785689706
521.641.629573443412450.0104265565875472
531.641.633164774681800.0068352253181974
541.641.636756105951150.00324389404884758
551.641.64034743722050-0.000347437220502256
561.641.64393876848985-0.00393876848985209
571.651.64753009975920.00246990024079809
581.651.65112143102855-0.00112143102855174
591.651.65471276229790-0.00471276229790157
601.651.65830409356725-0.0083040935672514

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1.43 & 1.43750000000001 & -0.00750000000000674 \tabularnewline
2 & 1.43 & 1.437500 & -0.00749999999999881 \tabularnewline
3 & 1.43 & 1.4375 & -0.00749999999999905 \tabularnewline
4 & 1.43 & 1.4375 & -0.00749999999999912 \tabularnewline
5 & 1.43 & 1.4375 & -0.00749999999999906 \tabularnewline
6 & 1.43 & 1.4375 & -0.00749999999999906 \tabularnewline
7 & 1.44 & 1.4375 & 0.00250000000000094 \tabularnewline
8 & 1.48 & 1.4375 & 0.042500000000001 \tabularnewline
9 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
10 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
11 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
12 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
13 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
14 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
15 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
16 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
17 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
18 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
19 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
20 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
21 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
22 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
23 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
24 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
25 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
26 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
27 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
28 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
29 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
30 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
31 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
32 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
33 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
34 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
35 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
36 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
37 & 1.48 & 1.49441176470588 & -0.0144117647058824 \tabularnewline
38 & 1.57 & 1.49441176470588 & 0.0755882352941177 \tabularnewline
39 & 1.58 & 1.49441176470588 & 0.0855882352941177 \tabularnewline
40 & 1.58 & 1.49441176470588 & 0.0855882352941177 \tabularnewline
41 & 1.58 & 1.49441176470588 & 0.0855882352941177 \tabularnewline
42 & 1.58 & 1.49441176470588 & 0.0855882352941177 \tabularnewline
43 & 1.59 & 1.59725146198830 & -0.0072514619883041 \tabularnewline
44 & 1.6 & 1.60084279325765 & -0.000842793257653927 \tabularnewline
45 & 1.6 & 1.60443412452700 & -0.00443412452700376 \tabularnewline
46 & 1.61 & 1.60802545579635 & 0.00197454420364642 \tabularnewline
47 & 1.61 & 1.61161678706570 & -0.00161678706570341 \tabularnewline
48 & 1.61 & 1.61520811833505 & -0.00520811833505324 \tabularnewline
49 & 1.62 & 1.61879944960440 & 0.00120055039559694 \tabularnewline
50 & 1.63 & 1.62239078087375 & 0.00760921912624689 \tabularnewline
51 & 1.63 & 1.62598211214310 & 0.00401788785689706 \tabularnewline
52 & 1.64 & 1.62957344341245 & 0.0104265565875472 \tabularnewline
53 & 1.64 & 1.63316477468180 & 0.0068352253181974 \tabularnewline
54 & 1.64 & 1.63675610595115 & 0.00324389404884758 \tabularnewline
55 & 1.64 & 1.64034743722050 & -0.000347437220502256 \tabularnewline
56 & 1.64 & 1.64393876848985 & -0.00393876848985209 \tabularnewline
57 & 1.65 & 1.6475300997592 & 0.00246990024079809 \tabularnewline
58 & 1.65 & 1.65112143102855 & -0.00112143102855174 \tabularnewline
59 & 1.65 & 1.65471276229790 & -0.00471276229790157 \tabularnewline
60 & 1.65 & 1.65830409356725 & -0.0083040935672514 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=3948&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1.43[/C][C]1.43750000000001[/C][C]-0.00750000000000674[/C][/ROW]
[ROW][C]2[/C][C]1.43[/C][C]1.437500[/C][C]-0.00749999999999881[/C][/ROW]
[ROW][C]3[/C][C]1.43[/C][C]1.4375[/C][C]-0.00749999999999905[/C][/ROW]
[ROW][C]4[/C][C]1.43[/C][C]1.4375[/C][C]-0.00749999999999912[/C][/ROW]
[ROW][C]5[/C][C]1.43[/C][C]1.4375[/C][C]-0.00749999999999906[/C][/ROW]
[ROW][C]6[/C][C]1.43[/C][C]1.4375[/C][C]-0.00749999999999906[/C][/ROW]
[ROW][C]7[/C][C]1.44[/C][C]1.4375[/C][C]0.00250000000000094[/C][/ROW]
[ROW][C]8[/C][C]1.48[/C][C]1.4375[/C][C]0.042500000000001[/C][/ROW]
[ROW][C]9[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]10[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]11[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]12[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]13[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]14[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]15[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]16[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]17[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]18[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]19[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]20[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]21[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]22[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]23[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]24[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]25[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]26[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]27[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]28[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]29[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]30[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]31[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]32[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]33[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]34[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]35[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]36[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]37[/C][C]1.48[/C][C]1.49441176470588[/C][C]-0.0144117647058824[/C][/ROW]
[ROW][C]38[/C][C]1.57[/C][C]1.49441176470588[/C][C]0.0755882352941177[/C][/ROW]
[ROW][C]39[/C][C]1.58[/C][C]1.49441176470588[/C][C]0.0855882352941177[/C][/ROW]
[ROW][C]40[/C][C]1.58[/C][C]1.49441176470588[/C][C]0.0855882352941177[/C][/ROW]
[ROW][C]41[/C][C]1.58[/C][C]1.49441176470588[/C][C]0.0855882352941177[/C][/ROW]
[ROW][C]42[/C][C]1.58[/C][C]1.49441176470588[/C][C]0.0855882352941177[/C][/ROW]
[ROW][C]43[/C][C]1.59[/C][C]1.59725146198830[/C][C]-0.0072514619883041[/C][/ROW]
[ROW][C]44[/C][C]1.6[/C][C]1.60084279325765[/C][C]-0.000842793257653927[/C][/ROW]
[ROW][C]45[/C][C]1.6[/C][C]1.60443412452700[/C][C]-0.00443412452700376[/C][/ROW]
[ROW][C]46[/C][C]1.61[/C][C]1.60802545579635[/C][C]0.00197454420364642[/C][/ROW]
[ROW][C]47[/C][C]1.61[/C][C]1.61161678706570[/C][C]-0.00161678706570341[/C][/ROW]
[ROW][C]48[/C][C]1.61[/C][C]1.61520811833505[/C][C]-0.00520811833505324[/C][/ROW]
[ROW][C]49[/C][C]1.62[/C][C]1.61879944960440[/C][C]0.00120055039559694[/C][/ROW]
[ROW][C]50[/C][C]1.63[/C][C]1.62239078087375[/C][C]0.00760921912624689[/C][/ROW]
[ROW][C]51[/C][C]1.63[/C][C]1.62598211214310[/C][C]0.00401788785689706[/C][/ROW]
[ROW][C]52[/C][C]1.64[/C][C]1.62957344341245[/C][C]0.0104265565875472[/C][/ROW]
[ROW][C]53[/C][C]1.64[/C][C]1.63316477468180[/C][C]0.0068352253181974[/C][/ROW]
[ROW][C]54[/C][C]1.64[/C][C]1.63675610595115[/C][C]0.00324389404884758[/C][/ROW]
[ROW][C]55[/C][C]1.64[/C][C]1.64034743722050[/C][C]-0.000347437220502256[/C][/ROW]
[ROW][C]56[/C][C]1.64[/C][C]1.64393876848985[/C][C]-0.00393876848985209[/C][/ROW]
[ROW][C]57[/C][C]1.65[/C][C]1.6475300997592[/C][C]0.00246990024079809[/C][/ROW]
[ROW][C]58[/C][C]1.65[/C][C]1.65112143102855[/C][C]-0.00112143102855174[/C][/ROW]
[ROW][C]59[/C][C]1.65[/C][C]1.65471276229790[/C][C]-0.00471276229790157[/C][/ROW]
[ROW][C]60[/C][C]1.65[/C][C]1.65830409356725[/C][C]-0.0083040935672514[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=3948&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=3948&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11.431.43750000000001-0.00750000000000674
21.431.437500-0.00749999999999881
31.431.4375-0.00749999999999905
41.431.4375-0.00749999999999912
51.431.4375-0.00749999999999906
61.431.4375-0.00749999999999906
71.441.43750.00250000000000094
81.481.43750.042500000000001
91.481.49441176470588-0.0144117647058824
101.481.49441176470588-0.0144117647058824
111.481.49441176470588-0.0144117647058824
121.481.49441176470588-0.0144117647058824
131.481.49441176470588-0.0144117647058824
141.481.49441176470588-0.0144117647058824
151.481.49441176470588-0.0144117647058824
161.481.49441176470588-0.0144117647058824
171.481.49441176470588-0.0144117647058824
181.481.49441176470588-0.0144117647058824
191.481.49441176470588-0.0144117647058824
201.481.49441176470588-0.0144117647058824
211.481.49441176470588-0.0144117647058824
221.481.49441176470588-0.0144117647058824
231.481.49441176470588-0.0144117647058824
241.481.49441176470588-0.0144117647058824
251.481.49441176470588-0.0144117647058824
261.481.49441176470588-0.0144117647058824
271.481.49441176470588-0.0144117647058824
281.481.49441176470588-0.0144117647058824
291.481.49441176470588-0.0144117647058824
301.481.49441176470588-0.0144117647058824
311.481.49441176470588-0.0144117647058824
321.481.49441176470588-0.0144117647058824
331.481.49441176470588-0.0144117647058824
341.481.49441176470588-0.0144117647058824
351.481.49441176470588-0.0144117647058824
361.481.49441176470588-0.0144117647058824
371.481.49441176470588-0.0144117647058824
381.571.494411764705880.0755882352941177
391.581.494411764705880.0855882352941177
401.581.494411764705880.0855882352941177
411.581.494411764705880.0855882352941177
421.581.494411764705880.0855882352941177
431.591.59725146198830-0.0072514619883041
441.61.60084279325765-0.000842793257653927
451.61.60443412452700-0.00443412452700376
461.611.608025455796350.00197454420364642
471.611.61161678706570-0.00161678706570341
481.611.61520811833505-0.00520811833505324
491.621.618799449604400.00120055039559694
501.631.622390780873750.00760921912624689
511.631.625982112143100.00401788785689706
521.641.629573443412450.0104265565875472
531.641.633164774681800.0068352253181974
541.641.636756105951150.00324389404884758
551.641.64034743722050-0.000347437220502256
561.641.64393876848985-0.00393876848985209
571.651.64753009975920.00246990024079809
581.651.65112143102855-0.00112143102855174
591.651.65471276229790-0.00471276229790157
601.651.65830409356725-0.0083040935672514



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')