Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 25 Nov 2007 09:59:13 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/25/t1196009479zfyhlg734xzmdh7.htm/, Retrieved Sat, 04 May 2024 07:11:43 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=6500, Retrieved Sat, 04 May 2024 07:11:43 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact182
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper] [2007-11-25 16:59:13] [4bd8a0043457404de73994ae0e323922] [Current]
Feedback Forum

Post a new message
Dataseries X:
8,7	0
8,5	0
8,2	0
8,3	0
8	0
8,1	0
8,7	0
9,3	0
8,9	0
8,8	0
8,4	0
8,4	0
7,3	0
7,2	0
7	0
7	0
6,9	0
6,9	0
7,1	0
7,5	0
7,4	0
8,9	0
8,3	1
8,3	1
9	1
8,9	1
8,8	1
7,8	1
7,8	1
7,8	1
9,2	1
9,3	1
9,2	1
8,6	1
8,5	1
8,5	1
9	1
9	1
8,8	1
8	1
7,9	1
8,1	1
9,3	1
9,4	1
9,4	1
9,3	1
9	1
9,1	1
9,7	1
9,7	1
9,6	1
8,3	1
8,2	1
8,4	1
10,6	1
10,9	1
10,9	1
9,6	1
9,3	1
9,3	1
9,6	1
9,5	1
9,5	1
9	1
8,9	1
9	1
10,1	1
10,2	1
10,2	1
9,5	1
9,3	1
9,3	1
9,4	1
9,3	1
9,1	1
9	1
8,9	1
9	1
9,8	1
10	1
9,8	1
9,4	1
9	1
8,9	1
9,3	1
9,1	1
8,8	1
8,9	1
8,7	1
8,6	1
9,1	1
9,3	1
8,9	1




Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6500&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6500&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6500&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
WLHvrouwen[t] = + 7.97727272727272 + 1.13822023047375x[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
WLHvrouwen[t] =  +  7.97727272727272 +  1.13822023047375x[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6500&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]WLHvrouwen[t] =  +  7.97727272727272 +  1.13822023047375x[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6500&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6500&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
WLHvrouwen[t] = + 7.97727272727272 + 1.13822023047375x[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)7.977272727272720.14684854.323500
x1.138220230473750.1680666.772500

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 7.97727272727272 & 0.146848 & 54.3235 & 0 & 0 \tabularnewline
x & 1.13822023047375 & 0.168066 & 6.7725 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6500&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]7.97727272727272[/C][C]0.146848[/C][C]54.3235[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]1.13822023047375[/C][C]0.168066[/C][C]6.7725[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6500&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6500&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)7.977272727272720.14684854.323500
x1.138220230473750.1680666.772500







Multiple Linear Regression - Regression Statistics
Multiple R0.578893514173126
R-squared0.335117700751711
Adjusted R-squared0.327811301858873
F-TEST (value)45.8663297291626
F-TEST (DF numerator)1
F-TEST (DF denominator)91
p-value1.21531473773473e-09
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.688776540025267
Sum Squared Residuals43.1715941101152

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.578893514173126 \tabularnewline
R-squared & 0.335117700751711 \tabularnewline
Adjusted R-squared & 0.327811301858873 \tabularnewline
F-TEST (value) & 45.8663297291626 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 91 \tabularnewline
p-value & 1.21531473773473e-09 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.688776540025267 \tabularnewline
Sum Squared Residuals & 43.1715941101152 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6500&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.578893514173126[/C][/ROW]
[ROW][C]R-squared[/C][C]0.335117700751711[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.327811301858873[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]45.8663297291626[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]91[/C][/ROW]
[ROW][C]p-value[/C][C]1.21531473773473e-09[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.688776540025267[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]43.1715941101152[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6500&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6500&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.578893514173126
R-squared0.335117700751711
Adjusted R-squared0.327811301858873
F-TEST (value)45.8663297291626
F-TEST (DF numerator)1
F-TEST (DF denominator)91
p-value1.21531473773473e-09
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.688776540025267
Sum Squared Residuals43.1715941101152







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
18.77.977272727272750.722727272727247
28.57.977272727272720.522727272727278
38.27.977272727272730.222727272727273
48.37.977272727272730.322727272727274
587.977272727272730.0227272727272737
68.17.977272727272730.122727272727273
78.77.977272727272730.722727272727273
89.37.977272727272731.32272727272727
98.97.977272727272730.922727272727274
108.87.977272727272730.822727272727274
118.47.977272727272730.422727272727274
128.47.977272727272730.422727272727274
137.37.97727272727273-0.677272727272726
147.27.97727272727273-0.777272727272726
1577.97727272727273-0.977272727272726
1677.97727272727273-0.977272727272726
176.97.97727272727273-1.07727272727273
186.97.97727272727273-1.07727272727273
197.17.97727272727273-0.877272727272727
207.57.97727272727273-0.477272727272726
217.47.97727272727273-0.577272727272726
228.97.977272727272730.922727272727274
238.39.11549295774648-0.815492957746478
248.39.11549295774648-0.815492957746478
2599.11549295774648-0.115492957746479
268.99.11549295774648-0.215492957746479
278.89.11549295774648-0.315492957746478
287.89.11549295774648-1.31549295774648
297.89.11549295774648-1.31549295774648
307.89.11549295774648-1.31549295774648
319.29.115492957746480.0845070422535203
329.39.115492957746480.184507042253522
339.29.115492957746480.0845070422535203
348.69.11549295774648-0.51549295774648
358.59.11549295774648-0.615492957746479
368.59.11549295774648-0.615492957746479
3799.11549295774648-0.115492957746479
3899.11549295774648-0.115492957746479
398.89.11549295774648-0.315492957746478
4089.11549295774648-1.11549295774648
417.99.11549295774648-1.21549295774648
428.19.11549295774648-1.01549295774648
439.39.115492957746480.184507042253522
449.49.115492957746480.284507042253521
459.49.115492957746480.284507042253521
469.39.115492957746480.184507042253522
4799.11549295774648-0.115492957746479
489.19.11549295774648-0.0154929577464794
499.79.115492957746480.58450704225352
509.79.115492957746480.58450704225352
519.69.115492957746480.48450704225352
528.39.11549295774648-0.815492957746478
538.29.11549295774648-0.91549295774648
548.49.11549295774648-0.715492957746479
5510.69.115492957746481.48450704225352
5610.99.115492957746481.78450704225352
5710.99.115492957746481.78450704225352
589.69.115492957746480.48450704225352
599.39.115492957746480.184507042253522
609.39.115492957746480.184507042253522
619.69.115492957746480.48450704225352
629.59.115492957746480.384507042253521
639.59.115492957746480.384507042253521
6499.11549295774648-0.115492957746479
658.99.11549295774648-0.215492957746479
6699.11549295774648-0.115492957746479
6710.19.115492957746480.98450704225352
6810.29.115492957746481.08450704225352
6910.29.115492957746481.08450704225352
709.59.115492957746480.384507042253521
719.39.115492957746480.184507042253522
729.39.115492957746480.184507042253522
739.49.115492957746480.284507042253521
749.39.115492957746480.184507042253522
759.19.11549295774648-0.0154929577464794
7699.11549295774648-0.115492957746479
778.99.11549295774648-0.215492957746479
7899.11549295774648-0.115492957746479
799.89.115492957746480.684507042253522
80109.115492957746480.884507042253521
819.89.115492957746480.684507042253522
829.49.115492957746480.284507042253521
8399.11549295774648-0.115492957746479
848.99.11549295774648-0.215492957746479
859.39.115492957746480.184507042253522
869.19.11549295774648-0.0154929577464794
878.89.11549295774648-0.315492957746478
888.99.11549295774648-0.215492957746479
898.79.11549295774648-0.41549295774648
908.69.11549295774648-0.51549295774648
919.19.11549295774648-0.0154929577464794
929.39.115492957746480.184507042253522
938.99.11549295774648-0.215492957746479

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 8.7 & 7.97727272727275 & 0.722727272727247 \tabularnewline
2 & 8.5 & 7.97727272727272 & 0.522727272727278 \tabularnewline
3 & 8.2 & 7.97727272727273 & 0.222727272727273 \tabularnewline
4 & 8.3 & 7.97727272727273 & 0.322727272727274 \tabularnewline
5 & 8 & 7.97727272727273 & 0.0227272727272737 \tabularnewline
6 & 8.1 & 7.97727272727273 & 0.122727272727273 \tabularnewline
7 & 8.7 & 7.97727272727273 & 0.722727272727273 \tabularnewline
8 & 9.3 & 7.97727272727273 & 1.32272727272727 \tabularnewline
9 & 8.9 & 7.97727272727273 & 0.922727272727274 \tabularnewline
10 & 8.8 & 7.97727272727273 & 0.822727272727274 \tabularnewline
11 & 8.4 & 7.97727272727273 & 0.422727272727274 \tabularnewline
12 & 8.4 & 7.97727272727273 & 0.422727272727274 \tabularnewline
13 & 7.3 & 7.97727272727273 & -0.677272727272726 \tabularnewline
14 & 7.2 & 7.97727272727273 & -0.777272727272726 \tabularnewline
15 & 7 & 7.97727272727273 & -0.977272727272726 \tabularnewline
16 & 7 & 7.97727272727273 & -0.977272727272726 \tabularnewline
17 & 6.9 & 7.97727272727273 & -1.07727272727273 \tabularnewline
18 & 6.9 & 7.97727272727273 & -1.07727272727273 \tabularnewline
19 & 7.1 & 7.97727272727273 & -0.877272727272727 \tabularnewline
20 & 7.5 & 7.97727272727273 & -0.477272727272726 \tabularnewline
21 & 7.4 & 7.97727272727273 & -0.577272727272726 \tabularnewline
22 & 8.9 & 7.97727272727273 & 0.922727272727274 \tabularnewline
23 & 8.3 & 9.11549295774648 & -0.815492957746478 \tabularnewline
24 & 8.3 & 9.11549295774648 & -0.815492957746478 \tabularnewline
25 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
26 & 8.9 & 9.11549295774648 & -0.215492957746479 \tabularnewline
27 & 8.8 & 9.11549295774648 & -0.315492957746478 \tabularnewline
28 & 7.8 & 9.11549295774648 & -1.31549295774648 \tabularnewline
29 & 7.8 & 9.11549295774648 & -1.31549295774648 \tabularnewline
30 & 7.8 & 9.11549295774648 & -1.31549295774648 \tabularnewline
31 & 9.2 & 9.11549295774648 & 0.0845070422535203 \tabularnewline
32 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
33 & 9.2 & 9.11549295774648 & 0.0845070422535203 \tabularnewline
34 & 8.6 & 9.11549295774648 & -0.51549295774648 \tabularnewline
35 & 8.5 & 9.11549295774648 & -0.615492957746479 \tabularnewline
36 & 8.5 & 9.11549295774648 & -0.615492957746479 \tabularnewline
37 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
38 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
39 & 8.8 & 9.11549295774648 & -0.315492957746478 \tabularnewline
40 & 8 & 9.11549295774648 & -1.11549295774648 \tabularnewline
41 & 7.9 & 9.11549295774648 & -1.21549295774648 \tabularnewline
42 & 8.1 & 9.11549295774648 & -1.01549295774648 \tabularnewline
43 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
44 & 9.4 & 9.11549295774648 & 0.284507042253521 \tabularnewline
45 & 9.4 & 9.11549295774648 & 0.284507042253521 \tabularnewline
46 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
47 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
48 & 9.1 & 9.11549295774648 & -0.0154929577464794 \tabularnewline
49 & 9.7 & 9.11549295774648 & 0.58450704225352 \tabularnewline
50 & 9.7 & 9.11549295774648 & 0.58450704225352 \tabularnewline
51 & 9.6 & 9.11549295774648 & 0.48450704225352 \tabularnewline
52 & 8.3 & 9.11549295774648 & -0.815492957746478 \tabularnewline
53 & 8.2 & 9.11549295774648 & -0.91549295774648 \tabularnewline
54 & 8.4 & 9.11549295774648 & -0.715492957746479 \tabularnewline
55 & 10.6 & 9.11549295774648 & 1.48450704225352 \tabularnewline
56 & 10.9 & 9.11549295774648 & 1.78450704225352 \tabularnewline
57 & 10.9 & 9.11549295774648 & 1.78450704225352 \tabularnewline
58 & 9.6 & 9.11549295774648 & 0.48450704225352 \tabularnewline
59 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
60 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
61 & 9.6 & 9.11549295774648 & 0.48450704225352 \tabularnewline
62 & 9.5 & 9.11549295774648 & 0.384507042253521 \tabularnewline
63 & 9.5 & 9.11549295774648 & 0.384507042253521 \tabularnewline
64 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
65 & 8.9 & 9.11549295774648 & -0.215492957746479 \tabularnewline
66 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
67 & 10.1 & 9.11549295774648 & 0.98450704225352 \tabularnewline
68 & 10.2 & 9.11549295774648 & 1.08450704225352 \tabularnewline
69 & 10.2 & 9.11549295774648 & 1.08450704225352 \tabularnewline
70 & 9.5 & 9.11549295774648 & 0.384507042253521 \tabularnewline
71 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
72 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
73 & 9.4 & 9.11549295774648 & 0.284507042253521 \tabularnewline
74 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
75 & 9.1 & 9.11549295774648 & -0.0154929577464794 \tabularnewline
76 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
77 & 8.9 & 9.11549295774648 & -0.215492957746479 \tabularnewline
78 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
79 & 9.8 & 9.11549295774648 & 0.684507042253522 \tabularnewline
80 & 10 & 9.11549295774648 & 0.884507042253521 \tabularnewline
81 & 9.8 & 9.11549295774648 & 0.684507042253522 \tabularnewline
82 & 9.4 & 9.11549295774648 & 0.284507042253521 \tabularnewline
83 & 9 & 9.11549295774648 & -0.115492957746479 \tabularnewline
84 & 8.9 & 9.11549295774648 & -0.215492957746479 \tabularnewline
85 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
86 & 9.1 & 9.11549295774648 & -0.0154929577464794 \tabularnewline
87 & 8.8 & 9.11549295774648 & -0.315492957746478 \tabularnewline
88 & 8.9 & 9.11549295774648 & -0.215492957746479 \tabularnewline
89 & 8.7 & 9.11549295774648 & -0.41549295774648 \tabularnewline
90 & 8.6 & 9.11549295774648 & -0.51549295774648 \tabularnewline
91 & 9.1 & 9.11549295774648 & -0.0154929577464794 \tabularnewline
92 & 9.3 & 9.11549295774648 & 0.184507042253522 \tabularnewline
93 & 8.9 & 9.11549295774648 & -0.215492957746479 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=6500&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]8.7[/C][C]7.97727272727275[/C][C]0.722727272727247[/C][/ROW]
[ROW][C]2[/C][C]8.5[/C][C]7.97727272727272[/C][C]0.522727272727278[/C][/ROW]
[ROW][C]3[/C][C]8.2[/C][C]7.97727272727273[/C][C]0.222727272727273[/C][/ROW]
[ROW][C]4[/C][C]8.3[/C][C]7.97727272727273[/C][C]0.322727272727274[/C][/ROW]
[ROW][C]5[/C][C]8[/C][C]7.97727272727273[/C][C]0.0227272727272737[/C][/ROW]
[ROW][C]6[/C][C]8.1[/C][C]7.97727272727273[/C][C]0.122727272727273[/C][/ROW]
[ROW][C]7[/C][C]8.7[/C][C]7.97727272727273[/C][C]0.722727272727273[/C][/ROW]
[ROW][C]8[/C][C]9.3[/C][C]7.97727272727273[/C][C]1.32272727272727[/C][/ROW]
[ROW][C]9[/C][C]8.9[/C][C]7.97727272727273[/C][C]0.922727272727274[/C][/ROW]
[ROW][C]10[/C][C]8.8[/C][C]7.97727272727273[/C][C]0.822727272727274[/C][/ROW]
[ROW][C]11[/C][C]8.4[/C][C]7.97727272727273[/C][C]0.422727272727274[/C][/ROW]
[ROW][C]12[/C][C]8.4[/C][C]7.97727272727273[/C][C]0.422727272727274[/C][/ROW]
[ROW][C]13[/C][C]7.3[/C][C]7.97727272727273[/C][C]-0.677272727272726[/C][/ROW]
[ROW][C]14[/C][C]7.2[/C][C]7.97727272727273[/C][C]-0.777272727272726[/C][/ROW]
[ROW][C]15[/C][C]7[/C][C]7.97727272727273[/C][C]-0.977272727272726[/C][/ROW]
[ROW][C]16[/C][C]7[/C][C]7.97727272727273[/C][C]-0.977272727272726[/C][/ROW]
[ROW][C]17[/C][C]6.9[/C][C]7.97727272727273[/C][C]-1.07727272727273[/C][/ROW]
[ROW][C]18[/C][C]6.9[/C][C]7.97727272727273[/C][C]-1.07727272727273[/C][/ROW]
[ROW][C]19[/C][C]7.1[/C][C]7.97727272727273[/C][C]-0.877272727272727[/C][/ROW]
[ROW][C]20[/C][C]7.5[/C][C]7.97727272727273[/C][C]-0.477272727272726[/C][/ROW]
[ROW][C]21[/C][C]7.4[/C][C]7.97727272727273[/C][C]-0.577272727272726[/C][/ROW]
[ROW][C]22[/C][C]8.9[/C][C]7.97727272727273[/C][C]0.922727272727274[/C][/ROW]
[ROW][C]23[/C][C]8.3[/C][C]9.11549295774648[/C][C]-0.815492957746478[/C][/ROW]
[ROW][C]24[/C][C]8.3[/C][C]9.11549295774648[/C][C]-0.815492957746478[/C][/ROW]
[ROW][C]25[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]26[/C][C]8.9[/C][C]9.11549295774648[/C][C]-0.215492957746479[/C][/ROW]
[ROW][C]27[/C][C]8.8[/C][C]9.11549295774648[/C][C]-0.315492957746478[/C][/ROW]
[ROW][C]28[/C][C]7.8[/C][C]9.11549295774648[/C][C]-1.31549295774648[/C][/ROW]
[ROW][C]29[/C][C]7.8[/C][C]9.11549295774648[/C][C]-1.31549295774648[/C][/ROW]
[ROW][C]30[/C][C]7.8[/C][C]9.11549295774648[/C][C]-1.31549295774648[/C][/ROW]
[ROW][C]31[/C][C]9.2[/C][C]9.11549295774648[/C][C]0.0845070422535203[/C][/ROW]
[ROW][C]32[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]33[/C][C]9.2[/C][C]9.11549295774648[/C][C]0.0845070422535203[/C][/ROW]
[ROW][C]34[/C][C]8.6[/C][C]9.11549295774648[/C][C]-0.51549295774648[/C][/ROW]
[ROW][C]35[/C][C]8.5[/C][C]9.11549295774648[/C][C]-0.615492957746479[/C][/ROW]
[ROW][C]36[/C][C]8.5[/C][C]9.11549295774648[/C][C]-0.615492957746479[/C][/ROW]
[ROW][C]37[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]38[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]39[/C][C]8.8[/C][C]9.11549295774648[/C][C]-0.315492957746478[/C][/ROW]
[ROW][C]40[/C][C]8[/C][C]9.11549295774648[/C][C]-1.11549295774648[/C][/ROW]
[ROW][C]41[/C][C]7.9[/C][C]9.11549295774648[/C][C]-1.21549295774648[/C][/ROW]
[ROW][C]42[/C][C]8.1[/C][C]9.11549295774648[/C][C]-1.01549295774648[/C][/ROW]
[ROW][C]43[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]44[/C][C]9.4[/C][C]9.11549295774648[/C][C]0.284507042253521[/C][/ROW]
[ROW][C]45[/C][C]9.4[/C][C]9.11549295774648[/C][C]0.284507042253521[/C][/ROW]
[ROW][C]46[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]47[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]48[/C][C]9.1[/C][C]9.11549295774648[/C][C]-0.0154929577464794[/C][/ROW]
[ROW][C]49[/C][C]9.7[/C][C]9.11549295774648[/C][C]0.58450704225352[/C][/ROW]
[ROW][C]50[/C][C]9.7[/C][C]9.11549295774648[/C][C]0.58450704225352[/C][/ROW]
[ROW][C]51[/C][C]9.6[/C][C]9.11549295774648[/C][C]0.48450704225352[/C][/ROW]
[ROW][C]52[/C][C]8.3[/C][C]9.11549295774648[/C][C]-0.815492957746478[/C][/ROW]
[ROW][C]53[/C][C]8.2[/C][C]9.11549295774648[/C][C]-0.91549295774648[/C][/ROW]
[ROW][C]54[/C][C]8.4[/C][C]9.11549295774648[/C][C]-0.715492957746479[/C][/ROW]
[ROW][C]55[/C][C]10.6[/C][C]9.11549295774648[/C][C]1.48450704225352[/C][/ROW]
[ROW][C]56[/C][C]10.9[/C][C]9.11549295774648[/C][C]1.78450704225352[/C][/ROW]
[ROW][C]57[/C][C]10.9[/C][C]9.11549295774648[/C][C]1.78450704225352[/C][/ROW]
[ROW][C]58[/C][C]9.6[/C][C]9.11549295774648[/C][C]0.48450704225352[/C][/ROW]
[ROW][C]59[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]60[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]61[/C][C]9.6[/C][C]9.11549295774648[/C][C]0.48450704225352[/C][/ROW]
[ROW][C]62[/C][C]9.5[/C][C]9.11549295774648[/C][C]0.384507042253521[/C][/ROW]
[ROW][C]63[/C][C]9.5[/C][C]9.11549295774648[/C][C]0.384507042253521[/C][/ROW]
[ROW][C]64[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]65[/C][C]8.9[/C][C]9.11549295774648[/C][C]-0.215492957746479[/C][/ROW]
[ROW][C]66[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]67[/C][C]10.1[/C][C]9.11549295774648[/C][C]0.98450704225352[/C][/ROW]
[ROW][C]68[/C][C]10.2[/C][C]9.11549295774648[/C][C]1.08450704225352[/C][/ROW]
[ROW][C]69[/C][C]10.2[/C][C]9.11549295774648[/C][C]1.08450704225352[/C][/ROW]
[ROW][C]70[/C][C]9.5[/C][C]9.11549295774648[/C][C]0.384507042253521[/C][/ROW]
[ROW][C]71[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]72[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]73[/C][C]9.4[/C][C]9.11549295774648[/C][C]0.284507042253521[/C][/ROW]
[ROW][C]74[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]75[/C][C]9.1[/C][C]9.11549295774648[/C][C]-0.0154929577464794[/C][/ROW]
[ROW][C]76[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]77[/C][C]8.9[/C][C]9.11549295774648[/C][C]-0.215492957746479[/C][/ROW]
[ROW][C]78[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]79[/C][C]9.8[/C][C]9.11549295774648[/C][C]0.684507042253522[/C][/ROW]
[ROW][C]80[/C][C]10[/C][C]9.11549295774648[/C][C]0.884507042253521[/C][/ROW]
[ROW][C]81[/C][C]9.8[/C][C]9.11549295774648[/C][C]0.684507042253522[/C][/ROW]
[ROW][C]82[/C][C]9.4[/C][C]9.11549295774648[/C][C]0.284507042253521[/C][/ROW]
[ROW][C]83[/C][C]9[/C][C]9.11549295774648[/C][C]-0.115492957746479[/C][/ROW]
[ROW][C]84[/C][C]8.9[/C][C]9.11549295774648[/C][C]-0.215492957746479[/C][/ROW]
[ROW][C]85[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]86[/C][C]9.1[/C][C]9.11549295774648[/C][C]-0.0154929577464794[/C][/ROW]
[ROW][C]87[/C][C]8.8[/C][C]9.11549295774648[/C][C]-0.315492957746478[/C][/ROW]
[ROW][C]88[/C][C]8.9[/C][C]9.11549295774648[/C][C]-0.215492957746479[/C][/ROW]
[ROW][C]89[/C][C]8.7[/C][C]9.11549295774648[/C][C]-0.41549295774648[/C][/ROW]
[ROW][C]90[/C][C]8.6[/C][C]9.11549295774648[/C][C]-0.51549295774648[/C][/ROW]
[ROW][C]91[/C][C]9.1[/C][C]9.11549295774648[/C][C]-0.0154929577464794[/C][/ROW]
[ROW][C]92[/C][C]9.3[/C][C]9.11549295774648[/C][C]0.184507042253522[/C][/ROW]
[ROW][C]93[/C][C]8.9[/C][C]9.11549295774648[/C][C]-0.215492957746479[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=6500&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=6500&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
18.77.977272727272750.722727272727247
28.57.977272727272720.522727272727278
38.27.977272727272730.222727272727273
48.37.977272727272730.322727272727274
587.977272727272730.0227272727272737
68.17.977272727272730.122727272727273
78.77.977272727272730.722727272727273
89.37.977272727272731.32272727272727
98.97.977272727272730.922727272727274
108.87.977272727272730.822727272727274
118.47.977272727272730.422727272727274
128.47.977272727272730.422727272727274
137.37.97727272727273-0.677272727272726
147.27.97727272727273-0.777272727272726
1577.97727272727273-0.977272727272726
1677.97727272727273-0.977272727272726
176.97.97727272727273-1.07727272727273
186.97.97727272727273-1.07727272727273
197.17.97727272727273-0.877272727272727
207.57.97727272727273-0.477272727272726
217.47.97727272727273-0.577272727272726
228.97.977272727272730.922727272727274
238.39.11549295774648-0.815492957746478
248.39.11549295774648-0.815492957746478
2599.11549295774648-0.115492957746479
268.99.11549295774648-0.215492957746479
278.89.11549295774648-0.315492957746478
287.89.11549295774648-1.31549295774648
297.89.11549295774648-1.31549295774648
307.89.11549295774648-1.31549295774648
319.29.115492957746480.0845070422535203
329.39.115492957746480.184507042253522
339.29.115492957746480.0845070422535203
348.69.11549295774648-0.51549295774648
358.59.11549295774648-0.615492957746479
368.59.11549295774648-0.615492957746479
3799.11549295774648-0.115492957746479
3899.11549295774648-0.115492957746479
398.89.11549295774648-0.315492957746478
4089.11549295774648-1.11549295774648
417.99.11549295774648-1.21549295774648
428.19.11549295774648-1.01549295774648
439.39.115492957746480.184507042253522
449.49.115492957746480.284507042253521
459.49.115492957746480.284507042253521
469.39.115492957746480.184507042253522
4799.11549295774648-0.115492957746479
489.19.11549295774648-0.0154929577464794
499.79.115492957746480.58450704225352
509.79.115492957746480.58450704225352
519.69.115492957746480.48450704225352
528.39.11549295774648-0.815492957746478
538.29.11549295774648-0.91549295774648
548.49.11549295774648-0.715492957746479
5510.69.115492957746481.48450704225352
5610.99.115492957746481.78450704225352
5710.99.115492957746481.78450704225352
589.69.115492957746480.48450704225352
599.39.115492957746480.184507042253522
609.39.115492957746480.184507042253522
619.69.115492957746480.48450704225352
629.59.115492957746480.384507042253521
639.59.115492957746480.384507042253521
6499.11549295774648-0.115492957746479
658.99.11549295774648-0.215492957746479
6699.11549295774648-0.115492957746479
6710.19.115492957746480.98450704225352
6810.29.115492957746481.08450704225352
6910.29.115492957746481.08450704225352
709.59.115492957746480.384507042253521
719.39.115492957746480.184507042253522
729.39.115492957746480.184507042253522
739.49.115492957746480.284507042253521
749.39.115492957746480.184507042253522
759.19.11549295774648-0.0154929577464794
7699.11549295774648-0.115492957746479
778.99.11549295774648-0.215492957746479
7899.11549295774648-0.115492957746479
799.89.115492957746480.684507042253522
80109.115492957746480.884507042253521
819.89.115492957746480.684507042253522
829.49.115492957746480.284507042253521
8399.11549295774648-0.115492957746479
848.99.11549295774648-0.215492957746479
859.39.115492957746480.184507042253522
869.19.11549295774648-0.0154929577464794
878.89.11549295774648-0.315492957746478
888.99.11549295774648-0.215492957746479
898.79.11549295774648-0.41549295774648
908.69.11549295774648-0.51549295774648
919.19.11549295774648-0.0154929577464794
929.39.115492957746480.184507042253522
938.99.11549295774648-0.215492957746479



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')