Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 18 Nov 2007 04:51:00 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2007/Nov/18/t1195386304snm8bly5iq4ncvs.htm/, Retrieved Sun, 05 May 2024 01:43:41 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=5554, Retrieved Sun, 05 May 2024 01:43:41 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact204
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2007-11-18 11:51:00] [01c398ee8ca2f8c0964b19b0b10c7536] [Current]
Feedback Forum

Post a new message
Dataseries X:
7.4	0
7.2	0
7,0	0
6.6	0
6.4	0
6.4	0
6.8	0
7.3	0
7,0	0
7,0	0
6.7	0
6.7	0
6.3	0
6.2	0
6,0	0
6.3	0
6.2	0
6.1	0
6.2	0
6.6	0
6.6	0
7.8	0
7.4	0
7.4	1
7.5	1
7.4	1
7.4	1
7,0	1
6.9	1
6.9	1
7.6	1
7.7	1
7.6	1
8.2	1
8,0	1
8.1	1
8.3	1
8.2	1
8.1	1
7.7	1
7.6	1
7.7	1
8.2	1
8.4	1
8.4	1
8.6	1
8.4	1
8.5	1
8.7	1
8.7	1
8.6	1
7.4	1
7.3	1
7.4	1
9,0	1
9.2	1
9.2	1
8.5	1
8.3	1
8.3	1
8.6	1
8.6	1
8.5	1
8.1	1
8.1	1
8,0	1
8.6	1
8.7	1
8.7	1
8.6	1
8.4	1
8.4	1
8.7	1
8.7	1
8.5	1
8.3	1
8.3	1
8.3	1
8.1	1
8.2	1
8.1	1
8.1	1
7.9	1
7.7	1
8.1	1
8,0	1
7.7	1
7.8	1
7.6	1
7.4	1
7.7	1
7.9	1
7.6	1





Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Herman Ole Andreas Wold' @ 193.190.124.10:1001

\begin{tabular}{lllllllll}
\hline
Summary of compuational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ 193.190.124.10:1001 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5554&T=0

[TABLE]
[ROW][C]Summary of compuational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ 193.190.124.10:1001[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5554&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5554&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of compuational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Herman Ole Andreas Wold' @ 193.190.124.10:1001







Multiple Linear Regression - Estimated Regression Equation
x[t] = + 6.70434782608696 + 1.38708074534161y[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
x[t] =  +  6.70434782608696 +  1.38708074534161y[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5554&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]x[t] =  +  6.70434782608696 +  1.38708074534161y[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5554&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5554&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
x[t] = + 6.70434782608696 + 1.38708074534161y[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)6.704347826086960.1068762.733900
y1.387080745341610.12318211.260400

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 6.70434782608696 & 0.10687 & 62.7339 & 0 & 0 \tabularnewline
y & 1.38708074534161 & 0.123182 & 11.2604 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5554&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]6.70434782608696[/C][C]0.10687[/C][C]62.7339[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]y[/C][C]1.38708074534161[/C][C]0.123182[/C][C]11.2604[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5554&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5554&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)6.704347826086960.1068762.733900
y1.387080745341610.12318211.260400







Multiple Linear Regression - Regression Statistics
Multiple R0.763007246200546
R-squared0.58218005775454
Adjusted R-squared0.577588629817777
F-TEST (value)126.797167629064
F-TEST (DF numerator)1
F-TEST (DF denominator)91
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.512528984548903
Sum Squared Residuals23.9044223602484

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.763007246200546 \tabularnewline
R-squared & 0.58218005775454 \tabularnewline
Adjusted R-squared & 0.577588629817777 \tabularnewline
F-TEST (value) & 126.797167629064 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 91 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.512528984548903 \tabularnewline
Sum Squared Residuals & 23.9044223602484 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5554&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.763007246200546[/C][/ROW]
[ROW][C]R-squared[/C][C]0.58218005775454[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.577588629817777[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]126.797167629064[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]91[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.512528984548903[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]23.9044223602484[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5554&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5554&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.763007246200546
R-squared0.58218005775454
Adjusted R-squared0.577588629817777
F-TEST (value)126.797167629064
F-TEST (DF numerator)1
F-TEST (DF denominator)91
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.512528984548903
Sum Squared Residuals23.9044223602484







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
17.46.704347826086960.695652173913044
27.26.704347826086960.495652173913043
376.704347826086960.295652173913044
46.66.70434782608696-0.104347826086957
56.46.70434782608696-0.304347826086956
66.46.70434782608696-0.304347826086956
76.86.704347826086960.0956521739130434
87.36.704347826086960.595652173913043
976.704347826086960.295652173913044
1076.704347826086960.295652173913044
116.76.70434782608696-0.00434782608695632
126.76.70434782608696-0.00434782608695632
136.36.70434782608696-0.404347826086957
146.26.70434782608696-0.504347826086956
1566.70434782608696-0.704347826086957
166.36.70434782608696-0.404347826086957
176.26.70434782608696-0.504347826086956
186.16.70434782608696-0.604347826086957
196.26.70434782608696-0.504347826086956
206.66.70434782608696-0.104347826086957
216.66.70434782608696-0.104347826086957
227.86.704347826086961.09565217391304
237.46.704347826086960.695652173913044
247.48.09142857142857-0.691428571428571
257.58.09142857142857-0.591428571428571
267.48.09142857142857-0.691428571428571
277.48.09142857142857-0.691428571428571
2878.09142857142857-1.09142857142857
296.98.09142857142857-1.19142857142857
306.98.09142857142857-1.19142857142857
317.68.09142857142857-0.491428571428572
327.78.09142857142857-0.391428571428571
337.68.09142857142857-0.491428571428572
348.28.091428571428570.108571428571428
3588.09142857142857-0.0914285714285714
368.18.091428571428570.00857142857142826
378.38.091428571428570.208571428571429
388.28.091428571428570.108571428571428
398.18.091428571428570.00857142857142826
407.78.09142857142857-0.391428571428571
417.68.09142857142857-0.491428571428572
427.78.09142857142857-0.391428571428571
438.28.091428571428570.108571428571428
448.48.091428571428570.308571428571429
458.48.091428571428570.308571428571429
468.68.091428571428570.508571428571428
478.48.091428571428570.308571428571429
488.58.091428571428570.408571428571429
498.78.091428571428570.608571428571428
508.78.091428571428570.608571428571428
518.68.091428571428570.508571428571428
527.48.09142857142857-0.691428571428571
537.38.09142857142857-0.791428571428572
547.48.09142857142857-0.691428571428571
5598.091428571428570.908571428571429
569.28.091428571428571.10857142857143
579.28.091428571428571.10857142857143
588.58.091428571428570.408571428571429
598.38.091428571428570.208571428571429
608.38.091428571428570.208571428571429
618.68.091428571428570.508571428571428
628.68.091428571428570.508571428571428
638.58.091428571428570.408571428571429
648.18.091428571428570.00857142857142826
658.18.091428571428570.00857142857142826
6688.09142857142857-0.0914285714285714
678.68.091428571428570.508571428571428
688.78.091428571428570.608571428571428
698.78.091428571428570.608571428571428
708.68.091428571428570.508571428571428
718.48.091428571428570.308571428571429
728.48.091428571428570.308571428571429
738.78.091428571428570.608571428571428
748.78.091428571428570.608571428571428
758.58.091428571428570.408571428571429
768.38.091428571428570.208571428571429
778.38.091428571428570.208571428571429
788.38.091428571428570.208571428571429
798.18.091428571428570.00857142857142826
808.28.091428571428570.108571428571428
818.18.091428571428570.00857142857142826
828.18.091428571428570.00857142857142826
837.98.09142857142857-0.191428571428571
847.78.09142857142857-0.391428571428571
858.18.091428571428570.00857142857142826
8688.09142857142857-0.0914285714285714
877.78.09142857142857-0.391428571428571
887.88.09142857142857-0.291428571428572
897.68.09142857142857-0.491428571428572
907.48.09142857142857-0.691428571428571
917.78.09142857142857-0.391428571428571
927.98.09142857142857-0.191428571428571
937.68.09142857142857-0.491428571428572

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 7.4 & 6.70434782608696 & 0.695652173913044 \tabularnewline
2 & 7.2 & 6.70434782608696 & 0.495652173913043 \tabularnewline
3 & 7 & 6.70434782608696 & 0.295652173913044 \tabularnewline
4 & 6.6 & 6.70434782608696 & -0.104347826086957 \tabularnewline
5 & 6.4 & 6.70434782608696 & -0.304347826086956 \tabularnewline
6 & 6.4 & 6.70434782608696 & -0.304347826086956 \tabularnewline
7 & 6.8 & 6.70434782608696 & 0.0956521739130434 \tabularnewline
8 & 7.3 & 6.70434782608696 & 0.595652173913043 \tabularnewline
9 & 7 & 6.70434782608696 & 0.295652173913044 \tabularnewline
10 & 7 & 6.70434782608696 & 0.295652173913044 \tabularnewline
11 & 6.7 & 6.70434782608696 & -0.00434782608695632 \tabularnewline
12 & 6.7 & 6.70434782608696 & -0.00434782608695632 \tabularnewline
13 & 6.3 & 6.70434782608696 & -0.404347826086957 \tabularnewline
14 & 6.2 & 6.70434782608696 & -0.504347826086956 \tabularnewline
15 & 6 & 6.70434782608696 & -0.704347826086957 \tabularnewline
16 & 6.3 & 6.70434782608696 & -0.404347826086957 \tabularnewline
17 & 6.2 & 6.70434782608696 & -0.504347826086956 \tabularnewline
18 & 6.1 & 6.70434782608696 & -0.604347826086957 \tabularnewline
19 & 6.2 & 6.70434782608696 & -0.504347826086956 \tabularnewline
20 & 6.6 & 6.70434782608696 & -0.104347826086957 \tabularnewline
21 & 6.6 & 6.70434782608696 & -0.104347826086957 \tabularnewline
22 & 7.8 & 6.70434782608696 & 1.09565217391304 \tabularnewline
23 & 7.4 & 6.70434782608696 & 0.695652173913044 \tabularnewline
24 & 7.4 & 8.09142857142857 & -0.691428571428571 \tabularnewline
25 & 7.5 & 8.09142857142857 & -0.591428571428571 \tabularnewline
26 & 7.4 & 8.09142857142857 & -0.691428571428571 \tabularnewline
27 & 7.4 & 8.09142857142857 & -0.691428571428571 \tabularnewline
28 & 7 & 8.09142857142857 & -1.09142857142857 \tabularnewline
29 & 6.9 & 8.09142857142857 & -1.19142857142857 \tabularnewline
30 & 6.9 & 8.09142857142857 & -1.19142857142857 \tabularnewline
31 & 7.6 & 8.09142857142857 & -0.491428571428572 \tabularnewline
32 & 7.7 & 8.09142857142857 & -0.391428571428571 \tabularnewline
33 & 7.6 & 8.09142857142857 & -0.491428571428572 \tabularnewline
34 & 8.2 & 8.09142857142857 & 0.108571428571428 \tabularnewline
35 & 8 & 8.09142857142857 & -0.0914285714285714 \tabularnewline
36 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
37 & 8.3 & 8.09142857142857 & 0.208571428571429 \tabularnewline
38 & 8.2 & 8.09142857142857 & 0.108571428571428 \tabularnewline
39 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
40 & 7.7 & 8.09142857142857 & -0.391428571428571 \tabularnewline
41 & 7.6 & 8.09142857142857 & -0.491428571428572 \tabularnewline
42 & 7.7 & 8.09142857142857 & -0.391428571428571 \tabularnewline
43 & 8.2 & 8.09142857142857 & 0.108571428571428 \tabularnewline
44 & 8.4 & 8.09142857142857 & 0.308571428571429 \tabularnewline
45 & 8.4 & 8.09142857142857 & 0.308571428571429 \tabularnewline
46 & 8.6 & 8.09142857142857 & 0.508571428571428 \tabularnewline
47 & 8.4 & 8.09142857142857 & 0.308571428571429 \tabularnewline
48 & 8.5 & 8.09142857142857 & 0.408571428571429 \tabularnewline
49 & 8.7 & 8.09142857142857 & 0.608571428571428 \tabularnewline
50 & 8.7 & 8.09142857142857 & 0.608571428571428 \tabularnewline
51 & 8.6 & 8.09142857142857 & 0.508571428571428 \tabularnewline
52 & 7.4 & 8.09142857142857 & -0.691428571428571 \tabularnewline
53 & 7.3 & 8.09142857142857 & -0.791428571428572 \tabularnewline
54 & 7.4 & 8.09142857142857 & -0.691428571428571 \tabularnewline
55 & 9 & 8.09142857142857 & 0.908571428571429 \tabularnewline
56 & 9.2 & 8.09142857142857 & 1.10857142857143 \tabularnewline
57 & 9.2 & 8.09142857142857 & 1.10857142857143 \tabularnewline
58 & 8.5 & 8.09142857142857 & 0.408571428571429 \tabularnewline
59 & 8.3 & 8.09142857142857 & 0.208571428571429 \tabularnewline
60 & 8.3 & 8.09142857142857 & 0.208571428571429 \tabularnewline
61 & 8.6 & 8.09142857142857 & 0.508571428571428 \tabularnewline
62 & 8.6 & 8.09142857142857 & 0.508571428571428 \tabularnewline
63 & 8.5 & 8.09142857142857 & 0.408571428571429 \tabularnewline
64 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
65 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
66 & 8 & 8.09142857142857 & -0.0914285714285714 \tabularnewline
67 & 8.6 & 8.09142857142857 & 0.508571428571428 \tabularnewline
68 & 8.7 & 8.09142857142857 & 0.608571428571428 \tabularnewline
69 & 8.7 & 8.09142857142857 & 0.608571428571428 \tabularnewline
70 & 8.6 & 8.09142857142857 & 0.508571428571428 \tabularnewline
71 & 8.4 & 8.09142857142857 & 0.308571428571429 \tabularnewline
72 & 8.4 & 8.09142857142857 & 0.308571428571429 \tabularnewline
73 & 8.7 & 8.09142857142857 & 0.608571428571428 \tabularnewline
74 & 8.7 & 8.09142857142857 & 0.608571428571428 \tabularnewline
75 & 8.5 & 8.09142857142857 & 0.408571428571429 \tabularnewline
76 & 8.3 & 8.09142857142857 & 0.208571428571429 \tabularnewline
77 & 8.3 & 8.09142857142857 & 0.208571428571429 \tabularnewline
78 & 8.3 & 8.09142857142857 & 0.208571428571429 \tabularnewline
79 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
80 & 8.2 & 8.09142857142857 & 0.108571428571428 \tabularnewline
81 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
82 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
83 & 7.9 & 8.09142857142857 & -0.191428571428571 \tabularnewline
84 & 7.7 & 8.09142857142857 & -0.391428571428571 \tabularnewline
85 & 8.1 & 8.09142857142857 & 0.00857142857142826 \tabularnewline
86 & 8 & 8.09142857142857 & -0.0914285714285714 \tabularnewline
87 & 7.7 & 8.09142857142857 & -0.391428571428571 \tabularnewline
88 & 7.8 & 8.09142857142857 & -0.291428571428572 \tabularnewline
89 & 7.6 & 8.09142857142857 & -0.491428571428572 \tabularnewline
90 & 7.4 & 8.09142857142857 & -0.691428571428571 \tabularnewline
91 & 7.7 & 8.09142857142857 & -0.391428571428571 \tabularnewline
92 & 7.9 & 8.09142857142857 & -0.191428571428571 \tabularnewline
93 & 7.6 & 8.09142857142857 & -0.491428571428572 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=5554&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]7.4[/C][C]6.70434782608696[/C][C]0.695652173913044[/C][/ROW]
[ROW][C]2[/C][C]7.2[/C][C]6.70434782608696[/C][C]0.495652173913043[/C][/ROW]
[ROW][C]3[/C][C]7[/C][C]6.70434782608696[/C][C]0.295652173913044[/C][/ROW]
[ROW][C]4[/C][C]6.6[/C][C]6.70434782608696[/C][C]-0.104347826086957[/C][/ROW]
[ROW][C]5[/C][C]6.4[/C][C]6.70434782608696[/C][C]-0.304347826086956[/C][/ROW]
[ROW][C]6[/C][C]6.4[/C][C]6.70434782608696[/C][C]-0.304347826086956[/C][/ROW]
[ROW][C]7[/C][C]6.8[/C][C]6.70434782608696[/C][C]0.0956521739130434[/C][/ROW]
[ROW][C]8[/C][C]7.3[/C][C]6.70434782608696[/C][C]0.595652173913043[/C][/ROW]
[ROW][C]9[/C][C]7[/C][C]6.70434782608696[/C][C]0.295652173913044[/C][/ROW]
[ROW][C]10[/C][C]7[/C][C]6.70434782608696[/C][C]0.295652173913044[/C][/ROW]
[ROW][C]11[/C][C]6.7[/C][C]6.70434782608696[/C][C]-0.00434782608695632[/C][/ROW]
[ROW][C]12[/C][C]6.7[/C][C]6.70434782608696[/C][C]-0.00434782608695632[/C][/ROW]
[ROW][C]13[/C][C]6.3[/C][C]6.70434782608696[/C][C]-0.404347826086957[/C][/ROW]
[ROW][C]14[/C][C]6.2[/C][C]6.70434782608696[/C][C]-0.504347826086956[/C][/ROW]
[ROW][C]15[/C][C]6[/C][C]6.70434782608696[/C][C]-0.704347826086957[/C][/ROW]
[ROW][C]16[/C][C]6.3[/C][C]6.70434782608696[/C][C]-0.404347826086957[/C][/ROW]
[ROW][C]17[/C][C]6.2[/C][C]6.70434782608696[/C][C]-0.504347826086956[/C][/ROW]
[ROW][C]18[/C][C]6.1[/C][C]6.70434782608696[/C][C]-0.604347826086957[/C][/ROW]
[ROW][C]19[/C][C]6.2[/C][C]6.70434782608696[/C][C]-0.504347826086956[/C][/ROW]
[ROW][C]20[/C][C]6.6[/C][C]6.70434782608696[/C][C]-0.104347826086957[/C][/ROW]
[ROW][C]21[/C][C]6.6[/C][C]6.70434782608696[/C][C]-0.104347826086957[/C][/ROW]
[ROW][C]22[/C][C]7.8[/C][C]6.70434782608696[/C][C]1.09565217391304[/C][/ROW]
[ROW][C]23[/C][C]7.4[/C][C]6.70434782608696[/C][C]0.695652173913044[/C][/ROW]
[ROW][C]24[/C][C]7.4[/C][C]8.09142857142857[/C][C]-0.691428571428571[/C][/ROW]
[ROW][C]25[/C][C]7.5[/C][C]8.09142857142857[/C][C]-0.591428571428571[/C][/ROW]
[ROW][C]26[/C][C]7.4[/C][C]8.09142857142857[/C][C]-0.691428571428571[/C][/ROW]
[ROW][C]27[/C][C]7.4[/C][C]8.09142857142857[/C][C]-0.691428571428571[/C][/ROW]
[ROW][C]28[/C][C]7[/C][C]8.09142857142857[/C][C]-1.09142857142857[/C][/ROW]
[ROW][C]29[/C][C]6.9[/C][C]8.09142857142857[/C][C]-1.19142857142857[/C][/ROW]
[ROW][C]30[/C][C]6.9[/C][C]8.09142857142857[/C][C]-1.19142857142857[/C][/ROW]
[ROW][C]31[/C][C]7.6[/C][C]8.09142857142857[/C][C]-0.491428571428572[/C][/ROW]
[ROW][C]32[/C][C]7.7[/C][C]8.09142857142857[/C][C]-0.391428571428571[/C][/ROW]
[ROW][C]33[/C][C]7.6[/C][C]8.09142857142857[/C][C]-0.491428571428572[/C][/ROW]
[ROW][C]34[/C][C]8.2[/C][C]8.09142857142857[/C][C]0.108571428571428[/C][/ROW]
[ROW][C]35[/C][C]8[/C][C]8.09142857142857[/C][C]-0.0914285714285714[/C][/ROW]
[ROW][C]36[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]37[/C][C]8.3[/C][C]8.09142857142857[/C][C]0.208571428571429[/C][/ROW]
[ROW][C]38[/C][C]8.2[/C][C]8.09142857142857[/C][C]0.108571428571428[/C][/ROW]
[ROW][C]39[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]40[/C][C]7.7[/C][C]8.09142857142857[/C][C]-0.391428571428571[/C][/ROW]
[ROW][C]41[/C][C]7.6[/C][C]8.09142857142857[/C][C]-0.491428571428572[/C][/ROW]
[ROW][C]42[/C][C]7.7[/C][C]8.09142857142857[/C][C]-0.391428571428571[/C][/ROW]
[ROW][C]43[/C][C]8.2[/C][C]8.09142857142857[/C][C]0.108571428571428[/C][/ROW]
[ROW][C]44[/C][C]8.4[/C][C]8.09142857142857[/C][C]0.308571428571429[/C][/ROW]
[ROW][C]45[/C][C]8.4[/C][C]8.09142857142857[/C][C]0.308571428571429[/C][/ROW]
[ROW][C]46[/C][C]8.6[/C][C]8.09142857142857[/C][C]0.508571428571428[/C][/ROW]
[ROW][C]47[/C][C]8.4[/C][C]8.09142857142857[/C][C]0.308571428571429[/C][/ROW]
[ROW][C]48[/C][C]8.5[/C][C]8.09142857142857[/C][C]0.408571428571429[/C][/ROW]
[ROW][C]49[/C][C]8.7[/C][C]8.09142857142857[/C][C]0.608571428571428[/C][/ROW]
[ROW][C]50[/C][C]8.7[/C][C]8.09142857142857[/C][C]0.608571428571428[/C][/ROW]
[ROW][C]51[/C][C]8.6[/C][C]8.09142857142857[/C][C]0.508571428571428[/C][/ROW]
[ROW][C]52[/C][C]7.4[/C][C]8.09142857142857[/C][C]-0.691428571428571[/C][/ROW]
[ROW][C]53[/C][C]7.3[/C][C]8.09142857142857[/C][C]-0.791428571428572[/C][/ROW]
[ROW][C]54[/C][C]7.4[/C][C]8.09142857142857[/C][C]-0.691428571428571[/C][/ROW]
[ROW][C]55[/C][C]9[/C][C]8.09142857142857[/C][C]0.908571428571429[/C][/ROW]
[ROW][C]56[/C][C]9.2[/C][C]8.09142857142857[/C][C]1.10857142857143[/C][/ROW]
[ROW][C]57[/C][C]9.2[/C][C]8.09142857142857[/C][C]1.10857142857143[/C][/ROW]
[ROW][C]58[/C][C]8.5[/C][C]8.09142857142857[/C][C]0.408571428571429[/C][/ROW]
[ROW][C]59[/C][C]8.3[/C][C]8.09142857142857[/C][C]0.208571428571429[/C][/ROW]
[ROW][C]60[/C][C]8.3[/C][C]8.09142857142857[/C][C]0.208571428571429[/C][/ROW]
[ROW][C]61[/C][C]8.6[/C][C]8.09142857142857[/C][C]0.508571428571428[/C][/ROW]
[ROW][C]62[/C][C]8.6[/C][C]8.09142857142857[/C][C]0.508571428571428[/C][/ROW]
[ROW][C]63[/C][C]8.5[/C][C]8.09142857142857[/C][C]0.408571428571429[/C][/ROW]
[ROW][C]64[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]65[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]66[/C][C]8[/C][C]8.09142857142857[/C][C]-0.0914285714285714[/C][/ROW]
[ROW][C]67[/C][C]8.6[/C][C]8.09142857142857[/C][C]0.508571428571428[/C][/ROW]
[ROW][C]68[/C][C]8.7[/C][C]8.09142857142857[/C][C]0.608571428571428[/C][/ROW]
[ROW][C]69[/C][C]8.7[/C][C]8.09142857142857[/C][C]0.608571428571428[/C][/ROW]
[ROW][C]70[/C][C]8.6[/C][C]8.09142857142857[/C][C]0.508571428571428[/C][/ROW]
[ROW][C]71[/C][C]8.4[/C][C]8.09142857142857[/C][C]0.308571428571429[/C][/ROW]
[ROW][C]72[/C][C]8.4[/C][C]8.09142857142857[/C][C]0.308571428571429[/C][/ROW]
[ROW][C]73[/C][C]8.7[/C][C]8.09142857142857[/C][C]0.608571428571428[/C][/ROW]
[ROW][C]74[/C][C]8.7[/C][C]8.09142857142857[/C][C]0.608571428571428[/C][/ROW]
[ROW][C]75[/C][C]8.5[/C][C]8.09142857142857[/C][C]0.408571428571429[/C][/ROW]
[ROW][C]76[/C][C]8.3[/C][C]8.09142857142857[/C][C]0.208571428571429[/C][/ROW]
[ROW][C]77[/C][C]8.3[/C][C]8.09142857142857[/C][C]0.208571428571429[/C][/ROW]
[ROW][C]78[/C][C]8.3[/C][C]8.09142857142857[/C][C]0.208571428571429[/C][/ROW]
[ROW][C]79[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]80[/C][C]8.2[/C][C]8.09142857142857[/C][C]0.108571428571428[/C][/ROW]
[ROW][C]81[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]82[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]83[/C][C]7.9[/C][C]8.09142857142857[/C][C]-0.191428571428571[/C][/ROW]
[ROW][C]84[/C][C]7.7[/C][C]8.09142857142857[/C][C]-0.391428571428571[/C][/ROW]
[ROW][C]85[/C][C]8.1[/C][C]8.09142857142857[/C][C]0.00857142857142826[/C][/ROW]
[ROW][C]86[/C][C]8[/C][C]8.09142857142857[/C][C]-0.0914285714285714[/C][/ROW]
[ROW][C]87[/C][C]7.7[/C][C]8.09142857142857[/C][C]-0.391428571428571[/C][/ROW]
[ROW][C]88[/C][C]7.8[/C][C]8.09142857142857[/C][C]-0.291428571428572[/C][/ROW]
[ROW][C]89[/C][C]7.6[/C][C]8.09142857142857[/C][C]-0.491428571428572[/C][/ROW]
[ROW][C]90[/C][C]7.4[/C][C]8.09142857142857[/C][C]-0.691428571428571[/C][/ROW]
[ROW][C]91[/C][C]7.7[/C][C]8.09142857142857[/C][C]-0.391428571428571[/C][/ROW]
[ROW][C]92[/C][C]7.9[/C][C]8.09142857142857[/C][C]-0.191428571428571[/C][/ROW]
[ROW][C]93[/C][C]7.6[/C][C]8.09142857142857[/C][C]-0.491428571428572[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=5554&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=5554&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
17.46.704347826086960.695652173913044
27.26.704347826086960.495652173913043
376.704347826086960.295652173913044
46.66.70434782608696-0.104347826086957
56.46.70434782608696-0.304347826086956
66.46.70434782608696-0.304347826086956
76.86.704347826086960.0956521739130434
87.36.704347826086960.595652173913043
976.704347826086960.295652173913044
1076.704347826086960.295652173913044
116.76.70434782608696-0.00434782608695632
126.76.70434782608696-0.00434782608695632
136.36.70434782608696-0.404347826086957
146.26.70434782608696-0.504347826086956
1566.70434782608696-0.704347826086957
166.36.70434782608696-0.404347826086957
176.26.70434782608696-0.504347826086956
186.16.70434782608696-0.604347826086957
196.26.70434782608696-0.504347826086956
206.66.70434782608696-0.104347826086957
216.66.70434782608696-0.104347826086957
227.86.704347826086961.09565217391304
237.46.704347826086960.695652173913044
247.48.09142857142857-0.691428571428571
257.58.09142857142857-0.591428571428571
267.48.09142857142857-0.691428571428571
277.48.09142857142857-0.691428571428571
2878.09142857142857-1.09142857142857
296.98.09142857142857-1.19142857142857
306.98.09142857142857-1.19142857142857
317.68.09142857142857-0.491428571428572
327.78.09142857142857-0.391428571428571
337.68.09142857142857-0.491428571428572
348.28.091428571428570.108571428571428
3588.09142857142857-0.0914285714285714
368.18.091428571428570.00857142857142826
378.38.091428571428570.208571428571429
388.28.091428571428570.108571428571428
398.18.091428571428570.00857142857142826
407.78.09142857142857-0.391428571428571
417.68.09142857142857-0.491428571428572
427.78.09142857142857-0.391428571428571
438.28.091428571428570.108571428571428
448.48.091428571428570.308571428571429
458.48.091428571428570.308571428571429
468.68.091428571428570.508571428571428
478.48.091428571428570.308571428571429
488.58.091428571428570.408571428571429
498.78.091428571428570.608571428571428
508.78.091428571428570.608571428571428
518.68.091428571428570.508571428571428
527.48.09142857142857-0.691428571428571
537.38.09142857142857-0.791428571428572
547.48.09142857142857-0.691428571428571
5598.091428571428570.908571428571429
569.28.091428571428571.10857142857143
579.28.091428571428571.10857142857143
588.58.091428571428570.408571428571429
598.38.091428571428570.208571428571429
608.38.091428571428570.208571428571429
618.68.091428571428570.508571428571428
628.68.091428571428570.508571428571428
638.58.091428571428570.408571428571429
648.18.091428571428570.00857142857142826
658.18.091428571428570.00857142857142826
6688.09142857142857-0.0914285714285714
678.68.091428571428570.508571428571428
688.78.091428571428570.608571428571428
698.78.091428571428570.608571428571428
708.68.091428571428570.508571428571428
718.48.091428571428570.308571428571429
728.48.091428571428570.308571428571429
738.78.091428571428570.608571428571428
748.78.091428571428570.608571428571428
758.58.091428571428570.408571428571429
768.38.091428571428570.208571428571429
778.38.091428571428570.208571428571429
788.38.091428571428570.208571428571429
798.18.091428571428570.00857142857142826
808.28.091428571428570.108571428571428
818.18.091428571428570.00857142857142826
828.18.091428571428570.00857142857142826
837.98.09142857142857-0.191428571428571
847.78.09142857142857-0.391428571428571
858.18.091428571428570.00857142857142826
8688.09142857142857-0.0914285714285714
877.78.09142857142857-0.391428571428571
887.88.09142857142857-0.291428571428572
897.68.09142857142857-0.491428571428572
907.48.09142857142857-0.691428571428571
917.78.09142857142857-0.391428571428571
927.98.09142857142857-0.191428571428571
937.68.09142857142857-0.491428571428572



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')